Showing posts with label technology. Show all posts
Showing posts with label technology. Show all posts

Thursday, October 6, 2011

Applegarchy

It's always sad when someone suffers and dies earlier than they might otherwise have because of cancer. So I am sad that Steve Jobs has died. Of course, were he a nobody instead of a billionaire, though, I wouldn't have felt anything about it. I would have ignored his death like all the other strangers' deaths.

I am not much of a believer in the sorts of ideals Steve Jobs came to represent, and seeing the outpouring of gratitude in various media outlets for how he "invented the future" and so forth has made me feel more than usually estranged from the culture I live in. So forgive me if I come across as sour or surly. For his hagiographers, Jobs is an innovative, entrepreneurial genius who gave concrete form to the inchoate desires of the masses to live more beautiful lives. Indeed, he is the man whose marketing savvy brought us the gadgets that set us free to become what we wanted to be, the stylish silhouettes dancing in the old original iPod advertisements, opaque and indistinguishable in their solipsism.


What I see when instructed to appreciate the awesomeness of the world that Jobs helped created is a world full of atomized consumers enthralled by gadgets that promise to augment their lives but just as often compress them, reify them, codify them into quantified data. I see a superficial aesthetic anchored in fastidious fonts and hermetic product design that I am supposed to receive as a special consolation, a privilege of my era as wonderful as electricity or refrigeration. I see commercial products specifically designed to repel curiosity and DIY modification championed as harbingers of the triumph of the "personal." I see gadgets design to accelerate consumption and subsume more of everyday live to the anxieties of mediation represented as great enablers of productive self-expression. Apple under Jobs put a sleek, brushed-aluminum case on the ideology of consumerism and convinced us it had sparked some sort of revolution.

I have no special complaints about the functionality of Apple's products, though they are relatively overpriced. Their vaunted ease of use has only occasionally disappointed me, though I have never understood why I was supposed to be so grateful for it. Praising products for merely working seems to speak of our undue tolerance for broken, shabby things, not a generalized elevation of expectations. And outside of fast fashion, perhaps no company exemplifies the commitment to obsolescence more rigorously than Apple. No other company has been more successful in leveraging the media to make its perfectly functional products seem useless and outdated on a regular schedule. All hail "innovation"!

Still, my problem has always been more with the cult of Apple and of Jobs himself. To me, Jobs represented the tyranny of design, the soft command of seductive interfaces, the covert control through cleverly marketed convenience, the triumph of closed, hierarchical systems over open-source ones, commercial protocols and the ethos of the gated community over the commons. More than any other corporate executive, he commoditized creativity and sold it as a fungible status symbol. Apple is supposed to serve as proof that good design can drive capitalist expansion, that market competition will ultimately produce only things that are held by consensus to be not only utilitarian but beautiful. But one could also see this as a demonstration of capitalist ideology's advance -- it no longer needs appeals to utility and rationality to justify itself, but can presume its subjects will regard exchange itself is beautiful, that its logic can only but yield pleasure. Apple thus betokens a growing dependence on the market in order to experience pleasure. We must buy things to entitle ourselves to an aesthetic feeling.

That I own an iPod probably opens me to accusations of hypocrisy in some people's eyes. Complaining about consumerism but still shopping for things probably makes me a hypocrite to such people (if they are not straw men) too. If you participate at all in the status quo -- if it ensnares you as it is intended to -- you have no right to criticize it. It's incumbent instead to celebrate it. A cursory look at Twitter shows there is certainly no shortage of cheerleaders. When I listen to music, it doesn't mean that much to me if it happens to come from an iPod. But Apple ideology tells me it should, that he device is more significant than what it conveys. My reactionary response has been to fetishize vinyl.

Part of me feels viscerally an envy with regard to Jobs that marks the degree to which I've vicariously participated in the myth that has been built around him, in the entrepreneur worship, the fantasy of power -- of being able to alter other's lives and still be regarded as benevolent. Technology is a perfect vector for that sort of power, masking the agency of those who develop it and program it and representing that as irresistible progress. That instinctive envy engenders a deep skepticism of Silicon Valley, of the sort of people drawn to it, those who seeking technocratic means to dominate the world, impose a vision, dictate the contours of others' lives. Jobs worship perpetuates the idea that proprietary technology is developed for us, for our improvement and our needs, rather than for profit or for the egos of venture capitalists and self-proclaimed visionaries. It makes more sense to me, if you want to worship tech gurus, to choose someone like Linus Torvalds, though I doubt he'll be on the cover of Time when he dies.

Friday, August 19, 2011

Marshall McLuhan Centennial (21 July 2011)

To mark the 100th anniversary of the birth of Canadian media guru Marshall McLuhan, Megan Garber has an extensive post about his ideas at the Neiman Journalism Lab site, pointing out, somewhat cryptically, that "McLuhan’s theories seem epic and urgent and obvious all at the same time. And McLuhan himself — the teacher, the thinker, the darling of the media he both measured and mocked — seems both more relevant, and less so, than ever before." I think that means that we take McLuan's useful insights more or less for granted even as they shape the contours of the debate about the impact of mediatization. McLuhan certainly wasn't afraid to make sweeping, unsubstantiated generalizations, which definitely makes his account of history occasionally "epic," but almost unfalsifiable as well. So sometimes it seems like McLuhan is just relabeling phenomena (this is a "hot" medium, this is a "cold" one) without performing much analysis, translating things into jargon without necessarily developing arguments.

Garber notes a recent essay by Paul Ford about the media's role in imposing narratives on the flux of events and regularizing time and points out that "If McLuhan is to be believed, the much-discussed and often-assumed human need for narrative — or, at least, our need for narrative that has explicit beginnings and endings — may be contingent rather than implicit." That is, the norms of our reading, or rather our media consumption generally, are shaped by existing levels of technology and how that technology is assimilated socially. We don't come hardwired with a love of stories, as literary humanists sometimes insist. Narrative conventions are always part of what society is always in the process of negotiating -- they are political, ideological, like just about every other kind of relation. McLuhan believed that new media forms would retribalize humanity, undoing some of the specific sorts of freedoms market society (which he links specifically to books and literacy) guaranteed and introducing different ways to construe it. The danger, as Garber implies, is that we will get swallowed by real time, which old media broke into manageable increments but which mew media has redissolved. This opens up possibilities of deliberate disorientation and unsustainable acceleration of consumption.

Anyway, I recently read McLuhan's Undertstanding Media and this is what I took away from it. The general gist is that print media support individualism and economistic rationality: "If Western literate man undergoes much dissociation of inner sensibility from his use of the alphabet, he also wins his personal freedom to dissociate himself from clan and family" (88). Literacy, in McLuhan's view, makes capitalist-style consumer markets possible: "Nonliterate societies are quite lacking in the psychic resources to create and sustain the enormous structures of statistical information that we call markets and prices.... The extreme abstraction and detachment represented by our pricing system is quite unthinkable and unusable amidst populations for whom the exciting drama of price haggling occurs with every transaction" (137). This ties in to the idea that humans must learn to be rational in an economic sense -- that such calculation is not inherent but socially constructed. Capitalist society (and its media) equips us with this form of reason during the process of subjectivation.

But the atomized, anonymized individuals of the literate world are prone to anomie, to being "massified." Whereas subsequent media (more immersive and real-time; accelerated) are returning culture dialectically to a more "tribal" orientation -- the "global village." We collectively try to defeat time by pursuing the instantaneousness of new media; this speed, this accelerated transience begins to undo economism in favor of some new collectivity. "Fragmented, literate and visual individualism is not possible in an electrically patterned and imploded society" (51). So it's obvious why the P2P types and the technoutopian futurists are attracted to McLuhan, who more or less established their rhetorical mode. But McLuhan occasionally issues some warnings about the mediated future as well. This, for example, seems like a prescient critique of the attention economy and recommendation engines:

Once we have surrendered our senses and nervous systems to the private manipulation of those who would try to benefit from taking a lease on our eyes and ears and nerves, we don't really have any rights left. (68)

And later he writes, "The avid desire of mankind to prostitute itself stands up against the chaos of revolution" (189). In other words, technology will be commercialized rather than become subversive.

McLuhan claims that "the effect of electric technology had at first been anxiety. Now it appears to create boredom" (26). That is, it exacerbates the paradoxes of choice, encourages us to suspend decision making for as long as possible, since switching among a newly vast array of alternatives appears easy. But such suspension, such switching may have hidden cognitive costs, may contribute to ego depletion. He points out how technology tends to accelerate exchange, noting that, for example, "by coordinating and accelerating human meetings and goings-on, clocks increase the sheer quantity of human exchange." This seems to be a structural fit with capitalism's need to maximize exchange to maximize opportunities to realize profit. Photographs, too, create a world of "accelerated transience" (196).

He also notes that certain technologies seek to make self-service labor possible, eliminating service requirements and prompting us to take on more responsibility for ourselves as a form of progress (36). That is, technology institutes convenience as a desirable value that trumps other values; ease and efficiency make collectivity appear progressively more annoying, a social ill to be eradicated in the name of individualist freedom, the only freedom that counts.

McLuhan anticipates the rise of immaterial labor, as "commodities themselves assume more and more the character of information" -- they become signifiers, bearers of design distinctions and lifestyle accents. "As electric information levels rise, almost any kind of material will serve any kind of need or function, forcing the intellectual more and more into the role of social command and into the service of production." Hence the rise of the "creative class" and the importance of social production, building brands and meanings and distributing them authoritatively. Manufacturing becomes a pretense for information, where the real profit margins are:

At the end of the mechanical age people still imagined that press and radio and even TV were merely forms of information paid for by the makers and users of "hardware," like cars and soap and gasoline. As automation takes hold, it becomes obvious that information is the crucial commodity, and that solid products are merely incidental to information movement. The early stages by which information itself became the basic economic commodity of the electric age were obscured by the ways in which advertising and entertainment put people off the track. Advertisers pay for space and time in paper and magazine, on radio and TV; that is, they buy a piece of the reader, listener, or viewer as definitely as if they hired our homes for a public meeting. They would gladly pay the reader, listener, or viewer directly for his time and attention if they knew how to do so. The only way so far devised is to put on a free show. Movies in America have not developed advertising intervals simply because the movie itself is the greatest of all forms of advertisement for consumer goods.

McLuhan insists that "the product matters less as the audience participation increases" -- that is because that participation is the product, the manufactured good, the pretense. "Any acceptable ad is a vigorous dramatization of communal experience," McLuhan claims (228); by this I think he might mean that ads plunge us into visceral experience of what Baudrillard calls the "code" of consumerism. McLuhan asserts that ads draw us into neotribal experiences of collectivity; I think this claim is undermined by the rise of personalization and design ideology. We collectively participate in the idea of customizing our consumer goods, but finding a unique angle on this common culture is the main avenue for hipster distinction. We craft our own niche for ourselves, and become anxious to isolate ourselves from others within the various constituencies brands create for themselves. Belonging to the communities facilitated by media products fosters a simultaneous tension to escape their embrace, o make one's participation singular. That is to say, media participation is as competitive as it is collaborative.

In the last chapter, McLuhan says this of the future of work:

The future of work consists of earning a living in the automation age. This is a familiar pattern in electric technology in general. It ends the old dichotomies between culture and technology, between art and commerce, and between work and leisure. Whereas in the mechanical age of fragmentation leisure had been the absence of work, or mere idleness, the reverse is true in the electric age. As the age of information demands the simultaneous use of all our faculties, we discover that we are most at leisure when we are most intensely involved, very much as with the artists in all ages.

This sounds a lot like the autonomist idea of the general intellect, which kicks in after automation becomes standard in industry. McLuhan's way of putting it: "Many people, in consequence, have begun to look on the whole of society as a single unified machine for creating wealth.... With electricity as energizer and synchronizer, all aspects of production, consumption, and organization become incidental to communications." He suggests that the only profession of the future will be teacher. We will be all teaching each other new ways to please and divert ourselves, new ways to want more things. Learning itself becomes "the principal form of production and consumption" (351). That sounds like a good thing, but one must factor in the ramifications of widespread, institutionalized narcissism, which leads us to become experts in one very particular subject: ourselves. When the alleged structural unemployment subsides, this is the sort of service economy we will be left with -- the full flowering of communicative capitalism. We are consigned by automation to industrialized, mass-produced individuality that we must never stop blathering about.

Non-Time and Hauntology (5 May 2011)

I went to a talk last night at NYU by Mark Fisher about "hauntology," which refers to a kind of intermediate space-time between places palpably shaped by organic time and nonplaces (shopping malls, etc. -- see Marc Augé), which are wrenched out of time and posit an unending nontime, the end of history, an undisruptable retailing present that perpetually recurs. I didn't really get what hauntology was all about: it seemed to have to do with cultural productions that are aware of the nonplace/nontime crisis -- the way neoliberalism has foisted non-space/time on us, along with a subjectivity without depth that must flaunt its requisite flexibility by shuffling the deck of floating signifiers -- and are "reflexive" and "critical" and "negative" about this condition. Fisher made this point with music: British pop music now is blithely appropriational of the past without foregrounding that in any particular way; retro has ceased to be a meaningful descriptor. So music made now would not be at all disruptive, he argues, if someone living in 1979 heard it. There would be no retroactive future shock. It doesn't sound like the future; the future that should be occurring now has been thwarted, lost, effaced. The sense of cultural teleology is gone, vanished, perhaps, in the now pervasive relativism that regards all culture product as potentially valuable.

There are lots of plausible and interrelated explanations for why the pop-culture future can no longer occur, including:

(1) The demise of a hegemonic culture industry (and the rise of digitization and peer-to-peer distribution) brought the end of a shared sense of the cultural moment. We're not all watching the same TV show at the same time and hearing the same records on the radio. Instead we have access to all culture all at once, on demand -- whether it's, say, Lamb Lies Down on Broadway, the complete works of Margaret Cavendish, yesterday's episode of Survivor, or all of them at once. This AV Club article by Steven Hyden about Def Leppard's Hysteria gets at the idea:

As everything changes rapidly around us, we as music fans in many ways still think we’re living in a Def Leppard world, where winning a Grammy means you’ve arrived, and going to No. 1 on the charts makes you a pop star. In reality, we live in a culture where the terms “mainstream” and “underground” have become virtually meaningless, as practically every song by every band ever is equally accessible, frequently at no cost, to anyone with an Internet connection and the interest to seek it out ... It’s clear that music rarely unites us under the banner of mass-accepted artists anymore; even in a concert audience, we’re all just a bunch of individuals, with little connecting us to one another beyond a shared interest in the artist onstage—one artist among hundreds on our abundantly stocked iPods. Sounds lonely, doesn’t it? Sometimes I yearn for the old world, the one I grew up in, a place where dinosaurs like Hysteria stomped around pop culture for months, if not years, leaving sizable impressions in the hearts of a generation, whether they liked it or not.

The availability of everything means that particular works of pop music lose "symbolic efficiency" to use (and possibly misuse) a term from Žižek. Nothing successfully connotes the zeitgeist; everything invokes a desire to one-up with a better reference or a new meme or detournement of the contemporary. We are too knowing and skeptical to accept anything as unproblematically representative of the now.

(2) Neoliberalism/post-fordism/late capitalism has projected itself as the end of history, normalized nontime, and generalized the reception of conditions of ontological insecurity as freedom. We lack a subjectivity that can experience or recognize historicity.

Fisher links the idea of a "missing future" with the disappearance of negativity and criticality in contemporary pop culture, which (as I interpret it) has no space for anything oppositional or which transforms oppositional gestures into postures that circulate only as signifiers of personal identity. It reminds me of Douglas Haddow's "Hipsters are the dead-end of Western culture" argument:
An artificial appropriation of different styles from different eras, the hipster represents the end of Western civilization – a culture lost in the superficiality of its past and unable to create any new meaning. Not only is it unsustainable, it is suicidal. While previous youth movements have challenged the dysfunction and decadence of their elders, today we have the "hipster" – a youth subculture that mirrors the doomed shallowness of mainstream society.
Hipsters don't experience non-time negatively, as a loss, as melancholic, as indicative of deep alienation. Instead they seem to be thoroughly subjectivized by neoliberalism to the extent that they regard it as opportunity to show off how creative they can be in their cycle of appropriations. That last thing they want is to be reminded of how their personality is conditioned by the times they live in; in nontime, one can feel transcendent and immortal, one can permanently defer adulthood.

Hauntological music (like Burial) tries to at least evoke the feeling of loss, tries to register the missing future as a kind of catastrophe, Fisher argues, though it can't actually instantiate this missing future. It tries to at least restore meaning to the concept of retro, foregrounding the appropriations of the past by sounding like a scratchy record, and so on. (I don't know; all electronic music literally sounds the same to me.) I wasn't persuaded that a work's reflexivity about how symptomatic it is itself of the impossibility of escaping non-time made it viable as a mode of resistance. I'm probably too skeptical of reflexivity to ever regard it as resistance; I see reflexivity as the quintessential mode of neoliberalist subjectivity -- a calculating self-consciousness that can't be escaped, that forces us to be considering our identity as an alienated thing to be developed and invested entrepreneurially. (The following is highly provisional and may ultimately prove embarrassing): Whatever is reflexive needs to become collective. The problem of non-spacetime is that of an isolated individual subject who admits of no possibility for intersubjectivity, which is perhaps the primary way we experience history, through how our relations with others subjectivize us in particular, contingent ways. Reflexivity about our loss of that intersubjectivity seems to still cling to the individuation, to see and secretly cherish one's isolated uniqueness and incontingency in the recognition of it as a loss.

In my view, social media have become the extension of non-spacetime, where nothing, no identity or incident, is necessarily contingent or organic, and one is doomed to the "freedom" of endless ontological insecurity, the forever search for a grounding authenticity that can only generate more memes. Social media are where we go to protect our experience of nontime, which is threatened by the Real, by historicity, by death. Facebook is the ultimate nonplace. Being on it is to enter non-time, to maintain a continual pseudo-presence.

The non-spacetime crisis, I think, is a crisis of presence. When we exist in non-spacetime, presence becomes impossible -- or it is known by its absence, in a kind of negative theology. To put that less cryptically (or maybe not): technology has basically dissolved the unity of the subject in a particular place in time. Smart phones, etc., let us be in many places at once, conducting any number of conversations and self-presentations asynchronously. This casts an air of provisionality over everything we do; our lack of total commitment to a that place at that moment is always implied, always understood. No one is even bothered anymore when someone they are talking to looks at their phone. There is no ethical requirement to be fully present, and without that, there is no genuine (I know, how can you even ever define "genuine") intersubjectivity. The refusal to be fully present is a restatement of the refusal to permit our identity to be socially contingent or to be palpably collective. The smart phone reserves our right to check out of any collective identity formation at any time. This is the essence of contemporary "convenience," which I have long interpreted as being able to avoid interaction with other humans and being forced to empathize with them and recognize their existence as other. (We can only tolerate other people when we regard them as extra in our movie.)

Fisher referred to Jameson's distinction between psychological nostalgia and formal nostalgia, between the ability to evoke a real lost past and being trapped in pastiche. What I took from this is that the postmodern/neoliberal subject cannot access psychological nostalgia, but can only simulate it through pastiche, as this sort of subject has only existed in nontime as opposed to historical time. My sense is that this subject doesn't yearn for historical time at all but worries about historical time erupting into nontime via some sort of terrible Event. When something that threatens to be an Event happens, subjects rush to assimilate it to nontime by mediatizing it, "sharing" it in social media, or meme-ifying it. I'm not sure if this holds, but it may be possible to interpret the ad hoc celebrations of Osama bin Laden's execution this way -- an effort to experience a historical moment in a way that dehistoricizes it -- puts the partyers back at the center of their personal hermetic history, claims the Event as just an event in their individual story.

Because we have no access anymore to psychological nosalgia, we end up nostalgic for the capability for nostalgia, we feel homesickness for a home we never had. These leads to a compensatory attraction to childhood kitsch, to moribund objects (joining a typewriter club is an extreme manifestation of this), to anachronism, atavism, whatever seems genuinely and indelibly marked by a past. This perpetuates the cycle that denies the creation of a distinctive future, guarantees that the future is a more attenuated and annotated reconfiguration of detritus from the past.

(Malcolm Harris has more thoughts inspired by the talk here.)

Thursday, August 18, 2011

"This generation got no destination to hold" (4 April 2011)

I enjoyed the essay about the "iPod era" by Nikil Saval in the new n+1 (excerpt by Slate here), which considers the ramifications of solitary listening for political action. I thought it would cover the same ground as sociologist Michael Bull's book about the iPod, Sound Moves (which I wrote about here), but it turned out to be something quite different.

I don't accept Saval's claim that the music of the 1960s was "an incitement to social change" without a lot of qualifications that are not supplied in the essay; I think the music was shaped by political ferment that preceded it rather than vice versa, and most politicized pop was cashing in on the zeitgeist. The explicit merging of politics and youth music culture in the popular mind undermined political action, diverting it into self-neutralizing spectacles like Woodstock (funny how that more or less heralded the end of radicalism for many of its participants even as they claimed it was just the beginning; instead it was as if they knew they had achieved the goal of having participated in something "historical").

I'm more persuaded that the surge in politicized pop in the 1960s was a demographic phenomenon, a Baby Boomer thing. There is always music for youth expressing a rejection of certain aspects of the status quo; when there was more youth, there was more of it, and it was more prominent culturally, as that bulge of young consumers made what they consumed significant to the entire capitalist system. There was no golden age when music could be put to good political use. All music conjures a feeling of solidarity, I think; iPodization has just made the vicariousness of the process explicit.

What I liked best about Saval's essay was the survey of pop music sociology: Saval suggests Adorno and Bourdieu offer "the two most considered attempts to connect music and society" and contrasts them: Adorno held out for traditional aesthetics and the importance of high culture as a form of resistance; Bourdieu rejected musical taste's autonomy from the social order, arguing that it reflected status rather than an appreciation for something transcendent. I'm glad Saval takes some tentative steps to refute the idea that Adorno was "wrong" about popular music (I've written about that before for what it's worth), but still seems to want to emphasize the autonomy of the consumer in an administered consumer system. However, the essay makes this excellent point:
The danger now is different. The man no longer needs a monopoly on musical taste. He just wants a few cents on the dollar of every song you download, he doesn't care what that song says. Other times he doesn't even care if you pay that dollar, as long as you listen to your stolen music on his portable MP3 player, store it on his Apple computer, send it to your friends through his Verizon network.

Popular culture is already subsumed by capital; this is not different from the situation in the 1960s. RCA didn't care what kind of polemic Jefferson Airplane put on its albums, because the company just wanted to sell records. The radical sentiment was already commodified and neutralized; what it inflamed in listeners was to a degree already contained, already likely to express itself as radical chic, vicarious fantasy, and scenesterism rather than radicalism. And scenesterism is good for the culture industry; it enriches the value of products with new, valuable meanings for customers. It builds brand equity.

Saval suggests Bourdieu is a "philistine" who asserts the "falsehood" that "music is the 'pure' art par excellence. It says nothing and it has nothing to say." I don't think that's false at all; I think it is a recognition that music, like any other form of art, is not an untarnished container for humanistic pieties about what constitutes "greatness." But music, like asbtract art, more easily masquerades as such because it seems "purifed" of interpretable content and presents audiences with the higher truth of form qua form.

Bourdieu, Saval claims, refuted Adorno, but I think that Adorno and Bourdieu have complementary perspectives; both see popular culture as manifesting the failure of popular culture to be autonomous from capitalism and the classes it structures to support itself.

Saval's essay concludes, probably ironically, by recommending silence as a mode of resistance, as a means of steering between the Scylla of Adornesque snobbery and the Charybdis of Bourdieuian identity self-consciousness:
One radical option remains: abnegation—some "Great Refusal" to obey the obscure social injunction that condemns us to a lifetime of listening. Silence: The word suggests the torture of enforced isolation, or a particularly monkish kind of social death. But it was the tremendously congenial avant-garde gadabout John Cage who showed, just as the avalanche of recorded music was starting to bury us, how there was "no such thing as silence," that listening to an absence of listener-directed sounds represented a profounder and far more heroic submission than the regular attitude adopted in concert halls—a willingness to "let sounds be," as he put it ... Silence is the most endangered musical experience in our time. Turning it up, we might figure out what all our music listening is meant to drown out, the thing we can't bear to hear.
That seems like a Baudrillardian fatal strategy to me: "against the acceleration of networks and circuits, we will look also for slowness," he wrote in Fatal Strategies, in 1983. "Not the nostalgic slowness of the mind, but insoluble immobility, the slower than slow: inertia and silence, inertia insoluble by effort, silence insoluble by dialogue. There is a secret here too."

The dilemma Saval diagnoses is painfully familiar to me; I am always trying to find a way to "really" hear music, stop instrumentalizing it. But I probably won't ever choose silence or to "Enjoy the Silence" or even listen to Hymns to the Silence. My strategy recently has been repetition. (Insert obligatory citation of Derrida and/or Lacan here.) I tend to listen to the same handful of albums over and over again and hope that constitutes a nullification of the imperative to seek and enjoy novelty for its own sake through the medium of popular culture. Right now (god help me) one of those albums is Wings' Wild Life.

Surveillance and the Social Layer (2 March 2011)

I didn't listen to the interview with the founder of startup company Hashable described in this Silicon Alley Insider post, but a section of the recap caught my attention. It details the thinking process behind a serial entrepreneur's approach to the "social layer" -- the industry jargon for how immaterial labor can be captured in digital form through internet connectivity:
They thought about the gestures that business people and white collared professionals were conducting day after day that were not making it to the Internet.

They recognized that the acts of people meeting with and introducing other people were not making it in a structured way to the Internet. When you have breakfast/lunch/coffee/drinks with friends, that data would be valuable to users if a service would gather it and make it shareable. So, they decided it might be cool to build an application that was fun to use, social, and drives people to create that information and send it to Hashable. While the interface looks a lot like Foursquare, instead of saying “I’m at Starbucks”, you would say “I’m having coffee with Mark”. Hashable is saving very important information for its users and creating a multipurpose address book of the people its users interact with.

Notice how this exchange is structured. What is regarded as in inherently intolerable is that any sort of social behavior could escape digital capture, could slip through the net of commercial surveillance. Innovation has become a matter of perfecting that surveillance, allowing all our behavior to be mediated and translated into marketing data to fuel the engines of consumerism -- perfect the management of demand.

The contemporary tech startup's critical ("cool") task is to somehow entice you to share your private information in a standardized digital form in as close to real time as possible by making it "fun" and "social" and more or less compulsive, if not compulsory. It should find ways to "drive" users to report on themselves without the burden becoming intolerable.

"Fun" and "social" in this sort of context tend to be undefinable; their meanings are presented as obvious common sense, and thus they can only be elaborated through tautology: what is fun is social, and what is social is fun. Behind that screen, gamification tactics are deployed to encourage users to regard the progressive surrender of privacy as individuating accomplishment, a kind of glory that can measured only insofar as we let ourselves be tracked. The surveillance apparatus disguises itself as a giant scoreboard.

The point of all this data collection is, of course, privatized profit.
To the question of how Hashable plans to monetize, Mike answered honestly that “we’re not sure”. Hashable is creating unique data sets. The relationships that users have with people, the strength of those relationships – Hashable may be able to monetize access to that information. Users gain points every time they use Hashable. A certain number of points could be required for certain access, so Hashable might offer an option to pay for points or charge for access if you don’t have points.
It's interesting that though users supply the content, it is Hashable that "creates" the data sets -- a subtle rhetorical move that allows social-media companies to justify the property rights they claim with regard the information they collect. And here we also see that the original purpose of the service -- to "sae very important information for users" has shifted to "monetizing access to that information" by selling it to outside parties. It's also interesting to see how using the service is convertible to currency from the company's perspective -- it extracts more value from users using the service than those users get from being ostensibly served by it. One would have to pay to not use it, presumably after one had invested just enough labor into building a personal network within Hashable's proprietary clutches to not just give it up altogether.

As always with social media, the goal is to get you to invest enough of yourself in someone else's proprietary network so that you become trapped by it. Then the company can hold that part of yourself hostage if you object to the way they whore it out.

Tuesday, August 16, 2011

Paying Attention (22 Nov 2010)

Nicholas Carr is not happy about this NYT Magazine column by Virginia Heffernan about the "attention-span myth." Heffernan contends that technology critics like Carr err in imagining that something like an attention span exists.
The problem with the attention-span discourse is that it’s founded on the phantom idea of an attention span. A healthy “attention span” becomes just another ineffable quality to remember having, to believe you’ve lost, to worry about your kids lacking, to blame the culture for destroying. Who needs it?
Apparently Heffernan regards the pathologizing of short-attention spans as a disciplinary ruse to stifle children's creativity and discourage the artistic temperament. There should be no normative correction of the ability to pay attention; distraction is only a different form of attention, or attention paid to things society disapproves of. Distraction is a refusal to pay attention to things you are supposed to in order to be conformed. The inability to concentrate, then, is the triumph of the human spirit and its refusal to submit. If you feel like you have a hard time concentrating, it's just a lame excuse for procrastinating, a disguised form of nostalgia for a personal epoch of total focus that never really existed. Technology has nothing to do with it, because there is no "it".

Carr responds by insisting that attentiveness in fact exists and takes different forms (not merely "long" or "short"), and these forms and their prevalence are affected by technological context.
One can, for instance, be attentive to rapid-paced changes in the environment, a form of attentiveness characterized by quick, deliberate shifts in focus.... There is a very different form of attentiveness that involves filtering out, or ignoring, environmental stimuli in order to focus steadily on one thing - reading a long book, say, or repairing a watch. Our capacity for this kind of sustained attention is being eroded, I argue, by the streams of enticing info-bits pouring out of our networked gadgets. There are also differing degrees of control that we wield over our attention.
I agree with that; perhpas the way to split the difference and avoid the semantic arguments is to say that technology has certainly changed the sorts of things we want to give our attention to. I would add that built into consumerism is an incentive to make sure people scatter their attention as wide as possible on the greatest number of things and experiences, all of which have now been successfully packaged (often thanks to technological change) as exchangeable commodities. When a person's attention is fixed on a certain specific activity, it registers as lost opportunities to make sales -- one for each infinitely divisible moment that passed in which the person could have been distracted, could have consciously shifted attention, but didn't. That's why unlike Heffernan, I see concentration rather than distraction as an act of cultural resistance.

The problem with reckoning with attention problems is not that it is ineffable but that it doesn't correspond with an economic model that has us spending and replenishing some quantifiable supply of it. But the metaphors built into an "attention span" or "paying attention" or the "attention economy" imagine a scarce resource rather than a quality of consciousness, a mindfulness. It may be that the notion of an attention economy is a sort of self-fulfilling prophecy, bringing into being the problems its posits through the way it frames experience. It may not be constructive to regard attention as scarce or something that can be wasted and let those conceptions govern our relation to our consciousness. The metaphor of how we exert control over our focus may be more applicable, more politically useful in imagining an alternative to the utility-maximizing neoliberal self. The goal would then be not to maximize the amount of stuff we can pay attention to but instead an awareness that much of what nips at us is beneath our attention.

When the algorithms ignore you (draft) (1 Nov 2010)

For a long time, well past the point of reasonability, I was one of those people who didn't want a cell phone. I romanticized the idea of disappearing completely as technology left me behind and I could assume a "pure" form of unmediated existence. I professed a fear of being too easily reachable. But more likely I was afraid that carrying a phone around would provide continual proof that in fact no one wanted to reach me, that I had already disappeared and didn't even know it.

But what I harped on most was my fear of having to ignore certain calls and have the person who called assuming that I was choosing to ignore them. Maybe they would trust that I have a good reason; maybe they would think, like I often do, that I am self-centered jerk. But the plausible and conveniently neutral explanation that I am not home would no longer apply, would no longer be at the ready to mitigate the missed connection.

Inevitably, I thought, this would degrade my moral obligation to reciprocate with friends in order to sustain for myself the idea that they actually are friends. The burden of reciprocation becomes too great: Because we are expected to carry around phones, it's easy to presume that by default, the conversational channel is always open and that when your call is not taken, it's as if the person you were trying to talk to had turned their back on you mid-sentence. A new burden emerges: an obligation to explicitly redraw the boundaries of availability. Being "not at home" becomes a state of mind, a choice that brings forward any feelings of self-importance that might otherwise lie dormant.

It would seem natural enough to want to associate availability to presence, but thanks to technology, we can be virtually present yet dispositionally unavailable. This conundrum evokes those scenes in 19th century novels when the card of hopeful drawing-room visitor is sent up, and the servant returns to say that no one is in, even though everyone knows that they are. Etiquette deems this a frank, acceptable lie, a recoding that blunts the truth that one person has refused to speak to another.

I have always been uncomfortable with that, probably because I too readily imagine myself being refused. It seemed to me that cell-phone technology was muddling social signals, putting both too much and too little ego at stake in the effort to communicate. To restore clarity, I thought it might be morally useful for everyone to unambiguously experience the full weight of their refusing, and then maybe they would do less of it. (I believed somewhat naively that no one thrived on the capricious power to reject.) Naturally I overlooked how much refusing I do. When I didn't have a cell phone, I operated under the illusion that my various refusals were disguised by the "not at home" fiction; now I am in the position of having to feel like one of those drawing-room snobs every time I ignore a cell-phone call. But I don't want to be the snob, I protest to myself -- the snobs are supposed to be picking on me. I'm the underdog! I'm the underdog!

If anything, social media have made our responsibilities toward those friends trying to reach us even murkier. Some of the success of those platforms must be attributable to how they ease the pain of refusing people. When you are socializing in a broadcast medium, you don't have to refuse anyone. To accept a friend request on Facebook, for instance, sends out a self-satisfying burst of good will and burdens us very little; at worst, we may have to defriend the person later, an invisible action that the defriended may never notice. The awkwardness of building cliques is displaced to the medium and becomes part of the platform's functionality, how it allows you to regroup friends and filter the results of their gross social product online.

And increasingly, Facebook is performing the social filtering for us, absolving us of the guilt implicit in that as well. Social media structures communication between friends so that the responsibility for listening -- inescapably built into earlier mediums that structured talk between friends as person-to-person -- is modulated into a vaguer injunction to respond if and when you feel like it. Because status updates and the like are not addressed to anyone specific, they don't generate an obligation in anyone specific to pay attention. The messages instead compete in an attention marketplace that Facebook's interface creates and moderates, in which the currency is comments and "likes" and the other click-driven responses that the company can measure and process algorithmically. The results of that process -- which is explored in this Daily Beast article by Thomas E. Weber -- determine which messages will be featured prominently in friends' newsfeeds, and which ones will be suppressed.

The algorithms that Facebook uses to generate the newsfeed it presents to users do a lot of pre-emptive refusal for us, filtering out materials it has determined will be less compelling to us (or someone in our friend pool or who is statistically similar to us, at least). But its sense of our interests is only one aspect of its filtering criteria. MORE HERE Weber points out that the "most recent" updates are not simply the most recent ones, and that posts that others have responded to are more likely to show up in news feeds, and so forth. He advises that you "try to get a few friends to click like crazy on your items" if you want to show up in your friends' newsfeeds more regularly. Facebook apparently priortizes updates that prompt "user engagement" (links requiring click through) over ones that are just thoughts or ideas.

In short, Facebook filters what we are saying for its won purposes of keeping people logged in and generating data trails it can use. So what?
For average users, cracking the Facebook code is something of a fun puzzle. But for marketers trying to tap Facebook—or individuals who see the service as a way to promote themselves—understanding how content propagates through the system is anything but a game.
But the deeper danger that Weber glosses over here is that Facebook may be systematically obliterating the distinction he maintains between average users and marketers. It makes friendship into a game of self-promotion, a struggle to get noticed, for everyone brought into its sphere.

Basically on Facebook, our communication is assessed like online advertising -- how many click-throughs did it inspire? Presumably this will prompt us to make what we say on Facebook sound like advertising discourse. That in turn furthers the sense that on Facebook and other social media, we are not "friends" so much as competing personal brands, and that the media themselves encourage that kind of identity formation. After all, the way we talk about ourselves is to a large degree who we think we are. Self-expression is insufficient for sustaining friendship in the realm of Facebook; you need to offer actionable communiques, you need to be selling something, and you need the numbers on your side to make an impact.

In a sense, this is always how markets function; they depersonalize exchange and reduce transaction costs, thereby increasing the number of exchanges that occur. Accordingly, the volume of friend communication was consume thanks to Facebook has increased exponentially, but we have next to no ethical obligation with regard to any of it -- that's understood by all parties entering into Facebook's market for friendly discourse. That market works like any other successfully efficient one. But what has radically changed is the nature of friendship, which once upon a time was something intended specifically as a bulwark against depersonalization, against market logic. But twith Facebook, the consumerist allure of "more, faster" has fused with a closely related moral cowardice about rejecting people to drive us en masse to bring the efficiencies of commercialization right into the heart of our social lives.


When the algorithms ignore you

Monday, August 15, 2011

Conversation and Convenience (9 Sept 2010)

A few prefatory remarks: I dislike talking on the phone and generally avoid it if I can. I dislike voice mail even more, a moribund technology that I wish would be discontinued so that those behind the curve don't accidentally use it and expect people to listen to their messages. I am not proud of my attitudes though, despite frequently airing them. These attitudes reflect my fear of being engaged in a conversation I can't control, having to expending my precious time on discourse from another human being that I can't skim through. Not to be overdramatic, but I think I am in some small way psychically murdering those people who are trying to reach me when I refuse to listen to their messages, when I choose my convenience over their relative inarticulateness and the slowness of the technology they have chosen to use. The same goes for timeshifting the act of conversation so that it can cease to be reciprocal and take the form of broadcasting. It's a casual act of cruelty to deny that reciprocity, to withhold the possibility of spontaneous sympathy and understanding.

It makes me think of that section of My Dinner With Andre in which Andre talks about having to endure small talk while he was in the midst of watching his mother die.
ANDRE: [Long pause.] Well, you know, I may be in a very emotional state right now, Wally, but since I've come back home, I've just been finding the world we're living in more and more upsetting. I mean. Last week I went down to the public theater one afternoon. You know, when I walked in I said "hello" to everybody, 'cause I know them all and they all know me, and they're always very friendly. You know that seven or eight people told me how wonderful I looked, and then one person, one, a woman who runs the casting office, said: "Gee, you look horrible! Is something wrong?" Now she, we started talking, of course I started telling her things, and she suddenly burst into tears because an aunt of hers, who's eighty, whom she's very fond of, went into the hospital for a cataract, which was solved, but the nurse was so sloppy she didn't put the bed rails up, so the aunt fell out of bed and is now a complete cripple! So, you know, we were talking about hospitals. Now, you know, this woman, because of who she is, you know, 'cause this had happened to her very, very recently, she could see me with complete clarity. [Wally says "Un-hunh."] She didn't know anything about what I've been going through. But the other people, what they saw was this tan or this shirt, or the fact that the shirt goes well with the tan, so they say: "Gee, you look wonderful!" Now, they're living in an insane dream world! They're not looking. That seems very strange to me.

WALLY: Right, because they just didn't see anything somehow, except the few little things that they wanted to see.

ANDRE: Yeah. You know, it's like what happened just before my mother died. You know, we'd gone to the hospital to see my mother, and I went in to see her. And I saw this woman who looked as bad as any survivor of Auschwitz or Dachau. And I was out in the hall, sort of comforting my father, when a doctor who is a specialist in a problem that she had with her arm, went into her room and came out just beaming. And he said: "Boy! Don't we have a lot of reason to feel great! Isn't it wonderful how she's coming along!" Now, all he saw was the arm, that's all he saw. Now, here's another person who's existing in a dream. Who on top of that is a kind of butcher, who's committing a kind of familial murder, because when he comes out of that room he psychically kills us by taking us into a dream world, where we become confused and frightened. Because the moment before we saw somebody who already looked dead and now here comes a specialist who tells us they're in wonderful shape! I mean, you know, they were literally driving my father crazy. I mean, you know, here's an eighty-two-year-old man who's very emotional, and, you know, if you go in one moment, and you see the person's dying, and you don't want them to die, and then a doctor comes out five minutes later and tells you they're in wonderful shape! I mean, you know, you can go crazy!

WALLY: Yeah, I know what you mean.

ANDRE: I mean, the doctor didn't see my mother. People at the public theater didn't see me. I mean, we're just walking around in some kind of fog. I think we're all in a trance! We're walking around like zombies! I don't think we're even aware of ourselves or our own reaction to things, we're just going around all day like unconscious machines, I mean, while there's all of this rage and worry and uneasiness just building up and building up inside us!
I think our communication devices are making it harder to escape the dream world; it's making us all conversational butchers who deny one another's reality because we can't be bothered to look beyond our own fantasia. And signs of the resultant passive-aggressive hostility seem to be everywhere -- to me anyway, it seems obvious in myself, when I find myself cursing at my phone like I have Tourette's because it tells me there is a voice mail for me to listen to. I know then that I live in an insane dream world.

Not everyone sees it this way. At the Economist's Free Exchange blog, Ryan Avent responds to a Kevin Drum post about hating the phone. Avent defends the drift toward textual communications:

Younger people want to talk on the phone less because the opportunity cost of setting everything else aside is higher, and because the substitutes for phone conversations are better than ever.

At any given moment, I'm carrying on many, many different conversations. Some of these conversations are conducted through blog arguments. Others, via email. Still others take place using instant messaging or Twitter. Other people use other modes—Facebook, Flickr, comment threads, and probably other social network tools I've not heard of. But what all these options have in common is that the participants in the discussions can engage in them at their convenience. I can return an email whenever I have a spare moment....

A phone call, on the other hand, requires both participants to be talking to each other in real time.... time spent on a constrictive phone call is time not spent on the many other conversations an individual has going.

Of course, this takes some getting used to. What is actually an increase in productivity feels to those used to long phone calls like an overwhelming and thought eviscerating wave of distraction. Plus, it's hard to hear over cell phones! But if phone calls feel burdensome to young people, it's because they're often actually burdensome. And the conversion of a convenience into a burden is representative, above all else, of progress.

Here I must disagree with Avent. I don't regard the experience of convenience or lack of it as an automatic indication of "progress." I think putting a private illusion of productivity ahead of the fostering of a shared psychic space through conversation is a terrible mistake, an inhumane selfishness that our gadgets make all too easy for us to indulge. I don't think the ability to conduct "conversations" at our convenience is especially beneficial. It erodes the very concept of social reciprocity, of necessarily willed attention in the moment, even if it is against one's inclination. If there is to be a meaningful public sphere, it requires resistance, friction, argument, confrontation. It requires inconvenience, the inconvenience of focusing our attention most of all. It requires the difficulty that Jonah Lehrer talks about here with regard to e-readers.
I’d love them to include a feature that allows us to undo their ease, to make the act of reading just a little bit more difficult. Perhaps we need to alter the fonts, or reduce the contrast, or invert the monochrome color scheme. Our eyes will need to struggle, and we’ll certainly read slower, but that’s the point: Only then will we process the text a little less unconsciously, with less reliance on the ventral pathway. We won’t just scan the words – we will contemplate their meaning.

Difficulty prompts contemplation. It disrupts the insane dream world; it forbids the caustic solipsism that ultimately doesn't even serve ourselves, but shuts us in a crypt of incomparable and thus impotent self-regard.

The more convenience we introduce to conversation, the more we're winnowing away the difficulties that preserve the possibilities of discourse. Instead we get a simulation of communication that precludes a confrontation with anything outside the dream world, and makes sure that the world we share with others will not change in any meaningful way. Convenient communication lets existing power relations further entrench themselves; the convenience assures that discussions of their inequity can never be broached.

Gmail's Priority Inbox (31 Aug 2010)

I suppose it's a measure of my social insignificance that I don't get very many emails to my personal account in a day -- no more than dozen or so tops, and a few of those are from automated mailing lists. I have never had much trouble keeping up with that influx. So I can't say that the priority inbox feature that Google is rolling out -- described here by Slate's Farhad Manjoo -- is targeted at me. But it bothers me nonetheless, and for the usual reasons. They want to assume responsibility for some of our tedious decision-making tasks so as to better develop a simulacrum of our thinking processes, to anticipate what we want as an intermediary step to knowing better how to dictate our desires.

Google proposes to collect information about your priorities so that it can sort your mail for you before you see it and perhaps even ensure that some emails deemed unworthy of you never offend your eyes, much the way most spam is effectively banished to junk folders. Manjoo describes how it is supposed to work:
Priority Inbox looks for signals that a message is especially valuable. Among other things, it analyzes your experience with a particular sender—is a message from someone whose mail you tend to open and reply to? Was the e-mail sent only to you, or was it part of an e-mail list? Did the message contain keywords that have proved interesting to you in the past? If a message makes the threshold for importance, Gmail marks it with a small yellow tag. These messages will appear at the top of your inbox, above the rest of your mail.
That seems semi-innocuous, automating a process you can set up manually with Gmail's existing filtering features. What makes this whole thing sinister is this: Manjoo notes that "Priority Inbox promises to get better the more I use it. Google has added two buttons that let you train the system—you press one button to mark a message as important, and another to mark it unimportant." To call it "training the system" is to put a benign aspect on what's really the collection of highly intimate personal data about your social life and encoding in a way that makes it immediately operational. It's in effect a Google initiative to get a hold of some of the "social graph" data Facebook is designed to collect and own -- who cares about who, what, when, and why. Facebook spits out its status update feed on the basis of such information and is always refining it through its monitoring of our interactions with the site. No doubt Google would like to offer future marketers something similar in terms of proof-of-concept: "See, we can mimic their own decision making so well that they accept it as their own!"

Paying for the Internet in Vulnerability (6 Aug 2010)

Often, despite my not infrequent fulminations, as I find myself spending more and more time in front of computer screens, reading and writing and even Twittering with ever more frequency, I start to wonder if I have been too pessimistic about the internet, about its role in accelerating our consumption of culture, the degree to which it more thoroughly saturates our everyday lives with marketing and its associated ideology: the celebration of novelty for its own sake, the embrace of narcissism as a mode of hyperfriendship, the supplanting of knowledge with information and data, the transformation of consumption into meme production, the mobilization of identity into a circulating personal brand that articulates the amount of society's attention one is worth, the disappearance of contemplation in favor of increased mental throughput, the sense that quality, though frequently brandished as a goal, is in truth a liability unless it can serve as an emollient to our alacritous neuroprocessing. (I was going for a sentence of Ruskin-like expansiveness -- how did I do? Perhaps protracted Proustian periods will persuade us all to take the long view now and then.)

But when I read an news item like this WSJ article, by Nick Wingfield, I am reminded all over again that I am not as cynical as I should be. The article details how Microsoft considered developing its internet browser so that user privacy would be better protected as a default, but then decided that such a course would inhibit the true purpose of internet accessibility.
In early 2008, Microsoft Corp.'s product planners for the Internet Explorer 8.0 browser intended to give users a simple, effective way to avoid being tracked online. They wanted to design the software to automatically thwart common tracking tools, unless a user deliberately switched to settings affording less privacy.... In the end, the product planners lost a key part of the debate. The winners: executives who argued that giving automatic privacy to consumers would make it tougher for Microsoft to profit from selling online ads.
The internet is ultimately not a commons, and our access to it is conditional on our vulnerability within it. Neither Microsoft nor any other tech company is in business to open our access to free-flowing information or protect our privacy for nothing. (The companies that do want to help you do that are parasites who rely on the others to intentionally endanger it.) Their business, as network architects and technicians, is ultimately surveillance -- to make sure one is connected to the network and appropriately exposed, exploitable as a node. Wingfield points out that "the 50 most-popular U.S. websites, including four run by Microsoft, installed an average of 64 pieces of tracking technology each onto a test computer." We get to use the internet, or rather companies want to make it possible for us to use the internet, because they can reap the rewards from our data processing there -- that's the only reason. And at tech companies that survive, executives are in place to smack down the wild-eyed dreamers among the product developers who think otherwise. This graphic illustrates the way the tracking systems work, and how we, in our lust for information, work to transform ourselves into demographic data

And here's more reason for cynicism: Google's negotiations with Verizon to in effect put an end to net neutrality. They are discussing placing a burden on content creators to pay to have their content distributed efficiently on the internet. This seems like it would ultimately reinstitute the gatekeeping power of the media companies, which would quickly turn such costs into something that mimics the costs of printing and distributing bundles of paper, or pressing grooves into vinyl, or what have you. So any dream of the internet being a democratizing, disintermediating force in the realm of cultural production would be effectively quashed. Amateurs would be on the ham-radio section of the net, with transmissions at lugubrious levels, while the professional media would be on the "real" internet.

Death of the Author? (5 Aug 2010)

I've thought this over a bit today and basically agree with Matt Yglesias, that the claim Trip Gabriel reports in this NYT article that the ethos of the internet is prompting kids to plagiarize more than they used to is pretty dubious. Here's the core of Gabriel's article:
Professors used to deal with plagiarism by admonishing students to give credit to others and to follow the style guide for citations, and pretty much left it at that.

But these cases — typical ones, according to writing tutors and officials responsible for discipline at the three schools who described the plagiarism — suggest that many students simply do not grasp that using words they did not write is a serious misdeed.

It is a disconnect that is growing in the Internet age as concepts of intellectual property, copyright and originality are under assault in the unbridled exchange of online information, say educators who study plagiarism.

Digital technology makes copying and pasting easy, of course. But that is the least of it. The Internet may also be redefining how students — who came of age with music file-sharing, Wikipedia and Web-linking — understand the concept of authorship and the singularity of any text or image.

Gabriel's article seems like misplaced anxiety; the stakes are pretty low with plagiarism: students are basically only "hurting themselves" by cheating on their homework, as the proverb goes, and it's not like the papers are up for publication. These cheaters are not David Shields or Jonathan Lethem. The idea that students suddenly don't understand the concept of authorship reminds me of the worst nightmares of the fuddy-duddy professors who would fulminate about Barthes and Foucault and "this so-called textuality" when I was a graduate student. Where was the proper respect for Genius?

Gabriel interviews anthropologist Susan Blum, who seems like this species of worrywart.

In an interview, she said the idea of an author whose singular effort creates an original work is rooted in Enlightenment ideas of the individual. It is buttressed by the Western concept of intellectual property rights as secured by copyright law. But both traditions are being challenged.

“Our notion of authorship and originality was born, it flourished, and it may be waning,” Ms. Blum said.

She contends that undergraduates are less interested in cultivating a unique and authentic identity — as their 1960s counterparts were — than in trying on many different personas, which the Web enables with social networking.
Obviously she hasn't hurt Mark Zuckerberg lecture about integrity and a single online identity. I'd be surprised too to find that students are uninterested in authenticity and unique identity and are seeking to merge with the multitude in a gesture of postmodern antisubjectivity. Self-broadcasting mediums and Web 2.0 seem to emphasize the value of a unique identity, not dissolve it.

Students, I suspect, don't take attribution seriously because the work they are being asked to do is not serious to them. They don't have much of a sense of scholarship as a collective enterprise, or of what they do in college as scholarship. With gen-ed classes, they know they are just marking time and doing busy work for the most part. They are right to think that plagiarism is not "a serious misdeed" that is somehow different from any other form of academic dishonesty. To pretend otherwise is to serve the ideological bidding of the lords of intellectual property.

The implication of plagiarism hysteria is that scholarship is a process of claiming ownership of proprietary information, an exceedingly unnatural attitude that students have always needed to be indoctrinated into, particularly if they want an academic career. This usually involves a series of ritualized genuflections in the form of citations of the recognized masters of a particular discipline as part of a student's professionalization into the academy.


Yglesias notes that the Web prioritizes the association of data with its metadata -- song files with the artists, etc. -- and thus generally organizes information so that it is easier to deduce where it originated if you are so inclined. It's never been easier to catch cheaters, he points out, something that was true even when I last taught college courses, in 2001.

I am inclined to think that the ubiquity of material available for appropriation and the ease of cutting and pasting itself explains most of the alleged rise in plagiarism. In my experience, most students who were inclined to cheat were way too lazy to retype passages out of a book.

Friday, August 12, 2011

Information Processing and Pleasure (30 July 2010)

I've been reading Tyler Cowen's provocative book The Age of the Infovore (a.k.a. Create Your Own Economy), which argues for the beneficial potential in seizing upon information organization as a form of pleasure itself rather than preparatory work that leads to pleasure. I'm somewhat skeptical of that; I tend to lament the time I spend sorting my library on iTunes instead of hearing the music. The need to organize and accumulate feels like a screen between me and the music; I can't even hear it anymore until it's organized, and I find myself listening as a way of processing to know how to sort a song, put it in its proper playlist, rather than to enjoy it in a more sensation-oriented way. I add so much metadata that it begins to obscure the data; the metapleasure cannibalizes from the pleasure I once derived from music. I end up just collecting music and information about it; much of it never gets played at all. And that gnaws at me at times. I fantasize about getting the "never played" playlist down to zero -- sometimes I consider leaving my iTunes playing while I sleep.

Cowen asserts that the organization makes the music "actually sound better" -- presumably that satisfaction from organizing can be enjoyed as sensuous. To me these are distinct satisfactions -- the organization "pleasure" feels more like OCD compulsion, an anxious restlessness at everything not being in its proper place. Whereas getting lost in the music is something entirely different, a suspension of anxiety and the need to "get things done." Perhaps the way I experience pleasure is no longer in sync with society -- i.e., my generation was socialized in a disappeared age, and the structure of everyday life now demands a different kind of subjectivity, responsive to different modes of pleasure. I may be insufficiently autistic, as Cowen suggests the pleasure in ordering and processing is a quintessential autistic trait that is becoming advantageous in an infocentric economy.

Cowen argues that ordering can be a mode of relaxation, rather than a mere manifestation of the psychic pressure to be productive: "Ordering and manipulating information is useful, fun, alternately intense and calming, and it helps us plumb philosophical depths.... It is a path toward many of the best rewards in life and a path toward creating your economy and taking control of your own education and entertainment." In other words, the infiltration of digitally mediated information processing into our daily practices gives a chance to experience more autonomy in our lives, provided we are content to live life at the level of "little bits," as he calls them -- memes, cultural fragments, decontextualized informational nuggets, isolated data points and so forth. Cowen makes this crucial point: When access is easier (which it has become, thanks to the internet), we tend to favor smaller pieces of information as a way of diversifying our options. This could be a matter of our inherent preference for novelty, though it may be a consequence of the values we inherit from our society, which privileges novelty over security, omnivorous dabbling over deep geekery. Either way, our internal filters are winnowing, such that we start to choke on anything more substantial than a tweet, become restless at the thought of assimilating larger, holistic hunks of culture. This seems to be a conceptual shift in how we approach experience, not as something overwhelming to lose ourselves in but as something to collect and integrate within ourselves as a series of discrete, manipulatable objects.

Social norms, biological imperatives and technological developments, then, have fragmented culture into ever smaller bits, as our identities have been cut free from traditional anchors. And experiences have been reified, in part because of the ease with which they can be digitized and distributed. As a result, we now carry the burden (or enjoy the freedom) of having to continually reassemble such fragments into something coherent and useful for ourselves -- into our self-identity, into an amalgam that represents our interests and self-perceptions, as well as the image we want to present socially. The Internet "encourages us to pursue our identities and alliances based around very specific and articulable interests," Cowen notes -- they need to be simplified to match the bittiness of how we all have begun to see the world.

As Cowen points out, culture was once largely ordered for us collectively by the nature of the slow media through which it reached us. Songs came in a prescribed order on an album. K-tel picked the hits for Music Explosion. Now we do the selection and the arranging for ourselves. "A lot of the value production has been moved inside the individual human mind," Cowen writes.

The key word is "individual," though. These amalgamations are increasingly private and intensely personal, but nonetheless need social validation, which was intrinsic to the cultural order when it was mandated for everyone. When there were only three TV channels, everyone wanted to know who shot J.R. and no one needed to explain what they were talking about with that or why they cared. Now I would need to do a lot of explaining if I was intensely curious about who shot J.R. (which I am, and please don't spoil it for me!).

The point is, we want our identities -- our cultural investments -- recognized; we want to be understood. So we end up having to explicate ourselves, "share" our private organizational schemes with ever more urgency on the host of new media forms designed primarily to facilitate this sort of communication -- the communication of privately curated little bits organized into a hierarchy, commented upon, glossed in an effort to make their contingent coherence more broadly comprehensible so that we feel less alone, less like we treading water alone in a vast sea of information.

Our ongoing efforts to communicate the significance of our assemblages is itself a harvestable kind of information processing -- it has personal value to us, making us feel understood and recognized. But it has monetary value to media companies and marketers as demographic data and semantic enrichment for their brands and products. Our quest for coherence and recognition and ontological security turns out to be very useful intellectual labor when resituated outside the crucible of our own identity.

Sometimes this seems very sinister to me, a monetization of our social being in a way that cuts us out of the rewards, even as it makes some "knowledge work" jobs expendable. It also leaves us with an identity that feels more fragile and reified at the same time; we are alienated from our immediate experience of ourselves and instead relate to ourselves as though our identity is a brand. It also means that the public sphere becomes "the social factory," as the autonomistas say, a realm that blends production and consumption so seamlessly that leisure and for-itself social activity and pure sensual immersion become impossible. They become irrelevant, outmoded forms of pleasure -- contemplation (decidedly and necessarily inefficient) is a casualty of the joys of efficient processing as pleasure. (Cowen calls this the Buddhist critique -- ordering precludes a sense of oneness and harmony with the universe that Buddhists pursue. Nicholas Carr makes similar points about focus in The Shallows; our brains are being changed by internet use to disregard contemplation as joy.)

We are driven to be producing informational value and accepting that as pleasure, rather then engaging in the kinds of pleasure Bataille grouped under the notion of expenditure -- waste, symbolic destruction, eliminating meanings, destructuration, entropic anarchy. That may be a good thing, unless you believe the need for "expenditure" builds up within a rationalized society and may explode into fascist movements if not ventilated. It seems that digitization means that our visions of excess are directed into a rage for ever larger collections of things (think hoarders) or ever more order.

Nothing's gonna stop the flow (27 July 2010)

Alan Jacobs, responding to Peggy Nelson's celebration of the flow, asks:
In the Flow, are “listening” and “consuming” distinguishable activities?
That's an interesting distinction: Listening, if I'm reading Jacobs right, is way of appropriating knowledge that is not simultaneously productive or at once situated in an exchange process, as the word "consuming" implies. The digitally mediated flow seeks to make any noneconomic responsiveness to art or culture or anything else in life more or less impossible -- or at least ideologically undesirable. Why just attend a lecture when you can liveblog it and "add value" with your coverage?

The implied imperative in Web 2.0 is to make all consumption productive and allows us to avoid the ignominious fate of becoming a passive consumer -- the straw-man figure of our era who represents the inauthentic conformist couch potato who has surrendered all agency. Perhaps no subject position has been more demonized than that one in late-capitalist consumerism, as various investigations of the rebel consumer illustrate.

The flow basically eliminates the lag time that listening presupposes, the space in which a more considered response can germinate (if warranted). You might call it the space that makes a deliberate aesthetic possible. (Ross Douthat suggests this space for contemplative reading is becoming a luxury, a class-based privilege contingent on being to afford a distraction-free retreat. I would add that it's also a matter of class whether one feels impelled to be relevant through accelerated productive consumption or whether one will be confident of one's relevance as a matter of habitus.) If the immediate aesthetic response is simply a coded form of obedience to existing relations of power, the social order inscribing itself on our hearts as Eagleton argues, then obviating the space of rumination reinforces the aesthetic's ideological function. We can't dispense with the aesthetic, which allows for real experiential pleasure, a pleasure that seems to resound deep within us and call forth a certain holistic sense of ourselves that is wedded to enduring ideals of the good. But when we make our aesthetic response more deliberate, there is a chance to align our pleasure with our identity with consciously affirmed social ideals.

The real-time revolution, the rise of the flow, basically requires all responses be even more spontaneous than aesthetic approval or else be forgotten and ignored as everyone moves on with the tide of events. From the perspective of real-time hegemony, listening is an arrogant effort to arrest the flow of events rather than swim with them and contribute to the flow's momentum. Trying to stand still amid the flow, to stop and listen, to focus longer than the flow's pace permits, is to ask to be drowned as the flood washes over you.

Long live the new efficiencies (7 July 2010)

I always knew there was something suspicious about concentration, considering how inconvenient and inefficient it is, slowing my consumption down unconscionably. I'm so glad the network can glean the by-products of my perpetual boredom and restlessness and make proper, efficient use of them. I am glad we are evolving.

In an essay for the NYTimes Opinator blog, evolutionary-psychology proponent and "card-carrying Darwinian" Robert Wright responds to Nicholas Carr's case in The Shallows that our interaction with the internet affects our ability to concentrate, leaving us permanently distracted. Wright suggests that this feeling of permanent distraction is a good sign, indicating that our brains are being fused to others, contributing to purposes larger than we are capable of comprehending. If we could actually concentrate on what we were doing in responding to too many things at once, we would end up cutting out the myriad networked connections to others that puts informational tidbits in motion and makes them useful.
On balance, technology is letting people link up with more and more people who share a vocational or avocational interest. And it’s at this level, the social level, that the new efficiencies reside. The fact that we don’t feel efficient — that we feel, as Carr puts it, like “chronic scatterbrains” — is in a sense the source of the new efficiencies; the scattering of attention among lots of tasks is what allows us to add value to lots of social endeavors. The incoherence of the individual mind lends coherence to group minds.
That's pretty chilling. It reminds me of The Charge of the Light Brigade:
Theirs not to make reply,
Theirs not to reason why,
Theirs but to do and die:
All hail the new efficiencies! What difference does it make if they obliterate subjectivity as we have known it? An overrated propensity, identity, unless it can "add value" to "social endeavors," that is, unless it has brand equity. But as for individual autonomy? Who needs it. As Wright tells us, "this fragmenting at the individual level translates, however ironically, into broader and more intricate cohesion at the social level — cohesion of an increasingly organic sort." The group mind must know what it's all about -- it's organic, after all. Wright notes that he is "nostalgic as the next middle-aged guy for the time when focus was easier to come by," but he is not worried that the superbrain is malevolent or totalitarian.

I, for one, also welcome the group mind. It relieves me of all my anxieties and responsibilities. It makes me feel comfortable in my ignorance, which obviously is integral to a larger purpose determined by the magic conjunction of those who share my avocational interests, who are kept equally ignorant. A thousand gut reactions always amount to more than one considered response.


The Cultural Production of Teens (1 July 2010)

This summary is not available. Please click here to view the post.

Thursday, August 11, 2011

Logan's Run Economy (21 June 2010)

To follow up on the post yesterday about foolish austerity, and the noxious view that unemployment benefits shouldn't be extended because it would be rewarding a bunch of lazy slackers: some of the unemployed would love to be hired but are (a) stuck in a house in a depressed area or (b) old. Or both. Ezra Klein links to this Washington Independent piece by Annie Lowrey that details the difficulties older workers have in getting hired.
There are structural reasons that the unemployment crisis is hitting older Americans so hard. Older workers are more likely to be underwater homeowners, unable to sell their house and move away. They often have highly specific marketable skills, and seek positions more selectively. They also often have skills rendered obsolete by the recession, in outdated trades. But too often, employers illegally presume that older workers will be harder to train, more likely to leave for other positions, less productive, less technologically able or less willing to move — and do not hire them for those reasons.

I worry about this a lot. I work in an industry (magazine publishing) that is notorious for age discrimination. Also, as a PhD dropout who was "overqualified" for most of the entry-level jobs I had to apply for, I can readily remember the frustration and futility, the despair -- and this was when the economy was recovering robustly. I ended up having to spend a lot of time in temp agencies, proving my ability to alphabetize and to open up files in Microsoft Word. I felt like a useless piece of garbage, unwanted by society because I made the error of extending my education until I reached 30, and I wondered when the time would come when they would send me to the soul-renewal chamber like in Logan's Run. How much worse must it be for those laid off in a downturn because they are older and draw more benefits and better salaries and ruled inefficient relative to younger, less demanding workers.

Whenever the economy undergoes a structural shift, it seems that older workers will be disadvantaged without remedy, since they will have outdated skills and will be subject to uncorrectable discrimination. Lowrey notes that "policy experts fear that age discrimination in hiring, compounded by the recession, is a problem without a solution. Individuals can bring cases against individual companies, but discrimination is virtually impossible to prove, even if it is easy to see as an aggregate phenomenon." So even though the trend is obvious at the macro level, micro-level concerns make it difficult to change. (Sort of like the problem of inadequate demand in the economy generally.) Thus we have a Logan's Run economy, concentrating unemployment among older workers while making them scapegoats for the runaway deficit that is "stealing from our children." Perhaps the Republicans who stonewalled the jobs bill should stop pussyfooting around and tell these older workers to just go die.

Hang the DJ (8 June 2010)

How can this possibly be true?
“Of the twenty hours a week that an average American spends listening to music, only three of it is stuff you own. The rest is radio,” Tim Westergren told me.

That's from Sasha Frere-Jones's New Yorker article about Pandora. I suppose I am blinkered by my own habits. The amount of time I listen to the radio is generally limited to the time I am trapped in environments playing the radio (e.g., the supermarket on the corner near my apartment, the barber shop, etc.). I don't spend much time in cars, which is where I once listened to the radio, mainly on car trips back and forth from Tucson to Phoenix and Las Vegas. That was mainly to alleviate boredom while traveling alone; trying to find a listenable song is a way to stay awake. All of that has made me feel somewhat alienated from the culture in which I live.

What songs DJs play on the radio are the result of a variety of institutional forces -- playlists, payola, personal preferences, radio station formats -- which is what makes those songs listenable, I think. They encode the zeitgeist, balancing various social pressures and contextual factors, expressing market forces as well as advancing certain trends while leaving a window of theoretical possibility for individual expression and taste. The medium of radio posits a collective audience that can be pleasurable to join or judge, or both. It allows for passive participation in something beyond oneself.

Frere-Jones lauds an online music service: "In some ways, it’s an improvement on the radio model: the number of potentially appealing d.j.s here dwarfs what you might have once found on radio." But that seems backwards -- his assumption that we want more options to suit our individual niche when we listen to the radio seems wrong. I don't think we care as much about hearing something we like than we do about exposing ourselves to what's going on in the world. Listening to the radio feels like cultural participation precisely because the options are limited. If one can choose from a huge number of stations, one doesn't end up with the feeling of participating in culture -- instead one seems to be escaping from it (to where?). The limited number of stations mimics the limits our cultural context puts on our identity. The boundaries are necessary to create a sense of belonging to something, of being something in particular. If "what's on the radio nowadays" becomes an unbounded set of songs, the radio becomes useless as a cultural barometer. And music itself becomes less intrinsic to social life, the more choices there are about what to listen to.

So I am skeptical that DJs will ever be replaced by computer-generated, Pandora-like applications that try to play what you as an individual really want and like. Listening to music is only partially about expressing an individual taste. It also about reading the collective mind, belonging to it, participating in a cultural conversation that one needs DJs (or perhaps other consumers that the Internet could connect us with) to moderate to believe that they exist. The iTunes autofill playlists put me into a conversation with an opaque algorithm that tends to infuriate more than anything else. (Hey, "Genius": Please stop playing Three Dog Night.)