Showing posts with label data processing. Show all posts
Showing posts with label data processing. Show all posts

Monday, January 30, 2012

The Rise of the Data Self (25 Jan 2012)

This Smithsonian post (via 3QD) offers some more support for my fledgling thesis from Monday's post that "normal" identity is becoming explicitly data-based -- that it's natural to think about who we "really" are in terms of statistics-driven self-surveillance rather than depth psychology or self-actualization quests or anything like that. Freud is out, Facebook et al. is in. For example, we try things that seem self-expressive using media that can give us quantified feedback, and only when the results come back do we decide whether what was expressed was "true." We can convert ourselves in the same way into data, that can make us into a statistical profile and return to us what other people with similar data profiles are doing, and hence what we ourselves should be doing.

Recommendation engines are perhaps the most explicit form of this: "Customers Who Bought Items in Your Recent History Also Bought..." Data-driven micro-marketing approaches are another form. The Smithsonian piece offers other examples derived from "quantified self" initiatives that have people monitoring their vital statistics and uploading them for analysis and aggrgation.

Consider the possibilities in health care. In the past, anyone analyzing who gets ill and why had to rely on data skewed heavily toward sick people–statistics from hospitals, info from doctors. But now, with more and more healthy people collecting daily stats on everything from their blood pressure to their calorie consumption to how many hours of REM sleep they get a night, there’s potentially a trove of new health data that could reshape what experts analyze. As Shamus Husheer, CEO of the British firm Cambridge Temperature Concepts, told the Wall Street Journal, “You can compare sleep patterns from normal people with, say, pain sufferers. If you don’t know what normal sleep looks like, how do you tease out the data?”

What a dream come true! We can collect enough data to create the profile of the ultimate superbeing: the perfectly average human. And then we can use health-insurance protocols to force everyone to become this or else.

But my suspicion is that this runs deeper — that data collection is slowly becoming the ideological basis of the self — what we regard as the real self. Data is the authorized way to pursue self-knowledge in the networked society; the other means are suspicious, deluded or outmoded. This is not just a matter of the evergreen appeal of naive empiricism. (It has numbers; it can be graphed; ergo, it's true! Numbers don't lie! Who cares how they are contextualized?) Since interactions within social networks are now easily captured and standardized, the quantifiable data thereby produced have become far more constitutive of identity. Just read this article about the designers of Facebook's Timeline function. As the post explains, the user interface is what is supposed to dictate the self as you navigate through your own data heap. With Facebook's organizational help, you muck around in there looking to build the real you.

The assumption is that by letting Facebook capture and process everything, a more reliable version of the self than our own memory can give us will be produced. As the post's title suggests, the UI has "soul"; you do not. Or, as a subheading in the piece claims, life should be seen as having a UI. There is no direct experience of life; it's entirely a data network that we need mediated for us. In one of the more disturbing hubristic-techie quotes I've read in a while, one of the designers tells us what Facebook Timeline lets us do: "You gently consume time.” Rage, rage against the dying of the light, etc.

And though Facebook wants "the Timeline to be a place for self-expression: A way for users to reveal who they are and what their lives are about," it has provided a tightly controlled and highly formatted medium for it that emphasizes standardization (echoing the old Facebook vs. MySpace distinction; Facebook was "clean" because it retained aesthetic control). It imposes the metaphor of life and memory as a stream, which, as Eric Harvey notes, is not some natural, neutral reflection of how we remember but a reshaping of life into narrative, which suits Facebook's ends. The more work we put into making a coherent story out of the data Facebook collects, the more useful, marketable information we give them.

It makes little sense to look within for the true self when we have available immediate (and processable) reactions from and comparisons with masses of other people to help sketch out our contours, when we have an enormous data trail we have created incidentally to be reprocessed by outside parties as self-revelation. Our filters are who we are; the "social graph" is not something on which we are merely one point, but it's instead a map of our identity (or at least what capital wants us to think of as our identity). We become more of a person the more we build out this graph, let the flows of information it facilitates constitute us.

As Nicholas Carr pointed out here, the "right to be forgotten" by tech companies that want to own our memories and even the process by which we remember, may be in danger, despite the E.U.'s effort to institutionalize it. The degree to which the data self is naturalized for us will determine how much such a right will seem beside the point.

The new strategies of desire (23 Jan 2012)

The Economist's holiday double issue a few weeks ago had an article about 1950s motivational-research guru Ernest Dichter (author of The Strategy of Desire) that argued that newfangled behavioral economics marks a kind of return to his approach to consumer behavior -- that most of it is "irrational" and dictated by unconscious impulses and emotional needs, not by the perceived usefulness of a particular commodity. For a few decades, those assumptions were regarded as dubious -- rejected as being patronizing toward consumers, refusing to grant them agency or the sophistication to desire things for complicated yet still conscious reasons. Motivational research and hysteria about consumer manipulation at the hands of evil corporations were based on assumptions that consumers are passive saps who are brainwashed into wanting this and that, whereas it had become more politically expedient for the left and the right both to begin to argue that consumers exercised real power over and in their deliberate shopping choices. Consumerism was touted as a genuine forum for self-expression, an arena in which the identity-enriching fruits of capitalism could be harvested. Or it was a place where consumers could genuinely subvert the hegemonic order, repurposing consumer goods to suit their own "revolutionary" purposes and undermine systems of control.

In reality, both of those interpretations of consumer behavior reinforced one another: individualistic identity projects became hard to distinguish from subversive detournement of goods, and squabbling over fashion-derived hierarchies leeched energy away from the confronting institutionalized economic ones. That is part of what makes consumer capitalism so durable. It commodfies identity and thereby makes it seem more powerful, the key to solving all of capitalism's other inequities. It starts to seem plausible that the problems with capitalism are simply problems of self-expression.

So what then to make of the return of the irrational consumer? Here's how the Economist article synthesizes recent behavioral research:
Humans, it turns out, are impressionable, emotional and irrational. We buy things we don’t need, often at arbitrary prices and for silly reasons. Studies show that when a store plays soothing music, shoppers will linger for longer and often spend more. If customers are in a good mood, they are more susceptible to persuasion. We believe price tends to indicate the value of things, not the other way around. And many people will squander valuable time to get something free.
In Dichter's time, figuring out how to manipulate consumers beneath the level of consciousness was a matter of applying Freudian theory in what seemed to be intuitive, arbitrary ways ("To elevate typewriter sales, [Dichter] suggested the machines be modelled on the female body, 'making the keyboard more receptive, more concave,' " the article notes). Now it's more a matter of Paco Underhill-style applied surveillance, where retailers spy on their consumers, amass data on the ir aggregate behavior and draw conclusions that no individual consumer would have been able to explain -- what kind of music leads to more purchases, how wide the aisles should be, which products should be placed at eye level, and so on.

This suggests how our improved capacity for quantification and data processing has changed the way we think of the "true self": It's no longer about depth psychology or the formation of a unique unconscious on the basis of universalized childhood experiences or what have you. Instead, we are starting to think the truth about ourselves is hidden from us not by our defense mechanisms but by our lack of computing power. We only have so much data in our memory (mainly our own limited personal experience) to process with our minuscule capacity to determine what is going on around us or why we are acting in such a way. Whereas computers aggregating the behavior of thousands or millions can reveal the genuine, normal response. Computers are our best analysts, not Freudians. We understand our irrationality not in reference to childhood trauma but to some composite of normal behavior built from masses of collected "shared" data and fed back to us under the guise of automated recommendations, superior filtering technology, influencing power within networks, augmented reality, etc. The empirical sheen to the computated conclusions about "ordinary" human behavior in the contrived situations being measured make them seem all the more incontrovertible. We begin to believe for expediency's sake that the recommendation engines know the real us better than we could ever know ourselves. It makes it easy to fit into our world to accept that.

I wonder if capitalism's system of control is evolving in a similar way. The ways capitalism offers subjects the opportunity to elaborate a unique identity are perhaps becoming either insufficient or irrelevant. They are being supplanted by improved surveillance, autosurveillance, constant confessions of the self. The politically useful concept of the unique personal identity is giving way to the more productive networked self, a disseminated identity normed through exhaustive data aggregation. The exhortation to "be ourselves" and discover the authentic self is steadily giving way to the soft commands to always be measuring ourselves, and sharing more information as a means to take that measure.

UPDATE: this All Things Digital post says basically the same thing: "The 'Mad Men' Years Are Giving Way to the 'Math Men' Era," i.e. data is more important than creativity in intuiting what will manipulate people.

Wednesday, August 17, 2011

Facebook "Sponsored Stories" (26 Jan 2011)

At ReadWriteWeb, Sarah Perez reports on Facebook's unrepentant new marketing scheme, which it calls, nauseatingly enough, "Sponsored Stories." You, in the course of innocently and eagerly "sharing" updates about the wonderful products and services in your life, craft a magical "story" for your "friends," and advertisers would like to help your "story" reach more of your "friends" by co-opting it and featuring it more prominently on others' home pages. (Sorry for the Carles level of scare quotes, but social media's aggressive transvaluation of the language of intimacy seems to necessitate them.) This ruse allows marketers to indulge one of their favorite creation myths about their business; that ordinary people are always already enthusiastically advertising products to one another, and advertisers emerged to serve as shrewd editors of those conversations, standardizing them and elevating them into a clearer expression of the zeitgeist. We make ad discourse ubiquitous through our free choice. Advertisers just follow our lead.

Facebook's program extends that notion, putting forward the idea that anything we do is best understood as some sort of promotion for something (of course it's always at least self-promotion). Perez explains:
With Facebook's Sponsored Stories, your activity is now up for grabs, available to the advertiser associated with the brand, business or app you interacted with. Just checked in to a restaurant? That's an ad. Just liked a brand? That's an ad. Just shared a news story from the Web? That's an ad.
And naturally, Facebook doesn't allow its users to opt out of this program. You don't control when your endorsement gets adopted and redistributed. Your profile picture just shows up in a sidebar ad as though you're a willing shill.

The lesson here is that Facebook is systematically blurring the line between promotional discourse and nonpromotional discourse -- it acts as though an update and an ad are fundamentally the same, as though context doesn't color your willingness to celebrate a particular product. Arguably, being on Facebook and recapitulating your life there is the process of turning it into one long advertisement -- recasting experience into a media-commodity form. The more time one spends on the site, the more one will structure one's experience in those terms, preconceiving it for its promotional potential. It encourages you to understand yourself as a product requiring advertising, a brand seeking synergies.

The Sponsored Stories program makes that conversion absolutely explicit, and Facebook seems to assume everyone is on board, already accustomed to the idea of marketing discourse being the only relevant sort of public discussion. Facebook already filters potential content from your friends, using algorithms to generate what it thinks you should see. The existence of these algorithms invites efforts to game them, to figure out what will get your update noticed and disseminated the most. Sponsored Stories supplements the algorithms, giving users a chance to jump the line, to craft their updates more like marketing, so they will receive wider play. This then feeds the loop, making personal disclosures seem ever more like marketing, implying that they should be mined even more thoroughly for their advertising potential. Thanks, Zuck!

Quantification, self-exploitation, prostitution (25 Jan 2011)

I've written before about my suspicions regarding the quantified-self movement and the data-driven life. To want to turn one's life experience into data is obviously an expression of a deep-seated urge to become an object. It reflects a yearning to make one's life knowable in an "objective" way -- or rather, to be known in the way that our highly rationalized society regards as authentic. You know, numbers don't lie.

Some of that urge is pragmatic: since bureaucracies are already turning us into data, why not manage the process more directly? But the process of making fresh data by experimenting on oneself seems an especially sinister kind of alienation, reifying a mind-body split that threatens to rob one of ability to experience life as flow, installing instead a Taylorist manager in one's mind, a superego run amok. With all these "objective" numbers, the potential for self-flagellation becomes limitless, as the numbers vividly render the illusion of life as the ceaseless quest for measurable progress. But there is always another measurement to take. One never arrives. The data we generate becomes a miasma into which our soul eventually disappears; we become a set of numbers pre-formatted for the institutional data-processing and the recommendation engines that increasingly structure contemporary life. We become ghosts in the machine at best.

I called the obsession with measuring oneself and mimicking the way technology already translates us into data "robot envy," an idea that apparently finds some support in Sherry infoTurkle's new book Alone Together, which I need to read ASAP. This Chronicle Review piece about Turkle's recent work lays out her thesis about "sociable robots" and the possibility that we will prefer relationships of convenience with selfless machines to fraught and necessarily ambivalent relationships to other humans.
"What if we get used to relationships that are made to measure?" Turkle asks. "Is that teaching us that relationships can be just the way we want them?" After all, if a robotic partner were to become annoying, we could just switch it off.

The article doesn't mention it, but one obvious use for the sociable robots of the future would be as prostitutes. The prostitute-client relationship is already about supplanting awkward intimacy with convenience and efficiency and providing a "made to measure" relationship. It's also a realm in which social behavior has already been rigorously quantified: time is strictly metered and billed at a negotiated rate, which serves as a putative measure of the value of the prostitute's company. This essay in Salon by a prostitute writing under the name Charlotte Shane explains how prostitution can be regarded as the logical end point of the urge to quantify one's social impact. She argues that we "are living in a world where a woman's worth is constantly equated with her sex appeal. Is it any wonder that many women might find it compelling to take that equation to its logical end?" That a woman's "worth" is often regarded as entirely a matter of her sexual usefulness is depressing enough, and one could certainly blame patriarchy, but one shouldn't neglect the way patriarchy is served by the capitalist drive toward ever greater rationalization, efficiency, and quantification. These are all vectors for self-criticism and for insecurity in relation to numerical standard that has been abstracted from any particular context that can make it meaningful. The strictures of femininity in capitalist culture manifest themselves in lots of numbers: dress sizes, weight, caloric intake, breast measurement, etc. Shane writes, tellingly, "I was so highly self-critical as a young adult that by the time I was 12 I vowed I'd have breast surgery."

Prostitution supplies another of those numbers, one which seems to synthesize all the others into a master index. What unifies the numbers is the underlying notion that genuine, inarguable objectification is sexual objectification, rooted in nature and vested by the inescapable imperative to procreate, which makes the choice of sexual partner seem like the archetypal consumer choice. This logic fuels the crypto-evolutionary suspicion -- epitomized in, say, Michel Houellebecq novels -- that all consumerism is a matter of status competition, but status is merely a genteel way of expressing the competition for suitable mates. Sex sells everything. Once a market for a woman's sexual attention has been created (it is, after all, the "oldest profession"), the ideology that locates a woman's essential value in sexuality seems concrete, material, an inescapable truth -- after all, a specific figure can be quoted. Shane writes:
There's something almost merciful about finally having the clarity of a number, and once you're an escort, you've quite literally put a price on your sexual powers. That's an intimidating assignment for any young woman with a less than robust sense of self-esteem, but it can also be perversely satisfying: You've finally quantified your appeal.
This seems like an especially pernicious example of the alienation inherent in wages generally, but it captures the temptation of self-quantification. It's a weird, entirely temporary form of mercy, though, because that price we obtain is just one quote. Being priced only reminds you that your value is always being negotiated by others, and you are mostly at the mercy of that often arbitrary conversation. Once you get priced once, you have to keep being priced to procure a moment's respite of security in one's worth.

The eagerness to be quantified is one of those cures that worsen the disease. It promises to resolve insecurity and afford us more control over our lives and how we are perceived, but instead it radically destabilizes the self and gives over the arbitration of our self-worth to an impersonal market. It impels us to keep generating occasions to be measured -- the yardstick dictates our behavior, supplies the form that shapes our identity. We feel obliged to create information in order to know ourselves, but it doesn't constitute capital for us; it becomes a mode of autoexploitation -- we work hard to turn ourselves in a product for someone else ultimately to sell. The self-exploitation process culminates in the development of the personal brand, which codifies the illusion that we are building equity through compulsive sharing and self-measuring: The qualities of the self aren't seen as ineffable and transcendent; as processes rather than possessions; as indivisible clusters of skill, experience and intuition rather than market-assessed commodities. Instead we find social validation in being on the market and being sold -- an understandable conclusion, given the hegemonic idea that value is somehow conferred by markets. How can you know something is worth anything if you don't try to sell it? How can you know your full potential as a human unless you have leveraged your brand and maximized its value?

Tuesday, August 16, 2011

The Taxonomical Drive and Girl Talk (19 Nov 2010)

Citing Nina Power's book, One Dimensional Woman, Jodi Dean posts about the "taxonomical drive":
[Power] introduces the idea in the context of contemporary pornography: on the internet, one can find whatever one wants, although almost as soon as one finds it, one doesn't really want it anymore. Rather, one wants to see what else is out there. The item itself no longer scintillates. The drive to find other images, to keep moving and looking and marking, takes it place. After you've seen five or six busty amputee tops, you've seen them all--or have you? maybe there are different types? let's look for them! Desire switches into drive, now a drive to taxonimize and classify (blond, with shoes; shaved, no gun; etc...).

I've written a lot over the years about this concept, mainly with regard to music, where amassing more songs and managing the metadata and organizing the music library all begin to cannibalize the pleasure of the music itself. Or rather, these data-driven pleasures mediate our experience of music in a different way from what we knew before mp3s. The music becomes more like information, requiring less of a sensual surrender. Girl Talk seems emblematic of music created to suit this new aesthetic; classifying the samples becomes inseparable from the pleasures of listening to it.

You could draw the conclusion that Girl Talk, despite being "free" to all and seeming emblematic of the potential of a cultural commons over and against intellectual property, also serves to naturalize cultural labor (assigning and classifying semiotic meanings) as the chief pleasure, acclimating us to our doom of being data processors for the media companies that ultimately control the repositories of our lives, which we are turning into data banks with their help. Girl Talk thus functions the same way Power argues that online porn does; in Dean's words, Power "links taxonomical drive with contemporary porn's endeavor to bore us all to death and turn sex into work." Girl Talk models how listening should be immaterial labor. That's not necessarily bad, if one regards that sort of work as its own reward. The fear is that cultural-processing work cuts us off from some other way of experiencing life, pleasure, that is beyond the fixation of being useful. It traps us in the "mirror of production," to borrow from Baudrillard.

I find that this foregrounded data component makes it impossible for me to hear music as music; it doesn't engage what feels to me like a deeper part of my brain that responds more directly to sound seemingly stripped of semiosis. But the subjectively deeper experience I am imagining may be an illusion, an ideological chimera conjured by my investment in classifying the "real" in a specific way, privileging a certain nostalgic access to "the way things were" as a kind of protective revenge against youth, and against my becoming moribund. I want to be able to believe that I really hear the music and grant myself permission to condescendingly pity those whose entire listening life has been lived in the digital age. Fetishizing vinyl reflects this as well -- record players are magical time machines transporting us back to the era of "real" listening, where the patina of crackles and surface noise and skips all serve to certify the authenticity of our response to what we hear. Not a clean data stream, real analog sound, embossed on a material substrate that bears the traces of decay, the marks of time -- so much more like our own mortal flesh and therefore more true.

But this is all mystification of course. Music never comes to us purified of signification and thus closer to some unmediated truth; it is always mediated by some degree of contextual information that prepares us, puts us in a certain state of receptivity that will then allows us to flatter ourselves with our responsiveness. "Ooh, the Brahms, it washes over me so!" There's no way of listening to music that would allow it to reveal what our true inner response would be, no way it can test our spontaneous appreciation, however much we might want to leverage it as proof of our intrinsic noble sensibility. We can't prove good taste at the level of the sensual, the level of the music itself; it is always an argument conducted on the level of signs, the level of ideology.

The fantasies about authentic listening and real experience are not just reflections of the will-to-distinction; they are also counter-fantasies to the dominant consumerist dream of achieving the complete archive, of having the most direct access to every possible option, of even being able to at once hold all those possibilities, if not in our heads, then in some other tangible way. We oscillate between seeking the uncollectible, ineffable and thus "real" experience that can't be repeated or precisely commodified, that seems to elude reification; and seeking to collect everything, to taxonomize so as to seem to have a handle on every possible future we could choose for ourselves -- assuming the future is (as consumerist ideology tells us) merely a matter of what we choose to consume. Dean argues that
In a just-in-time culture, a culture of preemption, where connectedness has taken the place of planning, the archive serves as a kind of fortress of planning, a backup plan, a reserve army of the not yet desired but could be. We store up for the future, presuming we can access these stores rather than just add to them.
But that future never comes; the future is always now, and the storing up is the mode of consumption, not a kind of savings, not a deferral. The archive eases the fear of commitment, of having to choose and thereby forgo other pleasures. We collect the options on possible experiences, possible possessions, and as with financial derivatives, the notional total of these grows exponentially, far beyond the limits imposed by real attention scarcity, allowing us the illusion of transcending the constraints of time. That is what it means, I think, to consume the archive, to take pleasure in the metadata, in the metaexperience, in the theoretical possibility of future enjoyment -- this allows us to compress many experiences and goods into a smaller space in time. Of course, that means capitalism can overcome yet another barrier to endless consumer-demand growth and more profit can be squeezed out of ever-shorter circulation cycles, which now have become quantum.

Friday, August 12, 2011

Long live the new efficiencies (7 July 2010)

I always knew there was something suspicious about concentration, considering how inconvenient and inefficient it is, slowing my consumption down unconscionably. I'm so glad the network can glean the by-products of my perpetual boredom and restlessness and make proper, efficient use of them. I am glad we are evolving.

In an essay for the NYTimes Opinator blog, evolutionary-psychology proponent and "card-carrying Darwinian" Robert Wright responds to Nicholas Carr's case in The Shallows that our interaction with the internet affects our ability to concentrate, leaving us permanently distracted. Wright suggests that this feeling of permanent distraction is a good sign, indicating that our brains are being fused to others, contributing to purposes larger than we are capable of comprehending. If we could actually concentrate on what we were doing in responding to too many things at once, we would end up cutting out the myriad networked connections to others that puts informational tidbits in motion and makes them useful.
On balance, technology is letting people link up with more and more people who share a vocational or avocational interest. And it’s at this level, the social level, that the new efficiencies reside. The fact that we don’t feel efficient — that we feel, as Carr puts it, like “chronic scatterbrains” — is in a sense the source of the new efficiencies; the scattering of attention among lots of tasks is what allows us to add value to lots of social endeavors. The incoherence of the individual mind lends coherence to group minds.
That's pretty chilling. It reminds me of The Charge of the Light Brigade:
Theirs not to make reply,
Theirs not to reason why,
Theirs but to do and die:
All hail the new efficiencies! What difference does it make if they obliterate subjectivity as we have known it? An overrated propensity, identity, unless it can "add value" to "social endeavors," that is, unless it has brand equity. But as for individual autonomy? Who needs it. As Wright tells us, "this fragmenting at the individual level translates, however ironically, into broader and more intricate cohesion at the social level — cohesion of an increasingly organic sort." The group mind must know what it's all about -- it's organic, after all. Wright notes that he is "nostalgic as the next middle-aged guy for the time when focus was easier to come by," but he is not worried that the superbrain is malevolent or totalitarian.

I, for one, also welcome the group mind. It relieves me of all my anxieties and responsibilities. It makes me feel comfortable in my ignorance, which obviously is integral to a larger purpose determined by the magic conjunction of those who share my avocational interests, who are kept equally ignorant. A thousand gut reactions always amount to more than one considered response.


Thursday, July 28, 2011

Slowing down (2 July 2009)

Complaining about the technologically mediated acceleration of life and the loss of the time for contemplation has become a lot like crying wolf. From what I gather, people seem to be sick of hearing it -- as a meme it had its moment several months ago. Even though I've beaten that drum many times, I find myself thinking: Okay. Concentrating is hard, but then when hasn't it been? There are a surfeit of distractions; I get it. But it's not like I am going to go on an information fast and spend my free time meditating. I'm not going to dismantle my RSS feed and devote an hour a night instead to reading a single poem. Those seem like idealistic, nostalgic fantasies about the "life of the mind," which in practice would most likely amount to a refusal to engage with life as it is actually being lived. For example, I very much wish I was in a world without Twitter and maybe even without telephones, but that doesn't mean it's imperative that I live as if it were so. Down that road lies the technological equivalent of veganism, wherein everyone in my life would need to adapt to my fussy, righteous rules about which ubiquitous behaviors were permissible in my little world.

Still, though David Bollier's account of an April 2009 lecture (probably based on this paper, pdf) by media studies professor David Levy has its share of neo-Thoreauvianism in it, it nevertheless raises some points worth considering. The main gist is this: "The digital communications apparatus has transformed our consciousness in some unwholesome ways. It privileges thinking that is rapid, productive and short-term, and crowds out deeper, more deliberative modes of thinking and relationships." I have said the same sort of thing lots of times, but, as Levy asks, what actually constitutes the difference between "productive" thought and "deliberative" thought? I tend to think of the former as data processing -- tagging mp3 files, for instance -- and the latter as analytical inquiry, but it may not be so easy to distinguish the two. The mental modes tend to flow into one another. Working through menial mental tasks sometimes allows for inspiration to break through -- and after all, what is one supposed to be doing with one's mind when it is taking its time to deliberate? The "information overload" critique sometimes centers on the idea of slowing down the mind. But the mind is always moving, thinking one thought after another; the problem with the internet is that it gives it too many places to go all at once, has the potential to gratify too many idle curiosities. Bollier suggests that "We are sabotaging those inner capacities of consciousness that we need to be present to others and ourselves." But the dream that Levy attributes to Vannevar Bush seems a more apt description of what we've tried to do. "Bush’s intention was clear: by automating the routine aspects of thinking, such as search and selection, he hoped to free up researchers’ time to think more deeply and creatively." It's just that the two functions can't be separated; the way in which we think about things doesn't have degrees. It's holistic; we require routine tasks to fire our creativity, and creativity can often become routinized.

It's important to distinguish between having information at our disposal and lacking the discipline to make contemplative use of it. Often the two are implicitly elided, as if too much information automatically leads to frivolous surfing through it. Bollier writes, "Fast-time activities absolutely crowd out slow-time alternatives. The now eclipses the timeless. And we are becoming diminished creatures in the process." I don't quite understand this. We have to live in the now, because we are not "timeless." We die. And the problem with information overload doesn't lie with the activities and the media so much as they do with the approach we take to them, the ideology about information consumption we have internalized in the course of mastering these new technologies. We think they are supposed to make our lives convenient, and we measure that in terms of time efficiency. If we do many different things in the same span of time we once were forced to do only a few things -- if on the train we can read 17 books simultaneously on a Kindle rather than one -- than we are "winning." The pressure to consume more is not inherent to the technology or in some new perception of time, but is instead inherent to consumer capitalism, which fetishizes quantity. As Levy points out, the roots of this are in the "production problem" -- how to keep making more stuff if people are already sated and don't have the time to consume more. The solution: manufacture new wants and speed up consumption. So the consumerist imperative probably led us to develop many of these technologies. But still, we should be careful not to blame the tools for the kind of people we have become. (If Twitter went out of business tomorrow, many people's discourse would still remain superficial and inane.) If we have ceased to be able to love, it is not because we lack the leisure or are too distracted. It is because we have learned to privilege different sorts of experience, are rewarded for different sorts of accomplishments.

So the call for "an 'information environmentalism' to help educate people about the myriad and aggressive forms of mental pollution afflicting our lives" seems misguided. The "mental pollution" is an effect, not a cause, of our loss of contemplative peace. That is, our mental lives are not degraded by information but by a pervasive cultural attitude about it, that treats ideas as things to be collected and/or consumed.

ADDENDUM: Ben Casnocha's review of Tyler Cowen's new book presents a far more cogent critique of the "attention crisis" hullabaloo then what I've provided above.
We have always had distractions. We have never had long attention spans. We have never had a golden age where our minds could freely concentrate on one thing and spawn a million complex and nuanced thoughts. Cowen reminds us that charges to the contrary have been made at the birth of every new cultural medium throughout history. Moreover, the technologies that are supposedly turning our brain into mush are very much within our control. The difference between the new distractions (a flickering TV in the kitchen) and age-old ones (crying infant) is that the TV can be turned off, whereas the crying infant cannot.
He also notes the way in which chaos and "un-focus" can lead us to breakthrough insights. Though I don't remember agreeing with much of Sam Anderson's New York magazine essay in praise of distraction, this point that Casnocha highlights seems apropos: "We ought to consider the possibility that attention may not be only reflective or reactive, that thinking may not only be deep or shallow, or focus only deployed either on task or off. There might be a synthesis that amounts to what Anderson calls 'mindful distraction.' " That's what I was struggling to express above: thinking is thinking; subjecting it to binary categorizations does injustice to how it actually works and leads to unnecessary and useless prescriptions for how to provoke thinking of a certain type.

Friday, July 15, 2011

"Projects for paying attention to attention" (26 Jan 2009)

PSFK linked to this post by Russell Davies, in which he explains his strategy to make himself actually pay attention to what he listens to:
unless I trick myself into paying attention to music that I either just revert to tried and trusted favourites or let all sorts of new stuff drift by me an in ambient haze. Not really listening.
So I thought I'd try a 26 week experiment; listening to a new letter every week. Just to see what I notice. This is week A.
Projects for paying attention to attention. Those seem interesting now.
I'm sympathetic to the desperate feeling of being overwhelmed by all the music that's accessible to us, but I'm not sure that setting up arbitrary limits is the best solution. Ideally, there would be something at least semi-organic about how we pursue pleasure and pay attention. Is that the impasse we have reached, where we have to force ourselves to pay attention to the things we intended to do for fun? Maybe the "ambient haze" he worries about is actually preferable -- the best we can do nowadays. (And maybe that explains why people like Animal Collective -- and it's even in the A's!)

That said, my arbitrary listening approach has to do with waging a campaign to play every song I have on my iPod at least once. It's a Sisyphean task and a bit joyless too. It certainly isn't what I want music in my life for -- to be the arbitrary yardstick for how much entertainment-industry product I've compelled myself to consume.

Anyway, this makes me wonder if we really have in fact entered into the so-called attention economy -- the idea that our most precious currency is the attention we pay to something, since so much is freely offered and the competition amongst marketers for our eyeballs has never been fiercer. We do have a limited amount of time in which to concentrate our attention on things, but these limitations have not translated into a clear hierarchy of where to focus. Instead, we seem more bewildered than ever by all the possibilities, and we become more and more whimsical without necessarily wanting to. The need to pay attention to attention suggests that our culture has been too successful in promoting distractions; it may be that we can no longer tell distractions apart from things we want to be interested in. Distraction has become the common denominator of all leisure experiences; the only alternative is basically work, socially constructed as joyless.

(Bonus: My 15-song all-A playlist, judging by iTunes play count:
1. Abba, "Bang-a-boomerang"
2. AC/DC, "Dog Eat Dog"
3. Addrisi Brothers, "Time to Love"
4. Al Kooper, "Be Yourself (Be Real)"
5. Andrew Bird, "Fake Palindromes"
6. Alice Cooper, "Under My Wheels"
7. the Arrows, "Toughen Up"
8. Allman Brothers, "Don't Keep Me Wonderin' "
9. Asha Bhosle, "Ankhen Meri Maikhana"
10. Andy Kim, "Shoot 'em Up Baby"
11. Angry Samoans, "Lights Out"
12. The Association, "It'll Take a Little Time"
13. Aerovons, "World of You"
14. A.C. Newman, "Battle for Straight Time"
15. Allen Toussaint, "Electricity")

Re-editing frenzy (14 Jan 2009)

I generally have my portable MP3 player on shuffle, playing random songs from a 5,000-track grab bag. In effect, this makes my device an ad hoc radio station, and as such, I find that it requires radio edits of songs that will be wrenched out of whatever context they originally garnered from their place on an album. I used to scorn the radio edits of songs -- the truncated version of "Green-Eyed Lady" is especially egregious, as is the radio edit of Fleetwood Mac's "Sara" in the original and disgraceful CD issue of Tusk. But now I am seeing their usefulness. When yo are not listening to the songs in the environment they were designed for, you must adapt them to suit your particular circumstances.

At first, for me, this was a matter of removing things like the tedious sound-clip intros on Wu-Tang Clan songs, and removing unnecessary space from the beginning and end of songs that once were hidden bonus tracks on CDs. (What a horrible trend that was.) Then I found that I had started to remove boring musical intros and long fades -- the sonorous organ solo at the beginning of Led Zeppelin's "Your Time Is Gonna Come," and the drum machine loops at the end of Eric B. and Rakim's "Microphone Fiend," for instance.

Emboldened, I now have even started to remove parts of songs I don't like no matter where they fall -- that pointless drony part in Nirvana's "Drain You," the noise solo in Pere Ubu's "The Modern Dance" and so on. Who has the time? Just give me the hooks.

When I first began doing this, I felt like a philistine tampering with the artistic vision embodied in these songs. Before I could re-edit them, I had to deal with them as they were, as did everyone else. We could only differ in our interpretations and opinions about what we heard. Now we can all make our own customized versions -- the triumph of read/write culture! (Tom Slee makes some skeptical remarks about read/write culture in this review of Lawrence Lessig's Remix -- the key one, I think, is that hobbies in the digital age have become more subject to depersonalized commodification because the internet is eroding face-to-face interaction in localized, hobby-based economies -- what he calls small-scale culture. The internet can entice us with a limitless audience, prompting us to underrate, or worse, ignore, the ready-made audience of friends and family we would have had without it.)

Gradually, I ceased to have any qualms about my song re-editing. Now I wonder if I am going to end up in Girl Talk territory, composing my own Stars on 45 mashups, or somewhere even more radical. And I wonder if this is a good thing, a liberation from top-down, culture-industry domination. I wonder if I am making laudable strides toward making my consumption more like production.

Consumption always is production, in the sense that we are reproducing ourselves (reconstituting our labor power, as Marx would have it). The problem is that even though I am being "productive," I reproduce myself precisely as a consumer, an identity I alternately dread and wallow in. That's not what I'm usually hoping to accomplish when I exhibit a bias toward "being productive": I'm thinking instead about trying not being passive in the face of the onslaught of data and products and messages and images and such, but trying to engage it actively -- usually in a doomed-to-fail attempt to manage it all. (Hence so much of my "leisure" time is spent on organizational tasks.)

But the problem with consumerism may lie specifically in that kind of engagement with cultural goods, particularly when it fails to bring the pleasure that it seems to promise or delivers the pleasure in addictive microdoses that create prolonged interludes of suffering want. In such productive "creative" activity, I am still reproducing myself with consumerism's preferred tools and reinforcing in myself the desires that it suits consumerism for me to have -- though I am not sure if I have any alternative.

This is the problem with the Situationist approach of detournement. Derivative by definition, it seems neutered, forced, circumscribed. Its subversiveness never actually registers on the level it would need to in order to fundamentally alter social relations or capitalism; for all its confrontationalism, it's not actually disruptive. It just permits those subjected to capitalism feel as though they are struggling if they choose to; it permits us to redecorate our cages with more individualistic creativity, with signs of our unbroken but ineffectual spirit.

Friday, July 8, 2011

The Numerati, by Stephen Baker (4 September 2008)

The advance of digital technology further and further into the nooks and crannies of our lives is based on an elementary trade-off. It supplies us with a great deal of convenience: It lets us communicate with one another wherever and whenever we want to; it provides us with instantaneous access to and limitless storage of media, everything from personal photos to films to most of the history of recorded music on a terabyte hard drive; it's capable of building in a level of redundancy in our lives, preserving what we might otherwise forget and protecting us from oversights -- if you lose tickets to an event, chances are the barcode on them can be canceled and new ones issued to you. And if your credit card number is stolen, chances are the bank will recognize suspicious purchases and notify you. But in exchange for all this convenience, we sacrifice privacy and spontaneity: We permit all our public actions to be cataloged and processed, and we make ourselves completely and instantly accessible not just to our friends and family, but to marketers who seek to guide our behavior in contexts that they can detect and analyze perhaps even before we have a chance to, and to the state, which may seek to stifle dissent before it has the opportunity to assemble and gather force. We become willing parties to our own reification, to our assimilation into the giant digital data machine. Obviously there is pleasure in this, not only in the expanded access to entertainment but also in the thrill of losing ourselves, of ceding responsibility, of having an all-powerful deity-like entity feed us what it thinks we need to know to be happy in whatever situation we end up in. In short, we have a easier time navigating the world as we experience it because it has been preformatted by powerful institutions. Unfortunately our interests are more or less tangential to these institutions, whose primary concern is their own survival and growth.

So, considering how technology threatens to render our wishes irrelevant even as it pretends to cater to them -- that is, to our desires boiled down to the need for convenience, to consume more faster and with maximum indiscriminateness -- it would seem diligent to regard technology's encroachments with circumspection and skepticism. Because information technology makes so much of our private lives public and because it flattens our experience into a universal code of ones and zeros that threatens to annihilate our sense of its uniqueness, it's natural and prudent to be ambivalent about IT and the dislocating change it incurs. But The Numerati, a new collection of profiles of mathematician data miners by frequent BusinessWeek contributor Stephen Baker, offers mostly token displays of such ambivalence. The book -- whose chapters explore how data about us can be used to make us the target for ads and political appeals, how it can be used to better surveil us at work and capture terrorists (or at least casino cheaters), how it can expose our health issues, and how it can predict the fate of our relationships -- is not really for skeptics. While occasionally paying lip service to privacy advocates, it is generally fawning in its coverage of the companies who sell their abilities to profile us in terms of what we might be susceptible to buy. It regards their invasive business practices as inevitable, the inescapable result of increased competition, and a reflection of the dubious proposition that consumer preferences dictate the direction of the economy. Companies need to spy on their own customers, the logic goes, in order to know what those customers will want just in time to provide it to them, maximizing whatever logistical competitive advantage can thereby be derived. "Retailers simply cannot afford to keep herding us blindly through stores and malls, flashing discounts on Pampers to widowers in wheelchairs," Baker warns in a typical passage.

But if you are not primarily worried about what companies can or can't "afford," the values implicit in the book may bother you. You might not celebrate as a company learns to shed its "barnacle" customers -- i.e. the ones that try to keep companies to their word and make them deliver on their promises. You might not be happy that shopping carts can persuade people to buy more at the supermarket than they otherwise would have. You won't cheer when a computer figures out who you voted for based on contextual clues, opening you up to a new slew of fundraising appeals. Baker seems to register just how dehumanizing and awful the world of surveillance and forced digitalization of our lives will be, but in the book, the craven instincts of the business journalist usually take over, and he presents corporate management's side as the final word -- our inevitable fate that we may as well start loving since we are powerless to alter it.
Think of the endless rows of workers threading together electronic cables in a Mexican assembly plant or the thousands of soldiers rushing into machine-gun fire at Verdun -- even the blissed out crowd pushing through the turnstiles at a Grateful Dead concert. From management's point of view, all of us in these scenarios might as well be nameless and faceless. Turning us into simple numbers was what happened in the industrial age. That was yesterday's story.
The examples cited here are bizarrely incongruous -- are we supposed to be happy to be compared to soldiers being ordered to march into certain death? is that at all comparable to Deadheads at a stadium show? and simply because a lot of people have gathered in one place means they have been ontologically reduced to a statistic automatically? But setting that aside, the phrase yesterday's story is enough to tip us off to Baker's teleological impulses, while his elision of management's point of view with that destiny, with the end of the story, with the point of view that shapes the story, is characteristic of the book as a whole. It is our fate to become numbers in the eyes of the powers that be, because it suits those powers that we be organized in that much-more-manageable fashion. But Baker would have us believe that history itself is responsible, not the institutions and those who profit by them.

The confusions about cause and effect then extend to the means of data collection. "When it comes to producing data," he declares, "we are prolific." This seems an innocuous enough statement, but it's totally backward. Our behavior is simply our behavior; to us it is lived experience, memory, sense stimuli. We don't "produce" the data, the technology that collects it transforms our lived experience into that data that institutions (corporations, the state) crave. It works to have us reconceive ourselves as numbers, as the sum of datapoints, and then presents its manipulations of that data as the means for our personal extension, even though we are now limited to the field it has defined. "Once they have a bead on our data, they can decode our desires," Baker notes, but it seems more appropriate to say that they encode it, trapping it in the mediated digital world. Amazon, for example, usefully tells us what we might want based on our behavior, and then buying the books it has suggested begins to seem a way of completing ourselves. The data -- the preexisting categories, the defaults, the automated processes incumbent in the systems that capture information -- has started to produce us.

The most obvious example of this is social networks, or the even more totalizing Second Life. These data-harvesting applications hope to encourage us to conduct our social lives in their petri dishes and behave in preconditioned ways the service providers can measure and exploit -- attaching ads and recommendations to social exchanges that in the real world would transpire with unencumbered spontaneity, with no commercial subtext. Online, though, our behavior -- now transformed into marketing data -- suddenly works, to those we "network" with, like a sales pitch -- a means to some other end rather than being autonomous. Our actions seems less real until they are posted and shared and processed to our maximum advantage with regard to the impression we would like to create or the number of page views we would like to garner. Our consciousness, when reduced to data out of convenience, becomes merely instrumental, something easily reprogrammed to accomplish various tasks. We can automate our social life or refashion our identities thanks to the tools the networks provide, but the thrill of lived experience vanishes to a degree, becoming more and more a matter of adjustments on the spreadsheet of self.

After Baker has misconstrued our role in turning ourselves into data, it's a short leap to claim that "the only folks who can make sense of the data we create are crack mathematicians." In other words, don't try to understand yourself; you need a math genius to tell you who you are and what your meant to do through your behavior. Statisticians are better managers of our datasets than we are, and they are better able to manipulate our data to see what it will yield -- to see what our true possibilities are. Apparently our own account of our hopes and dreams and intentions is irrelevant to the degree that it is not conditioned by what the math geniuses have calculated and made permissible. Once we are data, we are inscrutable to ourselves.

Not only does our reduction to data make us strangers to ourselves, but Baker goes so far as to opine that in the future, we will be "happy to pay for the privilege of remaining, to some degree or other, in the dark" about the selves that can be constructed from our data. He has in mind the disconcerting probabilities that we will contract diseases, but it applies plausibly to the whole range of knowledge that can be produced about us. When we begin to be overtargeted, we will need filters to discover our authentic reflection in the efforts to persuade us. We will want liberation from the self left behind by the trail we've blazed through commercial culture, as that identity is merely the one that shopping permits us to have. A more integral self will fight that commercially derived one for social space in which to manifest. But the hegemony of consumerism will require us to pay for that privilege of being able to conceive an authentic self independent of our data stream.

What can we do to thwart our being converted to data? Baker suggests a can't-beat-em-join-em approach, urging us to make spreadsheets of our achievements to demonstrate our worth. As digital data hounds become more thoroughly intrusive, we can probably count on the advent of services that would throw out false scents in our name, creating fake data trails to muddy the image of ourselves therein, to obscure our health concerns from insurance companies who would like to exclude us, and to mask our shopping proclivities to ensure that we don't suffer price discrimination or perhaps attract favorable discounts. Just as credit-score doctors learned how to game FICO, a counter-Numerati is sure to emerge to try and thwart their efforts to define us. Short of that, it will increasingly be to our benefit to conduct ourselves anonymously if we want to preserve any sense of self at all.

Wednesday, June 29, 2011

The unheard music (1 May 2008)

Over the past few years I have amassed a mountain of songs that I've never listened to, and lately I've begun the quixotic project of trying to listen to it all and sort out which songs I actually like so I can find them more easily. Consequently, I feel like I never listen to music for sheer pleasure or distraction anymore; it's systematic, Sisyphean work, as I keep adding more unheard music to the pile. Not that it has deterred me, but I quickly realized that this is no way to decide whether I actually like these songs. In fact, most songs, if they have managed to make it to my hard drive, are pretty okay. The often snap decision about whether they will make it into the "good" playlist is typically an arbitrary one, based on whim and giving me the gratification of decisiveness for its own sake -- the joy in this procedure doesn't come from hearing the music itself. (This is a clue to why record reviews are so often irrelevant.)

And even then, when allegedly deciding I like a song, it's not that I really like it in that moment exactly. It's more that I have made a promise to myself to like it later, that at some point down the road it will be in rotation on my iPod and I will grow to truly appreciate it then. This realization leads me to believe that the value of any song has little to do with its intrinsic qualities and more to do with what I have managed to invest in them; the songs are repositories for my emotional energy, the energy I've spent consuming and remembering them, linking them in various ways to the story I tell myself about my life.

It may be that certain qualities in songs lend themselves to this kind of emotional investment. It helps if they are a relatively blank slate. If they are too specific, they will crowd out the feeling I need to be able to pour into them to like them. If the songs have timely political messages of their own or are specific gripes about how being a professional musician sucks, they will rarely attract any emotional energy investment. Generic songs about having feelings -- falling in love, going to a party, leaving home, etc. These seem to work the best. Also, context contributes to whether or not a song can attract emotional investment. If it is in the right genre, or was in a movie, or was referenced by friends or something along those lines, it gives one a reason to pay extra attention to a song, and once you have singled a song out to actually pay attention to it, you are 99 percent of the way to liking it. (Not to belabor the obvious, but liking a song is no more than a willingness to really pay attention to it when it is playing.)

As part of my project, I was listening to an album called She & Him and I was thinking it was mediocre and was going to delete it. Then I remembered why I acquired it in the first place -- because M. Ward (whose other albums I have already decided to like) was part of the band. That simple piece of knowledge changed the whole way I perceived the music; it focused my attention and shifted my attitude away from looking for reasons to reject it toward listening carefully for things to like. The songs are occasions for bringing to bear pieces of information like that, to connecting memories and data about what brought pleasure before. If a song can fit into a larger structure -- a musican's oeuvre, an approved genre, memories of having heard it at the bar or whatever -- it becomes more listenable, likable for that reason. But they are too insubstantial in isolation to be fairly judged on their own merits. The criteria can't emerge from some ideal notion of what a song should be; the criteria in practice emerge from the richness of the situation, which paradoxically enough, is a product of the limitations it imposes on what you can consume.

In general, I liked music a lot more when it was scarce. When it was scarce, I was much more likely to look for reasons to include songs in my life rather than reject them. It's often constraints that make music meaningful to me -- for example, I won't forget the one tape I had in the car when I drove across New Mexico (a compilation of the Music Machine, the West Coast Pop Art Experimental Band, and the Gestures); those songs will always have that peculiar resonance. The songs in heavy rotation on the oldies station in Phoenix was partial to in the 1990s -- "Woman, Woman" by Gary Puckett and the Union Gap; "Summer Rain" by Johnny Rivers, etc. -- will always signify that specific time and place, what I was feeling then, the drives I used to take down I-10 late at night, crossing the Maricopa County Line on the way to Tucson. I was discovering new music in a very measured way, and I felt like it was expanding my mind at a pace at which I could assimilate it, enjoy it.

Now, there's no danger of my ever running out of music; there is no need for me to be discovering more. (Maybe I'm just old, and that's why my discovery phase is over. Just about everything I hear sounds like something else I've heard already, and if it doesn't, I get cranky over its newfangledness.) Instead, I am haunted by the fear of running out of attention. So it helps when there are limits imposed on how much music there is to consume, a limitation that was once imposed by radio playlists and the amount of money I was willing to spend on music.

Back in the day, I imagine the infancy of the culture industry also limited things -- the number of records that received distribution was much smaller. This morning, I had reached the a compilation of the Shangri-Las greatest hits. After sorting out the obvious keepers -- "Walking in the Sand," etc. -- I was left with 20 songs that were all cut from the same cloth, all decent in their own right, but indistinguishable from one another. Being able to hear them all at once, with no expenditure or effort, undermined songs that in isolation might have seemed dramatic, powerful, singular. And they all probably seemed that way when they were singles, and you lived with them on the radio for a finite amount of time and grew to like them or not. The songs weren't made to withstand being clicked through, rapid-fire, to determine which are good and which aren't. (No music is made for that.) I ended up grasping for reasons to pick one over the other to put on the keeper list, thinking ashamedly to myself, If this song were to crop up in a commercial or get covered by some other band I heard of, I'd keep it for sure.

So while I think the subscription-type services that will allow users access to all of recorded music that Reihan Salam describes in this Slate article are inevitable, I don't think they will do much for people's enjoyment of music. They may discover a lot more stuff, but only in the collector's sense of having filed away an awareness of it. It will become much harder to find the time and the discipline to invest emotional energy in a few songs when the temptation will always be there to indulge that antithetical pleasure of judging -- in or out? keep or toss from the playlist? The editing will be a never-ending process, and we'll never get to the point where we have the time to listen to the carefully compiled playlist and start making the effort of investing ourselves in the music, in bringing the songs to life so that they can return the favor later on.

Perhaps that is why muxtape, the site that lets you upload and share online "mixtapes" of 12 or so songs is such an attractive idea -- not so much for the consumer but for the uploader. It takes those playlists of chosen songs and gives them an immediate broader context for emotional investment -- a community of fellow listeners. It helpfully imposes some parameters, limits that sharpen your focus. It becomes a forum for making your listening habits performative. Which 12 songs will go together? How can I put my tastes to use to impress somebody out there who might be listening? Isn't that the bottom line in amassing a mammoth knowledge of pop music in the first place -- impressing people? But when you are simply listening to music -- for yourself, rather than brandishing the extent of your familiarity for others -- you are just remembering yourself and what effort you spent in the past to really listen. That energy returns to you, as if the song supplies it. That seems to me to be what it means to like a song. And if we don't budget the time to make that investment, if we feel too overwhelmed with choices to bother to attach much feeling to the choices we make, we'll end up amassing all kinds of music, enjoying the pleasures of curating a collection while not really liking any of the music.