Friday, July 29, 2011

Nanostories, etc. (31 Aug 2009)

Harper's editor Bill Wasik, the inventor of the purposely pointless internet-driven media event known as a flash mob, has expanded on what that experiment taught him in a book, And Then There's This. Fittingly enough, I finished reading it while I was down the shore, in the land that the internet seems to have forgot. (When they hear wi-fi, many in Wildwood would probably think you are talking about WIFI 92, the top 40 station in Philly circa 1978.) The book is primarily about how the internet encourages the acceleration of our cultural consumption by prompting us -- now no longer passive consumers but media operatives ourselves, fascinated by our own impact and keen to play at being an insider -- to refashion news as "nanostories," microstories whose popularity (measured by internet metrics) peaks quickly and then rapidly dissipates. Whatever real underlying fundamental trends there might be get lost in the noise. Culture accelerates, becomes quicker in its payouts, and becomes more compulsive and addictive. This, as Wasik notes, makes the internet just like a slot machine, whose quick-hitting but apparently random rewards are engineered to make players addicted: "games of chance seem to be more addictive in direct proportion to the rapidity and continuity of their 'action' -- how quickly, that is, a gambler is able to learn the outcome of his wager and then make another." Online, the action is the tracing of trends and our own statistically determined significance. Twittering, and then seeing what sort of response it provokes, etc. We are never at a loss for an opportunity to try to garner attention, and these efforts are archived, deepening our potential self, even if it is all noise. The internet's archiving capacity means there is an excess of the narratives from which we shape our sense of self. "With the Long Tail of Truth, telling ourselves new stories about ourselves has never been easier: abundant, cheap distribution of facts means an abundant, cheap and unlimited variety of narratives, on demand, all the time."

But the internet is not only a machine for generating memes, but also for manufacturing spurious hermeneutics. Wasik contends that we have all become conscious analysts of how media narratives operate (we have the "media mind," as he puts it); the presence of so many independent operators in the media space compresses those narratives, turns them over quickly as we all experiment to see which framing techniques attract the most attention. (Popularity tends to snowball, since popularity is factored in to what choices are given prominence.) The internet has given us means to sell ourselves the way products have long been sold to us, and we've embraced them, adopting advertising measuring tools (the data on popularity the internet makes available to use for our personal pages) as markers of moral value. The potential scope of our reach invalidates previous mores:
When your words or actions or art are available not only to your friends but to potentially thousands or even millions of strangers, it changes what you say, how you act, how you see yourself. You become aware of yourself as a character on a stage, as a public figure with a meaning.
As a result, we manage our public meaning like a brand manager, and perfect the art of culture monitoring -- meta consumption of media. We begin to consume the buzz about buzz, or pure buzz, with no concern with what it's about, only whether we can exploit it for self-promotion.

What's lost in the focus on the meta-story of something's popularity and usefulness for our own carefully monitored identity is obviously the thing itself, which becomes difficult to recognize and consume in traditional ways. Artists are seen as the "instantiation of a trend," and their work is assessed in that regard -- the mythical organic reading is even harder to achieve or even simulate. "Call it the age of the model" Wasik writes. "Our metaanalyses of culture (tipping points, long tails, crossing chasms, ideaviruses) have come to seem mroe relevant and vital than the content of culture itself.... The real vigorish is in learning not about what is cool than learning about how cool works." When all that resonates about a meme or idea is its viral potential, all ideas are ideas about marketing.

This concern with only the momentary impact of any story and its metasignificance decontextualizes them, allows ideas to function as commodities: "The meme vision of culture -- where ideas compete for brain space, unburdened by history or context -- really resembles nothing so much as an economist's dream of the free market. We are asked to admire the marvelous theoretical efficiencies (no barriers to entry, unfettered competition, persistence of the fittest) but to ignore the factual inequalities." In other words, nanostories, not surprisingly, preserve the status quo, reinforcing our own vanity and self-centeredness along with the market as timeless, unquestionable norm.

Wasik takes a look at the decisive role of boredom. We are not inherently disposed to be bored -- Wasik cites research that suggests boredom is a defense mechanism that we invoke when we are confronted with too many choices. But those choices are what capitalism offers us as proof of our purchasing power as consumers. So we experience boredom as proof of our centrality in the consumerist cosmos, and this boredom is a deliberate achievement of the existing social order -- it fixates us on novelty as a value, and drives us to consume habitually, for ideological rather than material fulfillment. It's pretty self-evident, I guess -- boredom is a product of awareness of choice, and the advertising infrastructure does nothing but make us aware of choices. Wasik argues that the ubiquitous boredom helps drives the acceleration of media consumption by fostering backlashes on schedule; I would only add that the boredom is market-driven -- the oversupply of ideas and goods are stimulating the demand adequate to them, changing the attitudes and self-concepts of consumers in the process.

So the market imposes the possibility of novelty on everyday life, which engenders boredom, the feeling of being hopelessly overwhelmed by choice and the drift into aimless lassitude. In this state we are unwilling to commit to anything deeply -- it might grow boring -- so we invest our time and effort on into shallow things that are quickly disposed of, or the most convenient experiences, things which are by their nature not very satisfying. So we become temperamentally insatiable.

In the final chapter, Wasik suggests strategies for fighting the acceleration and compression of cultural consumption: one is rationing our information supply and adopting a renunciative attitude toward the internet. Just say no. Another is time-shifting -- "delaying one's experience of a cultural product long enough that any hype or buzz surrounding it has dissipated." That is something I wholeheartedly endorse and practice: I am currently watching the second season of Dallas -- and loving it. I don't know that it helps anything though. I needed there to be buzz before to even think of watching it now. Ultimately Wasik has no answers -- we must strike a balance, he suggests in Aristotalian fashion, but gives no sense of what that might be. We must choose "judiciously" what information we consume, but offers no criteria for this. He advocates "sustainable approaches to information" but little sense of what that would entail. Like Žižek argues, it is easier to imagine the end of the world -- the destruction of the internet by some super virus or something -- than to imagine a way to consume it temperately.

Bonus material: Path dependency and status quo bias as ideology (31 Aug 2009)

I've got a post up at Generation Bubble about the usefulness of such concepts as path dependency and status quo bias to conservatives. Like the placebo effect, which is apparently growing stronger, the strength of these other psychological effects are probably controlled by ideology -- that is, their intensity can be manipulated by how ideas about them are repeated and ratified in the public sphere, ideas that become accepted as common sense, things that we fall back upon as natural explanations for phenomena, and natural ways to respond. These biases are real but perhaps not as inevitable as the way we report on them makes them out to be. And they are downright untrustworthy when deployed in reactionary argument.

A related thought: the rhetorical deployment of the findings of psychological research would seem to have an impact on the ongoing validity of those findings -- our psychology may change in reaction to how certain aspects of it are abused in argument.

Resort Motel Architecture in Wildwood (30 Aug 2009)

I've been on vacation the past week in Wildwood, New Jersey, (North Wildwood to be precise) which is down the shore about 30 miles south of Atlantic City, near the southern tip of the state's coastline. It's long had a reputation as a working-class beach town for immigrants from Philadelphia, and it's still not uncommon to see houses down there with the Italian and Irish flags flying from the awnings alongside the U.S. flag. Like its North Jersey equivalent, Seaside Heights, Wildwood has an extensive boardwalk that retains a carnivalesque atmosphere, where scams and bad bargains of all sorts are made to seem innocuous and where the water-gun-game barkers and snake-handling carneys and iron-on T-shirt makers seem like artisan practitioners of threatened traditional crafts.

I played lots of skee-ball to win tickets (my high-risk, high-reward strategy -- always shoot for the 100s), which I could then trade in for plastic army men, balsa-wood propeller planes, and off-brand candy. I also rode an old wooden coaster that may have concussed me; I got off and wandered the amusement pier punch drunk, in search of place to sit and fortify myself with a lime ricky.

The upper middle classes from the Philadelphia area tend to eschew Wildwood in favor of Ocean City and Cape May, Avalon and Long Beach Island -- similar places by and large that have somehow managed to manufacture class distinction for themselves. The aspirational towns tend to push a contrived family-friendliness and institute measures like charging a fee to use the beach to seem exclusive. Thanks to their protected reputations, real estate values rose dramatically during the bubble years in these towns, which meant that much of the earlier generations of buildings (the boarding houses and one-story "shit shacks") have all been razed in favor of generic aluminum-sided, triple-decker condos and elaborate second-home mansions.

But in Wildwood, this hasn't really happened, making it an architectural treasure trove, at least for me -- I have an inexhaustible fascination with vacation motels of the late 1950s and early 1960s. Wildwood is still filled with hundreds of them, some of them renovated, some run down, some essentially unchanged since their construction in the heyday of democratized leisure. The vendors on the boardwalk are as up to the minute with trends as they can be -- they were selling Michael Vick Eagles jerseys (and "Hide your beagle, Vick's an Eagle" shirts) a week after he was signed. But away from the boardwalk, the town becomes a museum. To walk into Wildwood Crest at night is to enter a hushed Edward Hopper world, haunted with a pensive sense of unfulfilled possibility, all these lido decks and lounge chairs under the orderly rows of subdued lights, devoid of people.

Just as consumption itself was being democratized after World War II, with the working classes experiencing the thrills of purchasing power amidst the sudden flood of mass-produced goods, so was leisure democratized, with similarly mass-produced motels rapidly constructed in déclassé seaside enclaves like Wildwood. This motel build-out mirrored the construction of housing developments for lower-income families in the string of Levittowns outside Philadelphia -- workers could now for the first time enjoy their own detached homes in suburban communities, and they could also expect to enjoy their own private rooms in resorts meant to accommodate them when their week of summer vacation -- another novelty -- came around.

These motels tended to come in two sizes. The first is a two- to four- story motel in a L-shape, nestling a small standard-issue in-ground pool (the sort with a 8-foot deep end that rarely gets built anymore -- the drowning/lawsuit risk has made them infeasible now) surrounded by plastic chaise longues, and with a sundeck dotted with fake palm trees. Here's a few pretty typical examples:

Many of these have since been converted to efficiency condos, which is part of what saved them from demolition.

The deluxe version of these are a little larger and typically sit nearer the ocean:

These are the places that are often billed with the somewhat oxymoronic appellation "resort motels."

I probably photographed 50 of these motels, and I only scratched the surface. And this is after untold dozens had been torn down over the years. (In one vacant lot on Ocean Avenue, the local preservation society had salvaged and displayed a few of the neon signs from motels since demolished.) They are pretty much identical, differing only in location and name, and one would think this would make them depressingly banal and interchangeable. But they struck me as strangely hopeful, and that feeling was reinforced with each new iteration of the same model that I came across. In their sameness came not a refutation of the personal uniqueness of the people who chose to stay in these motels -- an ersatz individuality that is largely oversold ideologically and exploited by marketers as a source of insecurity and misery (are you unique enough? have you discovered the real you? how can you be sure if you are not displaying it at all times through a highly personalized set of belongings and experiences? shouldn't you buy more stuff to become more unique? Hmm?)

Instead they evoked an egalitarian feeling that everyone was confident back then of receiving the same standard treatment; that the worth to the patrons of these places was not rooted in laying special claim to some thing or place but in having the opportunity to partake in the vacation spirit embodied by the town as a whole. The sameness in the environment allowed for actual differences among everyone -- differences taken for granted, the peculiarities that aren't dependent on marketing and goods and identity displays -- to show through. It was simply assumed, I imagine, that such stuff as identity mongering was irrelevant in the face of the opportunity to go on the beach, to swim in the ocean, to confront the ceaseless waves. In other words, these cookie-cutter motels conjure a sense that leisure was once beyond the reach of invidious comparison, that pleasure isn't a matter of differentiating our identities or proving our creativity or individuality at all. There is nothing particularly creative or inventive about enjoying the beach, but it is all the more enjoyable for that. You can do what everyone else does and it's okay. It's even liberating -- at least now, when identity and personal branding is so foregrounded.

But it's hard to be sure the original clients of the motels felt that liberation in the same way. We are at the other end of the transformation of subjectivity that for them had only just begun, so liberation for them may have been precisely that birth into branded identity that we are beginning to experience as a prison. It's the elaborate, now-campy names of these places that make me wonder about this -- especially the accompanying signs, with their neon and their futurama fonts, seem like cartoonish, hyperbolic attempts at branding something that has no particular distinction.

Their superfluousness, the gratuitous nature of these names feels significant. Unnecessary on the face of things (many similar accommodations in other seaside towns are content to have only a street address) the garish names were, at that moment in time when the motels were being built, of critical importance to builders and customers alike presumably. The names were something extra that didn't make the places themselves special, but signified that an effort was being made to market something to a class of people who had been mostly ignored by advertising. In a loud, unmistakable way, the names called out to the new mass-market consumers and sent the message that an expense was being made on their behalf, that unnecessary ideas were being called forth, to amuse them for a moment, to make them feel that they were going to stay somewhere where ordinary rules did not apply: places where, for example, doors were painted sky blue and palm trees had blue leaves.

The names and extravagant signs became requisite amenities, along with heat in the rooms and color televisions.

The ham-fisted attempts at branding seem primitive and quaint now, and seem to preclude the possible of their being aspirational, of generating insecurity. For us, they do the opposite of making us proudly and earnestly self-aware; they conjure a time when branding seemed kitschy and fun, impossible to regard seriously and thus allowing us to transcend it all. But they must have had unironic significance in the 1960s, a resonance similar to the unnecessary but ubiquitous names for housing subdivisions built during the recent real estate boom -- nicely lampooned in this post about Denver's exurbs. Maybe names such as these will seem campy (and not awful) to subsequent generations:

The naming protocols signify a token effort to mask a fundamental contradiction, to make mass-produced "luxuries" come across as bespoke. They are fumbling efforts at establishing social capital where none is to be found, and thus readily strike us as wrong, as embarrassing, as shameful, as violations of some natural order. The faux-pastoral, crypto-aristocratic aspect of the subdivision names exposes what they are trying but failing to do, they indicate how the developments are trying to graft themselves on to a social order that would have forbidden their very existence. The resort motels, at least, attempt something different with their branding, something that to me speaks to the optimism of the era -- each an oasis unto itself in a lush profusion of oases without a desert. They don't try to appropriate a class-inflected language and approximate an old dream of country-club exclusiveness. Instead neon signage is used to invoke an aesthetic never before seen.

(Las Vegas is probably the ultimate example of this style, and Learning From Las Vegas, the manifesto by Venturi, et. al., is the standard reference for its significance.)

The opportunity for a mass-produced aesthetic that is its own thing, outside of the war of class signaling, seems to have passed us by. We can only return to the dawn of consumerism with nostalgia for its promise, mingled with despair for all its wrong turns since then. It promised to indoctrinate us all into the pleasures of owning and belonging and instead made us more conscious then ever before of the many things we can't have, the many places we don't belong, convincing us all the while that there can be no recompense for unfulfilled desire, no vacation from the yearning for more. A week at the Armada by the Sea will never suffice to make us forget ourselves.

Class consumers (21 Aug 2009)

Yves Smith noted a WSJ article reporting diminishing retail sales and heralding the new austerity in American consumers. This tidbit didn't quite fit the frame: "A cashier at Target in Los Angeles checks the authenticity of $100 bills." Counterfeiting is not exactly the act of an austere, frugal consumer, though it may be the act of debt-starved one.

Zero Hedge unleashed a long analysis of "the stratified American consumer" last weekend, making the useful point that the way the designation "American consumer" is thrown around tends to conceal the fact that it is not a homogeneous group. Statements like "Major retailers reported that American consumers are continuing to hunker down" from the WSJ article are not especially useful, because it is extrapolating a universal mental framework from aggregate macro data. (The same problem arises when a negative savings rate is extrapolated into "all Americans are spendthrifts," a rhetorically tempting logical leap I've certainly been guilty of.) Zero Hedge:
A drill down of disposable net income (after tax) and net worth, demonstrates why any discussion of "generic" consumers should be much more properly phrased as an observation of the "Wealthy" and "Everyone else". The disposable income difference between the richest 10% and even the next richest decile is staggering: a 3x order of magnitude....
While 10% of the population collects 40% of disposable income, it represents 57% of net worth! This is an impressive conclusion: on a lowest common denominator, the Net Worth variance between the 10% of the population that make up the wealthy and the 50% that comprise the middle class is over 8x! No wonder the aspirational consumer was the most vibrant retail category at the peak of the bubble: if the middle class can not accumulate 8x the net worth it needs to migrate into the top decile, it can at least dress like it. Unfortunately, it did these purchases on credit and is now paying for it (or not).
The upshot of this analysis is not surprising, but worth reiterating: that the data trends tracked regarding consumption reveal consumerism as a middle-class phenomenon driven probably by status envy of the upper-middle class for the upper-upper class. That the gap is widening means that the consumers now discovering austerity aren't liking it very much and that there is no paradigm shift to a culture of maximum utility extraction. Also worth noting: The lower classes (the bottom 40% who consume 12% of what's consumed in America) are statistically and economically irrelevant. I wonder if there is a social corollary to that -- beneath a certain income point, one's subsistence-style consumption becomes anonymous. The thought inspires in me a classic middle-class fantasy of the escape into squalor, a la the George Orwell of Down and Out in London and Paris: the dream that subsistence living is automatically authentic, and this authenticity compensates for the misery of relative deprivation.

The conclusions drawn in the Zero Hedge post seem ominous: The recession has hurt the lower and middle classes more than the wealthy, and has merely increased the wealthy's advantage. Calls for a consumer-led recovery will draw on their increased spending power, and when that spending shows up in the data it will mask the fact that more people in America are making do with less -- suffering the new frugality at the conspicuous spenders' expense.
Is it safe to say that the wealthy have managed to game the system yet again and avoided a significant loss of wealth, while maintaining sufficient access to credit? If in fact that is the case, a case could be made for a consumer lead-recovery, granted one that is massively skewed to the 10% of the population which consumes 42% in the US.
At that point, all the talk of the era of frugality will be over, even though more of us will be living it. What the Zero Hedge scenario means, as some others have remarked (now I can't find the links, grr), the U.S. will become more like Brazil -- a wealthy elite, with a monopoly on the social power that comes from the power to spend in a consumer society, living in gated communities with elaborate collections of luxury goods, with bodyguards for their children and so on, while the rest of the population is increasingly impoverished.

It reminds me of Baudelaire's The Eyes of the Poor, the poor family staring in at the lovers in their leisure at a "dazzling" cafe, whose decorations depicted "all history and all mythology pandering to gluttony." Increasingly, we are becoming that poor family, spectators of the heroic consumption of the upper classes. Thanks to the inexpensiveness of media entertainment, we consume the cheap images of their class status and derive what gratification we can.
The eyes of the father said: "How beautiful it is! How beautiful it is! All the gold of the poor world must have found its way onto those walls." The eyes of the little boy: "How beautiful it is! How beautiful it is! But it is a house where only people who are not like us can go." As for the baby, he was much too fascinated to express anything but joy -- utterly stupid and profound.
Meanwhile economic forces cement the boundary between us and those in the cafe enjoying the splendor, continue the redistribution upward, making sure we can do nothing but marvel at wealth.

Class and classism and the meritocratic fantasy (19 Aug 2009)

Most people would not agree that it's okay to cross the street if you are spooked by the race of someone approaching. But fewer people, I suspect, would feel the same way if you cross because you think the person coming toward you is a lot poorer than you are. In that case, you can ascribe a socially sanctified line of reasoning to the situation -- rationally, that person has a pretty obvious reason to assault you for your money, thus it makes sense to try to avoid them, and if they feel bad about it, well, they should try harder not to be poor.

The idea is that classism is often tolerated where racism (and sexism and bigotry against gays and so on) is not, because prevailing neoliberalism makes it seem okay to ascribe "rational" incentives to other people and discriminate accordingly. After all cynicism about other people's motives is a positive common-sense virtue in a society ruled in all aspects by a free market. An ability to think in terms of costs and benefits and find the applicable way of applying such an analysis to any scenario is the mark of a mind thinking at its clearest. At the same time, when we discriminate along those lines and not along the old racist, sexist, etc. lines, we can congratulate ourselves for how far we have come, regard our existing social order as progressive, and assure ourselves that our own advantages are merited, and not the product necessarily of racism. Eschewing bigotry and promoting diversity strengthens our ideological faith in the meritocracy that hardly exists in reality, as Walter Benn Michaels argues in this LRB book review of a collection called Who Cares about the White Working Class?.
My point is not that anti-racism and anti-sexism are not good things. It is rather that they currently have nothing to do with left-wing politics, and that, insofar as they function as a substitute for it, can be a bad thing. American universities are exemplary here: they are less racist and sexist than they were 40 years ago and at the same time more elitist. The one serves as an alibi for the other: when you ask them for more equality, what they give you is more diversity. The neoliberal heart leaps up at the sound of glass ceilings shattering and at the sight of doctors, lawyers and professors of colour taking their place in the upper middle class. Whence the many corporations which pursue diversity almost as enthusiastically as they pursue profits, and proclaim over and over again not only that the two are compatible but that they have a causal connection – that diversity is good for business. But a diversified elite is not made any the less elite by its diversity.
This is an argument spelled out in his book The Trouble With Diversity. In the LRB piece, he pushes the book's argument further, detecting a similar mechanism in the worries about classism manifest in Who Cares About the White Working Class?:
It’s thus a relevant fact about Who Cares about the White Working Class? that Ferdinand Mount, who once advised Thatcher, is twice cited and praised here for condemning the middle class’s bad behaviour in displaying its open contempt for ‘working-class cultures’. He represents an improvement over those who seek to blame the poor for their poverty and who regard the culture of poverty rather than the structure of capitalism as the problem. That is the view of what we might call right-wing neoliberalism and, from the standpoint of what we might call left-wing neoliberalism, it’s nothing but the expression of class prejudice. What left neoliberals want is to offer some ‘positive affirmation for the working classes’. They want us to go beyond race to class, but to do so by treating class as if it were race and to start treating the white working class with the same respect we would, say, the Somalis – giving ‘positive value and meaning to both “workingclassness” and ethnic diversity’. Where right neoliberals want us to condemn the culture of the poor, left neoliberals want us to appreciate it.
The great virtue of this debate is that on both sides inequality gets turned into a stigma. That is, once you start redefining the problem of class difference as the problem of class prejudice – once you complete the transformation of race, gender and class into racism, sexism and classism – you no longer have to worry about the redistribution of wealth. You can just fight over whether poor people should be treated with contempt or respect. And while, in human terms, respect seems the right way to go, politically it’s just as empty as contempt.
Michaels wants the left to worry more about income inequality and fight for the eradication of the income differences that make for social classes. (I wonder what Will Wilkinson would make of that.)

Built into the idea of meritocracy -- an ideal often conflated with the American Dream -- is the corollary that the poor deserve contempt. It's easy to see how people could overrate their own hard work and its relevance to their own success (such as it is) and believe that hating poor people is a way of providing crucial tough love. We can jumble up the causal links and think that hating hte poor will help make the meritocratic dream more real. It serves as a way of voicing our belief in the meritocratic ideal.

Overfollowing on Twitter (19 Aug 2009)

Just a few days after having my first experience with Twitter and "real-time search" that could be remotely characterized as useful, I'm reading in this Mark Gimein essay at the Big Money that Twitter is doomed.
The irony of Twitter is that even as it becomes more pervasive, it is in danger of very quickly becoming markedly less useful. Twitter is in danger of collapsing under its own weight. Not because of its problems keeping up with traffic—those are solvable—but because the volume of material that Twitter unleashes now puts impossible demands on its users' time and attention. The problem, in a nutshell, is information overload. The more Twitter grows and the more feeds Twitterers follow, the harder it gets to mine it for what is truly useful and engaging. Even as Twitter reaches a peak in the cultural cred cycle, it's time to start asking how it can be saved from itself.
The problem, in Gimein's view, is that users are too profligate in who they follow, making the concept meaningless -- the number of followers one has is no indication of the amount of people who are actually reading what you have to say, even when it comes in telegraphic blasts. This line of reasoning suggests how Twitter works to quantize communication, making the numbers in the audience more important than what's said. Of course, that has always been true of ratings-driven media, but it hasn't been true for our conversations.

But the genius of Twitter as a potential business is that it turns ordinary people into media companies. It lets us subject our conversations to Nielsen-like ratings, to regard our communications as a product conveying our personal brand. Then we can crunch the numerical data Twitter supplies to tweak our brand, and see what works to improve the numbers, which serve as proxy for our relevance and reach and, by extension, our right to feel important. Then these numbers can be used to sell ads as well -- we can indicate to advertisers what sort of demographic we have in our followers, making it a new way to monetize our friendships, following the inroads Facebook has made in that department. In the process, we become a product, a package of manipulatable content.

Gimein's critique has nothing to do with decrying that process of reification. He's more concerned with effective filtering. I think real-time search makes the following/followed concept meaningless to practical information gathering -- the followers number is all about status and ersatz influence measurement, not communication in any conventional sense. Twitter is less about disseminating information than it is about subjects trying to make themselves feel more real, ontologically speaking, in a increasingly mediated world.

Gimein's argument almost incidentally indicates how fragile the illusion of self-branding is -- we can fixate all we want on the numbers and the illusion of control that gives us over how popular and influential we can become, but that number is ultimately misleading. Gimein relates an anecdote of having one of his posts pushed on Google's corporate Twitter feed, which has a million followers -- it brought his own post a few hundred hits. That's telling -- the click-through percentage probably diminishes the larger the recommending pool is (niche aggregators are going to be more inherently trustworthy to its followers). But also telling is the way Gimein is willing to subject himself with no apparent hesitation to the sort of analysis usually reserved for online advertising.

Anyway, Twitter foments the fantasy of our vast influence, our endless relevance to everyone, and enlists more or less meaningless numbers to sustain it. Following people and being followed doesn't signify any kind of commitment, any reciprocal responsibility -- it's just an effortless way to give and receive empty recognition. It's a devalued currency, hyperinflated. But we can use that number nonetheless as a focal point, a kind of mandala for our self-worship.

The quantification disguises the emptiness of the social relations it is supposedly counting, an operation that reiterates the kind of instrumental rationality that characterizes the neoliberalism colonizing more and more of everyday life. Despite its early promise as a social-planning tool, Twittering is becoming a self-referential operation; we project things that make us feel important and pretend that it is for the benefit of unseen (and, in fact, often indifferent) others. We get a simulacrum of civic participation minus the trouble of other people and reciprocity and responsibility. We can buy followers for our Twitter feed and then forget in the midst of our fantasy how self-defeating that is.

Beatles Rock Band (18 Aug 2009)

Though it has a great hed, Daniel Radosh's article on the new Beatles Rock Band game reads like a long infomercial, or like a textual equivalent of one of those fake-documentary shows that gets made to promote movies, with interviews with the stars and the director that can be clipped out for use on Entertainment Tonight or get thrown on the DVD as bonus "features." (Though I can understand why Radosh took the assignment -- who wouldn't jump at the chance to interview the last living Beatles?) It did nothing to allay my confusion about music-based games. Changing radio stations in the stolen cars in Grand Theft Auto still seems a more meaningful musical gaming experience to me than playing Simon to the beat on plastic mini-guitars. I still think I may as well try to play guitar and suck, or simply play air guitar rather than try to master the irrelevant and counterproductive ersatz fretwork of Guitar Hero. (Guitar playing involves raising to the level of instinct the movement of your fingers up and down the fretboard along with the rise and fall of melody. Guitar Hero seems capable of thwarting that development.)

And I continue to think statements like this one -- "Playing music games requires an intense focus on the separate elements of a song, which leads to a greater intuitive knowledge of musical composition" -- are pretty ideological, wishful thinking asserted as nebulous fact by the game's marketers. ("Knowledge of the alphabet leads to a greater intuitive knowledge of poetic composition.") This tepid endorsement has the same hopeful vagueness: "Olivia Harrison, George Harrison’s widow, who stopped by Abbey Road while Martin was working, recalled her surprise upon first playing Rock Band a few years earlier. 'You feel like you’re creating music,' she marveled. 'It must engage some creative part of your brain.' " Of course, you are not making music and are only simulating creativity vicariously. But if the game satisfies the brain's creative impulses, why not just call it real creativity, the same way we might call crack-induced euphoria "true happiness"?

My interaction with music has always been a different sort of game, in which I scored points with myself for memorizing lyrics and remembering song titles and the names of the musicians in bands that most people had barely heard of. Music trivia seemed the natural game to play to me, and the rewards didn't seem to cancel out the pleasure of listening but instead enhanced it with context. That trivial information could buy my way in to some conversations and establish bonds with people who otherwise would have ignored me is a happy by-product. The games circumvent all that; it houses the information and its interfaces commandeer the social engagement.

The mix of social, cerebral and sensuous elements in my response to music is most satisfying when it seems immediate and fused, a kind of physico-cognitive dance that occurs spontaneously with the sound. I tap my hand, maybe imagine an album cover, sing along in a way that makes feel as though I am merging with the music and its mood. I wouldn't want someone or something to judge my ability to keep time with the drummer while I am tapping my foot or measure the synced-up relevance of my air-guitar strumming. This makes me more self-conscious rather than less, seeming to defeat the whole point of immersing myself in a song. I'm "inside the music" in a ineffably complex way that seems direct as opposed to mediated.

So I just don't get Rock Band. I don't want a game to mediate music to me when music is already mediating other, more profound experiences -- memories, dreams, secret pathways into the hearts of friends or imagined strangers, sheer abandon to sensory stimuli. These are enough to hold my attention; I wouldn't want those experiences endangered or compromised or supplanted by the discipline enforced by a game that measures your attention. That seems to me like covert industrial training.

Maybe if I could get over the scorekeeping component and think of it more like karaoke, I would understand it better; as this explanatory comment of Radosh's made the most sense to me: "What nonmusicians want, it turned out, is a sense of what it’s like to perform the music they already love." I understand that desire, though it seems strange gratified in this form, largely free of the effort and frustration that actual collaboration and mastery entails.

I don't quite accept the argument that Rock Band is akin to things like Flight Simulator games that let you pretend to be a pilot without having to actually learn. Music appreciation is richer and potentially deeper than simply getting to pretend to do something you can't actually do. So the music games seem to be ruining richer soil, precluding the deeper engagements for a mediated, preordained, regulated one. It's the apotheosis of Adornoesque fears of debased, administered culture squelching the free space of aesthetic creation in which resistance and protest could still be mustered. Instead we'll get Rock Band: Woody Guthrie.

The need for spaces in which interactivity is not preprogrammed and which allow for an unbounded sort of imaginative engagement seems crucial, akin to the information-free zone Jason Zegerle posits in this TNR post.

I like to learn covers of songs on guitar, and I think it has something to do with this, retracing the steps of musicians you have come to think of as being beyond merely creative -- it seems almost unbelievable, miraculous, that you can actually just play "Dear Prudence," that your fingers are retracing that immaculate moment of creation that John Lennon had when he discovered it and it actually sounds sort of the same for you. It's like touring a holy land, or standing in some spot where a famous speech was delivered.

But I don't feel like I become these legendary rock figures when I play their songs on guitar. It seems like the video games are trying to promote a much more direct sort of vicariousness that cuts out the other potential pleasures of music, as if the reason to love music is simply to imagine yourself being loved by an audience, being someone else.
What McCartney says at the end of Radosh's piece is pretty incisive:
The teacup clattered quietly on its saucer, and McCartney thought about the changes he’d seen in the music world. “There were no cassette recorders” when he and Lennon first started writing songs, he noted. “We just had to remember it. Then suddenly there were cassettes, then we were working on four track instead of two track, then you got off tape, then you’ve got stereo — which we thought just made it twice as loud. We thought that was a really brilliant move.” After the Beatles came CDs, digital downloads and now video games. “I don’t really think there’s any difference. At the base of it all, there’s the song. At the base of it, there’s the music.”
And the future? “In 10 years’ time you’ll be standing there, and you will be Paul McCartney. You know that, don’t you?” He made a sound like a “Star Trek” transporter. “You’ll have a holographic case, and it will just encase you, and you will be Paul McCartney.” He paused and then said, “God knows what that will mean for me.” Then he added slyly, “I’ll be the guy on the original record.”
The Beatles game makes him more famous, more relevant, more real. It makes those of us playing the game more anonymous, more immaterial, more unrealized. And the games of the future will make us able to become anyone but ourselves.

Reified design (14 Aug 2009)

The difference between something that is well-designed and something that is merely designy seems pretty self-evident if you base the judgment on functionality. But the difference between the two is always being blurred, usually by marketers trying to gain an edge for a product that's not essentially different from its market competitors. So we get designy bottles of dishwashing liquid, designy stainless-steel appliances, designy retro-looking appliances and other pseudo-novelties. Design improvements that allow us to consume or use a good more efficiently are conflated with improvements that shift our attention away from use to abstract contemplation -- the good becomes a mirror in which we see reflected our own good taste. Designy-ness, like so many consumerist products, lets us consume ourselves.

These efforts to sell products as a vehicle for design combine to create a climate in which design for its own sake is functionality, an aesthetic end that inherently enriches the lives of those who get to handle such "beautiful" objects. Industrial designers like Apple's Jonathan Ive get elevated to the status of artists, as if their aim was not to sell more goods but to create the Good. Consumerism is thereby transformed into a kind of democratized connoisseurship; Target (or, if you are still trying to preserve class distinction, ABC Carpet & Home) becomes a museum from which you can take home the objets d'art.

So what is wrong with that? This may not be a convincing answer, but designy-ness is an ideological sheen on consumerism, redeeming commodification while furthering it, permitting mass-distributed designy-ness to supplant genuine heterogeneity. In genuine heterogeneity is the chance that we might really redeem the promise of individualism -- that we will be able to garner social recognition for being ourselves, and recognition could be separated from being judged or taxonomized. But designy-ness and its off-the-shelf aesthetic (often prepared by lauded gurus) militates against that. However much we enjoy our own tastes in such stuff privately (solipsisticly) we become typecast when we exhibit those tastes publicly.

Terry Eagleton gets at this problem in the Kant chapter of The Ideology of the Aesthetic:
In the aesthetic representation...we glimpse for an exhilarated moment the possibility of a non-alienated object, one quite the reverse of the commodity, which like the 'auratic' phenomenon of a Walter Benjamin returns our tender gaze and whispers that it was created for us alone. In another sense, however, this formal, desensualized aesthetic object, which acts as a point of exchange between subjects, can be read as a spiritualized version of the very commodity it resists.
Designy goods, as spiritualized versions of consumer junk, elevate the practice of their exchange to something more seemingly dignified and become the medium for social contact itself. That is, as good possessive individualists molded within capitalism, we are isolated by our tastes and the goods whispering our ersatz uniqueness to us, and we gloat in our transcendence until the loneliness overwhelms us, and we are driven to participate in society, which we can do only on those same terms, at the level of our tastes in everyday goods, in mass entertainments, and that sort of thing. We think we are curators of our own personal museum of tasteful, design-y goods, but in the end it's someone else's institution and we are just the guards.

(For more deep thoughts about design, see this post at one of my other venues.)

Fear of Sharing (13 Aug 2009)

Sharing once seemed to me a simple, straightforward thing, but the way tech and social media companies have co-opted it recently have made me increasingly suspicious of it. In the usage that is starting to become prevalent, sharing isn't a matter of giving over a portion of a desirable thing to others. Instead it is a code word for what is in fact a mode of online production, for labor that we perform ostensibly for the benefit of friends we explicitly connect ourselves with in networks but ultimately for the benefit of the companies who hold the fruits of our effort on their servers. When we "share" via upload, we aren't sharing at all, we are working to move information and data into digital space where it can be manipulated and harvested for profit.

Facebook's recent acquisition of FriendFeed, a service that turns everything you upload to various sites into a single RSS stream, threatens to make the cant about sharing even worse. In this piece for The Big Money, Chadwick Matlin argues that the acquisition is Facebook's effort to corner the market in "social aggregation" -- that is, to evolve into a kind of Google Reader in which instead of subscribing to blogs, you subscribe to people. Matlin likens the effort to the Huffington Post, which aggregates and filters content to make it all managable, only with FreindFeed and Facebook, your friends will filter what you read, not strangers.

Take the devilishly popular HuffPo, for example. For better or worse, the site's mashup of news from disparate sources has struck a cord among its 7 million monthly visitors. Its homepage is a mix of links to blog posts from HuffPo contributors and links to outside stories from the news media. Rather than hunt and peck through all these other sites, people go to HuffPo to be delivered a smattering of links. Aggregators work because they do all the hard work for you.

So now imagine a social aggregator with the size and sway of Facebook. Users would love it because it would make their lives simpler and more streamlined. The other social media sites stand to gain as well, since Facebook would be pointing more users to content offsite. News sites will get more traffic because people will be clicking through on more links. Facebook, of course, would be the biggest victor: It would be able to get people to check in more often and stay longer. Ad rates can then go up, which helps the company's bottom line.

That seems logical enough. But don't we want actual editors filtering content for us rather than our friends? And also, the FriendFeed would become merely another performative medium, like status updates, only with links and photos and other Tumblr-like flotsam and jetsam. This wouldn't make my life more streamlined -- it would mean I would be inundated with more information to process about my friends' efforts to signal who they want to be. And I would have my own performance to worry about as well.

Because what these sorts of services do -- when we passively reveal what music is playing on our computers, or when we update what book we are reading, or update Twitter of a Facebook status update -- is send the message to the world that it is okay to assume that we are always, always performing. And that is an oppressive, sick feeling, for me anyway. That sort of claustrophobic suffocation precludes the possibility of a true public space, as in not private. Everything that once might have delineated the private is now being compulsively shared or extracted and brought into view. (Even if we close our networks to permit only our "friends", we still must admit the company who polices the border into our circle, and their terms of service give little comfort that they won't abuse the friendship.) I miss the old sharing, spontaneous gifts to specific people, a willingness to show up somewhere and spend time in their company. The new sharing seems only to force me into a narcissistic posture; the new sharing is always on the verge of boasting.

But obviously not everyone is so troubled by the new sharing. Otherwise "real-time search" -- search Twitter for up-to-the-minute information about something happening in the moment -- wouldn't work at all. I had my first experience with real-time search last weekend, when the Netflix Watch Instantly service wasn't working and I wanted to know if the whole system was down, or if it was something about my computer or account that was messed up. It occurred to me to search Twitter, which quickly revealed that it was the system. I was momentarily delighted by the ingenuity of this and was grateful that other people bothered to update this mundane stuff, share it, but then I felt guilty about not doing likewise. I couldn't imagine making it my responsibility, of making the leap to believing that everything I am experiencing is relevant to the world. I don't accept that I have an infinite responsibility to share potentially useful information with other people. Would Levinas chastise me?

Soviet Consumerism (10 Aug 2009)

Someone on Metafilter had linked to Real USSR, which offers essays and photos of Soviet material culture. It seems like a useful resource in imagining what a postconsumer (or non-consumer) society might look like. The Soviets apparently failed in achieving a positive example of such a society; its citizens, at least in Western representations, were hungrier than their non-Communist counterparts for branded goods and the world of status consumption from which they were by and large excluded. Their seemingly dismal lack of consumer goods was possibly the best propaganda weapon for the U.S. during the Cold War: Dowdy, nondescript proles standing in lines outside gray, barren Soviet distribution centers would be contrasted with the glitz of shopping malls and the endless opportunities for self-aggrandizement. Who wants to work for a collective goal when we can enjoy a solipsistic reverie in which all causes begin and end with ourselves?

Consumerism in Western society, at any rate, is strongly associated with atomistic individualism, offering the illusion of transcending social reciprocity for a higher convenience, in which pleasure is served directly to us through various purchases in well-stocked retail outlets. Pleasure is presumed to be a matter of accumulation -- is constructed to be that sort of thing, a matter of developing the richest self through consuming and mastering the greatest amount of stuff. In Soviet culture, consumerism must have meant something else entirely, carving out a space for subjectivity -- for an alternate currency of information, about goods and what they might signify -- in an authoritarian state premised on surveillance and information control.

Anyway, the posts at Real USSR give a glimpse of the resourcefulness of Russians in the face of deprivation and the ideology that was meant to justify it or excuse it. This resourcefulness is the kind of thing I tend to sentimentalize as what's lost in consumer societies -- only the Soviets probably didn't experience it as pleasure, despite the faint nostalgia of posts such as this one about DIY fashion. It was likely felt as necessity pinching in, complicating everyday life. It was official ideology impinging on autonomy in ways that symbolized how dominated the populace was:

During the Soviet times fashion was first and foremost, an instrument of propaganda of hard work attitudes and education of good taste. Therefore the way people were dressed was very strictly regulated – just like anything else, fashion had to be “planned” and “approved”.

In other words, old-fashioned sumptuary laws were in place to discourage fashion from becoming a medium for suggesting social mobility. The link between autonomy and personal display is maintained where ideally it would be dissolved. The inescapability of invidious comparison is still implied. The post mentions the "fashion neighborhood watch" -- a sort of inverse of the feeling I have when I walk around the East Village feeling helplessly uncool. I tend to aspire toward sartorial anonymity for very different reasons than the Soviets might have -- but then again maybe they are the same, a yearning for safety. Invisibility in a socialist society is suspect because everyone is supposed to be responsible to every one else (in reality, the state) in the collective project of society -- you have to be visible but undifferentiated. In a consumer society, visibility traps one's aspirations to individuality in the realm of fashion -- judgment of who we are remains on the surface level, and can't break through to the aspects of personality that are not so easily displayed. Consequently, the culture industries work hard to produce legible symbols for every possible personality trait, de-authenticating them in the process so that all identity can be regarded suspiciously as pretense.

This post, about Soviet brands, surprised me, because I had sort of assumed that there were no brands in the USSR. The post highlights a perfume called Red Moscow, the name of which suggests how branding was co-opted for propaganda purposes -- of course it makes sense that the state would invent nationalist brands. I tend to take it for granted that brands of products function only to help individuals brand themselves, to allow them to project certain traits along the lines described in the previous paragraph. (For producers, brands allow for the elaboration of differences between competitors' commodities where there are more or less materially indistinguishable.) But Red Moscow suggests that brands could be contrived to close off avenues for the development of a superficial self. Nationalist brands would enlist users into helping complete the ideological project of the state, not the self -- a state that may not allow for an autonomous self. Such brands would demonstrate conformity and obedience in a much more direct way than our brands, which entice us to show off our conformity to the general significance of participating in fashion and having a lifestyle. Consumerism is soft coercion in that sense; it allows for a space where conformity can comfortably coexist with rebellion -- the revolution is reduced to continually turning over one's personal affect within a game whose rules are thereby protected from change.

Stupid Names (7 Aug 2009)

I don't hate Thomas Pynchon like New York magazine reviewer Sam Anderson apparently does, but I agree with him about this: "I hate -- maybe most of all -- his characters’ stupid names." Picking awful, jokey names for characters does seem to call the whole aesthetic into question, as it puts Pynchon's sense of humor in a bad light. And the names come up continually, reminding readers over and over that that they might not be in safe hands. Is it ever actually funny when made-up characters are given funny names? Or is this a rhetorical tactic that only high-school English teachers find droll and amusing?

The complaint about stupid names reminded me of my long-held philosophy of judging bands by their names confidently, without ever having to hear their music. If a band chooses an annoying name -- one of the most definitive choices they have to make collectively about what they are trying to accomplish -- you can count on them to make similarly poor choices in their music. TV on the Radio? Sounds great, lets go for it, it's got something. Clap Your Hands Say Yeah? Yes! Really whimsical. CYHSY! (Generally, it is bad when your band's name is typically compressed to an acronym when your releases are being reviewed. This recourse to abbreviation is to save space, but I'm sure it also helps writers and editors keep from vomiting on their keyboards.)

A band's name is a potent symbol in the use of musical taste for signaling purposes. If the band's name seems dumb to me, I won't adopt that band and integrate it with my sense of self. I won't fly my own identity under that sort of banner. The more musical taste functions as a parameter of fan identity, the more important the name becomes -- I must be far gone, then, since I am consuming the names alone in many instances and never bothering with the music. Often I have a fantasy of finding a radio station that I trust enough to hear a steady stream of music without ever having to hear the names of the artists.

But sadly when I hear something good, I want to possess more information about it. I can't surrender to the song; I need to maintain a facade of self-control by marshaling and mobilizing facts about it. (I can sympathize with the fear that TNR's Jason Zengerle expresses here. To preserve our imaginative faculties, he suggests that "the trick is carving out spaces in your life that are disconnected, or at least at a certain remove, from the information overload that's all around us." But technology is developing to always put temptation in our path, or rather in our palms.)

Sometimes bands with great names (High on Fire) can't live up to their moniker, but rarely have I turned out to be wrong about a band with a dumb name (virtually every indie band from the 1990s). I can think of only a few bands with names I think I are terrible whose music isn't also terrible: My Morning Jacket, Spoon, Bonnie "Prince" Billy. Can't think of any others.

UPDATE: This PSFK post mentions "augmented reality," probably the next iteration of technology mediating our interaction to the entire world. Do people really want this sort of service before it's available, or do they have it foisted on them and then reconcile themselves to it after the fact by embracing its seeming advantages? It's only when I know I can readily access more info about something that I have a distracting wish to do so.

Reputation and rescission (5 Aug 2009)

Economist Bryan Caplan, a health-care reform skeptic, argued that health insurers' concern for their reputation would prevent them from abusive practices like rescission, when coverage is revoked once patents need expensive care. Insurers are so concerned about reputation, he argues, that German insurers defied the Nazis and demanded the right to pay damages to Jews after Kristellnacht. Paul Krugman's skepticism about the almighty power of reputation prompted Caplan to wish for a chart that would expose companies' rescission rates (funny that didn't turn up so easily) and later to elaborate on the wonderful powers of reputation in a competitive free market.
When we wanted a new house built, we gave 10% of the purchase price to the builder upfront. The builder gave us a contract almost devoid of legal remedies - practically everything was at the builder's "sole and absolute discretion." A few months later, we moved into our house. 99% of the details were exactly right, and they fixed the rest for free. Why would the builder treat us so well? Altruism? Ha. Legal remedies? Ha. Even repeat business is a stretch. What are the odds we'll ever ask them to build a second house for us? The only answer that makes sense is reputation.
Caplan sneers at altruism, but is altruism so different from the professionalism we expect from doctors when we assume they are not keeping us sick to bleed more money out of us in office visits and so on? Altruism, human decency, professional dignity all matter in many of our exchanges, even though it is hard to find a place for them in a formula-driven rationalistic economic analysis of how capitalism functions. In fact (as the film The Corporation depicts) the firm may function to disperse that altruism and mitigate motives other than profit. Responsibility is spread throughout the corporation so no one has to feel particularly guilty about its cutthroat doings -- as when sick patients have their coverage yanked from underneath them. Individuals within the firm can focus on their responsibilities to the hierarchy rather than to customers or society without feeling like unreasonable monsters. (Proprietors of small businesses have to face more of the brunt of the moral consequences of their practices, which makes it harder for them to fend of behemoths like Wal-Mart.) Damage to a brand's reputation can be combated by the same forces that might publicize it, and a corporation is usually going to have more resources for this than those it has wronged.

Arnold Kling, Caplan's co-blogger at EconLog, raises another problem with the reputation idea as it pertains to health care. "Reputation matters when exit matters. That is, if people will switch suppliers based on word of mouth, then reputation will be important." But under the current system we don't do that. And most people don't have a problem with their insurers until they need to actually use the coverage, at which point it will be too late to switch.

UPDATE: Tyler Cowen makes another good point.
Reputation affects market practices, but possibly reputation is part of the problem. It's relative reputation which matters. The operative reputational incentive is not always: provide a better product to get more customers. Sometimes the reputational incentive is: customers tolerate bad treatment, because established reputations suggest they will receive equally bad treatment elsewhere.
This seems to imply the cycle of relative reputation can push all competitors downward.

Televised food and consumption deskilling (4 Aug 2009)

Though it is part of the conspicuous flood of hype for the upcoming film Julia and Julia, Michael Pollan's article about cooking shows for last Sunday's New York Times Magazine is a fascinating look at the way food and cooking are depicted on television and what that indicates about America's relationship to consumption in general. A few things I found particularly interesting:

1. Pollan talks to a market researcher at the NPD Group, which is ubiquitous in stories about retailing in the business press.
Like most people who study consumer behavior, Balzer has developed a somewhat cynical view of human nature, which his research suggests is ever driven by the quest to save time or money or, optimally, both. I kept asking him what his research had to say about the prevalence of the activity I referred to as “real scratch cooking,” but he wouldn’t touch the term. Why? Apparently the activity has become so rarefied as to elude his tools of measurement.
I found it interesting that someone who is only interested in humans insofar as they are shopping would conclude that they are entirely fixated on convenience. The realm of retail is precisely where that potentiality in people is brought to the fore, rewarded, massaged. So it's not surprising that the NPD Group won't deign to measure an activity that runs counter to convenience, that it would prove "elusive." So is the causality backward here? Does the market research and retail analysis start with an assumption of convenience as the desirable value (if that is the goal, then we can seem to solve our problems with shopping) and then impute it to human nature, as a way of shoring up what is an ideological tenet, not a universal psychological truth?

2. Rob Walker higlighted this paragraph, which jumped out at me too:
The historical drift of cooking programs — from a genuine interest in producing food yourself to the spectacle of merely consuming it — surely owes a lot to the decline of cooking in our culture, but it also has something to do with the gravitational field that eventually overtakes anything in television’s orbit. It’s no accident that Julia Child appeared on public television — or educational television, as it used to be called. On a commercial network, a program that actually inspired viewers to get off the couch and spend an hour cooking a meal would be a commercial disaster, for it would mean they were turning off the television to do something else. The ads on the Food Network, at least in prime time, strongly suggest its viewers do no such thing: the food-related ads hardly ever hawk kitchen appliances or ingredients (unless you count A.1. steak sauce) but rather push the usual supermarket cart of edible foodlike substances, including Manwich sloppy joe in a can, Special K protein shakes and Ore-Ida frozen French fries, along with fast-casual eateries like Olive Garden and Red Lobster.
The point is that commercial television's main function is to make viewers into the sort of people who want to watch more and more commercial television. Any of its programs can be reduced to that agenda, ultimately, with the specific content of any show being something of a by-product, an alibi. As a free-flowing, ongoing form of media, television invites us to interact with it constantly; unlike other consumer goods it can refine the wants it satisfies in the process of satisfying them. Watching TV is like eating a meal that tastes great but makes you hungrier. To that end, television wants to provoke us to replace our concrete, direct activities with vicarious ones and demonstrate through its sensory manipulations that the vicarious experience (to paraphrase Baudrillard) is more real than real.

3. Nonetheless, I'm skeptical of this sort of argument: "You’ll be flipping aimlessly through the cable channels when a slow-motion cascade of glistening red cherries or a tongue of flame lapping at a slab of meat on the grill will catch your eye, and your reptilian brain will paralyze your thumb on the remote, forcing you to stop to see what’s cooking." I suppose there has been a study where the brainwave or eyeball movements of participants are tracked and measured to demonstrate some correlation between pictures of food and limbic system activity, but nevertheless, I have a hard time conceding that humans are hard-wired for any sort of vicarious experience. Such explanations for the increasing amounts of vicarious experience in our lives seems to excuse various forms of media have interposed themselves into our lives and prompted us to consume images. And it seems imperative to investigate how the seemingly irresistible appeal of images is constructed and reinforced within the realm of images -- how manipulation via images is a craft, an applied science. Our media are not neutral terrain that merely permit the unveiling of evolutionary mysteries; to what degree do they posit and condition those discoveries about the so-called reptilian brain?

4. Pollan suspects the appeal of cooking shows is that it vicariously fulfills our longing for meaningful work:
“You know what I love about cooking?” Julie tells us in a voice-over as we watch her field yet another inconclusive call on her headset. “I love that after a day where nothing is sure — and when I say nothing, I mean nothing — you can come home and absolutely know that if you add egg yolks to chocolate and sugar and milk, it will get thick. It’s such a comfort.” How many of us still do work that engages us in a dialogue with the material world and ends — assuming the soufflé doesn’t collapse — with such a gratifying and tasty sense of closure? Come to think of it, even the collapse of the soufflé is at least definitive, which is more than you can say about most of what you will do at work tomorrow.
Television presumably undermines the way in which such satisfaction might have been integrated into our leisure time in the form of craft-based hobbies.

5. Television is part of the culture-industry program of deskilling our leisure, or rather removing life skills from everyday, nonwork life and transforming them into hobbies for the few. Cooking, as Pollan suggests, has become another front in that war.
We seem to be well on our way to turning cooking into a form of weekend recreation, a backyard sport for which we outfit ourselves at Williams-Sonoma, or a televised spectator sport we watch from the couch. Cooking’s fate may be to join some of our other weekend exercises in recreational atavism: camping and gardening and hunting and riding on horseback. Something in us apparently likes to be reminded of our distant origins every now and then and to celebrate whatever rough skills for contending with the natural world might survive in us, beneath the thin crust of 21st-century civilization.
To play at farming or foraging for food strikes us as harmless enough, perhaps because the delegating of those activities to other people in real life is something most of us are generally O.K. with. But to relegate the activity of cooking to a form of play, something that happens just on weekends or mostly on television, seems much more consequential. The fact is that not cooking may well be deleterious to our health, and there is reason to believe that the outsourcing of food preparation to corporations and 16-year-olds has already taken a toll on our physical and psychological well-being.
Chris Dillow made a similar point in this post about cooking shows. He blames deskilling on "the spread of purely instrumental rationality - the idea that utility maximization consists solely in maximizing consumption for minimal expenditure of time and money." Not only is it an animating principle of capitalism, instrumental rationality, as the Frankfurt School theorists insisted, is the lifeblood ideology of the culture industry. (See "Enlightenment as Mass Deception") It also animates techno-utopian celebrations of the efficiency of the internet for all sorts of social-cum-commercial functions. In general, instrumental rationality alienates us from process and fixates us on the end product, convincing us that the time wasted in enjoying processes for their own sake is wasted since there are so many cool products out there for us to be enjoying in that time instead. Why cook when you can spend that time online hurtling through blog posts and YouTube videos?

This paper by JoAnn Jaffe and Michael Gertler about consumer deskilling and agribusiness probably has lots of interesting ideas on this point, but unfortunately it's gated. The abstract, though, is promising; it sounds like an academicized version of Pollan's ideas:
The prevalence of packaged, processed, and industrially transformed foodstuffs is often explained in terms of consumer preference for convenience. A closer look at the social construction of “consumers” reveals that the agro-food industry has waged a double disinformation campaign to manipulate and to re-educate consumers while appearing to respond to consumer demand. Many consumers have lost the knowledge necessary to make discerning decisions about the multiple dimensions of quality, including the contributions a well-chosen diet can make to health, planetary sustainability, and community economic development. They have also lost the skills needed to make use of basic commodities in a manner that allows them to eat a high quality diet while also eating lower on the food chain and on a lower budget.
I'm attracted especially by the suggestion that consumers are socially constructed to prefer convenience.

Here (pdf) is another paper on consumer deskilling vis-a-vis food, drawing on the paper cited above and Pollan. It focuses on deskilling, and the problems we face in trying to reskill ourselves (a process, by the way, that the Food Network blocks while seeming to facilitate it): "By gaining experiential knowledge of food, food preparation, appreciation of taste and quality, and increasing food literacy, one renders the range of products and services offered by the industrial food system as both useless and undesirable." In other words, you set yourself against a massive institution that controls most of our access points to the food chain, which means you are in for an endless headache until you are prepared to embrace inconvenience as an ethic of its own.

Carl Wilson's 'Let's Talk About Love': A Journey to the End of Taste (3 Aug 2009)

I'm reading the 33 1/3 book about Céline Dion by Carl Wilson (who is not to be confused with Carl Wilson), which is less about Dion than it is a sociology of pop culture taste. It appeals to me because it dispenses with the obfuscating fictions that taste is autonomous (i.e. intrinsic to one's inner being and the music itself), or that taste can be "right", and looks instead at what social functions taste plays, which class boundaries it helps regulate in a society that pretends to be without them.

The book is framed by the ongoing debate over what the function of pop-music criticism should be, or whether there should be any pop criticism at all. I waver on that question. Wilson mentions the rockist/popist debate, which seems like a red herring; at their worst both approaches are condescending, only in different ways. Embedded in most pop criticism is the idea that listeners need their preferences justified or vindicated by a better-informed outsider. Generally, I get impatient with will-to-power would-be tastemakers, and my experience in the magazine business has confirmed for me without question that pop music critics don't have any special listening expertise -- their ears aren't refined like a wine connoisseur's palette. They aren't doing the sonic equivalent of philology. Perhaps their class habitus affords them the instinct of authority. Usually, though, they are compromised by their own supposed qualifications, the concessions they make to be published for pay. At best, reviewers are clever writers who can startle with a turn of phrase; their work should be appreciated on a formal level, not for anything they might say about a particular record. What reviewers and their editors seem good for is establishing the horizons of relevance -- picking out the dozen records worth hearing and talking about in various genres every year. I like reading what other people have to say about a record I already know pretty well; then I can pretends I am part of a conversation, internally agreeing or disagreeing, coming up with objections. I don't read reviews of records I haven't heard already; since it is so easy to sample music or yourself rather than rely on recommendations, I imagine I am not alone in this.

It used to be that reviewers also established the parameters for pretending to your own expertise. They taught the grammar of snobbery. When I was younger, reading about music helped create the context within which I, a nascent taste bully, could enjoy it, positing the elitist club I can earn my way into by mastering various facts and references and attitudes. Music critics in the 1960s and 1970s taught where the cultural capital might be in pop music, basically inventing the idea that mastering the canon of pop could have any cultural value. In other words, they helped integrate the free field of pop where anything was permissible, listening-wise (it all failed to register as anything but trivial) into the matrix of social class determinations, so that it suddenly became like an investment, something that could open some door for you and allow you to shut the door on others.

But that need for a context is not limited to aspiring music snobs. Without a listening community, literal or implied, it's hard for an individual to get much pleasure out of pop. Listening to pop is a way to consume the zeitgeist as pleasure, and critical conversations (which are now more dispersed and democratic than ever) are a part of that zeitgeist, helped render it material. The depth we recognize in music is supplied by the listening context, which works both ways -- some rich music is emptied of its potential depth; some rudimentary music is enriched with contextual content. But a pure listen, without the compromising effects of context, is impossible, though bogus criticism will pretend to such purity.

If we want to opt out of the zeitgeist, the music bound up in it is lost to us -- what happens is we have to discover certain music years after its popularity (as when I started listening to Dookie last year), with some other rationale than belonging to our time, sharing in the pleasure all the other people seem to be getting from whatever it is. In that case, the pleasure may be in the illusion that one's own tastes are unique, consuming one's own special ability to resist conformity. With all this, the quality of the music itself doesn't matter -- it just needs some marked peculiarity, some relative novelty, to hang all the posturing on. The notion of taste then mediates the contradiction between our desire to belong and our desire to be unique, ineffable individuals. Macro-aspects assure us that we're within the appropriate boundaries, where as the particularities seem to speak to our uniqueness. Tastes can shift routinely and tactically yet somehow seem to us as if they never really altered but merely came into clearer definition.

Of course, when we listen to music, nothing but its intrinsic quality seems to matter; all the identity building and social participatory aspects are suppressed. Clearly our experience of our own taste is visceral, spontaneous -- I hear the Red Hot Chili Peppers and my body wants to vomit; my intellectual preferences seem to have nothing to do with it. It seems beyond questioning, like asking why you wouldn't want to eat a chocolate bar smeared with mustard. That's why it seems that criticism's most important function should be to demystify taste so that the ideological freight it carries is at least exposed. Wilson's book is a model for this; it's great at showing the sort of class contempt that gets disguised and authorized by manifesting as musical taste (especially in Chapter 8, which lays out sociologist Pierre Bourdieu's ideas in an accessible way). Wilson suggests that middlebrow, straightforward, sentimental conformist music like Dion's is hated because it is associated with average people who don't register on the media landscape. It's "schmaltzy" -- it's engineered to succor the status quo. Wilson explains, "It is not just cathartic but socially reinforcing,a vicarious exposure to both the grandest rewards of adhering to norms and their necessary price." It's blandly aspirational for what has already been endorsed, for what seems given, in all its inequities and imperfections. Hating such music makes us feel above average, with bigger dreams, until the opportunity opens up to become avant-garde by salvaging it, loving it. Musical taste, from this perspective, is sublimated prejudice, social bias turned into something you might be proud to display. Which is why it can seem that a good default position to always insist that people's tastes are wrong.

But in hating other people's music, we may perhaps fall prey to a different quasi-utopian illusion, a kind of aesthetic eugenics: that if we messianically hate what seems like average, mediocre music with enough ferocity, mediocre and average people will somehow vanish also. Everyone will be special and wise in their own way, and society will so attenuate itself that conformist mediocrity simply won't be possible. We won't have the option to like blah things (and thus be blah people) if we make impossible for the culture industry to manufacture blah art. Social classes determined by cultural capital will be obliterated along with cultural capital itself. Everyone will be forced to be free. It's a contemporary, secular variant on the recurrent fantasy of ending religious wars once and for all by simply forcing everyone to convert to your faith. It can be difficult to resist this and adopt a kind of indifferent tolerance when we yearn to make the music that moves us into a religion.

Is it possible to avoid these pitfalls, or should one discreetly drop the subject of music from one's conversational repertoire? Is what's left a tepid relativism that forces one to feel guilty every time one doesn't like something? Not long ago, a friend randomly sent me a link to this video of "The Witch", by the Rattles. I'd never heard it before, and it was reassuring that I knew someone who would know that I needed to hear it. Discourse about music always has a chance to open up the possibility for feeling recognized, understood, like that, and that can makes the discrimination and mockery it engenders seem a price worth paying. And Wilson's book itself exemplifies what careful attention to taste can reveal, though the price then is a endlessly recursive conversations on the level of meta-tastes.

Thursday, July 28, 2011

The intractable health-care market (29 July 2009)

I haven't written much about the health-care-reform debate because it tends to make me irrationally angry. I'm about to lose my health coverage, and it gives me the general feeling that the society in which I live doesn't really care if I die a preventable death, and that seems like a sucky society to live in. The current American system seems to enshrine the worst aspects of this country's prevailing ideology -- that we regard the idea of collective well-being as a empty notion, and that it's okay for people to suffer as long as the "free market" says it must be so. Right now, professional standards among doctors -- basically their pride in their profession -- is all that stands between sick people and their remorseless exploitation, and as economist Kenneth Arrow notes in this interview at The Atlantic's site, they are "eroding."
Some doctors understand that they shouldn't abuse the system. But you still see problems in the way doctors behave towards patients. They goof off. Sometimes it's too much work. Some things are difficult and risky to diagnose.
No system will ever remove the inevitability of human error and individual immorality, but certainly we can arrange for a system in which these undesirable behaviors aren't encouraged. Arrow suggests that the free-floating free-market ideology churned out of the University of Chicago economics department is partly responsible for the current cultural climate, in which profit is seen as the only ethics:
I think there has been a general drift around the country towards the idea that greed is good. Look at Wall Street. All of these industries involve a professional element in which information is flowing. You're supposed to be constrained to be honest about it. I don't really know why. But there is now more of an emphasis on popularization, which does improve efficiency but can also lead to an erosion of professional standards. There was this idea that professional standards were a mask for monopoly power--a Chicago theory, which I believe came from George Stigler. I don't know if they were that influential, but they seemed to be saying a lot of things that people were taking up in practice. I'm not totally sure why these professional standards changed, but it's more than medical reasons.

Rather than having patients pay for the treatment and resolution of an illness, the American system instead nickels and dimes customers with a bewildering variety of charges for tests, consultations, transportation of samples, X-rays, test analyses, and so on. Having your blood tested for a vitamin deficiency can result in six or seven different bills from a variety of medical-service providers. Insurers are supposed to manage this process so that it doesn't trouble or confuse the patient, who has his or her health to worry about. But instead this patchwork system has become a chaotic blizzard of invoices that incentivizes doctors to overprescribe and insurance companies to try to deny payment and care. Both are given economic reasons to keep patients ignorant, even as some like Ronald Bailey in this Reason article, argue that customers (i.e. sick people) must be forced to contain health-care costs, implying that it is their fault that spending has gotten out of hand. In the face of all the confusing billing, customers have failed to be "cost conscious" about health care. But it is impossible to know without an absurd amount of investigation and scrutiny what one is even being charged for in the health-care realm; that we would perform price comparisons and veto procedures and tests before the fact over our doctor's suggestions is a totally unreasonable notion. The insurance market is also notorious for its confusing contracts and for its misleading paperwork, and for denying coverage customers had been led to expect. That's known as rescission and is just about the worst thing in the medical business. Which is why it shouldn't be a business. Arrow argues that things like nonprofit hospitals and the eschewing of medical advertising signal the medical community's commitment to professional standards and its collective effort to correct the inherent market failures, whereas the recent "emphasis on markets and self-aggrandizement in the context of healthcare" undermines those standards and sends the opposite signals. As a result, patients like me get super-paranoid and even less rational in our medical decision-making, worsening the failures that libertarians and liberals alike complain about. Bailey thinks government intervention and regulation has helped cause this problem and only consumers can fix it by being better watchdogs over what they pay for; he alleges that "competition would provide a strong impetus for medical practitioners to provide consumers with good information about the effectiveness of various treatments and drive innovation." Most liberal analysts believe that patients will always be at an informational disadvantage (not least because being sick renders one vulnerable and incapable of rationally sorting through billing details) and only government intervention can help fix it.

At his NYT blog, Paul Krugman summed up the essential problems facing market-driven health care (and why the American system so badly needs reforming), drawing on this classic paper (pdf) by Arrow. Krugman's post is worth reading in its entirety, but the gist is this: For-profit insurers "try to deny as many claims as possible, and ... avoid covering people who are actually likely to need care. Both of these strategies use a lot of resources, which is why private insurance has much higher administrative costs than single-payer systems. And since there’s a widespread sense that our fellow citizens should get the care we need -- not everyone agrees, but most do -- this means that private insurance basically spends a lot of money on socially destructive activities." Achieving cost-effective treatment can't be entrusted to for-profit entities, because they will always put profit ahead of your health, defeating the ultimate purpose of medical care in a society. The state has a better chance of organizing workable health-insurance pools without concentrating energy on innovating new ways to deny care to maximize profit.

This is the point at which conservatives complain of rationing -- the government is going to decide who gets what care. This is meant to distract us from the de facto as rationing already in place through unaffordable prices. The current arrangement suggests that we as a society believe the poor should just go get sick and die. Reform efforts are aimed at present a different face of our society, one currently suppressed, in which we collectively face the mortality risk we all share, the thread that unites us all, no matter what other circumstance we may have been born into.

Going analog (28 July 2009)

A few days ago, in an attempt to recapture some of the deeper pleasure I used to take in listening to music, I hooked up my turntable, which had been sitting in a closet on top of a milk crate holding the few remaining records I didn't give away when I moved two years ago.

When I finally got the cables straightened out and dropped the needle on a record (The Other Woman by Ray Price), the first thing I noticed -- something that I had totally forgotten about playing records -- is that each time you play one, it sounds a little different. There are many contingencies: static, dust, the needle's fidelity, the speed at which the turntable revolves. Records get worn out, obviously; they develop skips and so on. Some of the skips on records I had as a kid are burned into my mind, so that when I hear "Born to Run" on the radio, I still brace myself for a skid across the "wha-uh-uh-ohohoh" part at the end that never comes. I have this unique (albeit mostly useless) relation to that song because of the specific damaged record I owned. (Who knows? Maybe people will come to sentimentalize imperfections in the compression of their audio files. I tend to delete them instead.)

It's silly to sentimentalize skips in records, but they alert me to the fragility of the entire musical encounter, they hint at the miracle of performance that we typically take for granted. It figures music as something rare, something requiring care and preservation, something that is still sanctified despite its commercialization. It seems that digitization has destroyed once and for all that palpable sense of sacredness in music. Music still has its functionality, but it seems less autonomous from the culture industry -- because of its materiality, analog has built-in friction to hypermediatization, to infinite copies and regressive recursivity and the production endless simulacra of simulacra. With analog, the signal degrades, and in that there seems to be a sort of salvation these days.

I am attracted to the possibility of negating aspects of music consumption that in recent years have made it more "convenient." I'm strangely hoping to make listening to music purposely inconvenient. Instead of delighting in being able to take my whole music collection anywhere on a MP3 player, I'm returning to the quaint idea that I have to go to a particular place to indulge in my music, a de facto listening booth. When I go into the room with the turntable and put a record on, I'm specifically interested in hearing music, not making a soundtrack for my life. Instead of being a pretend DJ for my own household radio station, I put on an album side, from the very limited selection available to me, and listen to every song the producers decided to put on there, for better or worse. No skipping songs impatiently, no opening an audio editor to delete boring parts, no metadata editing and star ranking and image file curating. The music just plays.

If I grow tired of the few records I have, I can't just go to an MP3 blog online and get some new ones. Instead I'd have to go to a thrift store in the neighborhood and pray that something worthwhile will be buried in there. In my mind is a vague wish list of records it would be nice to discover, many of which I used to own but of course got rid of in the excitement over digital convenience. It's hard to explain why, but these seem like records that make more sense as vinyl artifacts: Paul McCartney's Red Rose Speedway. Fleetwood Mac's Tusk. And Hard Nose the Highway by Van Morrison. Bob Dylan's Street-Legal (even though the mastering is horrible).

It's a pure nostalgia move, I know, trying to re-create the listening conditions of my 11-year-old self, and it seems like a counterproductive road to be on, one that leads to becoming a prisoner of history. What am I going to do, only listen to music from the LP era for the rest of my life, or become one of those extreme audiophiles who spends $40 on limited-run vinyl pressings of new material? Of course I'm still going to listen to music through my computer, but I'm hoping to rekindle the memory of an alternative -- as if going analog is some form of cultural resistance, a faint form of negative dialectics. At a time when we can access as much music as we want pretty much anywhere we want, I'm trying to restore a sense of limits to my listening, develop a relationship to songs again as mediated by the physical object of the album. Does this make the medium the message to a degree that the specific content of the records is obviated? That, I guess, is what I am going to find out.

UPDATE: This FT article (via PSFK) details Apple's rumored efforts to rekindle the fetishization of albums. It's like Apple and its minions are one step ahead, preemptively co-opting any efforts to create a sphere of culture immune to its influence. My fantasy is about detechnologizing my relation to music. Apple apparently is trying to leave no escape routes, not even into the past.