Thursday, August 5, 2010

Cosmetic surgery as clothing (14 October 2006)

I'm always a little wary about getting into gender issues because I don't want to lapse, as may be the male tendency, into a paternalistic, patronizing attitude about them: My friend Carolyn has fond grad school memories of being lectured to about what it means to be a woman by some male student who had read a little Cixous and Irigaray ("Don't you see, Carolyn? All women wear the mask"); I definitely don't want to be that guy. I'm not entirely sure which is worse, blundering through a discussion of such issues in ways that reveal my tunnel vision or ignoring them altogether -- but then again ignorance never stopped me from proclaiming opinions on other subjects, so why should it stop me here?

In yesterday's WSJ was a review of Beauty Junkies, a book about the cosmetic surgery industry and women who are "addicted" to having work done. Reviewer Alexandra Wolfe sums the situation up this way:
These days, dots on the derriere are just one clue among many -- including the shiny sheen on stretched cheeks, the ever-alert eyes and the perky, unbouncing breasts -- that someone has "had a little work done." Beauty Junkies captures the sad fate that has befallen the feminine ideal: Since women can achieve an approximation of attractiveness through one procedure or another, they all end up looking vaguely like the same person: an aging porn star. In the end, the book leaves the reader not only aware of the emptiness of cosmetic surgery's results but also conscious of the vacuity of our current concept of beauty itself.
Though I would toss out the reference to the feminine ideal, that seems an apt description of what is so creepy about cosmetic surgery: it permits women to eschew their natural looks in favor of a technologically produced fashionable alternative. The standardizing technology of the surgeries and injections and so on allow the beauty of any person to be judged by the same set of criteria, rather than each person evoking her own criteria to explain her unique beauty. And it makes of this standardized "beauty" a proxy for money and class -- those with the income and the access to the right surgeons can achieve the robotic look Wolfe describes and will thus be held to be beautiful by society, though they are clearly hideous to any person marvelling at their plasticity close up. Aesthetic beauty has no objective reference point (there's no universal "feminine ideal"); it always derives from the imperatives of signifying class. Cosmetic surgery forwards that aspect of the ideology encoded in beauty; it makes it a choice but presents that choice as a natural fact (much like class is supposed to be, blue blood is proof of a natural and God-intended superiority). Cosmetic surgery extends fashion's domain from their clothes to their very bodies, which in turn allows the outward expression of their natural self to be altogether eradicated. Carolyn's grad-school friend might say they are completely and perpetually safely behind "the mask."

My defensive preface to this post comes in here, because I don't want to sound as though the women who get plastic surgery are either dupes, victims or shallow collaborators with an oppressive male order. As Pandagon blogger Amanda Marcotte is always pointing out, "you can criticize the power inequities that the garment is evidence of without attacking women who are better off wearing it than not, for whatever reason. And same thing with make-up or high heels or shaving or whatever. That women feel they have to act more or 'do' femininity to achieve perfectly reasonable goals, like be attractive or to get a job or whatever is not a sign that those women are somehow awful. It’s a sign that they are in a socially inferior position and have to put up with more shit to get half as much."

But this isn't so much about gender, I guess, as it is about technology standardizing behavior and expectations as it presents more "choices." Women have more choices and options than ever in how to conform to an oppressive standard of beauty; isn't that great? What freedom. The choices are actually coercive in practice; they destablilize one's sense of self and intensify feelings of insecurity, they intentionally create the impression of inadequacy. When consumer choice colonizes a realm of everyday life, it absorbs it into the play of the cycle of fashion and the zero-sum rigor of manufacturing class distinctions, the requirement of consuming conspicuously. That's why Marcotte's prediction that the pressures of self-presentation men and women will be subject to will be distributed more equally seems both plausible and extremely depressing. "Grooming standards are going to go in this direction, I suspect. As women gain power, we’re going to grow weary of tap dancing for men, but on the other hand, men are going to start tap dancing for us. I’ve got no problem with this; in the abstract sense, a lot of things marked feminine are joyful in themselves, but only problematic because they’ve got the baggage of inferiority attached to them. Ornamental dress and grooming isn’t really a problem, unless you have some sort of grudge against color and beauty." I guess I do have a grudge against color and beauty, because in consumer society those concepts aren't for themselves but are tools of producing, displaying and reinforcing inequality -- now they reinforce gender inequality, but should they shift in the way Marcotte anticipates, then they will express and uphold class inequality. The "baggage of inferiority" is always attached to beauty once it becomes subject to fashion -- that is, once it becomes an on-demand product; ultimately that's the whole point of "feeling beautiful" as opposed to simply being beautiful: to make yourself feel superior and others inferior. And its byproduct, that we all feel insecure over just where we rank in the beauty hierarchy, just makes us that much more cooperative with the existing social order.

Freedom from choice (13 October 2006)

One of the things I used to naively romanticize about the Soviet Union as a teenager was the idea that there were purportedly so few choices among consumer goods, that the stores were bereft of brands. I felt terrorized by the need to own brand-name clothing and crap like that, so I used to daydream about the land beyond such distinctions. This image that I found on the English Russia blog, pretty well illustrates what my austere fantasies were like.

Since then I've adopted a different attitude toward the defunct Soviet system, but I'm still skeptical about the need for so many consumer choices. Convinced by Barry Schwartz's analysis in The Paradox of Choice, I usually think of the problem of too much choice in terms of optional paralysis: the existence of more choices defers our need to make a decision and enhances our fear we'll make a less than optimal choice. The more choices we are presented with, the more likely it is we'll become a "maximizer" and cease being a "sufficer," to use Schwartz's terminology. Apparently people can generally adapt to whatever course they have chosen and rationalize it as the best choice retrospectively, but that benevolent process won't kick in as long as we suspend ourselves over multiple possibilities and "keep our options open." One of my main gripes about cell phones and other communications technology is that it encourages precisely that behavior, a refusal to commit to any plan and an attitude that all decisions are provisional. This, I think leads to greater uncertainty and further unhappiness and a certain irrational insecurity that manifests, for example, in the insane compulsion to spend every moment while walking down a sidewalk on the phone with someone else. Because one's own decisions have been made provisional, one probably assumes everyone else's have become that way too, and therefore we must keep calling each other up to firm up plans or lobby for what has already supposedly been agreed upon. This adds to the sum total of insecurity we all must wrestle with everyday, yet it is extremely difficult to perceive that systemic low-level insecurity; instead we remember those instances when cell phones prove truly convenient.

But in certain cases, too much choice can lead to misery for another reason: overconsumption. This post from economist Chris Dillow's blog Stumbling and Mumbling, cites a study about TV watching that reveals that "For the 10% of people who watch most TV, relative to what you'd expect from their demographic features, moving from 3 to 10 TV channels depresses well-being by one-third of the effect of getting divorced." Dillow evokes the notion of akratic individuals -- i.e. people with less than average willpower who can't resist temptations they would otherwise prefer to resist -- for whom the additional options prompts unwarranted and ultimately undesired consumption. Akrasia poses a difficulty to neo-classical economic thinking, which holds that consumer choices reveal preferences and that people are in effect incapable of doing things they don't really want to do (and if they say they have they are lying to themselves and putting up a false front of virtuousness or morality or modesty or what have you). What's more people are presumed to make choices among potential pleasures that will unerringly yield them the most satisfaction. We are supposedly inherent maximizers, of a sort, but with none of the decision making agony -- we just automatically find the most utility available to us at the margin. From this point of view, overconsumption is a ludicrous oxymoron.

But evidence and anecdotal experience seems to point the other way. Overconsumption occurs; rational choice isn't a given. Environmental and psychological factors lead people to choose poorly and against their interests and intentions. But because perfect rationality is enshrined in the received analyses of capitalism, and because capitalism shapes our consciousness in ways we can hardly even begin to enumerate, we tend to expect of ourselves this perfect rationality, we tend to overrate the "freedom" that comes from consumer choice and underrate other forms of political and social freedom -- or rather we see our ability to vote, to participate in civil society, to expresses ourselves more or less freely as finding their most perfect expression in market situations, in the choice among products we'll own. And since we are encouraged by the standard economic analysis of capitalism (which trickles down throughout capitalist culture) to never regard our free choices as constrained or curtailed or shaped by any force other than our own will, we believe the exercise of that will in the market is the most meaningful self-defining activity we can undertake -- consumption trumps production, and we are what we own rather than what we make and do. Also, it gets harder to understand what is happening when the market disappoints us, when we discover we have made the false choices that received ideology have taught us are impossible. Society allows no space for such disappointment to exist, since we can't blame our perfectly rational selves or the perfectly efficient market. So it just builds as a kind of dark matter, perhaps finding expression in the rise of mental illness, stress, and fundamentalist spirituality.

Pop music critics: Raconteurs or racketeers? (11 October 2006)

Though I'm an occasional practitioner, I'm generally turned off by pop music criticism. Usually the critic must strain to establish the reliability and authority of his own tastes, when these are often arbitrary or determined by extra-musical considerations. The critic's need to sound authoritative often becomes an end in itself, so that informing the audience about a piece of music becomes subordinated to the critic's establishing her cultural capital through allusive flourishes and headlong rhetorical rushes and crafty phrasemaking. Many reviewers are extremely creative, but the creativity seems misplaced if not parasitical. It's a bait and switch: you begin the review wanting to learn about a band or an album and end up regaled with the reviewer's diaristic ramblings. The reviewer uses the review as a ruse to get you to pay attention to his raconteur performance.

Often this is entertaining and informative, but it feels as if that happens by accident. When I was a teenager, access to music and music opinion writing was so limited that I would consume whatever I got my hands on and probably gave it more credit than it deserved: the scarcity of column space made it seem that those who had it had some privileged insight into the workings of pop, that they had oracular wisdom. There's no scarcity of column space now; now there's a scarcity of attention that readers can pay to all the reviewing that's out there. This superfluity, paradoxically, has produced the monopoly Matt Yglesias argues that Pitchfork now has over indie-rock taste formation. I suspect that the long tail of Internet opinion writing makes those few "hits" at the narrow head seem that much more important; the more options there are the less consumers want to experiment -- a "paradox of choice" scenario. Yglesias explains it differently, citing the decline of local alternative weeklies:
Most categories of media used to rely on a handful of big players that dominated the scene. The Internet, by lowering the barriers to entry, lets more voices get at least some audience and you see a lot of fragmentation. But indie music was very fragmented back in the day thanks to alternative weekly papers. That particular brand of media has, however, been very hurt by the Internet. On the one hand, there's less need for each town to have its own record critic and movie critic when the Web can distribute reviews nationwide at very low cost. At the same time, Craigslist has really undercut the classified advertising market. So we've seen the emergence of a single website with enormous market power -- Pitchfork.
The barriers to entry, of course, are still low. But to prevent a rival from emerging, Pitchfork doesn't need to be perfect -- it just needs to be good enough. Which it is. Their taste is generally reliable. What's more, however, there's an asymmetry to what kind of reliability matters. A website that regularly recommended bands that turned out to suck would be a real problem. You'd waste money on albums and shows that you didn't enjoy. But if the website merely fails to recommend albums that are, in fact, good you won't notice. You just won't buy them. Instead, you'll buy other things that they do recommend. And as long as those things are non-terrible, your life will proceed just fine -- you'll still have plenty of good music to listen to and there won't be an incentive to seek out alternative opinions.
I don't know if Pitchfork has this kind of hegemonic power or not, and I'm not sure there's greater incentive to write negative reviews than positive ones, though perhaps the unlimited space and the emphasis on a reviewer's raconteurship, though, has made preliminary filtering -- the selection of only interesting things to review -- less significant and customary. Still, people turn to reviews for recommendations; they don't need to be told what not to listen to. And there's no pleasure in writing negative reviews. I've written plenty of them, but usually out of misguided sense that what I was doing was some kind of radical truth-telling about the nature of the culture industry. But truth has nothing to do with it. I thought it made me seem credible and uncompromised to be negative; but record reviews are no place to make the case that commercial music altogether should be stopped.

Writers, knowing that a positive review will be read more than a negative one and will likely be featured more prominently on a site or on a metafilter-type aggregrator, have more incentive to review everything glowingly and manufacture hype. And I don't think it hurts Pitchfork or Spin or anyone else to hype bad bands. People are quick to forgive misleading hype because they get a temporary joy from the excitement it infects them with and because it is so universally prevelant that they probably don't bother to hold anyone in particular accountable for it. In fact, it seems Pitchfork rose to prominence on the strength of its breathless hype of bands that succeeded in becoming semipopular. A pop critics' power may seem to come from piggybacking on some high-profile trends and being regarded as the herald of things that have brought pleasure. But because music's ability to give pleasure is so arbitrary, I wonder if a critic's power ultimately has nothing to do with predictive power and everything to do with how entertaining a raconteur he is on a consistent basis. In other words, Pitchfork is widely read because the reviews are funny, not because they are accurate. Trends in indie-rock popularity are likely driven more by TV music supervisors selecting songs for shows or perhaps MySpace momentum than by Internet critics. And most of all they are driven by the word-of-mouth maestros that Gladwell discusses in The Tipping Point, who differ from writers, I think, in the amount of ego invested in the taste-making process.

In an interesting post at Crooked Timber, Henry Farrell picks up on Yglesias's observation of distorted incentives to make a slightly different point about the source of pop critics' alleged power. Farrell cites Diego Gambetta’s work on the Sicilian Mafia in an effort to relate the arbitrarity of pop music criticism to the Mafia's racketeering methods. Just as the Mafia must broker negative outcomes to chosen victims to demonstrate their power, so must critics advocate dubious art to assert their ineffable powers of discernment:
Critics serve to guarantee to the public that certain artists, certain music, is ‘good’ (there are a whole bunch of sociological questions about what constitutes ‘good’ in this sense that I don’t want to get into). But they also want to preserve their own role as critical intermediaries and arbiters of taste – in other words, they don’t want consumers to feel sufficiently secure in their own tastes that they can bypass the critic and formulate their own tastes about artists. Therefore, one could make a plausible case that critics have an incentive to inject certain amounts of aesthetic uncertainty into the marketplace, by deliberately writing reviews which suggest that bad artists are good, or that good artists are bad, so as to screw with the heads of the listening public.
I think critics lack the kind of leverage with consumers to make this work, but for those who have fallen into the trap of looking to "established" critics to foster their own taste's legitimacy, this sort of strategy will keep them ensnared. I doubt critics consciously embark on such a nefarious plan -- it's not as organized as organized crime -- but they probably excuse their abstruse choices as demonstrating their versatility or flexibility or personal growth as a critic rather than an effort to keep readers guessing. But it's probably right that the motive lurking behind all pop-critic discourse is the need to justify the need for pop critics at all -- they are always threatened by the fact that pop culture is made to be directly accessible by a mass audience without intermediaries, that its aim generally is to cater to broad, simple tastes. The pop critic wants always to obfuscate that if he wants to do anything other than filter.

Defining altruism as impossible (10 October 2006)

Tim Harford's column in Financial Times this past weekend adopts the familiar economists' argument that altruism is a convenient fiction, and that all behavior at some level is self-interested. "It is not that economists are incapable of imagining - or even modelling - altruism. They can, but they usually don’t. And there is a good reason for that: people aren’t selfless." I don't disagree necessarily: Yes people who do good get something out of it -- the satisfaction of doing actions that others and themselves wil find meritorious. But I'm not sure what the point of such a case is, other than to make the argument that it's no good to even make the effort to be anything other than selfish and to encourage people to spend less time rationalizing their selfishness or checking it.

Is that what's at stake? Is Harford wanting to espouse the Hayekian idea that altruism undermines price signals and makes it impossible for others to properly value what an economy requires? But generosity on the scale he's discussing doesn't really threaten spontaneous order. Is he perhaps at heart one of those Ayn Randians who think charity is actually detrimental, a patronizing impingement on the dignity of those you seek to aid? ("Those children with cerebral palsy don't want your pity. Let them learn through hardship how to cure themselves.") What's wrong with wanting to impress people with how giving you can be? It may be a sneaky way of being ostentatious, but isn't it better than buying a Lamborghini? ("Well, actually, the Lamborghini plant employs…") The whole thing seems like a cheap way to be contrarian without making an especially clear point.

But I was most perplexed by this piece of econothink:
Even the way we choose to dole out cash betrays our true motives. Someone with ₤50 to give away and a world full of worthy causes should choose the worthiest and write the cheque. We don’t. Instead, we give ₤2 to the street collector for Save the Children, pledge ₤15 to Comic Relief, another ₤15 to Aids research, and so on. But ₤15 is not going to find a cure for Aids. Either it is the best cause and deserves the entire ₤50, or it is not and some other cause does. The scattergun approach simply proves that we’re more interested in feeling good than doing good.
I don't follow this at all: why would wanting to support multiple causes call your sincerity in question? Is "authentic" charity really an all or nothing proposition? This is what comes, perhaps, when you are locked in to evaluating utility at the margins, or fixated on the logic that leads one to decide voting is pointless since the chances are astronomical that your vote will be decisive. There the logic is the same; one votes to make oneself feel better and to pretend to be a good citizen. But it's not a pretense; these sorts of "useless" gestures establish important parameters for one's behavior and elevates a principle of doing a virtuous activity for its own sake, not because one has rationally calculated the action that will be maximally efficacious. Donating money to several causes may demonstrate indecisiveness but it doesn't imply self-satisfaction necessarily. Charitable impulses are necessarily haphazard, because they represent a flight from "rational" selfishness -- the pleasure we take in them is in part how irrational they make us feel, disposing of Bataille's "accursed share." The altruistic move is the one that can never be modeled or predicted; thus it's a way to reassert our human spontaneity in the face of institutions that increasingly anticipate, often to our delight, our every next move.

Damn Yankees (6 October 2006)

According to the adage, rooting for the Yankees is like rooting for U.S. Steel, but in fact the Yankees with their enormous payroll, entirely disproportionate to the rest of the teams in the league, represent, if anything, the triumph of labor over management, earning close to the maximum of their market value for their efforts, for better or for worse. In the game that fans generally don't care about -- the struggle between workers and bosses -- the Yankee players are succeeding in unprecedented fashion. In the game we watch on the field, it's a different story.

This weekend the Yankees lost a playoff series to the Detroit Tigers, a team with the best pitching in baseball (and the statistics prove it) and which won 95 games, two fewer than New York. Because of the disparities in the teams' payroll, though, the Tigers were considered to be underdogs, wildly overmatched, a hopeless longshot to even compete in the series, so their victory was heralded as some kind of karmic triumph, something that had less to do with their efforts than the hubris of the Yankees and the laziness and indifference of the team's overcompensated players. Commentators have a chance to break out all the ideological notions that go along with big-money athletes: Well-paid players can't work together as a team, players care more about their pay than the game, the players are arrogant and aloof and unmanageable, they were overconfident in the face of low-profile upstarts, they don't play for the love of the game.

Then there's the Tigers, who prove that working hard and beating the best is its own sweet reward, no matter what the players' take-home pay is. By repeating these nostrums, are we dittoing management's line in undermining unions? True, it's hard to see guys like Alex Rodriguez and Gary Sheffield as working-class Joes; they tend to be depicted as mercenary "free agents," even though it required union intervention to allow them to negotiated the contracts they received, which were not extorted but given freely as a response to fair competition, at least as we typically define fairness economically -- not having enough money to make a competitive bid doesn't amount to unfairness, despite the complaints of small-market teams. But such free agents are ultimately wage workers; they don't own the means of producing baseball games and their talent and notoriety -- the only capital they have -- requires someone else to build an arena in which to exhibit it. They just happen to be wage workers who have managed to get a much fairer deal -- a larger proportion of the MLB enterprise's profits -- for themselves than most workers, because they have rare skills that are not easily replaceable. But because they have won the labor-management game, we for some reason have a strange desire to see them lose the game on the field as recompense. We crave proof that being a successful worker paid an appropriate wage somehow means you are a bad human being, tainted by money.

We end up cheering the Tigers' victory as some kind of victory for the baseball system or because they have a low payroll and have thereby "overachieved" -- but isn't it odd to cheer an organization for its success in suppressing wages? That is like rooting for U.S. Steel.

Listener fatigue (6 October 2006)

Reason's blog links to this article from the Austin American-Statesman about the abuse of audio compression in contemporary pop music. Compression is a post-production audio-processing effect that eliminates dynamics and makes everything sound equally loud and crisp -- and it's what makes it sound like your radio is going to explode when the chorus of "Smells Like Teen Spirit" kicks in. When records were pressed to vinyl, the medium limited the amount of compression that could be used (which part of the reason records sound so warm); but digital technology changed all that. With no physical limitations, engineers have gone over the top with compression. Why? Joe Gross, who wrote the article, calls compression the audi equivalent of MSG. It's the audio equivalent of boldface type; it makes things "pop".

But it also seems to make listeners' ears hurt. The article cites a letter written by an exasperated Sony A&R man:
"The mistaken belief that a 'super loud' record will sound better and magically turn a song into a hit has caused most major label releases in the past eight years to be an aural assault on the listener," Montrone's letter continued. "Have you ever heard one of those test tones on TV when the station is off the air? Notice how it becomes painfully annoying in a very short time? That's essentially what you do to a song when you super compress it. You eliminate all dynamics." For those already confused, Montrone was essentially saying that there are millions of copies of CDs being released that are physically exhausting listeners, most of whom probably don't know why their ears and brains are feeling worn out.
Another recording engineer concurs:
The brain can't process sounds that lack a dynamic range for very long. It's an almost subconscious response. This is what Montrone was talking about when he mentioned the TV test tone. "It's ear fatigue," Tubbs says, "After three songs you take it off. There's no play to give your ears even a few milliseconds of depth and rest." Alan Bean is a recording/mastering engineer in Harrison, Maine. He's a former professional musician and a doctor of occupational medicine. "It stinks that this has happened," he says. "Our brains just can't handle hearing high average levels of anything very long, whereas we can stand very loud passages, as long as it is not constant. It's the lack of soft that fatigues the human ear." This is part of the reason that some people are really fanatical about vinyl. "It's not necessarily that vinyl sounds 'better,' " Bean says. "It's that it's impossible for vinyl to be fatiguing."
Gross connects this phenomenon to the attention wars that play out in consumer society, with everything competing to be heard. Just as the relentlessly loud record wears us out, the relentlessly nagging media culture, the inexorable progress of ad creep, the invasiveness of entertainment and information access seem to have a tendency to shut our bodies down, and we stumble through life in a state of sensory fatigue. What happens them? Vulnerability? Susceptibility? Depression? Stress? Psychosomatic ailments? While individuals have the opportunity to go Luddite and hole themselves away (with their vinyl record collections), the process seems irreversible at the social level; the causes present themselves as solutions.

Extended warranties (6 October 2006)

Buying extended warranties is foolish, as this Washington Post article clearly shows.
"The things make no rational sense," Harvard economist David Cutler said. "The implied probability that [a product] will break has to be substantially greater than the risk that you can't afford to fix it or replace it. If you're buying a $400 item, for the overwhelming number of consumers that level of spending is not a risk you need to insure under any circumstances."
Since extended warranties don't typically cover wear and tear damage -- the main reason consumer goods fail -- you would basically be buying insurance that covered an extremely unlikely event, that a product would suddenly become a lemon after the manufacturer's warranty lapsed. At the point an extended warranty kicks in, you'd generally be better off replacing whatever item it is with the up-to-date model rather than having a third-party repairman of the insurance company's choosing fix an outdated piece of electronics, probably at great inconvenience to you. You would do better putting that extended warranty money into a slot machine and setting aside whatever money resulted in a repair/replacement kitty.

Behavioral economists point to extended warranty purchases as an example of irrational risk sensitivity, but it seems to me like more a case of asymmetrical information. Spending makes consumers feel vulnerable, and retailers exploit that discomfort buy selling them an insurance product they know their customers don't really need. Customers buy a sense of well-being that evaporates, probably the minute they walk away from the register, away from the salesperson's nagging predictions of doom. (A variant on this is the pernicious practice of rental-car agents forcing unnecessary insurance on customers in an even more confusing retail and regulatory environment, typically conjuring up doomsday scenarios and implying legal ramifications for customers that are dubious to say the least.) You end up with the feeling that the company hopes the product it just sold you will break, to spite you for rejecting their warranty -- which is where it makes its money.

For what's startling, and what helps explain their popularity, is this fact, also highlighted by Tyler Cowen in this Marginal Revolution post: "Neither Circuit City nor Best Buy discloses how much of its bottom line comes from extended warranty sales. But analysts have estimated that at least 50 percent and in some lean years 100 percent of profits at the electronics retailers come from extended warranty sales." No wonder the salespeople are so pushy.

Despicable, but all too rational (4 October 2006)

Wal-Mart is by no means the only employer who is guilty of the labor practices this NY Times article details, but as Ezra Klein never tires of pointing out, Wal-Mart sets the standard that others will have to follow to be able to compete. (After all, it is the world's largest employer.) These most recent moves -- purported to allow the company flexibility to efficiently deal with fluctuations in store sales volume due to seasonal variation, vagaries of the business cycle -- seek to undermine the benefits that accrue to employees through longevity. Also, old workers are prone to expensive inconveniences like sickness, and tend to be more "inflexible" in their ways (they are less amenable to having their hours rejiggered at management's whim) that put a burden on the company.
some Wal-Mart workers say the changes are further reducing their already modest incomes and putting a serious strain on their child-rearing and personal lives. Current and former Wal-Mart workers say some managers have insisted that they make themselves available around the clock, and assert that the company is making changes with an eye to forcing out longtime higher-wage workers to make way for lower-wage part-time employees.
Since most workers in discount retail don't really gain any skills from long-term employment, and since they have been successfully prevented from unionizing, they are easily and ideally replaceable every so often, before they reach any service-related goals and raises that may have been dangled before them to keep them striving and focused while on the job. "These moves have been unfolding in the year since Wal-Mart’s top human resources official sent the company’s board a confidential memo stating, with evident concern, that experienced employees were paid considerably more than workers with just one year on the job, while being no more productive. The memo, disclosed by The New York Times in October 2005, also recommended hiring healthier workers and more part-time workers because they were less likely to enroll in Wal-Mart’s health plan." Experienced employees figure out how to make the employer's system work more to their advantage. That's why you need to lean on them until they quit.

This is in no way surprising. Employers have no incentive to show any loyalty to their employees -- the illusion that they have ever cared has always stemmed from the pressure the existence of strong unions exerted on them. Pensions, benefits and such -- the entire concept of human resource departments (which are detestable precisely because they pretend to perform the function of union representatives while working in management's interest) -- were often concocted to forestall the progress of unions. But things have changed, and employers have nothing to fear anymore, nothing to prevent them from shifting all the insecurities of the business cycle onto workers, those least fit for coping with them. As Klein explains,
Folks forget sometimes that unions aren't just there to argue for better benefits and salaries, but better working conditions, more stability in hours, more respect for seniority, and easier mediating between family and work. They exist, in other words, to ensure that employers uphold their end of the "work hard and get ahead" bargain. Except, unions don't really exist anymore, and they certainly don't at Wal-Mart. This is the result.

The point is that there is no such bargain in American society, and that there ever was one in the good old days is an illusion. Employers regard labor as a cost to be controlled, not as people whose welfare needs to be considered -- such bleeding-heart sentiment was proven useless with the "defeat" of socialism and the proclaimed end of economic history. If workers and employers both prosper, it's not because of some spirit of fair play and ethics, it's not because some employers are congenitally nice and paternalistic, it's because both sides have leverage over each other that forces them to split the proceeds. The bargain, to the extent that it existed, was forced by labor having a representative in the negotiation in the form of unions. Unions, though, have been systematically stripped of their ability to effectively organize, and the NLRB is staffed with Republicans hostile to their very existence. So employers rationally extend their advantage and insulate stockholders at the expense of employees. This leaves workers to fight with other workers for what protection remains, continually undermining one another while the company blithely sails along, meeting its growth targets and pleasing Wall Street. It's an old story, and it probably sounds like a string of leftist cliches, but the utter boring predictability of it, and the reluctance to tell that same old tired workers-getting-screwed story yet again is one of the most potent weapons management has in its arsenal.

other Desert Cities

Part of my trip to Southern California was spent in Palm Desert, one of the "other Desert Cities" referred to on the I-10 exit sign for Highway 111, which runs from Palm Springs out to Indio, before it heads to that environmental disaster area known as the Salton Sea. ON the way there from L.A., you first pass through the wind farms in the mountain pass and the creepy rows of wind turbines that render the landscape alien and forboding. Harvesting wind energy seems a good idea, but still, the hills seems to have been colonized by some relentlessly churning alien life-form -- I felt like I understood the concept of visual pollution at a visceral level. The whirring blades are mesmerizing, in a bad way. They create a delirium of planes and angles shifting and changing in a lulling rhythm, making it impossible to see anything else. It's a wonder there aren't more accidents on that winding downhill stretch of the freeway, where it seems like the average traveling speed is around 85 miles per hour. Beyond the turbines, you enter Palm Springs, the desert city that is not "other" and is the oldest of the group. It's an unremarkable town that sits in the shadow of a stupendous mountain. The sublimity of the landscape makes the human doings there seem a bit insignificant, piddling, so it's suitable that most of what goes on there is golf and tchotchke shopping From there, on 111, you enter Cathedral City, then Rancho Mirage (home to the Betty Ford Clinic, a rehab center), then Palm Desert, where we stayed. From the highway these towns are indistinguishable -- just one shopping strip after another, with some hotels interspersed here and there. Streets are named for moribund performers: Dinah Shore, Bob Hope, Frank Sinatra, Fred Waring. I could think of no good reason to be there, and that was what made it perfect.

Unlike most touristy places, which garishly try to cajole you into doing and spending, thrusting temptations your way and working to intensify your restlessness, Palm Desert was an oasis of sobriety. No wonder the rehab centers are there. No wonder people talk of going to the desert to dry out. At night -- we were there on a Saturday night, and it was quiet as the moon -- even the lights were subdued; so much so that most of the stores and restaurants seemed to be closed, seemed indifferent to our business. There were activities imploring us to attend -- no bands playing, no limited-run reperatory cinema, no places to see or meet people, no night life of any kind. It seemed like we were so alone. It was beautiful.

Having nothing to do and feeling no pressure to do anything exciting are two very different things. It seem like anything we turned to was going to be fulfilling. We went to the outdoor pool in the warmth of the evening and sat in the hot tub and when we got too hot, we went swimming. We met a few recent graduates of the "program," which seemed to be a Betty Ford euphemism. We went to eat at an anonymous chain restaurant and received pleasant, generic service. We felt like nobodies in nowhere land. I wished we would have booked a longer stay.

I'm always troubled by forced leisure, so much so that vacations rarely feel warranted or comfortable to me; they often seem like an alternate form of work. I feel like I'm trapped in what Baudrillard calls the fun morality, the obligation to treat leisure productively, to use it to manufacture distinction if nothing else. It's very hard to just waste time, to let yourself destroy it. The pressure can become intense to find something useful to do with the vacation time, made artificially precious by the meaningless work it's framed with. It can lead to moments of self-consciousness within the vacation -- which remove one from the present moment and place in time and sends one to the purgatory of hypotheticals and second guesses: Am I really living up to the time I've been allotted? Has this all been worth it? Worth what? What is the point of comparison?

Living in New York, I'm constantly aware of ambitious people, and the pressure they put on themselves and the people around them. It's in the pace of everything that happens, and I become infected with it -- it shows in the way I am ready to run people over on the sidewalks when they aren't going fast or in the impatience I freely exhibit when the person in front of me dodders around for exact change while I'm anxious for my coffee. Los Angeles has similarly ambitious people, though it seems to exhibit itself there as a kind of desperation to be paid attention to rather than a heedless haste. But when you reach the other Desert cities, ambition seems a million miles away. Urgency is unthinkable there. It dawned on us that this could be the point of the place, to evaporate ambition in the dry heat and leave you adrift in endless expanse of undifferentiated time. The ultimate vacation is from ambition, from the need to score distinctive accomplishments -- to remove yourself from the ongoing competitive status game that haunts our every action. In the desert cities, places that don't especially want tourists so much as retirees, who are beyond ambition and anxious only to fill out the rest of their days with pleasant distraction, that vacation, possibly a permanent one, is always waiting.

The security of humorlessness (2 October 2006)

I just got back from a brief vacation in Southern California. On the flight back we had a layover in Houston, during which I heard an announcement that warned that anyone making jokes about security issues could end up facing imprisonment. At first I thought this was a bit draconian and totalitarian, the TSA was going to dictate my sense of humor to me while I was in the deadness of airport space. But I also thought there was probably no need to be making jokes about security because there's probably no chance that they would be funny. And anyone making such jokes may very well not be joking. It seems as though many dangerous situations, as they begin to unfold, seem like a joke. A homeless person approaches you, makes a request that slowly becomes a demand, and possibly you think, What are you, joking? A teenager tells you to hand over your iPod as you are walking by; it might even seem like a joke as he's coming at you. One of my flights out to Los Angeles had to make an emergency landing (which sounds much more dramatic than it wa; it was just an unscheduled stop at the Newark, New Jersey airport) because the passenger across the aisle -- a uniformed crew member from the plane's previous trip -- was having trouble breathing and seemed like he was about to die. My thought was initially that it was all some kind of joke, some kind of test, not really happening, not really spontaneous and unintentional. It may be that life is so rationalized and bureaucratized that we must regard anything spontaneous and unintentional as some kind of joke. It starts to become clear, when you travel down that path of reasoning, why you would get on the airport intercom and forbid joking around.

But then, if you were joking, and it didn't just seem as though you were joking, then theoretically you aren't serious about what you say, by definition, and you are announcing your harmlessness. What sort of terrorist would joke about his plan in the airport? If it can be identified as a joke, then the harmlessness of the joker is known, and the punishment is just spite for trying to puncture the illusion of protection the TSA veils the airport in. We have all seen the reports of how easy it is to defy the rules at the screening checkpoints and smuggle in contraband fluids or pass through the metal detector without removing our shoes or various other prototerroristic acts. What the TSA relies on is a humorless attitude about security, which plays out in every traveler monitoring every other traveler, studying them for signs of suspiciousness. Catching them joking about bombing a plane is not really the point, of course; sending the message that mutual suspicion and snitching over trivialities is encouraged is what is about. TSA officials perhaps hope this climate can serve a deterrent function. It's hard to imagine who would be incompetent enough however, to be deterred, by citizens on patrol.

Also, isn't it terroristic to show disaster films like Poseidon as the in-flight entertainment? Don't they think passengers can make the simple analogy of air travel with boat travel? And I would like to send a special shout-out to the sadist next to me on the flight from Newark to Los Angeles. United 93 was a great choice for your portable DVD player. After the medical emergency landing and the asphixiated crew member I had just witnessed, it really was the coup de grace. Thanks for sharing that experience with me.

Risk compensation (2 September 2006)

In my post about aggressive driving, I mentioned that particular species of economic reasoning that holds you are less safe if you wear seat belts, because you will drive more recklessly -- you diminish your incentive to be careful by having taken safety precautions earlier. Steven Landsburg makes that classic case in the first chapter of The Armchair Economist. Mark Thoma has another example here, where he links to a study that reveals "Cyclists who wear helmets are more likely to be knocked off their bicycles than those who do not, according to research. Motorists give helmeted cyclists less leeway than bare-headed riders because they assume that they are more proficient. They give a wider berth to those they think do not look like 'proper' cyclists, including women, than to kitted-out 'lycra-clad warriors.' " Tyler Cowen takes the opportunity to remind readers of the Tullock Effect, which argues that the most important safety device one could add to a car is a spike mounted on the steering wheel pointed at the driver's heart.

This is precisely the sort of economic thinking that non-economists find baffling, if not repellant, because it seems smugly contrarian, mimicking the perversity tropes that Alfred Hirschman has identified as the hallmarks of reactionary rhetoric. Not only do helmets not make you safer, they put you at greater risk. When you make your silly little attempts at affecting what will happen to you, you actually undermine yourself. But economists aren't typically reactionaries. They seem to prefer to see themselves as radical truth-tellers, burning away clouds of rationalization and demogoguery to reveal the consequences of incentives at work. But I wonder if there isn't some kind of risk compensation going on for economists themselves, snug in the safety of their own mathematical models, protecting from the ambiguities in the world that they have rendered invisible.

Mortgaged future watch (28 September 2006)

The front page of Monday's Wall Street Journal is the gift that keeps on giving to me. Another article there discusses the problems America may face because of its enormous current-accounts deficit, and explains it all much better than I could in the last column I wrote. It's extremely surprising to see something this bearish in WSJ; it's like a stiff dose of fiscal castor oil. Here's the lead:
Over the past several years, Americans and their government enjoyed one of the best deals in international finance: They borrowed trillions of dollars from abroad to buy flat-panel TVs, build homes and fight wars, but as those borrowings mounted, the nation's payments on its net foreign debt barely budged. Now, however, the easy money is coming to an end. As interest rates rise, America's debt payments are starting to climb -- so much so that for the first time in at least 90 years, the U.S. is paying noticeably more to its foreign creditors than it receives from its investments abroad.
So what, right? The significance of this is that "in years to come, a growing share of whatever prosperity the nation achieves probably will be sent abroad in the form of debt-service payments. That means Americans will have to work harder to maintain the same living standards -- or cut back sharply to pay down the debt." Says economist Nouriel Roubini: "Your standard of living is going to be reduced unless you work much harder. The longer we wait to adjust our consumption and reduce our debt, the bigger will be the impact on our consumption in the future." So the heedless consumption of today is coming at the expense of posterity -- we're consuming the sweat of our children's brow. This will cause an especially thorny difficulty if we fail to have those children -- which, if you believe demographical doommongers like Laurence J. Kotlikoff, is already a problem. See also here for a chart explaining why Americans should forget about retiring.

And this also puts the future of our economy in the hands of China. China's borderline irrational predilection for our T-bills at a lower rate of return then they could investing in their own country has permitted our spending binge -- "Foreigners have been willing to accept a much lower return on relatively safe U.S. investments than U.S. investors have earned on their assets abroad. Take, for example, China, which since 2001 has invested some $250 billion in U.S. Treasury bonds yielding around 5% or less -- part of a strategy to boost its exports by keeping its currency cheap in relation to the dollar." If China decides to start dumping this debt, it would roil the dollar and send its value plummeting, diminishing our precious purchasing power. Now, it's not really in China's interest to do so; American consumers have helped fuel the their healthy growth rates. But they've been known to do some counterproductive things in the past.

Aggressive driving (27 September 2006)

As a frequent traveler on New York City's roadways, I'm a huge believer in aggressive driving -- which I define to be taking every advantage other drivers give you to move forward toward your destination. Much like the maximizing selfishness that organizes neoclassical economics, driving aggression provides a simple and predictable principle by which to predict what others will do in any given situation, making navigation through traffic proceed according to a regular logic. This is why driving amidst taxicabs makes me secure -- they are most consistent in their maniacal aggressiveness -- perhaps since they have an economic motive to do so, and acquire more fares. So it all knits up nicely. Anyway, what this means in practice is that one must be conscious of never showing weakness on the road, never show signs of yielding as lanes merge around an accident or a construction site, or else you'll be repeatedly cut off. You can't let pedestrians start crossing against the light, or else a wave of them will block the intersection. You have to totally habituate yourself to accelerating when approaching an obstacle in your lane so as to get in front of the cars beside you in other lanes -- merges happen easier at higher speeds. Most of all you have to trust other drivers will have the sense to avoid you when you have to make a blind merge, say from Roosevelt Avenue or Queens Boulevard onto the BQE, or from the Grand Central onto the Interborough Parkway, possibly the most dangerous interchange on the eastern seaboard. Believing in the good sense of other drivers and presenting them with a uniform code of behavior is all important the more snarled and congested traffic patterns become. Nothing is worse than "polite" deferential drivers, whose eagerness to be liked or to be "fair" introduces all sorts of chaos into the system. Traffic is not about justice; it is about flow.

Cautious drivers, as well, undermine things; they tend to be indecisive, disrupting the flow and creating chain reactions of unpredictability. Worst of all are confused drivers, who forget about the other cars around them as they begin to panic over finding their way. My philosophy is that it's better to be lost than to crash, Bonfire of the Vanities notwithstanding. Missing your exit, I think, is better than veering across lanes of traffic in the last minute at inappropriately slow speeds because you didn't know where you were going.

But Monday's WSJ article about Belgium's traffic problems made me wonder about the limitations of my driving philosophy. The opening of the article is great:
The intersection outside Isabelle de Bruyn's row house in a quiet residential neighborhood here is a typical Belgian crossroads. It has no stop signs. Now and then, cars collide outside her front door.
"The air bags explode. One car flipped over in the street. Part of one car ended up here," says Ms. de Bruyn, a real-estate agent, pointing to her front steps. Her brother-in-law, Christophe de Bruyn, adds: "In America, they have stop signs. I think that's a good idea for Belgium, too."
The suggestion isn't popular at the Belgian transport ministry. "We'd have to put signs at every crossroads," says spokeswoman Els Bruggeman. "We have lots of intersections."
I love that -- we can't put up signs; there are too many intersections! There's an almost touching faith on display about the good sense of drivers, of humans in general, that they don't need a sign telling them what to do at every juncture. It's as though Belgium is refusing to acknowledge that society has become so complex as to require bureaucracy to administer it. And signage has the potential problem of making people less cognizant than they should be; if they start to rely on signs to dictate all their driving behavior, they stop using common sense and stop being so alert. This sort of counterintuitive reasoning, favored by libertarian economists, is usually brought out to explain why wearing seat belts or helmets makes you less safe (you and others around you feel safer and all let their guard down) or why social welfare programs generate moral hazards. Any kind of shared social responsibility (dictated by signs or prompted by legislation or manners and mores) theoretically erodes personal responsibility and the state of total vigilance we are presumed to adopt in the state of nature.

So it would seem like Belgium should have the safest streets around, with everyone personally responsible at every moment for how to proceed, based on a simple guiding principle, yield to the driver on the right. The emphasis on personal responsibility quickly leads drivers to escalate the principle into the kind of relentless aggression I was just advocating, an egomaniacal pursuit of maximizing one's traffic advantage. This has not so salutary results:
A driver in Belgium who stops to look both ways at an intersection loses the legal right to proceed first. Such caution might seem prudent, given the lack of stop signs. But a driver who merely taps his brakes can find that his pause has sent a dangerous signal to other drivers: Any sign of hesitation often spurs other drivers to hit the gas in a race to get through the crossing first. The result is a game of chicken at crossings, where to slow down is to "show weakness," says Belgian traffic court lawyer Virginie Delannoy. Neither driver wants to lose this traffic game, she says, adding: "And then, bam!"
Traffic becomes yet another zero-sum game, another quotidian task turned into an occasion for intense competition for its own sake.

So all this leads me to think that my attitude toward aggressive driving contributes to the general spread of a "There's no such thing as society" mentality that rejects social safety nets, etiquette, and so on that makes social existence run more smoothly in a spirit of mutual cooperation and occasional sacrifice. Yet I can't imagine venturing out onto the FDR Drive with a different attitude; I can't imagine not believing that the collective welfare is better served by my selfish commitment to thrusting myself forward at every possible instance. Am I under the spell of ideology when behind the wheel -- in the quintessentially American role of individual driver, of my own car and destiny -- or am I really a small-government conservative in what I do, if not in what I say?

The snobbery of Wal-Mart rejection (26 September 2006)

When it comes to Wal-Mart, I can get fully behind critiques that attack its monosoponic powers or its systematic suppression of unionization efforts or its failure to provide adequate benefits. I can even understand the rationale behind Chicago's attempt to force big-box stores like Wal-Mart to pay a "living wage."

But I draw the line at fighting Wal-Mart because it's somehow uncool to shop there: this WSJ story, which details how urban environments are trying to prevent Wal-Mart from opening in their midst, cites Boston mayor Thomas Minino claiming that "Wal-Mart does not suit the clientele we have in the city of Boston." If that were really the case, why make any efforts to prevent them from opening? The store would just go out of business when the clientele fails to materialize. But the mayor knows all too well the store would succeed and be a magnet for lower-income families, for whom the store's cheap goods often provide a real boost in living standards. The mayor favors Target, but not because of its wages or benefits: "It's a different image they have in how they market their product and the appearance of their stores," he says. "That's a lot to do with it, the image of the store." In other words, middle-class shoppers come to Target, whose presence in your neighborhood is likely to improve your property values rather than call them into question. With this classically smug latte-liberal utterance, Minino justifies the cynical culture-war griping of many a Republican reactionary. Ooh, we don't want Wal-Mart's trailer-trash reputation in our precious city, it just wouldn't be Bostonian. What would the brahmins think? It makes every one who has ever complained about Wal-Mart seem a little bit more hypocritical in the eyes of its defenders, as they suspect these kind of aesthetic niceties lie at the root of every protest.

In fact, that's probably part of the reason this story is above the fold on page one of the Journal, and these mayoral quotes feature prominently and early. (It even forced an especially amusing A-hed story, about Belgium's lack of stop signs and the resulting frequency of crashes, down below the fold.) It presents such a perfect picture of those politicos and urbanites who reject Wal-Mart as snobs, and Wal-Mart as the unfortunate victim of unfair discrimination. A city commissioner from Miami is even quoted saying, "I feel bad for Wal-Mart, but that's their image." No one needs to "feel bad" for Wal-Mart, which remains the largest American retailer by a long stretch, in large part because of its successful efforts to broadcast loud and clear its image as a ruthless discounter. Wal-Mart would love to portray itself as a victim of its own irresistibility. The WSJ and Boston's mayor have conspired to oblige, and to make anti-Wal-Mart agitation seem like a variant of the same bourgeois busybody-ism that animates homeowners' associations who fret about people who fail to mow their lawns with sufficient frequency.

But the reason to organize against Wal-Mart is never because it is popular or déclassé; such trivial motives will inevitably trivialize the entire effort. You can't snub the people who shop at Wal-Mart as the wrong sort of people and then complain about the wages the company pays to those very same people; chances are the class bigotry does much more damage than Wal-Mart's business practices ever could do.

Emulating machines (25 September 2006)

Science fiction often exploits the fear that we will invent computers that will become smarter than us and then attempt to extinguish our flawed and feeble, morally compromised race. The excellent Battlestar Galactica, whose third season starts soon (expect a like hype barrage like what was recently rolled out for The Wire), does some of the most interesting stuff with this trope, mainly by making the robots indistinguishable from humans and giving them an eschatological worldview. The cylons have a commitment to quasi-spiritual ideals, lending the conflict religious-war overtones have obvious significance for the alleged clash of civilizations currently taking place in reality. The robots' unflagging commitment to their beliefs underscores the way humans waver and are repeatedly vulnerable to betraying one another. We don't ever root against humans -- as we do in Paul Verhoeven's Starship Troopers, in which it slowly dawns on us that the humans are fascists, the real villains of the movie, and the mechanical-looking insects deserve our sympathy -- but we can't fail to see the implication that humankind tends to fracture into warring camps in the face of an implacable enemy. And there's the usual overtones of human hubris and tampering in God's domain and that sort of thing.

But eventually sci-fi will need to evolve a response to a phenomenon that's potentially far more frightening: Rather than robots seeking to eradicate humans, humans become so impressed with the efficiency of machines that they voluntarily seek to emulate them. It's already happening all around us. For example, the book Mind Performance Hacks, recently promoted by BoingBoing, promises "tips and tools for overclocking your brain" and comes fully loaded with a host of other brain-as-processor metaphors. The brain is the hardware and consciousness the output of resident programs. The attraction of computer metaphors is that they seem to solve human problems by allowing us to conceptualize them in a ready-made way that makes them seem easily solvable by the march of technological process. Thus we talk of ideas as computer viruses, taking a biological metaphor that's been technologized and repatriate it for humans. We see our own minds as programmable, controllable, able to be applied to discrete focused tasks. We talk about plugging ourselves into networks and so on. We imagine social life as a massive operating system in which everything has a deliberate function, so that it can seem comprehensible and manageable. By imagining ourselves more like computers, we are to take the value system technology generates -- one almost hegemonic in business culture -- and apply it to our own behavior.

Well, come to think of it, this humans-wanting-to-be-hyperefficient-computers idea crops up even in the sci-fi I've seen (which is not much). There are the hyperintelligent mentats of Dune who drink a special potion to allow them to become human supercomputers. The Matrix depicted Keanu Reeves downloading information directly into his brain that became immediately functional -- a kind of patch or software update, as though the brain ran on third-party programs. The human brain was regarded as passive, alien to the person whose head it was in. It was simply a matter of overwriting it with whatever the person was supposed to experience. One becomes configured as an end-user of one's own brain, a mere consumer of the experiences it can be programmed to spit out. Consciousness is a step removed from the brain, which provides the data that consciousness enjoys, as though it were a film.

The Mind Hacks book takes mind-machinery a step further, promising to make the brain work more like a machine under the user's conscious direction, which implies the user consciousness aspires to be more machinelike, more relentlessly productive. Rather than receiving data the brain spits out, consciousness merges with "subroutines" it can perform to think more mechanically, more efficiently. No doubt these things work -- these kinds of ideas for human perfectibility and increased mental acuity have kicked around before as mnemonics or chisenbop or EST or hypnotherapy, bioengineering, methadrine, etc. -- but what seems new is the insistence on the computer metaphor, as if to be a computer would be to live the dream.

My vague hypothesis about this is the following: that our economy's emphasis on technology as a means to produce perpetual growth and wealth is having the effect of making us think that by becoming more machinelike, we become more human -- we move closer to our human potential by mirroring the methods that have enhanced economic potential and productivity. This seems to fetishize information for its own sake. Information, now an unconquerable ocean, tempts us to master it through heroic feats of navigation, exploratory expeditions made purely for glory. Human potential, human experience may come to seem entirely a matter of information processing -- and the faster your brain processes information, the more life one is cramming into our allotted time on earth. Efforts to absorb all this information can become a kind of flow experience, a way of entering the "zone" associated with athletic accomplishment, and at that point one may seem to merge with the information itself, to become inseparable from its continual transmission. That might be the aspiration anyway, to become the best data you can be, so you still figure in the techno-future world. Social networking sites, which already seek to reduce ourselves (enhance ourselves?) into a flow of routinely updated data, may be the first florescence of this. And the burgeoning popularity of virtual spaces would be the next, integrating the data in a reconsitituted virtual self, bringing people a step closer to having the field for one's identity laid out as a flexible, benevolent operating system, which lets one be ensconced in the safety of programming logic, having shifted existence to a space where inhibiting personal anomalies can simply be debugged.

Retail stocks (23 September 2006)

Amateur stock picking is generally a bad idea, and every straight-talking guide to personal finance will tell you to invest in low-fee mutual funds that track certain indexes -- take the guesswork out of it, since changes in stock prices are generally a random walk that no analyst or fund manager could predict. The theory is that whatever information an investor could act on is already priced in to a security by the time you get your order for it in.

But this doesn't stop financial publications and financial service providers from pimping stocks and urging stock tips on readers. In One Market Under God Thomas Frank describes some of the hoopla about personal investing during the 1990s bubble, and what he calls "market populism." The idea was that anyone could use the stock market to get rich and that purchasing power rendered political power insignificant and made giant gaps between rich and poor immaterial. Part of the hype of the time regarded wise amateurs who could follow their gut and invest in companies whose products they believed in, as though it were as simple as having a good experience in a Home Depot (I know, a far-fetched example) and then phoning your broker the next day for 100 shares of it. Frank notes that one financial guru advised going to the mall and writing down the names of your favorite stores as a way to generate stock-investment ideas. Then you can have a personal stake in the success of the brands you prefer; you can cheer them on like sports teams, but have a legitimate reason for it.

I'm prone to do the opposite. Not that I'm a big-time stock picker, but whenever I read about recommended securities from the retail sector, I'm skeptical, and it has everything to do with my personal bias against brand-name shopping. I rationalize by thinking that it's foolish to bank on the overtapped American consumer's propensity to continue on a discretionary spending binge forever, but really it is that I don't want to believe that American Eagle Outfitters (AEOS) or Abercrombie and Fitch (ANF) are simply going to continue to grow; that duping teens with sexed-up advertisements can constitute a business strategy that Wall Street respects. I don't even want to take them seriously as businesses; I prefer to think of them as dark cultural forces that will be thwarted once everyone eventually wakes up and realizes how pointless brand-name clothes are. Investing in a company like Chico's (CHS) or Coach (COH) would not only be hypocritical, it would be against my utopian vision of the world, against what I want to believe about universal common sense. (Maybe this is precisely why I should be buying retail stocks. Never a bad idea to bet against utopias.) Perhaps the behavioral finance theorists have a term for this kind of bias, but I'm fully aware that it is irrational. But rejecting retail stocks because of a reactionary personal philosophy seems no less coherent than picking them because of the weather. And it turns out the weather is one of the most significant economic factor for retail stocks, perhaps more than fashionability or personal belief in the brand or a good feeling about a marketing strategy. Justin Lahart's column in Friday's WSJ noted the tendency for September's weather to determine a retail stock's fortunes:
September temperatures tend to vary a lot. And September is a crucial month for retailers. That means the weather plays an outsize role in the month's sales and can trump other economic factors, says Paul Walsh, a meteorologist at weather-analysis firm Planalytics, which advises retailers. September is when retailers, especially in the apparel business, are stocked with fall fare. Cool temperatures early in the season make it easier to sell sweaters and furry boots at full price. Last year, warm weather lasted across much of the U.S. until October, leading retailers to cut prices deeply in an attempt to clear inventory. The jolt of Hurricane Katrina also hurt many, meaning comparisons to last year are especially easy this month.
Obviously, if we follow the money, retailers must be scheming along these lines. Control the weather, control your portfolio. But it's amazing to me to think of all the sophisticated mathematical tools and spreadsheets and models and algorithms, and the vast sums of money at stake, and the myriad of different brokers and analysts who work everyday to try to harness the market, and in the end the kind of logic that is seen retrospectively to have affected the market can run along the lines of "Retail is thriving because September was sort of cold and more shoppers bought sweaters."

Virtual fashion and zero-sum games (22 September 2006)

Whenever I read an article like this one, from today's WSJ, about spending real money for objects that only exist in virtual worlds like Second Life, my first thought is usually something along the lines of "How pathetic." I assume that the online life is a compensation for a circumscribed real life (as though mine was so free and uninhibited) -- without autonomous scope in reality, one seeks refuge in a virtual world where one has quasi-divine powers of generation. And that's not such a terrible thing, I guess, even though it leaves the existing institutions that crush aspirations in the name of "being realistic." Online, unconstrained by the givens of genetics and circumstances, one can build an entirely new self that conforms more closely to one's aspirations without having to undergo the struggles and compromises, without having to take the risks or confront the failures that one would while pursuing such ambitions in real life. You start off on a somewhat equal footing, but making the initial decisions yourself about the context that will shape your Second Life destiny. Decisions have consequences on a much more insignificant scale, and aren't irreversible. So matter what you look like or what your ethical standards are or how poor you are in real life, you can be both a stripper and a fashionista in Second Life: "The scene -- drama and all -- keeps Janine Hawkins engaged in fashion in a way that wouldn't be possible for her offline. 'It's totally different to pay $15 to keep up with the fashions in Second Life than' the $1,500 that would be necessary in real life, she says. Her avatar, Iris Ophelia, originally paid for outfits by dancing at Second Life bars. 'Every time I had enough money, I'd run there and buy everything I could,' she says." One can leverage technology much more directly on the narcissistic project of identity, while shifting this project outside of oneself to appear to legitimize it, as if it were the same as making art or pursuing an entrepreneurial scheme. So in short, my immediate judgmental reaction is to see involvement with these worlds as the product of stunted, misdirected energy, and the economic transactions that mediate between real and online worlds as enabling the misdirection, as making the pretend world seem more real, like having a toy Fisher-Price gas station for your Matchbox cars.

But economic penetration into these worlds actually renders them less of an escape, because it introduces the very elements one may have been trying to flee from -- the competition for limited resources, the positional status games that come along with unequal distributions of income. Suddenly one's limitless autonomy is constrained not by the desired Pavlovian obstacles and rewards built into the game by programmers but by the very same intractable realities of money and status that it would seem one would use role-playing games to render insignificant. The invasion of real-life economic considerations is all the more likely in a game that doesn't dictate an objective, like Second Life: "There are no dragons or wizards to slay. Instead, San Francisco-based Linden Lab, the company behind Second Life, has provided a platform for players -- median age 32 and 57% male, with 40% living outside the U.S. -- to do whatever they want, whether it is building a business, tending bar or launching a space shuttle. Residents chat, shop, build homes, travel and hold down jobs, and they are encouraged to create items in Second Life that they can sell to others or use themselves." It almost sounds like an unbounded space wherein individuals can be left alone to construct their own fantasy lives without the constraints of social pressure or necessity -- a utopian space where both egalitarian and individualistic norms can prevail.

But human nature abhors a utopia. Without a specific fictive goal to pursue, the goals we improvise to direct our ambitions in real life will invade, and the anxieties that beset such ambitions will also follow them into cyberspace. And one of the fundamental invented ambitions to keep ourselves preoccupied is keeping up with fashion, or staying ahead of its curve. Sometimes fashionability is a proxy for wealth, another way of demonstrating it conspicuously. But often -- think of Lower East Side youth innovators, or spontaneous ghetto street styles -- fashion is an alternate means for accruing status, for participating in a game with winners and losers in the absence of other clearly delineated goals and in conditions where vast sums of money are inaccessible. Fashion creates a zero-sum game where none otherwise exists, and that no one has an excuse not to play, to sate our need for "meaningful" competition and purpose across any boundary within a society. Hence Second Life becoming overrun by the fashion business, which combines two compelling ways to create winners and losers:
Because Second Life creators own their products and can sell them, the game has attracted both professional and amateur designers, says Linden spokeswoman Catherine Smith. That has led to a thriving fashion scene that includes not just dressmaking but also jewelry, hair and even skin design, as people purchase the elements to create a look for their online alter egos. Selling virtual clothes to real people for their avatars can even be lucrative: In August, the 20 best-selling Second Life fashion designers generated a combined $140,466 in sales, Linden says. "We found out pretty quickly that people loved owning things," Ms. Smith says, and many start by buying items for their avatars. "It's not surprising that fashion and hairstyles and skins are as attractive and as exciting and as valuable as they are, because it's part of individualizing" the appearance of a player's online persona.
Individualization online is not an innocuous project of self-actualization but a competition, a contest, just as we are encouraged to see it in real life. Fashion, in order to thrive, must make sure we never forget it.

Adorno for pop critics (20 September 2006)

Yesterday I noticed that elsewhere at PopMatters was a review of Adorno's Philosophy of Music, which, as the author of the review, Patrick Schabe notes, has nothing to do with "new music" as anyone outside of academic music programs would understand it. Those interested in mounting an Adornoesque critique of contemporary pop music would be better served by reading "Perennial Fashion -- Jazz," from Prisms. The essay, which isn't about jazz so much as it is about whatever musical product is being manufactured and marketed for the masses, features this immortal observation:
The aim of jazz is the mechanical reproduction of a regressive moment, a castration symbolism. 'Give up your masculinity, let yourself be castrated,' the eunuchlike sound of the jazz band both mocks and proclaims, 'and you will be rewarded, accepted into a fraternity which shares the mystery of impotence with you, a mystery revealed at the moment of the initiation rite.

I think of this whenever the need to concentrate on what I'm listening to makes me feel peevish and disgruntled. Isn't music supposed to make me relax? Then I know for sure I'm in the illustrious fraternity. When I start to write a record review and I try to conjure criteria by which to judge it, I feel a bit haunted by Adorno's words; I know nothing I will say will dispel the notion that I'm pounding out my own castrated, syncopated rhythm on my keyboard. In "On the Fetish Character in Music and the Regression of Listening" (anthologized by Routledge in The Culture Industry), Adorno suggests that with the advent of commercialized music, "the concept of taste is outmoded" because "the subject who could verify such taste has become as questionable as has the right to a freedom of choice which empirically, in any case, no one any longer exercises." Taste may survive as a necessary fiction, more necessary for those who attempt to codify subjective judgments and share them as music writers, people who won't surrender to the idea that cultural product is just product whose significance lies merely in how sellable it is. Adorno notes "the golden age of taste has dawned at the very moment in which taste no longer exists."

Adorno's often criticized for his rampant elitism, which consists mainly of insisting that collectively our ability to even hear music has regressed to the point where authentic music has become incomprehensible to our infantile ears. As Martin Jay explains it in The Dialectical Imagination, Adorno believed that one of the effects of commercialized popular music was to regress a listener to an infantile state where "like children who demand only food they have enjoyed in the past, the listener ... could only respond to a repetition of what he heard before." Such a listener's attitude toward culture is like "that of the meaningless leisure of the unemployed." Commercialized music, by destroying the link between performance and listening (you can hear it in an inferior reproduction with no apprehension of what's been lost, whenever its convenient for you the listener with no attention to the work involved in creating it) also destroys the Benjaminian aura of a musical work, which destroys its critical function, its ability to comment negatively on the culture and suggest something larger.

Adorno had none of the faith that later pop culture theorists have had in the ability of consumers to subvert culture industry brainwashing and find liberating, creative use for Justin Timberlake or episodes of Grey's Anatomy. That we mistake playing with the culture industry's toys for a kind of real freedom shows only how impotent and short-sighted we've become. As far as Adorno is concerned, popular music is never about active engagement but always about relaxation, of lulling to sleep the individual's critical awareness, of wallowing in passivity. (Is dancing a passive response to rhythmic music? Yes, in Adorno's mind. It's mimicking the martial movement of troops massed and marching past the dictator's parade stand.)

Critics try to invent standards that will insulate themselves from the consequences of music commercialization, from the reality of exchange value, which levels off all other forms of value and slowly but surely introjects itself into the populace so that top-sales lists are popularly held to be synonymous with best lists. "If one seeks to find out who 'likes' a commercial piece, one cannot avoid the suspicion that liking and disliking are inappropriate to the situation, even if the person questioned clothes his reactions in those words." I can pretend to subjective opinions based on criteria of my own devising, but my own ability to hear has already been too compromised to make these criteria anything but postures of resistance to the market, or collaboration with it -- I can either condemn sell-outs (or laud bands for "authenticity") or hype bands and build their PR image; that is what is left to the music reviewer.

Adorno has this cutting comment for my delusion: As a pop-music critic, I am like "The couple out driving who spends their time identifying every passing car and being happy if they recognize the trademarks speeding by, the girl whose satisfaction consists solely in the fact that she and her boyfriend 'look good,' the jazz enthusiast who legitimizes himself by having knowledge about what is in any case inescapable." As for the masses? "Where they react at all, it no longer makes any difference whether it is to Beethoven's Seventh Symphony or a bikini."

The general idea is that the standardization that comes with preparing a piece of music for the market makes whatever difference may exist between products superficial. Such works cease to be "autonomous" and thereby forfeit aesthetics. "An approach in terms of value judgments has become a fiction for the person who finds himself hemmed in by standardized musical goods. He can neither escape impotence nor decide between the offerings where everything is so completely identical that preference in fact depends merely on biographical details or on the situation in which things are heard." In other words, we choose favorite bands the way we choose favorite sports teams, when really they all play the same game by the same rules and are ultimately exchangeable for one another. We celebrate bands or singers for the cult of personality built up around them, or because we heard the music in conjunction with certain epochs in our own lives.

This relates to what neuroscientist Daniel Levitin found -- in This Is Your Brain on Music (reviewed here) he argues that the cerebellum, the lizard part of the brain that synchronizes movement, is as involved in music listening as the frontal lobe, the cognitive part. Adorno would probably argue that now the cerebellum works at the expense of the frontal lobes, so that we have a mechanical, rhythmic response to music that is pleasurable without ever having out intellect engaged. Levitin also suggests that at times of emotional tumult -- teenage years, etc. -- our brains are more likely to tag certain pieces of music with emotion and have them serve as emotional cues for the rest of our lives. Then as our brain wiring hardens, this tagging ability diminishes, and it becomes harder to become emotionally attached to new music. We recur to the stuff that is "nostalgic" -- the only music we're wired to instinctively appreciate. If Adorno's right, and we've let our intellectual means for appreciating music atrophy, and we've instead become reliant on these emotional/instinctive methods for assimilating music, at a certain point we become frozen; without the intellectual basis upon which to enjoy music we haven't heard before, we're incapable of taking pleasure in anything new. This lost ability is serious stuff to Adorno, because real music, like all real art, when heard with the intellect, offers listeners a means by which to criticize a given reality, to synthesize alternatives. Without it, we're trapped in the status quo, and worse, self-deluded into enjoying it as plentitude.

Blackjack, the new poker? (19 September 2006)

Televising blackjack, as CBS is planning to do, has got to be one of the more desperate attempts to cash in on the poker craze; the thinking seem to be that because they are both gambling games involving cards, audiences will embrace them both with equal fervency. (I guess that's why we aren't seeing World's Ultimate Keno Championship). But playing blackjack isn't all that exciting; watching it even less so. What's the pitch? "You'll thrill as player A contemplates hitting a soft 17. You'll gasp when player B steals the dealer's bust card. You'll writhe in envy as player C splits eights with the dealer showing 6. You'll marvel as drunk tourists pass out at the table while trying to figure out what his cards add up to. Watch in amazement as player D doubles his initial betting increment after achieving a favorable count!" As this article in today's WSJ points out (buried in the last paragraph), quoting a LV Hilton gambling boss, "Blackjack is what it is; the game itself is strategy and if you know basic strategy, you either hit or you stand." Though CBS has likely tried to add the semblance of show-downs and drama, blackjack's not really a competition against other players so much as it is a feat of mathematics, of counting the deck and hedging one's bets accordingly. There are no bluffs, no betting ruses -- no psychology involved whatsoever if its being played at the highest level. The only suspense is to see whether an audacious better turns out to be unlucky.

When I lived in Vegas, blackjack was what you'd play when you wanted to chill out from the gambling games that are actually exciting and move quickly (craps) or involve multiple levels of analysis and courage (poker). There is no need for assessing personalities in blackjack as there is in poker, and there is no infectious excitement or momentum to sweep you up into gambling euphoria, as there is at a craps table, when a hot roller is visibly multiplying chip stacks all around you and creating mountains of wagers before your eyes on the green felt and the dealers' patter becomes more animated, as they get cut in on all the action, and the bettors cheer like its the world series and total strangers start slapping high-fives with each other. At the blackjack table, a much more muted camaraderie develops, one that revolved around how frequently the cocktail waitress returned. The pace, with the frequent reshufflings to stifle card counters, can be glacial, and because it requires no time to make decisions about play (every choice about whether to hit or double down can be determined in advance -- some players bring a chart with them to the table), every pause by another player feels excruciating. When you can see the cards of your tablemates, as with some games, you want to shout out what they should do and get on with it. Really, blackjack is an extremely inefficient slot machine, with lots of room for human error (or crooked deck mechanics).

So I'm guessing TV blackjack will fail. You can't really learn anything from watching it, and it lacks drama because the outcomes seem almost entirely random. But maybe people watch gambling for some other reason, for a vicarious thrill at seeing money treated as points. No longer a matter of survival, money becomes merely a means of keeping score, in gambling games, and perhaps there's something liberating in that.

Updating iTunes (18 September 2006)

I usually ignore the update notifications that iTunes pops up every time I reopen it, because I expect them to eventually drop some digital-rights hammer on my music collection and render it inoperable. (Kind of like what Microsoft apparently plans to do with its future Zune player.) It will start by doing some unrequested "analyzing", going through all my songs one by one, and then boom, none of them will work without some kind of certification. That's probably unduly paranoid, but at some point it seems inevitable. Eventually music players and subscription services will dominate the music industry (this article in today's WSJ about the innovations of Apple's digital music player rivals gives a peek at the future), and the idea of collecting music may become as moribund as collecting pogs. It's hard to imagine not claiming a sense of ownership over music, but it wasn't so long ago that the best people could do was own sheet music. Music must have meant something very different then; it was always an activity rather than ambient wallpaper or a passive hobby. So attitudes toward music are clearly very malleable and responsive to distribution technology. Future distribution may make music more like on-demand cable TV, where we pay monthly to check in and hear something new whenever we want to, or it may merge seamlessly with satellite radio. Once ownership of some tangible product is undermined as a motive for buying music, the stage is set for a resurgence of the significance of radio. What is the difference between radio and subscriptions to massive music libraries, anyway, other than who picks the songs? Most people want some one else to do that work anyway. I imagine the subscription services will offer playlists to download as well as individual songs.

Anyway, I broke down and upgraded to iTunes 7 last night, mainly because I was enticed by the promise that I could have the iTunes store get all the missing album artwork for me. Of course I had to log in to the store and let them store a credit-card number -- but I took the bait; it seemed a fair trade and I like seeing the covers. Apple has obviously decided that pushing album art is important to the next phase of digital music's takeover. Not only does the promise of all that free art get more customers into their database, one click away from purchasing media, it also brings the experience of digitized music ownership one more sensual step closer to accurately simulating the collecting experience. The new iTunes lets you browse your collection by album cover, which makes a surprising difference in terms of how I understand all the junk I've got on my hard drive. The program even has an option that lets you flick through a virtual shelf of "albums" with mouse clicks, letting you see all the covers lined up neatly next to each other as if they were mounted on a vertical Rolodex. It's a still a little clumsy, but it definitely seems like a move toward a whole new paradigm for computerized music consumption that attempts to provide consolation for the loss of the fetishized object. Next some enterprising entrepreneur will get to work scanning the back covers.

Update: The program is extremely buggy for Windows and I've had to rollback to iTunes 6. Looking at pretty covers while browsing isn't worth my computer freezing up every time a new song comes on.

Perverse financial incentives (16 September 2006)

Apart from being an unconvincing defense of Benthamite utilitarianism, economist Richard Layard's Happiness is little more than a compendium of hedonics studies and some general conclusions about their implications -- it is useful as a bibliography if nothing else. One group of studies he cites has to do with performance-related pay, which he is anxious to reveal as a source of stress and hedonic ineffiency. (He also optimistically suggests we should celebrate the income tax as a wonderful way of encouraging a healthy work-life balance.)

But what I found most interesting was this: "Economists and politicians have tended to assume that when financial motives for performance are increased, other motives remain the same. In fact, these motives can change." Layard then cites studies that demonstrate that financial motives tend to compromise pre-existing motivations; it erodes what impulses we might have had to perform an action for its own sake: one found that people paid to solve puzzles will work at them less than those encouraged them to solve them merely for the satisfaction of solving them. Apparently once we are paid to do something, we begin to believe that the pay compels us to do it, and the activity takes on the qualities of disutility economists associate with jobs in general -- that we must be compensated financially to waste our time working rather than enjoying leisure. It seems that being paid is good way to destroy whatever pleasure we take in something; so strong is the alienating tendency of money and profit-tallying that when it intervenes we begin to separate from our involvement in what we are doing in the moment and revert to position of calculation -- thinking about the future, thinking about theoretical maximization rather than actualizing any of that potential in the present moment. This would seem to have the effect of keeping work and leisure unfortunately opposed to each other, a separation that seems to begin with the compulsion to sell one's labor on the open market for wages.

So as long as we remain dilettantish about our hobbies, we can enjoy them; when we professionalize, we turn them into chores. This explains part of my failure to pivot from researching a dissertation to defending one -- I enjoyed learning as long as it was a hobby of mine to understand trends in 18th century cultural production, but when it became a matter of packaging and selling that knowledge, I balked. This is what makes me something of a loser as far as our economy goes; I lack the willpower to be able to stomach the loss in pleasure that comes with professionalization -- another word for that ability to power through that hedonic loss? Ambition, which may be a description of the internal quality of finding pleasure in professionalization rather than tumult and combat and compromise and self-reification. I feel the same way about making music: As long as I have no ambition other than to get together with my friends and make music, I enjoy the pleasures of creation; but if we begin to try to market our band as cultural producers, we're likely to enjoy it less and see it as yet another job, a set of imposed responsibilities from without. But I still perceive these feelings as a kind of failure in myself, balking at the moment when "reality" requires me to summon up ambition and confront the world as it is and integrate the product of my creativity with it in the only apparent way possible: commercially. This is probably why I resent certain bands who are on the brink -- because their music suddenly seems about making those compromises and finding the wellspring of ambition to be more than dilettantes, to be content with more than just their own pleasure.