Friday, April 29, 2011

Small business sentimentality (30 Dec 2007)

Yesterday, as I walked to the other side of the neighborhood to get my hair cut, I noticed that a little independent coffee shop that had opened only a few months ago was already closed. This didn't surprise me at all; it was more of a whimsical notion than a coherent business. They sold hip, old-timey nostalgic things like campy paperbacks and candy in 1950s era packaging, and they also sold homemade pies at exorbitant prices. It had two tables in the back and one in front, not sufficient room for anyone to loiter comfortably, but enough where you were made to feel that someone ought to be but wasn't. And the barristas, if you'd call them that, were generally on their cell phones or in the midst of conversation among themselves instead of dispensing service. And its location, sort of on the way to the subway for that side of the neighborhood, was adequate, but not prime. Maybe, if this article by Taylor Clark from Slate is to be taken as gospel, they needed to be even closer to the Starbucks that's on the corner a few blocks away.

Clark argues that far from putting mom-and-pop coffee shops out of business, Starbucks teaches local customers to elevate their tastes and to find it reasonable to spend a lot on coffee.
Each new Starbucks store created a local buzz, drawing new converts to the latte-drinking fold. When the lines at Starbucks grew beyond the point of reason, these converts started venturing out—and, Look! There was another coffeehouse right next-door!
The reason Starbucks doesn't obliterate its competition the way Wal-Mart does, is that it has far fewer overhead advantages, and the ones it implements (cheap, automatic espresso machines) degrades its product. Often, a chief aspect of the service it sells -- convenience -- is spoiled by its own popularity. And we all know how sentimental latte liberals can be about "anti-corporate" businesses -- independent retailers and the like. The presence of Starbucks right next door allows such people to express their political views with much more salience when they actively reject Starbucks for the small-time coffee shop, assuming all the time how clever they are and how much better the service will be from the local people who truly appreciate it.

I admit that such thinking drove me from my local Starbucks to the now defunct independent competitor. I had received a few lukewarm cups from Starbucks (and didn't have the time on the way to work to go back to the store, wait in line, and ask for a new one) and I remain fed up with Starbucks' employees inability to properly prepare an Iced Americano, so that it doesn't turn into a piss-warm puddle of watered-down espresso. I figured the new local place would do a better job out of pride. As Clark notes, you don't beat Starbucks on ambiance but by providing a better product. But the local shop ultimately failed in this, serving their own lukewarm brew and sometimes making me wait as the counter person carried out their private conversations leisurely rather than waiting on me. Maybe I'm sensitive (i.e. paranoid), but I hate when clerks are laughing with their friends as I approach. I hate disrupting a good time, especially when all I want is what the establishment is presumed to exist to provide. So I stopped going there and made another effort to get up earlier to have coffee at home in the morning. (Clark cites this shocking statistic: only 10 percent of coffee shops fail. That confirms that my neighborhood java entrepreneurs were unusually misguided.)

So, though little public failures always tend to depress me, I wasn't exactly sad to see this locally owned coffee shop fold. Instead, it was a reminder that I shouldn't make the mistake of being sentimental about mom-and-pop stores. It's the same temptation as being sentimental about small-town life, while forgetting the stultifying conformity and the routine invasions of one's privacy. One benefit of, say, going to Starbucks is that you preserve your anonymity, which is tantamount to remaining basically equal in the eyes of the clerks (though the tall guy with the glasses at the Starbucks on my corner remembers me and gets my small coffee ready without my having to ask -- this is more than the local place would do). Mom-and-pop places are much more prone to the petty graft that comes from familiarity and small-scale aspirations -- extorting tips, using variable, spontaneous pricing to take advantage of neophytes, and so on. Local places will play favorites with customers, decide who belongs and who doesn't, and work in various subtle and unsubtle ways to exclude those deemed undesirable. Some people will get "the nice guy discount" (if they have the gumption to ask for it) and others will get stonewalled mysteriously as they wait to be helped.

Ultimately, it depends on the disposition of particular employees how one will be treated in a shop, but national chains are more likely to insist on uniform service apart from local considerations. Sentimentality leads us to believe that those considerations are to our favor, are the sorts of things that knit us into a community. We forget that they can work the other way, and remind us of our arbitrary exclusion.

Disappointed bridges (28 Dec 2007)

The Economist's year-end double issue seems to be a catch-all for all the evergreen articles that could find no comfortable place in the magazine's rigorously formatted weekly structure. It's comparable in a way to the New York Times Magazine's annual year in ideas feature, but far more discursive, almost arbitrary. Among this year's "Chirstmas specials" (as the table of contents deems them) are articles about Mormons, poker, census-taking, skydiving, Esalen (with some sadly egregious typos in the dek -- probably a printer's error; my heart bleeds for their copy editors) and the sex life of pandas. The two I found most interesting were one about the rise and fall of shopping malls (a photography exhibit prompted me to write about them in the past) and another on the moribund entertainment piers found in coastal resort towns -- places in Atlantic City or Santa Monica where you are encouraged to free yourselves of ordinary constraints and waste money on cheap distractions (my favorite is Skee-ball) and synthetic candy and the like.

Why commercialize piers? Not necessarily for their natural beauty, though that can be impressive -- I like staring out into the awesome nothingness, the endless horizon, as much as anyone. It's seductive on several levels, as piers strain and stretch to extend those horizons for us, even if it's for only a few feet. But as with malls and casinos, piers are disorienting places, making them ideal for separating people from their money. According to the article, pier owners turned away from a strategy of exclusivity to instead cater to ordinary people, believing that they "could profit instead from the mob's adventurousness: that sense of being in limbo, neither on sea nor on land, suspended in a state of fantasy."

Because piers are classic liminal spaces -- suspended between land and sea, breeching a conceptual boundary immutable categories -- the article suggests this creates an opportunity for a "sense of self-discovery" but it is also a place where you know you have reached the end of the line -- hence the frequency with which people attempt suicide from them. If one doesn't opt for that direction, it can be reassuring to know that at that point, there is no other direction to go in but the way back. Retracing the same ground can be tedious, or a kind of tacit admission of defeat, but not at the end of a pier. Then, I accept that there is no shame in going back from where I came.

Egalitarian customer service (26 Dec 2007)

This item from BusinessWeek about Southwest Airlines' recent adjustments highlights a dilemma between treating people fairly and treating them equally:
When Southwest Airlines (LUV) rolled out its new business fares and boarding procedures in early November, the carrier's blog quickly became crowded with comments. Nearly 500 impassioned remarks have been posted recently about the changes, which shook up Southwest's longstanding first-come, first-served boarding policy. The old method often meant long waits in line at the gate. The new way assigns passengers a specific place in line and gives priority to frequent travelers and people who pay extra for "business select" fares. Families with small children who don't check in early now wait longer to board.
To me, these sound like sensible changes. Who wants long waits at the gate? Isn't waiting in the insane security line with your shoes off and your pants falling down indignity enough? First-come, first-serve makes sense on the Greyhound (though watch out for the ex-cons), but when you pay hundreds of dollars to travel somewhere, you should be able to book a seat. It makes much more sense to fill the seats in the order that they are purchased, or (most economists would likely argue) to use variable pricing to induce customers to pay extra for the privilege of securing advantageous seats. What is most fair, in terms of being most economically efficient, is to let each seat fetch whatever airlines can get for them. But when you sit beside someone on the same flight, enjoying a comparable square foot of personal space, and find out they paid hundreds of dollars less than you for the opportunity, this doesn't seem so fair. It seems like some kind of discrimination has taken place and that you've been had. At that point, a customer is likely to think, What, is he better than me? Why should he pay less than me for the same thing? Same goods, same price: It seems the democratic way.

Hence Southwest customers don't like that some customers can buy or travel their way into preferential treatment. Part of what travelers had paid for in flying Southwest, apparently, is the leveling experience of having to scramble for seats at departure time. It was a way to purchase an ersatz egalitarianism, since it stood in stark contrast to the first-class, second-class, etc., seating systems at other airlines. Getting that first row seat on a Southwest flight because you camped out at the gate and earned it was a way of erasing extraneous advantages, of getting a perk that is ordinarily unreachable to those who can't afford to spend a fortune. What Southwest offered was an escape from money-based meritocracy, an escape from having what you are willing to spend serve as a proxy for your worthiness. Of course, most of these same consumers want precisely the opposite from their employers -- they want to be rewarded specially for their merit and for their special talents. Perhaps this inconsistency is a way that consumerism helps capitalist democracies smooth over the perpetual conflict between justice and equality, or to put it another way, between equal opportunity and equal outcomes. As part of the production cycle, we want meritocracy, we want disparate outcomes to reflect our different abilities, ambitions, and efforts. But in the consumption cycle that occurs simultaneously, we want the illusion of egalitarianism, of an equal outcome regardless of effort or ability -- we want the shortcuts and the conveniences to the feeling that no one else's money is better than our own.

But that doesn't take into account positional goods, which people consume to specifically destroy the spirit of egalitarianism. Positional goods allow us to express the class prerogatives and inherited advantages that distort our opportunities in general in the realm of consumption, where the market would seem to afford the same opportunities to all. The illusion of the democratic marketplace is useful to a point -- to keep a class-riven society complacent through the magic of purchasing power -- but beyond that point it is far more lucrative to exploit class insecurities, to manufacture scarcity and sell the thrill of exclusivity while fattening profit margins.

Thursday, April 28, 2011

Forever catalogs (24 Dec 2007)

In time for the holidays, BusinessWeek wrote about a service called Catalog Choice that will try to get your name off catalog mailing lists. It turns out that is not as straightforward a task as it seems.
when an activist Web site called Catalog Choice contacted the likes of L.L. Bean, Williams-Sonoma (WSM), and Harry & David and asked them to take thousands of people off their mailing lists, the retailers knew they had a public-relations problem.
How did they respond? Some—mostly outdoorsy brands like L.L. Bean and Lands' End (SHLD)—made soothing noises. Others blew off the Web site (and subsequently, the people declining their catalogs), and have done nothing with the names.
You think you wouldn't need an "activist" service for this, that expressing a wish not to be pestered by mail wasn't a form of activism. It would seem as though you could simply request that the company stop wasting time, postage, and paper by refraining from sending you a catalog you don't wish to receive. But catalog merchants, as persistent as debt collectors in pursuing their aims, apparently know better than their prospective customers what those customers really want.
L.L. Bean says it has removed some of the names on Catalog Choice's list, but is still evaluating it for accuracy. The company wouldn't say how many names it had removed or how long the evaluation would take. Williams-Sonoma, which also distributes the Pottery Barn (WSM) catalog, says it "is still figuring out the right thing to do for our customers" and has only analyzed samples of Catalog Choice's list.
The right thing to do? What is there to "figure out" about a person saying, "Please stop sending me catalogs"? But retailers know that people say one thing -- "I want to save," "I care about the environment" -- and do another when they, in the privacy of their own homes, are confronted with pretty pictures and insinuating fantasies. Knowing this, nothing short of a restraining order would stop the retailers from sending the catalogs. Like all direct mail operators, they don't care what the recipients say. They only abide by the mathematics of the proposition. If the profit from sales closed through the mailings outweighs the costs of sending out the catalogs, they will continue to do it. And with the microtargeting available, the math can be more precise, they can likely track the sales returns on catalogs send to a specific zip code, maybe an address. Hence, if people don't want catalogs, they probably need to stop shopping.

The irrepressible melancholy of year-end best-of-music lists (21 Dec 2007)

The music industry is alleged to be dying, but a look at the scads of best-of lists makes it seem as though there is more good music than ever. The erosion of the big labels control over what we hear has been mirrored by an explosion of journalistic opportunities on the internet for people to espouse their idiosyncratic tastes. What emerges has less of a cram-down, lockstep feel to it than lists in the past would have, but it can still be bewildering and overwhelming if you fall like I do into the fantasy/trap of wanting to be aware of everything people thing is worth hearing. Just have a glance at Slate's year-end critics roundtable series of posts. Intentionally or not, these critics make appreciating pop music -- pop music, mind you -- seem like a full-time job. No dilettantes allowed in the world of pop music.

And this is despite the writers' palpable urge to relate to what ordinary people get out of music -- it's almost a desperate plea really (their irritating populist proclamations aside; these seem like overcompensation for being anything but an ordinary music fan) because thinking as much as they do about music is a sure way to forever alienate yourself from the natural, routine relationship with music, the one that is straightforward and brings those lucky enough to preserve it an uncomplicated joy. It's enough to wonder whether pop music gets too much coverage, which threatens to suffocate all the pleasure out of it. More likely though is that I am too often in front of a computer with nothing better to do than read about music.

But it seems everyone now agrees that the music industry will no longer exist in the terms we know it, and this will inevitably change how both musicians and fans go about their business. David Byrne's article for Wired about how musicians can adapt to changes in the entertainment industry is fairly comprehensive and surprisingly businesslike (it has infographics and numbered lists, in accordance with the assumption that businesspeople can't process information presented in paragraphs or complete sentences). It's extremely informative without being overly dogmatic, and It's full of eminently sensible and realistic advice that doesn't presume a draconian intellectual-property regime to protect intellectual property from technological despoilment. He highlights that the overhead labels used to cover is no longer an issue, and now all they have to offer bands is up-front money, which amounts to a life of indentured servitude as the bands give up control over what they create as they try to pay the money back.

Much of his argument has its roots in an idealistic definition of what music is, an inalienable experience that defies commodification and is essentially social.
In the past, music was something you heard and experienced — it was as much a social event as a purely musical one. Before recording technology existed, you could not separate music from its social context. Epic songs and ballads, troubadours, courtly entertainments, church music, shamanic chants, pub sing-alongs, ceremonial music, military music, dance music — it was pretty much all tied to specific social functions. It was communal and often utilitarian. You couldn't take it home, copy it, sell it as a commodity (except as sheet music, but that's not music), or even hear it again. Music was an experience, intimately married to your life. You could pay to hear music, but after you did, it was over, gone — a memory.
Technology changed all that in the 20th century. Music — or its recorded artifact, at least — became a product, a thing that could be bought, sold, traded, and replayed endlessly in any context. This upended the economics of music, but our human instincts remained intact.

That's well-put, and in more philosophical moments, I tend to think of "real" music as being that pure. But I'm less optimistic that my "human instincts" are so intact. I sometimes fear that music is something I've never quite experienced because it is so foreign to the consumer culture that is all I have ever known. I feel I've had glimpses of music qua music -- in impromptu jam sessions in a friend's barn, or working on recording music on a four-track, or at a really inspired show when the band seems to be doing it for love. Reading old novels has given me intimations of this too, of women needing "finishing" so that they could supply music in the country houses that the characters in bourgeois novels tend to inhabit, of country fairs and balls being a much bigger deal to characters because of the occasions for music they presented. I would think about how much we take recorded music for granted and how it has robbed music of much of its enchantment. That you had to buy music, making it somewhat scarce, give it some ersatz magic, but it wasn't like (so I imagined) when you had to know someone who could play or sing in order to hear it, when almost all music a person would hear in ordinary life was what we know would regard as amateur. And when you heard music, it was compelling; you'd never think to regard it as aural wallpaper. (Classical music, the product of this era, still demands that level of attention. Who has the time?)

Romanticizing garage bands and local scenes is part of pining for the "authentic: music of the days before music as product. Sometimes I fall for this notion, that before there were so many records and so many radio stations, local bands served a real function of supplying music where there was none, and their incompetence was lamentable but tolerated, rather than being kitschy or a perverse and deliberate badge of honor. One didn't have to evolve contrarian tastes to prove one's devotion to music, I imagine, one just had to show up at the high school gymnasium or the VFW or the church social, hear the covers of songs on the radio and maybe some songs that were new -- to you at least, if not altogether original to the band -- and be grateful that there was music at all.

The simplicity of musical taste is what seems so seductive to me, what makes the early 1960s seem a golden age. It's easy to imagine that in the golden age, before the deluge, music appreciation was free of the posturing and calculation that is so palpable in, say, any publication's best music of 2007 list. Making these lists forces on us a mentality where we're listening to rate songs and rule music out and exclude things rather than embrace music and make taste inclusive. The selections on such lists are in earnest, for sure, but still they have a groomed, fussed-over quality. But these lists are so discouraging; the music alone ends up seeming insufficient. It feels obligatory to continue to discover new things, to broaden horizons, to incorporate more and more knowledge of what's available. A list of good music seems like it should come across as a service rather than a challenge, but it always feels like homework when I read one. It's no way to discover music; the best ways seem lost to the past -- those days when your local scene and radio station dictated what you heard, and then all these surprises were still hidden out there in the world, things someone could bring back for you. Obscurantist MP3 blogs are probably motivated by the wish to bring that feeling of special discovery to people, but the instantaneous availability of everything tends to undermine it.

Part of this is the paradox of choice in action: because there is so much music, so cheaply available, I have a hard time growing too attached to any of it without feeling I'm missing out on something somewhere else. Plus, hearing so much music makes more and more of it seem similar and mediocre. When you have only 25 albums in your world, you can forgive a lot of flaws; but the more reference points one has, the more listening becomes a game of comparison and categorization. It's the nature of collecting music; when it becomes a product, one starts to taxonomize it. It becomes information to be comprehended and organized, rather than a sensation to experience.

Ordinarily I try to reject this sort of dichotomy between intellectualization and spontaneous authenticity, between thought and feeling. If authenticity is going to be assigned to any kind of aesthetic experience, it should be to those which fuse thought and feeling and make them seem synonymous. It's hard to explain what that even means, but I think of it as the feeling that comes when a new level to something becomes comprehensible, when a hidden order reveals itself. When I realize some innocuous line in a song refers to much more than it initially seems, and the broader implications are suddenly dazzling or devastating or overwhelming -- understanding more and then at once understanding that you hardly understanding anything, that the work you are contemplating is inexhaustible.

The assumption that thought ruins real experience is usually urged by those who profit by our impulsiveness, marketers and proselytizers of various stripes. And it's not thinking (mischaracterized as a hyperrational urge to demystify everything) that reifies experience. But the illusion that we can have a shortcut to mastering the experiences that life has to offer by turning them into data to be processed and filed is one of consumerism's more seductive lures. The promise is always the same and always a false one: that there can be pleasure without effort, that convenience is for its own sake. In this way catalogable information is the enemy of thought; it refuses to let thought become feeling.

Still, it's impossible to imagine life without recorded music or to pretend that recorded music isn't our primary experience with it. The "economics of music" that Byrne sets against human instincts can't be ignored or separated from the experience of enjoying music. We can't return to an innocent stage where we listen to music instead of consume it.

The NAR's sunshine boys (19 Dec 2007)

Daniel Gross pointed out the obvious in this recent Slate column about the National Association of Realtors: You can't trust anything their forecasters say.
Within the fraternity of financial and fiscal forecasters, the seers at the National Association of Realtors—longtime chief economist David Lereah and his successor Lawrence Yun—may be uniquely ill-equipped to deliver sobering forecasts. They work for a trade group whose mission is to buck up the spirits of real-estate brokers. And real-estate brokers—who live to sell, promote, and market—are constitutionally disinclined to hear anything but good news.
This is apparent to anyone who follows developments in the housing industry in the business press, yet the business press continues to report their meaningless sunshiny accounts of the economy as though it constitutes news, discrediting other analysts across the board. Journalosts could get much more reputable numbers from the National Association of Home Builders, a trade association rather than a sales association, with less of an agenda in its forecasts.

Since economic analysts have such strong incentives to be optimistic -- it's what clients generally want to hear, and optimistic forecasts foment increased confidence, which tends to feed on itself -- a knee jerk pessimism is almost de rigeur for economists who wants to establish their independence. Nothing but innate contrarianism gives incentive to be negative. As a result, bearish views on the economy always seem to be more credible, regardless of the underlying economic data. Of course the data itself can be made to tell whatever story is preferred, if analysts are suitably unscrupulous and the reporters gullible enough. That's why CEPR economist Dean Baker will never run out of material for his blog, Beat the Press, which recounts examples of shoddy or biased economic reporting -- usually this is a matter of failing to give reference points for figures presented for shock value, or neglecting to adjust for inflation, or cherry-picking data, or presenting predictions as facts, or cheerleading for the Dow or the S&P 500 as though investors' fortunes were synonymous with the fortunes of the economy at large. But like the NAR, the business press has the interests of its readers at heart, and seeks to keep them cheerful and reassured.

Waiting for the Rhapsody (18 Dec 2007)

In BusinessWeek a few issues ago (I'm just starting to catch up on my reading), Peter Burrows was pushing subscription music services, trotting out some sensible arguments against being tied down to enjoying only the music you own -- you can discover so much new stuff, sample music on whims, and listen to a lot of cheesy songs you wouldn't necessary want on preserve on your hard drive. And you don't have to worry about a hard drive crash erasing your collection, because you won't have a collection: peace of mind through shedding belongings, which bring with them the anxiety of having to protect them. (This always makes me think of Spalding Grey explaining in Swimming to Cambodia how he conquered his fear of swimming in deep water by leaving his wallet in plain view on the beach. He was so worried about the wallet being stolen that he didn't think about the danger of being too far from shore.)

It seems inevitable that eventually a wireless device will be introduced that gives you access to all of recorded music for a subscription fee. The technology seems to be in place; it just requires the right combination of design, promotion and cooperation among what's left of the music industry. And this will seem like a great idea until people realize what a pain in the ass it is to select what they want to hear from the near infinite possibilities, and will long for the simplicity of radio stations one trusts to play good music. This, anyway, is what Sirius seems to be banking on, as their cocky commercials about their portable players implies.



For those who aren't indifferent or open-minded enough to give over control over what music they hear to professional -- to people who must play DJ for themselves (and probably their friends) ownership of music is essential for several reasons. First, making the purchase is a decision-making moment that in itself gives pleasure -- it's a moment in which one gets to make some piece of knowledge one has operational. The decision also invests one emotionally in the thing purchased, increasing the possibility for enjoying it. This is one of the sad realities of consumer societies, that putting money where your mouth is is way to fix your attention on something and be optimistically disposed toward its being about to please you. When you download a bunch of music off a borrowed hard drive, your investment in the music is zilch, and the effort to sort through it all is herculean -- all those little decisions about whether you like this or that song as you weed through has less pleasure attached to it because nothing ultimately is at stake in the choice. In such a situation, when I'm trying to assimilate a large quantity of music, I find myself thrown back on my taste alone, and that taste is nebulous, contingent. When I buy music, I find I have more reason to try enjoying it at different times, trying to find the mood or occasion that suits it.

And the big collection is necessary if you want to impress people with mix CDs. You give yourself a much larger vocabulary to speak with when you have more songs to choose from and consequently more juxtapositions to play with. It's nice to have a lot of music when you want to give it as a gift to someone else. I don't know that any recipient of a mix CD has nearly as much invested in it as its creator, but some of the emotion that gets poured into making mixes must survive into the final product. And that residual emotional is a direct result of someone working hard to make the most out of their music collection. (The friend I visited in Seattle recently had a new friend who made him a bunch of compilations, and reading through the track lists, I almost felt like I was getting to know her without actually meeting her. But I didn't ask to listen to them -- accustoming to making the compilations myself, I get peevish having to hear other people's; sad, really, the joy that I think compilations can give is something that I myself am generally shut off to.)

Collecting is a means for filtering, as is making the compilations, and both of these activities are about bringing knowledge to bear, making decisions with consequences. The subscription service removes the consequences, almost makes the idea of having selective musical taste superfluous. Not there is anything wrong with that; musical taste's centrality to identity seems a peculiar quirk. Nonetheless, taste in commercial music comes down to what music you are willing to pay for specifically. If you are paying to have it all, you effectively have no taste.

"Reading the spreadsheets upside down" (17 Dec 2007)

Last week, the Economist's Buttonwood columnist, who writes about Wall Street, had an interesting piece about falling corporate profits. With the credit crisis taking its economic toll, corporate profits were bound to start falling -- don't tell the analysts though.
The consensus is that earnings will grow by 14% in 2008, with every single sector managing an advance. In the first half of the year, when many economists think that America will be dicing with recession, analysts are forecasting that corporate profits will be growing at an annual rate of 9%.
Going by experience, profits start to fall when the annual rate of economic growth falls below 1.5%. “Consensus forecasts for next year's US profit growth border on the hallucinatory,” says Tim Bond of Barclays Capital. “Even allowing for the typical bullish bias, the prevailing consensus suggests that equity analysts are collectively reading their spreadsheets upside down.”

Earnings are likely to fall because consumer spending is likely to drop. The article mentions the hit consumer discretionary sector (carmakers, retailers, etc.) taking a hit, a reflection of faltering consumer confidence. I'll add the usual caveat here -- I tend to root against consumer spending, particularly of the discretionary sort, because, paradoxically, I take it as a proxy for rote, thoughtless consumerism. Rather than exercising discretion or saving, consumers seem as though they are obliged to spend, going into debt to perpetuate their habits. But it's probably not a good idea to extrapolate individuals' psychology from these aggregate numbers; that's why they conduct the confidence surveys, I suppose. Nonetheless, consumers cannot continue to overspend, no matter how convenient that is to companies' bottom lines and to the economy as a whole. A story in BusinessWeek last week noted the rise in America's credit-card debt, and the rise in delinquincy rates on payments -- if this debt has been keeping consumer spending aloft, it seems in imminent jeopardy. And in the New York Times recession-forecast bonanza yesterday, economist Stephen S. Roach argues that
The current recession is all about the coming capitulation of the American consumer — whose spending now accounts for a record 72 percent of G.D.P. Consumers have no choice other than to retrench. Home prices are likely to fall for the nation as a whole in 2008, the first such occurrence since 1933. And access to home equity credit lines and mortgage refinancing — the means by which consumers have borrowed against their homes — is likely to be impaired by the aftershocks of the subprime crisis. Consumers will have to resort to spending and saving the old-fashioned way, relying on income rather than assets even as mounting layoffs will make income growth increasingly sluggish.

The inevitable retrenching will likely be painful, crimping the standards of living of even upper middle class families (those jumbo APR mortgages to buy those oversize homes don't seem so smart anymore), but it presents an opportunity nonetheless to transform values toward a more conservational, spartan ethic. I glamorize spartanism (hypocritically no doubt) because it seems simpler and inherently a more creative way to live than letting consumerism supplant creativity (what is noxious to me about the term creativity is how it reifies the process, makes it into a commodity). But it is certainly inconvenient to live that way, and convenience is so easy to become accustomed to and celebrate as an end unto itself, or as a means to enable even more consumption.

Corporate profits have been unusually high for several years, and there are different explanations for this, as the Buttonwood column points out:
The optimists argued that profits could stay high because the balance of power had moved in favour of capital and away from labour, thanks to the globalisation of the workforce. But perhaps profits had been boosted by accommodating monetary policy, a credit boom and the associated surge in asset prices.
It's funny how optimism equates to workers getting shafted. The idea is that outsourcing gives capital more leverage over workers, because they can draw from a much larger reserve army of the unemployed. This forces them to accept wages that are below the marginal product of their labor, meaning more profits accrue to the companies. That theory was influential enough to persuade Alan Greenspan (if we can believe his memoir) to keep interest rates low without fear of stimulating inflation -- wages would remain tamped down, so the increase in the money supply would lead to capital investment rather than inflation and a more rapid circulation of funds. But then this logic leads to the other explanation for erstwhile surging profits. Interest rates were low, money was nearly free, and inflating house prices were making consumers feel flush, giving them access to equity lines of credit. So when wondering where the money went as homes are foreclosed and banks go under, those fat profit margins might be somewhere to start.

Wednesday, April 27, 2011

Texting love (14 Dec 2007)

Bookforum linked to this article about the effects of text messaging on traditional courtship practices in the Philippines. I know that sounds fascinating, and you're probably not even reading this sentence because you eagerly clinked on the link. But as I never have understood the allure of texting, I found the story illuminating.

Clearly it makes sense when to send messages when they are cheaper than talking, as they are in the Philippines, as the article points out. I don't know if that is true with the typical American cell phone plans, but it ought to be. I have long wished there would be a plan that would allow nothing but text messages, because I'm not much for chitchat -- when forced to use the phone, I generally just want the pertinent information, two or three of the the five Ws maximum. And I don't think I would want a smartphone, which seems like too much technology for my simple needs. I think I need the stupidphone.

Anyway, Randy Jay Solis, the article's author, suggests that texting is apparently well-suited for courtship because it creates an extra-intimate space in which the communication takes place.
Texting allows for depth in the courtship stage, an efficient way to exchange a variety of important, intimate, and personal topics and feelings. “The mobile phone screen is able to create a private space that even if you are far from each other physically, the virtual space created by that technology is apparent,” Arnel [a random Philippine teen] explains. “No one can hear you say those things or no one else can read them, assuming that it is not allowed to be read or seen by others.”
This is probably obvious to everybody who has ever texted, but it never occurred to me that this would be so, that technology would produce a virtual space that users would regard as more intimate rather than one further step removed from intimacy. I usually construe this kind of technology as a filter, a level of protection, a way to deny presence, whereas it probably can seem more intimate than a whisper in the ear when satellites are recruited into bringing you into a sweet nothing.

Solis points out how texting facilitates the ability of strangers to meet and become intimate whenever boredom strikes. But this intimacy, perhaps because it is technologically amplified, becomes more addictive.
Texting answers the need for a sustained connection necessary to increase and maintain intimacy, but it has also made couples more dependent on each other. “It became a habit,” *Emmy explains. Partners text each other as often as they can and have a compulsion to keep the communication constantly moving. One respondent attributed this to the “unwritten rule of texting.” Clara elaborates, “Once a person has texted you, you have to reply. If you don’t reply, the person will automatically think you ignored him or her on purpose. So you have to reply no matter what, even when you really have nothing to say.”
Since most of the couples initiating a romantic relationship do not have the luxury to meet up in person or talk over the phone regularly, the frequency of texting becomes a distinct indication of their seriousness about the relationship. “To commit is to be there for the person, 24/7. Texting helps in achieving that despite of the barriers in time and distance,” *Von explains.
This pinpoints what is the probably the main reason I have resisted getting a phone all the years, beyond Luddite inertia. I'm a little bit terrified of this kind of dependency and compulsion, of being unable to ignore a message without guilt or to go without sulking when my message garners no response. It's bad enough with email -- I had to abandon instant messaging for the same reason. When the messages are flowing back and forth in rhythm, its like you are wired into your correspondent, but then if there is a gap, it's like a betrayal, like being abandoned. I would get too impatient and paranoid in the delays, as though I were waiting for someone to pass me the crack pipe. It may takes more maturity than I can muster to presume innocence when an urgent or intimate message goes out there and just hangs, and it seems like the texting life would be filled with such mishaps and emotional misfires. In general, communications technology promotes impulsive immediacy over consideration, yielding a fraught, fragile intimacy that is only as a deep as the last message. All intimacy requires continual reciprocal contact, but accelerating that contact may be more than our limbic systems can handle. That, anyway, would seem to be part of the argument of an essay Solis cites, Heidegger, Habermas, and the Mobile Phone by George Myerson. According to these notes Myerson argues that "mobile communication is fragmented, accelerated, highly commoditized, and ultimately meaningless." He suggests that mobile phones are a critical step in the effort to meter all communication, to translate it into a purchasable object, to have it measurable in money. It ceases to be communication and instead becomes a species of exchange. That argument verges on a semantic trick, and my susceptibility to it is probably rooted in my bias for pragmatic talk, but it still seems an apt description for texted testimonials, and their cousins, the messages exchanged on social networking sites that are little more than acknowledgments that people are scrutinizing one another.

It seems curmudgeonly to complain about there being more intimacy in the world thanks to technology, I know. But ultimately, the way communication is quantified may be what seems so sinister about the heightened intimacy of texting; it turns the freedom of love into a kind of dope high purchasable on demand. And our bodies are supplanted by the devices we use to reach one another, the ones that let us be everywhere at once, and nowhere.

Winning at white elephant (13 Dec 2007)

As uselessly contrarian as I tend to be, the holiday season tends to bring out my cynicism, leaving me feeling that I believe in nothing, like The Big Lebowski's nihilists. It's not merely a matter of my not being religious; I find myself not wanting to buy into the holiday spirit at all. I grouse about the music, and the gifting and the parties and the traveling and the stress and the continual efforts of coordination that must be made for no one's particular satisfaction, just so that one can feel as though one participated in an inescapable social ritual. Part of my uneasiness comes from a misplaced expectation of authenticity, a notion that that the contrived aspect of the holiday season destroys spontaneity and replaces the opportunity for it with ersatz obligations. I tend not to see the season's contrivances as opportunities themselves, as moments when society tolerates our going slightly beyond the way we ordinarily treat acquaintances, when we can, generally speaking, safely venture a little bit more of ourselves. So it may be that I, with my customary suspicion that I have little to offer, try to absent myself from the proceedings altogether.

This year, I tried to soalce myself after the seasonal onset of my lack of belief by reading Eric Hoffer's The True Believer, which I got from the bargain shelf in a Seattle used book store. Hoffer (a "longshoreman philosopher" who wrote as an unaffiliated autodidact) was interested in investigating the nature of mass movements, and what permitted the rise of Fascism and Nazism in the 1930s, and he set out his conclusions in the this near-aphoristic work. Hoffer's main insight is that one's eagerness to belong to a mass movement derives from a sense of personal frustration, an overwhelming sense of failure in the face of the opportunities afforded in a free society. "When our individual interests and prospects do not seem worth living for, we are in desperate need of something apart from us to live for. All forms of dedication, devotion, loyalty and self-surrender are in essence a desperate clinging to something which might give worth and meaning to our futile, spoiled lives." Talk about cynical. Hoffer regards the rise of mass movements as the almost inevitable consequence of widespread mediocrity coupled with the unreasonable expectations that democracy generates for the common person. "Unless a man has the talents to make something of himself, freedom is an irksome burden. We join a mass movement to escape individual responsibility, or, in the words of one ardent young Nazi, 'to be free from freedom.' " Democratic ideology leaves the impression that all men are equal, whereas it has the effect of making one's place in the irrepressible hierarchies in society seem entirely the individual's fault. Thus the frustrated people in a capitalist democracy "want to eliminate free competition and the ruthless testing to which the individual is continually subjected in a free society."

So if frustration produces a longing for a mass movement that resolves all personal inadequacies in a dedication to a cause, then what should we make of consumerism, which seems to work in precisely the opposite way, generating frustration while imbuing consumers with the imperative to be an individual and construct a unique, predominantly external identity on the basis of which one can be judged. Is it a recipe for fomenting fascism, instilling the frustration with oneself that makes one susceptible to being led and having society leveled? Or is consumerism just the ideological by-product of a commitment to individualism? Is it a way of embodying that ethic in a set of economic practices, as ideologues like Milton Friedman have always insisted -- i.e. that the freedom of choice offered by market societies derives from a culture prioritizing and prizing personal freedom? Or does it help flatter people into believing they are profoundly and uniquely creative, setting them up for permanent frustration with themselves and what they are brought to believe is a personal failure. Consumerism invites dilettantism, because it promises that everything is easy and that you can do anything if you buy the right accoutrements. So consumers are prone to becoming what Hoffer calls permanent misfits: "Whatever they undertake becomes a passionate pursuit; but they never arrive, never pause. They demonstrate the fact that we can never have enough of that which we really do not want, and that we run fastest and farthest when we run from ourselves."

Hoffer exempts from the centripetal pull into the abyss of mass movements those people who have genuine creative outlets: "Nothing so bolsters our self-confidence and reconciles us with ourselves as the continuous ability to create; to see things grow and develop under our hand, day in, day out. The decline of handicrafts in modern times is perhaps one of the causes for the rise of frustration and the increased susceptibility of the individual to mass movements." How can that be, with Michael's Crafts opening in every suburban strip mall?

Consumerism, with its emphasis on passive acquisition, tends to undermine crafts and hobbies, subordinating them to the master hobby of collecting things. And capitalism eviscerates most jobs and empties them of their social meaning. But the counterargument can be made that consumerism makes the necessary materials for hobbies cheap and plentiful, and the division of labor gives people the time to pursue them. It's just their personal weakness and indecision that leaves them bored and unfulfilled.

So it's hard to determine whether consumerism is a stepping stone to fascism or its antidote. Also, then, is the holiday season zeitgeist a proto-fascist expression of mass-movement psychology in the face of capitalist culture's ideology of individualism, or is it actually a perfect encapsulation of that ideology, dressed up in the pseudo-religious garb of a mass movement? By trying to reject holiday cheer and the exposure that comes with the giving spirit it demands, I'm in part trying to keep myself cloaked in the anonymity that Hoffer singles out as a sign of the frustrated weakling: "The passion for equality is partly a passion of anonymity: to be one thread of the many which make up a tunic; one thread not distinguishable from the others. No one can then point us out, measure us against others and expose our inferiority." I feel like one of those permanent misfits, too, out of step, feeling frustrated by the phoniness that seems to surround me but may in fact be coming from within.

But then, I can't decide if my resistance to the holidays isn't really an expression of weakness and frustration, but rather a defiant attempt to assert my individuality in the face of Santa's marching orders. Maybe my attitude is all wrong. I should go into that white elephant exchange at my work with the proper competitive spirit, with the certainty that my gift is going to kick the ass of all those other ones.

Joy in repetition (12 Dec 2007)

I'm still ruminating over my insane need for musical variety. Once, when I was in high school, a friend's girlfriend picked me up in her car to take me somewhere. I don't remember where we were going -- maybe a party or a teen dance night somewhere -- but I'll never forget what we listened to on the drive. She had filled both sides of a 90 minute cassette with the same song taped over and over again: "Burning Flame" by a band called Vitamin Z. Surprisingly enough, I had enough politeness in me then not to deride her choice of music, but I certainly complained vociferously about it later. I asked my friend who was dating her how he could stand it, but apparently he hadn't even noticed. Hadn't noticed? Back then it hadn't occurred to me that there was much of anything else to notice about someone. (I suspect if I had my mind on other things besides music back then, I would have had fewer arguments and more girlfriends.)

At the time, I thought that girl was hopelessly narrow-minded, but since then I have often wondered if she was on to something. I even find myself envying her; she had the secret of being able to know her mind and be satisfied rather than be continually searching. She could find the joy in repetition that tends to elude me, that complacency of which consumerism may indeed train us to be suspicious. Consumerism seeks to instill in us repetitious routines that yield no satisfaction, merely hunger for more, for different.

I find that I am implacably restless in searching for new music, as if I stop discovering new songs, the emotions music evokes in me will also disappear. Of course, my actual experience with listening has proven to me that the music I know best and have listened to the most yields the richest emotional reactions, especially if the songs have become palimpsests of the things I was feeling each time I made a point of listening to them before. Though some songs become unfortunately encrusted with nostalgia, others remain alive and undepleted despite the freight of emotions they carry.

Nevertheless, I still have the fear that the music I know will somehow fail me and that I need to seek more, need to set aside time not for the music I already know can move me, but for the unsorted hodge podge I never cease gathering, hoping that something in that effluvia will inspire. It seems like a terrible waste, but for those unexpected moments when out of nowhere, a album track from some forgotten band delivers an unexpected spark, and it's like falling in love all over again.

But I am waiting for the day when what I already know will be enough, when contentment won't seem like a rumor, when I'll turn inward with what I have and reap the harvest of all that effort of endless accumulation, when I'll supplant the search for that spark with something deeper, with a feeling more like an eternal flame, I suppose.

Taste versus curiosity (11 Dec 2007)

I used to make the mistake of thinking that people with a small record collection had no particular taste in music. I'd assume that they just didn't care about music or else they would be going about assembling an encyclopedic collection. If they knew how much good music was out there, they would know they should have a lot more. Now of course, this is hardly indicative. Anybody can borrow a hard drive from someone else and amass instantaneously a music collection that would dwarf anything even the most astute record collector would have had circa 1996.

In the old days, a small collection seemed to suggest indifference, as though the discs in the collection were just so much flotsam and jetsam that drifted into their possession -- random birthday gifts and impulse buys and the like. And sometimes that is the case. I often forget that not everyone is afflicted with the anxiety of a collector -- the secret egoistic suspicion that if something is not in your possession, it might somehow cease to exist or worse, reveal a weakness, a vulnerability in your base of knowledge. They don't have the peculiar sense of responsibility of needing to have anything you could possibly think to play for someone on hand and ready. Instead they are content to take music as they find it, trusting in the many DJs out there to supply something reasonably entertaining when music is desired, which for a substantial number of people, I've discovered to my absolute shock, is not particularly often.

But other people with spartan music collections are not indifferent; they are just operating with a much more stringent filter, working with assumptions much different than the ones I usually have about music. I'm typically guided by curiosity, and since I am listening to music almost all the time, I can make the time to hear anything, no matter how annoying or uninspired, just to know what it's about. Part of this is to maintain enough familiarity with what is out there to continue to pass as a credible music snob, along with the sheer pleasure of simply knowing things, regardless what it is. But part of this is also indifference, not holding the music you're hearing to any standard. Falling prey to the sort of consumerist thinking I often complain about, I find I prefer hearing something new to something good. I want to consume novelty rather than appreciate music.

So it often seems like I have no particular taste in music at all, as I will listen to anything, and what comes up on the shuffle of my iTunes gives no indication of the music that I actually think is best. Some find this incomprehensible -- why not listen only to the music that you really are into? They are confounded at the idea of spending any time listening to something patently awful or second rate and can't countenance it. Their collections, a few albums in heavy rotation, explain a lot about their taste, which is revealed to be distinct and well-formed. Meanwhile I maintain a protective distance from anything so definite, always hiding my true feelings behind a mask of comprehensiveness.

Mall shootings (7 Dec 2007)

I'm not sure I can even write about this subject without seeming glib, and I in no way want to make light of the tragedy of random people being murdered as they go about their ordinary lives. But when I was watching the coverage of the Omaha mall shooting yesterday (I was waiting to catch a plane and CNN was inescapable -- why must they do this to public spaces, try to sedate people waiting with TV? Have people become this impatient? On a related note, I flew cross-country on Northwest, and it wasn't until after the flight was over that I realized why it was such a refreshing experience -- no force-fed in-flight entertainment) I kept wondering if the shooting could in anyway be interpreted as an act of protest against malls, and what they might represent to people. This subject never came up in Anderson Cooper's inane questioning of the various psychologists and local witnesses on the program; instead the focus was on the killer and what sort of mental illness he must have had to prompt such a deranged act, and then he was dutifully compared to other sensationalized killers, glorifying him in precisely the way the psychologists had said he had yearned for -- his desperate need for recognition and notoriety.

But I was wondering, why the mall? Was this just a natural choice, the place to go to see strangers, the quintessence of public space in America? When planning this horrible crime, did the killer ever once think, this will make people think twice about the emptiness of shopping? This might discourage aimless or rote consumerism? Probably not, but such an angle was not even hinted at in the exhaustive coverage I was subjected to, even when they went through a rundown of other recent mall massacres. The mall was just a null variable; no one mentioned any characteristic about malls that might relate them to the spate of shootings occurring there. Perhaps in all these cases, the mall was an incidental choice. But something about shopping seems to make people especially vulnerable -- people enter malls in order to let their guard down, to open themselves to the pleasing enticements of goods, the fantasies they promise but rarely deliver on. As the staging ground for fantasies of the transformational powers of property, the mall be the place where consumerism is most satisfying, where it works best and makes the most sense, where the dreams have full play and the action we are being continually prompted for by our dominant public discourse, advertising, can actually be consummated. When you get the goods home, they often aren't half as exciting. Is their something about the heightened sense of reality at malls in consumer society that attracts the deranged lunatics desperate to leave their mark on that society?

Maybe this refusal to rationalize crime as having a poltical reason emphaszies the horror of the crime for those consuming it as news -- if they provide no political motivation for it -- if it is presented to be as random as possible -- it perhaps provides the greatest vicarious thrill, the greatest amount of the knotted-stomach feeling from witnessing something awful. To offer potential political rationales for murder, no matter how disapproving, would still be in effect justifying the idea that violence can serve political ends, a belief that the state must monopolize. Individuals can't be permitted to conceive of action that way -- politicized violence committed by anyone other than a state agent is uniformly labeled terrorism.

So instead of trying to rationalize this kind of awful crime with any kind of purpose, society seems to prefer the idea that killing is random, senseless, and motivated wholly by psychological defects in the murderer.

The evils of word-of-mouth advertising (4 Dec 2007)

I found a copy of Lewis Hyde's The Gift in the free pile at work -- a quite appropriate place to find it in some ways (somebody's giving it away), totally inappropriate in others (the copy of the book was distributed in a commercial setting solely for marketing purposes, which Hyde argues destroys the gift's essential nature). Hyde's fundamental point is that gifts necessarily form relationships between giver and recipient, while commercial exchanges pointedly do not -- they are arranged to be reciprocal and neutral, to balance out and eradicate any need for gratitude or graciousness or indebtedness. Hyde writes, "In commodity exchange it's as if the buyer and seller were both in plastic bags; there's none of the contact of gift exchange. There is neither motion or emotion because the whole point is to keep the balance, to make sure the exchange itself doesn't consume anything or involve one person with another." For some, that lack of intimate contact or interpersonal obligation is the whole point; it's much more convenient to accumulate things without accumulating relationships, even though relationships are likely much more fulfilling and are often the point of having things in the first place. We want to have certain things all to ourselves to project a certain kind of identity, but we also want to share things with who we choose and erect the boundaries we seek to make concrete around our family or our circles of friends. Consumerist ideology works to persuade us that the convenience and the identity display of collecting goods and market exchange are more satisfying than the sharing and the network formation of gift exchange; that isolation from ties and evasion of responsibility is the whole of freedom. But in reality, most people don't want to be free on those terms. We like to feel obliged; it gives us a reson for being, a sense that we matter. Consumer society is set up that you can live your entire adult life without having anything but frictionless, emotion-free commercial interactions with other people -- an arrangement preferred by commercial interests, since it may then take a cut of the action that occurs every time people interact. Every bit of human interaction in such cases requires market mediation, which allows the intermediaries to extract profits. Ordinary human relations, decommercialized and inconvenient with all those feelings and junk, are not so reliably lucrative. The fair, impartial exchange idealized in the market in which you get what you pay for (caveat emptor and all) is a way of stifling relationships that occur outside of commercialization. Making a fair deal as a cornerstone of morality may foster isolation.

It occurred to me that my contempt for word-of-mouth advertising has something to do with opinions as gifts -- when one offers a word-of-mouth recommendation, it functions as a gift; it fosters a relationship that in some way supersedes the specific thing recommended. The opinion is only an occasion to enrich a relationship. But word-of-mouth advertising, obviously, corrupts that process and invalidates the gift, turning it into a tactic or a product. Few people are soulless enough to spread bogus word of mouth intentionally, but the goal of Facebook and other social networks seems to be to commercialize sincere word-of-mouth recommendations or to automate the opinion-giving process, so that every time you do something online, your actions generate an automatic recommendation to those who are on a feed, receiving updates of your every move. This deprives you of the chance of making a gift of your opinion, transforming it instead into a sales tool preemptively, poisoning the ground of friendship.

Instead of promoting the sharing of ideas and opinions among friends, social networking sites promote posturing and marketing, friendship as spectatorship, surveillance, and imitation. The reciprocity it provokes seems thin, encouraging discourse that is typically taken for granted in friendship -- you don't need special notification that someone is paying attention to you or validating your choices; you don't need testimonials from friends to the effect that they actually really do like you. Social networks offer a way to conduct a friendship without putting forth any specific personalized effort -- it removes the gift of friendship from the relationship and leaves the marketing opportunities.

Prime subprime borrowers (3 Dec 2007)

This should come as a shock to no absolutely no one: The Wall Street Journal had a research firm crunch the numbers and determined that many subprime loans were issued to borrowers who likely would have qualified for better rates and fewer fees. In 2005, people with credit scores that would have qualified them for conventional loans
got more than half -- 55% -- of all subprime mortgages that were ultimately packaged into securities for sale to investors, as most subprime loans are. The study by First American LoanPerformance, a San Francisco research firm, says the proportion rose even higher by the end of 2006, to 61%. The figure was just 41% in 2000, according to the study. Even a significant number of borrowers with top-notch credit signed up for expensive subprime loans, the firm's analysis found.
How could this have happened? The ever-rational consumer would have shopped around for the best deal, right?

Hardly. The brokers closing mortgages were given lucrative incentives for writing subprime loans and ARMs and the other now notorious credit products, so they had every reason to preserve the ignorance that all of us generally have when it comes to the credit market and to exploit our vulnerability in a time when we are making one of the most significant decisions of our lives, purchasing property.
Many borrowers whose credit scores might have qualified them for more conventional loans say they were pushed into risky subprime loans. They say lenders or brokers aggressively marketed the loans, offering easier and faster approvals -- and playing down or hiding the onerous price paid over the long haul in higher interest rates or stricter repayment terms.
The subprime sales pitch sometimes was fueled with faxes and emails from lenders to brokers touting easier qualification for borrowers and attractive payouts for mortgage brokers who brought in business. One of the biggest weapons: a compensation structure that rewarded brokers for persuading borrowers to take a loan with an interest rate higher than the borrower might have qualified for.

This handy interactive graphic shows a lenders rate sheet and the yield spread premiums agents could earn by bullying or tricking borrowers into loans at terms worse than they theoretically qualified for. Basically, lenders use financial incentives to prompt agents to put people into shitty loans, with bad rates and prepayment penalties and unwieldy fees. Who's on the side of the borrower? Basically, no one. It was pretty much caveat emptor in the midst of a real-estate-buying frenzy when everyone was telling everyone else how they had to act fast and buy something, anything, before all the deals were gone and how housing prices were never going to go down again, since after all, they're not making any more land.

It's almost unreasonable to fault mortgage brokers for being negligent and unethical. They are in the real estate racket; that's what it's all about. You don't get into real estate out of a love for humankind and a dream of a better world. You do it because it seems like a good way to make a lot of money. And when greed is what gets you through your work day, why wouldn't you prey on the ignorance of your clientele? It's nothing personal, after all, just business. So one could argue that unscrupulous lending practices should be restrained by law -- to a larger degree than they are already. In other words, the set of what counts as unscrupulous needs to be expanded to encompass what lenders likely consider to be standard operating procedures -- it would be legislating away their right to whatever profit they can grab, and once you've done that, what stops the state from interceding in the economy with all sorts of price controls: rent control, caps on food prices, medical expenses, and legal fees, and so on. Ethical cases could be made for any of these kinds of intervention, but capitalism, when it comes down to it, isn't arranged to be ethical and doesn't function with elastic definitions of fairness.

Top-down culture (30 Nov 2007)

Consumerism tends to drive us to want more and more culture (more is always better) and thus it requires an apparatus that foists new options on us and strenuously tries to encourage these new and improved options will give us more than what we've already got, and probably haven't exhausted -- can one ever really exhaust works of art, after all? The nature of that appartus is changing, though, as the old monolithic culture industry -- empowered by economies of scale, driven by profit-seeking, and hewing to the lowest common denominator among the large audiences it hoped to muster and hold together -- is falling apart under the stress of new ad hoc distributional networks made possible by digitization of works and the internet hooking together consumers who love to share. Part of this sharing is sheer generosity, part of it is an aggrieved feeling that the really good stuff needs to be heard by more people and that filesharing is a way to strike a blow against the industry that has made its money stifling or neutering independent voices.

The temptation to tell people what they should like is always strong. And people often like to be told -- that's why there are so many services supplying cultural criticism. I remember how seriously I took the album reviews I read in Rolling Stone or even in the local newspaper -- now, having had some exposure to what such reviewers are actually like, I can't even believe that I ever paid any attention to what they thought. Consequently I'm driven even further toward the unfortunate egomaniacal position that I probably know more than many of the people writing these things; then there's the problem also that I'm starting to have decades more experience listening to music on many of them, leaving me in a position where I have nothing to learn from them that's not wrong or superseded by something that I already learned about years ago. And then there's the fact that so much highly recommended new music is rehashed old music that is best appreciated when you don't know the antecedents. It makes me feel very old, as though I have outgrown the opportunity to enjoy contemporary culture, and have become instead one of those relics, living in the past and arguing how everything was so much better before. This still seems preferable to being one of those old guys trying desperately to stay young and "relevant," as if what teenagers determine that relevance for anyone other than marketers looking to tap a lucrative demographic segment.

Those advocating legislating taste in various ways, whether through censorship or some form of sumptuary laws, typically believe that when individuals are left to their own devices, they are too easily influenced by advertising or by other parties who are in bad faith in making their suggestions. Those who oppose top-down culture believe that individuals should be free to choose to enjoy whatever they want, and that their authentic wishes supersede any attempts at influencing or shaping their choices. Of course, everyone likes freedom in the abstract, and everyone likes to believe that they know their own mind and know what they like without having to be told. But in practice, people tend to seek out opinions, because they may enjoy the suggestion of human company in opinions more than the works of art in themselves. People look to culture to invest them with a sense of belonging, and sharing opinions or liking something that is liked in general is a way of simulating that feeling of inclusion. So a work's reception can be more significant than the work itself, which makes the dogma about individuals following their own tastes and a work's popularity reflecting its intrinsic quality suspect.

If people aren't really following their own hearts in choosing what culture to consume, the question becomes, who should they follow? People want to follow the tastes of people they like and want to count themselves among, absent that, they'll go along with who seems to be liking the same sorts of things -- an impression that can be created either by advertising or by the opinion-making media or by both in conjunction. The alternative would be to mandate popular culture through state-funded educational systems -- this would provide a much more uniform harmony of tastes, but would discourage variety. It would restore much of the meaning to the currently meaningless terms independent and alternative, however.

But there isn't really enough at stake to warrant state intervention, unless you believe that culture is primarily didactic, teaching people how to behave and interact with one another. Those who want censorship usually invoke this argument, that the loose morals on display in commercial culture -- the sexual objectification of anything beautiful to turn it into a lure, the promotion of giving in to temptation always (how else to keep consumerism rolling in the face of economic crisis?) -- warrants a clampdown. Snobs, on the other hand, claim that commercial culture is vulgar, playing to the tastes of the most ignorant, and encouraging everyone else to become stupid too, as in Mike Judge's film Idiocracy. But snobs are in bad faith when they argue that culture should be subjected to top-down control (presumably by wise people like themselves) because their sense of esteem comes from the superiority they feel to common tastes, which would vanish if their wish for power over culture was granted.

This is meandering a bit, but the reason I started thinking about this was because of the questions I was trying to get at in the previous entry, about what will be lost if commercial culture truly gives way to some new form of cultural participation enabled by technology. The culture industry, that Frankfort school boogeyman, was one way of controlling culture in the name of a popular taste gauging and tested by functionaries and agents and middlemen who just wanted to make money off other people's talents. In some ways, this is a pure, paradoxically selfless motive -- they weren't trying to foist their own talents on the world or skew what culture was produced with their own idiosyncratic vision. They just worked as conduits, trying to find the easiest way to please people. Sometimes that meant trying to brainwash them and feed them shit that was already on hand, sometimes that meant responding to an unexpected turn, a sea change in demography or popular expectation. Absent a culture industry, these people are out of jobs, but they may be replaced by strict opinion makers, who filter the mass of what artists make available directly to audiences to pick out stuff worthy of attention. A&R men will work after the fact rather than before, and will deal directly with the public. They will perhaps be like stock pickers, marketing their track record. But in order for this to happen, they will need to set their opinions off from the mass of freely offered (and easily aggregated) opinion that's already available on the internet. Opinions would have to become scarce in order to have any value, and that seems unlikely. What is scarce is people who put money behind their opinions, who have "skin in the game". Perhaps what is needed is a futures market in culture to replace the signal investments made by the culture industry.

So instead, tastes will perhaps be formed by aggregators, who collect data on what different groups are actually doing -- so you can tailor your choices to who you want to fit in with. You can have a Muzak like service supply culture for Brand You the same way they do for retail stores. More likely, the data about what stuff individuals should be getting into to belong to a specific set will be aggregated within social networks on social networking sites. It is up to clever marketers to figure out how to infiltrate these networks or co-opt the opinion leaders within these networks, the sort of people Malcolm Gladwell profiles in The Tipping Point. Social networking sites should make these people easier to track down, and will yield them opportunities to reap rewards for their natural proselytizing talents. So these folks will have the opportunity to become the new A&R people. And groups of friends will come to be organized as mini-culture industry firms.

Post-commercial entertainment (27 Nov 2007)

The Hollywood writers' strike certainly helps throw this in relief, but it seems clear that the commercial entertainment industry is in trouble. Digitization and dispersed internet distribution has made it impossible for them to control supply, and the intellectual property concepts their business models depend on seem likely to come under attack or undergo extreme revision in an era where anonymous collaboration and open-source development become more and more customary. Not to wax too utopian about it, but it seems like the idea of commercial artists working for industry middlemen is rapidly becoming a thing of the past, and as that changes, the means by which our society defines what makes for an artist or entertainer will change as well. Reality TV and blogging are just the most obvious examples of semi-professional and, in some cases, post-commercial entertainment supplanting the work of pros. The expectations we have of polish and high-end production values may continue to become more and more relaxed; what lo-fi indie rock helped pioneer could become acceptable in every genre and every medium, as YouTube would suggest. (Though all I ever seem to use YouTube for is watching old clips of bands from the 1960s and 1970s appearing on European TV; it's become sort of a random-access collective memory. In fact, I can safely say that the internet, by enriching my access to obscure culture detritus from past decades, has guaranteed that I won't pay any attention to contemporary culture for the foreseeable future.) While paid ads still support part of the distribution medium for these works (i.e. Google's ad brokering makes it worth its while to host all this junk), the creators themselves, who are confronted with very little overhead for making and self-distributing their own product, are not necessarily compensated monetarily and seem to have attention (becoming more and more measurable, more and more useful as a means for status competition) rather than monetary reward as their motivation. This seems like a good thing, at first, but is it actually a license or a prod for all art to become even more about ego than communication? in other words, is self-expression as a goal wildly overrated, especially now that it's so easy, now that we are in the so-called age of microcelebrity Clive Thompson notes in this Wired column? Is art being subsumed to an even greater degree by the (commercially derived) ideology of personal branding? Are we getting the worse of both worlds -- the superficial, narcissistic culture without the discipline brought on by the need to make money?

In his book In Praise of Commercial Culture, economist Tyler Cowen points out that on the 18th century, when the printing press was having similar effects on culture as the internet is having now, critics worried that the commercialization of art, the market for books, would erode the power of fame as an incentive, without which writers would produce nothing but trash. But with fame devalued now that the trappings of celebrity are open to all, it seems like money and the professionalization that went along with it were last-ditch means to uphold standards. In Cowen's view, 18th century critics sought to impose aesthetic standards and use fame as the reward that would induce writers to adhere to them. In a similar fashion, centralized cultural production enables a few media corporations, or the state (as in China, Soviet Russia, etc.), to impose similar standards. In a market economy, mass popularity seems to justify after the fact those decisions made early on about which works met the approved standards and were worthy of being supported. But mass popularity, or monetary reward may not be as significant when you can bask in the recognition of a niche audience and feel righteous about not having sold out. The "microcelebrity" thesis perhaps bears out Cowen's argument that there is not a limited supply of fame, and that technology and the density of intertextual references multiplies the amount of fame there is to go around, albeit in ever finer measurements. But conversely, the demands on our attention may be stretched to the limit, leaving us in even greater need for filters and organizers of what's available. Commercial gatekeepers once served this function; perhaps now social networking tools (linked in to targeted advertising) will replace them. Nothing, though, stands to discourage anyone from producing culture and "cluttering" the public sphere with it. I waver between thinking this is a pervasive triumph over passivity and fretting that it's a disaster that's made self-branding and the commercialization of our intimate identity commonplace -- an eagerly sought accomplishment that we hope to confirm in the public sphere.

Having cheered for so long against the culture industry Goliath (without ever really suspecting it was actually vulnerable), it hasn't often occurred to me to consider what we lose with its decline. The need to make art that will sell is usually derided as forcing artists to pursue the lowest common denominator and compromise their vision. But it may also have required artists to focus, to consider how effective their work would be on audiences. Respect for the bottom line typically makes people more receptive to criticism, and criticism from invested parties generally improves things. And the commercial entertainment industry performed a useful filtering service, putting hurdles between artists and audiences that eliminated some poetasters (and, unfortunately, some talented but easily deterred entertainers). One could be critical of what made it through that initial filter, but usually the fact that it made it through meant it was worth taking the time to criticize -- it had been chosen and produced among thousands of other contenders. But free from the restrictions of commercialism, artists can ignore criticism and be as self-indulgent as they choose, selecting self-referential topics and making no effort to generalize subject matter so that others may get more out of them. Instead, artists can develop the expectation that others should be interested in their work for the sake of person making it, that it be interesting only on a personal level, the way Facebook pages are supposed to be.

Consumer confidence and consumerism (23 Nov 2007)

After years of people going into deeper debt to fund steady increases in consumption, it seems like consumer spending is finally going to give, just in time for Black Friday and the high retail season buckling under the strain of increased fuel prices, drops in housing prices, and suddenly tighter lending standards. Both the Economist and BusinessWeek ran cover stories about the possibility of a recession in America stemming from consumers inability to keep on consuming. The Economist story notes that "even if the economy technically avoids a recession, it will feel like one to most Americans—because it will be led by consumers. That will be a big change. Consumer spending has not fallen in a single quarter since 1991; it has not fallen on an annual basis since 1980. Consumers barely noticed America's last recession—when low interest rates and high house prices kept them spending solidly." In other words, easy credit has made consumers feel entitled, even obliged to spend. The loss of disposable income/loan funds to spend will force consumers to get more creative to stretch their dollars to provide the same amount of shopping excitement. If shopping action can be likened to gambling action, shoppers may have to drop down to cheaper tables and throw out fewer bets for the dealers.

For years, credit was easily available, at interest rates that almost made it imprudent not to borrow, especially considering that housing prices were perpetually increasing, supplying new collateral for further borrowing. Hence people would extract equity from their homes in the form of loans and spend it on consumer goods. The stereotype -- one I admittedly have a weakness for -- is that self-indulgent Americans were splurging on flat-screen TVs, luxury cars, electronic gadgets and whatnot, but it also includes things like college tuition, cell-phone services, child care, medical expenses, and things less glamorous and easy to condemn as wasteful. (Law professor Elizabeth Warren has a good paper on the "overconsumption myth." She argues that "The Over-Consumption story dominates any discussion of the financial condition of America’s families, but when all the plusses and minuses of changes in family spending are added up, a very different picture emerges. Families are spending less on ordinary consumption and more on the basics of being middle class." Whether the basics of being middle class are skewed, or subject to hedonic-treadmill style escalation into frivolous unnecessaries is a different question, but people feel obliged to spend what they must to hold on the status they achieved, regardless of whether what they spend it on is truly useful or necessary in the abstract).

Michael Mandel's piece in BusinessWeek surveys the likelihood of a consumer pullback, balancing the optimists against the pessimists and ultimately making it seem as though consumer spending is divorced from underlying economic forces, and that consumers instead respond to vague impressions they get from the economic zeitgeist. Thus, Mandel comments that the Fed needs to use rate policy to encourage consumers to remain calm. "More rate cuts by the Fed can cushion the impact of the consumer cutbacks but not avert them altogether. It's best to think of this as the end of a long-term spending and borrowing bubble, where the role of policy is to keep the inevitable adjustment from turning into panic." If rates stabilize, perhaps people will continue to feel comfortable tapping the "about $4 trillion in unused borrowing capacity on their credit cards" that remains available to them in the aggregate. Because ours is such a consumerism-oriented culture, institutional forces to encourage shopping regardless of conditions are already entrenched -- think of the expanse of the advertising infrastructure, or the way shopping today is a news story on every local news program across the country, or the flood of credit card solicitations that come to our mailboxes virtually daily.

I'm prone to mistaking a drop in consumer confidence as a pervasive and potential loss of faith in consumer values, even though the two have little to do with each other. Just because people report that they are worried about how much they can spend doesn't mean they have suddenly made their peace with doing less shopping and finding alternative preoccupations. It's not like they are losing confidence in the promised power of things to make them happy. If anything, advertisers likely redouble their efforts in down times and people rely more than ever on the fantasies ads evoke, in lieu of being able to actually get the things advertised. The fantasies can sustain them until purchasing power returns, and the objects of the fantasies probably become even more alluring.

But whenever consumer confidence dips, or consumer spending drops, or retailers report weaker earnings than expected, I tend to see this as good news, as proof that people are busy doing something else. That's probably because I think of consumption mainly as frivolous consumerism, as a self-defeating preoccupation with acquiring things rather than making the best use of them. If economic conditions diverts people from consumerism, maybe then they will refocus on making the most of what they already have, better conserve what already exists and find alternatives to consumption for ways of spending time -- to consume leisure rather than goods, to avail oneself of shared, cooperative public activities rather than retrench in private and partake in invidious comparison -- figure out ways to gloat about how much higher on the ladder one is, or how one's belongings prove how much better one's taste is in things.

But of course, when consumer confidence drops, and consumption levels suffer, growth is restricted, investment falls off, and unemployment rises along with general anxiety. People are not likely to seize upon recessions and relative privation as great opportunities to get in touch with the "things that really matter in life," as consumption measures do take those things into account. This is where the longstanding argument about whether levels of consumption correlate with levels of reported happiness come into play. On the face of things, the correlation seems weak; people don't tend to be any happier as their incomes improve, since they adapt quickly to their new horizons, and the stress of keeping up with unfamiliar mores in new socioeconomic classes takes its toll. But some argue that self-reporting is no way of measuring happiness because people have no useful perspective on themselves, and that the clear improvements in standards of living measured in other terms -- in productivity and leisure and in the richness and diversity and quality of goods -- are, though taken for granted, extremely significant advances that no one would voluntarily surrender. These things clearly derive from economic growth driven by stimulating consumption.

Condition branding (or manufacturing depression) (19 Nov 2007)

In the New York Review of Books, Frederick Crews (the Hawthorne scholar?) looks at three books that argue that depression has been fomented by the pharmaceutical industry, which stands to benefit directly from any increase in depression diagnoses. If you have ever seen the film Johnny Mnemonic -- if any of those 12 people are reading -- this will sound familiar:
Most of us naively regard mental disturbances, like physical ones, as timeless realities that our doctors address according to up-to-date research, employing medicines whose appropriateness and safety have been tested and approved by a benignly vigilant government. Here, however, we catch a glimpse of a different world in which convictions, perceived needs, and choices regarding health care are manufactured along with the products that will match them.
The corporate giants popularly known as Big Pharma spend annually, worldwide, some $25 billion on marketing, and they employ more Washington lobbyists than there are legislators. Their power, in relation to all of the forces that might oppose their will, is so disproportionately huge that they can dictate how they are to be (lightly) regulated, shape much of the medical research agenda, spin the findings in their favor, conceal incriminating data, co-opt their potential critics, and insidiously colonize both our doctors' minds and our own.
In Johnny Mnemonic, Keanu Reeves had to rescue the world from some pharmaceutically manufactured chronic disease that would make the world entirely dependent on an evil drug company's ministrations. (I think it also involved the perilous downloading of information into Keanu's overloaded brain.) The filmmakers probably didn't have SSRIs in mind then, but the analogy would have been apt: Citing one of the books, Crews notes the SSRIs "horrific withdrawal symptoms, such as dizziness, anxiety, nightmares, nausea, and constant agitation, that were frightening some users out of ever terminating their regimen—an especially bitter outcome in view of the manufacturers' promise of enhancing self-sufficiency and peace of mind. The key proclaimed advantage of the new serotonin drugs over the early tranquilizers, freedom from dependency, was simply false."

That loss of individual autonomy in the face of marketing campaigns and the slipperiness of diagnosing mental illness on the basis of a movable feast of symptoms is the part of premise of the books Crews looks at: Big Pharma uses advertising to transform what may once have been considered character traits into pathologies that one should treat with medication, promoting the sense that individual idiosyncrasy is a kind of disability that needs to be corrected, so we can all conform to the same master personality, the sort of synthetic pseudo-humans we see impersonated on television, people who are always happy, never hostile or self-sabotaging, never wracked by doubt, never anything but eager to cooperate and behave how society expects. Turning idiosyncrasies into mental health problems is known as "condition branding" -- the industry treats the name of a disease as a brand and promulgates it with the same marketing techniques that a company would use for toothpaste or laundry detergent, a process that has brought us social anxiety and restless legs syndrome and, these books argue, the depression epidemic.

Such a thesis is certain to offend lots of people who are debilitated by depression and may potentially see this line of argument as an attack on their right to feel better. For them, Crews has this rejoinder:
This isn't to say that people who experience infrequent minor depression without long-term dysfunction aren't sick enough to deserve treatment. Of course they are. But as all three of the books under consideration here attest, the pharmaceutical companies haven't so much answered a need as turbocharged it. And because self-reporting is the only means by which nonpsychotic mental ailments come to notice, a wave of induced panic may wildly inflate the epidemiological numbers, which will then drive the funding of public health campaigns to combat the chosen affliction.

The books are not simply denying the severity of a particular illness; instead they offer a subtler attack on individual autonomy, implying that people can be talked into feeling sick by advertising and other devious promotional campaigns. The premise of these books would seem to imply that people don't really know what to make of what they are feeling, and it's our inclination to turn to social norms for guidance. In our commercial, consumerist society, of course, those norms are bought and sold, and they hinge on solutions that permit for shopping and consuming objects with magic-seeming properties of transformation -- like, say, Prozac. Crews calls such drugs political sedatives, since any relief they seem to provide also serves to dissuade us from wondering whether commercial imperatives dictated their prescription.

But, as Crews explains, this is an old story, and it goes back to the imperatives that underlie the whole ediface of a consumer society, which hinges on an building an insecure populace that can be counted on to seek comfort in goods. Advertising is the art of creating dissatisfaction, and preying on personal vulnerabilities is an especially efficient way of accomplishing that end. The process is perhaps at its most extreme when the goods being advertised as medicines, and the dissatisfaction is elevated to the level of a disease that one ignores only at one's personal peril. The tendency of marketing to drift toward this maximalist approach is one reason sensible countries ban pharmaceutical advertising directed at consumers rather than theoretically disinterested medical professionals.

The ice-men of competitive minigolf (17 Nov 2007)

I love miniature golf, the more preposterous the holes the better. I like loop-de-loops, rotating obstacles, crossing wood-plank bridges over moats, the whole thing. I even played a glow-in-the dark goofy golf course in some dingy cellar on Clifton Hill in Niagara Falls.



But though I like a healthy amount of chance mixed into my mini golf, I still play to win. When I used to go down to a friend's beach house in Ocean City, New Jersey, we became dork aficionados of the many boardwalk courses and eventually got to the point where we'd bring our own putters and balls to the courses, to up the level of competition (and to perhaps compensate for the edge taken away by the beverages that were also brought along). But no matter how geeky we got, we never approached the level of the men profiled in this Wall Street Journal story by Charles Forelle about competitive minature golf, as it's played in Scandinavia.
In Europe, competitors like Mr. Ryner play a rigorously pure form of miniature golf. Course designs are more Mies van der Rohe than Myrtle Beach -- clean lines, crisp angles, geometric obstacles. There are no garden gnomes astride the mini fairways. No toy windmills. No water hazards teeming with plywood crocodiles. Here, minigolf is an athletic fugue of golf and billiards, a challenge of precision and consistency.
I was shocked to discover that these hardcore minigolfers have a range of balls that they use for different surfaces and different angles, and that they can hit shots with deliberate spin. They even go to the trouble of heating or cooling balls when necessary to get the right amount of bounce off the walls.

Forelle maintains throughout the perfect A-hed-story tone of haute seriousness ("athletic fugue" is genius), but what makes the story priceless is the quotes collected from the stern Europeans who compete with such rigorous purity.
Minigolf requires stamina and precise control. Most of all, it takes mental fortitude, says Hans Bergström, a computer specialist at Volvo and president of the European Minigolfsport Federation. "You have a very small muscle movement that makes the difference. If you cannot control your nerves, you will get it wrong," he says. "The very best players in the world are ice-cold men."
What does this say about the Swedes and Germans who seem to dominate the sport?

To grow the sport in America, some entrepreneurs are encouraging enlivening the sport with goofier holes. But one of the champions is not pleased with the idea that courses will become more gimmicky to make the sport more enticing and perhaps televisable (and if you've read this far I definitely recommend you watch all the clips on the WSJ's interactive video feature):
Walter Erlbruch bristles at the memory of a round of American-style minigolf. The passing blades of a windmill scooped up putted balls and flung them into a pool. "Luck," sniffs Mr. Erlbruch. "If you make a nonsense of my sport, I don't like it."

I'm sure somewhere in America, miniature golf is played with this level of intensity, but it never managed to reach even the level of horseshoes in terms of respectability here as an adult game. That's probably because it tends to be a family activity, meaning competitors are at unequal levels of ability. This encourages course designs that negate the role of talent, or else it makes adult players not to get too hung up on playing well in order to keep it "fun" for everyone -- so they play down to the level of the kid who's whacking the ball around with no conception of the rules or the purpose of keeping score. Also, it's probably never caught on with adults here because, unlike, say, bowling or darts, drinking is not usually integrated with playing minigolf. Mini golf courses -- inexplicably, to my mind -- don't typically have bars on site. You are discouraged from beer drinking while putt-putting, which is strange considering how commonplace drinking is on real golf courses, where players typically have to pilot motor vehicles around and send flying projectiles through the air with as much velocity as they can muster. But then, my pleasure in minigolf may strictly be a nostalgic thing for childhood, when cutthroat competition meant trying to get the ball in the clown's nose for a free game, not trying to make sure you weren't forced to work overtime without compensation just to keep your job.