Casino Crawl
If you’ve been to an American discount store — Macfrugal’s, Factory 2-U, Dee & Dee, any of the thousands of 99-cent stores — then you’ve already experienced many of the dubious thrills of being a Las Vegas tourist. The garish, chintzy products, ranging from the theoretically practical (melon-ballers, liquid-soap dispensers) to the inexplicable (garlic shampoo, hair mayonnaise), and the overt rip-offs of familiar brands (why buy Sony when you can have Coby?) that you find at discount stores are analogous to the ersatz reconstructions of ritzier tourist destinations (Paris, Venice, New York, New Orleans), and the thin veneer of glitz over everything the Las Vegas Strip has to offer. And the gamble you take when you buy a package of “Blooper” T-shirts, hoping they’re not too irregular to wear, is not so different than taking a spin at the nickel slots.
The gaudy, overlit Strip is a monument to second-rate simulation and faux glamour. The displays are calculated and contrived to mark the point where over-the-top hits rock bottom; where perfect naiveté becomes indistinguishable from perfect cynicism. Both 99-cent stores and tourist Las Vegas make no effort to disguise their exuberant tawdriness; it seems to bespeak of a humble wish to please through sheer quantity and brute pandering. In the face of such generous profusion of stimuli, a critical attitude can seem like fussy, uptight caviling.
The downscale, vaguely disreputable shoppers one sees in the narrow, overstocked aisles at the discount store, are much like the people I find in the heart of tourist Las Vegas. One night, as I jostle my way up and across a pedestrian overpass at the intersection of the Strip and Flamingo Road (the latter named for the Flamingo casino, of course), thronged to capacity, buzzing with electricity and the flickering of enormous video screens, I pass the drifters — the desert rats, as they’re known out west, with their leathery tans and deeply etched squints from too much desert sun. Even at night they slit their eyes as if to protect them from the flashing lights. Desert Rats manage to hustle jobs distributing pornographic pamphlets printed with phone numbers of “dancers” and “massage artists”. I squeeze past the overweight women bulging out of their undersized clothes and the men in their customary “eveningwear” of Bermuda shorts and auto racing T-shirts, sipping fruity booze drinks through a straw from giant plastic vessels shaped like the Eiffel Tower or a pyramid or the Empire State Building, depending on where it was purchased.
Amid the tourists, I feel a press of momentum toward no particular destination, despite the superfluity of alleged attractions: the dancing fountains at the Bellagio, the erupting volcano at the Mirage, the pirate-ship battle at Treasure Island, the talking statues inside Caesars Palace. On the street, the cars barely move. And the pedestrians, when they’re not forced off of street level on to overpasses, clot the intersections, hardly trying to pick out traffic signals from the sea of flashing color. They stand in the road, oblivious, taking pictures while horns blare. But most of the drivers are in no hurry. Their glass-packs rumble, the bass booms. Periodically someone yells something incoherently celebratory out of the window, as if driving through the heart of these crowds and the glare of the casino lights makes them feel as though a TV camera has been pointed at them. Sometimes you see a bunch of people drinking on the back of a pickup truck, usually with California plates and tinted windows, and they’ll all start yelling, too, as soon as they feel they are being looked at.
Inside the casinos, which are surprisingly hard to tell apart considering their disparate themes — perhaps proving that as far as American tourists are concerned, there are no essential differences between, say, Ancient Egypt and contemporary New York — there are lines: lines for the buffets, lines for the steakhouses, where the prime rib could be had for under $10; lines for the casino cage and the change window where paper and plastic money is changed to coins and chips; and lines for the ATM (“the only thing paying out”, as one cheerful tourist from Massachusetts tells me). The only place where there is no line is at the slot machines, many featuring tie-ins to television shows. The faces of TV stars rotate on the simulated slot reels while the TV theme songs intermittently play, the music competing with the machine next to it.
With a cacophony of beeps and bells going off all around me, I wander confusedly through the aisles. Much as I do on my sojourns to Macfrugals, I try to reassure myself that I’m not like “them”. I’m not one of those credulous rubes being taken in by phony bargains and bad risks. Rather, I have some ironic reason for being there. Maybe, I think, that self-consciousness exempts me: If I know I’m being taken, I can’t still be a fool, right?
Both the 99-cent store and tourist Las Vegas are microcosms of American consumer capitalism at its most dubious. These are places where the unscrupulous rush to find the maximum profit margin among the desperate, the gullible, and the easily beguiled, and they make no effort to conceal it. What both offer is the chance, no matter how parodic or benighted, to participate in the luxurious joys of the abundance American society never tires of prescribing, the opulent lifestyle portrayed in entertainment and advertisements alike. When you’re selling hope to the hopeless, you can use the cheapest, most vulgar means to evoke their dreams and counterfeit their fulfillment. To go to Vegas is to immerse yourself in that vulgarity and the exploitation it trumpets. Every gleaming hotel tower, every simulacrum of an ancient wonder, was built not through some noble human aspiration for splendor or permanence, but for carefully harvesting the stupidity and cupidity of others. For no other reason than it exists: a fertile field of pure profit that never need lay fallow.
Every effort has been made to streamline the profit-making potential of people’s ignorance. Living here, you can enjoy the prosperity brought on by that harvest, but only in the form it necessarily takes, as quantity over quality, as lures and traps you know enough to selectively avoid. It’s easy to see why some choose to move here. There’s no income tax, there’s a superfluity of well-paid, low-skill jobs in the construction and service industries, and there’s an abundance of cheap, cheap houses — half the price for something twice the size of what one could get in nearby California. You have construction workers who move here to build houses for the other construction workers who have come to build the new hotels or work on the roads that are replacing the twenty-year-old roads. The roads are as methodically torn down as if their obsolescence had been as carefully planned as a make of automobile, soon to go out of style.
As a place where the most resonant tradition is an overwhelming rejection of moralistic meddling, Las Vegas seems to represent a chance at pure freedom. It’s a place that seems to refuse to make judgments and won’t ask questions of you when you suddenly arrive, no matter the hour. Everyone’s invited. And virtually no one’s been rooted in this land long enough to be snobbish about latecomers. The fervent wish of many Americans that maybe are stuck in bad jobs or are married to bad spouses or have perhaps done a few bad things, to “start clean”, forget their past, invent new and better selves for themselves, matches the valley’s own perpetual reconstruction and permanent growth, with no apparent compass to steer it and with no legislation to slow it down. People come to town to take advantage: of the weather, of the libertarian climate, of the affordability, of each other.
Because when you come to live in Las Vegas — lest you become one of the exploited — you’re likely to link arms with the exploiters and gleefully find opportunity in the weakness of others, expressed through their gambling or their insatiable appetite for junk. But all the while that corrosive attitude slowly corrupts your own tastes as it sharpens your sense of advantage. It bedazzles away your ability to see others as humans instead of marks. It erodes away an ethic of compassion. Before you know it, you have the self-serving ethics of a con man. An article in The New York Times on Las Vegas’s growth (“A Life as a Live! Nude! Girl! Has a Few Strings Attached,” 2 June 2004), offers a perfect example of the instrumental rationality the city encourages, through the story of Trixie, a stripper who “is generous about sharing with other dancers . . . the cold science of milking the customers for everything they’ve got. “It’s just like predators, just like in the rest of human nature. . . ,” she says, “You condition them like Pavlov’s dogs.” In other words, you can enjoy the tax-free life and home-owning ease that Las Vegas offers (a new house is built every 20 minutes on average, and even that can’t keep pace with demand), but you must also have no qualms about holding the leash of the slobbering tourists, or about having your own chain pulled.
Getting In On the Action
Gambling is as good a place as any to turn when you’ve lost faith in the idea that your society rewards merit, and one can only muster a belief in the lucky chance. At the craps table, impotence strangely transforms into omnipotence. As I wait for the shooter to throw the dice, listening to the stickman repeat his mantra of proposition bets — “Your C and E’s, your any seven, the horn, the hard ways” — I start to think the outcome will be affected by how rigidly I adhere to my ritual: first sipping my complimentary gin and tonic, then taking a drag of my cigarette and returning it to the ashtray but without letting it settle in a notch, then touching my glasses, and then waiting, taking care never to look the shooter in the face and never to look at the dice when they land.
Anyone who gets seriously involved with gambling knows that for the gambler, it’s not about money or even the feeling of winning. It’s about the “action”, which is the ability to make bets and keep them perpetually unresolved. The more bets you can keep open-ended, the more action you’re seeing. Action reduces the world to a sublimely manageable scale: the next card, the next roll, the next spin of the wheel. Action means things are in motion, but they’re not going anywhere in particular, and that’s just fine. When you have action, you can’t think of the past or the future, only the present moment matters, intensely. As long as you have action, things matter and apathy is impossible.
Beside me a shriveled man, breathing with the aid of an oxygen tank on wheels, writes down every number thrown in a little spiral notebook; he is “clocking the dice”, searching for patterns. Mathematically, this is insane, but at the table, it seems reasonable enough; it’s proactive, he’s just looking for an edge. “Six, corner six, no field six, comes go to six.” I’m off-and-on my come bet for $34 with $30 more still working. Suddenly my most meaningless actions have great portent. Suddenly I am shaping reality with the power of my superstitious thought. Ordinarily, I’m burdened with a sense that nothing I can do matters, that in an increasingly complex world where any individual’s contribution toward solving a problem won’t even register, even if it’s not making the problem inadvertently worse. This creeping nihilism calls the very notion of belief into question, and without belief, one is paralyzed.
At the craps table, however, I forget all that, the same way those thousands who’ve always come to Las Vegas to try to forget their dead-end situations of no options and no future to invest any belief in. For a few escapist hours, I have important options (Bet the Pass or the Don’t Pass? Place the numbers or play the Field?) and I can act decisively, and my decisions contribute directly to how much longer I can stay in the action, and for those hours, action is all there is in the world. When you have limited options, be it from a lack of education or ambition, or the absence of a support system, familial or communal, and you are trained most of your life to be a passive, obedient sponge, whether at the workplace, where you’re expected to do no more than follow procedures, or at leisure, watching television shows that cue you when to laugh and cry, the idea of taking action, of being able to do something, anything, that’s meaningful can take on mythic proportion.
But action is not activity, it’s always something you have and never something you do. You can’t generate it yourself, and it can’t be sustained. Action allows you to feel what it’s like to be active rather than passive. It offers the fantasy of significance without ever really being significant. Except to the casinos, which don’t care at all about action but care very much about money. The gaming industry knows precisely, with verifiable certainty, how many pennies it is going to make for every dollar wagered, and there’s nothing romantic or whimsical about these calculations. Its profits could be measured with the relentless exactitude of a pendulum clock. Despite the celebrity of certain casino builders — Bugsy Siegal, Donald Trump, Steve Wynn — the casino business is not a business for dreamers or visionaries. In reality, gaming is the most zero-sum of industries; there is no win-win, there is no fair trade. The laws of probability are simply leveraged ruthlessly against those too ignorant or ornery or desperate to abide them. With mathematical certainty I should know deep down that the house wins, and I lose; if I can believe in nothing else, I should try to believe in that.
Where the Streets Have No Name
Out in the valley’s foothills, in Henderson, Nevada, the rapidly expanding suburb to the southeast of Las Vegas, it feels like I could be anywhere, or nowhere — like an airport, minus the liminality. There are no historical markers, no monuments, no indigenous plants or trees, nothing that you haven’t seen somewhere else but with more contingencies. Without the shopping centers, the apartment complexes, the tract housing, the casinos, there’s literally nothing; no history out here, just a “now” that never ends.
The ersatz glamour and pandering gaudiness of the Strip seem far away, even though you can still see the row of tall hotel towers from the newly built Interstate 215, nearing completion after several years of construction. At the most recently finished exits of that road, at Windmill Lane and Eastern Avenue, housing developments and retail strips have sprung up as fast as they could be built, and the few vacant lots have enormous signs driven deep into their rocky, inhospitable soil, advertising the coming construction. On this apparent blank slate it seems as if humankind can write its dreams, unhampered by anything but the limits of its own imagination and technology. Perhaps this has been the goal of American history all along, to achieve the tabula rasa promised to colonists by the idea of a “New World”. But surveying the rows of identical houses, replicating with viral efficiency, I wonder how such a dream could have amounted to this.
Before the highway was built, before corporations began taking advantage of what the Las Vegas Chamber of Commerce Web site calls a “pro-business environment in which companies do not pay corporate income, franchise, inventory, or unitary taxes”, the idea that anyone would live out in these hills was unthinkable. Located in the hottest desert in North America, this land constitutes a vast emptiness, devoid of all but the most rugged vegetation — low-growing shrubs, cacti and yuccas — as one finds in Death Valley, the shimmering desert just north of town. But now, as the Chamber of Commerce Web site points out, “Every hour, another two acres of Las Vegas land are developed for commercial or residential use”, and “developers of master-planned communities . . . are running out of new street names”.
It takes the full breadth of humankind’s scientific powers to make living out here even conceivable, and perhaps this is a justification in and of itself for doing it, the same way Mount Everest needs to be climbed simply because it’s there. Also, it’s a perfect place for people who regard weather and nature as obstacles to overcome and eliminate, rather than a phenomena to experience. Seasons? Who needs them. Weather does have the unfortunate habit of upsetting plans, just as the natural world has the tendency to intrude upon us at inopportune moments. One of the achievements of modernity is to make nature something we can experience on a tourist basis, on our terms, when we feel up to the aesthetic appreciation we’ve come to feel it deserves. We’ll go to Vail when interested in experiencing winter, we’ll go hiking when we want to think about how pleasant trees can be. Otherwise, we’ll live in Henderson, Nevada, Phoenix, Arizona, or San Jose, California, or any of the other similar sprawl towns that have emerged from the desert in the past few decades.
This kind of landscape constitutes a conservative’s dream. It’s a perfect, airtight marketplace where nothing intrudes with the merciless operation of the market: no local traditions, no environmental concerns, no inclement weather, no government restrictions, no sense of community solidarity. It’s a perfect capitalist laboratory. With such rapid growth, no political constituency has ever had a chance to stabilize; people are overwhelmed or diluted into incoherence before they come to know themselves, let alone identify their own potential power. With little threat of community action, businessmen and politicians can act with impunity, and cooperate with one another in schemes with nothing but the bottom line to answer to. Out here, in the featureless, anonymous desert, inside any of its casinos, all identically beguiling, uniformed employees make the soft count, dumping ten-gallon buckets full of quarters into enormous counting machines while others push wheeled steel cages, full of drab metal boxes stuffed full of money, while still others present to men wearing suits the carbon copies of the receipts that account for it all. Similar people in similar uniforms serve the food in the franchise restaurants, work the registers in the chain clothing stores, stock the shelves of the twenty-four hour convenience stores, wash the SUVs, and sell them. Here, the people all wear name tags but the names are meaningless; behind every uniform there is only the unnamed and faceless profit making of corporate enfranchisement.
In suburbs like Henderson it’s easy to see what results when the ruthless rationality of the marketplace goes unchecked. The tract housing multiplies, chain stores rush in to rival other chain stores, and the dreams expressed through small businesses or individual entrepreneurship are quietly snuffed. Without anything to restrain it, the psychological compulsion to be always new, the very motor of consumer capitalism with its cycles of fashion and planned obsolescence, is accelerated to the point of absurdity.
Out here where the tourists don’t come, you might expect to find a completely different set of priorities embodied in a wholly different architecture, and a rhythm to life directly opposed to the bustling impatience and endless exhortations to be excited that you see on the Strip. The Strip suggests a squalid decadence, waste for its own sake. And at first glance, the bland homogeneity of Henderson seems to express a kind of streamlined pragmatism. With the tidy concision of a geometric proof, the mixture of new homes and new chain restaurants and grocery stores and enormous parking lots and twenty-four-hour drug stores seem to express the American wish for a quiet life, padded everywhere by unceasing convenience and unsullied cleanliness; the kind of life where you don’t have to talk to anyone outside your family unless that someone is serving you. You can float above the fray, cocoon yourself away from the dog-eat-dog world that Trixie the stripper talks about. The entire desert community is without reference to anything that’s not man-made, and all the nature you see, the tiny, carefully landscaped lawns (which must be watered twice daily to survive) and artfully positioned trees (usually to maximize shade for automobiles) is man-made, engineered, designed, and entirely liberated from the desert’s ecology, with something of the same tactfulness that Target lavishes on a soap dish.
Every few weeks, whenever a new shopping center opens, one more interstate exit stretches further out into the pristine suburb, I feel obliged to go, even though the stores are all basically the same — there might be a Macaroni Grill instead of an Olive Garden, or an Albertsons instead of a Smith’s supermarket, a Borders instead of a Barnes and Noble. But the stores themselves aren’t important; it is more the feeling of having virgin territory to explore that matters. The parking lots are half finished, and if you aren’t careful you could drive right off the edge of the concrete into the desert. You’d be hard-pressed to find a single piece of chewed gum stuck to the pavement anywhere. Even the dumpsters around back emanate a halo of freshness. The rows of SUVs, in parking spots — actually designed to accommodate them — gleam in the sun; I catch blinding reflections off the tinted windows as I scurry for the air-conditioning inside. By comparison, the shopping centers closer to town feel moribund, antiquated, once they’ve become merely functional and are no longer brand new. My fellow residents of Henderson seem to agree. They seem to flock with me to the new centers simply because they are new, the same way we all check out the new casinos when they are built, even though we all agree they are all basically the same.
Though they aspire to a kind of contempt of the tourists who underwrite much of the valley’s growth, the permanent residents of Henderson actually mirror them in mentality, as they are tourists in their everyday lives, seeking the same blend of arbitrary novelty and comfortable familiarity, the same diverse shopping opportunities. Both are blithely untroubled by the lack of a meaningful or authentic context for their actions, both are happily without anything in their environment to attach them to concrete, determined reality, to a sense of necessity — both float in simulacra, technologically separated from nature, whether it’s through elaborately executed reconstructions of fantasy landscapes or through the simple efficacy of air-conditioning. If there is a difference, it’s that Hendersonites seem to side with the House instead of searching for action. They prefer the dreary certainties of a convenient life to a life of dreams and surprises.
The citizens suit themselves perfectly to the chain-store universe: no obstacles impel them to be creative or to seek solace in a community. Nothing leads them beyond themselves, and they are all too happy to fend for themselves. After all, cooperation means inevitable compromise, and that just isn’t all that comfortable.
Las Vegas grows and continues to grow — a giant, thirsty weed reaching farther into the desert — because it can, because our economy fetishizes growth for its own sake. But out in the foothills you can see where the building must stop, you see that the growth must end, and then you wonder what will all those home-owning construction workers do then? Where will they go to next? Or will they just tear it all down and start over? Whatever they do, no one is likely to stop them.
Marginal Utility Mirror
Sunday, February 19, 2012
Orwell essay (18 Dec 2008)
It’s a shame that the word Orwellian now signifies totalitarian surveillance and remodeling reality with lies. Judging by the persona George Orwell establishes in his essays, of which Harcourt has recently issued two new collections, Orwellian could easily have come to mean a bluff impatience with pretentiousness, or the tendency to evoke the ordinary person’s point of view as a defense of one’s own tenacious positions, or the no-nonsense voice he achieves by preferring to risk overstatement rather than waste words.
The two new volumes are a welcome and long overdue overhaul of the earlier A Collection of Essays, which now seems skimpy and inadequate in comparison. By including more of his shorter efforts, reviews and occasional journalistic pieces, editor George Packer, a New Yorker staff writer, gives a more complete picture of Orwell’s preoccupations while making palpable the pressures he wrote under. Not only was he sickly—he was wounded in the throat during his Spanish Civil War stint and long struggled with tuberculosis, which would kill him at age 47—but he was entirely engrossed by the Second World War from its origins to its aftermath.
His acute awareness of the moral crisis into which it cast Western civilization colors every word he wrote, whether he was eviscerating antiwar novels and the politics of literary poetasters, trying to rationalize the popularity of obscene popular culture, or assessing his own war experiences both during the Blitz and as a soldier abroad. Reading these essays, you always have the sense that for Orwell, the end of civilization was palpable, that a fog had settled on the world that he and his peers had all taken for granted, and when it lifted, they would find themselves in unknowable circumstances, wherein no received truths, no former certainties about the inherent goodness of human nature and the benevolence of technological progress, could be taken for granted.
In compiling the two volumes, Packer smartly divides Orwell’s essays into narrative pieces (Facing Unpleasant Facts) and critical pieces (All Art Is Propaganda). This useful arrangement keeps the confrontational bluster of his criticism separate from the occasional sanctimony and grandiosity in the autobiographical material to reveal the underlying consistency of his thinking throughout. The critical essays are anchored in his belief that first-hand experience of misery, war, and despotism is virtually mandatory for a writer to have any credibility in an age such as he wrote in: “So much of left-wing thought is a kind of playing with fire by people who don’t even know that fire is hot,” he notes in “Inside the Whale”. In his narrative essays, he is often out to demonstrate his own bona fides on this point and show readers just how hot the fires he has known were.
Calling the essays in Facing Unpleasant Facts “narrative” is something of a misnomer, however. It’s not as though he’s telling stories, except in well-known pieces like “Shooting an Elephant”, and even then he is often baldly interested in illustrating a point. Orwell is not an essayist who is content to describe an incident that’s redolent with metaphoric possibility and let readers work it out if they choose. Generally he comes right out and states his purpose, as in “A Hanging”, when he declares, “It is curious but until that moment I had never realized what it means to destroy a healthy, conscious man.” Even so, what he relates evokes much more than the conclusions he draws or the motives for writing that he shares. He is enamored and mystified enough by things as they are, in and of themselves, that in describing them, he conveys some of that ineluctable mystery that allows human life to persist in the face of wrenching misery and innumerable examples of our unrelenting capacity for cruelty toward our own species.
Rather than tell open-ended tales, Orwell does what essayists since Montaigne have always done—use scraps of personal experience to illustrate concise conclusions, which then are presented as though they are being discovered as the essayist is writing. Whether discussing secondhand bookstores, English cooking, or Luftwaffe bombs raining on London, Orwell will make a sharp point based on a personal hunch and then try to disavow anything unusual in his observation, hoping it will pass as something that would have occurred to anyone with open eyes. He reveals the essence of his method when in his essay about Marrakech he states plainly an attitude that is implicit throughout the book: “I am not commenting, merely pointing to a fact.”
Of course, the facts one chooses to point to, the details we decide to acknowledge, are usually comment enough. The essays make plain that Orwell relishes describing corpses, stenches, and squalor, even if he usually refrains from becoming sensationalistic about it. Such things, clearly, seemed indicative to him of the world as it was. But they more clearly indicate what Orwell thinks his audience doesn’t know or accept about their world. “To survive you have to fight, and to fight you have to dirty yourself,” he notes in “Looking Back on the Spanish War”, but the “intelligentsia”—possibly Orwell’s favorite pejorative—prefers not to understand this. “The fact that such a platitude is worth writing down,” Orwell says of his own observation, “shows what the years of rentier capitalism have done to us.”
Considering how concerned Orwell was with clear expression, it’s tricky to write about his work without lapsing into a pastiche of it. You want to follow the guidelines he lays out in his famous “Politics and the English Language”, particularly since he is uncharacteristically optimistic about the chances of saving the language from its devolution into Newspeak. Generally, he’s successful in being his own best example of he thinks writing should be—free of slippery, lazy phrases and showy literary flourishes.
That’s not to suggest his essays are free of rhetorical figure; rather when he deploys a metaphor, you can visualize it immediately and its meaning is always unmistakable. And when he shifts to abstractions, it’s usually with an air of apology. He was constitutionally allergic to all forms of orthodoxy, which he viewed as inherently indicative of the absence of thought. “To write in plain, vigorous language,” he writes in “The Prevention of Literature”, “one has to think fearlessly, and if one thinks fearlessly one cannot be politically orthodox.”
Thus wary of insufficient boldness, Orwell often favors provocative hyperbole and isn’t afraid to contradict himself, sometimes within the span of a few phrases. He often seems to argue with himself, as if he had too much momentum to go back and cross out something ultimately insupportable. Instead he tries to reason with whatever side of himself could have committed such an idea to paper. What results from all this is an occasionally lumpy but always lively prose with a blunt matter-of-fact rhythm that’s hard to argue with and extremely tempting to imitate.
In his criticism, Orwell’s main concern is to avoid coming across as an effete intellectual, or more particularly, a member of the “pansy left” that he never tires of excoriating. By that epithet, he meant what we might now in the US dub the Kucinich left—pacifists enamored of their own self-righteous ideals and unwilling to recognize the impracticability of peace in the face of undeniable threats. Describing his own era’s left intelligentsia in “England Your England”, Orwell writes, “There is little in them except the irresponsible carping of people who have never been and never expect to be in the position of power.”
Among the critical essays, Orwell’s “No, Not One”, a caustic review of a novel by Alex Comfort, is his most emphatic dismissal of kneejerk pacifism. “Pacifism is only a considerable force in places where people feel themselves very safe,” he declares, arguing that it is tantamount to being “pro-Nazi.” Ultimately, Orwell insists pacifism occurs only on those who “have no experience” of violence and can’t understand its inevitability—a species he typifies as “some comfortable English professor defending Russian totalitarianism.”
Elsewhere he claims that “men can only be highly civilized while other men, inevitably less civilized, are there to guard and feed them.” So vigorous is Orwell in the defense of necessary violence—that “good will is powerless unless the policeman is there to back it up”—that he ends up sounding like a precursor of Colonel Jessup, Jack Nicholson’s character in A Few Good Men who apoplectically forwarded “You need me on that wall” line of logic. At times, Orwell seems so flustered with the pansy pacifists he seems to imagine are reading him skeptically, you expect him to declare, “You can’t handle the truth!” (It’s no wonder Orwell was a popular name for neo-conservatives to bandy about in the run-up to the Iraq war; Christopher Hitchens obviously borrowed freely from him.)
Orwell was just as skeptical of technological optimists as he was of pacifists. Of course, 1984makes plain the degree to which he thought technology could be abused by totalitarian power, but skepticism of the benevolence of progress recurs continually in his essays. Though appreciative of modernism, he rejects the idea of literary progress and defends writers who have fallen out of fashion. And in his essay about H.G. Wells, Orwell outlines his own loss of faith in science, which was just as useful to the Nazis as it was for the rest of the world. “The aeroplane,” he writes, “was looked forward to as a civilizing influence but in practice has hardly been used except for dropping bombs.”
In Orwell’s view, Wells was unable to grasp the same things that eluded the pansy left: “He was, and still is, quite incapable of understanding that nationalism, religious bigotry and feudal loyalty are far more powerful forces than what he himself would describe as sanity. Creatures out of the Dark Ages have come marching into the present, and if they are ghosts they are at any rate ghosts which need a strong magic to lay them.” That’s Orwell at his most pessimistic, certain that Western Civilization has exhausted itself and the high literature it nurtured, and apprehensive about what would rise in its stead.
Orwell’s purpose, then, is rummage through the cultural flotsam and jetsam that has already implausibly managed to survive in search of essential human qualities that will remain when the Götterdämmerung finally concludes. In “Lear, Tolstoy, and the Fool” Orwell writes, “In reality, there is no kind of evidence or argument by which one can show that Shakespeare, or any other writer, is ‘good.’… Ultimately there is no test of literary merit but survival, which is itself merely an index to majority opinion.” But he doesn’t scorn majority opinion; he regards it with surprising respect. One of Orwell’s most appealing tendencies as a critic is that he never presumes to improve our tastes. He dispenses with aesthetic appreciation in favor of sociological questions, and he rarely seeks to justify his own preferences. He is pleased to come across as the common man’s representative, delivering common sense to a snob intelligentsia whose contrarian posturing has left it twisted it up with “humbug.”
His interest in what has survived leads him to take popular tastes for what they are, and to take popular culture seriously. In his essay on boy’s weeklies—cheap magazines with formulaic stories for teenagers—he argues that “people are influenced far more than they would care to admit by novels, serial stories, films and so forth, and that from this point of view the worst books are often the most important.” Since what we consume directly affects what we end up believing about the world, Orwell wants to figure out what sort of beliefs derive from “vulgar” popular tastes. What’s striking is that he assumes these beliefs will ultimately be decent and good, whereas the morals of art that fails to win wide approval are inherently suspect.
In “Benefit of Clergy”, his essay on Salvador Dali, he contemptuously dismisses the view that “the artist is to be exempt from the moral laws that are binding on ordinary people.” But he is eager to absolve illustrator Donald McGill, an artist who specialized in ribald postcards of women bending over, and Tropic of Cancer author Henry Miller, whose apathy and obscenity Orwell seems to find irresistible. To Orwell, both McGill’s and Miller’s work testify to the enduring Sancho Panza in all of us, that self-serving side of ourselves that prefers survival to heroism or virtue, that would have us slip passively through our particular era by being true to an aspect of our nature that transcends all eras. Enjoying that side of ourselves vicariously through dirty postcards and smutty novels, we reserve the energy it takes for us to live up to civilized society’s stringent demands—as Orwell insists, we generally do.
Whereas Orwell sympathizes with the earthy weaknesses of common folk and excuses the culture that derives from them, he has no such sympathy for the pampered intellectual class. In a review of a Charlie Chaplin’s The Great Dictator, Orwell goes so far as to declare that “the common man is wiser than the intellectuals, just as animals are wiser than men.” As usual, Orwell posits a kind of instinctual goodness that comes from a simpler, more earthy life. But ordinary people, unlike animals and intellectuals, retain an acute sense of right and wrong. “The common people, on the whole, are still living in the world of absolute good and evil from which the intellectuals have long since escaped,” he writes. Though this sometimes leads to common people preferring simplistic fiction, it also testifies to their moral clarity, while the moral compass of most intellectuals, in Orwell’s view, wass completely askew.
In the essay about Tolstoy’s condemnation of Shakespeare, Orwell writes, “Tolstoy was not a saint, but he tried very hard to make himself into a saint, and the standards he applied to literature were other-worldly.” Shakespeare, who embodies good badness as much as any writer, appeals to the “normal human being” who simply “wants life on earth to continue.” But saintly reformers like Tolstoy (and Gandhi) would prefer to coerce us into the Kingdom of Heaven, and in that attitude, Orwell insists, lies egotism and the “appetite for power.” These are the people who “are convinced of the wickedness of both armies and of police forces, but who are nevertheless much more intolerant and inquisitorial in outlook than the normal person who believes that it is necessary to use violence in certain circumstances,” he writes. “They will, if they can, get inside [the normal person’s] brain and dictate his thoughts for him in the minutest particulars.” In other words, the Kucinich left is equivalent to the Thought Police, though it is generally too hypocritical to recognize it.
In general, Orwell forgives and even champions bad art, obscenity, and quietism if it seems honest. Intellectuals, though, are dishonest about their aims, which they obfuscate by abusing language; their ambitions, which they cloak by abjuring the more explicit forms of violence; and their appreciation of popular culture, as they seek to shun their connection to ordinary people. This Orwell finds unforgivable. He assumes that anything the intellectuals hate must have some redeeming qualities. So he’s willing to hold his nose and dig into a bête noire of the Left like Rudyard Kipling, a writer who despite being “morally insensitive and aesthetically disgusting” has enough sense to recognize that “a humanitarian is always a hypocrite.” Since Kipling has the good imperialist’s sense of responsibility, according to Orwell he is able to “create telling phrases” and become a “good bad poet”—that is, one who grasps universal experiences and renders it in easily remembered language. Kipling “dealt largely in platitudes, and since we live in a world of platitudes, much of what he said sticks.”
Orwell devotes an entire essay to “good bad books”—“the kind of book that has no literary pretensions but which remains readable when more serious productions have perished”—but in a sense all of his literary criticism deals with the subject and the messages that redeem whatever vulgar way they happen to be expressed. As the collection’s title suggests, Orwell saw art as essentially propaganda; “good bad” works had the advantage of propagandizing for humble and obvious ideas rather than dangerous, overambitious ones. Good bad books are written by “natural novelists… who seem to attain sincerity partly because they are not inhibited by good taste.”
Good badness thus works as obscenity does in McGill’s postcards, crudely capturing “ordinary” sentiments we all can recognize in ourselves even if we are not quick to acknowledge them. This, for Orwell, ultimately explains which works survive. Like Kipling, good bad writers are unafraid to appeal straightforwardly to a common sensibility that we all are presumed to share, which allows them to survive when more intentionally iconoclastic work ceases to shock and fades away. Most famously, Orwell credits Dickens with the same staying power: Though Dickens is irredeemably bourgeois, he expressed “in a comic, simplified, and therefore memorable form the native decency of the common man.”
Appreciating avant-garde art, championing utopian crusades, sneering at plebian entertainments: these are available only to a pampered leisure class. Orwell instead romanticizes an emotional Spartanism that’s open to everyone. “Happiness hitherto has been a byproduct, and for all we know it may always remain so,” he writes in “Can Socialists Be Happy?” The reason to fight for Socialism is not to establish heaven on earth or test the mettle of one’s own commitment but to achieve “human brotherhood,” which he defines as “a world in which human beings love one another instead of swindling and murdering one another.”
Orwell’s essays testify to the notion that any effort to accomplish that appealingly straightforward ideal will be threatened by intellectual arrogance. But they also remind us that better than our best intentions is our inescapable nature, our shared ordinariness, which will always have the potential to redeem us all if only we will embrace it.
The two new volumes are a welcome and long overdue overhaul of the earlier A Collection of Essays, which now seems skimpy and inadequate in comparison. By including more of his shorter efforts, reviews and occasional journalistic pieces, editor George Packer, a New Yorker staff writer, gives a more complete picture of Orwell’s preoccupations while making palpable the pressures he wrote under. Not only was he sickly—he was wounded in the throat during his Spanish Civil War stint and long struggled with tuberculosis, which would kill him at age 47—but he was entirely engrossed by the Second World War from its origins to its aftermath.
His acute awareness of the moral crisis into which it cast Western civilization colors every word he wrote, whether he was eviscerating antiwar novels and the politics of literary poetasters, trying to rationalize the popularity of obscene popular culture, or assessing his own war experiences both during the Blitz and as a soldier abroad. Reading these essays, you always have the sense that for Orwell, the end of civilization was palpable, that a fog had settled on the world that he and his peers had all taken for granted, and when it lifted, they would find themselves in unknowable circumstances, wherein no received truths, no former certainties about the inherent goodness of human nature and the benevolence of technological progress, could be taken for granted.
In compiling the two volumes, Packer smartly divides Orwell’s essays into narrative pieces (Facing Unpleasant Facts) and critical pieces (All Art Is Propaganda). This useful arrangement keeps the confrontational bluster of his criticism separate from the occasional sanctimony and grandiosity in the autobiographical material to reveal the underlying consistency of his thinking throughout. The critical essays are anchored in his belief that first-hand experience of misery, war, and despotism is virtually mandatory for a writer to have any credibility in an age such as he wrote in: “So much of left-wing thought is a kind of playing with fire by people who don’t even know that fire is hot,” he notes in “Inside the Whale”. In his narrative essays, he is often out to demonstrate his own bona fides on this point and show readers just how hot the fires he has known were.
Calling the essays in Facing Unpleasant Facts “narrative” is something of a misnomer, however. It’s not as though he’s telling stories, except in well-known pieces like “Shooting an Elephant”, and even then he is often baldly interested in illustrating a point. Orwell is not an essayist who is content to describe an incident that’s redolent with metaphoric possibility and let readers work it out if they choose. Generally he comes right out and states his purpose, as in “A Hanging”, when he declares, “It is curious but until that moment I had never realized what it means to destroy a healthy, conscious man.” Even so, what he relates evokes much more than the conclusions he draws or the motives for writing that he shares. He is enamored and mystified enough by things as they are, in and of themselves, that in describing them, he conveys some of that ineluctable mystery that allows human life to persist in the face of wrenching misery and innumerable examples of our unrelenting capacity for cruelty toward our own species.
Rather than tell open-ended tales, Orwell does what essayists since Montaigne have always done—use scraps of personal experience to illustrate concise conclusions, which then are presented as though they are being discovered as the essayist is writing. Whether discussing secondhand bookstores, English cooking, or Luftwaffe bombs raining on London, Orwell will make a sharp point based on a personal hunch and then try to disavow anything unusual in his observation, hoping it will pass as something that would have occurred to anyone with open eyes. He reveals the essence of his method when in his essay about Marrakech he states plainly an attitude that is implicit throughout the book: “I am not commenting, merely pointing to a fact.”
Of course, the facts one chooses to point to, the details we decide to acknowledge, are usually comment enough. The essays make plain that Orwell relishes describing corpses, stenches, and squalor, even if he usually refrains from becoming sensationalistic about it. Such things, clearly, seemed indicative to him of the world as it was. But they more clearly indicate what Orwell thinks his audience doesn’t know or accept about their world. “To survive you have to fight, and to fight you have to dirty yourself,” he notes in “Looking Back on the Spanish War”, but the “intelligentsia”—possibly Orwell’s favorite pejorative—prefers not to understand this. “The fact that such a platitude is worth writing down,” Orwell says of his own observation, “shows what the years of rentier capitalism have done to us.”
Considering how concerned Orwell was with clear expression, it’s tricky to write about his work without lapsing into a pastiche of it. You want to follow the guidelines he lays out in his famous “Politics and the English Language”, particularly since he is uncharacteristically optimistic about the chances of saving the language from its devolution into Newspeak. Generally, he’s successful in being his own best example of he thinks writing should be—free of slippery, lazy phrases and showy literary flourishes.
That’s not to suggest his essays are free of rhetorical figure; rather when he deploys a metaphor, you can visualize it immediately and its meaning is always unmistakable. And when he shifts to abstractions, it’s usually with an air of apology. He was constitutionally allergic to all forms of orthodoxy, which he viewed as inherently indicative of the absence of thought. “To write in plain, vigorous language,” he writes in “The Prevention of Literature”, “one has to think fearlessly, and if one thinks fearlessly one cannot be politically orthodox.”
Thus wary of insufficient boldness, Orwell often favors provocative hyperbole and isn’t afraid to contradict himself, sometimes within the span of a few phrases. He often seems to argue with himself, as if he had too much momentum to go back and cross out something ultimately insupportable. Instead he tries to reason with whatever side of himself could have committed such an idea to paper. What results from all this is an occasionally lumpy but always lively prose with a blunt matter-of-fact rhythm that’s hard to argue with and extremely tempting to imitate.
In his criticism, Orwell’s main concern is to avoid coming across as an effete intellectual, or more particularly, a member of the “pansy left” that he never tires of excoriating. By that epithet, he meant what we might now in the US dub the Kucinich left—pacifists enamored of their own self-righteous ideals and unwilling to recognize the impracticability of peace in the face of undeniable threats. Describing his own era’s left intelligentsia in “England Your England”, Orwell writes, “There is little in them except the irresponsible carping of people who have never been and never expect to be in the position of power.”
Among the critical essays, Orwell’s “No, Not One”, a caustic review of a novel by Alex Comfort, is his most emphatic dismissal of kneejerk pacifism. “Pacifism is only a considerable force in places where people feel themselves very safe,” he declares, arguing that it is tantamount to being “pro-Nazi.” Ultimately, Orwell insists pacifism occurs only on those who “have no experience” of violence and can’t understand its inevitability—a species he typifies as “some comfortable English professor defending Russian totalitarianism.”
Elsewhere he claims that “men can only be highly civilized while other men, inevitably less civilized, are there to guard and feed them.” So vigorous is Orwell in the defense of necessary violence—that “good will is powerless unless the policeman is there to back it up”—that he ends up sounding like a precursor of Colonel Jessup, Jack Nicholson’s character in A Few Good Men who apoplectically forwarded “You need me on that wall” line of logic. At times, Orwell seems so flustered with the pansy pacifists he seems to imagine are reading him skeptically, you expect him to declare, “You can’t handle the truth!” (It’s no wonder Orwell was a popular name for neo-conservatives to bandy about in the run-up to the Iraq war; Christopher Hitchens obviously borrowed freely from him.)
Orwell was just as skeptical of technological optimists as he was of pacifists. Of course, 1984makes plain the degree to which he thought technology could be abused by totalitarian power, but skepticism of the benevolence of progress recurs continually in his essays. Though appreciative of modernism, he rejects the idea of literary progress and defends writers who have fallen out of fashion. And in his essay about H.G. Wells, Orwell outlines his own loss of faith in science, which was just as useful to the Nazis as it was for the rest of the world. “The aeroplane,” he writes, “was looked forward to as a civilizing influence but in practice has hardly been used except for dropping bombs.”
In Orwell’s view, Wells was unable to grasp the same things that eluded the pansy left: “He was, and still is, quite incapable of understanding that nationalism, religious bigotry and feudal loyalty are far more powerful forces than what he himself would describe as sanity. Creatures out of the Dark Ages have come marching into the present, and if they are ghosts they are at any rate ghosts which need a strong magic to lay them.” That’s Orwell at his most pessimistic, certain that Western Civilization has exhausted itself and the high literature it nurtured, and apprehensive about what would rise in its stead.
Orwell’s purpose, then, is rummage through the cultural flotsam and jetsam that has already implausibly managed to survive in search of essential human qualities that will remain when the Götterdämmerung finally concludes. In “Lear, Tolstoy, and the Fool” Orwell writes, “In reality, there is no kind of evidence or argument by which one can show that Shakespeare, or any other writer, is ‘good.’… Ultimately there is no test of literary merit but survival, which is itself merely an index to majority opinion.” But he doesn’t scorn majority opinion; he regards it with surprising respect. One of Orwell’s most appealing tendencies as a critic is that he never presumes to improve our tastes. He dispenses with aesthetic appreciation in favor of sociological questions, and he rarely seeks to justify his own preferences. He is pleased to come across as the common man’s representative, delivering common sense to a snob intelligentsia whose contrarian posturing has left it twisted it up with “humbug.”
His interest in what has survived leads him to take popular tastes for what they are, and to take popular culture seriously. In his essay on boy’s weeklies—cheap magazines with formulaic stories for teenagers—he argues that “people are influenced far more than they would care to admit by novels, serial stories, films and so forth, and that from this point of view the worst books are often the most important.” Since what we consume directly affects what we end up believing about the world, Orwell wants to figure out what sort of beliefs derive from “vulgar” popular tastes. What’s striking is that he assumes these beliefs will ultimately be decent and good, whereas the morals of art that fails to win wide approval are inherently suspect.
In “Benefit of Clergy”, his essay on Salvador Dali, he contemptuously dismisses the view that “the artist is to be exempt from the moral laws that are binding on ordinary people.” But he is eager to absolve illustrator Donald McGill, an artist who specialized in ribald postcards of women bending over, and Tropic of Cancer author Henry Miller, whose apathy and obscenity Orwell seems to find irresistible. To Orwell, both McGill’s and Miller’s work testify to the enduring Sancho Panza in all of us, that self-serving side of ourselves that prefers survival to heroism or virtue, that would have us slip passively through our particular era by being true to an aspect of our nature that transcends all eras. Enjoying that side of ourselves vicariously through dirty postcards and smutty novels, we reserve the energy it takes for us to live up to civilized society’s stringent demands—as Orwell insists, we generally do.
Whereas Orwell sympathizes with the earthy weaknesses of common folk and excuses the culture that derives from them, he has no such sympathy for the pampered intellectual class. In a review of a Charlie Chaplin’s The Great Dictator, Orwell goes so far as to declare that “the common man is wiser than the intellectuals, just as animals are wiser than men.” As usual, Orwell posits a kind of instinctual goodness that comes from a simpler, more earthy life. But ordinary people, unlike animals and intellectuals, retain an acute sense of right and wrong. “The common people, on the whole, are still living in the world of absolute good and evil from which the intellectuals have long since escaped,” he writes. Though this sometimes leads to common people preferring simplistic fiction, it also testifies to their moral clarity, while the moral compass of most intellectuals, in Orwell’s view, wass completely askew.
In the essay about Tolstoy’s condemnation of Shakespeare, Orwell writes, “Tolstoy was not a saint, but he tried very hard to make himself into a saint, and the standards he applied to literature were other-worldly.” Shakespeare, who embodies good badness as much as any writer, appeals to the “normal human being” who simply “wants life on earth to continue.” But saintly reformers like Tolstoy (and Gandhi) would prefer to coerce us into the Kingdom of Heaven, and in that attitude, Orwell insists, lies egotism and the “appetite for power.” These are the people who “are convinced of the wickedness of both armies and of police forces, but who are nevertheless much more intolerant and inquisitorial in outlook than the normal person who believes that it is necessary to use violence in certain circumstances,” he writes. “They will, if they can, get inside [the normal person’s] brain and dictate his thoughts for him in the minutest particulars.” In other words, the Kucinich left is equivalent to the Thought Police, though it is generally too hypocritical to recognize it.
In general, Orwell forgives and even champions bad art, obscenity, and quietism if it seems honest. Intellectuals, though, are dishonest about their aims, which they obfuscate by abusing language; their ambitions, which they cloak by abjuring the more explicit forms of violence; and their appreciation of popular culture, as they seek to shun their connection to ordinary people. This Orwell finds unforgivable. He assumes that anything the intellectuals hate must have some redeeming qualities. So he’s willing to hold his nose and dig into a bête noire of the Left like Rudyard Kipling, a writer who despite being “morally insensitive and aesthetically disgusting” has enough sense to recognize that “a humanitarian is always a hypocrite.” Since Kipling has the good imperialist’s sense of responsibility, according to Orwell he is able to “create telling phrases” and become a “good bad poet”—that is, one who grasps universal experiences and renders it in easily remembered language. Kipling “dealt largely in platitudes, and since we live in a world of platitudes, much of what he said sticks.”
Orwell devotes an entire essay to “good bad books”—“the kind of book that has no literary pretensions but which remains readable when more serious productions have perished”—but in a sense all of his literary criticism deals with the subject and the messages that redeem whatever vulgar way they happen to be expressed. As the collection’s title suggests, Orwell saw art as essentially propaganda; “good bad” works had the advantage of propagandizing for humble and obvious ideas rather than dangerous, overambitious ones. Good bad books are written by “natural novelists… who seem to attain sincerity partly because they are not inhibited by good taste.”
Good badness thus works as obscenity does in McGill’s postcards, crudely capturing “ordinary” sentiments we all can recognize in ourselves even if we are not quick to acknowledge them. This, for Orwell, ultimately explains which works survive. Like Kipling, good bad writers are unafraid to appeal straightforwardly to a common sensibility that we all are presumed to share, which allows them to survive when more intentionally iconoclastic work ceases to shock and fades away. Most famously, Orwell credits Dickens with the same staying power: Though Dickens is irredeemably bourgeois, he expressed “in a comic, simplified, and therefore memorable form the native decency of the common man.”
Appreciating avant-garde art, championing utopian crusades, sneering at plebian entertainments: these are available only to a pampered leisure class. Orwell instead romanticizes an emotional Spartanism that’s open to everyone. “Happiness hitherto has been a byproduct, and for all we know it may always remain so,” he writes in “Can Socialists Be Happy?” The reason to fight for Socialism is not to establish heaven on earth or test the mettle of one’s own commitment but to achieve “human brotherhood,” which he defines as “a world in which human beings love one another instead of swindling and murdering one another.”
Orwell’s essays testify to the notion that any effort to accomplish that appealingly straightforward ideal will be threatened by intellectual arrogance. But they also remind us that better than our best intentions is our inescapable nature, our shared ordinariness, which will always have the potential to redeem us all if only we will embrace it.
Labels:
George orwell
Book Arbitrage (2 July 2008)
On a shopping-friendly holiday like Memorial Day or Labor Day, when thrift stores were open and likely to have sales, we would travel to an unfamiliar city, with a few empty milk crates in the car and full day to waste. Armed with a page torn out of a pay-phone phone book (this was back when there still were pay phones, with phone books in the booths in bolted-down binders) and a map from the automobile club, we would devise our plan of attack and chart out a route for hitting as many thrift stores as we could as quickly as we could: the big, national names—St. Vincent de Paul, Goodwill, Deseret Industries, Salvation Army, Savers, Value Village—as well as the small-time charities for local hospitals and food banks and woman’s shelters and the like. Sometimes there would be library-sponsored book sales mixed in. We didn’t discriminate. For our purposes, sheer volume was of the essence.
Though we’d give over entire days to spending our money at these charity stores, we didn’t regard ourselves as philanthropists. We saw ourselves more as a clever criminal gang or, if we were less inclined to romanticize ourselves, as savvy arbitrageurs, seizing upon a loophole we uncovered and exploiting it to the farthest degree we could manage. Our goal was straightforward: Get a trove of books to resell at the independent bookstores in the college town where we lived, amassing store credit and magically transmuting such dreck as Barbara Kingsolver and Tony Hillerman novels into the gold we wanted, the abstruse social-theory texts that looked so imposing and impressive on a budding graduate student’s bookshelf.
So in our blitzkrieg assault on a town’s thrift stores, we didn’t bother with the clothes, shoes, or bric-a-brac. We were interested in books, and only books. Working methodically, shelf by shelf through whatever chockablock collection of battered, secondhand bookcases the store had marshaled to house its collection, we would pull out any title that appeared to have value. We coveted trade-size paperbacks the most; we’d buy any of those, particularly since they were usually priced the same as the pocket-size ones and were generally contemporary enough to make for easy resale. Hardcovers were less attractive because they were inexplicably more expensive and impossible to resell unless they were from some collectible niche or about an unusual subject.
It doesn’t take long to train your eye to spot worthy spines—after a few book-scouting trips, you would start to hone in on certain shapes, certain fonts, seemingly arbitrary characteristics that experience had taught us made a book sellable. Often you didn’t need to know the first thing about the title or the author to know that it could be resold.
But the bottom line was typically price. If a thrift store was having a book sale—10 paperbacks for a dollar, maybe—we’d lower our standards accordingly. We would take a chance on some books that might not sell, that might linger in our crates for a while and frustrate and embarrass us. These were books that you wouldn’t want people thinking you actually owned sincerely—Tom Clancy paperbacks, diet books, books from the Left Behind series, that sort of thing. You didn’t want people thinking you actually liked this sort of book.
Sometimes, though we would find a book we intended to keep for ourselves; this was the most triumphant disintermediating achievement of all. This is how I got my copy of Pedagogy of the Oppressed, at Speedway Outlet in Tucson, Arizona, and an Everyman edition of Samuel Richardson’s Pamela Volume II at a Savers in Flagstaff.
On these missions, as the day would wear on, our crates would quickly fill up. We took great care to keep our scores separate—we were competing with each other as well as with the bookstores we intended to eventually sell them to. When the crates were close to overflowing and the sun was about to set, we would head home and begin strategizing for the second half of our book-arbitrage scheme.
You wouldn’t think that taking a box of used books to an independent bookstore would be fraught with anxiety. For most people—for ordinary people—this is a once-a-year activity at most, a satisfying moment perhaps of a successful purging, a species of what New York Times Magazine columnist Rob Walker has called unconsumption. But for us, it was a tactical battle with a wily opponent, the bookstores’ purchasing agent, who was, we imagined, trained to try to thwart renegade book arbitrageurs like ourselves by stonewalling, stalling, or low-balling us on our hard-earned haul.
So we made very precise attack plans, organized the books in our crates carefully to foster the illusion that the stuff throughout was consistently sellable. It was necessary that the sequence of books the buyer would see be well-orchestrated, telling a particular tale about our own curatorial tastes, that they could be trusted implicitly. We wanted to lull the buyer into a comfort zone, build her confidence in our crates, so that she might give the benefit of the doubt to an unfamiliar title, or take a few books merely out of having fallen into a rhythm of pulling out keepers. Nothing was more humiliating than when a buyer passed on what you brought and you had to carry a nearly full crate out of the store. Bearing this burden of shame signaled an open, outright rejection that we felt at a surprisingly personal level. Those crates embodied the deliberate exercise of our judgment, and when they failed to convince, we could only wonder whether we had lost our touch.
Some stores’ buyers were tougher than others, and this led to the establishment of an indie-bookstore hierarchy in our minds. The tougher the buyers were, the scarcer the store credit would be and the more likely it was the store would have special, expensive treasures we yearned for. (Could we have simply bought these books? Well, we were earning a graduate student stipend, which hovers somewhere around minimum wage.) We’d start with the more selective stores, and as our crate dwindled, we’d bring the dregs into Bookman’s, a statewide chain with low standards but which offered notoriously low amounts of credit. And Bookman’s selection was sizeable, but stocked mainly with what we knew were the dregs.
While the bookstores would sometimes quote a cash figure for the books they wanted to buy from us, they always offered far more in store credit. Invariably, we took the credit. We had a lofty notion that we were in this for the knowledge—for more books—not for money. In our rejection of cash, we would even semi-self-deprecatingly quote a line that Joan Didion quoted in Slouching Towards Bethlehem: “the hippies scorn money—they call it bread”.
The problem with cash is that it immediately shifted the terms of the game, and how who was winning and who was losing was assessed. When it was a matter of who ended up with the better books, we believed we were the clear victors. I took away all three volumes of Kolakowski’s Main Currents of Marxism for a hodgepodge of forgettable contemporary fiction that I salvaged from a Mormon thrift store: Me 1, Book Haven 0. But looked at from a more bottom-line oriented perspective, it was not so flattering. The bookstores was getting to mark up by at least 100 percent the books we brought them, after hours and hours of unpaid labor, and their employees got to spend their days reading books and judging the likes of us while we were out there doing the grunt work. Our uncompensated labor helped kept their store going, and all we got out of it were some esoteric scraps. And when the culture of independent bookstores were celebrated, none of that glory redounded to us. We had already pitted ourselves against them in our hearts, and we envied them too.
Because book arbitrage had become a demented hobby for us, we didn’t mind being paid in books—that translated into prestige with the right sort of audience. And we enjoyed the work for its own sake, and for the comforting illusions about ourselves we drew from it. But eventually, when we were no longer in the cloistered economy of graduate school and had to make our way in the real world, we discovered that unread books on the shelf don’t offer much sustenance and no one is nearly as impressed with our collections as we were with ourselves. Then the collections slowly get dismantled and sold, for real, and all the chimerical gains from our arbitrage are wiped out for good.
Though we’d give over entire days to spending our money at these charity stores, we didn’t regard ourselves as philanthropists. We saw ourselves more as a clever criminal gang or, if we were less inclined to romanticize ourselves, as savvy arbitrageurs, seizing upon a loophole we uncovered and exploiting it to the farthest degree we could manage. Our goal was straightforward: Get a trove of books to resell at the independent bookstores in the college town where we lived, amassing store credit and magically transmuting such dreck as Barbara Kingsolver and Tony Hillerman novels into the gold we wanted, the abstruse social-theory texts that looked so imposing and impressive on a budding graduate student’s bookshelf.
So in our blitzkrieg assault on a town’s thrift stores, we didn’t bother with the clothes, shoes, or bric-a-brac. We were interested in books, and only books. Working methodically, shelf by shelf through whatever chockablock collection of battered, secondhand bookcases the store had marshaled to house its collection, we would pull out any title that appeared to have value. We coveted trade-size paperbacks the most; we’d buy any of those, particularly since they were usually priced the same as the pocket-size ones and were generally contemporary enough to make for easy resale. Hardcovers were less attractive because they were inexplicably more expensive and impossible to resell unless they were from some collectible niche or about an unusual subject.
It doesn’t take long to train your eye to spot worthy spines—after a few book-scouting trips, you would start to hone in on certain shapes, certain fonts, seemingly arbitrary characteristics that experience had taught us made a book sellable. Often you didn’t need to know the first thing about the title or the author to know that it could be resold.
But the bottom line was typically price. If a thrift store was having a book sale—10 paperbacks for a dollar, maybe—we’d lower our standards accordingly. We would take a chance on some books that might not sell, that might linger in our crates for a while and frustrate and embarrass us. These were books that you wouldn’t want people thinking you actually owned sincerely—Tom Clancy paperbacks, diet books, books from the Left Behind series, that sort of thing. You didn’t want people thinking you actually liked this sort of book.
Sometimes, though we would find a book we intended to keep for ourselves; this was the most triumphant disintermediating achievement of all. This is how I got my copy of Pedagogy of the Oppressed, at Speedway Outlet in Tucson, Arizona, and an Everyman edition of Samuel Richardson’s Pamela Volume II at a Savers in Flagstaff.
On these missions, as the day would wear on, our crates would quickly fill up. We took great care to keep our scores separate—we were competing with each other as well as with the bookstores we intended to eventually sell them to. When the crates were close to overflowing and the sun was about to set, we would head home and begin strategizing for the second half of our book-arbitrage scheme.
You wouldn’t think that taking a box of used books to an independent bookstore would be fraught with anxiety. For most people—for ordinary people—this is a once-a-year activity at most, a satisfying moment perhaps of a successful purging, a species of what New York Times Magazine columnist Rob Walker has called unconsumption. But for us, it was a tactical battle with a wily opponent, the bookstores’ purchasing agent, who was, we imagined, trained to try to thwart renegade book arbitrageurs like ourselves by stonewalling, stalling, or low-balling us on our hard-earned haul.
So we made very precise attack plans, organized the books in our crates carefully to foster the illusion that the stuff throughout was consistently sellable. It was necessary that the sequence of books the buyer would see be well-orchestrated, telling a particular tale about our own curatorial tastes, that they could be trusted implicitly. We wanted to lull the buyer into a comfort zone, build her confidence in our crates, so that she might give the benefit of the doubt to an unfamiliar title, or take a few books merely out of having fallen into a rhythm of pulling out keepers. Nothing was more humiliating than when a buyer passed on what you brought and you had to carry a nearly full crate out of the store. Bearing this burden of shame signaled an open, outright rejection that we felt at a surprisingly personal level. Those crates embodied the deliberate exercise of our judgment, and when they failed to convince, we could only wonder whether we had lost our touch.
Some stores’ buyers were tougher than others, and this led to the establishment of an indie-bookstore hierarchy in our minds. The tougher the buyers were, the scarcer the store credit would be and the more likely it was the store would have special, expensive treasures we yearned for. (Could we have simply bought these books? Well, we were earning a graduate student stipend, which hovers somewhere around minimum wage.) We’d start with the more selective stores, and as our crate dwindled, we’d bring the dregs into Bookman’s, a statewide chain with low standards but which offered notoriously low amounts of credit. And Bookman’s selection was sizeable, but stocked mainly with what we knew were the dregs.
While the bookstores would sometimes quote a cash figure for the books they wanted to buy from us, they always offered far more in store credit. Invariably, we took the credit. We had a lofty notion that we were in this for the knowledge—for more books—not for money. In our rejection of cash, we would even semi-self-deprecatingly quote a line that Joan Didion quoted in Slouching Towards Bethlehem: “the hippies scorn money—they call it bread”.
The problem with cash is that it immediately shifted the terms of the game, and how who was winning and who was losing was assessed. When it was a matter of who ended up with the better books, we believed we were the clear victors. I took away all three volumes of Kolakowski’s Main Currents of Marxism for a hodgepodge of forgettable contemporary fiction that I salvaged from a Mormon thrift store: Me 1, Book Haven 0. But looked at from a more bottom-line oriented perspective, it was not so flattering. The bookstores was getting to mark up by at least 100 percent the books we brought them, after hours and hours of unpaid labor, and their employees got to spend their days reading books and judging the likes of us while we were out there doing the grunt work. Our uncompensated labor helped kept their store going, and all we got out of it were some esoteric scraps. And when the culture of independent bookstores were celebrated, none of that glory redounded to us. We had already pitted ourselves against them in our hearts, and we envied them too.
Because book arbitrage had become a demented hobby for us, we didn’t mind being paid in books—that translated into prestige with the right sort of audience. And we enjoyed the work for its own sake, and for the comforting illusions about ourselves we drew from it. But eventually, when we were no longer in the cloistered economy of graduate school and had to make our way in the real world, we discovered that unread books on the shelf don’t offer much sustenance and no one is nearly as impressed with our collections as we were with ourselves. Then the collections slowly get dismantled and sold, for real, and all the chimerical gains from our arbitrage are wiped out for good.
Labels:
book store,
thrift store
Thursday, February 2, 2012
Google and the Production of Curiosity (1 Feb 2012)
On Twitter, PJ Rey resurrected this August 2010 op-ed by William Gibson that has new currency given the hullaballoo about Google's privacy-policy changes. Gibson argues that Google is an unanticipated form of artificial intelligence, "a sort of coral reef of human minds and their products." But this description sounds less like artificial intelligence and more like Marx's notion of the general intellect. Anticipating the intensification of technology, Marx claimed that machines would eventually subsume "the process of social life" and integrate it as a form of productivity.
This is pretty obscure even by Marx's standards, but autonomist Marxists (Negri, Lazzaurato, Virno) have extrapolated from this a definition of general intellect that embraces, as Virno puts it, "formal and informal knowledge, imagination, ethical tendencies, mentalities and ‘language games’." Because of the membranous nature of the general intellect, when harnessed and integrated with capital, it can recuperate all social behavior as "immaterial" production -- enriching the valence of signs, producing affects, etc. -- it means that "even the greater ‘power to enjoy’ is always on the verge of being turned into labouring task." That is, our consumption, especially of information, is a mode of production. The general intellect is the sum of all that information circulation.
Google, then, is the reification of the general intellect. It manages to take human curiosity and turn it into capital.
The consequences of that are profound. Our curiosity is no longer a sign of our leisure; it's an enormously important economic factor. To a degree this has always been true. Our willingness to pay attention to things is at the root of consumer demand. But it is now far more productive of informational goods in and of itself, thanks to ubiquitous online surveillance and data-storage capabilities. Much of the way we express our human curiosity can now be recorded and fed into algorithms and plotted on graphs of connections to generate more information, stimulate more curiosity, produce more demand. That's why, as Gibson points out, Google's Eric Schmidt claimed that people "want Google to tell them what they should be doing next." Google doesn't end lines of inquiry; it gives users momentum. The point of Google is to try to keep you Googling. Not only does that make their ad space more valuable, but it adds value to their search products; it thickens the membrane. As Gibson notes, "In Google, we are at once the surveilled and the individual retinal cells of the surveillant, however many millions of us, constantly if unconsciously participatory."
What that means is that Google's instantiation of the general intellect captures not merely human cooperation and collaboration, as the theorists tend to emphasize when discussing post-Fordist production and the productivity of interpersonal "virtuosity". It also captures and perhaps even emphasizes the lateral surveillance aspect of sociality -- each implementing control on everyone else, recording what they do and annotating it. Human curiosity is intensified and directed at one another. The general intellect becomes a giant spying machine. (Facebook is probably a more explicit example of that than Google, but Google seems more powerful as the received source of answers, the index of approved information.)
Gibson notes how Google makes personal identity a productive factor, a kind of capital it owns. This makes it something we are therefore stuck with. What we have done and would like to have forgotten is part of Google's "fixed capital" that they are loath to relinquish, despite Schmidt's suggestion that teens be issued fresh identities when they become adults. (Gibson ridicules "the prospect of millions of people living out their lives in individual witness protection programs.") Instead we must adapt our understanding of who we are and what identity consists of. In The New Spirit of Capitalism, before launching into a discussion of the ideological usefulness of the term network, Boltanski and Chiapello discuss our demand for the intelligibility of shared social values.
So our common sense understanding of what it means to have a self is changing under this pressure. We no longer have the luxury of seeing ourselves as isolated individuals who make themselves as an expression of their iron internal will. Now we have our identities explicitly shaped (or maybe even dictated) by our contingent place in social networks and we can't hide that fact from ourselves. We have to relieve the dissonance of our data trail by surrendering the prerogative of claiming to be self-created and learn to love the self the data tells us we are or should be at any particular moment. We let Google tell us what to do next.
The development of fixed capital indicates to what degree general social knowledge has become a direct force of production, and to what degree, hence, the conditions of the process of social life itself have come under the control of the general intellect and been transformed in accordance with it. To what degree the powers of social production have been produced, not only in the form of knowledge, but also as immediate organs of social practice, of the real life process.
This is pretty obscure even by Marx's standards, but autonomist Marxists (Negri, Lazzaurato, Virno) have extrapolated from this a definition of general intellect that embraces, as Virno puts it, "formal and informal knowledge, imagination, ethical tendencies, mentalities and ‘language games’." Because of the membranous nature of the general intellect, when harnessed and integrated with capital, it can recuperate all social behavior as "immaterial" production -- enriching the valence of signs, producing affects, etc. -- it means that "even the greater ‘power to enjoy’ is always on the verge of being turned into labouring task." That is, our consumption, especially of information, is a mode of production. The general intellect is the sum of all that information circulation.
Google, then, is the reification of the general intellect. It manages to take human curiosity and turn it into capital.
The consequences of that are profound. Our curiosity is no longer a sign of our leisure; it's an enormously important economic factor. To a degree this has always been true. Our willingness to pay attention to things is at the root of consumer demand. But it is now far more productive of informational goods in and of itself, thanks to ubiquitous online surveillance and data-storage capabilities. Much of the way we express our human curiosity can now be recorded and fed into algorithms and plotted on graphs of connections to generate more information, stimulate more curiosity, produce more demand. That's why, as Gibson points out, Google's Eric Schmidt claimed that people "want Google to tell them what they should be doing next." Google doesn't end lines of inquiry; it gives users momentum. The point of Google is to try to keep you Googling. Not only does that make their ad space more valuable, but it adds value to their search products; it thickens the membrane. As Gibson notes, "In Google, we are at once the surveilled and the individual retinal cells of the surveillant, however many millions of us, constantly if unconsciously participatory."
What that means is that Google's instantiation of the general intellect captures not merely human cooperation and collaboration, as the theorists tend to emphasize when discussing post-Fordist production and the productivity of interpersonal "virtuosity". It also captures and perhaps even emphasizes the lateral surveillance aspect of sociality -- each implementing control on everyone else, recording what they do and annotating it. Human curiosity is intensified and directed at one another. The general intellect becomes a giant spying machine. (Facebook is probably a more explicit example of that than Google, but Google seems more powerful as the received source of answers, the index of approved information.)
Gibson notes how Google makes personal identity a productive factor, a kind of capital it owns. This makes it something we are therefore stuck with. What we have done and would like to have forgotten is part of Google's "fixed capital" that they are loath to relinquish, despite Schmidt's suggestion that teens be issued fresh identities when they become adults. (Gibson ridicules "the prospect of millions of people living out their lives in individual witness protection programs.") Instead we must adapt our understanding of who we are and what identity consists of. In The New Spirit of Capitalism, before launching into a discussion of the ideological usefulness of the term network, Boltanski and Chiapello discuss our demand for the intelligibility of shared social values.
Young cadres in particular feel a need clearly to identify the new forms of success and the new rules of the game in the economic world, in order to know how to conduct themselves and prepare their children. This demand for intelligibility exerts significant pressure for greater explanation and formalization of the rules of conduct, which will then guide action. In fact, the people tend to conform to these emergent new rules, if only because they confer meaning on what would otherwise merely seem like an arbitrary proliferation of ad hoc mechanisms and locally convenient improvisations.I don't know about that being a "fact," but it seems plausible that social media have taught us all something about "locally convenient improvisations," for good and ill. And the explosion of Facebook and Google into our lives has disrupted the old version of intelligibility -- prompting new rules that are consistent with the new form of capitalism these media are driving.
So our common sense understanding of what it means to have a self is changing under this pressure. We no longer have the luxury of seeing ourselves as isolated individuals who make themselves as an expression of their iron internal will. Now we have our identities explicitly shaped (or maybe even dictated) by our contingent place in social networks and we can't hide that fact from ourselves. We have to relieve the dissonance of our data trail by surrendering the prerogative of claiming to be self-created and learn to love the self the data tells us we are or should be at any particular moment. We let Google tell us what to do next.
Authenticity Issues and the New Intimacies (31 Jan 2012)
Tom Slee recently began posting about MIT sociologist Sherry Turkle's recent book Alone Together. Turkle, in some ways, is the chief theorist of digital dualism; her books The Second Self (1984) and Life on the Screen (1995) helped set the terms for talking about virtual selves in cyberspace as projections of some real self that exists outside it and is deleteriously affected by these interactions. Those books are more than a little dated, but in a way that makes their arguments more striking. Just substitute Facebook for MUD in Life on the Screen; after all, what is Facebook if not a MUD in which you create and play the character of yourself.
Turkle's basic point was that computers change the people who use them (they are not neutral tools). Users begin to transfer programming metaphors to their interactions with people and psychological metaphors to the behavior of machines, and so on. This leakage between our conceptions of humans and nonhuman objects for Turkle threatens the integrity of the category of the human; reading her books sometimes feels like reading the anti-Donna Haraway. (I won't even try to relate Turkle to OOO.) Here's a typical declaration, from the introduction to the 20th anniversary edition off The Second Self:
That boundary seems very important to Turkle; her main concern often seems to be holding on to a firm definition of the "real" and bemoaning the encroachment of simulations on the preserves of genuine human experience. This deeply conservative standpoint stems from her theoretical grounding in Freud. In Alone Together she declares (in a quote Slee also highlights), "I am a psychoanalytically trained psychologist. Both by temperament and profession, I place high value on relationships of intimacy and authenticity." But what can be the basis for determining what is authentic? It can often seem arbitrary, even in Turkle's own anecdotes. (Her psychoanalytical background is surely what makes Turkle so interested in stories, the aspect of Alone Together that Slee focuses on in his post. The tendency of her examples to undermine themselves or dissolve into ambiguity is part of what he finds compelling about them.)
It often appears that Turkle is working with a preconceived, somewhat ahistorical notion of what identity and subjectivity must consist of, which leads her to take a condemnatory tone toward what her research unearths about human-machine hybridity. This tone tends to curtail the analysis -- serve as its own conclusion. She can come across as a "What about the children?!" concern troll, deploying all sorts of rhetorical questions to try to persuade us of the psychological harm of technology's current drift. ("In all of this, there is a nagging question: Does virtual intimacy degrade our experience of the other kind and, indeed, of all encounters, of any kind?" "Are we comfortable with virtual environments that propose themselves not as places for recreation but as new worlds to live in?" "Why am I talking to a robot? and Why do I want this robot to like me?")
Turkle sometimes seems to worry that "real" identity is being thwarted by online sociality, which fosters some sort of inauthentic identity. But I think the concern with authenticity is an expression of nostalgia for a period where it was easier to believe that one had total control over the development of one's personality and that identity came from within. Networked sociality has made that much harder to sustain, and the ideological anchors for identity have also begun to change with the times (hence the legitimization of the data self). Authenticity is a pressing personal issue now for many not because it has been suddenly lost (it's always already irrevocable), but because it has become one of the terms in the accounting system for a different form of mediated selfhood. "Authenticity" is another metric in the attention economy, measuring how believable one is to oneself in the process of broadcasting oneself. I'd expect that soon "authenticity" will be a literal metric, measuring the data trail one produces at one point of time with some earlier point to detect the degree of drift. (I know I should probably spin that out into an analysis of Lana Del Rey, but I'm thinking I'll just let that ship sail without me.)
In Alone Together, Turkle fuses a section about sociable robots with a section about social media usage to basically argue this: social media accustom us to instrumentalized friendship, and once we are used to that, we are open to crypto-relationships with robots (the "new intimacies"), since they offer nothing more than instrumental value. Since we don't want the "drama" of reciprocal real-time sociality anyway, there is basically no difference from our point of view between relating to another person and a robot. They are both merely mirrors for ourselves anyway. To a narcissist, every other person is always already a robot.
My favorite of Turkle's anecdotal subjects is "Brad," who talks about quitting Facebook and is highly articulate about the suffocating, stultifying reflexivity that social media induce. "Online life is about premeditation," he tells Turkle. This is also true about any concern for authenticity -- it involves a sort of deciding in advance what sort of spontaneous behavior to indulge in. We try to judge ourselves in terms of some ideal that is not supposed to be an ideal at all but one's natural, revealed self. But there is nothing natural about checking in with yourself on how natural you are being. Direct experience of one's authentic self is impossible, once it's conceived as something that can be known in the abstract -- as something fungible, malleable, deployable -- rather than as a process, a mode of action. Assessing one's authenticity, therefore is impossible too. It either makes no sense to ask (everything I do, I'm doing, and is thus authentic to me by definition) or involves paradoxes of reflexivity and observer bias (every time I try to see myself doing authentic things, my self-consciousness changes my behavior). Nevertheless, social media set themselves up (or, to be fair, are taken as) forums for authentic-self-assessment -- one' can't judge the authenticity of the unmediated self in real time, but one can certainly evaluate the authenticity of one's online profile or the impression others seem to have of you. That is the narcissistic trap social media sets out for us.
But it is a trap also to imagine one can have some sort of direct experience of others, as if you could see the "real" person outside social media. We can't access the other's consciousness; it is always an objective performance from the outside. Nobody can ever show you their "real" self.
Slee brings up one of Turkle's anecdotes that gets at a different way of viewing things outside of authenticity:
In social media there is a material basis for an alternative to ingrained tradition in anchoring identity; a networked self could have some solidity that renders the performative nature of identity operate beyond questions of genuineness or authenticity. From a resister's perspective, this all looks as odd and mechanical as the idea of sending actors to love your parents for you. But adopters can take solace in sending out their "Profile" (to use Nathan Jurgenson's term for aggrgate online presence) to perform our cemented identity within various social networks. Once you accept that Facebook's data collection roots you, you are "free" to be absent from social rituals but be present nonetheless. Welcome to the new intimacy.
What's dangerous about this is not that it has ruined some previous form of intimacy that was especially precious. The problem is that we believe that we construct this social-media identity autonomously and that it is therefore our responsibility, our fault if it's limited. The social-media companies have largely succeeded in persuading users of their platforms' neutrality. What we fail to see is that these new identities are no less contingent and dictated to us then the ones circumscribed by tradition; only now the constraints are imposed by for-profit companies in explicit service of gain.
Turkle's basic point was that computers change the people who use them (they are not neutral tools). Users begin to transfer programming metaphors to their interactions with people and psychological metaphors to the behavior of machines, and so on. This leakage between our conceptions of humans and nonhuman objects for Turkle threatens the integrity of the category of the human; reading her books sometimes feels like reading the anti-Donna Haraway. (I won't even try to relate Turkle to OOO.) Here's a typical declaration, from the introduction to the 20th anniversary edition off The Second Self:
we stand on the boundary between the physical and virtual. And increasingly, we stand on the boundary between worlds we understand through transparent algorithm and worlds we understand by manipulating opaque simulation.
That boundary seems very important to Turkle; her main concern often seems to be holding on to a firm definition of the "real" and bemoaning the encroachment of simulations on the preserves of genuine human experience. This deeply conservative standpoint stems from her theoretical grounding in Freud. In Alone Together she declares (in a quote Slee also highlights), "I am a psychoanalytically trained psychologist. Both by temperament and profession, I place high value on relationships of intimacy and authenticity." But what can be the basis for determining what is authentic? It can often seem arbitrary, even in Turkle's own anecdotes. (Her psychoanalytical background is surely what makes Turkle so interested in stories, the aspect of Alone Together that Slee focuses on in his post. The tendency of her examples to undermine themselves or dissolve into ambiguity is part of what he finds compelling about them.)
It often appears that Turkle is working with a preconceived, somewhat ahistorical notion of what identity and subjectivity must consist of, which leads her to take a condemnatory tone toward what her research unearths about human-machine hybridity. This tone tends to curtail the analysis -- serve as its own conclusion. She can come across as a "What about the children?!" concern troll, deploying all sorts of rhetorical questions to try to persuade us of the psychological harm of technology's current drift. ("In all of this, there is a nagging question: Does virtual intimacy degrade our experience of the other kind and, indeed, of all encounters, of any kind?" "Are we comfortable with virtual environments that propose themselves not as places for recreation but as new worlds to live in?" "Why am I talking to a robot? and Why do I want this robot to like me?")
Turkle sometimes seems to worry that "real" identity is being thwarted by online sociality, which fosters some sort of inauthentic identity. But I think the concern with authenticity is an expression of nostalgia for a period where it was easier to believe that one had total control over the development of one's personality and that identity came from within. Networked sociality has made that much harder to sustain, and the ideological anchors for identity have also begun to change with the times (hence the legitimization of the data self). Authenticity is a pressing personal issue now for many not because it has been suddenly lost (it's always already irrevocable), but because it has become one of the terms in the accounting system for a different form of mediated selfhood. "Authenticity" is another metric in the attention economy, measuring how believable one is to oneself in the process of broadcasting oneself. I'd expect that soon "authenticity" will be a literal metric, measuring the data trail one produces at one point of time with some earlier point to detect the degree of drift. (I know I should probably spin that out into an analysis of Lana Del Rey, but I'm thinking I'll just let that ship sail without me.)
In Alone Together, Turkle fuses a section about sociable robots with a section about social media usage to basically argue this: social media accustom us to instrumentalized friendship, and once we are used to that, we are open to crypto-relationships with robots (the "new intimacies"), since they offer nothing more than instrumental value. Since we don't want the "drama" of reciprocal real-time sociality anyway, there is basically no difference from our point of view between relating to another person and a robot. They are both merely mirrors for ourselves anyway. To a narcissist, every other person is always already a robot.
My favorite of Turkle's anecdotal subjects is "Brad," who talks about quitting Facebook and is highly articulate about the suffocating, stultifying reflexivity that social media induce. "Online life is about premeditation," he tells Turkle. This is also true about any concern for authenticity -- it involves a sort of deciding in advance what sort of spontaneous behavior to indulge in. We try to judge ourselves in terms of some ideal that is not supposed to be an ideal at all but one's natural, revealed self. But there is nothing natural about checking in with yourself on how natural you are being. Direct experience of one's authentic self is impossible, once it's conceived as something that can be known in the abstract -- as something fungible, malleable, deployable -- rather than as a process, a mode of action. Assessing one's authenticity, therefore is impossible too. It either makes no sense to ask (everything I do, I'm doing, and is thus authentic to me by definition) or involves paradoxes of reflexivity and observer bias (every time I try to see myself doing authentic things, my self-consciousness changes my behavior). Nevertheless, social media set themselves up (or, to be fair, are taken as) forums for authentic-self-assessment -- one' can't judge the authenticity of the unmediated self in real time, but one can certainly evaluate the authenticity of one's online profile or the impression others seem to have of you. That is the narcissistic trap social media sets out for us.
But it is a trap also to imagine one can have some sort of direct experience of others, as if you could see the "real" person outside social media. We can't access the other's consciousness; it is always an objective performance from the outside. Nobody can ever show you their "real" self.
Slee brings up one of Turkle's anecdotes that gets at a different way of viewing things outside of authenticity:
Visiting Japan in the early 1990s, Turkle heard tales of adult children who, too distant and too busy to visit their aging and infirm parents, hired actors to visit in their stead, playing the part of the adult child. What's more, the parents appreciated and enjoyed the gesture. It's slightly shocking to western sensibilities, but once we hear a little more context it becomes more understandable.Traditional rituals of social interaction allow people a certain measure of ontological security with regard of their place in society and within familial networks. That relatively secure identity still had to be performed to be felt, but the performance is explicitly understood as a performance. The reality of the identity is guaranteed by the rootedness of traditions. Such role-playing doesn't fit with the ideology of individual existential freedom and the glories of unrestricted personal consumer choice and living like a stranger among strangers in urban settings and so forth. And while I certainly wouldn't want to be saddled with that sort of ritualized social life and have an identity assigned to me even more on the basis of the circumstances of my birth, I do wonder what it would be like to feel intrinsically the basis of my identity was secure and my "authenticity" could never be sullied by missteps of taste.
First, the actors are not (in all cases, at least) a deception: the parents recognize them for what they are. Yet the parents "enjoyed the company and played the game". In Japan, being elderly is a role, being a child is a role, and parental visits have a strong dose of ritual to them: the recital of scripts by each party. While the child may not be able to act out their role, at least this way the parent gets to enact theirs, and so to reinforce their identity as an elderly, respected person.
In social media there is a material basis for an alternative to ingrained tradition in anchoring identity; a networked self could have some solidity that renders the performative nature of identity operate beyond questions of genuineness or authenticity. From a resister's perspective, this all looks as odd and mechanical as the idea of sending actors to love your parents for you. But adopters can take solace in sending out their "Profile" (to use Nathan Jurgenson's term for aggrgate online presence) to perform our cemented identity within various social networks. Once you accept that Facebook's data collection roots you, you are "free" to be absent from social rituals but be present nonetheless. Welcome to the new intimacy.
What's dangerous about this is not that it has ruined some previous form of intimacy that was especially precious. The problem is that we believe that we construct this social-media identity autonomously and that it is therefore our responsibility, our fault if it's limited. The social-media companies have largely succeeded in persuading users of their platforms' neutrality. What we fail to see is that these new identities are no less contingent and dictated to us then the ones circumscribed by tradition; only now the constraints are imposed by for-profit companies in explicit service of gain.
Labels:
identity,
robots,
sherry turkle,
social media
Data Self redux (30 Jan 2012)
Nathan Jurgenson has some good constructive criticism of my data self posts from last week. He points out that it is not enough to talk about how social media captures some preexisting self but also "how the individual, in all of their offline experience, behavior and existence, is simultaneously being created by this very online data."
Since I often tend to depict identity as a residual experiential illusion left over after capitalism subjectivizes us, I was admittedly surprised to see Jurgenson cite me as an example of what he calls "agentic bias" -- "the tendency to conceptually grant too much power to individuals to create their online Profiles by neglecting the ways in which individuals are simultaneously being created by their digital presence." Social media doesn't simply capture what we do online, it shapes what we do and also what we do offline -- as Jurgenson has argued elsewhere, once social media makes you aware of the ability to document your life as it is happening, it changes what you experience; you begin directing your life as if it were a documentary, choosing what to do in part on the basis of how it can be represented later.
My dialectics certainly need sharpening, but I definitely agree with Jurgenson that social media shape identity rather than merely expressing it. If I underplay the degree to which this is true, it may be because I take it for granted too much in my thinking: of course it's true that having media at one's disposal to share things renders those things subordinate to the process of sharing. Once we have a channel, we live so as to fill it with content, and that content is more self-consciously molded to suit desired audiences and enhance one's watchability -- it's "curated" with an eye to make oneself more followable. more relevant. Once you have Spotify, you have to groom what you listen to and what others see you listening to, and that reflexive grooming cancels out any preceding “innocent” or “authentic” listening behavior. Basically, the ongoing fretting about “authenticity” is a reflection of an ideologically induced blindness to our own agentic bias with regards to ourselves.
Identity has long worked in our particular ideological climate by masking its constructed origins, making it seem as though the profile (our self-directed identity, in Jurgenson's terms) is always making the Profile (the online representation). The stake is our status as a unique individual; other people may be products of the system but not us; we are self-created. So when the Profile begins to change my way of perceiving the world — when I begin to shape my behavior in terms of what social media captures or what I can share — I strive also to disavow that change. I think some of the disavowal is built into the services and the rhetoric of personal sharing they are bathed in. (Apparently, if Jurgenson is right, it has colored my own rhetoric as well.) We don't want to admit that we are being determined to a degree by our media use, so we instead struggle to do the impossible and deliberately communicate authenticity, try to communicate in such a way -- communicate something so genuine and real and uncompromising perhaps -- that can make ourselves believe that it's not totally obvious that we are posing for the cameras we've pointed at ourselves. Because if we admit to and foreground our "inauthentic" curatorial impulses -- doing things just to tweet them -- then we surrender the old ideal of our having a self-actualized identity, a unique internal self that we discover inside ourselves and then share with the world.
Social media has laid siege to that concept of identity, and the assumptions of privacy it relies on. It is trying to obviate that private self's pleasures and make pleasure derive instead from sharing, from comparing our data with others', from seeing what sort of quantifiable influence we can have as we build out networks. This is what I was calling the "data self," which may not make a whole lot of sense as a term, but there it is. With this self-concept in place, we don't worry about that disavowal of our constructedness so much; perhaps we think we can outrun it with further sharing, reaping further rewards. If we keep screaming out our attempts to self-brand, it still seems like we are controlling the process. I think this helps explain that particular desperate urgency of social-media use; we have to monitor and control the spin about ourselves before any of the many facets of our network come to own the narrative about us. That pleasant Pavolovian buzz of seeing that someone has responded to something I have posted somewhere is not merely pleasure at having gained some attention; it is also a moment that feels like control over an identity that has slipped away into the permanently public realm.
Anyway, that shift is what I was trying to get at in those posts about the “data self” -- that changes were happening at the level of the institutions that determine our concept of who we are, teach us what sort of things go into that. Capitalist subjectivity was once anchored in consumerist “authenticity” — consumerism allows us to discover the unique individual we really are and express it. But social media and smart-phone conduits are changing what anchors capitalist subjectivity to something outlined by data profiling and “sharing.” That is, the pleasures of identity are less about discovering, owning, and operating a particular unique self (as they were, mainly, under consumerism pre-social media); they are becoming more about mircoaffirmations available through social-media use: we are matched with the people who can affirm us, we see a reflection of ourselves in the data that makes us feel recognized, we are told what to want in a way that assures us we will be doing what is right and normal if we follow automated yet socially derived recommendations, etc. The data self knows itself only to the degree that it shares data and exists within media that can guide it toward various satisfying experiences and allow it to display its satisfactions dependably in formatted and readily circulatable ways. The data self doesn't really exist until social media use has gotten well under way. What may start as a yearning to express the authentic interior self in some new way ends up, as Jurgenson is arguing, obliterating the plausibility and satisfaction of having in interior self. Of course, for people who have had no sense of self that precedes social media, this analysis may make no sense. They can't conceive of authenticity as anything but a rumor.
The threat to this new “data self” sort of subjectivity is not inauthenticity -- the threat that tends to afflict people my age (phoniness as a moral category) -- but lack of access. The threat is being disconnected, having the information flow disrupted. And that threat stems from an underlying terror at the possibility that there's something crucial about our lives that can’t actually be expressed -- integral things about which we can say nothing, as Wittgenstein says. The danger to our identity is not that it will be exposed as a fake, but that endless sharing of it will make it feel increasingly inexpressible, that the key thing is escaping our attempts to tell all.
But I also think that Interaction with Facebook, etc., reshapes subjectivity to make reject the possibility of that inexpressibility. We are only what we express and share; the possibility that something could be meaningful and unshared becomes unthinkable, unfeelable. What measures the "real" about ourselves is not some internal ability to think or feel something but the ability to externalize it as processable data. Social media becomes the media of identity -- not recording it but constituting it. The data self is not just a matter of the data we supply (actively or passively), but also the data and metadata the social-media companies return to us. We increasingly stabilize our self-concept in terms of what social media makes possible, what sorts of rewards it can supply, and what garners those rewards.
I think the logical extension of the data self -- the self that is secure with itself only to the extent that it is constituted in social media as manipulatable data -- will be for Twitter to come preloaded with plausible friends, Facebook preloaded with life experiences, or at least preordained slots of experiences a user is supposed to have. Our self-recording vision of ourselves, as shaped by social media, may not alienate us from some "genuine" experience we would have otherwise had, but it is also not autonomous either, not freed of the ideological dispositions companies have built into their platforms. As much as we may come to see our lives as documentaries that we are always in the process of making, we still are typically reliant on someone else's storyboard.
(Note to self: Remember that the work of identity construction for any given individual is always collective. One's identity is not the product of the identity-bearer's labor only, but is also the product of those whose work sustains institutions and expressive codes and everything else that contributes to substantiating and expressing identity.)
Since I often tend to depict identity as a residual experiential illusion left over after capitalism subjectivizes us, I was admittedly surprised to see Jurgenson cite me as an example of what he calls "agentic bias" -- "the tendency to conceptually grant too much power to individuals to create their online Profiles by neglecting the ways in which individuals are simultaneously being created by their digital presence." Social media doesn't simply capture what we do online, it shapes what we do and also what we do offline -- as Jurgenson has argued elsewhere, once social media makes you aware of the ability to document your life as it is happening, it changes what you experience; you begin directing your life as if it were a documentary, choosing what to do in part on the basis of how it can be represented later.
My dialectics certainly need sharpening, but I definitely agree with Jurgenson that social media shape identity rather than merely expressing it. If I underplay the degree to which this is true, it may be because I take it for granted too much in my thinking: of course it's true that having media at one's disposal to share things renders those things subordinate to the process of sharing. Once we have a channel, we live so as to fill it with content, and that content is more self-consciously molded to suit desired audiences and enhance one's watchability -- it's "curated" with an eye to make oneself more followable. more relevant. Once you have Spotify, you have to groom what you listen to and what others see you listening to, and that reflexive grooming cancels out any preceding “innocent” or “authentic” listening behavior. Basically, the ongoing fretting about “authenticity” is a reflection of an ideologically induced blindness to our own agentic bias with regards to ourselves.
Identity has long worked in our particular ideological climate by masking its constructed origins, making it seem as though the profile (our self-directed identity, in Jurgenson's terms) is always making the Profile (the online representation). The stake is our status as a unique individual; other people may be products of the system but not us; we are self-created. So when the Profile begins to change my way of perceiving the world — when I begin to shape my behavior in terms of what social media captures or what I can share — I strive also to disavow that change. I think some of the disavowal is built into the services and the rhetoric of personal sharing they are bathed in. (Apparently, if Jurgenson is right, it has colored my own rhetoric as well.) We don't want to admit that we are being determined to a degree by our media use, so we instead struggle to do the impossible and deliberately communicate authenticity, try to communicate in such a way -- communicate something so genuine and real and uncompromising perhaps -- that can make ourselves believe that it's not totally obvious that we are posing for the cameras we've pointed at ourselves. Because if we admit to and foreground our "inauthentic" curatorial impulses -- doing things just to tweet them -- then we surrender the old ideal of our having a self-actualized identity, a unique internal self that we discover inside ourselves and then share with the world.
Social media has laid siege to that concept of identity, and the assumptions of privacy it relies on. It is trying to obviate that private self's pleasures and make pleasure derive instead from sharing, from comparing our data with others', from seeing what sort of quantifiable influence we can have as we build out networks. This is what I was calling the "data self," which may not make a whole lot of sense as a term, but there it is. With this self-concept in place, we don't worry about that disavowal of our constructedness so much; perhaps we think we can outrun it with further sharing, reaping further rewards. If we keep screaming out our attempts to self-brand, it still seems like we are controlling the process. I think this helps explain that particular desperate urgency of social-media use; we have to monitor and control the spin about ourselves before any of the many facets of our network come to own the narrative about us. That pleasant Pavolovian buzz of seeing that someone has responded to something I have posted somewhere is not merely pleasure at having gained some attention; it is also a moment that feels like control over an identity that has slipped away into the permanently public realm.
Anyway, that shift is what I was trying to get at in those posts about the “data self” -- that changes were happening at the level of the institutions that determine our concept of who we are, teach us what sort of things go into that. Capitalist subjectivity was once anchored in consumerist “authenticity” — consumerism allows us to discover the unique individual we really are and express it. But social media and smart-phone conduits are changing what anchors capitalist subjectivity to something outlined by data profiling and “sharing.” That is, the pleasures of identity are less about discovering, owning, and operating a particular unique self (as they were, mainly, under consumerism pre-social media); they are becoming more about mircoaffirmations available through social-media use: we are matched with the people who can affirm us, we see a reflection of ourselves in the data that makes us feel recognized, we are told what to want in a way that assures us we will be doing what is right and normal if we follow automated yet socially derived recommendations, etc. The data self knows itself only to the degree that it shares data and exists within media that can guide it toward various satisfying experiences and allow it to display its satisfactions dependably in formatted and readily circulatable ways. The data self doesn't really exist until social media use has gotten well under way. What may start as a yearning to express the authentic interior self in some new way ends up, as Jurgenson is arguing, obliterating the plausibility and satisfaction of having in interior self. Of course, for people who have had no sense of self that precedes social media, this analysis may make no sense. They can't conceive of authenticity as anything but a rumor.
The threat to this new “data self” sort of subjectivity is not inauthenticity -- the threat that tends to afflict people my age (phoniness as a moral category) -- but lack of access. The threat is being disconnected, having the information flow disrupted. And that threat stems from an underlying terror at the possibility that there's something crucial about our lives that can’t actually be expressed -- integral things about which we can say nothing, as Wittgenstein says. The danger to our identity is not that it will be exposed as a fake, but that endless sharing of it will make it feel increasingly inexpressible, that the key thing is escaping our attempts to tell all.
But I also think that Interaction with Facebook, etc., reshapes subjectivity to make reject the possibility of that inexpressibility. We are only what we express and share; the possibility that something could be meaningful and unshared becomes unthinkable, unfeelable. What measures the "real" about ourselves is not some internal ability to think or feel something but the ability to externalize it as processable data. Social media becomes the media of identity -- not recording it but constituting it. The data self is not just a matter of the data we supply (actively or passively), but also the data and metadata the social-media companies return to us. We increasingly stabilize our self-concept in terms of what social media makes possible, what sorts of rewards it can supply, and what garners those rewards.
I think the logical extension of the data self -- the self that is secure with itself only to the extent that it is constituted in social media as manipulatable data -- will be for Twitter to come preloaded with plausible friends, Facebook preloaded with life experiences, or at least preordained slots of experiences a user is supposed to have. Our self-recording vision of ourselves, as shaped by social media, may not alienate us from some "genuine" experience we would have otherwise had, but it is also not autonomous either, not freed of the ideological dispositions companies have built into their platforms. As much as we may come to see our lives as documentaries that we are always in the process of making, we still are typically reliant on someone else's storyboard.
(Note to self: Remember that the work of identity construction for any given individual is always collective. One's identity is not the product of the identity-bearer's labor only, but is also the product of those whose work sustains institutions and expressive codes and everything else that contributes to substantiating and expressing identity.)
Monday, January 30, 2012
Existential Liberalism As Ideological Fog (27 Jan 2012)
This, from "Reflections on the Call" by Leon de Mattis, in Communization and Its Discontents (pdf), is a good point:
Class relations disguise themselves at the personal level, and dissolve into "existential liberalism." Capitalists in general are bad, but each individual capitalist seems like a nice enough person, doing their philanthropy and what not, recycling like a good citizen, etc. The same is true of middle class/creative class types, whose personal congeniality and sympathy for proles at the personal level hides from them (read: me) their systemic role in oppression.
This is how ideological mystification at the level of everyday life proceeds; inequality is out there, we know, but for those above a certain level of impoverishment and disenfranchisement, it is not experienced as such as a personal problem. No one wants to be proletarianized in their own subjectivity, in their own concept of themselves. So they explain the ways inequality affects them in terms of personal failings or bad luck -- not as: the system can and has declassed me despite my efforts to abide the rules of its game.
Part of our energy is thus spent reproducing in our everyday encounters the ideological fog in which we are all supposedly equal (the 99%). (Consumerist relations in "democratic" marketplaces where everybody's dollar spends are a big part of this, but not all of it.) We actually have to work to reproduce the illusion that "existential liberalism" coheres, that the deviations we experience are anomalies. It's shocking, then, when we experience something we can't resolve through this kind of work — say, when you get subjected to "unfair" police violence or are insulted through some bald piece of snobbery. But it may be that we prefer the ongoing work of sustaining our class-based sense of dignity to refusing the work and living in the full, intolerable glare of the naked relations of power. Who wants to unmask those nice people who mean so well? Who wants to look in that kind of mirror?
It is certain that the division of society into classes would be infinitely more visible if inter-individual relations were the brute and unreserved translation of relations of production. The proletarian would doff his cap in passing to the capitalist with his top hat and cigar, and there would be nothing more to say. But unfortunately things are a little more complicated, and ‘existential liberalism’ is not the unique translation of the effect of relations of production in everyday life...
Class relations disguise themselves at the personal level, and dissolve into "existential liberalism." Capitalists in general are bad, but each individual capitalist seems like a nice enough person, doing their philanthropy and what not, recycling like a good citizen, etc. The same is true of middle class/creative class types, whose personal congeniality and sympathy for proles at the personal level hides from them (read: me) their systemic role in oppression.
This is how ideological mystification at the level of everyday life proceeds; inequality is out there, we know, but for those above a certain level of impoverishment and disenfranchisement, it is not experienced as such as a personal problem. No one wants to be proletarianized in their own subjectivity, in their own concept of themselves. So they explain the ways inequality affects them in terms of personal failings or bad luck -- not as: the system can and has declassed me despite my efforts to abide the rules of its game.
Part of our energy is thus spent reproducing in our everyday encounters the ideological fog in which we are all supposedly equal (the 99%). (Consumerist relations in "democratic" marketplaces where everybody's dollar spends are a big part of this, but not all of it.) We actually have to work to reproduce the illusion that "existential liberalism" coheres, that the deviations we experience are anomalies. It's shocking, then, when we experience something we can't resolve through this kind of work — say, when you get subjected to "unfair" police violence or are insulted through some bald piece of snobbery. But it may be that we prefer the ongoing work of sustaining our class-based sense of dignity to refusing the work and living in the full, intolerable glare of the naked relations of power. Who wants to unmask those nice people who mean so well? Who wants to look in that kind of mirror?
Labels:
capitalist subjectivity,
ideology
Subscribe to:
Posts (Atom)