Showing posts with label nostalgia. Show all posts
Showing posts with label nostalgia. Show all posts

Thursday, October 6, 2011

Ostalgia at the New Museum (2 Oct 2011)

In the first question of this interview, philosopher Paolo Virno is asked to explain how he can regard Marxism as a "doctrine of rigorous individualism" -- particularly since the mandated ideology of the "really existing socialism" of Eastern Europe tended to emphasize suppressing individuality in favor of an elaborately professed fidelity to the collective. Virno quotes Russian linguist Lew Vygotsky in response: "the real movement of the development process of the child’s thought is accomplished not from the individual to the socialized, but from the social to the individual." That is, we must learn to think of ourselves as individuals within a given set of social relations, a process that Virno claims continues throughout adult life: " We constantly have to deal with the interiority of the public and with the publicity of the interior."

In many ways, the Ostalgia show at the New Museum, which collects work by artists who were raised in the Soviet Union or its satellite nations in the Eastern Bloc, deals with precisely this question: how does one express or capture that inner sense of individuality publicly, in a society that pretends to officially deny its existence? The show presents ironic support for the idealized notion that true individuality can emerge only under socialism: Despite the manifest material hardships, the artists evince a kind of hardy, irrepressibly idiosyncratic spirit that can seem sharper, more distinctive, more authentic to jaded Western eyes. It's easy, for example, to read the ad hoc creativity captured in Vladimir Arkhipov's images of improvised household gadgetry (e.g., a maraca made from foam rubber, scaps of leather, and a Fanta can; a tub stopper made from a bent fork stuck in a rubber boot heel) as proof that Eastern European living conditions forced everyone to be folk artists.


That is where the nostalgia (albeit vicarious) comes in for us: the wistful fantasy of a world in which glamour and celebrity don't exist, so no one is under pressure to develop their identity with fame in mind. In "Ostalgia," we see art made in a society without an art market, design devised not for a culture of consumer seduction but for individuals desperate for functionality. No crass commericalism -- the pursuit of distinction in these works is not about ego but simply the essential desire to show that one existed. The works have an ontological desperation that seems to purify them.

It's readily apparent that none of the ordinary people in the 1960s and '70s-era photographs by Boris Mikhailov are posing for him to improve their personal brand. The frank, earthy images utterly lack that kind of calculating awareness; they instead carry an erotic charge that depends not on conforming to commercially established beauty standards but on evincing a surprising singularity, a spontaneous particularity, as if they are only just discovering their uniqueness. Since Mikhailov's work was not sanctioned by the state, his subjects ran real risks by agreeing to pose (often nude). This tends to make them appear as co-conspirators in an act of subversive intimacy. The feeling of forbidden liberation is palpable. Rather than being objectified by the photographs (even though they were paid to model), Mikhailov's subjects seem instead to seize the opportunity to be truly subjectivized by them. The images allow them to express individuality for its own sake. Secret moments stolen from a surveillance state, the photos document poignant private alliances, anti-networking.

This is a far cry from what we are accustomed to in the era of social media, where the idea of secret moments is becoming unthinkable, regarded increasingly as aberrant and antisocial. It is hard to imagine achieving such intimacy or liberation as Mikhailov's images evoke in the time of Facebook, in which not only is everything permitted -- there is no repressive state forbidding personal expression -- but people are pressured to share as much of that everything as possible in a tolerated spirit of self-aggrandizement. Social media have turned fears of a surveillance state inside out and made self-revelation ubiquitous, automatic, virtually unintentional. At the same time, given the sorts of skills needed to secure the jobs reserved for the "creative class," we are forced to manufacture and parade an ersatz individuality, showcasing our flexibility and ingenuity, our ability to anticipate trends, our well-rounded sensitivity to the processes of cultural meaning making.

So despite our culture's emphasis on individuality, we don't generally experience it the way it's depicted in "Ostalgia". Instead we fret about the degree to which we are unique, make a fetish of superficial nonconformity. We are becoming used to the notion of an attention economy, in which our individuality is increasingly inexpressible except as a quantitative measure. It becomes harder to image a life that is obscure, and so such a life begins to seem as if it contains more reality. Ostalgia becomes a nostalgia not for the all-embracing social order socialist states tried to provide or their uneven successes in suppressing the effects of a class structure on ordinary people, but for the way such total systems left inadvertent room for a contested privacy that had monumental value. Now we just give it away through frictionless sharing, and many of us don't know how to stop it even if we aren't inclined to lend our consent.

Friday, August 19, 2011

Non-Time and Hauntology (5 May 2011)

I went to a talk last night at NYU by Mark Fisher about "hauntology," which refers to a kind of intermediate space-time between places palpably shaped by organic time and nonplaces (shopping malls, etc. -- see Marc Augé), which are wrenched out of time and posit an unending nontime, the end of history, an undisruptable retailing present that perpetually recurs. I didn't really get what hauntology was all about: it seemed to have to do with cultural productions that are aware of the nonplace/nontime crisis -- the way neoliberalism has foisted non-space/time on us, along with a subjectivity without depth that must flaunt its requisite flexibility by shuffling the deck of floating signifiers -- and are "reflexive" and "critical" and "negative" about this condition. Fisher made this point with music: British pop music now is blithely appropriational of the past without foregrounding that in any particular way; retro has ceased to be a meaningful descriptor. So music made now would not be at all disruptive, he argues, if someone living in 1979 heard it. There would be no retroactive future shock. It doesn't sound like the future; the future that should be occurring now has been thwarted, lost, effaced. The sense of cultural teleology is gone, vanished, perhaps, in the now pervasive relativism that regards all culture product as potentially valuable.

There are lots of plausible and interrelated explanations for why the pop-culture future can no longer occur, including:

(1) The demise of a hegemonic culture industry (and the rise of digitization and peer-to-peer distribution) brought the end of a shared sense of the cultural moment. We're not all watching the same TV show at the same time and hearing the same records on the radio. Instead we have access to all culture all at once, on demand -- whether it's, say, Lamb Lies Down on Broadway, the complete works of Margaret Cavendish, yesterday's episode of Survivor, or all of them at once. This AV Club article by Steven Hyden about Def Leppard's Hysteria gets at the idea:

As everything changes rapidly around us, we as music fans in many ways still think we’re living in a Def Leppard world, where winning a Grammy means you’ve arrived, and going to No. 1 on the charts makes you a pop star. In reality, we live in a culture where the terms “mainstream” and “underground” have become virtually meaningless, as practically every song by every band ever is equally accessible, frequently at no cost, to anyone with an Internet connection and the interest to seek it out ... It’s clear that music rarely unites us under the banner of mass-accepted artists anymore; even in a concert audience, we’re all just a bunch of individuals, with little connecting us to one another beyond a shared interest in the artist onstage—one artist among hundreds on our abundantly stocked iPods. Sounds lonely, doesn’t it? Sometimes I yearn for the old world, the one I grew up in, a place where dinosaurs like Hysteria stomped around pop culture for months, if not years, leaving sizable impressions in the hearts of a generation, whether they liked it or not.

The availability of everything means that particular works of pop music lose "symbolic efficiency" to use (and possibly misuse) a term from Žižek. Nothing successfully connotes the zeitgeist; everything invokes a desire to one-up with a better reference or a new meme or detournement of the contemporary. We are too knowing and skeptical to accept anything as unproblematically representative of the now.

(2) Neoliberalism/post-fordism/late capitalism has projected itself as the end of history, normalized nontime, and generalized the reception of conditions of ontological insecurity as freedom. We lack a subjectivity that can experience or recognize historicity.

Fisher links the idea of a "missing future" with the disappearance of negativity and criticality in contemporary pop culture, which (as I interpret it) has no space for anything oppositional or which transforms oppositional gestures into postures that circulate only as signifiers of personal identity. It reminds me of Douglas Haddow's "Hipsters are the dead-end of Western culture" argument:
An artificial appropriation of different styles from different eras, the hipster represents the end of Western civilization – a culture lost in the superficiality of its past and unable to create any new meaning. Not only is it unsustainable, it is suicidal. While previous youth movements have challenged the dysfunction and decadence of their elders, today we have the "hipster" – a youth subculture that mirrors the doomed shallowness of mainstream society.
Hipsters don't experience non-time negatively, as a loss, as melancholic, as indicative of deep alienation. Instead they seem to be thoroughly subjectivized by neoliberalism to the extent that they regard it as opportunity to show off how creative they can be in their cycle of appropriations. That last thing they want is to be reminded of how their personality is conditioned by the times they live in; in nontime, one can feel transcendent and immortal, one can permanently defer adulthood.

Hauntological music (like Burial) tries to at least evoke the feeling of loss, tries to register the missing future as a kind of catastrophe, Fisher argues, though it can't actually instantiate this missing future. It tries to at least restore meaning to the concept of retro, foregrounding the appropriations of the past by sounding like a scratchy record, and so on. (I don't know; all electronic music literally sounds the same to me.) I wasn't persuaded that a work's reflexivity about how symptomatic it is itself of the impossibility of escaping non-time made it viable as a mode of resistance. I'm probably too skeptical of reflexivity to ever regard it as resistance; I see reflexivity as the quintessential mode of neoliberalist subjectivity -- a calculating self-consciousness that can't be escaped, that forces us to be considering our identity as an alienated thing to be developed and invested entrepreneurially. (The following is highly provisional and may ultimately prove embarrassing): Whatever is reflexive needs to become collective. The problem of non-spacetime is that of an isolated individual subject who admits of no possibility for intersubjectivity, which is perhaps the primary way we experience history, through how our relations with others subjectivize us in particular, contingent ways. Reflexivity about our loss of that intersubjectivity seems to still cling to the individuation, to see and secretly cherish one's isolated uniqueness and incontingency in the recognition of it as a loss.

In my view, social media have become the extension of non-spacetime, where nothing, no identity or incident, is necessarily contingent or organic, and one is doomed to the "freedom" of endless ontological insecurity, the forever search for a grounding authenticity that can only generate more memes. Social media are where we go to protect our experience of nontime, which is threatened by the Real, by historicity, by death. Facebook is the ultimate nonplace. Being on it is to enter non-time, to maintain a continual pseudo-presence.

The non-spacetime crisis, I think, is a crisis of presence. When we exist in non-spacetime, presence becomes impossible -- or it is known by its absence, in a kind of negative theology. To put that less cryptically (or maybe not): technology has basically dissolved the unity of the subject in a particular place in time. Smart phones, etc., let us be in many places at once, conducting any number of conversations and self-presentations asynchronously. This casts an air of provisionality over everything we do; our lack of total commitment to a that place at that moment is always implied, always understood. No one is even bothered anymore when someone they are talking to looks at their phone. There is no ethical requirement to be fully present, and without that, there is no genuine (I know, how can you even ever define "genuine") intersubjectivity. The refusal to be fully present is a restatement of the refusal to permit our identity to be socially contingent or to be palpably collective. The smart phone reserves our right to check out of any collective identity formation at any time. This is the essence of contemporary "convenience," which I have long interpreted as being able to avoid interaction with other humans and being forced to empathize with them and recognize their existence as other. (We can only tolerate other people when we regard them as extra in our movie.)

Fisher referred to Jameson's distinction between psychological nostalgia and formal nostalgia, between the ability to evoke a real lost past and being trapped in pastiche. What I took from this is that the postmodern/neoliberal subject cannot access psychological nostalgia, but can only simulate it through pastiche, as this sort of subject has only existed in nontime as opposed to historical time. My sense is that this subject doesn't yearn for historical time at all but worries about historical time erupting into nontime via some sort of terrible Event. When something that threatens to be an Event happens, subjects rush to assimilate it to nontime by mediatizing it, "sharing" it in social media, or meme-ifying it. I'm not sure if this holds, but it may be possible to interpret the ad hoc celebrations of Osama bin Laden's execution this way -- an effort to experience a historical moment in a way that dehistoricizes it -- puts the partyers back at the center of their personal hermetic history, claims the Event as just an event in their individual story.

Because we have no access anymore to psychological nosalgia, we end up nostalgic for the capability for nostalgia, we feel homesickness for a home we never had. These leads to a compensatory attraction to childhood kitsch, to moribund objects (joining a typewriter club is an extreme manifestation of this), to anachronism, atavism, whatever seems genuinely and indelibly marked by a past. This perpetuates the cycle that denies the creation of a distinctive future, guarantees that the future is a more attenuated and annotated reconfiguration of detritus from the past.

(Malcolm Harris has more thoughts inspired by the talk here.)

Thursday, August 18, 2011

Dead Media and Mediatized Subjectivity (23 Feb 2011)

Though lots of CDs are still being sold (more than 200 million last year), it still seems fair to pronounce the medium dead, dead as the 8-track or cassette tape. Compact discs were an intermediate technology that full-scale digitization has rendered obsolete. Unfortunately for me, I got caught up in the way that technology was exploited for all it was worth in its 15-year reign, which brought on levels of record company opportunism that, as an insufficiently cynical teenager, I was unprepared for.

Anwyn Crawford surveys the damage wrought by the CD in this essay, which recounts the medium's history. She points out that "a CD’s capacity for 74 minutes of data, as opposed to an average of 40 minutes (20 each side) for vinyl, encouraged artists to record longer and longer albums – or alternatively, for labels to stuff CD album releases with remixes, ‘bonus’ tracks, demos and other filler, particularly in the lucrative market of CD reissues." That in turn destroyed the integrity of albums, always a tenuous idea but one which defined the heyday of rock music. In the CD era, albums were a filler dumping ground, and reissues offered unnecessary rejected material that made the original release seem retroactively provisional. Quality gave way to quantity, both ideologically and on the discs themselves.

Now it seems obvious that CDs were misbegotten, especially since the transformation of songs into digital files has made packaging far more important, and the files' infinite reproducibility has made the analog aura of vinyl into a fetish. But when CDs were first introduced in the 1980s, I took the hype at face value and believed that it made perfect sense to discard my record collection and pay for it all over again, so I could hear albums in their alleged pristine digital state. Back then I imagine I was tired of records getting scratched or warped, tired of plucking dust balls of the phonograph needle, tired of having to remember to step lightly when I walked past the table the stereo sat on so I wouldn't make the record skip and possibly ruin it. I was eager to believe the promise of CDs' indestructibility, their "perfect" sound. I wanted to believe that such a thing existed -- a perfect copy of an album, unworn by time; it suited my impression that the music I was getting into was "timeless," transcendentally great.

I had no idea, however -- couldn't imagine -- that industry engineers would be so indifferent, so negligent, in remastering those transcendental classics from the back catalog. I disbelieved my ears when I heard the hollow, trebly sound of the Byrds CDs and Astral Weeks. I didn't believe that Columbia would simply cut several minutes out of "Sad-Eyed Lady of the Lowlands" for the first Blonde on Blonde CD and not mention it anywhere. Or that the opening chords of "Brown Sugar" would be lopped off the Sticky Fingers disc. Or that the Doors' first album would be mastered at the wrong speed.

And so on. I just trusted for no good reason that record companies cared about the quality of their products. But in retrospect, the compact disc as a medium was definitive proof that they didn't. The CD offered inferior sound for inflated prices, and the product itself was often adulterated (intentionally or through negligence), which forced consumers who cared to repurchase the same albums on CD multiple times. (How many times has the Village Green Preservation Society been reissued?) And then there are the problems of overcompression destroying the dynamics, and discs being normalized at eardrum-bleed decibel levels for no apparent reason. Ultimately, I came to see that I had been duped and felt betrayed by the record industry, though it never really owed me anything anyway.

I wonder if something similar will happen eventually with iPods and MP3s. Obviously the sound quality of MP3s is inferior, but their convenience has always trumped the need for fidelity, and lossless formats don't seem to catch on, partly because Apple refuses to support open-source codecs like flac. As hard-drive capacity increases, presumably some commonly used future version of MP3s will eventually reach CD levels of fidelity, but such files intrinsically cannot compare with analog sound. The purpose of music in the iPod era seems to have changed fundamentally -- the iPod, as sociologist Michael Bull details in Sound Moves, provides a portable sound world that offers solace and privacy in the abrasive environments we must traverse in modern life. (This always makes me think of the film Morvern Callar, in which the main character uses a Walkman to keep herself tuned out, and viewers are made to recognize the disjunction between her sound world and the diagetical sound we don't hear.)

We use music more and more to propel us through other activities; less often do we make listening our primary activity. Virtually the only time I make paying attention to music my primary activity it is when I am at a live performance, and I hardly ever do that these days. And more important, as Bull's book suggests, we use it to experience mediated faux togetherness through pop music while managing our privacy and exerting our individualist right to hear what we want and block out everything else. MP3s are designed to indulge our individualism and produce/compensate for our isolation. The surfeit of them, the overwhelming number of songs anyone with sufficient hard-drive space can collect, is a reflection of that need for more and more building blocks for us to construct our sonic uniqueness in playlists. The medium encourages the use of music to express subjectivity, even more so then the previous era of music-oriented youth subcultures. With MP3s, music doesn't express its content so much as it expresses us.

Perhaps at some point it will seem insane that people walked down the street wearing sensory deprivation devices that allowed them to ignore one another, and perhaps the format for piping those supplementary noises into our heads will be reviled. But probably MP3s will be replaced by an aural media that gives us even more apparent control over how to experience community as a commodity while we remain safely isolated, entombed in sound.

Wednesday, August 17, 2011

Detroit Ruin Porn (20 Jan 2011)

John Patrick Leary's essay in Guernica magazine about Detroit "ruin porn" -- images of decaying, abandoned buildings; of familiar contemporary types of places turned eerily desolate -- is well worth reading. As Leary notes, these images tap into an archeological fantasy that allows viewers to imagine they have survived the apocalypse, rather than confront the truth that they are living in the midst of it, the turbulent and unruly transition to a globalized, postindustrial order. Ruin porn allows us to believe that we are not the victims of the chaotic upheaval; it even offers the hopeful sense that all the requisite suffering is in the past. Leary points out that one rarely sees humans in these photos.
So much ruin photography and ruin film aestheticizes poverty without inquiring of its origins, dramatizes spaces but never seeks out the people that inhabit and transform them, and romanticizes isolated acts of resistance without acknowledging the massive political and social forces aligned against the real transformation, and not just stubborn survival, of the city.
Viewers have no stake in the city's survival or the ongoing struggle to halt the decline; instead the ruin photos drive us inward and invite us to regard what we see as the majestic and irrevocable result of cosmic entropy, an emblem of the vanity of human wishes. Time is cast as the enemy, as the photos depoliticize the consequences of so much negligence and malfeasance, of exploitation and broken promises.

Of course, if one really wants to enjoy the apocalypse as sublime entertainment, one should plan to visit the recently reopened Chernobyl Zone of Exclusion (sometimes translated, more fittingly perhaps, as the "zone of alienation"). "It is very moving and interesting and a beautiful monument to technology gone awry," says Mary Mycio, who wrote a book about the zone. Who doesn't want a monument to out-of-control technology? It lets us think we are always ultimately in control of it. Detroit's ruin porn seeks to cast the city as a kind of zone of exclusion, an anomaly, a disaster area that can be cordoned off form the real America of prosperous innovation and heroic entrepreneurs and can-do strivers. But the reality is that entrepreneurs and innovation always leaves a trail of destruction somewhere else, and these images, as Leary suggests, fail to evoke the causal chain.
No photograph can adequately identify the origins for Detroit’s contemporary ruination; all it can represent is the spectacular wreckage left behind in the present, after decades of deindustrialization, housing discrimination, suburbanization, drug violence, municipal corruption and incompetence, highway construction, and other forms of urban renewal have taken their terrible tolls.
The photos license our indifference to the entire question. Leary writes that "ruin photos suggest a vanquished, even glorious past but, like the ruins themselves, present no way to understand our own relationship to the decline we are seeing," but they probably do worse, they suggest such understanding would be an irritating distraction to the decadent beauty.

Ruin photos speak to our desperate desire to have our world re-enchanted. We want the banal structures and scenes of our everyday life dignified by the patina of decay, so that we can imagine ourselves as noble, mythic Greeks and Romans to a later age and, more important, so that we can better tolerate the frequently shoddy and trite material culture that consumerism foists on us, see it once again as capable of mystery. We carry our own personal zone of alienation wherever we go, but seeing the familiar world of our everyday life in ruins externalizes that alienation, makes it seem as though we've exorcised it like a devil. We become larger than this life, than these dentist's offices and deserted boardrooms Leary notes in the photos. We will survive it all, we will outlast the mediocrity that made us.

Belarus watch (14 Jan 2011)

Growing up during the Cold War before the advent of the internet, I wasn't exposed to much pro-Soviet journalism; instead I was forced to hope the Russians loved their children too. Today's youth need not suffer any anxiety nor must their minds be imprisoned in pro-democratic hegemony. For instance, here's a New Statesman article by Neil Clark (via MR)about the glories of Belarus's command economy under Europe's last dictator, Alexander Lukashenko -- "the side of Belarus you won't read much about," Clark tells us. If you fantasize about a more equal society unsullied by vulgar consumerism and inane ephemera from our postmodern age of frivolity, all these false needs being stoked and exploited, then perhaps you should consider moving to Minsk:
While other former Soviet republics rushed to embrace capitalism following the fall of the Berlin Wall, privatising their state-owned enterprises and removing subsidies to industry and agriculture, Belarus kept the old collectivist flame alive. My guidebook describes it as a country "so unspoilt by the trappings of western materialism that it's very easy to feel a sense of having slipped into another time and dimension".... Unlike Ukraine and Russia, Belarus's economy is not dominated by billionaire oligarchs. There is no underclass: according to UN figures, Belarus has one of the lowest levels of social inequality in the world.
In Belarus, as Clark explains, the state-owned tractor firm still sponsors the workers' "theater collective" at the culture center near the factory. Thus it is proved: socialism allows the human to thrive in all aspects of his species being, to develop all sides of his potential.

It's perhaps easy to glamorize life in such a regime from the outside, when one need not suffer the deprivation or the indignities or the depersonalization but can instead celebrate the way people living there can seem to collectively symbolize a lost purity of being. We can pretend they chose their impoverished and necessarily spartan everyday life as a lifestyle and applaud them for it, as though they were on the frontier of a voluntary simplicity movement, living out what Juliet Schor, in a feat of semantic jujitsu, has called the plenitude in her recent book. (The gist: We have everything we need already, so let's stop worrying about growth and income and start focusing on conservation, which will enrich us spiritually.) But that's all projection of our reservations about how we live in the West. In effect, Belarus tempts us to structure our ideas about what might be a better, fairer society as nostalgia for a "backward" past. This reconfigures socialist politics as atavistic daydreams.

Existential hoarding (6 Jan 2011)

This CNN item about decluttering has sensible advice that few people, I suspect, would disagree with, as it fits the zeitgeist by seeming vaguely eco-conscious and consevationist. The author, Tsh Oxenreider, invites us to live as if we are always just about to move overseas:
Ask yourself, Is this thing worth hauling 6,000 miles across an ocean and in to a new home? Is it providing that much meaning and value to my life? If not, why bother having it now?
She argues that more is less, in that you have less to take care of or worry about. If you purge yourself of unnecessary things, each remaining object becomes more meaningful, glowing with the value of its being intentionally chosen. The gambit is to create an artificial scarcity for oneself, set limits to manufacture aura.
Living this way isn't about having nothing. It's about everything in your life having value. It's looking at all your belongings and knowing that you've given that thing permission to be there, that the item is truly adding value and beauty to your life. When you get rid of the things that don't matter, the things you do keep become that much more valuable, and you'll have more time and money to invest in quality over quantity.
It's an appealing fantasy. I imagine having that one bookshelf on which every book is one of the best and most powerful books I've ever read; I think of those apocryphal families of yore who only had a bible but knew it backwards and forwards and derived true spiritual nourishment from it as a result. I think of being able to imagine at some point eating every single thing in the refrigerator, or better yet, emptying it totally and eating only what I've bought fresh from the green grocer's on the way home from work each night. (Like I ever do that.) I think of empty closets, save for my all-purpose utilitarian uniform that I can wear at all times in any weather and always avoid the appearance of "trying to look cool." I'll be so free of objects that I will spend all my time in unfettered activity, really doing things -- though with only a few objects at hand it is likely to be the same sorts of things over and over, or it will involve me consuming disposable things or spending lavishly to access meaningful experiences. Or maybe not. Maybe I can convince myself that all I need is a guitar, a laptop, and a dream.

I want very much for Oxenreider's fantasy to be true. I want shedding belongings to generate a lasting and satisfying sense of having purified my life. I want to prove myself the sort of fortified and transcendent soul who can overcome the hegemony of advertising and conspicuous consumption and capitalist reification to see the true value of things, the value that stems from my ability to invest them with part of my own spirit. I can dispense with the vulgar lies of commerce and live a spare, Spartan minimalist life of meditation in a tastefully empty room like the one we see pictured in the article.

Yet when I try to live with that sort of rigor, I experience little lasting joy. When I've purged things in the past, the feeling of lightness that follows tends to be fleeting. Instead, entering into purging mode can sometimes open a yawning void in my life, not because I am ridding myself of things and worry I will feel their loss but because I start to see how little anything "really" means. What's the point in having anything? I'm just going to die anyway. Having a hoard of stuff to sort through and manage and muffle my existence also stifles my sense of mortality, for better or worse.

And perhaps because of how I've been conditioned to experience things, the thrill of getting rid of stuff -- unconsumption, as Rob Walker would have it -- ends up feeling much like the joys of acquiring it. That is, the pleasure is detached from the nature of the object itself, from the intentionality that is supposed to be so obvious within it. Instead the objects prompt multifarious fantasies, depending on where we are coming from, what we have been exposed to, what are particular situation is that moment. So it is not as simple as it might seem to get rid of the "unnecessary" things in our lives, because necessity is a moving target, much like our own sense of self, our own priorities. Sometimes it seems very useful to me to have every album by Grand Funk Railroad loaded and ready on my computer just in case; sometimes it seems like insane clutter preventing me from noticing the really great music I could be listening to.

Sometimes the meaning of things elapses -- revealing another scary truth about mortality. It doesn't mater how few things you have; your memory is going slip away. The things that seemed important will merely haunt us then, or we will remember things we didn't save but not remember why, or we'll cherish that lost moment of purging more than what we've kept. Maybe at the point when you can no longer remember why something was important to you, you just throw it away -- but then why was it ever saved in the first place? You save it because you are afraid to forget its meaning, and when you do start to forget, the item may seem more dear than ever in its obscure mystery. Hoarding can be a matter of luxuriating in that surfeit of mystery as much as it is a matter of suffocating on material goods.

I don't think my life is a matter of memories stored in goods, but I am afraid of it seeming as empty as that room in the photo. In my eagerness to purge, the danger is I'll shut everything out. I don't know if I can access a way of organizing my life that doesn't lump people and things together -- a way of living in which I don't need otherwise useless things to remind me of the people I want to keep in touch with. Theoretically, I should be able to have a purity of intention that doesn't require objects in this inefficient way, and I shouldn't think about collecting experiences as if they were things or objectifying the time I want to spend with others. But to live according to that theory, with little social support for it, is to risk the empty room becoming not tasteful and light but a void.

Oxenreider writes as though our intentions are constant, so close to the surface, so readily accessible, but she also writes as though only our individual tastes are at stake in those things. But usefulness and meaning are slippery, social concepts, and we end up implicating one another in our needs for things, multiplying those needs without being able to account for where they are coming from within us. It seems like this weird burden to have things, this inexplicable hoard we end up with through no will of our own. But in fact that is just the burden of being with others, refracted into miscellaneous odds and ends.

Re-reading Bret Easton Ellis's 'Less Than Zero' As an Adult (1 Dec 2010)

Prompted by editing this essay about the Less Than Zero film and by fortuitously coming across a copy of the novel in a thrift store, I decided to re-read Bret Easton Ellis's debut book, Less Than Zero, which was published in 1985. I first read it as a teenager in high school, and it sort of blew my mind. I was working at Waldenbooks in the mall then, and the novel seemed to come out of nowhere; it just appeared on the shelving cart one shift as if it were my destiny to read it. It played to all of my aesthetic proclivities then, all the bad ideas I ever had as an aspiring fiction writer: Write about apathetic teens doing lots of drugs and having sex indiscriminately; dump in a lot of inscrutably allusive pop-culture references; strip the prose of all lyricism and substitute a brutalist stream of consciousness, with the trick that the consciousness you're streaming is so devoid of reflexive insight that it comes across as aleatory and affectless. This would best capture the existential reality of youth boredom, which of course, as all teenagers knew, was the most significant problem confronting society in the Reagan years.

Even as a 15-year-old, though, I had a hard time imagining any adults reading the book or taking it seriously. It seems very much a young-adult novel, dependent on the reader's fascination with and general lack of perspective on the world of haute dissipation it depicts. Lots of vicarious thrills and chills for a teen: it depicts a world in which parents are always absent, money is never an issue, drugs are always plentiful, and everyone is down for sex with everyone else. People o.d. and parents have abdicated all responsibility, but that just sets an appropriate backdrop of extremity; they don't constitute real problems. The only real problems revolve around whether or not you can really open up to a friend. For adults, it's all a bit silly. You don't envy the characters, certainly, and you don't even pity them. At best, it has the junk appeal of MTV's nano-soap-opera Undressed, which was clearly inspired by Ellis's vision.

The plot of Less Than Zero follows Clay, the narrator, a college freshman who has come home to Los Angeles for his winter break from college back east. Though it isn't spelled out, he seems to be the son of wealthy entertainment-industry figures, and his friends are drawn from the same milieu. Though cognizant of no agency of his own, Clay finds himself involved in scenes of what the author apparently regards as steadily increasing shockingness, starting with a casual homosexual tryst, moving on to heroin shooting-gallery parties, a snuff-film viewing session, some gay prostitution, and finally a kidnapping and rape of a prepubescent girl. In between these meant-to-titillate scenes are some maudlin accounts of childhood memories (including the obligatory undergrad-workshop dead grandmother) and some slice-of-ennui observational passages of teens hanging out at pool parties, snorting fat rails and club hopping, hoping to spot members of X or the Go-Go's. (My favorite is a scene in which Clay, hanging out with several of his interchangeable friends at a sushi restaurant, is told that rockabilly will be the next big thing -- "and not those limp-wristed Stray Cats either" -- and that anyone who's anyone has to read The Face. As a teenager, I took that last injunction literally and struggled to track down copies of it -- Waldenbooks did not carry it, alas.)

Though it certainly succeeds in conveying a paradoxical mood of angsty apathy, the book's writing at the sentence level is fairly uneven -- not all that surprising considering Ellis's age when he wrote it, and the eagerness to rush the novel out as some sort of unexpurgated view on youth decadence. Its frequent badness was likely regarded as a badge of its authenticity. Less Than Zero's shocking incidents are generally unconvincing, and melodramatic despite the faux detachment. They read like exploitation-fiction cliches, only told in an approximation of the style of Raymond Carver or, more obviously, Joan Didion circa Play It as It Lays. And even though all sorts of unconscionably horrible events take place, the main conflicts structuring the novel are surprisingly mundane: the narrator's mixed feelings about losing touch with his best friend and breaking up with his high school girlfriend. These are expected to carry significant emotional freight for readers even in the midst of snuff films and raped 12-year-olds. It seems extremely bizarre to say the least for Clay to walk out of a room in which his high school buddies are raping someone, snort a quarter-gram of cocaine, and then pout earnestly about his girlfriend dumping him. It makes it seem as though the depravity might be all in his head or something, weird scenes inside the gold mine that serve as projections of Clay's alienation. But such a reading seems extremely speculative, counter to the explicit intent that we take all the action literally and lament the moral turpitude.

The incongruous tonal juxtapositions foretell the way in which Ellis's later novel American Psycho shifts from gory murder scenes to dementedly positive reviews of Genesis records, but they also betoken a lack of control, or perhaps an editorial hedging against making the novel's characters repellent to the core as they were probably intended to be. I suspect Ellis's ploy was not at all different from American Psycho, whose narrator, I think, is supposed to be Clay's brother: choose a monstrous, contemptible personality type (the spoiled film brat, the Wall Street banker) and have them narrate their own vapidity while having them participate in cartoonishly evil scenarios with no sense of their own moral culpability. But it seems like he was told to leaven Less Than Zero with mawkish passages (often set entirely in italics) that imply Clay has feelings we are supposed to empathize with. It would have been a much more successful book, I think, if Clay had no redeeming interiority, if there really was no there there, especially after all of Ellis's hamfisted repetitions of slogans from billboards and snatches of conversation: "Disappear here", "People are afraid to merge", etc. In a better book, he would endorse these slogans unthinkingly rather than be unnerved by them. Or better still, he would register them without noting how appropriate they were to his condition, and then the reader would have something to do. As it stands, Ellis explicates too much, and much too implausibly.

Tuesday, August 16, 2011

The Taxonomical Drive and Girl Talk (19 Nov 2010)

Citing Nina Power's book, One Dimensional Woman, Jodi Dean posts about the "taxonomical drive":
[Power] introduces the idea in the context of contemporary pornography: on the internet, one can find whatever one wants, although almost as soon as one finds it, one doesn't really want it anymore. Rather, one wants to see what else is out there. The item itself no longer scintillates. The drive to find other images, to keep moving and looking and marking, takes it place. After you've seen five or six busty amputee tops, you've seen them all--or have you? maybe there are different types? let's look for them! Desire switches into drive, now a drive to taxonimize and classify (blond, with shoes; shaved, no gun; etc...).

I've written a lot over the years about this concept, mainly with regard to music, where amassing more songs and managing the metadata and organizing the music library all begin to cannibalize the pleasure of the music itself. Or rather, these data-driven pleasures mediate our experience of music in a different way from what we knew before mp3s. The music becomes more like information, requiring less of a sensual surrender. Girl Talk seems emblematic of music created to suit this new aesthetic; classifying the samples becomes inseparable from the pleasures of listening to it.

You could draw the conclusion that Girl Talk, despite being "free" to all and seeming emblematic of the potential of a cultural commons over and against intellectual property, also serves to naturalize cultural labor (assigning and classifying semiotic meanings) as the chief pleasure, acclimating us to our doom of being data processors for the media companies that ultimately control the repositories of our lives, which we are turning into data banks with their help. Girl Talk thus functions the same way Power argues that online porn does; in Dean's words, Power "links taxonomical drive with contemporary porn's endeavor to bore us all to death and turn sex into work." Girl Talk models how listening should be immaterial labor. That's not necessarily bad, if one regards that sort of work as its own reward. The fear is that cultural-processing work cuts us off from some other way of experiencing life, pleasure, that is beyond the fixation of being useful. It traps us in the "mirror of production," to borrow from Baudrillard.

I find that this foregrounded data component makes it impossible for me to hear music as music; it doesn't engage what feels to me like a deeper part of my brain that responds more directly to sound seemingly stripped of semiosis. But the subjectively deeper experience I am imagining may be an illusion, an ideological chimera conjured by my investment in classifying the "real" in a specific way, privileging a certain nostalgic access to "the way things were" as a kind of protective revenge against youth, and against my becoming moribund. I want to be able to believe that I really hear the music and grant myself permission to condescendingly pity those whose entire listening life has been lived in the digital age. Fetishizing vinyl reflects this as well -- record players are magical time machines transporting us back to the era of "real" listening, where the patina of crackles and surface noise and skips all serve to certify the authenticity of our response to what we hear. Not a clean data stream, real analog sound, embossed on a material substrate that bears the traces of decay, the marks of time -- so much more like our own mortal flesh and therefore more true.

But this is all mystification of course. Music never comes to us purified of signification and thus closer to some unmediated truth; it is always mediated by some degree of contextual information that prepares us, puts us in a certain state of receptivity that will then allows us to flatter ourselves with our responsiveness. "Ooh, the Brahms, it washes over me so!" There's no way of listening to music that would allow it to reveal what our true inner response would be, no way it can test our spontaneous appreciation, however much we might want to leverage it as proof of our intrinsic noble sensibility. We can't prove good taste at the level of the sensual, the level of the music itself; it is always an argument conducted on the level of signs, the level of ideology.

The fantasies about authentic listening and real experience are not just reflections of the will-to-distinction; they are also counter-fantasies to the dominant consumerist dream of achieving the complete archive, of having the most direct access to every possible option, of even being able to at once hold all those possibilities, if not in our heads, then in some other tangible way. We oscillate between seeking the uncollectible, ineffable and thus "real" experience that can't be repeated or precisely commodified, that seems to elude reification; and seeking to collect everything, to taxonomize so as to seem to have a handle on every possible future we could choose for ourselves -- assuming the future is (as consumerist ideology tells us) merely a matter of what we choose to consume. Dean argues that
In a just-in-time culture, a culture of preemption, where connectedness has taken the place of planning, the archive serves as a kind of fortress of planning, a backup plan, a reserve army of the not yet desired but could be. We store up for the future, presuming we can access these stores rather than just add to them.
But that future never comes; the future is always now, and the storing up is the mode of consumption, not a kind of savings, not a deferral. The archive eases the fear of commitment, of having to choose and thereby forgo other pleasures. We collect the options on possible experiences, possible possessions, and as with financial derivatives, the notional total of these grows exponentially, far beyond the limits imposed by real attention scarcity, allowing us the illusion of transcending the constraints of time. That is what it means, I think, to consume the archive, to take pleasure in the metadata, in the metaexperience, in the theoretical possibility of future enjoyment -- this allows us to compress many experiences and goods into a smaller space in time. Of course, that means capitalism can overcome yet another barrier to endless consumer-demand growth and more profit can be squeezed out of ever-shorter circulation cycles, which now have become quantum.

Facebook and Cobain's Dilemma; Or, Reliving the Years That Punk Broke (5 Nov 2010)

I'm enjoying these AV Club essays by Steve Hyden about early-90s alternative rock (via mefi). I can't relate to regarding Pearl Jam (the poor man's Bad Company) as an important band at all, but revisiting Nirvana's emergence and the epochal break it signaled at the time seems worthwhile. It is easy to forget how "revolutionary" the concept of Nirvana on regular rock radio was back then, especially since Nirvana has been played side by side with Skynrd on classic-rock radio for more than a decade now. At the time, it didn't seem like Nirvana was supposed to be popular; it seemed that something had gone shockingly wrong with the usually smooth workings of the culture industry's demographic programming. Right or wrong, lots of music geeks suddenly felt a new responsibility for what was popular, because suddenly anything seemed possible. This didn't feel so great; no one is more invested in the established rules of musical taste than the connoisseurs who marginalize themselves through them.

What I'll always remember is being dragged to an apartment party full of frat-guy types early in 1992, and as was typical of the era, the Steve Miller Band's greatest hits album was playing. (This was a period in which the worldly-wise sophomores at the dorm I lived in as a freshman told me that if I wanted to get laid, I'd better have some Journey or Chicago tapes at the ready.) Then the disc player changed to Nevermind, and no one at the party even noticed. I felt instinctively and instantaneously that something horrible had happened while I wasn't paying attention, that the ground had been cut out from beneath me, and I'd need a whole new place to stand in order to feel smugly superior in my tastes. I never experienced co-optation in real time before, and I was young enough and naive enough until then to believe I never would.

After the initial shock passed, I tried to see the upside, tried to glory in what I thought of as "my" music triumphing. I had the idea that I would get newfound respect from the people who had ignored me before for having been into that kind of music all along. Of course, I was grossly mistaken -- the newcomers to Nirvana hadn't changed their attitude toward music at all; they had just embraced the new thing that had come along. It wasn't like they were going to turn to the likes of me and start taking dictation as I pontificated about all the other great bands I was into. It remained true that no magic combination of musical tastes was going to make social anxiety magically disappear. There would be no vindication, no recognition beyond the friendships I'd already made. Nobody, but nobody, cares who "discovered" the good music, and most laugh at the people who think it was themselves. I sort of learned that the hard way, and responded by retreating from contemporary music altogether for years. (Fortunately for me, this was right around the time that tons of fabled albums from the 1960s started to be reissued on CD, and it felt avant-garde to be into the Zombies.)

Hyden's second essay contrasts Axl Rose and Kurt Cobain, who apparently had a feud, despite being pretty similar.
Both men hated the press for spreading “lies” that often turned out to be true, and both were drawn to complicated women who created as much misery as ecstasy in their lives. Both men saw fame as a double-edged sword; it gave them the attention they craved after a lifetime of being ignored, and yet it also seemed to intensify their feelings of self-loathing.
Is it ridiculous to think that the problems of fame Hyden describes, once reserved for reflexive rock stars, now potentially afflict us all? Does microfame yield macro shame? My experience with Facebook has been double-edged in that way: It seemed I had a chance to redeem all that time I felt ignored, when I was one of the first to be into Nirvana and no one gave a damn. But I only rediscovered the same disgust with myself for wallowing in that miserable egomania, as Facebook forced me to recognize yet again that my tastes aren't really my own, because I still want so badly for people to applaud me for them.

Was this Cobain's dilemma, only writ small? You end up ruining all the cool things you thought you wanted to share with the world because you can't share them without tainting them with shameful self-importance. But you never feel important because self-worth is totally bound up with connoisseurship, with having the cool things that you want to share acknowledged. The trap seems even harder to escape now that the pressure and the means to share everything online is ubiquitous, and we're constantly appraised of how our wise deliberations are entirely ignored, at least by somebody.

Friday, August 12, 2011

Remembering Live Aid (13 July 2010)

Remember when pop stars ended world hunger? It was 25 years ago today when Live Aid, a cross-continental benefit concert organized by Boomtown Rats lead singer Bob Geldolf as a follow-up to the charity single "Do They Know Its Christmas?" took place. At the Awl, Dave Bry has a video round-up. The event seemed like the height of hubris at the time, and it has not aged well, though it still seems like a better idea than Hands Across America.

I spent the day 25 years ago intermittently watching Live Aid in between summertime wiffle-ball games in the neighborhood where I grew up. I remember thinking that it was surprising Queen was so popular in the UK. Bono seemed to be trying too hard. Why did Bryan Ferry croon into two microphones instead of one? Phil Collins got special attention by jetting across the ocean to drum for acts on both the London and the Philadelphia stages. Duran Duran singer Simon LeBon's voice cracked in a ridiculous manner when they performed "A View to a Kill," their awful James Bond theme. As the sun set that day, it was just staring to dawn on me that the new wave acts I was really into at the time weren't all that good, and that Live Aid wasn't really going to become my generation's Woodstock. It had the long-term effect of making me skeptical about celebrity-charity schemes, and possibly charity in general, making the whole prospect seem like an ego trip. I wonder if Live Aid may have helped galvanize an entire generation of libertarian cynics.

Thursday, August 11, 2011

Gentrification and Justification (25 June 2010)

Historian Claude Fischer makes some interesting points about gentrification in this post, a response to a review essay from the Atlantic by Benjamin Schwarz, on books about Greenwich Village. Both critique the sentimental idea that certain urban neighborhoods once were really authentic and "had soul" but now have been yuppified with the wrong sort of gentrification that hasn't respected the neighborhoods' special uniqueness and have instead imported suburbanized blandness -- chain stores, class homogeneity, a rigid separation from the world of manufacturing, and so on. It's a version of the golden-age fallacy that posits a time just at the horizon of memory when things were the way they should be, everything and everyone in their proper place. It's not an especially dynamic vision; it regards change as inherently corrupting, even though nothing could be more natural than for neighborhoods to evolve over time.

Not to impugn the motives of preservationists, who don't always seem to be acting in bad faith, but neighborhood-preservation efforts are often attempts to entrench power relations, enhance property values, and widen the gap between well-to-do and struggling areas of the city instead of allowing a more egalitarian equilibrium to emerge -- or at least acknowledging dynamism in urban development. There is no platonic ideal of, say, Greenwich Village, just contested ideas that represent competing visions and interests. Fischer describes how authenticity becomes a stalking horse for less lofty concerns:
Struggles over gentrification, even if rooted in matters like rents and loft space, also entail ideological battle. Spokespersons for the current residents invoke local color; they seek to preserve this moment by investing it with historical authenticity; we, they say, are the traditional people of the neighborhood. (One generation’s “traditional” residents are, of course, usually an earlier generation’s outside “invaders.”) The developers, merchants, and middle-class newcomers may be bringing change, but they also often invoke history, a history that looks back before the current residents. One tactic is to use Historical Preservation, to protect the original architecture of a neighborhood, that is the styles that preceded the current residents, for example, the single-family Victorian gingerbread houses, not the stuccoed-over Victorians divided into three flats for immigrant families.

Schwarz deplores the proposition that "the state should create the conditions necessary for favored groups—be they designers, craftspeople, small-batch distillers, researchers, the proprietors of mom-and-pop stores—to live in expensive and fashionable neighborhoods or boroughs. That effort would ultimately be an aesthetic endeavor to ensure that the affluent, well-educated denizens of said neighborhoods be provided with the stage props and scenery necessary for what Jacobs and her heirs define as an enriching urban experience." In other words, it's a way of using aesthetics to disguise a conservative politics, even, perhaps, from the purveyors themselves.

Preservationists, Schwarz argues, try to freeze a particular transitional moment when the gritty aspects of neighborhood have only just begun to give way to revitalization. Fischer points out that struggles over the evolution of a neighborhood often involve competing static visions of its "real" character, all of which should be regarded skeptically. Fischer writes:
Schwartz suggests that this balance of working-class grit and a cleaned-up bohemia was attained only in a few places – the Village most famously – and for just a brief moment before the neighborhoods tipped over into “inauthentic” yuppiedom. (In the Village, that moment came around 1960, just about when Bob Dylan showed up.) Schwartz is impatient with those who, in slamming gentrification, imagine that those thrilling moments could be preserved in “amber.”
These moments shouldn't be privileged over other moments in a neighborhood's life cycle, in part because they can't be artificially constructed -- they are cherished for their organic juxtapositions of unlikely elements. Remove the spontaneity, and these become contrived Urban Outfitters moments.

Gentrification nowadays is more readily experienced and lamented by a broader group of people because, as Schwarz argues, neighborhoods evolve much more quickly than they did when Jane Jacobs was making the case for preserving their organic character. "Indeed, what has changed since Jacobs’s day ... is the speed of the transition of districts from quasi dereliction to artsy to urban shopping mall. This acceleration results from the ways consumption has become the dominant means of self-expression ... and from—relatedly, ultimately—the acceleration of the global economy."

That acceleration reflects consumerist capitalism's permanent imperative to increase our cultural-consumption throughput, which is achieved on a number of fronts. Digitization and mobile communications make it easier for us to be always consuming and producing new consumer meanings that invalidate the old ones and intensify the need to consume more. More-thorough mediatization makes for faster fashion cycles. Retail outlets start to come and go as if they were art installations. People delay in starting families, extending the period of fashion-conscious adolescence and the period in which they want to live among strangers in an urban environment displaying and "discovering" themselves. And so on.

How does consumption become the "dominant means of self-expression"? The division of labor in modern society is partly responsible, removing the meaningfulness from work. That meaningfulness crops up instead in consumption, the symbolic resonance of what we buy, display, and use. Along with that change, identity becomes provisional and open-ended rather than constrained by traditional limits. Rather than having an identity assigned by the conditions of our birth, we become responsible for creating a series of roles for ourselves. The self becomes a goal we never can quite reach but are always moving toward through various experiments with lifestyles and purchases and attitudes. We seek distinction through the pursuit of novelty and originality, which fill the void left by retreating traditional values (which are ushered out by the creative destruction of capitalism).

And therein lies a contradiction: we want to consume the traditional values of our neighborhoods precisely at the moment that we have become the sorts of selves who can't exist in traditional settings. Just as the authenticity of our identity has become something we feel required or anxious to establish over and over again through careful outward displays, so our neighborhoods begin to be held to the same standards, as they are transformed primarily into settings for our personal narratives expected to reflect our self-image. But at the same time, to cater to our identity needs on a practical level (food, clothes, cafes, bars, tchotchke shops, etc.), neighborhoods lose the local color that supports the idea that we are somehow on the urban frontier, pursuing an edgy lifestyle distinct from the safe, boring, blah lives of our parents who fled cities.

Anyway, it's quite possibly better that gentrification tends to temper the edge of contemporary bohemia in America. Otherwise we might have in the U.S. the "alcoholic agoras" of England that Dragan Klaic mentions in this article -- "where young people get drunk by 10:00 pm, vomit in the streets, get into fights and are taken away by ambulances and police or totter into taxis to get back home. This is the daily reality of the creative city pipe dream."

Analog fetishes (4 June 2010)

I read this in a piece of PR about the Thermals upcoming album and thought it was pretty strange:
Personal Life was recorded to 2" tape and mixed to 1/2" tape, and vinyl enthusiasts will be happy to know the vinyl was mastered straight from tape, with no computers involved.
Really? It would have sounded so much better on 2 1/8'' tape.

Do people really care what inch tape an album was recorded to? It seems even less pertinent than when producers would boast of how many tracks there were on the mixing board. Of course, that was back when digital sound was sold as being for audiophiles, and not as their bane. ("The new Steely Dan record was mixed direct to digital off the 96-track digital masters... The sounds have never been sullied by an actual room tone.") Is it in Doubly? You don't do punk rock in doubly, you know.

The idea, I suppose, is to convey the impression of being a "real" rock band, not some bunch of amateurs recording on a laptop with Garage Band. It seems a flimsy selling point though. Obviously if the music is any good, it doesn't much matter what format it has been recorded in or gets played back on. But I guess a ramification of rapidly proliferating technology and gadgetry is that more people must consciously consume the format rather than the music, like the hi-fi stereo system zealots of old.

I've dabbled with consuming formats over songs lately, trying to see if forcing myself to listen to vinyl records would change my relationship to music. I thought it would function as a sort of musical dietary restriction: I would get into album sides again, and really listen and commune with the artists. I would stop consuming novelty and metadata instead of music. But I realized I was basically chasing a lost moment in my past when the relationship seemed more pure and more important to my self-concept. Playing vinyl records instead of an iPod doesn't make me 18 again, unfortunately. All that happened was that I felt re-authorized to seek out and try to collect difficult-to-find objects -- as much as I yearned for lost youth, I yearned for meaningful shopping quests even more. Now, alas, I am back to digital files.

Tuesday, August 9, 2011

I want my garmonbozia (10 April 2010)

Metafilter has an omnibus of links about Twin Peaks, which is now, unbelievably, more than 20 years old. I have long since thrown away most of my VHS tapes, but one I have held on to is a taping of the Twin Peaks pilot off network TV -- WPVI, channel 6 in the Philadelphia area. It's much easier to understand how revolutionary Twin Peaks was -- particularly that first episode -- when you see it in the context of the commercials that aired with it, the local news-anchor interpolations and the plugs for other ABC shows. The shift in tone when it cuts away from the stark despair of Laura Palmer's mother -- the horror on her face and her otherworldly moan of grief -- to the first commercial break is just insane. Having had my mind blown by Blue Velvet (and Dune, I admit it) I was already a fan of Lynch's, and I was predisposed to be enamored with the show, but Twin Peaks went far beyond anything I could have expected. Each of the early episodes inspired a weird mood that I couldn't shake for hours or even days afterword, an acute awareness of a whole different layer of causality beyond the surface of things, a deliciously ambiguous blend of anxiety and excitement. When "Twin Peaks mania" swept through what was then a much more monolithic mainstream media, I was fully sympathetic -- probably the last cultural fad of that scope that I can say that about. It was very strange to be in such harmony with what was being promoted everywhere -- kind of like those first few months Nirvana was popular, driving the Steve Miller Band off boomboxes at the keg parties.

Anyway, one of the MeFi links led me to this perceptive review by Tom Huddleston of the way-underrated Twin Peaks: Fire Walk With Me, which is without question the film that creeps me out the most of anything I've ever seen. It's an uncompromising portrayal of disassociative emotional pain -- the anguish that produces the surreal. Slicing up eyeballs has nothing on the scene in which Leland is revving the engine in his convertible while the one-armed-man is yelling at Laura about her awful destiny. Huddleston captures another such moment in the film:
The infamous David Bowie dream sequence is equal parts entertaining and ludicrous, the Thin White Duke mumbling obscurely about Judy in a mangled pseudo-Southern accent. It’s as willful and pointless as Richard Pryor’s cameo in Lost Highway, a distorted display of counterculture namedropping. But as Bowie mumbles and Lynch’s Gordon Cole yells, the tone shifts and somehow the sequence becomes genuinely unsettling, the face of the monkey behind the mask an unexpected, nightmarish image. This blending of the absurd and the horrifying to dreamlike and disturbing effect has become Lynch’s hallmark, from the chickens in Eraserhead to the hobo behind the diner in Mulholland Drive. Nowhere else in his work does he use the technique as effectively as in Fire Walk With Me. Sudden tonal shifts from joy or security to overwhelming sadness, unease, terror and back again are perhaps the film’s most effective emotional weapons, and Lynch deploys them mercilessly.
In short, this movie will freak you out, even (or perhaps especially) if you have no idea what is going on or what anything is supposed to mean in its garbled cosmology.

A note on the phrase "I want my Garmonbozia." Apparently that is Lynch's cryptic way saying that there is a debt of pain and suffering -- owed by whom? to whom? Not clear. But the experience of watching the film sinks you right into that ambiguity -- right on the border where voyeurism blurs into victimization.

Watch the tram car please (6 April 2010)

Washington D.C. is trying to build out a streetcar system, but some residents dislike the idea for apparently aesthetic reasons. The overhead wires will, in their view, cause visual clutter. Matt Yglesias explains why this is dumb.

Recently I was in Prague and Berlin, both of which have lots of trams, and I can't say I was distracted by the visual clutter of the wires. I had the impression instead that they made certain parts of the city seem more alive, which was aesthetically pleasing to me. And they were a good way to get around as a visitor, since you get to see the city as you travel at a relatively slow pace.

In some ways I tend to romanticize public transportation and register my using it extensively as a sign that I am on vacation. (I ride the subway to work, but generally use a car for errands and such.) Unless you use a city's public transportation for a routine commute, it's hard to get a feel for how efficient it is. That said, I was impressed by Berlin's extremely extensive system of trains, trams and buses and the convenience of its honor-system method of ticketing, in which you don't have to pay to board, but need a ticket only in case inspectors ask to see it. I mostly rode the trams, and the whole time it never occurred to me that I once commuted regularly on tram cars, when I lived in West Philadelphia and rode the trolley (which sounds jaunty and old-timey and somehow illegitimate). It was a token-driven system that always felt vaguely like a rip-off.

That Philadelphia seems on the verge of reviving those moribund trolley lines that remain seems like an index of how the city's fortunes have changed (though Atrios points out that such talk has been around for a while). It gives the impression that it is becoming more rather than less dense (which is not apparently the case). And density is ultimately the point of city life. If there are more jobs, residents, and destinations within the city, it makes sense to build out mass transit (that is, public transit that is not buses, which are irreparably stigmatized) that connects the various points and complements the hub-and-spoke regional rail. And then these lines would theoretically make the city more livable. It could help make real the reality it implies.

At one point in the early 20th century, the Philadelphia trolley system was nearly as elaborate as Berlin's trams; my mother used to talk about riding it out to the suburbs to go to Willow Grove Park, which was later demolished to make way for a bowling alley, and then a haute-bourgeois shopping mall. The legend I always heard was that auto companies bought up the trolley car companies and dismantled the tracks to force people into cars, and the associated path dependency helped hollow out cities, enact forced suburbanization and institutionalize the marginalization of the poor who don't own cars.

I don't know if that is a true story, but more generally, it's clear that middle-class Americans are now too committed to car culture to ever support European-style urbanism. For instance, zoning regulations in many cities seem to privilege parking spaces over buildings themselves, a reflection of the constituency they serve. And home ownership, sadly, is still the American Dream, no matter how isolating or unsustainable or anti-egalitarian it proves to be. (Try renting your way into a good public-school system.) Apparently, being American means avoiding neighborliness on anything but one's own terms.

Saturday, August 6, 2011

Alt.burrito (26 Feb 2010)

Maybe I have been focusing too much on HRO Exegesis lately, but: Is there something that's supposed to be indie or "alt" about eating burritos in America that I am unaware of? Driven to desperate measures by the impassable slush everywhere in New York City, I went next door from my office building to Qdoba, a fast-food burrito chain, and playing over the speakers in the place was "Stillness Is the Move" by the Dirty Projectors. What is that about? I nearly choked on a cilantro leaf. And Chipotle, another haute-burrito chain, has long been in the habit of playing music by the likes of Wilco, Nick Cave, Neko Case, Tom Waits, Bonnie "Prince" Billy, etc., in their dining areas -- perhaps a sneaky effort to quickly drive customers out and keep turnover sufficiently high.

Anyway, what is the explanation for this branding strategy? Is it an effort to make burritos seem safe for white people to eat, to accelerate the process by which Mexican food is demexified?

Band name drought (17 Feb 2010)

I was just wondering how it is that doom rock bands from Sweden can get away with giving themselves elegantly concise names like "Witchcraft" and "Graveyard," while bands in the U.S. feel obliged to come up with something (usually outlandish or unwieldy) that is not already taken on MySpace. And voilà, this WSJ piece comes down the pike about how difficult it supposedly is to name your band. It seems true that a cursory search will reveal that what you thought was a great, original idea was already thought up and acted upon by someone else. (I am still sad that both Black Horse Pike and White Horse Pike have MySpace pages.) It's enough to make you pine for the legendary days of local garage-band scenes, where every township could have its own group called the Outsiders.

But really, this is not that huge of a problem. The Awl does a good job saying what needs to be said about the piece.
I mean, how hard is it to come up with a unique band name? Armed with only Google, a rhyming dictionary, and an urgency to get a post done, I challenged myself to come up with ten new group monikers for which there were no registered alternatives. It took three minutes.
The list he comes up with is worth clicking through to check out.

All that said, the matter of a band name isn't something insignificant. It's arguably as important as the music itself. There are lots and lots of bands, and if some can be ruled out by virtue of having terrible names, then they will be. Rare are the bands that are better than their names: Spoon is the only one I can think of off the top of my head right now. But legion are the bands with bad names that stink.

Thursday, August 4, 2011

A Historical Note on the Hipster (12 Jan 2010)

Five or six years ago a commenter suggested I read Hal Niedzviecki, a Canadian journalist who writes about pop culture and consumerism. I went ahead and ordered his 2000 book We Want Some Too: Underground Desire and the Reinvention of Mass Culture and when it arrived, I promptly filed it into the nether regions of my bookshelves and forgot all about it. Yesterday, in the midst of a sweep for a bag of books I was putting together to take to the Goodwill store, I came across it again and started reading it.

I don't agree with everything in his argument -- he dismisses the Adornoesque view of popular culture and tends to champion cultural consumerism as an integral, inevitable mode of identity production -- and much of it seems to have been rendered obsolete by the emergence of Web 2.0 and social networking and the like. (He has to explain what an MP3 is and spends a chapter explaining the concept that listening to one Girl Talk song conveys much more thoroughly.)

What is interesting though is who he means by "we": a generational cohort he sees as obliged to define their identities solely in terms of pop culture: he calls this "lifestyle culture" and he seems to regard it as inescapable. "To negate pop culture is to negate the very foundation of our lives -- a foundation that is no longer found in religious instruction, in the moral precepts of the state, in the bosom of the family, but in the frantic embrace of a pop emancipation we crave despite, and because of, who we are." (No wonder he hates Adorno -- he rejects and argues for the "negation" of the very basis of identity as Niedzviecki is willing to conceive it.) Since we define ourselves in terms of our tastes and our familiarity with pop culture, we can't admit that it is trivial. Instead we inflate its importance even more as we prolong our adolescence and nostalgize over Saturday morning cartoons.

"Lifestyle culture," he claims, "is our last, desperate, pervasive attempt to rebel against those who seek to reduce us to cogs in the machine." If that was the point of lifestyle culture, it has failed spectacularly; our everyday lives are more integrated with the machine than ever; social networks are harvesting and reselling the details of our cultural cry of self, conveniently translated already by our volunteer labor into terms of brands and trademarks already on the market. (After all, Facebook's founder has declared the end of privacy. Everything we do is fair game for corporate exploitation.) We continually need to assert our own coolness, and our efforts fuel the evolution of new cool signifiers. We work the lifestyle treadmill.

It's shocking that this seemed a new phenomenon in 2000. By 2010, that cohort had become generally known as "hipsters," and everyone was coming around to agreeing that their moment had passed. Not because anything they did has gone definitively out of fashion, but because they truly have become a "we" -- the mode of identity-fashioning through pop culture has become too ordinary a thing to demand a special label. People who behave like hipsters did back then are just normal now.

But when Niedzviecki was writing, there was still such as a thing as a consumer "underground" in which devotion to pop obscurities was expressed in hand-mimeographed fanzines and home-taped mix cassettes and other arduous analog means. That culture was just beginning to die its digital death, and outlines of hipsterism -- the zine mentality without the trouble of zines, the proud consumerism without the effort of digging up trivia and the sacrifice of marginalizing oneself -- was just beginning to become recognizable. Hipsterism is born when the cultural underground dies. It stems from the "crisis in authenticity" -- which is to say, it emerges when authenticity becomes a commodity. The mainstream expands and flattens out, lulls us with promises of mass participation: The internet was basically a build-out of the mainstream, and a way for it to incorporate all the stuff that once was considered outre or unmarketable.

Though often prescient about the incipient dilemmas of consumerist identity, Niedzviecki at times seems blind to how the proto-hipsterism he celebrates ("This process -- in which passive consumers become semi-active hobbyists and then, finally, full-blown creators -- challenges the gatekeepers of culture by asserting the power of the everyperson to be hos or her own critic and creator") would metastasize. He is hopeful about what would become the definitive hipster strategy, what he calls "noncompliant compliance" -- "intelligent, meaningful creative actions that nevertheless acknowledge the primacy of pop culture." Without an underground, there is no other choice, he argues. Maybe he's right about that, but that still sounds like a Baudrillardian fatal strategy to me. I see no upside to the "collapse of the underground" -- what the Frankfurt schoolers talk about as the negation of negation.

I wish I had Niedzviecki's optimism. But he is able to sustain it because he hadn't anticipated how mainstream media companies and culture industries would be able to adapt to the long-tailing and nichefication of audiences, how our identity seeking becomes free immaterial labor for corporations -- no matter how small-scale or insular it may seem. Piracy and peer-to-peer sharing has hurt their profits, but the amateur culture and brand co-creation Niedzviecki sees as somewhat subversive and hopeful has more or less been a boon for businesses and hasn't proven a haven from the hegemonic ideologies of consumerism. Alienation endures. Niedzviecki recognizes that lifestyle culture leads to apolitical narcissism: "cogent political activism" is difficult, he notes, "in an age where everybody wants to be their own personal cause, their own undergound myth, subject of their own fan club." That sounds like a description of Facebook to me.

With hindsight, it's easy to see how Niedzviecki misdiagnosed the problem. He was concerned that inundated with culture, we would lose our identity. He imagines us all as desperate to do something that "can't be decoded and marketed back" to us. We never receded into the mass; instead mass culture mutated to cater to us as individuals and we thrived on the way it could decode us. Identity, as that kind of decoding, has become ubiquitous, a compulsion. Our identity is at once more palpable and more fragile that it has perhaps ever been -- we have a rich and subtle language of objects with which to express it, yet no one seems to understand who we really are, and we keep trying to understand ourselves. We can't escape turning ourselves inside out and signing over our desire to consumerism to try to ease the dislocation, solve the riddle.

Wednesday, August 3, 2011

Avatar and Invisible Republic (29 Dec 2009)

I had no particular interest in seeing Avatar, but ended up seeing it the day after Christmas with my family. It seemed futile to resist. I even saw it in gimmicky 3-D, which added nothing to my enjoyment but did cause me to fidget ceaselessly with the glasses that I had to wear over my regular glasses. The film seemed primarily an exercise in glow-in-the-dark crypto-zoology, with little in the way of plausible plotting or character development. (We know Sigorney Weaver's character is cantankerous and outside-the-lines because they have her smoke a cigarette when she gets out of her cryo-travel pod.) It has a half-baked, programmatic but effective sentimentality that elicits emotional responses to the rite-of-passage cues. It kept me engaged by and large, though much of it reminded me of watching my roommate play Final Fantasy 9 (an oxymoronic title if ever there was one) on PlayStation while I was in college.

Only later did its trite politics annoy me. At first, I found it a little bothersome that I had to watch a bunch of humans get slaughtered by cartoons. Humans as a species haven't looked this bad in a sci-fi film since Paul Verhoeven's Starship Troopers (which I strongly endorse). We are given no explanation why the resource the greedy humans are after is so important other than the tautology that it's worth a lot of money. Weaver's character tries to counter the already confusing insistence on resource extraction with a non sequitur about how the "real value" of the planet the humans are pillaging lies in the fact that the trees are networked together to form a giant bio-Internet. (Great. The last thing we need is metaphors that glorify and naturalize digital, mediatized interconnectedness.) What is valuable about that? It's regarded as unimportant by the film's producers.

What is important to them is the quasi-spiritual mumbo jumbo about the native race on the planet, which seems modeled mainly on American Indian tribes and is represented in an extremely patronizing fashion as a bunch of simple primitives who understand their environment only in supernatural terms. It takes a human outsider, naturally, to teach them the significance of their ways and rally them to defend themselves, since they are helpless against aggression and superior military technology.

The Sociological Images blog sums up the racial politics this way:
Avatar is a fantasy in which the history of colonization is rewritten, but it a fantasy specifically for white people living with a heavy dose of liberal guilt. And it is one that, ultimately, marginalizes indigenous peoples and affirms white supremacy.
I don't see how anybody can contest that analysis. I feel a bit ashamed, actually, that I was sitting in suburban Bucks County with all my white, middle-class compadres, complacently consuming the spectacle without becoming disgusted as it unfolded. At the time it seemed curmudgeonly and cliched for me to reject the high-imperialist homilies the film was lazily built on and the blithe righteousness I was expected to share with the "good" humans. I didn't resist being constructed as viewer in that way because it felt good and flattering. It reaffirmed my sense of belonging to a group of wise and morally pure Westerners who would have done colonizing right -- that is, it played to the ingrained sense of superiority that being white and middle class in America provides. I should have been nauseated; instead I was verklempt as the hero claimed his squaw.

By coincidence, I began reading Greil Marcus's Invisible Republic, which in part is about the demise of the 1960s folk movement and Bob Dylan's role in destroying it after having come to exemplify it. The folkies, in Marcus's depiction, had the same patronizing attitude toward Appalachian poverty and civil-rights injustices (the Other America, as Michael Harrington dubbed it) that the makers of Avatar seem to evince about colonization. Capitalism sullied and exploited the pure rural people, as clear-headed bourgeois liberals can best recognize. To adherents, folk music (and Avatar) offers us glimpses of pre-capitalist America, a "democratic oasis unsullied by commerce or greed" in which art seems "the product of no ego but of the inherent genius of a people." The Avatar planet is such a product, for the race occupying it and the film-industry execs who made it.

The substance of this fantasy about indigenous people at harmony with their appropriate environment is the denial of individual subjectivity (the overriding value of the folk revival, according to Marcus), which is rendered unnecessary and impossible. Everyone is at one and merged with one another. Just look at the blue people in the movie sway to the unsounded rhythm as they worship their special tree. Marcus: "As they live in an organic community ... any song belongs to all and none belongs to anyone in particular." This is an attractive fantasy to have about other people, as it leaves oneself as the last unique individual standing -- like the hero of Avatar. Folk music tends to make a virtue out of a subject people's lack of autonomy because its adherents can't see a way to ameliorate those people's powerlessness without surrendering some of their own comfort. Avatar offers a fantasy solution, in which one vicariously becomes one of the subject people without losing one's distinctive identity, and then helps that group achieve autonomy. The story conveniently ends there, before the logic of communal unity eradicates the hero's sense of self.

But the faceless masses are most likely not so keen on being turned into a contemplative object for someone else, not psyched to have their identity and destiny predetermined by historical circumstances. We generally want someone else to be living by that pure code of acceptance of "authentic identity"; we are always tempted to try to reserve for ourselves the power to shape our own destiny and be anything we want. No one seems to volunteer to become the folk if the condition of that is disappearing into holistic anonymity. Instead we impose our notion of authenticity on others, and let their being trapped in it serve to limn the terms of our own private freedom.

Vinyl Sales Surge (9 Dec 2009)

The NYT had a mostly anecdotal piece this past weekend about how more people of the "iPod generation" are suddenly buying turntables and vinyl records. I don't know if I am demographically part of this generation, but I can relate -- I've been reacquiring albums I used to have in an effort to recapture the listening experiences of my youth, in which an entire side of a record would get digested in the full flush of analog warmth.

Interest from younger listeners is what convinced music industry executives that vinyl had staying power this time around. As more record labels added vinyl versions of new releases, the industry had to scramble to find places to press discs, said Mike Jbara, president and chief executive of the sales and distribution division of Warner Music Group.

“It is absolutely easy to say vinyl doesn’t make sense when you look at convenience, portability, all those things,” Mr. Jbara said. “But all the really great stuff in our lives comes from a root of passion or love.”

It makes sense for music companies to push vinyl since they have no choice but try to reorient consumers to the meaningfulness of physical objects. So it is hard to take Jbara seriously here as he spouts marketing propaganda. Nevertheless I think listening to records is appealing at this moment precisely because it is inconvenient, and maybe inconvenience is, as Jbara suggests, part of the essence of love. Overcoming obstacles is a feature, not a bug, as they say.It brings weight to the experience, makes it contrast with all the experiences that commercial culture has made easy (which in turn makes us consumers passive).

Yeats's "The Fascination of What's Difficult" seems relevant here. He seems to be arguing the opposite:

The fascination of what's difficult
Has dried the sap out of my veins, and rent
Spontaneous joy and natural content
Out of my heart.

It's the myth of "spontaneous joy" and "natural content" that has ushered in the "iPod Generation" and its peculiar restlessness, its voracious, overstimulated cultural appetite, its demand for instantaneous distraction. The absence of "natural content" -- the fact that satisfying oneself requires effort -- opens the possibility of substituting novelty for deeper satisfactions. When something doesn't spontaneously please us, we are invited to think the solution is to try something different, not work at the "difficult" thing that has failed us. The vinyl fascination may be the stirrings of a counter-movement that fetishizes difficulty as a way of fixing value, as measured in attention and consumption effort rather than ease and monetary cost.

Or perhaps it is just an expression of a nostalgia for material souvenirs of our taste. Records are a way to make my musical taste seem more substantial to me -- see, these are the albums I really care about, the ones I like so much, I listen to them on vinyl. On my computer, I have everything, so none of it will seem like it is special to me. The record collection is different; it requires sacrifices. Vinyl is the new way to signify that you are "serious about music," since having access to lots of songs and being familiar with lots of different stuff no longer suffices.

One thing I have learned in my return to vinyl is that records that skip are not endearing; they are totally annoying. And it doesn't take very much to induce a record to start skipping.

Tuesday, August 2, 2011

Music Discovery Stories (10 Nov 2009)

Nicholas Carr linked to Duran Duran bassist John Taylor's essay (!) for the BBC about how the internet changes music consumption. He relates a story about seeing Roxy Music on television in 1972 and riding his bike for miles to go to a shop where he could buy the record.

We had no video recorders, and of course there was no YouTube. There was no way whatsoever that I could watch that appearance again, however badly I wanted to. And the power of that restriction was enormous.... The power of that single television appearance created such pressure, such magnetism, that I got sucked in and I had to respond as I know now previous generations had responded to Elvis Presley on the Ed Sullivan show, or The Beatles, or Jimi Hendrix. I believe there's immense power in restriction and holding back.

The moral is familiar: On-demand culture deprives cultural-industry product of its aura, and consumers are left with a shallow and superficial relation to it. That seems to sell the power of the product itself somewhat short -- if the songs are really good, the aura artificially secured by restricted access presumably shouldn't matter to our aesthetic response. The would-be John Taylors of today should be listening to "Virginia Plain" over and over again despite downloading it. As he points out, the internet can free us from the tyranny of what's popular now and let us discover and become obsessed with culture from a diverse range of eras and locales.

What's lost is the monoculture -- the idea that everyone saw the same TV program and then could differentiate themselves in their community by their diverse responses. In today's cultural environment, everyone seems to be expected to be magpies, amalgamating all sorts of ad hoc bits of culture for themselves. (They are creating their own economy, as Tyler Cowen would say.) So distinctive gestures of musical taste are in some ways harder to find; people find it harder to interpret what your being into Roxy Music is supposed to mean. In the 1980s, it did mean a lot more to be into obscure bands, but much of that meaning was snobbery and exclusivity -- "I know someone who's got a lot of SST records and seven-inches." (I remember feeling weirdly betrayed when certain CD reissues started to come out -- "But I worked hard to gain access to those Gang of Four records! I traveled!") With "restriction" in place, music can serve as a positional good, slating us into subcultural hierarchies. That the internet has assaulted that citadel is unequivocally positive.

Also, I think we will continue to generate stories to go along with the way we discover songs. It is just that the retail encounter will no longer be part of that story. The restrictions imposed by the artificial scarcity created by the music industry will no longer determine significance; instead our own mnemonic efforts will be crucial to weeding among the stuff we binge-downloaded to elevate the songs that signify more than our idle curiosity.

But that requires an effort that we are no longer forced to make -- we can follow the route that society seems to encourage (through marketing and media triumphalism about discovering new trends and such) and just pursue novelty instead of depth in our relation to culture. It seems harder to summon the willpower to impose consumption restrictions on ourselves now that money doesn't do it for us. (That doesn't mean I want to go back to being culturally poor, though.)