An article at the AV Club by Sam Adams looks at the implications of Netflix's streaming service and the growing popularity of Spotify, a music-streaming company. He begins with an observation that seems unassailable to me -- "Convenience and choice are the watchwords of the digital era, in which content must be instantly accessible and as quickly digested, lest consumers flit off to some more welcoming destination" -- but I was confused by the analysis that follows, which didn't really explain why consumers are so susceptible to novelty and what he calls the "convenience trap," the willingness to consume what's available as opposed to what is presumably good for you. Adams fears we may be "unconsciously downgrading anything that isn’t so ready at hand."
But what does that mean? Why does everything have to be graded? And does an unconscious grade have any meaning? If you can't bother to make the effort to make your tastes conscious, then what difference does it make to you what you watch? And why should anyone else care? Adams is concerned that the great works may be lost to history if streaming services don't assimilate them to their streaming libraries: "Spotify’s great, unless you want to listen to anything Hüsker Dü recorded before its major-label debut. Would you trade New Day Rising for the Black Eyed Peas catalogue?" This doesn't strike me as a serious question. If you badly want to hear New Day Rising, try this. If you care about music, you probably won't let Spotify dictate what you can or can't hear, and digital reproduction has made it fairly likely that digital copies of everything will survive and proliferate. (Our real archival concern should be with the survival of analog artifacts that have yet to be digitized -- even though digitization may lead to a not entirely representative version of a work surviving.) The people who have a lot invested in their entertainment choices will supplement streaming services with ready alternatives. The people who don't diversify their supply basically don't really care, and why should they? Because certain art is good for them, and they should be made to consume it through clever institutionalized market nudges?
Adams's implicit concern seems to be that the tasteless masses will be left to languish in their cultural ignorance because the streaming services they thoughtlessly adopt don't force more redeeming content on them. And he also seems to think that if you are not cleaver enough to make redemptive consumer quests for the great works, you will be too dim or disinterested to understand them: "If you’re not inclined to put forth the effort to get yourself in close proximity to a given artwork, will you be willing to expend the mental energy necessary to understand it?" Apparently if one lives next door to the Prado, Goya's works there become more or less indistinguishable for you from Hagar the Horrible comics.
Working hard to gain access to a work has nothing intrinsic to do with being willing or able to interpret it. Adams offers an S&M take on art appreciation, that art should dominate and master us while we subserviently mold ourselves to its masterful lessons: "the viewer—not, please, the consumer—is fundamentally subservient to a work of art, in which it is our responsibility, and often our pleasure, to come to the work rather than expecting it to come to us. After all, shouldn’t art be inconvenient, if not in the sense of being difficult to access, then because it forces us out of our comfort zones, requiring us to reckon with its way of understanding the world?" I am pretty sympathetic to this, but I don't think my attitude needs to be generalized. It's not the only way to engage with art. And though I may try harder to get something out of a show I have to travel far to see, that doesn't mean I necessarily cruise through a nearby show on autopilot.
The idea that difficulty is necessary to have a "real" art experience is similar to the idea that something more real happens when the art encounter is "spontaneous" -- being surprised by the beauty of a sunset, etc. It is always tempting to extrapolate a dogma out of such experiences when they are profoundly affecting, but that would be a mistake. I don't think there is a prescription for assuring edifying aesthetic moments. Instead, when people try to push some recipe for the aesthetic onto someone else, they are imposing an encapsulated version of a status hierarchy that favors them. Ultimately, whatever they are pushing now (no matter how universal the principles are presented to be) will be repudiated later in pursuit of some fresh form of distinction. Isn't this an extremely elitist question: "How much more likely are you to bail on, say, Apichatpong Weerasethakul’s Uncle Boonmee Who Can Recall His Past Lives, when with a few clicks of your remote you can be watching a favorite episode of Friday Night Lights?" This seems to mean: You dummies should be watching the hard stuff (like me) but instead you are weak and let the technology trick you into watching what is mere middlebrow entertainment. You're trapped in your own lazy tastes.
Adams points out that "we carry around unspoken assumptions about what’s long and what’s short, what’s easy and what’s hard, and when those assumptions calcify, we may no longer be aware they’re there." Yes, this is how ideology typically works, and it extends far beyond how we choose to entertain ourselves. Making ourselves aware of our unthinking assumptions about what is common sense is probably always a good and worthy practice. But we don't encourage people to join in that project when we imply that the reason it is necessary is so that they can conform to some other dogma about what cultural product is correct and appropriate. That replaces one politicized mystification with another. Yes, Netflix -- like may consumer goods manufacturers -- would probably love it if we consumed simplistic mind candy as quickly and as often as possible; that's good business for them. And that incentive contributes to their trying to shape and promulgate a certain ideology about what it is fun to do. Their pay structure contributes to a materialization of that ideology. Convenience almost always serves an agenda of accelerated consumption, which is passed off as maximized happiness or efficiency. (You've consumed more, so you are better off!) But implying that people need to consume the "right" things instead of the convenient things seems to substitute an elitist ideology for a consumerist one, and may trigger reactionary retrenchment among the consumerists one may be trying to rescue with screenings of Bela Tarr films and copies of Metal Circus.
In 2007 I made the argument that subscription services "almost make the idea of having selective musical taste superfluous. Not there is anything wrong with that; musical taste's centrality to identity seems a peculiar quirk. Nonetheless, taste in commercial music comes down to what music you are willing to pay for specifically. If you are paying to have it all, you effectively have no taste." That is, in a consumer society we have this sense that you have to put your money where your mouth is to "prove" your taste. The idea that you need to suffer to acquire access to "real" art in order to appreciate it has a similar inflection to it -- that art needs to be scarce to have an aura of significance, which derives from people earning/paying for the privilege to consume it. But it seems more interesting to break out of the idea that scarcity imposes some mystical meaning on things to see what they might mean beyond that.
Showing posts with label intellectual property. Show all posts
Showing posts with label intellectual property. Show all posts
Friday, August 19, 2011
Google and goon squads (15 July 2011)
This December 2010 post by Peter Frase, addressing how capitalism might cope with technology's diminishing the need for labor inputs, has deservedly been put into broader circulation by Matt Yglesias and Metafilter. Frase sets up a thought experiment based on the Star Trek fantasy of a world in which productive labor has been rendered unnecessary, energy supplies are inexhaustible, and all humans apparently share in universal prosperity. Given these conditions, Frase wonders "how would it be possible to maintain a system based on money, profit, and class power?" Would capitalist relations continue to organize society even in the absence of the scarcities that legitimize those relations? If so, how? (Also, are we headed to this sort of society, given the persistence of unemployment and the arguably structural problems with Western economies that economist Michael Spence discusses here?)
Frase imagines that such a society would lean heavily on intellectual property law, presumably enforced by a draconian, all-encompassing surveillance state. It's not too hard to imagine Google facilitating this under the Orwellian auspices of "Don't be evil," especially after reading this article by Evgeny Morozov. "History is rife with examples of how benign and humanistic ideals can yield rather insidious outcomes—especially when backed by unchecked power and messianic rhetoric," he notes, and cites Said Vaidhyanathan's argument from The Googlization of Everything that asserts "the triumph of neoliberalism has made the 'notion of gentle, creative state involvement to guide processes toward the public good ... impossible to imagine, let alone propose.' " As manufacturing increasingly becomes a matter of information rather than manpower, Google's control of the information economy will potentially afford it the opportunity to implement a social structure. We would all essentially work for Google, whether (to draw on Frase's categories of post-productive labor) we are producing, sorting, and circulating content to attenuate its social value -- immaterial labor, by Lazzarato's definition, which Hardt expands to affective labor; I've written a bunch of posts about this sort of thing -- or whether we are muscle for intellecual-property enforcement (lawyers and "guard labor," to use the term Frase adopts from this paper).
Both immaterial labor and lateral surveillance seem to be expanding under the auspices of commercial social media and, as Frase notes, gamification, establishing the infrastructure and the mores to prevent informationalization from leading to an expansion of the commons, as P2P enthusiasts hope. Frase links to Yochai Benkler's Wealth of Networks, which sounds an optimistic note about the increased role of sharing and cooperation in production. Benkler's analysis resembles in some ways the Marxist theories regarding the "general intellect" that have evolved out of this cryptic section of the Grundrisse. (My effort at decoding it here.) Hardt and Negri extrapolate from the productive cooperation of the "general intellect" -- the development of which capitalists theoretically must foment to sustain productivity -- something they call the Multitude, a emerging political force that transcends state power and instantiates some sort of spontaneously self-organizing communism made of networks and flows. But it seems as though Web 2.0 companies are developing precisely to pre-empt such possibilities, to enclose the emerging commons and fuse them to structures that emphasize competition and individualism in the midst of enhanced sociality, that foreground status hierarchies rather than dissolve them, that articulate class distinctions rather than undermine them, and so on. Social media foster new forms of "artificial" scarcity (in attention, fame, relevance, identity, etc.) at the same time it eases inequalities in access to cultural goods. We can all download all the music and movies we want and remix them to our hearts content, but this doesn't touch the inequalities that form the basis of class. And reproducing class, guaranteeing that pre-existing inequalities in wealth and power can be reproduced and carried forward even in the absence of more-traditional methods of labor exploitation, is capitalism's primary raison d'etre (not increasing productivity or freedom or the "wealth of nations").
That's an implicit point of Frase's thought experiment, I think -- to suggest that no amount of prosperity or labor reduction will get rid of the class system and the exploitation it engenders structurally. It's not a set of social relations designed to promote equality, but its opposite. It creates a dynamic set of values that protect privilege in the face of abundance, in the face of technological improvements, in the face of developments that threatened to invalidate the aristocratic pretenses to inborn and inaccessible superiority.
Frase wonders where the money will come from to sustain the society of the future if zero-marginal-product workers have no right to expect to earn anything (according to neoclassical economic models) in a post-productive economy.
In the dystopian Google-run world of the future, workers will have attention rankings and goon-squad thug power to oppress one another and promote general insecurity; meanwhile real power and privilege will adhere to the corporation, its big shareholders, and those politicians it patronizes to protect itself.
Frase imagines that such a society would lean heavily on intellectual property law, presumably enforced by a draconian, all-encompassing surveillance state. It's not too hard to imagine Google facilitating this under the Orwellian auspices of "Don't be evil," especially after reading this article by Evgeny Morozov. "History is rife with examples of how benign and humanistic ideals can yield rather insidious outcomes—especially when backed by unchecked power and messianic rhetoric," he notes, and cites Said Vaidhyanathan's argument from The Googlization of Everything that asserts "the triumph of neoliberalism has made the 'notion of gentle, creative state involvement to guide processes toward the public good ... impossible to imagine, let alone propose.' " As manufacturing increasingly becomes a matter of information rather than manpower, Google's control of the information economy will potentially afford it the opportunity to implement a social structure. We would all essentially work for Google, whether (to draw on Frase's categories of post-productive labor) we are producing, sorting, and circulating content to attenuate its social value -- immaterial labor, by Lazzarato's definition, which Hardt expands to affective labor; I've written a bunch of posts about this sort of thing -- or whether we are muscle for intellecual-property enforcement (lawyers and "guard labor," to use the term Frase adopts from this paper).
Both immaterial labor and lateral surveillance seem to be expanding under the auspices of commercial social media and, as Frase notes, gamification, establishing the infrastructure and the mores to prevent informationalization from leading to an expansion of the commons, as P2P enthusiasts hope. Frase links to Yochai Benkler's Wealth of Networks, which sounds an optimistic note about the increased role of sharing and cooperation in production. Benkler's analysis resembles in some ways the Marxist theories regarding the "general intellect" that have evolved out of this cryptic section of the Grundrisse. (My effort at decoding it here.) Hardt and Negri extrapolate from the productive cooperation of the "general intellect" -- the development of which capitalists theoretically must foment to sustain productivity -- something they call the Multitude, a emerging political force that transcends state power and instantiates some sort of spontaneously self-organizing communism made of networks and flows. But it seems as though Web 2.0 companies are developing precisely to pre-empt such possibilities, to enclose the emerging commons and fuse them to structures that emphasize competition and individualism in the midst of enhanced sociality, that foreground status hierarchies rather than dissolve them, that articulate class distinctions rather than undermine them, and so on. Social media foster new forms of "artificial" scarcity (in attention, fame, relevance, identity, etc.) at the same time it eases inequalities in access to cultural goods. We can all download all the music and movies we want and remix them to our hearts content, but this doesn't touch the inequalities that form the basis of class. And reproducing class, guaranteeing that pre-existing inequalities in wealth and power can be reproduced and carried forward even in the absence of more-traditional methods of labor exploitation, is capitalism's primary raison d'etre (not increasing productivity or freedom or the "wealth of nations").
That's an implicit point of Frase's thought experiment, I think -- to suggest that no amount of prosperity or labor reduction will get rid of the class system and the exploitation it engenders structurally. It's not a set of social relations designed to promote equality, but its opposite. It creates a dynamic set of values that protect privilege in the face of abundance, in the face of technological improvements, in the face of developments that threatened to invalidate the aristocratic pretenses to inborn and inaccessible superiority.
Frase wonders where the money will come from to sustain the society of the future if zero-marginal-product workers have no right to expect to earn anything (according to neoclassical economic models) in a post-productive economy.
Thus it seems that the main problem confronting the society of anti-Star Trek is the problem of effective demand: that is, how to ensure that people are able to earn enough money to be able to pay the licensing fees on which private profit depends. Of course, this isn’t so different from the problem that confronted industrial capitalism, but it becomes more severe as human labor is increasingly squeezed out of the system, and human beings become superfluous as elements of production, even as they remain necessary as consumers.He wonders if capitalist ideology would be flexible enough to permit the guaranteed wage system this dilemma seems to require -- people get issued some token amount of money to keep the wheels spinning -- and if this nonetheless implies stagnation, the end of capitalist growth (and possibly capitalism itself). The issue seems to hinge on the difference between that minimal wage paid out (which stultifies its recipients, locks them in class position) and the creation of economic value that continues to accrue to capitalists. The value creators -- the minions of the general intellect -- need some nominal amount of money circulating among themselves to lubricate the gears of the social factory, but enough real value must be extracted from that factory to sustain the class divide -- to forestall redistributive effects. (My postulate is that capitalists will not create or sustain enterprises that redistribute wealth, only ones that concentrate it.) That value probably can't continue to be denominated in the same currency as the wages. Perhaps this is perhaps why more people are becoming content to work for attention, especially in the sectors most transformed by information technology, the ones subsumed by code. Google has indeed rolled out "badges" to reward users for consuming and processing news stories through its interface, as Rob Walker notes here.
In the dystopian Google-run world of the future, workers will have attention rankings and goon-squad thug power to oppress one another and promote general insecurity; meanwhile real power and privilege will adhere to the corporation, its big shareholders, and those politicians it patronizes to protect itself.
Monday, August 15, 2011
Paying for the Internet in Vulnerability (6 Aug 2010)
Often, despite my not infrequent fulminations, as I find myself spending more and more time in front of computer screens, reading and writing and even Twittering with ever more frequency, I start to wonder if I have been too pessimistic about the internet, about its role in accelerating our consumption of culture, the degree to which it more thoroughly saturates our everyday lives with marketing and its associated ideology: the celebration of novelty for its own sake, the embrace of narcissism as a mode of hyperfriendship, the supplanting of knowledge with information and data, the transformation of consumption into meme production, the mobilization of identity into a circulating personal brand that articulates the amount of society's attention one is worth, the disappearance of contemplation in favor of increased mental throughput, the sense that quality, though frequently brandished as a goal, is in truth a liability unless it can serve as an emollient to our alacritous neuroprocessing. (I was going for a sentence of Ruskin-like expansiveness -- how did I do? Perhaps protracted Proustian periods will persuade us all to take the long view now and then.)
But when I read an news item like this WSJ article, by Nick Wingfield, I am reminded all over again that I am not as cynical as I should be. The article details how Microsoft considered developing its internet browser so that user privacy would be better protected as a default, but then decided that such a course would inhibit the true purpose of internet accessibility.
And here's more reason for cynicism: Google's negotiations with Verizon to in effect put an end to net neutrality. They are discussing placing a burden on content creators to pay to have their content distributed efficiently on the internet. This seems like it would ultimately reinstitute the gatekeeping power of the media companies, which would quickly turn such costs into something that mimics the costs of printing and distributing bundles of paper, or pressing grooves into vinyl, or what have you. So any dream of the internet being a democratizing, disintermediating force in the realm of cultural production would be effectively quashed. Amateurs would be on the ham-radio section of the net, with transmissions at lugubrious levels, while the professional media would be on the "real" internet.
But when I read an news item like this WSJ article, by Nick Wingfield, I am reminded all over again that I am not as cynical as I should be. The article details how Microsoft considered developing its internet browser so that user privacy would be better protected as a default, but then decided that such a course would inhibit the true purpose of internet accessibility.
In early 2008, Microsoft Corp.'s product planners for the Internet Explorer 8.0 browser intended to give users a simple, effective way to avoid being tracked online. They wanted to design the software to automatically thwart common tracking tools, unless a user deliberately switched to settings affording less privacy.... In the end, the product planners lost a key part of the debate. The winners: executives who argued that giving automatic privacy to consumers would make it tougher for Microsoft to profit from selling online ads.The internet is ultimately not a commons, and our access to it is conditional on our vulnerability within it. Neither Microsoft nor any other tech company is in business to open our access to free-flowing information or protect our privacy for nothing. (The companies that do want to help you do that are parasites who rely on the others to intentionally endanger it.) Their business, as network architects and technicians, is ultimately surveillance -- to make sure one is connected to the network and appropriately exposed, exploitable as a node. Wingfield points out that "the 50 most-popular U.S. websites, including four run by Microsoft, installed an average of 64 pieces of tracking technology each onto a test computer." We get to use the internet, or rather companies want to make it possible for us to use the internet, because they can reap the rewards from our data processing there -- that's the only reason. And at tech companies that survive, executives are in place to smack down the wild-eyed dreamers among the product developers who think otherwise. This graphic illustrates the way the tracking systems work, and how we, in our lust for information, work to transform ourselves into demographic data
And here's more reason for cynicism: Google's negotiations with Verizon to in effect put an end to net neutrality. They are discussing placing a burden on content creators to pay to have their content distributed efficiently on the internet. This seems like it would ultimately reinstitute the gatekeeping power of the media companies, which would quickly turn such costs into something that mimics the costs of printing and distributing bundles of paper, or pressing grooves into vinyl, or what have you. So any dream of the internet being a democratizing, disintermediating force in the realm of cultural production would be effectively quashed. Amateurs would be on the ham-radio section of the net, with transmissions at lugubrious levels, while the professional media would be on the "real" internet.
Death of the Author? (5 Aug 2010)
I've thought this over a bit today and basically agree with Matt Yglesias, that the claim Trip Gabriel reports in this NYT article that the ethos of the internet is prompting kids to plagiarize more than they used to is pretty dubious. Here's the core of Gabriel's article:
Gabriel's article seems like misplaced anxiety; the stakes are pretty low with plagiarism: students are basically only "hurting themselves" by cheating on their homework, as the proverb goes, and it's not like the papers are up for publication. These cheaters are not David Shields or Jonathan Lethem. The idea that students suddenly don't understand the concept of authorship reminds me of the worst nightmares of the fuddy-duddy professors who would fulminate about Barthes and Foucault and "this so-called textuality" when I was a graduate student. Where was the proper respect for Genius?
Gabriel interviews anthropologist Susan Blum, who seems like this species of worrywart.
Students, I suspect, don't take attribution seriously because the work they are being asked to do is not serious to them. They don't have much of a sense of scholarship as a collective enterprise, or of what they do in college as scholarship. With gen-ed classes, they know they are just marking time and doing busy work for the most part. They are right to think that plagiarism is not "a serious misdeed" that is somehow different from any other form of academic dishonesty. To pretend otherwise is to serve the ideological bidding of the lords of intellectual property.
The implication of plagiarism hysteria is that scholarship is a process of claiming ownership of proprietary information, an exceedingly unnatural attitude that students have always needed to be indoctrinated into, particularly if they want an academic career. This usually involves a series of ritualized genuflections in the form of citations of the recognized masters of a particular discipline as part of a student's professionalization into the academy.
Yglesias notes that the Web prioritizes the association of data with its metadata -- song files with the artists, etc. -- and thus generally organizes information so that it is easier to deduce where it originated if you are so inclined. It's never been easier to catch cheaters, he points out, something that was true even when I last taught college courses, in 2001.
I am inclined to think that the ubiquity of material available for appropriation and the ease of cutting and pasting itself explains most of the alleged rise in plagiarism. In my experience, most students who were inclined to cheat were way too lazy to retype passages out of a book.
Professors used to deal with plagiarism by admonishing students to give credit to others and to follow the style guide for citations, and pretty much left it at that.
But these cases — typical ones, according to writing tutors and officials responsible for discipline at the three schools who described the plagiarism — suggest that many students simply do not grasp that using words they did not write is a serious misdeed.
It is a disconnect that is growing in the Internet age as concepts of intellectual property, copyright and originality are under assault in the unbridled exchange of online information, say educators who study plagiarism.
Digital technology makes copying and pasting easy, of course. But that is the least of it. The Internet may also be redefining how students — who came of age with music file-sharing, Wikipedia and Web-linking — understand the concept of authorship and the singularity of any text or image.
Gabriel's article seems like misplaced anxiety; the stakes are pretty low with plagiarism: students are basically only "hurting themselves" by cheating on their homework, as the proverb goes, and it's not like the papers are up for publication. These cheaters are not David Shields or Jonathan Lethem. The idea that students suddenly don't understand the concept of authorship reminds me of the worst nightmares of the fuddy-duddy professors who would fulminate about Barthes and Foucault and "this so-called textuality" when I was a graduate student. Where was the proper respect for Genius?
Gabriel interviews anthropologist Susan Blum, who seems like this species of worrywart.
In an interview, she said the idea of an author whose singular effort creates an original work is rooted in Enlightenment ideas of the individual. It is buttressed by the Western concept of intellectual property rights as secured by copyright law. But both traditions are being challenged.Obviously she hasn't hurt Mark Zuckerberg lecture about integrity and a single online identity. I'd be surprised too to find that students are uninterested in authenticity and unique identity and are seeking to merge with the multitude in a gesture of postmodern antisubjectivity. Self-broadcasting mediums and Web 2.0 seem to emphasize the value of a unique identity, not dissolve it.
“Our notion of authorship and originality was born, it flourished, and it may be waning,” Ms. Blum said.
She contends that undergraduates are less interested in cultivating a unique and authentic identity — as their 1960s counterparts were — than in trying on many different personas, which the Web enables with social networking.
Students, I suspect, don't take attribution seriously because the work they are being asked to do is not serious to them. They don't have much of a sense of scholarship as a collective enterprise, or of what they do in college as scholarship. With gen-ed classes, they know they are just marking time and doing busy work for the most part. They are right to think that plagiarism is not "a serious misdeed" that is somehow different from any other form of academic dishonesty. To pretend otherwise is to serve the ideological bidding of the lords of intellectual property.
The implication of plagiarism hysteria is that scholarship is a process of claiming ownership of proprietary information, an exceedingly unnatural attitude that students have always needed to be indoctrinated into, particularly if they want an academic career. This usually involves a series of ritualized genuflections in the form of citations of the recognized masters of a particular discipline as part of a student's professionalization into the academy.
Yglesias notes that the Web prioritizes the association of data with its metadata -- song files with the artists, etc. -- and thus generally organizes information so that it is easier to deduce where it originated if you are so inclined. It's never been easier to catch cheaters, he points out, something that was true even when I last taught college courses, in 2001.
I am inclined to think that the ubiquity of material available for appropriation and the ease of cutting and pasting itself explains most of the alleged rise in plagiarism. In my experience, most students who were inclined to cheat were way too lazy to retype passages out of a book.
Labels:
academia,
intellectual property,
social media,
technology,
theory
Friday, August 12, 2011
Memes and Marketing (22 July 2010)
In the most recent NYT Magazine. Rob Walker has an article about the ramifications of internet memes. It's well worth reading and not merely because it mentions Carles.
Any discussion about internet memes -- beyond participating in them by passing them along -- is inevitably a discussion about marketing. Meme circulation is an obvious laboratory of persuasion, a sort of testing process that exposes otherwise hidden ways in which the network connects us, revealing the strength of certain circuits and what causes them to fire. For marketers, memes sound-check the microphone and PA system of the internet in preparation for the commercial messages it will be expected to carry; they also allow a census to be taken of what sort of people are listening. Proven memes serve as a marketable demonstration of pure influence, abstracted from the relevance or utility of the message. So in other words, a meme is a pure, formal expression of marketing potentiality.
Not surprisingly, the meme professionals Walker talks to are all in the marketing business in one way or another. Tim Hwang, the organizer of the academic conference on memes that Walker attends (now he works for a branding company, is puzzled by those who challenge the natural marriage of meme-making and marketing: "There’s been this weird push around ‘Did ROFL culture sell out? Who owns all these spaces? ... This isn’t cool anymore because there’s people making money off it.’ " But those aren't exactly "weird" questions. Because memes pointedly blur the distinction between consumption and production, they prompt all sorts of concerns about the future of knowledge work, of creativity, of everyday life as a social factory -- of who makes "internet awesome," to borrow the terms Walker picks up on to start the article.
Harnessing the way people spread memes is a bit like getting a free media buy, only with far more dynamic potential. Ownership of the mechanisms that harness the value in meme production matters -- people make real money off the free labor of others' "goofing off". And that labor supplants work that has in the past been compensated with wages or intellectual property claims. The content of memes can be exploited, as the LOLcats entrepreneurs have managed to do. (And incidentally, why do people criticize LOLcats for being stupid? They are just mainstreamed Barbara Kruger.) And it is certainly not that memes become uncool when their links to marketing become explicit -- what makes the connection explicit is the marketers' own evocation of cool with regard to the process of meme circulation; their insistence that it be turned to account, made use of, made profitable in some way by urging the reconception of memes as culturally significant.
I don't know if what I mean by that is at all clear. Memes aren't "cool" -- they become cool retroactively when they are reported about, exploited, made to serve another function other than their pure viral transmission, their discharge of an ephemeral need for distraction. The space in which memes are initially generated is noncommercial, almost quasi-utopian. Maybe even postcapitalist. It's the anonymous space of 4chan that Walker cites, for example, a Bataille-like realm of expenditure, negativity, unredeemed human urges, unrationalized and unassimilated expression. It traces potential routes of circulation that operate outside of commercial influence. Walker mentions the American Indian trickster myth in relation to this, but it also exemplifies Bakhtin's notions of the heteroglossic and the carnivalesque -- in other words, it's a discourse that subverts or mocks official culture though doesn't necessarily challenge it (and may in fact reinforce it by venting off popular discontent or surplus creative energies). For better or worse, it's all about the lulz, as Walker points out.
But that is where the meme professionals/marketers step in to play their crucial role. Marketers find such spaces and contrive ways to assimilate them -- a function far more important than selling any particular product or idea. They bring tangible social recognition and even money for those savvy enough to game the system that has sucked them in. And in the process the creative and chaotic energy that fuels the generation of memes is neutered, rechanneled toward supporting the status quo.
Something LOLcat honcho Ben Huh says touches on this:
Any discussion about internet memes -- beyond participating in them by passing them along -- is inevitably a discussion about marketing. Meme circulation is an obvious laboratory of persuasion, a sort of testing process that exposes otherwise hidden ways in which the network connects us, revealing the strength of certain circuits and what causes them to fire. For marketers, memes sound-check the microphone and PA system of the internet in preparation for the commercial messages it will be expected to carry; they also allow a census to be taken of what sort of people are listening. Proven memes serve as a marketable demonstration of pure influence, abstracted from the relevance or utility of the message. So in other words, a meme is a pure, formal expression of marketing potentiality.
Not surprisingly, the meme professionals Walker talks to are all in the marketing business in one way or another. Tim Hwang, the organizer of the academic conference on memes that Walker attends (now he works for a branding company, is puzzled by those who challenge the natural marriage of meme-making and marketing: "There’s been this weird push around ‘Did ROFL culture sell out? Who owns all these spaces? ... This isn’t cool anymore because there’s people making money off it.’ " But those aren't exactly "weird" questions. Because memes pointedly blur the distinction between consumption and production, they prompt all sorts of concerns about the future of knowledge work, of creativity, of everyday life as a social factory -- of who makes "internet awesome," to borrow the terms Walker picks up on to start the article.
Harnessing the way people spread memes is a bit like getting a free media buy, only with far more dynamic potential. Ownership of the mechanisms that harness the value in meme production matters -- people make real money off the free labor of others' "goofing off". And that labor supplants work that has in the past been compensated with wages or intellectual property claims. The content of memes can be exploited, as the LOLcats entrepreneurs have managed to do. (And incidentally, why do people criticize LOLcats for being stupid? They are just mainstreamed Barbara Kruger.) And it is certainly not that memes become uncool when their links to marketing become explicit -- what makes the connection explicit is the marketers' own evocation of cool with regard to the process of meme circulation; their insistence that it be turned to account, made use of, made profitable in some way by urging the reconception of memes as culturally significant.
I don't know if what I mean by that is at all clear. Memes aren't "cool" -- they become cool retroactively when they are reported about, exploited, made to serve another function other than their pure viral transmission, their discharge of an ephemeral need for distraction. The space in which memes are initially generated is noncommercial, almost quasi-utopian. Maybe even postcapitalist. It's the anonymous space of 4chan that Walker cites, for example, a Bataille-like realm of expenditure, negativity, unredeemed human urges, unrationalized and unassimilated expression. It traces potential routes of circulation that operate outside of commercial influence. Walker mentions the American Indian trickster myth in relation to this, but it also exemplifies Bakhtin's notions of the heteroglossic and the carnivalesque -- in other words, it's a discourse that subverts or mocks official culture though doesn't necessarily challenge it (and may in fact reinforce it by venting off popular discontent or surplus creative energies). For better or worse, it's all about the lulz, as Walker points out.
The “for the lulz” attitude can be more broadly thought of as a rationale for the idea that everything is worth making fun of, nothing should be taken seriously, not even a guy getting punched in the face until he bleeds.As long as that attitude and the material it produces troubles us, the space of memes retains its peculiar autonomy and can function as a kind of unreflexive social critique, if not a return of the repressed. It has the potential to be something other than what Marcuse calls represssive desublimation -- the liberation of formerly suppressed drives as a new mode of social control.
But that is where the meme professionals/marketers step in to play their crucial role. Marketers find such spaces and contrive ways to assimilate them -- a function far more important than selling any particular product or idea. They bring tangible social recognition and even money for those savvy enough to game the system that has sucked them in. And in the process the creative and chaotic energy that fuels the generation of memes is neutered, rechanneled toward supporting the status quo.
Something LOLcat honcho Ben Huh says touches on this:
“What interested me the most was there’s this entire community of people devoted to following the rules and the system behind the framework of Lolcats,” Huh, who is 32, told me. “No one ever said, ‘These are the rules.’ But everybody said, ‘I know the rules.’ ”At first, there are no rules but instead the spontaneous order of the bacchanalia. But then, as metacommentary about memes begins to be distributed through marketing channels rules are codified, and what was a spontaneous expression of collectivity becomes obedience and exploited productivity masquerading as free expression.
Labels:
immaterial labor,
intellectual property,
marketing,
viral
Thursday, August 4, 2011
DIY Manufacturing, the DRM Future, the iPad (29 Jan 2010)
Chris Anderson's Wired article about changes in the manufacturing sector is as breathless and hype-ridden as one would expect ("Step inside and the office reveals itself as a mind-blowing example of the power of micro-factories"; "A garage renaissance is spilling over into such phenomena as the booming Maker Faires and local “hackerspaces.” Peer production, open source, crowdsourcing, user-generated content — all these digital trends have begun to play out in the world of atoms, too. The Web was just the proof of concept. Now the revolution hits the real world.") It's full of anecdotes about crowdsourced design and 3-D print shops and the world of liberated freelancers and the like. At Gizmodo, Joel Johnson provides a useful corrective: "We used to call 'micro-factories' 'small businesses,' but that was before we knew they were a revolution."
What Anderson seems to miss in all his glee is the erosion of labor's bargaining power. Since we'll all be small-scale manufacturers, he seems to assume, there will be no laborers, per se. Or they will all be in China, at any rate, and who cares about them? The reality of crowdsourcing is that it is a good way to find someone to do any given piece of work the cheapest. And there are always people out there who underestimate the value of their abilities.
The future may be a time when we can't sell our labor power alone; we'll all need to be small-time entrepreneurs, hawking some small-time idea or contribution to a project, just to hustle up a living. In other words, in the future we will all basically be living off the books, and if you've read Sudhir Venkatesh's book of the same name, you know that's not such a good thing. Another book that is probably relevant to this is The Jobless Future by Stanley Aronowitz and William DiFazio, which I've not yet read.
Still, the question of whether the disintermediation facilitated by the internet is causing a revolution in the means of production -- what they are, who has access to them, how they are related to capital, and so on -- is well worth considering. Michel Bauwens, whose P2P Foundation site is a fount of links and essays about whether the internet can be the basis for a whole new mode of social organization, posted this summary of his ideas about what he calls (unfortunately) "netarchical capitalism" -- an economic system in which the most important means of production is the information infrastructure that allows for participatory networks to form and free labor to be performed and what Paulo Virno (following Marx) calls the general intellect to operate. The general intellect is basically Marxist jargon for decentralized collaboration and cooperation, the generalized sharing of useful information about how to make things or consume things. As Virno defines it, it is "inseparable from the interaction of a plurality of living subjects. The ‘general intellect’ includes formal and informal knowledge, imagination, ethical tendencies, mentalities and ‘language games’. Thoughts and discourses function in themselves as productive ‘machines’ in contemporary labor." In other words, the most valuable thing in the early days of the Industrial Age were machines and factories -- you needed them to compete. Now, those are arguably less important than knowledge, how to operate machines and disseminate their products. And thanks to the internet, that knowledge is starting to belong to all of us.
Virno is glossing the "Fragment on Machines" from the Grundrisse, in which Marx suggests that technology will make human labor time less central to production, even though it remains the critical component in creating surplus value through exchange. Virno: "The main lacerating contradiction outlined here is that between productive processes that now directly and exclusively rely on science and a unit of measure of wealth that still coincides with the quantity of labor embodied in the product. According to Marx, the development of this contradiction leads to the ‘breakdown of production based on exchange value’ and therefore to communism." Optimists believe we are seeing that play out now in the development of "the networked information economy" to use economist Yochai Benkler's term (though Benkler does not seem to think these changes threaten the foundations of capitalism).
But that won't happen without a fight. Bauwens recognizes that the information infrastructure will remain in the control of capitalists and could close off the liberating potential of new technology. "A new capitalist class is emerging," he writes, "the forces which both ‘enable’ and exploit the participatory networks arising in the peer to peer era." He adds, "Although the large netarchical corporations do enable participatory networks, their for-profit nature makes them dangerous trustees of commons-favorable protocols." He lists some examples of netarchical capitalists, but no better example exists than Apple, whose new tablet device is clearly an attempt to toll the flow of information and reinstate the prerogatives of private intellectual property in the face of an emerging commons.
As Tom Formeski explains:
That's also the conclusion reached by Tim Lee (via Matt Yglesias): "Apple seems determined to replicate the 20th century business model of paying for copies of content in an age where those copies have a marginal cost of zero."
Apple is canny in leveraging the zeal its early adopters have for its designs into a wide-held belief in its products' inevitable superiority. Writes Lee, "In the short term, Apple’s technological and industrial design prowess can help to prop up dying business models." The press goes along with this now, hyping Apple PR as important news. Apple clearly intends to use its perceived advantage in design to dictate the terms of media consumption, and the company's slavish fans appear to be willing servants eager to carry out the dismembering of the general intellect. But is Lee right that this strategy is doomed to long-term failure, that people will tire of being in walled gardens? I really don't know. Are Kindles popular?
What Anderson seems to miss in all his glee is the erosion of labor's bargaining power. Since we'll all be small-scale manufacturers, he seems to assume, there will be no laborers, per se. Or they will all be in China, at any rate, and who cares about them? The reality of crowdsourcing is that it is a good way to find someone to do any given piece of work the cheapest. And there are always people out there who underestimate the value of their abilities.
The future may be a time when we can't sell our labor power alone; we'll all need to be small-time entrepreneurs, hawking some small-time idea or contribution to a project, just to hustle up a living. In other words, in the future we will all basically be living off the books, and if you've read Sudhir Venkatesh's book of the same name, you know that's not such a good thing. Another book that is probably relevant to this is The Jobless Future by Stanley Aronowitz and William DiFazio, which I've not yet read.
Still, the question of whether the disintermediation facilitated by the internet is causing a revolution in the means of production -- what they are, who has access to them, how they are related to capital, and so on -- is well worth considering. Michel Bauwens, whose P2P Foundation site is a fount of links and essays about whether the internet can be the basis for a whole new mode of social organization, posted this summary of his ideas about what he calls (unfortunately) "netarchical capitalism" -- an economic system in which the most important means of production is the information infrastructure that allows for participatory networks to form and free labor to be performed and what Paulo Virno (following Marx) calls the general intellect to operate. The general intellect is basically Marxist jargon for decentralized collaboration and cooperation, the generalized sharing of useful information about how to make things or consume things. As Virno defines it, it is "inseparable from the interaction of a plurality of living subjects. The ‘general intellect’ includes formal and informal knowledge, imagination, ethical tendencies, mentalities and ‘language games’. Thoughts and discourses function in themselves as productive ‘machines’ in contemporary labor." In other words, the most valuable thing in the early days of the Industrial Age were machines and factories -- you needed them to compete. Now, those are arguably less important than knowledge, how to operate machines and disseminate their products. And thanks to the internet, that knowledge is starting to belong to all of us.
Virno is glossing the "Fragment on Machines" from the Grundrisse, in which Marx suggests that technology will make human labor time less central to production, even though it remains the critical component in creating surplus value through exchange. Virno: "The main lacerating contradiction outlined here is that between productive processes that now directly and exclusively rely on science and a unit of measure of wealth that still coincides with the quantity of labor embodied in the product. According to Marx, the development of this contradiction leads to the ‘breakdown of production based on exchange value’ and therefore to communism." Optimists believe we are seeing that play out now in the development of "the networked information economy" to use economist Yochai Benkler's term (though Benkler does not seem to think these changes threaten the foundations of capitalism).
But that won't happen without a fight. Bauwens recognizes that the information infrastructure will remain in the control of capitalists and could close off the liberating potential of new technology. "A new capitalist class is emerging," he writes, "the forces which both ‘enable’ and exploit the participatory networks arising in the peer to peer era." He adds, "Although the large netarchical corporations do enable participatory networks, their for-profit nature makes them dangerous trustees of commons-favorable protocols." He lists some examples of netarchical capitalists, but no better example exists than Apple, whose new tablet device is clearly an attempt to toll the flow of information and reinstate the prerogatives of private intellectual property in the face of an emerging commons.
As Tom Formeski explains:
By building a proprietary, closed platform, with its own hardware and software, Apple is able to capture a larger part of the value stream from selling media....If Apple's device is successful, Nicolas Carr writes, "we’ll all be using iPads to play iTunes, read iBooks, watch iShows, and engage in iChats. It will be an iWorld."
Apple is making a bold bid to tie up a dominant share of the future media e-commerce market -- the sale of digital books, movies, newspapers, etc. Its proprietary hardware and software strengthen its DRM; media creators want strong DRM, which will attract them to Apple. And it's iTunes store distributes the media for them and collects payment.
That's also the conclusion reached by Tim Lee (via Matt Yglesias): "Apple seems determined to replicate the 20th century business model of paying for copies of content in an age where those copies have a marginal cost of zero."
Apple is canny in leveraging the zeal its early adopters have for its designs into a wide-held belief in its products' inevitable superiority. Writes Lee, "In the short term, Apple’s technological and industrial design prowess can help to prop up dying business models." The press goes along with this now, hyping Apple PR as important news. Apple clearly intends to use its perceived advantage in design to dictate the terms of media consumption, and the company's slavish fans appear to be willing servants eager to carry out the dismembering of the general intellect. But is Lee right that this strategy is doomed to long-term failure, that people will tire of being in walled gardens? I really don't know. Are Kindles popular?
Thursday, July 21, 2011
Digital anarchy (17 June 2009)
Generation Bubble reports on the UCLA Mellon Seminar in Digital Humanities and its most recent manifesto, which proposes an aggressive assault on intellectual property: the digital humanists movement "believes that copyright and IP standards must be freed from the stranglehold of Capital, including the capital possessed by heirs who live parasitically off of the achievements of their deceased predecessors." Thus, the manifesto proposes we "pirate and pervert materials by the likes of Disney on such a massive scale that the IP bosses will have to sue your entire neighborhood, school, or country" and "practice digital anarchy by creatively undermining copyright, mashing up media, recutting images, tracks, and texts." By these lights, Girl Talk is not a lame DJ but a Trotskyist firebrand leading the revolution from his laptop mixing board, one mashup salvo at a time. The manifesto regards media miscegenation as an inherent expression of freedom rather than a perhaps lamentable indication of the trap we are in, at a few stages removed from original creation, doomed to fabricate our material culture from shopworn digital remnants.
The manifesto suggests that eradicating intellectual property will lead to more cooperative intellectual labor, mediated by internet-distributed open-source software tools, while facilitating the "reinvention of the solitary, 'eccentric,' even hermetic work carried out by lone individuals both inside and outside the academy". That sounds somewhat sinister -- fomenting an effort to re-educate decadent individualists, perhaps through some rigorous self-criticism and a few self-denunciation sessions, and make them into better-functioning members of the collective, content to have their anonymous contributions to the new society recognized through its success at maintaining total control.
One need not be especially cynical to question the utopianism the manifesto trades in. Intellectual property is not merely some conspiracy cooked up by Capital but a flawed expression of the individual's pursuit for social recognition, which under capitalism is expressed through wages, salary, or payment of some kind or other. Perhaps we are to believe that in the future everyone will be content to disappear into the mass, to be mashed-up in the grand sociocultural remix to end all remixes, but I doubt it; the would-be technoutopians out there also seem to be those most highly networked, those who are most plugged in to the contemporary means of publicity. And it is not like academics in the humanities eschew recognition; their reputational squabbles seem to matter more to them than any aspect of their scholarly contributions.
So doing away with IP, society's current mode of administering recognition, serves only to alienate the creators the manifesto's writers seek to liberate. What must be found is a way to replace IP with a different system for doling out that recognition -- the attention economy's currency. Generation Bubble points out IP's enforcement problems, which have the tendency to invalidate the concept's moral grounding.
The manifesto suggests that eradicating intellectual property will lead to more cooperative intellectual labor, mediated by internet-distributed open-source software tools, while facilitating the "reinvention of the solitary, 'eccentric,' even hermetic work carried out by lone individuals both inside and outside the academy". That sounds somewhat sinister -- fomenting an effort to re-educate decadent individualists, perhaps through some rigorous self-criticism and a few self-denunciation sessions, and make them into better-functioning members of the collective, content to have their anonymous contributions to the new society recognized through its success at maintaining total control.
One need not be especially cynical to question the utopianism the manifesto trades in. Intellectual property is not merely some conspiracy cooked up by Capital but a flawed expression of the individual's pursuit for social recognition, which under capitalism is expressed through wages, salary, or payment of some kind or other. Perhaps we are to believe that in the future everyone will be content to disappear into the mass, to be mashed-up in the grand sociocultural remix to end all remixes, but I doubt it; the would-be technoutopians out there also seem to be those most highly networked, those who are most plugged in to the contemporary means of publicity. And it is not like academics in the humanities eschew recognition; their reputational squabbles seem to matter more to them than any aspect of their scholarly contributions.
So doing away with IP, society's current mode of administering recognition, serves only to alienate the creators the manifesto's writers seek to liberate. What must be found is a way to replace IP with a different system for doling out that recognition -- the attention economy's currency. Generation Bubble points out IP's enforcement problems, which have the tendency to invalidate the concept's moral grounding.
The age of virtual reproduction, where the costs associated with making cultural artifacts have in many cases become negligible (just about anyone can, with a little bit of know-how, record studio quality music on a desktop, for instance), has engendered an unprecedented situation. Gatekeepers of intellectual property now appear as veritable dogs in the manger. Each time they encode a sound-file to prohibit its copying, or each time they install crippleware on an electronic device to inhibit its full functionality, they betray the fact that scarcity is now more a matter of insistence than fact.That's well put. But the fact that scarcity is is always going to seem poorly manufactured suggests that we'll move on to a different tack: encouraging the deluge and enhancing the value of reliable editor and filters. In such a world, an individual's reputation for discernment will become even more valuable, and the economy within which they exist more hierarchical.
Tuesday, July 19, 2011
Performance theft (12 March 2009)
Wired reports on a mobile-phone application that lets a user scan a barcode of a DVD and launch a bit-torrent download of it at their home (link via BoingBoing). Somehow this seems more like stealing than using a search engine to find a torrent in the privacy of one's own home. Handling the object you will no longer have to buy seems to make tangible the notion of intellectual-property theft, which makes me wonder why anybody would bother to do it. Are there those among us gripped by a self-destructive desire to flamboyantly to perform theft in public rather than in the peace and anonymity of their own homes? This would be like defiantly parading to the Adult Books store rather than surfing for porn online.
Perhaps straight-up pirates who are looking to steal everything to sell it subway platforms and the like would benefit from a system in which they could just scan everything on the shelf, but it would seem like these people would have more reason to want not to be on camera in a retail outlet doing this.
It has been a long time since I browsed in a store looking for DVDs or music (one of the major quality-of-life improvements the internet has brought my life is that I never have to go to a record store again), so maybe I have lost touch with that level of impulsivity that would make bar-code-automated stealing seem like a good idea. I suppose it has a poetic flavor to it, using the retail machine's tools against the system itself. (And then I'm going to get a tattoo of a bar code on my arm, to make an important statement about conformity.) It's hard to remember what it was like to have to discover new culture by browsing in stores, though it was once my primary mode of cultural discovery. It still is, to some degree, in book stores and libraries. I'm not nostalgic for learning about music from the import section at Listening Booth -- but it is for that sort of nostalgia that this bar-code-reader
But it seems like most discoveries of new cultural products to want are made online -- a depressing fact is that we have our cultural world expanded not by wandering through the world having experiences and encountering unlikely or unexpected things, but through the systematic and highly rationalized, virtually automated mode of searching online. I could set a schedule by my cultural discoveries -- every week or so I spend an hour or so plowing through newsgroups and mp3 blogs to see if anything sounds interesting. I'm not sure if these count as "discoveries" any more. Instead, I'm merely calibrating my internal novelty-seeking metabolism, rendering the very idea of discovery impossible.
Perhaps straight-up pirates who are looking to steal everything to sell it subway platforms and the like would benefit from a system in which they could just scan everything on the shelf, but it would seem like these people would have more reason to want not to be on camera in a retail outlet doing this.
It has been a long time since I browsed in a store looking for DVDs or music (one of the major quality-of-life improvements the internet has brought my life is that I never have to go to a record store again), so maybe I have lost touch with that level of impulsivity that would make bar-code-automated stealing seem like a good idea. I suppose it has a poetic flavor to it, using the retail machine's tools against the system itself. (And then I'm going to get a tattoo of a bar code on my arm, to make an important statement about conformity.) It's hard to remember what it was like to have to discover new culture by browsing in stores, though it was once my primary mode of cultural discovery. It still is, to some degree, in book stores and libraries. I'm not nostalgic for learning about music from the import section at Listening Booth -- but it is for that sort of nostalgia that this bar-code-reader
But it seems like most discoveries of new cultural products to want are made online -- a depressing fact is that we have our cultural world expanded not by wandering through the world having experiences and encountering unlikely or unexpected things, but through the systematic and highly rationalized, virtually automated mode of searching online. I could set a schedule by my cultural discoveries -- every week or so I spend an hour or so plowing through newsgroups and mp3 blogs to see if anything sounds interesting. I'm not sure if these count as "discoveries" any more. Instead, I'm merely calibrating my internal novelty-seeking metabolism, rendering the very idea of discovery impossible.
Labels:
art,
file sharing,
intellectual property
Friday, November 5, 2010
File-sharing decoys as ads (19 October 2006)
Yesterday The Wall Street Journal reported on a new record-industry ploy to make file sharing work in their favor by flooding LimeWire, et. al., with dummy decoy files that are actually ads. You search for Audioslave or Dashboard Confessional (why you would do this, frankly, I don't know) and you end up with advertisements and possibly teasers to spread the advertising "virally" in order to unlock the song you wanted in the first place. Or it's a rare two-for-one treat for the would-be pirate who thought he was getting the new Jay-Z tracks; not only does he get something bogus, he also gets an ad cajoling him to drink Coca-Cola.
I'm not sure why Coke would want to associate itself with such a negative experience for the target audience. Wouldn't the person who receives this particular advertising message think, "Fuck you, Coca-Cola, and the bullshit DRM you rode in on"? Is the faith in the razzle-dazzle of new technologies for delivering ads so great that companies fail to imagine the more mundane matters of context? (Maybe they crossed this line long ago when they started running cheerful liquor ads alongside pictures of starving and maimed children in news magazines -- which reminds me of my favorite moment in the TV version of John Berger's Ways of Seeing, when he shows of few of these juxtapositions and declares that Western culture had officially gone insane.)
It was only a matter of time before ads rode to the rescue of intellectual-property thieves. Our society couldn't go on having more and more consumers unrepentantly embracing criminality. Something had to change to reincorporate them. It's impossible to remain an outlaw once ads find you -- what the presence of ads proves is that your deviousness has already been expected and accounted for -- thereby neutralizing it. For a while, with its futile lawsuits against its own customers, it seemed the record industry was going the way of the war on drugs, but this latest turn makes much more sense. There's a nice symmetry to ads and file-sharing; you steal someone else's intellectual property, ads steal some of your intellect right back.
Anyway, the further blurring of ads and content in the pop-music realm is reminiscent of Sigue Sigue Sputnik's pioneering effort in the 1980s, when the band put ads between the songs on their album Flaunt It (featuring "Love Missile F1-11") This idea, needless to say, did not catch on -- maybe foregrounding the band's crass cynicism wasn't such a good idea. Maybe people, even in the 1980s, didn't find that kind of hollow greed appealing. You didn't have that vicarious pull that pop music typically provides; you didn't think, Gee, I wish I could be a smug, hack, makeup-wearing phony who revels in commercialism and played-out disco beats. But perhaps the time for ads merged with music files has come. SSS probably wasn't wrong about ads and pop songs being essentially interchangeable; they were prescient in predicting their growing symbiosis. Perhaps we're now ready for product placements within pop songs: Just imagine a R&B diva getting all melismatic with brand names: "Aaaa-berr-cro-oh-oh-ah-ah-ohm-bie-eee-aye-eee!".
I'm not sure why Coke would want to associate itself with such a negative experience for the target audience. Wouldn't the person who receives this particular advertising message think, "Fuck you, Coca-Cola, and the bullshit DRM you rode in on"? Is the faith in the razzle-dazzle of new technologies for delivering ads so great that companies fail to imagine the more mundane matters of context? (Maybe they crossed this line long ago when they started running cheerful liquor ads alongside pictures of starving and maimed children in news magazines -- which reminds me of my favorite moment in the TV version of John Berger's Ways of Seeing, when he shows of few of these juxtapositions and declares that Western culture had officially gone insane.)
It was only a matter of time before ads rode to the rescue of intellectual-property thieves. Our society couldn't go on having more and more consumers unrepentantly embracing criminality. Something had to change to reincorporate them. It's impossible to remain an outlaw once ads find you -- what the presence of ads proves is that your deviousness has already been expected and accounted for -- thereby neutralizing it. For a while, with its futile lawsuits against its own customers, it seemed the record industry was going the way of the war on drugs, but this latest turn makes much more sense. There's a nice symmetry to ads and file-sharing; you steal someone else's intellectual property, ads steal some of your intellect right back.
Anyway, the further blurring of ads and content in the pop-music realm is reminiscent of Sigue Sigue Sputnik's pioneering effort in the 1980s, when the band put ads between the songs on their album Flaunt It (featuring "Love Missile F1-11") This idea, needless to say, did not catch on -- maybe foregrounding the band's crass cynicism wasn't such a good idea. Maybe people, even in the 1980s, didn't find that kind of hollow greed appealing. You didn't have that vicarious pull that pop music typically provides; you didn't think, Gee, I wish I could be a smug, hack, makeup-wearing phony who revels in commercialism and played-out disco beats. But perhaps the time for ads merged with music files has come. SSS probably wasn't wrong about ads and pop songs being essentially interchangeable; they were prescient in predicting their growing symbiosis. Perhaps we're now ready for product placements within pop songs: Just imagine a R&B diva getting all melismatic with brand names: "Aaaa-berr-cro-oh-oh-ah-ah-ohm-bie-eee-aye-eee!".
Labels:
drm,
file sharing,
intellectual property,
marketing,
sigue sigue sputnik,
viral
Thursday, August 5, 2010
Updating iTunes (18 September 2006)
I usually ignore the update notifications that iTunes pops up every time I reopen it, because I expect them to eventually drop some digital-rights hammer on my music collection and render it inoperable. (Kind of like what Microsoft apparently plans to do with its future Zune player.) It will start by doing some unrequested "analyzing", going through all my songs one by one, and then boom, none of them will work without some kind of certification. That's probably unduly paranoid, but at some point it seems inevitable. Eventually music players and subscription services will dominate the music industry (this article in today's WSJ about the innovations of Apple's digital music player rivals gives a peek at the future), and the idea of collecting music may become as moribund as collecting pogs. It's hard to imagine not claiming a sense of ownership over music, but it wasn't so long ago that the best people could do was own sheet music. Music must have meant something very different then; it was always an activity rather than ambient wallpaper or a passive hobby. So attitudes toward music are clearly very malleable and responsive to distribution technology. Future distribution may make music more like on-demand cable TV, where we pay monthly to check in and hear something new whenever we want to, or it may merge seamlessly with satellite radio. Once ownership of some tangible product is undermined as a motive for buying music, the stage is set for a resurgence of the significance of radio. What is the difference between radio and subscriptions to massive music libraries, anyway, other than who picks the songs? Most people want some one else to do that work anyway. I imagine the subscription services will offer playlists to download as well as individual songs.
Anyway, I broke down and upgraded to iTunes 7 last night, mainly because I was enticed by the promise that I could have the iTunes store get all the missing album artwork for me. Of course I had to log in to the store and let them store a credit-card number -- but I took the bait; it seemed a fair trade and I like seeing the covers. Apple has obviously decided that pushing album art is important to the next phase of digital music's takeover. Not only does the promise of all that free art get more customers into their database, one click away from purchasing media, it also brings the experience of digitized music ownership one more sensual step closer to accurately simulating the collecting experience. The new iTunes lets you browse your collection by album cover, which makes a surprising difference in terms of how I understand all the junk I've got on my hard drive. The program even has an option that lets you flick through a virtual shelf of "albums" with mouse clicks, letting you see all the covers lined up neatly next to each other as if they were mounted on a vertical Rolodex. It's a still a little clumsy, but it definitely seems like a move toward a whole new paradigm for computerized music consumption that attempts to provide consolation for the loss of the fetishized object. Next some enterprising entrepreneur will get to work scanning the back covers.
Update: The program is extremely buggy for Windows and I've had to rollback to iTunes 6. Looking at pretty covers while browsing isn't worth my computer freezing up every time a new song comes on.
Anyway, I broke down and upgraded to iTunes 7 last night, mainly because I was enticed by the promise that I could have the iTunes store get all the missing album artwork for me. Of course I had to log in to the store and let them store a credit-card number -- but I took the bait; it seemed a fair trade and I like seeing the covers. Apple has obviously decided that pushing album art is important to the next phase of digital music's takeover. Not only does the promise of all that free art get more customers into their database, one click away from purchasing media, it also brings the experience of digitized music ownership one more sensual step closer to accurately simulating the collecting experience. The new iTunes lets you browse your collection by album cover, which makes a surprising difference in terms of how I understand all the junk I've got on my hard drive. The program even has an option that lets you flick through a virtual shelf of "albums" with mouse clicks, letting you see all the covers lined up neatly next to each other as if they were mounted on a vertical Rolodex. It's a still a little clumsy, but it definitely seems like a move toward a whole new paradigm for computerized music consumption that attempts to provide consolation for the loss of the fetishized object. Next some enterprising entrepreneur will get to work scanning the back covers.
Update: The program is extremely buggy for Windows and I've had to rollback to iTunes 6. Looking at pretty covers while browsing isn't worth my computer freezing up every time a new song comes on.
Labels:
file sharing,
intellectual property,
music,
pogs,
technology
Subscribe to:
Comments (Atom)