Saturday, August 6, 2011

The things we do for Google (25 Feb 2010)

Google, as we know, makes a lot of money by providing "relevant" ads alongside its search results. As this cheerleading Wired article by Steven Levy details, Google can provide relevance to users because it has devised ways to harvest the data users provide in their various searches and in what they subsequently click on. This makes the search operation a quid pro quo exchange, even though it seems as if we are getting something for free when we use Google.
Singhal notes that the engineers in Building 43 are exploiting another democracy — the hundreds of millions who search on Google. The data people generate when they search — what results they click on, what words they replace in the query when they’re unsatisfied, how their queries match with their physical locations — turns out to be an invaluable resource in discovering new signals and improving the relevance of results. The most direct example of this process is what Google calls personalized search — an opt-in feature that uses someone’s search history and location as signals to determine what kind of results they’ll find useful. (This applies only to those who sign into Google before they search.) But more generally, Google has used its huge mass of collected data to bolster its algorithm with an amazingly deep knowledge base that helps interpret the complex intent of cryptic queries.
The egregious use of the word democracy in this passage gives a hint of how corporations would like people to view this immaterial or affective labor (the production of meanings or emotion) that its consumers perform. When you do work for Google, it's not work per se, but an expression of your significance as a digital citizen in the great internet super-regime. Your privacy isn't being invaded; no, instead you are getting to vote your desires automatically and have your voice registered in the way Google molds our common shared reality online. The "amazingly deep" company then can sort out the "cryptic" nonsense that its users type in and tell them what they really wanted to know, as if they weren't so ignorant and could ask for it intelligibly.

Michael Hardt argues that affective labor, the way we collectively "make" emotions in society, used to occur by and large outside of capitalist production, but the shift to an information and services economy brought it inside. "Just as through the process of modernization all production became industrialized, so too through the process of postmodernization all production tends toward the production of services, toward becoming informationalized.... production has become communicative, affective, de-instrumentalized, and 'elevated' to the level of human relations -- but of course a level of human relations entirely dominated by and internal to capital." In other words, the effort we put into making emotional bonds with one another have been co-opted in part by capitalist production, which exploits our human desire to cooperate and like each other and get more done and share things with each other and so on. My sense is that internet platforms have completed that project of cooptation.

The point is that Google and companies like it will increasingly use both promises of convenience and efficiency and celebrations of the democratic joys of open participation to justify privacy invasion and the annexation of the information work users do for their own private purposes. Web 2.0 is an ideological phenomenon more so than a technological one. It's a matter of persuading us to live publicly so that our lives can become the by-product of the data processing we do in the course of living.

No comments:

Post a Comment