Friday, September 18, 2020
Wednesday, September 9, 2020
It is about the darkening of the digital dream and its rapid mutation into a voracious and utterly novel commercial project that I call surveillance capitalism. These prediction products are traded in a new kind of marketplace for behavioral predictions that I call behavioral futures markets. Users provided the raw material in the form of behavioral data, and those data were harvested to improve speed, accuracy, and relevance and to help build ancillary products such as translation. I call this the behavioral value reinvestment cycle, in which all behavioral data are reinvested in the improvement of the product or service.
Surveillance capitalism’s command of the division of learning in society begins with what I call the problem of the two texts. The first text, full of promise, actually functions as the supply operation for the second text: the shadow text. Everything that we contribute to the first text, no matter how trivial or fleeting, becomes a target for surplus extraction.
Behavioral surplus must be vast and varied, but the surest way to predict behavior is to intervene at its source and shape it. The processes invented to achieve this goal are what I call economies of action. The scientists and engineers whom I interviewed identified three key approaches to economies of action, each one aimed at achieving behavior modification. The first two I call “tuning” and “herding.” The third is already familiar as what behavioral psychologists refer to as “conditioning.” This new level of competitive intensity characterized by scope and action ratchets up the invasive character of supply operations and initiates a new era of surveillance commerce that I call the reality business.There are many buzzwords that gloss over these operations and their economic origins: “ambient computing,” “ubiquitous computing,” and the “internet of things” are but a few examples. For now I will refer to this whole complex more generally as the “apparatus.” Although the labels differ, they share a consistent vision: the everywhere, always-on instrumentation, datafication, connection, communication, and computation of all things, animate and inanimate, and all processes—natural, human, physiological, chemical, machine, administrative, vehicular, financial.
This chapter and the next draw our attention to the gap between experience and data, as well as to the specific operations that target this gap on a mission to transform the one into the other. I call these operations rendition. We have seen that the dispossession of human experience is the original sin of surveillance capitalism, but this dispossession is not mere abstraction. Rendition describes the concrete operational practices through which dispossession is accomplished, as human experience is claimed as raw material for datafication and all that follows, from manufacturing to sales.
In this way, surveillance capitalism births a new species of power that I call instrumentarianism. Part III examines the rise of instrumentarian power; its expression in a ubiquitous sensate, networked, computational infrastructure that I call Big Other; and the novel and deeply antidemocratic vision of society and social relations that these produce. Thanks to Big Other’s capabilities, instrumentarian power reduces human experience to measurable observable behavior while remaining steadfastly indifferent to the meaning of that experience. I call this new way of knowing radical indifference. It is a form of observation without witness that yields the obverse of an intimate violent political religion and bears an utterly different signature of havoc: the remote and abstracted contempt of impenetrably complex systems and the interests that author them, carrying individuals on a fast-moving current to the fulfillment of others’ ends.
The withdrawal of agreement [from surveillance capitalism] takes two broad forms, a distinction that will be useful as we move into Part III. The first is what I call the counter-declaration. These are defensive measures such as encryption and other privacy tools, or arguments for “data ownership.” Such measures may be effective in discrete situations, but they leave the opposing facts intact, acknowledging their persistence and thus paradoxically contributing to their legitimacy. For example, if I “opt out” of tracking, I opt out for me, but my action does not challenge or alter the offending practice. The second form of disagreement is what I call the synthetic declaration. If the declaration is “check,” the counter-declaration is “checkmate,” and the synthetic declaration changes the game. It asserts an alternative framework that transforms the opposing facts.
Sunday, September 6, 2020
I just wanted to put this into the world. It was some cover artwork by the amazing Alice Duke, commissioned for a novel I wrote called I Have Seen You in Forever. In the end it didn't work out with the publisher (in an amiable and mutual way, and we may publish a different book together if I ever get them the manuscript), and I'm not sure if the novel ever will go out into the world. It might just hang out in its own world. But I love this strange picture and the secret story it's about that almost nobody knows.