Wrote this piece on affect recognition and other stuff almost a year ago and never got round to posting, later added some bits and then forgot about it. Decided to post without rejigs- it just about makes sense (except I’m no longer sleeping with screens as I have the boy to do that with)
Despite being a person who goes to bed surrounded by phone, iPad and laptop, sleep is usually one of the few places I’m not perennially plugged in, so last night’s dream of checking Twitter interactions seemed to signal a new level of saturation: an always-on zombie, dreaming of electric tweets. Not coincidentally, I’d been reading Jonathan Crary’s 24/7, a chilling indictment of life as value-extraction game where sleep holds out as a final frontier of valueless repose, along with daydreams, rest and the “useless time of reflection and contemplation”, all of which act as problematic barriers to the monetisation of all experience. For one, no technology is as yet capable of intruding into our dreams to program appearances of Twitter, or any other product, service or brand; also, unlike online behaviours, dreams can’t be recorded for clues about what sneaker or holiday destination your subconscious is hankering after. Which is just as well, since, as the facilitator of a Cryptolcass at Mozilla’s London HQ pointed out, a possible future where recorded dreams can put you in jail is reason enough to start taking privacy seriously, before it’s too late.
As well as potentially landing people in jail, technology that could directly access our minds, and dreams, would be manna for marketers, giving an even more accurate portrayal of what it is we ‘really want’, and then targeting ads to meet- and shape- these desires. The notion that ever more responsive, telematic, intimate devices and software will end up not just being able to read the contents of our mind, but to actively shape it, has also been a long-standing concern of dystopian strands of science-fiction, in books that herald the complete commodification of the soul and the pliant zombie citizens it produces. In Transmetropolitan, a dark comic series by Warren Ellis, advertising has penetrated into the dreamscape, with sponsored messages bombarding the protagonist Spider Jerusalem in his sleep. Similarly, MT Anderson’s Feed extrapolates current trends of targeted online marketing to telepathic levels, where ads pop up in users minds in response to life events- a break up, a job loss- with what algorithms have deduced would be their most likely behavioral response- ice-cream binge, shopping splurge or revenge fantasy. Most of Philip K Dick’s output could also be read as a (fairly accurate, it turns out) prediction of life once people’s affective responses can be parsed into useful data.
While these scenarios are still mostly fictional dystopias, Google’s purchase of various AI and robotics companies, like DeepMind and Boston Dynamics, suggest that ‘divination of private consciousness’, and it’s hidden desires, is big business. Perhaps with masses of Very Big Data plus algorithms fuelled by some artificially intelligent juice, search engines will finally be able read between the lines of your URL history, emails and social media interactions and deduce what you want to search for- and buy- before you even know. Of course this still signals a belief in the separation between us (desiring humans) and them (deducing algobots/ machines/ etc), when it’s probably more a process of entanglement where it becomes impossible to tell ‘who makes and who is made in the relation between man and machine’, as Donna Harraway put it.
But anyway, the parsing of affective responses and the insights they provide into our supposedly unfettered desires, is becoming big business. For example, new affect recognition imaging technologies promise to track what consumers ‘really’ feel/ want, according to the changing weather of facial expressions; and Dataclysm, a book by OK Cupid’s CEO book, draws conclusions from data gathered about users’ dating proclivities, thereby telling who they are when, as the subtitle puts it ‘they think no one’s watching’.
In both cases there seems to be an underlying belief that their data, or rather, the reading of it, unwittingly reveals elusive truths that not even the users themselves are truly aware of. Both rely on the algorithmic parsing of this data into ‘fact’- the muscular micro-movements into emotional states, or the eyeballing and clicking on dating profiles into facts about what men want (youth, apparently…). Needless to say, the conclusions (‘facts’) are codified by existing conventions- around what happiness, sadness etc look like, about how different genders interact (usually with the assumption that there are only ever TWO genders)- and therefore you could say, the process of reading and interpreting data has as much agency in the shaping of meaning as the data itself. That’s a long-winded, verbose way of saying that data, whatever that even means, does not just ‘mean stuff’ objectively and apart form the society and culture that’s measuring it.
Still, the idea of algos reading private consciousness suggests that they become both technologies for accessing inner experience, as well frustrating barriers to its efficient conveyance: an interface which enables communication but always prevents direct, unmediated, one-to-one experience, where what I feel, you feel. Boulter and Grussin call the desire to efface the medium so as to allow the represented content (of the video, computer game or innermost desires) to appear more realistically to the viewer the logic of immediacy, a longstanding tradition within Western art. This erasure of the medium or support gives the viewer the illusion of power by obfuscating the reality of the technical interface and its over-arching control of the viewing (or playing, or whatever) situation. The development of ever-smaller and more portable, wearable tech could also be seen as a desire to remove the pesky interfaces and merge more seamlessly and enjoyably with information flows.
More broadly, the notion of invisible or transparent infrastructure upholds an impossible dream of immaterial connection, and a state of denial towards the social and geopolitical realities that facilitate our tech. Recent debates around Stacktivism, for example, have emphasized the materiality of technology and the physical consequences of e-waste, energy-guzzling servers and the mineral harvesting involved in hardware construction. Stacktivism, has been described as ‘a conversation about those hidden technological and social infrastructures and the conventional metaphors that mask them: the cloud, the smooth and playful industrial design, the invisible interface.’
Getting off that fluffy cloud and scratching the surface of these smooth machines by becoming aware of the very material conditions that support them becomes an act of resistance against the ambient control of interface-less technology that aims to unite us with the contents of the network. If our data has become perhaps an interface through which our ‘inner world’ is mediated and known, then scrutinizing this process could also be about making visible the very real, material ways it’s collected, used and monetized through our always-on, and always-on-us- technology.