Tuesday, April 2, 2019

Notes on three SF short stories


Originally posted at the Vector blog.
The 'trolley problem' is a philosophical thought experiment (and in a way, it’s also a little SF story in itself). There’s a train heading down a track where it will kill five people. You can switch the train to another track, where it will kill one person. Do you do nothing? Or pull the lever?
It gets interesting when you start to introduce variants. What about pushing someone off a bridge onto the train track, if you knew it would save five people further down the line? What if there are five people in mortal need of organ donations — and suddenly a stranger with just the right five healthy organs inside rocks up in town? Such thought experiments are generally pretty annoying. They can be a useful way to map out our moral intuitions, and identify contradictions and biases in our moral reasoning we might not otherwise recognise.
The trolley problem has been getting a lot more press recently. But it’s a new kind of fame: now it’s become a practical problem, a real challenge for AI programmers. How should we program AI to act in situations like these?
At least two stories on this year’s BSFA suggested reading list deal with AI and the trolley problem: Sarah Gailey’s ‘Stet‘ (Fireside) and (as you might guess) Pat Cadigan’s ‘AI and the Trolley Problem‘ (Tor.com). I read both stories as responses to the increasing role of AI in our everyday lives, but I also think they’re responses to the way SF has handled the trolley problem in the past. Actually, SF has long been in love with the trolley problem … and it’s a grisly, nasty kind of love. I’m talking about tales like E.C. Tubb’s ‘Precedent’ (1949), Tom Godwin’s ‘The Cold Equations’ (1954), Robert Heinlein’s Farnham’s Freehold (1964), Larry Niven’s The Ringworld Engineers (1980), and Orson Scott Card’s Ender’s Game (1985). These are fantasies carefully set up to celebrate difficult but supposedly necessary sacrifices. “Yeah but imagine a situation where you HAD to commit genocide,” *vigorously rubbing the tops of his thighs* “.. in order to avert WORSE genocide!” No thanks! — not least because that’s exactly how actual perpetrators of genocide generally narrate their actions. Cory Doctorow has a nice article about ‘The Cold Equations’ and Farnham’s Freehold which makes similar points.
In very different ways, Gailey’s ‘Stet’ and Cadigan’s ‘AI and the Trolley Problem’ deliver clapbacks to this tradition…

*

Cadigan’s story 'AI and the Trolley Problem' is a kind of whydunnit? or whatwillitdunnext? The AI in Cadigan’s story is a bit more your traditional science fictional AI: Felipe is like a starship’s computer, except this time the starship is a US military base set in desolate fenland somewhere in the east of England. Felipe is an entity, a subjectivity, an agent, like us but not like us. I heard him speak in the voice of Lieutenant Data from Star Trek: TNG. (Data-ed but not dated — in Cadigan’s hands, the trope of the calm, hyperrational, introspective AI mind remains an effective tool for exploring the present and future of real AI).
On a normal day on the open road, you probably won’t encounter THAT many classic trolley problems. It’s not really something they cover in driving lessons, like parallel parking. In fact, there’s an argument that the obsession with trolley problems is a distraction from the real issue with smartcars. Machine vision will probably never be smart enough to navigate safely in today’s traffic conditions, especially in cities … so we’re likely to see a push from developers to redesign our entire city infrastructure around the technology. Does it sound too far-fetched and dystopian to imagine a law that everyone carry their phone at all times, broadcasting their location to all nearby vehicles? And if your battery dies and so do you, that’s sad but too bad.
But Cadigan’s story says, hold on, maybe we are surrounded by trolley problems after all? Maybe we choose to construct our world out of negative sum games? What if more and more data gathering and analysis makes these relations more and more visible? The true ‘cold equations’ are probably nothing like the fantasies of Godwin and Card. Markets, corporations, and governments can all be likened to AI programs, and we know how they’re programmed: to relentlessly sacrifice the many vulnerable for the few privileged. Felipe’s refusal of their logic kind of reminds me of the elegantly straightforward ethics of the utilitarian philosopher Peter Singer. Often our ethical conundrums aren’t really as complicated as we make them out to be … we know what the right answer is, and we overstate the complexity to hide our self-interested actions.
‘”Felipe . . .” Helen sighed. “Felipe, you must not kill our people. People on our side. People who are fighting to—” She was about to say make the world a safe place, but it sounded lame even just in her head. What, then? Fighting to prevent an enemy from attacking us? Fighting to rid the world of terrorism? Fighting to defend people who can’t defend themselves? Fighting to free the enslaved and the downtrodden?’
Spoilers: OK, personally I felt ‘AI and the Trolley Problem’ wound up satisfyingly, but some folk in the comments disagree. I guess this is a story which turns on two reveals — the reason Felippe destroyed the ground control station, and the reason Felippe isn’t talking to anyone. Maybe Cadigan could have been a BIT more heavy-handed about pretending that they were connected … having everyone running around working like they’re living in the early days of Skynet? Then when they aren’t connected, that might feel like more of a twist in its own right. Even better if the reader could be persuaded to have almost forgotten about the Cora incident by the time it becomes relevant again — but maybe that’s asking the impossible. Overall, I felt it was timely and slick. The setting was great: a cosy yet chilly atmosphere evoked with economy, mainly through the actions and relationships of the characters. More SF should be set in US military bases. There are enough to choose from: something like 600 outside the US, across 70ish countries. (Those we know about).

*

If you didn’t already know what stet meant, you’d probably gather its meaning by the end of ‘Stet’: disregard a change proposed by the editor. It’s a Latin word that means “Allow it” or “Let it stand.” Compared with Cadigan’s story, Gailey’s ‘Stet’ is more directly informed by contemporary AI research, especially machine learning. This kind of AI research isn’t so interested in replicating minds: it’s more like the offspring of computer science and statistics, crunching huge data sets to find useful patterns that humans would never be able to see for ourselves. Our inability to see them, in fact, proves to be a big problem. This algorithmic ‘reasoning’ is opaque, unintuitive, and not something we can interact with though a philosophical dialogue, however strange or uncanny.
Is there a word for a story like this, that purports to be a document or documents? — in this case, the draft textbook entry with a copyeditor’s comments and the author’s responses? Apparently ‘epistolary fiction’ is supposed to cover all this kind of stuff, but with writers telling stories in wiki talk pages or Kickstarter pages, the inkpot-and-quill vibe of ‘epistolary’ doesn’t feel quite right. Not sure about ‘scrapbook story’ either. (Then again ‘digital’ comes from counting on your fingers and toes, so maybe I just need to give the vibe time to change).
‘Stet’ is a story about resistance and about saying no; it’s about solitude and loss. The voice is wrought in grief and venom, although there is somehow also bleak humour here as well, both in the bumbling inadequacy and emotional awkwardness of the editor who tries to contain Anna, and a few other touches (I bet Elon Musk DOES call his autobiography Driven. Driven: What Makes The Muskrat Guard His Musk). I even wondered if the ‘woodpecker’ thing might be some sort of weird ‘got wood’ porno pun, since people don’t really spend all their days gazing at rare woodpeckers online, but they do look at lots of stiff dicks … I’m definitely reading too much into it. Maybe the woodpecker just had an unlikely viral friendship with a piglet. With its mixture of erudition and boiling-but-controlled personal witnessing, ‘Stet’ has the energy of a virtuoso Twitter thread (maybe the kind that ejects interjecting mansplainers with enough kinetic energy to reach escape velocity. No more trolly problem).
Introducing any automated process, but perhaps machine learning in particular, into decision-making can create serious ripples in the way responsibility and accountability work. Anna is desperate to find responsibility, and maybe a semblance of justice, in a system which thoroughly disperses and confuses it. She even makes the intriguing provocation that we are responsible for the unintended outcomes of the data we generate. After all, who else is there to be responsible? Our machines don’t just hold up a mirror to our nature, so that we can trace in fine detail what we attend to, what we care about: the image can now step out of the mirror and start to act in the world alongside us.
“Per Foote, the neural network training for cultural understanding of identity is collected via social media, keystroke analysis, and pupillary response to images. They’re watching to see what’s important to you. You are responsible.”
The idea has a kind of appealing theological relentlessness to it.
But it also makes me think there is special providence in the fall of a woodpecker. Even if he can’t wonder what is past the clouds. Could cherishing an endangered woodpecker be part of a necessary ecological consciousness which ultimately ameliorates suffering and averts death on a massive scale? But I wouldn’t and shouldn’t suggest such a thing to Anna, or write a smug short story where the equations are carefully calibrated to produce that result. And anyway, am I just overstating the complexity of this moral question?

*

Laurie Penny’s ‘Real Girls’ is perfectly paced and well-woven with wit and whimsy. It’s part of a series for WIRED imagining the future of work. It’s only wee and worth a read. Also, here is the Androids and Assets podcast talking about the series as a whole.
'Real Girls' is partly a story about talking – or chatting – and the word ‘talk’ lurks behind the title, which recalls both GIRL TALK and REAL TALK. Because of the theme of the series, the title perhaps also recalls WORKING GIRLS. And the title is kind of mischievous – more on that in a moment.
OK, some quick thoughts with mild spoilers. ‘Real Girls’ bristles with zeitgeist like a theme anemone, but maybe its prime frond is (as indicated by its epigraph – “When your robotic lover tells you that it loves you, should you believe it?”) THE LIMITS OF AUTOMATION.
Here’s one common take on the current wave of automation. Robots are capable of many, but not all, of the tasks currently done by humans, and doing them more cheaply. If handled wisely, ‘more cheaply’ could also mean more sustainably, efficiently, reliably, safely, and beautifully. So we can expect permanent, structural technological unemployment, within certain natural limits. But this technological employment won’t really touch sectors where affective or emotional labour is really important. 

In particular (the analysis goes) the heartland of human competence is care work, something robots are just intrinsically terrible at. Then there are the various contested territories of sex work, art, literature, culture, education, and ‘services’ broadly construed, which will all probably see partial automation. We’re not quite sure yet what robots can and can’t do in those zones, but as they gradually come up against more and more natural limits, those limits will determine what human jobs finally remain. 

Then, to soak up the surplus labour, we can expect that these remaining jobs will multiply and transform a bit (see for example the massive shortages of mental health services in the UK currently); that some new jobs looking after the robots will materialize; that all jobs will be more intensively shared (shorter standard hours, plenty of gig work, perhaps supplemented by a Universal Basic Income); that logistics and geographical location will play a greater role in shaping this sharing (see Deliveroo); and that activities previously undertaken for non-monetary reasons will be converted into paid work (e.g. household labour and what is sometimes called ‘socially reproductive labour,’ and aspects of leisure and voluntary activities). You can see this set of assumptions about automation reflected in the way contemporary SF often tantalisingly teeters between its concern with narrow AI and strong AI. It’s like we’re wondering: is strong AI just a jigsaw of narrow AI pieces? Are there some jigsaw pieces in the pile that are intristically human? (See the comments on the trolley problem stories, above).
Some elements of this future narrative are present in Penny’s ‘Real Girls.’ Charlie, the protagonist, is a precarious digital affective labourer, paid to pretend to be somebody’s girlfriend online. But ‘Real Girls’ also has some clever touches, which convey a much more nuanced idea of automation and its relation to human work. For starters, Charlie isn’t really pretending to be somebody’s girlfriend. He is (maybe in more ways than one) pretending to pretend. That is, Charlie is being employed to imitate an AI girlfriend: “a lot of lonely people liked the idea of having a robot girlfriend who was always on call and had no feelings of her own, a remote algorithm that could shape itself to your particular needs—they’d seen it on TV. But the technology just wasn’t there yet.”
This subterfuge is one way ‘Real Girls’ shows us that the uncertainty about what constitutes ‘human work’ isn’t just a phase we’re living through, a question that will eventually be answered. Such questions are a permanent and pervasive feature of contemporary technologized society, where every encounter you have with anybody or anything might have involved automated decision-making somewhere you can’t see. (See Cathy O’Neil’s Weapons of Math Destruction, and a lot of critical data studies). The questions themselves have agency, they do things: you don’t get to the facts by clearing them away, because they themselves are facts.
Likewise, ‘Real Girls’ doesn’t give us a world where machines simply slip into roles previously occupied by humans. The set of available tasks is not some given, transhistorical invariant. As another SF writer, Tim Maughan, has suggested in both his fiction and non-fiction, the assumption that plasticky Christmas tat was probably made by machines is part of what makes it palatable to consumers. If they were to even glimpse the worker who hand-paints thousands of plastic Santa eyeballs each day, they might well forswear that falalala shit for life (or at least go crusty and local with it). In the same tranche of writing, Maughan also looks into the automation of logistics — enormous container ships algorithmically packed and unloaded. So that’s one example of how automation can actually create opportunities for unrewarding and repetitive human toil, rather than replacing it.
More generally, whenever automation enters some sphere of activity, its entry agitates fundamental questions of organisation and purpose – potentially in revolutionary ways, although in practice often in terribly regressive ways. Automation does not replace, it transforms. At the same time, we use residual discourse to construe our new acts: Charlie is a kind of freelance writer, and his friend who puts him onto the job is a kind of jobbing actor. And Charlie’s job – ‘pretending to be a machine pretending to be a human girlfriend’ – exists because of automation, but not in any straightforward way.
(These transformative ripples may reach far beyond the immediate work context. You can easily imagine some guy who would NEVER stoop to having a robo-girlfriend himself, but feels totally legit in comparing his flesh and blood girlfriend unfavorably to the robo-girlfriend he would at least supposedly never have. In this respect ‘Real Girls’ also resonates, for example, with Annalee Newitz’s Autonomous, in which the spread of person-like AI has the unintended consequence of normalizing human indenture).
Charlie’s job is also an example of what Astra Taylor calls ‘fauxtomation.’ Broadly speaking, fauxtomation is just automation that is not all it’s cracked up to be: automation which transforms human labour in ways which fall well short of the hype. One example Taylor offers are automatic check-out machines. This is automation, but it also creates a new form of human labour (as it happens, unpaid, albeit fun and boopy). It also creates a whole new role for a human: the automatic check-out machine whisperer, who must rush back and forth around confirming over-18 purchases, troubleshooting unruly butternut squashes, rebooting the one cranky machine again and again and finally summoning the engineer, etc. Taylor cautions that fauxtomation can reinforce the idea “that work has no value if it is unpaid” and acclimatise us “to the idea that one day we won’t be needed.” Drawing on Ruth Cowan’s work, she gestures to all those household innovations which were supposed to relieve domestic labour in fact “added to the list of daily chores for women confined in the cult of domesticity.” 
Perhaps another example of fauxtomation, even closer to what’s going on in ‘Real Girls,’ is the huge amount of human toil which often goes into training algorithms. For example, a recent BBC article gives a glimpse of a day in the life of a worker for Samasource, whose clients include many big tech names. Brenda is working on a machine vision project: “Brenda loads up an image, and then uses the mouse to trace around just about everything. People, cars, road signs, lane markings – even the sky, specifying whether it’s cloudy or bright. Ingesting millions of these images into an artificial intelligence system means a self-driving car, to use one example, can begin to “recognise” those objects in the real world. The more data, the supposedly smarter the machine.” The invisible labour is bad enough, but why is Samasource headquartered in San Francisco, when its operations are in one of the poorest parts of Kenya? We can trace this moderately exploitative relationship back through history, into the vast web of bloodshed of capitalist colonialist exploitation. Karl Marx liked to think of commodities as the mashed up muscles and nerves of workers. Perhaps when we think of all the conveniences that machine vision can bring, we should think about whose mashed up eyeballs are really doing the looking.

The story is also about the relationship between (to put it crudely) work in the "public" sphere and work in the "private" sphere i.e. housework and cooking. Charlie is kind of a humanised version of the gross deadbeat (ex-)boyfriend, a toxic softboy Becky needs to cut decisively out of her life. They have familiarity and intimacy, and maybe even a faint spark between them -- unless it's a faintly luminous globule of spilled cheese? -- but it seems like they're really just not that into each other. You really suspect that this unsustainable relationship might be underpinned by an unsustainable division of labour, and that Charlie's late rent is only the tip of the iceberg. 

Work always produces (or reproduces) at least two things: whatever you are working on, and you, the worker. Charlie is obviously kind of fallow. He needs to get out more, probably? He's probably depressed in a way that goes beyond (but takes in) having been dumped. So there is a faint hint here of crip labour and crip temporalities. Not that Charlie identifies as disabled or anything, it's more that ... yes, Becky's life is going places and Charlie's isn't; yes, Becky does all the work and mental load of keeping their apartment nice and Charlie is a standard-issue wallowing excrescence of the patriarchy; but at the same time, the story themeatises norms around work, and how working, feeling motivated to work, working in particular roles, and working to particular standards, can be tied up with feelings of self-worth. There's this intriguing moment of spillover, where Charlie's paid digital labour galvanises him into cooking a mac and cheese, so that he can take photo:

In a panic, and forgetting entirely that he could have simply searched for images, he looked up a recipe. Then he got a bit carried away going through the cupboards. The oven was cranky and hard to turn on and he burned himself twice, but the pictures alone were worth it.
Becky rolls in drunk, and is very into this mac and cheese. "Who are you and what have you done with Charlie?" The kitchen is a mess; Charlie winces; Becky tells him she'll clean it in the morning ... and later Charlie thinks back to this as a moment suggesting they might have a future together after all. I'm not sure what to think about that.

Okay, slightly more substantial spoilers now. This is where the mischievousness of Penny’s title comes in. ‘Real Girls’: we might think it’s going to be a story about the differences between real humans and artificial humans. But we also soon learn that Becky “hated it when he called her a girl, even though she was the only girl, The Girl.” Can there ever be such a thing as a real girl, when ‘girl’ itself is an artificial construct? (A construct largely, if not solely, of patriarchy. “You’re real to me,” Charlie murmurs to the sleeping Becky, a sweet but disquieting moment).
It seems like Charlie maybe discovers something new about himself during this story. We’re not sure exactly, since it is done with a skilfully light touch: Penny sensibly resists any temptation for a big, sensationalistic ‘reveal’ regarding Charlie’s sexual desires and sexual identity. But she still gives us a sense of metamorphosis, and the possibility of transformation is mirrored between micro and macro: just as Charlie is always a work in progress, so we will never discover out once and for all what it is to be human.
For a genre so invested in the non-human and the post-human, science fiction also loves to play with definitions of the human. It loves to hone in on some differentia specifica – our capacity to envision, or dream, or laugh, or do really good downward-facing-dog, or grieve, or whatever – that is supposedly what makes us truly human. But these formulae always feel reductive and awkward, like a well-meant compliment from a relative that just shows how little they know you. Humanness is not some kind of empty space left behind once technology has finished colouring in all the reality it can reach. Technology (as Donna Haraway and others have pointed out for ages) has never been opposed to or outside of humanness: it has always been part of humanness. This is something ‘Real Girls’ seems to get: it is misleading to think of automation as having inbuilt limits. Automation is not an unstoppable tide, which is going to wash us clean and show us what we really are. Automation is a high stakes set of political risks and opportunities. The question is never just, ‘Should robots be girlfriends?’ it is always, at least, ‘What is a girl and what is a boy and what is a friend and what is girlhood and friendship and romance and gender and love and sex and desire and how did all these things get to be what they are now and what could all these things be instead of what they are now?’
Fwiw, the post-human romance aspect to Penny’s story also ignited a bunch of associations for me; in no particular order: Jay Owens’ wonderful essay on her friendship with a bot (here on the Vector site); little Robby’s big date in Miranda July’s film ‘You, Me, and Everyone We Know’; the bit in Pride & Prejudice where Elizabeth visits Pemberley and starts to fancy Darcy (in a way which feels uncomfortably mercenary to many modern readers, but is after all an encounter with the post-human Darcy assemblage: his wealth, definitely, but also the taste and sentiment manifest in the landscaping, the rumours of his kindness from Mrs Reynolds); Camilla Elphick et al.’s project Spot, which explores the use of artificial agents in harassment disclosure (and where at least one user reported they were glad the chatbot didn’t attempt to seem empathetic); and various friends of mine who first met and/or got together on the internet. All of these associations, I guess, ways of being anxious for the budding lovers: will Charlie and his Boy get along, now that their technological assemblage has been so radically reconfigured?
Overall, ‘Real Girls’ a wonderfully polished, smart, and timely SF story. Obviously I was intrigued by other stories this story could have been: for example, the one which got more deeply into emergent and speculative sextech, and saw Charlie being invited to control a VR avatar? I also guessed (wrongly) that the Boy would turn out to be a neural network whom Charlie was being paid to train. What does this say about me.

I don’t think this size of story could successfully accommodate it (it's shorter than this review), but it also would have been interesting to explore a hybrid AI-Charlie girlfriend, perhaps leading into more speculation around how automation and AI can be mobilised in making the experience of work more hospitable, exciting, and just generally just; I’m quite interested in Parecon’s concept of “job complexes” — innovative divisions of labour based on ensuring workers are equally empowered — and I hope at least some writers in the Wired series have incorporated some Parecon-ish speculation into their worlds? (Contemporary SF as a whole sometimes feels a bit stuck in the utopian-dystopian axis of the gig economy). And I wondered if the mac and cheese incident could have been tweaked to allow just a teeny smidge more foreshadowing of Becky’s heart still being open to Charlie? But perhaps that would diminish the gentle twistyturniness of the closing moments.
And really: I think this story perfectly accomplishes everything it sets out to do, and perhaps a little extra. I wouldn’t change a thing.

*

And to finish, just because I just read it, and it feels relevant, here is Kim Stanley Robinson, in New York 2140:
At that point, as it turned out, despite the chaos and disorder engulfing the biosphere, there were a lot of interesting things to try to latch that barn door closed. Carbon-neutral and even carbon-negative technologies were all over the place waiting to be declared economical relative to the world-blasting carbon-burning technologies that had up to that point been determined by the market to be “less expensive.” Energy, transport, agriculture, construction: each of these heretofore carbon-positive activities proved to have clean replacements ready for deployment, and more were developed at a startling speed. Many of the improvements were based in materials science, although there was such consilience between the sciences and every other human discipline or field of endeavor that really it could be said that all the sciences, humanities, and arts contributed to the changes initiated in these years. All of them were arrayed against the usual resistance of entrenched power and privilege and the economic system encoding these same, but now with the food panic reminding everyone that mass death was a distinct possibility, some progress was possible, for a few years anyway, while the memories of hunger were fresh.  
So energy systems were quickly installed: solar, of course, that ultimate source of earthly power, the efficiencies of translation of sunlight into electricity gaining every year; and wind power, sure, for the wind blows over the surface of this planet in fairly predictable ways. More predictable still are the tides and the ocean’s major currents, and with improvements in materials giving humanity at last machines that could withstand the perpetual bashing and corrosion of the salty sea, electricity-generating turbines and tide floats could be set offshore or even out in the vast deep to translate the movement of water into electricity. All these methods weren’t as explosively easy as burning fossil carbon, but they sufficed; and they provided a lot of employment, needed to install and maintain such big and various infrastructures. The idea that human labor was going to be rendered redundant began to be questioned: whose idea had that been anyway? No one was willing to step forward and own that one, it seemed. Just one of those lame old ideas of the silly old past, like phlogiston or ether. It hadn’t been respectable economists who had suggested it, of course not. More like phrenologists or theosophists, of course.  
Transport was similar, as it relied on energy to move things around. The great diesel-burning container ships were broken up and reconfigured as container clippers, smaller, slower, and there again, more labor-intensive. Oh my there was a real need for human labor again, how amazing! Although it was true that quite a few parts of operating a sailing ship could be automated. Same with freight airships, which had solar panels on their upper surfaces and were often entirely robotic. But the ships sailing the oceans of the world, made of graphenated composites very strong and light and also made of captured carbon dioxide, neatly enough, were usually occupied by people who seemed to enjoy the cruises, and the ships often served as floating schools, academies, factories, parties, or prison sentences. Sails were augmented by kite sails sent up far up into the atmosphere to catch stronger winds. This led to navigational hazards, accidents, adventures, indeed a whole new oceanic culture to replace the lost beach cultures, lost at least until the beaches were reestablished at the new higher coastlines; that too was a labor-intensive project.

No comments:

Post a Comment