By Aliette de Bodard
I wrote my first original AI by accident.
Before that, I’d made a few clumsy attempts to merge my writing and my day job (as a computer programmer specialising in machine learning, one of the foundational techniques of programming AIs). My AIs were, for the most part, derivative and unconvincing: vague, unformed ideas of programs giving birth to one another, of parallel consciousnesses I couldn’t properly describe or make into characters that felt real.
Then I wrote a story called “The Shipmaker” (later published in Interzone), about a maker of spaceships who met the woman incubating the organic intelligence meant to be a ship’s brain. I wrote it about motherhood and pregnancy and loss – and the AI in it was dead, or more accurately stillborn, never breathing or animating any circuits.
“The Shipmaker” became the first story of a cycle (the Xuya universe) featuring Minds, my own versions of AIs carried in mothers’ wombs. As I unspooled the consequences of this, I had choices to make about how this would play out. The easy trope would be to have the Minds be Other: to have them separated from their mothers at birth and raised by the state or by scientists; so that they could, invisibly, be the animus behind ships and space stations. To have them be lonely, or at any rate separate, their concerns not the ones of the humans who had birthed them.
AIs as Others are nothing new: one of the stories I still remember today (even though I was 9 or 10 when I read it) is Asimov’s The Bicentennial Man, in which a robot can only integrate into human society by choosing to die, a deep and abiding tale of loss. In the SF I grew up reading, AIs are a threat, or a salvation. They kill us all, or become our benevolent dictators. They leave to found their own societies, their rules, their priorities, their lives not bound by human concepts. They are, so very often, alien and incomprehensible.
To me, making that choice felt like a cheap cop-out, and a decidedly odd one. As a woman of French and Vietnamese descent who grew up in France, I am intimately familiar with what it means to be the Other. It’s a gulf measured in incomprehension gaps and thoughtless, hurtful remarks like the teacher who once asked me “why are you even here?” It’s a pretext used to exclude: society doesn’t have to make an effort to accommodate Others, because they won’t ever be able to interact with “normal” people. It doesn’t have to worry about their happiness or even the basic necessities of their lives.
I wanted my AIs to be part of a mixed society where they could interact with people – for them to have daily lives and aspirations; to enjoy small and almost insignificant things like the beauty of starlight or a flower. I wanted them to be people rather than things.
This meant making space for them. This meant finding ways humans and AIs could cohabit and interact. I chose to have my Minds be embodied with a central nexus rather than a distributed architecture, but to have that body be inaccessible to almost everyone (except for very close friends and repair engineers). In daily life, AIs would project an avatar; or simply speak through the network. A station would be able to carry on multiple conversations with people at the same time; a ship would know everything on board from passengers’ entertainment choices to the state of their hull.
And the other thing that made my AIs people: I gave them leisure time. I gave them passions and opinions. I gave them the space to exist beyond the function which society required of them. As humans, our work doesn’t define us – why should it for AIs? They could obsessively collect rare porcelain, or enjoy reading cheap historical dramas. They could love and lose and grieve. They’re of course much longer-lived, with a different experience of consciousness and embodiment – but societies are about different people with different experiences sharing spaces and resources, and the one I created in my stories is no exception.
I also channelled a lot of my day job into my fictional AIs. Much of making an AI, currently, is teaching them: the algorithms that create an AI devoted to a specific task take a dataset, and then shows the AI what it means, over and over, until they can generalise to new data. To teach an AI to recognise cars, you take lots of car pictures of all makes and shapes, and run algorithms until the AI has a good ‘idea’ of what a car means (in reality, an abstract model stored in their memory).
It’s very much like raising children – except slower and far more laborious because AIs genuinely have no instinct of what a car or a ball means. But we’re the ones doing the teaching. Everything we put into them is ours: all the rules by which they operate, even the framework by which they create new rules. So the idea that an AI could come out of this being alien just didn’t gel for me. Incomprehensible? Yes, in the way that sometimes I don’t understand how my elder son’s mind works, and why he tells me about red planets in the middle of the park. But alien? Never. AIs contain humankind because we made them (and there might well come a day when that is no longer the case, but that’s not the choice I made in my universe!).
So I wrote my Minds as family. I wrote them as children, raised by the mothers who had borne them – as siblings fighting their human siblings for their parents’ attention – as gangly adolescents having to conciliate their duties carrying passengers with familial obligations – as very old adults, with everyone they’d known as children long dead, and descendants scattered amongst the stars, slowly wending their way home for the family gathering of the Spring Festival. Again, it felt pretty natural to me: balancing personal life and work obligations is part of what we do. I wanted a Mind who was the animus of a station, but also the younger sister declaiming bad poetry at a feast.
It took me time; and in many ways I’m still learning. I take inspiration from AI research; from stories like Ted Chiang’s The Lifecycle of Software Objects or Ann Leckie’s Ancillary Justice, which tackle similar themes. Little by little, I build and expand the world around my Minds, and how it shapes the way that they live. Little by little, I create new Mind characters, trying my best to do what is right by them. In many ways I’m still doing the same iterations as machine learning: building my AIs little by little, trying to teach myself to consider the right parameters and experiences for them.
I wrote my first original AI by accident – but the ones I wrote after that are very much by design.
Aliette de Bodard lives and works in Paris, where she has a day job as a System Engineer. She studied Computer Science and Applied Mathematics, but moonlights as a writer of speculative fiction. She is the author of the critically acclaimed Obsidian and Blood trilogy of Aztec noir fantasies, as well as numerous short stories, which garnered her two Nebula Awards, a Locus Award and two British Science Fiction Association Awards. Her space opera books include The Tea Master and the Detective, a murder mystery set on a space station in a Vietnamese Galactic empire, inspired by the characters of Sherlock Holmes and Dr. Watson. Recent works include the Dominion of the Fallen series, set in a turn-of-the-century Paris devastated by a magical war, which comprises The House of Shattered Wings (Roc/Gollancz, 2015 British Science Fiction Association Award, Locus Award finalist), and its standalone sequel The House of Binding Thorns (Ace/Gollancz). She lives in Paris with her family, in a flat with more computers than warm bodies, and a set of Lovecraftian tentacled plants intent on taking over the place.