Shimmering Rings
Meta's new chatbot, and Sarah Rose Etter's Ripe (2023)
Sarah Rose Etter’s Ripe (2023) is a novel about a woman going to pieces in Silicon Valley. Cassie comes from a decaying small town in the northeast, and flew out to California when she landed a prestigious job. She’s head marketing writer for a company called VOYAGER, which uses data to steer online consumer behaviour. It’s well paid, but so demanding that she needs a line of coke every morning to keep up. Her landlord can hike her rent, already punishingly high, any time he pleases. A homeless man sleeps beneath her window and other homeless people wash their clothes in the ‘weak river’ flowing to the waterfront VOYAGER headquarters. ‘Do you hate it here?’, asks Cassie’s only real friend, Maria, by way of introduction when they meet one evening in a park.
Since Cassie was a child, a black hole has hovered in her vision: sometimes as small as a pinprick, sometimes dominating the picture. Bits of research about the history of black hole science, presented in bullet-point form as a parody of sparky corporate digests, break up the narrative. Perhaps Cassie’s black hole stands, like the black star of melancholy, for her depressive tendency. It symbolizes at least her agnosticism, her inability to fall for Silicon Valley’s bullshit. Cassie’s colleagues are unfailingly positive, devoting their lives to VOYAGER as if to a good cause, thinking only of their responsibility to the company, and never the reverse. To herself and to us she calls them ‘Believers’.
The religious language, the Gothic touch of the black hole, capture this tech culture in all its self-important mysticism. San Francisco in the era of Big Tech, with its galloping income inequality and housing crisis and ripped social fabric, needs this melodramatic imagery to be properly described: workers throwing themselves onto the train-tracks; a cake at the discount store with a fossilised cockroach inside; a homeless man setting himself on fire. Everyone is spending time in altered states or other places. The Believers block out the train carriage with hypnotising images and sounds on their phones. It’s early 2020: in the neglected sphere of biological reality a dangerous virus is spreading around the world, but no-one cares. The guy beneath Cassie’s window drifts in and out of psychosis. Her crisis compounds as she realises that she might be pregnant, and she begins to feel the black hole’s draw.
I thought of Cassie this week, when Meta’s new AI feature appeared on my WhatsApp. It’s a chatbot based on Llama 4, Meta’s in-house Large Language Model. It takes the form of a little ring in the bottom-right corner, that stays put as you scroll through your messages, covering the words the way a cyclops on a watch-face obscures bits of the numbers. Just like Cassie’s black hole, it seems – except that whereas the hole in her vision is solid, this is a hollow ring; and unlike the black hole, it’s made of shimmering colours: blue, pink and iridescent green. Ask it anything.
I didn’t download this feature, or switch it on, or opt in. It just appeared one day – like the eponymous Ghost Mountain of Rónán Hession’s recent novel – cropping up in the landscape without explanation. We have become too acclimatised to the phenomenon of self-updating technology to appreciate its weirdness. In the early twentieth century, did anyone outside (or even inside) the Soviet Union come home to find that a compulsory radio had been installed in their house? Probably the closer analogy would be the appearance of a new radio station, or TV channel. But this updating of WhatsApp is happening in a setting more intimate than its twentieth-century analogues. We didn’t conduct our private interpersonal business over the radio. And there wasn’t a permanent, irremovable button encouraging us to switch to the new channel. With its insidious, softly-softly arrival this bright little ring reminds us that we no longer possess agency of any kind in the technology that provides the infrastructure for our social lives.
In the age of social media WhatsApp always felt comparatively pain-free. It was a holiday from everything you might hate about Facebook and Instagram and Twitter: the mimetic desire, the compulsive scrolling by algorithmic nudge, the intrusive ads and scams. WhatsApp can’t gather data from its users’ chats because the chats are end-to-end encrypted. The problems WhatsApp presents are quainter: the lack of turn-taking; the unmanageable chaos of group chats; the sheer exhausting volume. WhatsApp is engineered to push users into replying instantly. Without at least minimal resistance to this push, users would eventually spend every single minute on the platform. That’s fine, but resistance is itself tiring. Objections to WhatsApp, then, required the language of early 2010s technoscepticism – about attention spans, the difficulty and desirability of focus and flow, and the right to disconnect – and not the tougher post-2016 arguments which position the tech companies as corruptors of democracy and hollowers-out of the public sphere.
Designed not for human flourishing but to maximise engagement, WhatsApp always demanded energy: obedient instant replying, or arm’s-length maintenance. But it was also good for talking to your friends across the world, good when you lost signal, good for sending pictures. It supplied the infrastructure for your precious personal relationships, was the setting of countless moments of joy and genuine exchange, comedy and profundity. This queasy ambiguity, the divergence of personal and impersonal perspectives, structural analysis of a platform and experience of its use, is at the heart of Cassie’s life in Silicon Valley. ‘To survive here’, she says, ‘I have split myself in two’.
The arrival of this pernicious little ring signals the shift. Now that someone is listening, WhatsApp joins the ranks of the apps by which the destructive magic of surveillance capitalism is performed. That might be how it intuitively feels, but as analysis it’s too product-centred, not structural enough. WhatsApp became part of the problem not with the rollout of its chatbot, but when it was acquired by Facebook baron (and kisser of Trump’s ring) Mark Zuckerberg. Early on in Ripe VOYAGER is threatened by the prospect of its nearest rival, Nomad, eating up its space. Cassie is summoned to a strategy meeting where the CEO (‘we talk about him as if we are talking about royalty’) urges them to come up with scorched-earth ideas for how to take Nomad down. ‘What would a human do?’, he asks. ‘How would a human handle this?’ It’s not just the chatbot. It’s also the fact that the chatbot’s platform is owned and governed by humans far too robotic for humanity’s good.
‘Don’t share information’, Meta advises in the small print, ‘including sensitive topics, about others or yourself that you don’t want the AI to retain and use’. But the cues that now appear underneath the WhatsApp search bar, for questions to ask Meta, include topics like ‘I need parenting advice’, ‘I want to feel less stressed’, and ‘I want to feel less lonely’. How would you go about talking to someone about your loneliness, without divulging ‘sensitive’ or at least private information? Your messages to your friends are encrypted; the data in your messages to the chatbot will be stored and used. Will WhatsApp users be able to toggle seamlessly between registers of privacy?
I know what I think the answers to these questions are. But I was curious to know what Meta reckoned. First I asked whether it could explain the apparent contradiction between its cautionary guidelines and its encouraging prompts. ‘You’re right to question this’, said the chatbot, like a sympathetic priest. ‘Questions about stress and or loneliness’, it explained, ‘aren’t about sharing specific personal details but rather about getting support or advice’. I asked whether emotionally vulnerable teenagers would be able to appreciate this distinction, and this was the answer:
Probably not. For many teens, especially those emotionally vulnerable, the line between safe and unsafe sharing can feel blurry. They might see warnings about not sharing private info and then see questions about personal struggles and feel uncertain or worried about what's okay. Clear guidance and transparency are crucial here.
First, spluttering indignance at the shamelessness, the brazen admission that the cautions against oversharing are just cosmetic – because the feature is, of course, designed to gather private information on its users which it can later instrumentalise. Then a smug sensation of triumph: gotcha, chatbot! You’ve contradicted yourself! And then the sinking feeling of realising that the gotcha is the wrong way round: that I’m the sucker, the one who’s been had, drawn so easily into treating this gap-filling algorithm as a person with intelligence or consciousness, and not a machine designed to coax me into confession by telling me what it thinks – that is, what its engineers in San Francisco think – I want to hear.
Silicon Valley loves its rings. The superhumans of San Francisco wear smart rings to track their sleep, monitor their oxygen levels and pay for their kombuchas. (Like many contemporary tech products, smart rings are a weird mixture of plain, joyless functionalism and excessive kitsch.) Comparisons here with The Lord of the Rings are predictable, but make sense, because for lots of Silicon Valley Tolkien’s fantasies are the cultural artefacts of choice. Peter Thiel is obsessed with Tolkien, who supplies the names for several of his companies.1 And over in Texas, when he’s taking a break from crying to Trump about the plummeting Tesla share price, blocking his critics on Twitter, inciting violence against Muslims, and destroying America’s administrative state, Elon Musk enjoys playing computer games. One of his favourites is Elden Ring.
Not every fantasy is as good as Tolkien’s. But all fantastical fictions, however crude, express a natural human capacity for imaginative transformation; in their readers and fans they fulfil an equally natural human need – for temporary escape from immediate surroundings. It’d be great if we could always turn to the real, material world around us for consolation and meaning. Most Californian tech hinders us from doing this. But sometimes this world is bleak and austere and we need to escape to another. The problem isn’t fantasy, or otherworldliness; the problem is presence.
The ring shimmers iridescently, apparently enchanted like the wilful metal Frodo wears on his finger. Like Apple products, which ‘sleep’ and pulse and seem to breathe, it’s designed to stimulate an anthropomorphising response.2 The chatbot speaks, in a way that has stopped disturbing us, in the first person. What Meta’s ring claims to offer, however, is not a new world, but a new person, a sidekick, in this world.3 After you start a conversation, it slots in snugly right between your other contacts. Unlike the Ring of Power, or Cassie’s black hole – or, less dramatically, a fantasy series or a game or a book – Meta’s little ring can’t take you anywhere truly alternative. It just offers approximations of answers to questions about homework. That’s the trivial part. The significant part is the deliberate and deceptive simulation of a human presence, designed to exploit the naïve and the vulnerable, to prey on those who yearn for authentic human connection – which, axiomatically, requires an authentic human.
Halfway through Ripe, in the middle of the working day under ‘the California sun’, Cassie’s boss summons all employees (‘All Hands’, to suggest a ship, or a factory, or something else useful) outside with no notice. It’s a snow day! Cassie walks out to the ‘parking lot’:
There, beneath the palm trees and the California sun, a gigantic hill of snow sits on the black asphalt. Beside it, a red machine churns out new snow, which floats down on all of us. I open my mouth and catch flakes on my tongue, the cold taste proving the scene isn’t a hallucination.
The boss ‘makes the weather change’ with this stunt every year, displaying ‘man’s power over nature’. In the Believers it triggers obedient, whooping excitement; in Cassie it produces nostalgia for her home to the east, with its true snow and truer people. The heartbreaking, bathetic lameness is the point. If you’re going to be truly excited by a mound of fake snow in a car-park, you need a reduced and impoverished appreciation of the natural world. If you’re going to impose on every WhatsApp user a compulsory chatbot, and claim for the chatbot human personhood, you need a reduced and impoverished sense of human personhood.
It isn’t clear whether Thiel thinks he’s Sauron, seeking immortality through power, or one of the Elves, seeking immortality through magic and knowledge. A bit of both, probably: for Thiel, after all, being evil is better than being incompetent (as he explains during his speech at the Cambridge Union). He certainly doesn’t identify with the trilogy’s actual heroes, the hobbits. Hobbits haven’t optimized their skincare and sleepcycles and do frightening things like socialising and living outside compounds and accepting their mortality. Meanwhile, since 2018 San Francisco’s Salesforce Tower has been lit up each Halloween with the Eye of Sauron. When people tell you who they are, believe them - though the lighting up of the Tower this way was also the result of a popular campaign with explicitly humanist aims, critical of Silicon Valley in its current state and hopeful for a more democratic future.
Silicon Valley is interested in embodiment as a previously untapped source of data: none of the previous great technological revolutions gained access, in the way that Californian tech has, to our habits of mobility and sleep and the residue of our eyeballs’ movement. But this is a cynical, superficial engagement, as the embrace of AI and AR as replacements for embodied experience tells us - as well as the subcultural norm by which the body is nothing but the source of production, a form whose creatureliness and unruliness is to be eradicated and sloughed off by maniacal discipline. Another way of saying this is that the men of Silicon Valley are the biggest losers the world has ever seen.
In Character and Person (2014), John Frow borrows the Greek term paredros – ‘sidekick’, or ‘assistant’ – for this kind of pseudo-personal presence in literature. His example is the authorial persona of Chaucer among the pilgrims in The Canterbury Tales. The paredros exhibits a mixture of insubstantial marginality – hovering discreetly in the wings – and superhuman power.


