I remember the moment, a couple of years ago, when I realized that my Facebook timeline was looking at me differently, or, rather, inviting me to look at it differently. The ads liberally scattered between the usual friend-content had taken on a new coquettish posture. A sleeveless, buff man crouched to touch the soil in an Italian winery, steamily meeting my eye as he did. He belonged in some way to Dolce & Gabbana.
As it happens, I had been conducting research into the consequences of targeted ads. I had found a recent study by Spanish scholars, which concluded that, for the purposes of ad targeting, Facebook’s advertising platform labels two thirds of its users with potentially sensitive tags. These labels are not publicly visible on user profiles, but they allow businesses to direct their ads only to those users whom Facebook has tagged as having certain interests. And the report notices the “extremely worrying” fact that some users tagged as interested in ads related to “homosexuality” live in countries like Saudi Arabia and Nigeria, where homosexuality is punishable with death.
Peculiarly, my research into this disturbing reality seemed to cause a surge of the beseeching male gaze in my Facebook feed. A vertical parade of male models exhibited swimwear, summer slacks, and jogging pants with the camera’s eye roving over their bodies with sensual languor. This was definitely not the “Dude, you gotta buy this!” mode of straight men targeting other straight men with the coolest new life-hacking commodities. I know how the Gillette man winks at his bro audience, and this wasn’t it. I hadn’t declared anything about my desires in Facebook’s vulgar “Interested in” column, but the sweep of the algorithm’s gaydar is silent and broad.
In digital surveillance terms this is called being “profiled,” but such attention registers much more than our profile — this is unabashed full-frontal. By curating the content we see online, by deciding which stories we see on our feeds, by suggesting products for us to buy and movies for us to stream, these algorithms create a strange kind of portrait of ourselves. And we collaborate in its construction with our clicks and cookies. Netflix, for instance, has multiple thumbnails for the same show; you’ll get the one that it thinks will appeal to you most. You can almost see yourself in the eyes of the actor they choose to feature.
Basil Hallward, the fictional artist of the most famous Western literary portrait, is afraid to show his painting of the beautiful Dorian Gray because it would betray a desirous secret of his soul. “I have put too much of myself into it,” he tells his friend. Our algorithmic portraits can be equally treacherous.
Algorithms, then, have the power to project the secrets — or at least the personal topographies — of our inner lives back at us. This technological penetration and display reminds of a scene in Thomas Mann’s novel The Magic Mountain (1924), which dramatizes the strange implications of early radiography on your sense of you. The protagonist, Hans Castorp, is in a Swiss sanatorium, being tested for tuberculosis. The doctor tells Castorp: “We’ll take a handsome x-ray of you — you’ll enjoy seeing what goes on in your own inside.” But enjoyment isn’t Castorp’s feeling when he confronts the hidden truth that the x-ray reveals. His experience in the twilight of the laboratory is eerie. The machine yields up an alien landscape, a bright network of white bones that is as much the real “him” as the familiar surfaces reflected in a mirror. The x-ray’s cold attention perceives and reveals undeniable truths about the mortal self. “You will get a free copy,” the doctor tells him, “then you can project the secrets of your bosom on the wall for your children and grandchildren to see!”
The feeling of being exposed occurs whenever the Internet addresses us in the second person. TikTok’s “For You” page, for instance, serves up a selection of videos that the algorithm anticipates you will enjoy. “Recommendation systems are all around us,” TikTok’s website explains, cheerily yet with a whiff of fatalism.
And before you start asking questions, remember that you enjoy these systems: “They power many of the services we use and love every day.” As we hopscotch from video to video, TikTok categorizes the content we favor, noticing whether or not we watch a clip all the way to the end. Each choice we make “helps the system learn about your interests,” so that “each person’s feed is unique and tailored.” When I set up a new profile, my newborn For You page suggested I follow The Rock, Gordon Ramsay, Billie Eilish, and Ed Sheeran. This is TikTok’s equivalent of the tabula rasa. Starting from nothing, every interaction helps construct an image of us as users, as if an entire personality can be built on the foundation of Dwayne Johnson’s sturdy shoulders.
The Internet adores this second-person voice. There it is, at every cyber–street corner: Recommended for You, Suggestions for You, Here Is Something You Might Like. Behind each of these You’s, an algorithm sits at an easel, squinting, trying to catch Your likeness. But these algorithms are true Renaissance practitioners. Not only portraitists, they’re also psychologists, data-crunchers, and private detectives, extrapolating personality from the evidence of our past actions: from our online histories and, increasingly, from what they can eavesdrop, without any meaningful warrant, in the physical world. From all those toothsome bytes of behavior, they create an image of You. In French there is the formal vous and the intimate tu, but the digital you is somewhere in between, coming from the other side of the screen, spoken by a strange intelligence that seems to know your secrets.
But what is the psychological impact of a bespoke Internet, tailored to you, and one where it is increasingly difficult to outrun yourself? “Bespoke” sounds luxuriously considerate, but it also entails a kind of revelation. It comes from the older word bespeak, which refers to an indication or a piece of evidence. Sewn into the bespoke is the fabric of external judgments. In the tailor’s series of sartorial calculations and decisions, the wearer is shaped, a silhouette is cast.
Philosophers, especially the phenomenologists — those focused on how our consciousness perceives the outside world as a set of experiences — have long been aware that we don’t get a sense of ourselves in isolation. Hegel was key in developing the idea that self-consciousness is only possible in relation to others. Our identities form in relation to how we perceive other people perceiving us. Being self-conscious, then, relies on the gaze of someone else. “The Other penetrates me to the heart,” Jean-Paul Sartre writes, describing Hegel’s intuition about how we are dependent on one another in our very being. “I cannot doubt him without doubting myself, since ‘self-consciousness is real only in so far as it recognizes its echo (and its reflection) in another.’” For Hegel, the penetrating gaze of another person produces in our minds an image of ourselves. Personalizing algorithms thus offer a startling twist on Hegel’s idea, as we begin to see ourselves more and more through the gaze of these unselfconscious but intelligent and highly attentive algorithmic “others.”
Sartre’s existentialism lies downstream from Hegel’s phenomenology, and his concept of hell in his 1944 play No Exit is tinglingly prophetic of our current predicament. The play is about three freshly dead strangers who have just arrived in a windowless room in hell. Garcin, Inèz, and Estelle realize that they can’t escape one another’s scrutiny. The lights are always on; there will be no more sleeping. One of the first things they all notice about hell is the absence of mirrors — and the play is about how they become each other’s looking-glasses. This relationship begins literally, when Inèz helps Estelle with her lipstick.
The three inmates soon feel the borders between themselves and their eternal roommates transgressed. Garcin sounds like a Swiss x-ray machine when he tells Inèz, “I can see into your heart.” Inèz, the most sinister of the three, says of Estelle that she “crept inside her skin, she saw the world through my eyes.” Inèz tells Estelle, “You’re lovely,” but, less pleasantly, she scolds Garcin for the way he holds his “grotesque” mouth. Garcin, meanwhile, who was shot for desertion, needs Estelle to assure him that he’s not a coward, in order to make it so. But there is no escaping Inèz, who describes herself as “a mere breath on the air, a gaze observing you, a formless thought that thinks you,” and declares, “You’re a coward, Garcin, because I wish it!”
The play’s conception of damnation, then, is a life in which your self-image is forged entirely in public. Your “I” only exists because someone says “You.” Garcin understands that “there’s no need for red-hot pokers” in this place, because both the scrutiny of his roommates and his terrible reliance on them is the torture. “Hell is — other people!”
In The Age of Surveillance Capitalism, Shoshana Zuboff mentions No Exit as a predictive text, a warning to the coming digital generations about living under the unblinking gaze of others. She suggests how Sartre’s famous line is “a recognition that the self–other balance can never be adequately struck as long as the ‘others’ are constantly ‘watching.’” Balance depends on our ability to retreat into a truly private place, released from the demands and appellations of our devices. While No Exit foresees the pressures of social media, where self-definition can be become unhealthily bound to other people’s reactions to our posts, the dynamic is still between sentient actors, each with their own subjectivity. But as the eyes of non-sentient machine-learning systems open wider, for me the claustrophobia increasingly comes not from being unable to unhook myself from the online judgments of others (“You’re lovely!”) but from being locked in with algorithmically produced images of myself. On the bespoke Internet, hell is — ourselves!
Unlike humans, algorithms often don’t withhold or disguise the conclusions they have drawn about us. Their judgments are unmasked, and yet they lack the x-ray’s objective gaze. They don’t serve us up an irrefutable row of our own clean, white ribs. Their assessments have commercial agendas. Their acuity is sometimes hilariously imperfect; they’re often pedantic and oafishly literal. In some contexts, they consolidate a self-image we are pleased to possess, entrenching our cherished habits by nudging us toward the same kind of content again and again. But at other times the algorithms warp our reflection, as in a hall of mirrors, pulling our self-image into grotesque configurations. Much of the hellishness is the uncanny quality of these algorithmic portraits. Beyond the impertinence of their presumptions, we are forced to negotiate with the way they represent us. With every bespoke online “you,” we might ask, “Is that really me?”
TikTok assures us that its sense of You is constantly being refined, with light and shade added to the features, paint building up on the canvas: The “recommendation system is a continuous process as we work to refine accuracy, adjust models, and reassess the factors and weights.” But this endless appraisal does not always bring this digital portrait into higher resolution. The effect of algorithmic scrutiny can be distorting. Let’s take an example from YouTube. Say you decide to take a deep dive into the stash of Susan Sontag material — a nineties news interview where she pretends she hadn’t heard of Camille Paglia, a more stately library talk, and some commemorative documentaries. This unintended Sontag season is so intense that soon YouTube’s selection of videos is one large Mallen stripe. In the bathroom mirror, you imagine what you would look like with a Mallen stripe.
But then, the disquieting slide occurs. In between Sontags, there appears a rogue evening with Camille Paglia on Shakespeare. She’s good on the Macbeth witches. But the slide continues, and soon the Sontags have morphed into a boorish crew of provocateurs. It’s like following someone you’ve just met at a party to a second party that is not your scene. There’s a sense of high-school vertigo, of an abortive week spent running with a scary new crowd. You log back into YouTube. “Is this who I’ve become?,” you ask yourself. As the algorithm invites you to watch the latest provocateur “DESTROY Feminist B**** in 17 seconds,” you wonder if it’s true that you’re known by the company you keep. In your head, Sontag speaks with the voice of a jilted childhood best friend. “You got weird,” she says. And it’s true that YouTube’s mercurial kind of portraiture is weird. The image of oneself created by these recommendation systems isn’t stable; it drifts and smears. And yet the label YOU remains deceptively enduring and monumental. This is recommended for You; this is Your tube.
Much valuable criticism about digital surveillance has rightly focused on what is being taken from us, the slurping up of our lucrative data in the name of, to take one profitable example, cross-marketing opportunities. Zuboff viscerally describes us as picked-over carcasses. But we also need to remember what this surveillance gives us: an Internet that looks uncannily like ourselves.
In my 2015 book The Four-Dimensional Human, I describe a failed promise of the 1990s Internet: that it would free us from our earthly identities and let us move like quicksilver through cyberspace, inhabiting all kinds of experimental selves in gaudy, rackety chat rooms. Instead, our online movements got pinned with thumbnail avatars of our real-world faces; we solidified in the heat of the personal brand. We developed what I called “chain-store selves” as we spread ourselves across the Internet with the trademarked consistency of a franchise.
In her book Trick Mirror (2019), Jia Tolentino argues that we have become “chained to ourselves online.” She coins this beautiful image: “It’s as if we’ve been placed on a lookout that oversees the entire world and given a pair of binoculars that makes everything look like our own reflection.” Here Tolentino is discussing the self-consciousness that comes from the way social media ties our online presence to our personal profiles, so that every post and comment and retweet is read as a description of selfhood.
Through algorithmic interventions this process is intensifying. Instead of just exhibiting our personal brands in the great mall of the Internet, the Internet that we each experience looks more and more like our personal brands. In my case, apparently, this means men’s crotches and Golden Girl memes. The promised mercurial freedom of the ’90s Internet has morphed into the slipping and sliding of the algorithms trying to pin you down.
When I was young, a TV and movie shorthand for depression was the person on the couch flicking blankly from channel to channel, barely able to hold up the remote. Today, that once grim scene now glows with a new appeal: the tranquility of not being catered to, of moving between channels that are blind to our mood.
There are less-morose versions of this dissolution of self. Think of what hallowed ground we now find in secondhand bookstores, so many of which have been sunk by algorithms’ seductive targeting of readers. Certain people I know sigh and say that they could spend all day in the jumbled heaps of used book stores. Perhaps a corner of that sigh might be reserved for the heavenly indifference of the stacks to customers’ tastes. Online among the algorithms, the closest I come to this rush of self-forgetfulness occurs when I accidentally log myself out of YouTube and load up the site as no one. A tide of off-the-rack smash hits, with millions of views apiece, strikes my anonymous face. Oh yes, I think just then, before hurriedly logging back in, I’m not the only one in here.
We have spent these two pandemic years confronting our own portraits in a more literal way. A day spent on Zoom turned out to be a day spent staring at our own faces. As if to underline the claustrophobia of lockdowns, Zoom’s portal to sociability actually led us straight back to ourselves. It is strange how easily we accepted that an image of ourselves should hang there among our companions. This default has made us intimate with our own listening faces, which we know how to paint in real time to appease our vanity.
But this real-time control over our expressions has been but a consoling decoy for the control we have ceded to algorithms that generate their own images of us. We know well that in a Zoom call, in the moments spent arranging our hair in the live view, we are not at our most receptive to others. On the bespoke Internet, we risk becoming permanently mesmerized by how the algorithms ceaselessly fiddle with our hair, tilt our cameras, and adjust our jawlines in their eternal quest to capture us.
An irony of this uncanny scrutiny is that the more algorithms pay attention to us, the less we may pay attention to our digital roommates, to their subjectivities, tastes, and priorities. Solipsism becomes the coded default. The more frightening formulation that waits down this road may be: “Who are other people?”
Hell Is Ourselves