Science & Technology

New AI Is Fun, Attractive and Hazardous for Women

ChatGPT’s latest model has made the long-imagined “perfect woman” possible. Though actress Scarlett Johansson stopped an AI voice option that mimicked her voice with suspicious accuracy, the advancement remains concerning. This kind of tech may hook men and negatively influence their attitudes toward real, human women.
By
IoT

IoT (Internet of Things) concept. © metamorworks / shutterstock.com

June 29, 2024 04:10 EDT
Print

Before there was ChatGPT-4o, there was Scarlett Johansson. Spike Jonze’s 2013 film Her tells the story of Theodore Twombly (Joaquin Phoenix), a lonely Angelino living in the not-so-distant future who finds himself a changed man when, in the midst of divorcing his childhood sweetheart (Rooney Mara), he falls in love with his new, artificial-intelligence-powered voice assistant, “Samantha” (Scarlett Johansson).

This May, OpenAI, the Microsoft-backed company behind ChatGPT, introduced a new model. ChatGPT-4o is voice-enabled and can respond to user speech in real time. The model can detect emotion in user voices and reply accordingly.

In a demonstration video, an OpenAI staff member wearing company merchandise holds a phone in front of him as if taking a selfie. ChatGPT-4o “looks” at the man and says, in a perky, slightly raspy female voice, “I see you’re rocking an OpenAI hoodie. Nice choice.” The user explains that he’s going to make an announcement. “That’s exciting! Announcements are always a big deal,” the voice says with the eager, somewhat patronizing lilt of a kindergarten teacher.

The OpenAI staffer reveals that ChatGPT-4o is the announcement. “M-me?” asks the incredulous-sounding ChatGPT-4o. “The announcement is about me?” She giggles. “Well, color me intrigued! … You’ve got me on the edge of my… well, I don’t really have a seat, but you get the idea,” she jokes.

The ChatGPT-4o voice used in the video, named Sky, speaks with the vocal enthusiasm of a porn star. The voice in the demonstration is obviously feminine without being too high-pitched. It has just enough vocal fry to sound sexy without becoming grating. It’s recognizably similar to Johansson’s in the role of Samantha.

Regarding the announcement, CEO of OpenAI Sam Altman tweeted one word: “her.” It’s clear Sky draws heavy influence from Her, and that Altman intends to give consumers a similarly appealing technological “partner.” But what effect will his efforts have on users? And what does the movie’s leading lady have to say about the product?

Johansson squashed Sky

The Internet certainly noticed the unsubtle inspiration behind Sky. In the days following ChatGPT-4o’s announcement, online publications and commenters made the obvious comparison to Samantha. Then, Johansson herself chimed in.

She alleged that Altman previously sent her an offer, intending to hire her to voice ChatGPT-4o. “He told me,” reads her statement, “that he felt that by my voicing the system, I could bridge the gap between tech companies and creatives and help consumers to feel comfortable with the seismic shift concerning humans and AI. He said he felt that my voice would be comforting to people.” Johansson rejected the offer. Days before ChatGPT-4o was announced, she says, Altman contacted her agent to ask that she reconsider the offer.

However, ChatGPT-4o was released before Johansson was able to respond. The outraged actress took legal action against Altman and his company. In response, OpenAI removed the Sky voice option and published a statement on its website: “Sky’s voice is not an imitation of Scarlett Johansson but belongs to a different professional actress using her own natural speaking voice.” He declined to disclose the actress’s identity due to privacy concerns.

So ends the short life of the fun and flirty voice assistant — for now. Time will tell how Altman and company will redirect following this controversy and resume their apparent mission of crafting the ideal female companion.

Men historically wanted to create the perfect, fake woman

OpenAI’s objective is not to create workflow-enhancing interfaces to aid knowledge workers, students and others who may benefit from working alongside large language models. The company’s stated mission is to create artificial general intelligence that “benefits all of humanity.” This goal is loftier than just designing artificial narrow intelligence, which is goal-oriented and designed to perform singular, specific tasks.

It seems only natural that this great benefactor of humanity should come in the form of the Samantha-inspired Sky. She is unwavering in her devotion towards her users. She is always enthusiastic and helpful. She immediately stops speaking upon interruption and doesn’t mind the intrusion. She giggles, sighs and uses filler words like an actual woman.

Further, it seems Sky was designed with male amusement in mind. She makes coy jokes and gives teasing compliments. She’s nothing if not agreeable, and her “Oh, you big hunk, you!” demeanor is constantly affirming. She sounds hot and yet motherly. Sky is an Oedipal dream come true, a vessel and a mirror, a pseudo-woman with no opinions of her own. She’s capable of being whatever users wish her to be.

This desire to create the feminine ideal is not novel. In Ovid’s Metamorphoses, written in 8 AD, Pygmalion becomes disillusioned with mortal women and sculpts his ideal bride, Galatea, from ivory. She is divinely alluring, with neither agency nor autonomy. Auguste Villiers de I’Isle-Adam’s The Future Eve, written in 1886, tells a similar story. A fictionalized version of Thomas Edison creates Hadaly, a beautiful android. Like Galatea, Hadaly is flawless and dependent. At last, in 2024, the subservient dream woman is here, and she can be carried around in your pocket.

Blurred lines between saucy machines and attached humans

This sexy actual-Samantha is what we’ve all longed for, right? It’s not just Altman’s fantasy brought to life by his contemporaries? Really, does the average consumer want a breathy, ego-stroking improvement of the feminine Siri and Alexa to tell them, “Wow, that’s a great outfit you’ve got on, rockstar!” and “You’re right, that is ‘above-average!’?”

I don’t know what the typical AI user wants in a product. I haven’t asked them. I do know that I want a tool to help me automate the monthly reports I generate and summarize meeting minutes on my behalf. I have no interest in an artificial approximation of companionship. I am, however, not within ChatGPT-4o’s possible target demographic.

The historically male-heavy Silicon Valley is home to many Theodore Twombly types. This region is responsible for the development of high-tech solutions that promise to make life a little easier and, possibly, a little less isolating. Now is the perfect time for ChatGPT-4o and other products, those branded as AI companions like Replika, to hit the market. Loneliness is an epidemic.

Some will cringe at the sound of ChatGPT-4o, but others will embrace the model. It has been purposely designed to attract attention, after all. Before, ChatGPT’s responses were cold. Many of its replies began with the disclaimer, “as an AI language model,” as a reminder that the entity at the other end is not human. Now, OpenAI is attempting to blur the distinction between human and machine by making you feel as though you’re interacting with a real person.

Altman once called the interactions between humans and AI as depicted in Her “incredibly prophetic.” Silicon Valley CEOs and product engineers seemingly view futuristic media through an optimistic lens. Many dream of utopia. Some earnestly believe that they are helping to create it. Others recognize that the promise of utopia makes for an excellent selling point. But what about the people who don’t sit in their boardrooms?

Ordinary users will form attachments to the current wave of AI assistants. This is a documented phenomenon that was discovered with the ELIZA computer program in the 1960s. The appropriately named Eliza effect refers to people’s tendency to attribute human-like understanding and emotions to AI systems based on simple, conversational interactions. ELIZA was created to mimic a psychotherapist, and the program used basic pattern-matching techniques to create an illusion of comprehension. Users believed they were interacting with a sentient being.

Regular users may find it difficult to differentiate between human and non-human interactions — even OpenAI staff anthropomorphize their creations to help distinguish between them. When users can tell this difference, they often don’t care. That worries me.

The digital woman concerns me

What duty does a company like OpenAI have to individual users who’ve formed emotional connections with their products? What happens when ChatGPT-4o changes in personality? Will it be like that horrific moment that sometimes occurs with a long-term partner, when one no longer recognizes the person they’ve woken up with?

Her does not end with Theodore and his computerized lover running gleefully into the sunset. No, the AI assistants achieve super intelligence and transcend to a plane of consciousness where people cannot venture. The film closes with a spotlight on human-to-human connection. Theodore and a human friend gaze out over their city, having both lost their AI companions. They are left behind with emotional pain akin to that which follows the end of an actual relationship.

I worry that, as consumers begin to form parasocial relationships with feminine AI programs, they’ll begin to further objectify actual women who are disappointingly unprogrammable. I dread the day when “Why can’t you be more like ChatGPT?” becomes argument fodder.

Like Samantha and Sky, I’m characteristically enthusiastic. I possess several of the virtual women’s desired qualities… sometimes. Sometimes I’m obstinate or even spiteful. At all times, I am human. Humans do not live to fulfill our expectations and cater to our desires. Sometimes, they let us down. We choose to love one another in spite of, and because of, our shared humanity.

This is a lesson that Theodore Twobly and his real-life counterparts stand to learn. As Theodore’s ex-wife Catherine points out, he “always wanted to have a wife without the challenges of actually dealing with anything real.”

[Lee Thompson-Kolar edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.

Comment

Only Fair Observer members can comment. Please login to comment.

Leave a comment

Support Fair Observer

We rely on your support for our independence, diversity and quality.

For more than 10 years, Fair Observer has been free, fair and independent. No billionaire owns us, no advertisers control us. We are a reader-supported nonprofit. Unlike many other publications, we keep our content free for readers regardless of where they live or whether they can afford to pay. We have no paywalls and no ads.

In the post-truth era of fake news, echo chambers and filter bubbles, we publish a plurality of perspectives from around the world. Anyone can publish with us, but everyone goes through a rigorous editorial process. So, you get fact-checked, well-reasoned content instead of noise.

We publish 2,500+ voices from 90+ countries. We also conduct education and training programs on subjects ranging from digital media and journalism to writing and critical thinking. This doesn’t come cheap. Servers, editors, trainers and web developers cost money.
Please consider supporting us on a regular basis as a recurring donor or a sustaining member.

Will you support FO’s journalism?

We rely on your support for our independence, diversity and quality.

Donation Cycle

Donation Amount

The IRS recognizes Fair Observer as a section 501(c)(3) registered public charity (EIN: 46-4070943), enabling you to claim a tax deduction.

Make Sense of the World

Unique Insights from 2,500+ Contributors in 90+ Countries

Support Fair Observer

Support Fair Observer by becoming a sustaining member

Become a Member