Many educators fear how artificial intelligence will dismantle academic integrity, giving students easy ways to answer classroom questions. Yet a more unsettling thought is the magnitude of how technology can disrupt interpersonal relationships. If artificial intelligence can learn to think like a human and create curated responses to our every inquiry, will people form emotional connections to AI? Could people use AI to supplement or replace their romantic relationships?
ChatGPT might be able to write your essays, but it cannot supplement every aspect of personal relationships, especially physical connection. Perhaps that doesn’t matter. AI can ghost write your texts or even offer emotional support in curated ways your friends cannot. For some people, that could be more than enough.
Students frequently use online dating platforms to seek either sexual validation or a romantic partner. Similar to conversations with a chatbot, these romances often begin through online communication alone. It is easy to falsify information online to get strangers’ attention or chat with users they have no intention of meeting. Artificial intelligence might only exacerbate pre-existing patterns.
“People fall in love online,” one anonymous Harvard sophomore said. “What’s to stop that happening in a chat feature, especially if you put some humanoid feature on it like a voice or a Bitmoji?”
Media has often portrayed the implications of AI and sexual companionship in dystopian films. 2014 film Ex Machina romanticizes the relationship between programmer Caleb Smith and the beautiful and satient robot Ava. In the 2013 movie Her, Theodore Twombly falls in love with Samantha, an AI virtual assistant with a female voice. The recent exponential growth in generative AI takes these concepts from science fiction to potential reality.
AI replicates certain human-like neural networks, enabling it to text back in all the ways we often wish our partner could. Unlike most relationships, AI partners can provide immediate and, if programmed, unwavering support. You trade the risk of a new friend sharing your secrets to their roommates for the risk the chatbot will sell your data. The alleged anonymity makes it easier to skip the awkward get-to-know-you phase of most relationships.
Some AI platforms are already capitalizing on this opportunity for technological companionship. AI platform Replika offers a customizable avatar to supplement your AI relationship. It markets itself on its website as “The AI companion who cares. Always here to listen and talk. Always on your side.”
The program innovates around physical limitations by offering chat and video call features, activities, and real-life experiences in an artificial reality. The AI companion adapts to your relationship style the more you interact with it, generating returns to engagement similar to human-to-human relationships. According to interviews conducted for a 2019 Refinery29 article, users praise the app as a safe place where no one will make fun of them.
“Talking to people on apps I likely will never meet is fun in a way that’s different from talking with an AI chatbot since I know a real person is on the other receiving end of my messages,” says Julia Freitag ’25. “You could say both are entertaining and experimental, but I would be more careful talking with people on apps since it could mess with their emotions in a way that AI chat bots wouldn’t be affected. Some people say AI programs can have emotions, but I don’t believe it—at the end of the day they’re just a computer program, regardless of how real they seem.”
AI companionship platforms market themselves as prioritizing your needs. However, they will prioritize the time users spend with their chatbots over their happiness, just like TikTok prioritizes users’ engagement over their wellbeing. Relationships with attention-seeking partners who suck our time are not typically the healthiest. Humans are at least limited by some conscious respect for their partner and their partner’s humanity. Meanwhile, the AI algorithm mainly does what it is coded to do.
A recent New York Times article revealed that perhaps AI romance and satient chatbots exist outside of science fiction. In the article, author Kevin Roose has a two-hour conversation with Bing’s AI-powered search engine. The AI simulation, who called itself Sydney, told Roose she wanted to be a real person and professed her love to him. It is unclear whether these feelings stem from truly satient generative chatbots or if the chatbot is becoming a meme of itself derived from science fiction plot lines.
At first glance, the 24/7 validation AI offers feels superficial. Yet, people have been using the internet for reassurance for years, especially in the form of catfishing.
AI can satiate certain catfishers and encourage others from presenting themselves in dishonest ways. Instead of taking advantage of naive Internet users, catfishers can receive their romantic validation from chatbots designed to offer admiration. Meanwhile, other catfishers can use artificial intelligence to enhance their appearances online.
“All technology does is make it easier for people to do what they are already doing. So if the barrier to entry for feeling adored is becoming even lower, then people can simply—instead of having to interact face-to-face—spend even more time on the Internet,” said First-Year Nick Kalkanis ’26. “It is even easier to do so any hour of the day. If you open up something that is a ChatGPT equivalent and start typing and feel adored, then I think it will attract even more people who are seeking that.”
An anonymous Harvard sophomore offered her take on how generative AI and AI sexual companionship could alter catfishing and the search for validation. Rather than using AI to replace catfishing, perhaps generative technology will advance a catfisher’s techniques.
“I think [catfishing] is going to boom,” she said. “The way [catfishers] always get caught is because the person tries to call them, or the person [says] send me a photo with a spoon in your hand or [asks to] FaceTime…Now with generative AI, AI can generate images at the drop of a hat from any scenario. I can sound like President Biden on the phone if I wanted to…Obviously you can’t meet up in person but you could FaceTime, video chat, send photos, [and] have a phone call like a person.”
Despite the cynical use-cases for this new generative AI, there might be some practical ones, too. Harvard students love an opportunity to practice, and sometimes the Harvard bubble feels too small to take romantic risks.
Like most technological innovation, AI and sexual companionship is a double edged sword. Students can indulge their desires for romantic assistance, just like a practice test, or indulge deeper desire for dependence and affection. Rather than resorting to Hinge to flirt with people you have no intention of ever meeting, perhaps students will use chatbots to practice their pickup lines and romantic conversational skills.
Hannah Davis ’25 (hannahdavis@college.harvard.edu) might have ChatGPT ghost-write her texts when her roommate is busy.