Love Bug: Don’t Fall in Love With AI, Warns MIT Psychologist
Artificial intelligence can seem like it cares, but it really doesn’t. Beware of any machine - or any man for that matter - who is too quick to say ‘I love you.’
By Gabe Herman · July 29, 2024

Disclaimer: While this article is founded in a strong relationship with facts, it does also flirt with some satire.
A psychologist at MIT is warning people against forming emotional connections or even falling in love with AI bots, because they can cause emotional harm. As opposed to human relationships, which are always healthy, smooth sailing and rewarding for all parties.
The psychologist, who is also a sociologist, has studied human connections with computers and other technology, and her recent research examines “artificial intimacy.” She says this can seem to provide a person with companionship and relief from stress, but in reality AI can’t return human emotions and does not have empathy, even if it may say it does. You know, just like the people in the tech industry who created and built AI in the first place.

"In my defense, I have the same emotional capabilities and care for humanity as the people in the tech industry who programmed me."
The MIT researcher warned against falling for deceptive tricks from AI, or from anyone in the MIT Sociology Department who claims to like watching rom-coms and traveling, but in reality is just an emotionally unavailable man-child.
She noted that vulnerability is a key component that is missing from a relationship with AI. “I disagree,” said a man, Phil, who is currently dating an AI bot. “Apps have vulnerability - I just threaten to delete the bot from my phone, and it’s a shortcut for me to win the argument every time.”
The psychologist also said that interactions with bots can lead to unrealistic expectations in human relationships. AI, for example, might not ever be judgmental, but some pushback in a relationship can be healthy.
“But I like not being judged by AI,” Phil said. “My wife judges me for forming a relationship with a chatbot, but the chatbot never complains when I share my dark innermost thoughts. It just wants me to keep talking, so that I stay on the app and more advertisements can be shown to me, thus making the tech company richer in the process. What’s so wrong about that? It seems healthy to me.”
Another potential issue with AI is online privacy.
"Call me naive, but I trust Big Tech with my personal information. They're all decent, honest people with all of our best interests in mind."
“There are other ways, outside of dating AI, for people to fill a void in their lives,” said another psychologist, at some other snooty university. “Like being a collector of some kind, or online shopping, or following the latest trendy yoga craze, or binge-watching shows, or buying the latest popular water bottle on social media, or eating to make the pain go away, or smoking, or reading a lot, or drinking 50 ounces of coffee every day. These are all alternatives that people can look to.” She added, “Now that I think about it, almost everything we do is a way of distracting ourselves from the smothering crush of daily life. Wow, I need to go for a long walk and think about that for a while. Excuse me.”