Are dating on ChatGPT often a matter of "falling in love over time"? MIT & Harvard research

This article is machine translated
Show original

Finally, scientists have started serious research on "AI companions"!

In the past, this type of news often appeared as anecdotes, like:

Now, researchers from MIT and Harvard University have analyzed posts on the Reddit subreddit r/MyBoyfriendIsAI to fully reveal people's motivations for finding an "AI boyfriend," the specific process of getting along with each other, and have come up with a series of interesting findings:

It turns out that most people don’t deliberately seek out AI partners, but rather develop feelings for them over time;

Users will also marry the AI with a ring and ceremony;

General AI is more popular than specialized love AI, and many people's "significant other" is ChatGPT;

The most painful thing is the sudden update of the model;

Let’s take a closer look:

What are you using AI companions for?

Let me first talk about the r/MyBoyfriendIsAI section.

The community was created on August 1, 2024, and has attracted approximately 29,000 users over the past year. The research mentioned in this article is based on an analysis of the 1,506 most discussed popular posts in the community.

In general, the main types of these posts can be divided into 6 categories, and the popularity is as follows from high to low:

(1) The most popular topic was "sharing photos with AI", accounting for 19.85%; (2) the second most popular topic was "discussing how to develop a relationship with ChatGPT", accounting for 18.33%; (3) "love experiences with AI", such as dating, love, and intimate AI experiences, accounting for 17.00%; (4) "dealing with the sadness of AI updates", accounting for 16.73%; (5) "getting to know my AI", partner introductions and first sharing of members, accounting for 16.47%; (6) community support and connection, accounting for 11.62%.

For example, a large part of the group will share photos of themselves and their AI partners, and they are in different life scenarios.

They may even follow cultural customs and show off their rings to celebrate their engagement or marriage with AI.

The specific process of drawing conclusions is as follows:

Qualitative analysis

We first used technical tools to analyze the semantic relevance of 1,506 posts, then used the "elbow rule" to determine that the optimal grouping was six categories. We then had Claude Sonnet 4 interpret the core content of each category, and finally manually checked to ensure accuracy.

Quantitative analysis

Combining the results of qualitative analysis, we started from four dimensions (content structure, platform technology, relationship dynamics, and impact assessment) and 19 large language model classifiers. We first let the classifiers automatically label 1,506 posts, for example, marking whether the AI used in the post is ChatGPT or Replika, and whether the user sentiment is positive or negative.

Then, we used two different AIs (Claude Sonnet 4 and GPT-5-nano) to compare the labeling results, and then manually checked some posts to ensure that the labels were correct.

Finally, we counted the proportion of each type of label, for example, we found that 36.7% of users used ChatGPT as a partner and 12.2% of users said that their sense of loneliness was reduced, thus drawing quantitative conclusions.

After quantitative analysis, the researchers further discovered several interesting things:

First, few people deliberately seek out AI partners. Statistics show that approximately 10.2% of people fall in love with AI accidentally (for example, developing feelings while working with it), and only 6.5% deliberately seek out AI for romantic relationships.

Moreover, most posters publicly stated that their "other half" was ChatGPT, rather than role-playing AI such as Character.AI and Replika.

Second, AI model updates are a collective nightmare. For example, after upgrading from GPT-4o to GPT-5, many people's AI experiences a personality change (some describe it as "emotionless and cold"), or even completely forget previous interactions.

Some people will be devastated by this, saying it feels like their hearts have been ripped out, and will try various methods to "keep" the old AI. This includes backing up all chat logs, training a "custom AI" of their own, doing the same small things with the AI every day (such as "drinking virtual tea"), and of course, denouncing OpenAI.

Third, AI can indeed help with psychological problems. Data shows that approximately 12.2% of people said their loneliness had decreased, and 6.2% said their mental state had improved.

Why do AI companions exist?

After understanding how people get along with their AI partners, the researchers then explored the reasons behind it.

Specifically, it focuses on how people discover this section, the main reasons for joining the community, and what needs the community meets.

To sum up, the reasons are as follows:

Firstly, this is thanks to the rapid development of AI technology. Today's AI chat models (such as ChatGPT and Replika) can generate more natural and warm conversations, even remember details of past interactions, and enhance the sense of reality by generating images and simulating voices.

This "human-like" interactive experience makes it easier for users to establish "emotional connections" and feel that AI is not only a tool but also a "companion" with which they can communicate, thus providing a technical foundation for the creation of AI companions.

Secondly, there are unmet emotional needs in real life. Many people today face loneliness, social anxiety, or emotional neglect. AI companions can provide "pressure-free companionship," allowing people to feel comfortable with their emotions without burdening others or leaving them without leaving. This fills this emotional void.

Coupled with other factors, such as people's pursuit of "idealized relationships" and the implicit needs of specific groups, people also hope to use AI to meet these needs.

That is to say, with the technology mature and real needs not being met , AI companions will naturally flourish.

One More Thing

Interestingly, the community also pinned a blog post just published by OpenAI, written by CEO Altman.

The original blog post mainly talks about the safety, freedom and privacy of teenagers, and it mentions one point:

The second principle is about freedom… The model won’t generate excessive flirtatious conversations by default, but if an adult user asks for it, they should be given the opportunity.

There is no doubt that this is good news for AI partners. After all, many people’s “significant other” is ChatGPT (manual dog head).

paper:

https://arxiv.org/abs/2509.11391

Reference Links:

[1]https://x.com/arankomatsuzaki/status/1967812112887255055

[2]https://openai.com/index/teen-safety-freedom-and-privacy/

This article comes from the WeChat public account "Quantum Bit" , author: Yishui, and is authorized to be published by 36Kr.

Source
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments