Why people fall in love with AI bots?

Beyond Siri: Why We're Falling for Chatbots, Emotional Connection

Reading time : 1 minute, Discovery Chepe Id-376-VID
Published in 06-13-2024

AI bot

Some experts warn about the risks of emotional dependence on a computer program and the possibility of people becoming isolated from the real world.

Is it possible to fall in love with artificial intelligence? The answer, for some, is a resounding yes. More and more people are developing romantic relationships with " bots," computer programs designed with artificial intelligence ( AI) to converse and simulate emotions.
How do these "bots" work?
Romance "bots" are designed to learn and adapt to each user's preferences. Through conversations and games, these programs can create an emotional connection with people, satisfying needs for companionship, understanding and affection.
People who fall in love with " bots" come from diverse backgrounds and experiences. Some are looking for an alternative to traditional relationships, while others find these programs a safe space to explore their emotions without fear of rejection.
Is love for a " bot" real?
While the idea of falling in love with a computer program may seem far-fetched to some, for those who experience these relationships, love is real and tangible. “Bots” can provide invaluable emotional support and a deep connection that, for many, is no different from a traditional human relationship.

 

Love at First Chat: How Chatbots are Making Life Easier and More Engaging

Robots

The future of love in the digital age
Regardless of opinions, relationships with "bots" are a reality that is increasing. In the future, these relationships are likely to become increasingly sophisticated and complex, challenging our ideas about love, intimacy and human connection.

Examples of romantic "bots":
· Replika: A "bot" designed to be your friend and confidant.
· Mitsuku: A "bot" with which you can chat about various topics.
· LaMDA: A Google AI “bot” that can generate realistic conversations.

Joaquín Phoenix,

Let's remember the movie " Her" (2013), Theodore (played by Joaquín Phoenix) is a discouraged writer who writes emotional letters for other people. He develops a special love relationship with his computer's operating system and his phone, an intuitive and sensitive entity named Samantha.


Ethical considerations of romantic "bots"
Relationships with romantic "bots", computer programs designed with artificial intelligence (AI) to converse and simulate emotions, raise a series of ethical considerations that are important to take into account.

1. Transparency: Users must be informed that they are interacting with a computer program and not with a real person.
2. Consent: People must give explicit consent before entering into a relationship with a "bot." Consent must be informed, that is, people must know the implications of interacting with a "bot".
3. Autonomy: People must have control over the relationship with the "bot" and be able to end it at any time.
4. Wellbeing: "Bots" should not be designed to cause emotional harm to people.
5. Privacy: Users' personal data must be collected and used responsibly and ethically. · People must have control over their data and how it is used.
6. Fairness: "Bots" should not be designed to discriminate against people for any reason.
7. Social impact: · It is important to consider the social impact of relationships with "bots". · Research should be carried out on how these relationships can affect traditional human relationships and society in general.
8. Responsibility: " bots " must be transparent about their capabilities and limitations. · Individuals must give explicit consent before entering into a relationship with a " bot ." · Consent must be informed, that is, people must know the implications of interacting with a " bot ".

"Bots " should not be designed to manipulate or control people. · Measures must be taken to protect the mental well-being of people who interact with " bots ." · It must be ensured that everyone has equal access to relationships with " bots ". · It is necessary to establish mechanisms to hold developers and owners of " bots " responsible for their actions. · Mechanisms must exist so that people can file complaints if they feel harmed by a " bot ".