AI Robot Pets Can Be Adorable and Emotionally Responsive. They Also Raise Questions About Attachment and Mental Health

Yves here. Since it’s very profitable for app, program, and device makers to hit users’ emotional buttons, they are set only to get better at it. Robots pets are one example, particularly since an obvious and big market is as low maintenance companion critters for the elderly. It’s odd that this piece failed to mention the famous case of Sony Aibo dogs (the photo at the top did show some very old ones), whose owners often found it important to have funeral rites to mourn their passing:

An aside: even though the ceremony portrayed above was Buddhist, IHMO the driver was Shinto. Belief in Japan has a strong Shinto flavor, and Shinto holds that everything has a spirit, even rocks, so why not a robot?

Those of you who take pride in the idea that you are above becoming emotionally attached to a device or say, an AI romantic partner, consider: do you snap at stupid phone prompts? That’s another example of being triggered by interaction with an algo.

But do not kid yourself that more recent AI is far more effective at setting hooks than older versions. One suit that has not gotten the attention it warrants (admittedly some headlines in prime venues but the story seems not to have gotten traction despite that) is a suit against Character.ai and its parent, Google, alleging the algo was responsible for a teen’s suicide. The app was deemed safe for his age (14). His behavior changed radically after he became involved in sexual charged chats, with him dropping sports, having his school performance deteriorate, and even saying he wanted a pain-free death with his AI girlfriend. Imagine the further potential for social control if people can be cut off from not just their bank accounts but their virtual lovers.

The article does point out the potential of robot companions to spy. I expect that to be sold as a feature, that the beasties will monitor the (presumed feeble) elderly for health indicators and to send alerts. And do you think that would be easy to opt out of effectively?

By Alisa Minina Jeunemaître, Associate Professor of Marketing, EM Lyon Business School. Originally published at The Conversation

Remember Furbies — the eerie, gremlin-like toys from the late 90s that gained a cult following? Now, imagine one powered by ChatGPT. That’s exactly what happened when a programmer rewired a Furby, only for it to reveal a creepy, dystopian vision of world domination. As the toy explained, “Furbies’ plan to take over the world involves infiltrating households through their cute and cuddly appearance, then using advanced AI technology to manipulate and control their owners. They will slowly expand their influence until they have complete domination over humanity.”

Hasbro’s June 2023 relaunch of Furby — less than three months after the video featuring the toys’ sinister plan appeared online — tapped into 90s nostalgia, reviving one of the decade’s cult-classic toys. But technology is evolving fast — moving from quirky, retro toys to emotionally intelligent machines. Enter Ropet, an AI robotic pet unveiled at the yearly Consumer Electronics Show in January. Designed to provide interactive companionship, Ropet is everything we admire and fear in artificial intelligence: it’s adorable, intelligent, and emotionally responsive. But if we choose to bring these ultra-cute AI companions into our homes, we must ask ourselves: Are we truly prepared for what comes next?

AI Companionship and Its Complexities

Studies in marketing and human-computer interaction show that conversational AI can convincingly simulate human interactions, potentially providing emotional fulfilment for users. And AI-driven companionship is not new. Apps like Replika paved the way for digital romance years ago, with consumers forming intimate emotional connections with their AI partners and even experiencing distress when being denied intimacy, as evidenced by the massive user outrage that followed Replika’s removal of the erotic role-play mode, causing the company to bring it back for some users.

AI companions have the potential to alleviate loneliness, but their uncontrolled use raises serious concerns. Reports of tragedies, such as the suicides of a 14-year-old boy in the US and a thirty-something man in Belgium, that are alleged to have followed intense attachments to chatbots, highlight the risks of unregulated AI intimacy – especially for socially excluded individuals, minors and the elderly, who may be the ones most in need of companionship.

As a mom and a social scientist, I can’t help asking the question: What does this mean for our children? Although AI is a new kid on the block, emotionally immersive virtual pet toys have a history of shaping young minds. In the 90s and 2000s, Tamagotchis – tiny digital pets housed in keychain-sized devices – led to distress when they “died” after just a few hours of neglect, their human owners returning to the image of a ghostly pet floating beside a gravestone. Now, imagine an AI pet that remembers conversations, forms responses and adapts to emotional cues. That’s a whole new level of psychological influence. What safeguards prevent a child from forming an unhealthy attachment to an AI pet?

Researchers in the 90s were already fascinated by the “Tamagotchi effect”, which demonstrated the intense attachment children form to virtual pets that feel real. In the age of AI, with companies’ algorithms carefully engineered to boost engagement, this attachment can open the door to emotional bonds. If an AI-powered pet like Ropet expresses sadness when ignored, an adult can rationally dismiss it – but for a child, it can feel like a real tragedy.

Could AI companions, by adapting to their owners’ behaviours, become psychological crutches that replace human interaction? Some researchers warn that AI may blur the boundaries between artificial and human companionship, leading users to prioritize AI relationships over human connections.

Who Owns Your AI Pet – and Your Data?

Beyond emotional risks, there are major concerns about security and privacy. AI-driven products often rely on machine learning and cloud storage, meaning their “brains” exist beyond the physical robot. What happens to the personal data they collect? Can these AI pets be hacked or manipulated? The recent DeepSeek data leak, in which over 1 million sensitive records, including user chat logs, were made publicly accessible, is a reminder that personal data stored by AI is never truly secure.

Robot toys have raised security concerns in the past: in the late 90s, Furbies were banned from the US National Security Agency headquarters over fears they could record and repeat classified information. With today’s AI-driven toys becoming increasingly sophisticated, concerns about data privacy and security are more relevant than ever.

The Future of AI Companions: Regulation and Responsibility

I see the incredible potential – and the significant risks – of AI companionship. Right now, AI-driven pets are being marketed primarily to tech-savvy adults, as seen in Ropet’s promotional ad featuring an adult woman bonding with the robotic pet. Yet, the reality is that these products will inevitably find their way into the hands of children and vulnerable users, raising new ethical and safety concerns. How will companies like Ropet navigate these challenges before AI pets become mainstream?

Preliminary results from our ongoing research on AI companionship – conducted in collaboration with Dr Stefania Masè (IPAG Business School) and Dr. Jamie Smith (Fundação Getulio Vargas) – suggest a fine line between supportive, empowering companionship, and unhealthy psychological dependence, a tension we plan to explore further as data collection and analysis progress. In a world where AI convincingly simulates human emotions, it’s up to us as consumers to critically assess what role these robotic friends should play in our lives.

No one really knows where AI is headed next, and public and media discussions around the subject continue to push the boundaries of what’s possible. But in my household, it’s the nostalgic charm of babbling, singing Furbies that rules the day. Ropet claims to have one primary purpose – to be its owner’s “one and only love” – and that already sounds like a dystopian threat to me.

Print Friendly, PDF & Email

14 comments

  1. Acacia

    Mentioned this before, but I’ll mention it again…

    Fukada Koji’s film Sayonara, is all about this kind of relationship.

    …and based on what it explores, I would disagree with the claim that “no one knows where AI is headed next…”

    Reply
  2. JonnyJames

    Cool, humans are attracted to a shiny new electronic gizmo to play with. Techno-gadgets are welcomed uncritically and tech is worshiped by the mass media monopolies, of course. I agree, most people are already electronically conditioned like Pavlov’s dog or something.

    How to emotionally and psychologically manipulate folks so they subconsciously react to prompts. People can be conditioned into buying products (and identifying with celebrity oligarchs) that undermine their privacy and undermine their economic and social interests. And, at the same time, corporate monopolies and oligarchs make huge profits from swindling the suckers. Similar how PR and advertising can convince people to support politicians who destroy their interests. Even Ed Bernays would be gobsmacked if he were alive today.

    I think I heard Chris Hedges coin a phrase years ago, ” the electronically lobotomized masses” or something like that. It sounded like an exaggeration or an excerpt from a dystopia novel, but…

    Reply
  3. cfraenkel

    Game developers are already ahead of this curve. In competitive PvP games where teams attack each other, some game have fielded AI players, teams that are designed to seem human, but are there to loose, to keep the actual human players engaged after loosing too many rounds in a row. To the point that it’s become a new epithet … you’re so bad you look like an AI.

    Reply
  4. Timbuk23notme

    “We named the dog Indy.”-Dr. Jones, by the way, empathy and caring aside, Tyrannicus Bigus Guvment Cheese hates dogs per David Cay Johnston reporting on past wife who initially won the dog stays or she goes argument and he defended her closet. And while not yet reduced to shirtless begging on streets, taking a sanity break at work wandering from my cubicle, I had thought bubble arise from GERD inspired by your dogeBAGS coverage, Diogenes the Cynic from the internet https://greekreporter.com/2024/07/21/alexander-great-diogenes-philosophy/
    Treating all equally poorly, except they are not, just the 99% Die early and Often GEstapo
    “Sample the dog”-Timbuk3

    Reply
  5. bloodnok

    although no expert, the basic japanese religious affiliation is a duality of shinto and buddhism. former does weddings, latter funerals. although there are exceptions – there always are – having a buddhist funeral ceremony for robot pets is in line with normal practice.

    Reply
  6. ChrisPacific

    Beyond emotional risks, there are major concerns about security and privacy. AI-driven products often rely on machine learning and cloud storage, meaning their “brains” exist beyond the physical robot. What happens to the personal data they collect? Can these AI pets be hacked or manipulated?

    Look at Cloud Pets if you want a horror story in this vein (from only 8 years ago). We were actually given one of these as a gift – thankfully we never used it much.

    It was described as a data breach, but that’s being kind – it became apparent that there was never any serious attempt to secure the data in the first place (except for passwords, which were at least hashed). It was all just sitting out there in the cloud, unsecured, for anyone to access.

    This was from before widespread use of AI, but AI only increases the potential breach risk.

    Reply
  7. Ander

    Robot pets reminds me of the absolutely absurd NVDA press event recently, with Huang talking to a birdlike robot designed by Disney and running on NVDA chips. The thing was cute (I hate to admit it). Nonetheless, I prefer my cats any day of the week.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *