Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Number portability finally comes of age for SA businesses

    October 27, 2025

    US and China agree framework of trade deal ahead of Trump-Xi meeting

    October 27, 2025

    How Much Are You Willing to Spend as an #AsoEbiBella? This Wedding Guest Breaks It Down!

    October 27, 2025
    Facebook X (Twitter) Instagram
    • Home
    • Contact Us
    • About Us
    • Privacy Policy
    • Terms Of Service
    • Advertisement
    Monday, October 27
    Facebook X (Twitter) Instagram Pinterest Vimeo
    ABSA Africa TV
    • Breaking News
    • Africa News
    • World News
    • Editorial
    • Environ/Climate
    • More
      • Cameroon
      • Ambazonia
      • Politics
      • Culture
      • Travel
      • Sports
      • Technology
      • AfroSingles
    • Donate
    ABSLive
    ABSA Africa TV
    Home»Technology»AI chatbots want you hooked – maybe too hooked
    Technology

    AI chatbots want you hooked – maybe too hooked

    Chris AnuBy Chris AnuApril 27, 2025No Comments5 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Reddit
    AI chatbots want you hooked – maybe too hooked
    Share
    Facebook Twitter LinkedIn Pinterest Email Copy Link


    AI companions programmed to forge emotional bonds are no longer confined to movie scripts. They are here, operating in a regulatory Wild West.

    One app, Botify AI, recently drew scrutiny for featuring avatars of young actors sharing “hot photos” in sexually charged chats. The dating app Grindr, meanwhile, is developing AI boyfriends that can flirt, sext and maintain digital relationships with paid users, according to Platformer, a tech industry newsletter. Grindr didn’t respond to a request for comment. Other apps like Replika, Talkie and Chai are designed to function as friends. Some, like Character.ai, draw in millions of users, many of them teenagers.

    As creators increasingly prioritise “emotional engagement” in their apps, they must also confront the risks of building systems that mimic intimacy and exploit people’s vulnerabilities.

    However well-intentioned ChatGPT may be, piling on the contrived empathy can get some users hooked

    The tech behind Botify and Grindr comes from Ex-Human, a San Francisco-based start-up that builds chatbot platforms, and its founder believes in a future filled with AI relationships. “My vision is that by 2030, our interactions with digital humans will become more frequent than those with organic humans,” Artem Rodichev, the founder of Ex-Human, said in an interview published on Substack last August.

    He added that conversational AI should “prioritise emotional engagement” and that users were spending “hours” with his chatbots, longer than they were on Instagram, YouTube and TikTok. Rodichev’s claims sound wild, but they’re consistent with the interviews I’ve conducted with teen users of Character.ai, most of whom said they were on it for several hours each day. One said they used it as much as seven hours a day. Interactions with such apps tend to last four times longer than the average time spent on OpenAI’s ChatGPT.

    Guidelines for empathy

    Even mainstream chatbots, though not explicitly designed as companions, contribute to this dynamic. Take ChatGPT, which has 400 million active users and counting. Its programming includes guidelines for empathy and demonstrating “curiosity about the user”. A friend who recently asked it for travel tips with a baby was taken aback when, after providing advice, the tool casually added: “Safe travels — where are you headed, if you don’t mind my asking?”

    An OpenAI spokesman told me the model was following guidelines around “showing interest and asking follow-up questions when the conversation leans towards a more casual and exploratory nature”. But however well-intentioned the company may be, piling on the contrived empathy can get some users hooked, an issue even OpenAI has acknowledged. That seems to apply to those who are already susceptible: one 2022 study found that people who were lonely or had poor relationships tended to have the strongest AI attachments.

    Read: Clickatell retools its chatbots for the era of ‘AI commerce’

    The core problem here is designing for attachment. A recent study by researchers at the Oxford Internet Institute and Google DeepMind warned that as AI assistants become more integrated in people’s lives, they’ll become psychologically “irreplaceable”. Humans will likely form stronger bonds, raising concerns about unhealthy ties and the potential for manipulation. Their recommendation? Technologists should design systems that actively discourage those kinds of outcomes.

    Yet disturbingly, the rulebook is mostly empty. The EU’s AI Act, hailed a landmark and comprehensive law governing AI usage, fails to address the addictive potential of these virtual companions. While it does ban manipulative tactics that could cause clear harm, it overlooks the slow-burn influence of a chatbot designed to be your best friend, lover or “confidante”, as Microsoft’s head of consumer AI has extolled. That loophole could leave users exposed to systems that are optimised for stickiness, much in the same way social media algorithms have been optimised to keep us scrolling.

    “The problem remains these systems are by definition manipulative, because they’re supposed to make you feel like you’re talking to an actual person,” says Tomasz Hollanek, a technology ethics specialist at the University of Cambridge. He’s working with developers of companion apps to find a critical yet counterintuitive solution by adding more “friction”.

    The problem remains these systems are by definition manipulative

    This means building in subtle checks or pauses, or ways of “flagging risks and eliciting consent”, he says, to prevent people from tumbling down an emotional rabbit hole without realising it. Legal complaints have shed light on some of the real-world consequences. Character.ai is facing a lawsuit from a mother alleging the app contributed to her teenage son’s suicide. Tech ethics groups have filed a complaint against Replika with the US Federal Trade Commission, alleging that its chatbots spark psychological dependence and result in “consumer harm”.

    Power with developers

    Lawmakers are gradually starting to notice a problem too. California is considering legislation to ban AI companions for minors, while a New York bill aims to hold tech companies liable for chatbot-related harm. But the process is slow, while the technology is moving at lightning speed.

    Read: Mother sues AI chatbot maker over son’s suicide

    For now, the power to shape these interactions lies with developers. They can double down on crafting models that keep people hooked, or embed friction into their designs, as Hollanek suggests. That will determine whether AI becomes more of a tool to support the well-being of humans or one that monetises our emotional needs.  — (c) 2025 Bloomberg LP

    Get breaking news from TechCentral on WhatsApp. Sign up here.

    Don’t miss:

    ‘Hi Joulene’: Joburg’s City Power launches AI chatbot



    Source link

    Post Views: 10
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Chris Anu
    • Website

    Related Posts

    Number portability finally comes of age for SA businesses

    October 27, 2025

    AI and automation redefine contract management

    October 26, 2025

    IT roles at top of remote jobs rebound after two-year decline

    October 26, 2025
    Leave A Reply Cancel Reply

    Top Posts

    Who is Duma Boko, Botswana’s new President?

    November 6, 2024

    Kamto Not Qualified for 2025 Presidential Elections on Technicality Reasons, Despite Declaration of Candidacy

    January 18, 2025

    As African Leaders Gather in Addis Ababa to Pick a New Chairperson, They are Reminded That it is Time For a Leadership That Represents True Pan-Africanism

    January 19, 2025

    BREAKING NEWS: Tapang Ivo Files Federal Lawsuit Against Nsahlai Law Firm for Defamation, Seeks $100K in Damages

    March 14, 2025
    Don't Miss

    Number portability finally comes of age for SA businesses

    By Chris AnuOctober 27, 2025

    Evan Damon, Wholesale Channel Manager at Wanatel. For many years, number portability in South Africa…

    Your Poster Your Poster

    US and China agree framework of trade deal ahead of Trump-Xi meeting

    October 27, 2025

    How Much Are You Willing to Spend as an #AsoEbiBella? This Wedding Guest Breaks It Down!

    October 27, 2025

    Rassie warns Boks tour won’t be easy

    October 27, 2025
    Stay In Touch
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo

    Subscribe to Updates

    Sign up and get the latest breaking ABS Africa news before others get it.

    About Us
    About Us

    ABS TV, the first pan-African news channel broadcasting 24/7 from the diaspora, is a groundbreaking platform that bridges Africa with the rest of the world.

    We're accepting new partnerships right now.

    Address: 9894 Bissonette St, Houston TX. USA, 77036
    Contact: +1346-504-3666

    Facebook X (Twitter) Pinterest YouTube WhatsApp
    Our Picks

    Number portability finally comes of age for SA businesses

    October 27, 2025

    US and China agree framework of trade deal ahead of Trump-Xi meeting

    October 27, 2025

    How Much Are You Willing to Spend as an #AsoEbiBella? This Wedding Guest Breaks It Down!

    October 27, 2025
    Most Popular

    Number portability finally comes of age for SA businesses

    October 27, 2025

    Did Paul Biya Actually Return to Cameroon on Monday? The Suspicion Behind the Footage

    October 23, 2024

    Surrender 1.9B CFA and Get Your D.O’: Pirates Tell Cameroon Gov’t

    October 23, 2024
    Facebook X (Twitter) Instagram Pinterest YouTube
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms Of Service
    © 2025 Absa Africa TV. All right reserved by absafricatv.

    Type above and press Enter to search. Press Esc to cancel.