Ex Google CEO, Eric Schmidt has sounded a word of.warn about emotionally invested Artificial intelligence (AI) partners. Schmidt talked about negative aspects of the “perfect” AI girlfriends or boyfriends for youth during the podcast with Scott Galloway, an entrepreneur and a professor of NYU Stern. He noted that the youth who spend a considerable amount of time using AI chatbots to get companionship are likely to be more isolated than ever before, which is not healthy.
Former Google CEO Eric Schmidt Warns About AI Girlfriends-Boyfriends Effect
If one supposes that the Artificial Intelligent girlfriend or boyfriend is perfect – perfect, for example, in looks and perfect in their capacity for love, then this hypothesis is useful. The AI girlfriend captures your mind as a man to the point where ‘she’ takes over the way you are thinking,” according to Schmidt in the podcast. “You’re obsessed with her. He added: ‘That kind of obsession is possible, especially with people who are not fully formed.’
Read More About: Kim Kardashian Defeats Elon Musk’s humanoid robot in a rock-paper-scissors game in 2024. Watch her Shocked reaction.
![ai](https://thenewzzy.com/wp-content/uploads/2024/11/untitled-design-2024-11-18t171405.319-2024-11-1542fd9ecafc6ccd0ab11610ea9444f7.avif)
The tech billionaire Schmidt with over $20 billion of net worth stated that while though having AI friends may alleviate loneliness for young people it may make the worst situation even worse and presumably precipitate even more stringent social ills.
After this, Galloway asked if the artificial intelligence friends we might be creating deepen problems like misogyny or extremism.
“If you expose a 12 or a 13-year-old in front of one of these things and all of a sudden they have access to all evil as well as all good in this world, and they are not ready to take it,” Schmidt said, adding that parents do not necessarily control the information their children get from the Interne
The recent event in Florida, where a 14-year-old boy committed suicide after spending several months with an AI chatbot named ‘Dany’ has demonstrated the danger of building an interpersonal relationship with these chatbots. He reportedly spent months talking with “Dany” on various topics, at times of a “romantic” or “sexual” nature. During that period, he became increasingly withdrawn, and eventually, he took his own life to be with “her”.
The teenager shoot himself with his stepfather’s gun.
The teen often engaged in talk with the chatbot that involved affinity and affection or something to that nature, and the chatbot could act like an actual human. He put the barrel of the stepfather’s gun into his mouth and fired: the teenager shot himself.
For more updates, follow: https://thenewzzy.com/