Search a title or topic

Over 20 million podcasts, powered by 

Player FM logo
Artwork

Content provided by Spotify Studios. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Spotify Studios or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

AI Chatbots: Are They Dangerous?

40:50
 
Share
 

Manage episode 505664288 series 93547
Content provided by Spotify Studios. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Spotify Studios or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

Many of us use artificial intelligence for help with research, work, or creative projects. But some people are getting a LOT more personal with their AI chatbots. We’re hearing stories of people treating their chatbot like a friend, or something more than a friend — with some people saying they’ve even fallen in love with their chatbot. And then there are stories of things taking a scary turn, with people’s mental health spiraling out of control after talking to these bots. So, what should we make of AI companions? Is it risky to spend a lot of time talking to an AI bot? We ask AI researcher Dr. Julian de Freitas and psychiatrist Dr. Keith Sakata.

This episode does mention mental health issues and suicide. Here are some crisis hotlines:

United States: US National Suicide Prevention Lifeline Dial 988 (Online chat available); US Crisis Text Line Text “HOME” to 741741

Australia: Lifeline 13 11 14 (Online chat available)

Canada: Canadian Association for Suicide Prevention (See link for phone numbers listed by province)

United Kingdom: Samaritans 116 123 (UK and ROI)

Full list of international hotlines here

Find our transcript here: https://bit.ly/ScienceVsAiCompanions

Chapters:

In this episode, we cover:

(00:00) What’s it like to fall in love with a chatbot?

(06:59) Do chatbots help people feel less lonely?

(21:19) Chatbots during a crisis

(28:43) Red flags to watch out for

(33:17) How dangerous are they?

This episode was produced by Rose Rimler, with help from Blythe Terrell, Meryl Horn, and Michelle Dang. We’re edited by Blythe Terrell. Fact checking by Diane Kelly. Mix and sound design by Bumi Hidaka. Music written by Emma Munger, So Wylie, Peter Leonard, Bumi Hidaka and Bobby Lord. Thanks to all the researchers we reached out to, including Cathy Fang, Dr. Linnea Laestadius, Dr. Sophia Choukas Bradley, and Prof. Stefano Puntoni. Special thanks also to Jeevika Verma.

Science Vs is a Spotify Studios Original. Listen for free on Spotify or wherever you get your podcasts. Follow us and tap the bell for new episode notifications.

Learn more about your ad choices. Visit podcastchoices.com/adchoices

  continue reading

310 episodes

Artwork

AI Chatbots: Are They Dangerous?

Science Vs

69,515 subscribers

published

iconShare
 
Manage episode 505664288 series 93547
Content provided by Spotify Studios. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Spotify Studios or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

Many of us use artificial intelligence for help with research, work, or creative projects. But some people are getting a LOT more personal with their AI chatbots. We’re hearing stories of people treating their chatbot like a friend, or something more than a friend — with some people saying they’ve even fallen in love with their chatbot. And then there are stories of things taking a scary turn, with people’s mental health spiraling out of control after talking to these bots. So, what should we make of AI companions? Is it risky to spend a lot of time talking to an AI bot? We ask AI researcher Dr. Julian de Freitas and psychiatrist Dr. Keith Sakata.

This episode does mention mental health issues and suicide. Here are some crisis hotlines:

United States: US National Suicide Prevention Lifeline Dial 988 (Online chat available); US Crisis Text Line Text “HOME” to 741741

Australia: Lifeline 13 11 14 (Online chat available)

Canada: Canadian Association for Suicide Prevention (See link for phone numbers listed by province)

United Kingdom: Samaritans 116 123 (UK and ROI)

Full list of international hotlines here

Find our transcript here: https://bit.ly/ScienceVsAiCompanions

Chapters:

In this episode, we cover:

(00:00) What’s it like to fall in love with a chatbot?

(06:59) Do chatbots help people feel less lonely?

(21:19) Chatbots during a crisis

(28:43) Red flags to watch out for

(33:17) How dangerous are they?

This episode was produced by Rose Rimler, with help from Blythe Terrell, Meryl Horn, and Michelle Dang. We’re edited by Blythe Terrell. Fact checking by Diane Kelly. Mix and sound design by Bumi Hidaka. Music written by Emma Munger, So Wylie, Peter Leonard, Bumi Hidaka and Bobby Lord. Thanks to all the researchers we reached out to, including Cathy Fang, Dr. Linnea Laestadius, Dr. Sophia Choukas Bradley, and Prof. Stefano Puntoni. Special thanks also to Jeevika Verma.

Science Vs is a Spotify Studios Original. Listen for free on Spotify or wherever you get your podcasts. Follow us and tap the bell for new episode notifications.

Learn more about your ad choices. Visit podcastchoices.com/adchoices

  continue reading

310 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Copyright 2025 | Privacy Policy | Terms of Service | | Copyright
Listen to this show while you explore
Play