Search a title or topic

Over 20 million podcasts, powered by 

Player FM logo
Artwork

Content provided by SAS Podcast Admins, Kimberly Nevala, and Strategic Advisor - SAS. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by SAS Podcast Admins, Kimberly Nevala, and Strategic Advisor - SAS or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Regulating Addictive AI with Robert Mahari

54:24
 
Share
 

Manage episode 477338895 series 3546664
Content provided by SAS Podcast Admins, Kimberly Nevala, and Strategic Advisor - SAS. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by SAS Podcast Admins, Kimberly Nevala, and Strategic Advisor - SAS or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

Robert Mahari examines the consequences of addictive intelligence, adaptive responses to regulating AI companions, and the benefits of interdisciplinary collaboration.

Robert and Kimberly discuss the attributes of addictive products; the allure of AI companions; AI as a prescription for loneliness; not assuming only the lonely are susceptible; regulatory constraints and gaps; individual rights and societal harms; adaptive guardrails and regulation by design; agentic self-awareness; why uncertainty doesn’t negate accountability; AI’s negative impact on the data commons; economic disincentives; interdisciplinary collaboration and future research.

Robert Mahari is a JD-PhD researcher at MIT Media Lab and the Harvard Law School where he studies the intersection of technology, law and business. In addition to computational law, Robert has a keen interest in AI regulation and embedding regulatory objectives and guardrails into AI designs.

A transcript of this episode is here.

Additional Resources:

  continue reading

75 episodes

Artwork
iconShare
 
Manage episode 477338895 series 3546664
Content provided by SAS Podcast Admins, Kimberly Nevala, and Strategic Advisor - SAS. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by SAS Podcast Admins, Kimberly Nevala, and Strategic Advisor - SAS or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

Robert Mahari examines the consequences of addictive intelligence, adaptive responses to regulating AI companions, and the benefits of interdisciplinary collaboration.

Robert and Kimberly discuss the attributes of addictive products; the allure of AI companions; AI as a prescription for loneliness; not assuming only the lonely are susceptible; regulatory constraints and gaps; individual rights and societal harms; adaptive guardrails and regulation by design; agentic self-awareness; why uncertainty doesn’t negate accountability; AI’s negative impact on the data commons; economic disincentives; interdisciplinary collaboration and future research.

Robert Mahari is a JD-PhD researcher at MIT Media Lab and the Harvard Law School where he studies the intersection of technology, law and business. In addition to computational law, Robert has a keen interest in AI regulation and embedding regulatory objectives and guardrails into AI designs.

A transcript of this episode is here.

Additional Resources:

  continue reading

75 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Copyright 2025 | Privacy Policy | Terms of Service | | Copyright
Listen to this show while you explore
Play