Search a title or topic

Over 20 million podcasts, powered by 

Player FM logo
Artwork

Content provided by Nick Standlea. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Nick Standlea or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Losing Control: How AI Agents Could Run the World — with Oxford’s Christopher Summerfield

53:49
 
Share
 

Manage episode 521356467 series 3498400
Content provided by Nick Standlea. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Nick Standlea or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

Professor Christopher Summerfield, a leading neuroscientist at Oxford University and Research Director at the UK AI Safety Institute, former Senior Research Scientist at Google DeepMind, discusses his new book, These Strange New Minds, which explores how large language models learned to talk, how they differ from the human brain, and what their rise means for control, agency, and the future of work.

We discuss:
The real risk of AI — losing control, not extinction

How AI agents act in digital loops humans can’t see

Why agency may be more essential than reward

Fragility, feedback loops, and flash-crash analogies

What AI is teaching us about human intelligence

Augmentation vs. replacement in medicine, law, and beyond

Why trust is the social form of agency — and why humans must stay in the loop

🎧 Listen to more episodes: https://www.youtube.com/@TheNickStandleaShow

Guest Notes:
Professor of Cognitive Neuroscience
🌐 Human Information Processing Lab (Oxford)
🏛 UK AI Safety Institute
Experimental Psychology
Oxford University

Human Information Processing (HIP) lab in the Department of Experimental Psychology at the University of Oxford, run by Professor Christopher Summerfield: https://humaninformationprocessing.com/

📘 These Strange New Minds (Penguin Random House): https://www.amazon.com/These-Strange-New-Minds-Learned/dp/0593831713

Christopher Summerfield Media:
https://csummerfield.github.io/personal_website/
https://flightlessprofessors.org
twitter: @summerfieldlab
bluesky: @summerfieldlab.bsky.social

🔗 Support This Podcast by Checking Out Our Sponsors:
👉 Build your own AI Agent with Zapier (opens the builder with the prompt pre-loaded): https://bit.ly/4hH5JaE

Test Prep Gurus
website: https://www.prepgurus.com
Instagram: @TestPrepGurus

Connect with The Nick Standlea Show:
YouTube: @TheNickStandleaShow
Podcast Website: https://nickshow.podbean.com/
Apple Podcasts: https://podcasts.apple.com/us/podcast/the-nick-standlea-podcast/id1700331903
Spotify: https://open.spotify.com/show/0YqBBneFsKtQ6Y0ArP5CXJ
RSS Feed: https://feed.podbean.com/nickshow/feed.xml

Nick's Socials:
Instagram: @nickstandlea
X (Twitter): @nickstandlea
TikTok: @nickstandleashow
Facebook: @nickstandleapodcast

Ask questions,
Don't accept the status quo,
And be curious.

🕒 Timestamps / Chapters
00:00 Cold open — control, agency, and AI
00:31 Guest intro: Oxford → DeepMind → UK AI Safety Institute
01:02 The real story behind AI “takeover”: loss of control
03:02 Is AI going to kill us? The control problem explained
06:10 Agency as a basic psychological good
10:46 The Faustian bargain: efficiency vs. personal agency
13:12 What are AI agents and why are they fragile?
20:12 Three risk buckets: misuse, errors, systemic effects
24:58 Fragility & flash-crash analogies in AI systems
30:37 Do we really understand how models think? (Transformers 101)
34:16 What AI is teaching us about human intelligence
36:46 Brains vs. neural nets: similarities & differences
43:57 Embodiment and why robotics is still hard
46:28 Augmentation vs. replacement in white-collar work
50:14 Trust as social agency — why humans must stay in the loop
52:49 Where to find Christopher & closing thoughts

  continue reading

52 episodes

Artwork
iconShare
 
Manage episode 521356467 series 3498400
Content provided by Nick Standlea. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Nick Standlea or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

Professor Christopher Summerfield, a leading neuroscientist at Oxford University and Research Director at the UK AI Safety Institute, former Senior Research Scientist at Google DeepMind, discusses his new book, These Strange New Minds, which explores how large language models learned to talk, how they differ from the human brain, and what their rise means for control, agency, and the future of work.

We discuss:
The real risk of AI — losing control, not extinction

How AI agents act in digital loops humans can’t see

Why agency may be more essential than reward

Fragility, feedback loops, and flash-crash analogies

What AI is teaching us about human intelligence

Augmentation vs. replacement in medicine, law, and beyond

Why trust is the social form of agency — and why humans must stay in the loop

🎧 Listen to more episodes: https://www.youtube.com/@TheNickStandleaShow

Guest Notes:
Professor of Cognitive Neuroscience
🌐 Human Information Processing Lab (Oxford)
🏛 UK AI Safety Institute
Experimental Psychology
Oxford University

Human Information Processing (HIP) lab in the Department of Experimental Psychology at the University of Oxford, run by Professor Christopher Summerfield: https://humaninformationprocessing.com/

📘 These Strange New Minds (Penguin Random House): https://www.amazon.com/These-Strange-New-Minds-Learned/dp/0593831713

Christopher Summerfield Media:
https://csummerfield.github.io/personal_website/
https://flightlessprofessors.org
twitter: @summerfieldlab
bluesky: @summerfieldlab.bsky.social

🔗 Support This Podcast by Checking Out Our Sponsors:
👉 Build your own AI Agent with Zapier (opens the builder with the prompt pre-loaded): https://bit.ly/4hH5JaE

Test Prep Gurus
website: https://www.prepgurus.com
Instagram: @TestPrepGurus

Connect with The Nick Standlea Show:
YouTube: @TheNickStandleaShow
Podcast Website: https://nickshow.podbean.com/
Apple Podcasts: https://podcasts.apple.com/us/podcast/the-nick-standlea-podcast/id1700331903
Spotify: https://open.spotify.com/show/0YqBBneFsKtQ6Y0ArP5CXJ
RSS Feed: https://feed.podbean.com/nickshow/feed.xml

Nick's Socials:
Instagram: @nickstandlea
X (Twitter): @nickstandlea
TikTok: @nickstandleashow
Facebook: @nickstandleapodcast

Ask questions,
Don't accept the status quo,
And be curious.

🕒 Timestamps / Chapters
00:00 Cold open — control, agency, and AI
00:31 Guest intro: Oxford → DeepMind → UK AI Safety Institute
01:02 The real story behind AI “takeover”: loss of control
03:02 Is AI going to kill us? The control problem explained
06:10 Agency as a basic psychological good
10:46 The Faustian bargain: efficiency vs. personal agency
13:12 What are AI agents and why are they fragile?
20:12 Three risk buckets: misuse, errors, systemic effects
24:58 Fragility & flash-crash analogies in AI systems
30:37 Do we really understand how models think? (Transformers 101)
34:16 What AI is teaching us about human intelligence
36:46 Brains vs. neural nets: similarities & differences
43:57 Embodiment and why robotics is still hard
46:28 Augmentation vs. replacement in white-collar work
50:14 Trust as social agency — why humans must stay in the loop
52:49 Where to find Christopher & closing thoughts

  continue reading

52 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Copyright 2025 | Privacy Policy | Terms of Service | | Copyright
Listen to this show while you explore
Play