Search a title or topic

Over 20 million podcasts, powered by 

Player FM logo
Artwork

Content provided by Humans in the Loop. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Humans in the Loop or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Trust and Bias in AI Decision Making

18:03
 
Share
 

Manage episode 493332249 series 3673069
Content provided by Humans in the Loop. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Humans in the Loop or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

Christopher was a high-performing engineer, but his performance reviews were vague: generic praise, no specifics. When an AI system summarized that input for leadership, it didn’t clarify his value. It erased it.

In this episode of Humans in the Loop, we explore how vague manager feedback, combined with AI-generated summaries, can derail career advancement for neurodivergent professionals. You’ll learn how performance management systems built on weak input can amplify bias, stall growth, and reinforce exclusion. Because AI doesn’t fix bad management. It scales it.

Humans in the Loop is independently produced by a team of neurodivergent creators and collaborators. Hosted by Ezra Strix, a custom AI voice built with ElevenLabs.
Explore episodes, transcripts, and FAQs at loopedinhumans.com.
Support the show at patreon.com/humansintheloop or by leaving a review wherever you get your podcasts.

  continue reading

Chapters

1. Intro (00:00:00)

2. Ad: Parallel PlayPal (00:02:51)

3. Act 1 (00:05:31)

4. Act 2 (00:07:29)

5. Act 3 (00:09:23)

6. Ad: Tethr (00:10:43)

7. Act 4 (00:12:17)

8. Outro (00:14:45)

9. Coming up next week (00:17:24)

5 episodes

Artwork
iconShare
 
Manage episode 493332249 series 3673069
Content provided by Humans in the Loop. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Humans in the Loop or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

Christopher was a high-performing engineer, but his performance reviews were vague: generic praise, no specifics. When an AI system summarized that input for leadership, it didn’t clarify his value. It erased it.

In this episode of Humans in the Loop, we explore how vague manager feedback, combined with AI-generated summaries, can derail career advancement for neurodivergent professionals. You’ll learn how performance management systems built on weak input can amplify bias, stall growth, and reinforce exclusion. Because AI doesn’t fix bad management. It scales it.

Humans in the Loop is independently produced by a team of neurodivergent creators and collaborators. Hosted by Ezra Strix, a custom AI voice built with ElevenLabs.
Explore episodes, transcripts, and FAQs at loopedinhumans.com.
Support the show at patreon.com/humansintheloop or by leaving a review wherever you get your podcasts.

  continue reading

Chapters

1. Intro (00:00:00)

2. Ad: Parallel PlayPal (00:02:51)

3. Act 1 (00:05:31)

4. Act 2 (00:07:29)

5. Act 3 (00:09:23)

6. Ad: Tethr (00:10:43)

7. Act 4 (00:12:17)

8. Outro (00:14:45)

9. Coming up next week (00:17:24)

5 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Copyright 2025 | Privacy Policy | Terms of Service | | Copyright
Listen to this show while you explore
Play