Search a title or topic

Over 20 million podcasts, powered by 

Player FM logo
Artwork

Content provided by WRKdefined and WRKdefined Podcast Network. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by WRKdefined and WRKdefined Podcast Network or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.
Player FM - Podcast App
Go offline with the Player FM app!

HR, We Have a Problem - The unsurprising truth about AI bias in hiring vs human discrimination

41:02
 
Share
 

Manage episode 497549093 series 2976360
Content provided by WRKdefined and WRKdefined Podcast Network. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by WRKdefined and WRKdefined Podcast Network or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

In this episode of HR, We Have a Problem, Teri Zipper and Jeff Pole, Co-Founder & CEO of Warden AIi, discuss the reality of AI bias in talent acquisition and hiring processes. This conversation covers the shared responsibility between vendors and organizations when implementing AI tools, practical guidance for evaluating AI systems, and the importance of third-party auditing in building trust.

Key points covered include:

↪️ AI systems performed more fairly across demographic groups (sex, gender, race, ethnicity) compared to traditional human-driven hiring processes, with 85% of AI systems passing standard fairness thresholds.

↪️ Only 14 AI-related employment discrimination cases emerged over five years, compared to hundreds of thousands of traditional discrimination claims, putting current legal concerns in perspective.

↪️ AI vendors and organizations share responsibility for safe deployment - vendors must build sound systems while users must implement them correctly with proper governance.

↪️ Organizations should involve legal, IT, and risk teams early in AI vendor evaluations, asking for AI explainability statements and understanding how vendors would support clients during potential discrimination claims.

Don’t miss this exciting thought leader conversation! Follow the hosts and companies mentioned below:

Sapient Insights Group

Download the 2023-24 HR Systems Survey White Paper

Instagram | Twitter | LinkedIn

Teri Zipper

Instagram | Twitter | LinkedIn

Jeff Pole

LinkedIn

Warden AI

LinkedIn

  continue reading

159 episodes

Artwork
iconShare
 
Manage episode 497549093 series 2976360
Content provided by WRKdefined and WRKdefined Podcast Network. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by WRKdefined and WRKdefined Podcast Network or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

In this episode of HR, We Have a Problem, Teri Zipper and Jeff Pole, Co-Founder & CEO of Warden AIi, discuss the reality of AI bias in talent acquisition and hiring processes. This conversation covers the shared responsibility between vendors and organizations when implementing AI tools, practical guidance for evaluating AI systems, and the importance of third-party auditing in building trust.

Key points covered include:

↪️ AI systems performed more fairly across demographic groups (sex, gender, race, ethnicity) compared to traditional human-driven hiring processes, with 85% of AI systems passing standard fairness thresholds.

↪️ Only 14 AI-related employment discrimination cases emerged over five years, compared to hundreds of thousands of traditional discrimination claims, putting current legal concerns in perspective.

↪️ AI vendors and organizations share responsibility for safe deployment - vendors must build sound systems while users must implement them correctly with proper governance.

↪️ Organizations should involve legal, IT, and risk teams early in AI vendor evaluations, asking for AI explainability statements and understanding how vendors would support clients during potential discrimination claims.

Don’t miss this exciting thought leader conversation! Follow the hosts and companies mentioned below:

Sapient Insights Group

Download the 2023-24 HR Systems Survey White Paper

Instagram | Twitter | LinkedIn

Teri Zipper

Instagram | Twitter | LinkedIn

Jeff Pole

LinkedIn

Warden AI

LinkedIn

  continue reading

159 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Copyright 2025 | Privacy Policy | Terms of Service | | Copyright
Listen to this show while you explore
Play