Search a title or topic

Over 20 million podcasts, powered by 

Player FM logo
Artwork

Content provided by Lydia Kumar. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Lydia Kumar or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.
Player FM - Podcast App
Go offline with the Player FM app!

16. Unmasking AI: Angeline Corvaglia on Bias, Emotional Design, and Protecting Your Unique Voice

45:20
 
Share
 

Manage episode 504415358 series 3669744
Content provided by Lydia Kumar. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Lydia Kumar or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

Angeline Corvaglia, founder of Data Girl and Friends and the soon-to-be-announced SHIELD, joins us to explore how AI education can equip the next generation to stay thoughtful, self-aware, and socially grounded in an age of algorithmic influence.

With a background in global finance and digital transformation, Angeline now works at the intersection of AI literacy and youth empowerment. In this expansive conversation, she shares how emotional design, biased data labeling, and chatbot companions are already shaping young minds, and what parents, teachers, and communities can do about it.

Angeline’s insights challenge us to center student voice, rethink “neutral” tech, and reclaim our inner compass in a time of persuasive machines.

Highlights:

  • The chatbot conversation that sparked her shift from CFO to AI literacy advocate
  • How young people are unknowingly outsourcing critical thinking to chatbots
  • What educators need to know about AI bias, trust, and voice development
  • Global stories from Nigeria to Italy revealing AI’s cultural blind spots
  • Simple metaphors (like cookie crumbs!) that make data concepts stick
  continue reading

18 episodes

Artwork
iconShare
 
Manage episode 504415358 series 3669744
Content provided by Lydia Kumar. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Lydia Kumar or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

Angeline Corvaglia, founder of Data Girl and Friends and the soon-to-be-announced SHIELD, joins us to explore how AI education can equip the next generation to stay thoughtful, self-aware, and socially grounded in an age of algorithmic influence.

With a background in global finance and digital transformation, Angeline now works at the intersection of AI literacy and youth empowerment. In this expansive conversation, she shares how emotional design, biased data labeling, and chatbot companions are already shaping young minds, and what parents, teachers, and communities can do about it.

Angeline’s insights challenge us to center student voice, rethink “neutral” tech, and reclaim our inner compass in a time of persuasive machines.

Highlights:

  • The chatbot conversation that sparked her shift from CFO to AI literacy advocate
  • How young people are unknowingly outsourcing critical thinking to chatbots
  • What educators need to know about AI bias, trust, and voice development
  • Global stories from Nigeria to Italy revealing AI’s cultural blind spots
  • Simple metaphors (like cookie crumbs!) that make data concepts stick
  continue reading

18 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Copyright 2025 | Privacy Policy | Terms of Service | | Copyright
Listen to this show while you explore
Play