Go offline with the Player FM app!
43. How Do You Assure AI For Bias and Accessibility in the NHS? With Adam Byfield
Manage episode 495665097 series 3459206
Adam Byfield is a Principal Technical Assurance Specialist at NHS England. After his previous appearance on the podcast, discussing providing ethical assurance for AI applications in healthcare, we were keen to get him back to dive into some more specific issues. We chose bias and accessibility, two related issues that are clearly central for anyone concerned with AI, including in healthcare applications. We talked about different forms of bias, how bias can affect accessibility and what forms of bias, if any, might be acceptable.
Ethics Untangled is produced by IDEA, The Ethics Centre at the University of Leeds.
Bluesky: @ethicsuntangled.bsky.social
Facebook: https://www.facebook.com/ideacetl
LinkedIn: https://www.linkedin.com/company/idea-ethics-centre/
81 episodes
Manage episode 495665097 series 3459206
Adam Byfield is a Principal Technical Assurance Specialist at NHS England. After his previous appearance on the podcast, discussing providing ethical assurance for AI applications in healthcare, we were keen to get him back to dive into some more specific issues. We chose bias and accessibility, two related issues that are clearly central for anyone concerned with AI, including in healthcare applications. We talked about different forms of bias, how bias can affect accessibility and what forms of bias, if any, might be acceptable.
Ethics Untangled is produced by IDEA, The Ethics Centre at the University of Leeds.
Bluesky: @ethicsuntangled.bsky.social
Facebook: https://www.facebook.com/ideacetl
LinkedIn: https://www.linkedin.com/company/idea-ethics-centre/
81 episodes
All episodes
×Welcome to Player FM!
Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.