An award-winning cannabis podcast for women, by women. Hear joyful stories and useful advice about cannabis for health, well-being, and fun—especially for needs specific to women like stress, sleep, and sex. We cover everything from: What’s the best weed for sex? Can I use CBD for menstrual cramps? What are the effects of the Harlequin strain or Gelato strain? And, why do we prefer to call it “cannabis” instead of “marijuana”? We also hear from you: your first time buying legal weed, and how ...
…
continue reading
Content provided by SCCE. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by SCCE or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.
Player FM - Podcast App
Go offline with the Player FM app!
Go offline with the Player FM app!
Alessia Falsarone on AI Explainability [Podcast]
MP3•Episode home
Manage episode 515189686 series 2837193
Content provided by SCCE. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by SCCE or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.
By Adam Turteltaub Why did the AI do that? It’s a simple and common question, but the answer is often opaque, with people referring to black boxes, algorithms and other words that only those in the know tend to understand. Alessia Falsarone, a non-executive director of Innovate UK, says that’s a problem. In cases where AI has run amok, the fallout is often worse because the company is unable to explain why the AI made the decision it made and what data it was relying on. AI, she argues, needs to be explainable to regulators and the public. That way all sides can understand what the AI is doing (or has done) and why. To create more explainable AI, she recommends the creation of a dashboard showing the factors that influence the decisions made. In addition, teams need to track changes made to the model over time. By doing so, when the regulator or public asks why something happened, the organization can respond quickly and clearly. In addition, by embracing a more transparent process, and involving compliance early, organizations can head off potential AI issues early in the process. Listen is to hear her explain the virtues of explainability.
…
continue reading
105 episodes
MP3•Episode home
Manage episode 515189686 series 2837193
Content provided by SCCE. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by SCCE or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.
By Adam Turteltaub Why did the AI do that? It’s a simple and common question, but the answer is often opaque, with people referring to black boxes, algorithms and other words that only those in the know tend to understand. Alessia Falsarone, a non-executive director of Innovate UK, says that’s a problem. In cases where AI has run amok, the fallout is often worse because the company is unable to explain why the AI made the decision it made and what data it was relying on. AI, she argues, needs to be explainable to regulators and the public. That way all sides can understand what the AI is doing (or has done) and why. To create more explainable AI, she recommends the creation of a dashboard showing the factors that influence the decisions made. In addition, teams need to track changes made to the model over time. By doing so, when the regulator or public asks why something happened, the organization can respond quickly and clearly. In addition, by embracing a more transparent process, and involving compliance early, organizations can head off potential AI issues early in the process. Listen is to hear her explain the virtues of explainability.
…
continue reading
105 episodes
All episodes
×Welcome to Player FM!
Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.