Search a title or topic

Over 20 million podcasts, powered by 

Player FM logo
Artwork

Content provided by Memfault. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Memfault or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.
Player FM - Podcast App
Go offline with the Player FM app!

#007: AI, Open Source, and the Future of Embedded Development: How Much Code Will We Actually Write?

55:14
 
Share
 

Manage episode 497253984 series 3680416
Content provided by Memfault. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Memfault or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

In today's Coredump Session, we dive into a wide-ranging conversation about the intersection of AI, open source, and embedded systems with the teams from Memfault and Goliath. From the evolution of AI at the edge to the emerging role of large language models (LLMs) in firmware development, the panel explores where innovation is happening today — and where expectations still outpace reality. Listen in as they untangle the practical, the possible, and the hype shaping the future of IoT devices.

Speakers:

  • François Baldassari: CEO & Founder, Memfault
  • Thomas Sarlandie: Field CTO, Memfault
  • Jonathan Beri: CEO & Founder, Golioth
  • Dan Mangum: CTO, Golioth

Key Takeaways:

  • AI has been quietly powering embedded devices for years, especially in edge applications like voice recognition and computer vision.
  • The biggest gains in IoT today often come from cloud-based AI analytics, not necessarily from AI models running directly on devices.
  • LLMs are reshaping firmware development workflows but are not yet widely adopted for production-grade embedded codebases.
  • Use cases like audio and video processing have seen the fastest real-world adoption of AI at the edge.
  • Caution is warranted when integrating AI into safety-critical systems, where determinism is crucial.
  • Cloud-to-device AI models are becoming the go-to for fleet operations, anomaly detection, and predictive maintenance.
  • Many promising LLM-based consumer products struggle because hardware constraints and cloud dependence create friction.
  • The future of embedded AI may lie in hybrid architectures that balance on-device intelligence with cloud support.

Chapters:

00:00 Episode Teasers & Welcome

01:10 Meet the Panel: Memfault x Golioth

02:56 Why AI at the Edge Isn’t Actually New

05:33 The Real Use Cases for AI in Embedded Devices

08:07 How Much Chaos Are You Willing to Introduce?

11:19 Edge AI vs. Cloud AI: Where It’s Working Today

13:50 LLMs in Embedded: Promise vs. Reality

17:16 Why Hardware Can’t Keep Up with AI’s Pace

20:15 Building Unique Models When Public Datasets Fail

36:14 Open Source’s Big Moment (and What Comes Next)

42:49 Will AI Kill Open Source Contributions?

49:30 How AI Could Change Software Supply Chains

52:24 How to Stay Relevant as an Engineer in the AI Era

⁠⁠Join the Interrupt Slack

Watch this episode on YouTube

⁠Suggest a Guest⁠

Follow Memfault

Other ways to listen:

⁠⁠Apple Podcasts

iHeartRadio⁠⁠

⁠⁠Amazon Music

GoodPods

Castbox

⁠⁠

⁠⁠Visit our website

  continue reading

16 episodes

Artwork
iconShare
 
Manage episode 497253984 series 3680416
Content provided by Memfault. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Memfault or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

In today's Coredump Session, we dive into a wide-ranging conversation about the intersection of AI, open source, and embedded systems with the teams from Memfault and Goliath. From the evolution of AI at the edge to the emerging role of large language models (LLMs) in firmware development, the panel explores where innovation is happening today — and where expectations still outpace reality. Listen in as they untangle the practical, the possible, and the hype shaping the future of IoT devices.

Speakers:

  • François Baldassari: CEO & Founder, Memfault
  • Thomas Sarlandie: Field CTO, Memfault
  • Jonathan Beri: CEO & Founder, Golioth
  • Dan Mangum: CTO, Golioth

Key Takeaways:

  • AI has been quietly powering embedded devices for years, especially in edge applications like voice recognition and computer vision.
  • The biggest gains in IoT today often come from cloud-based AI analytics, not necessarily from AI models running directly on devices.
  • LLMs are reshaping firmware development workflows but are not yet widely adopted for production-grade embedded codebases.
  • Use cases like audio and video processing have seen the fastest real-world adoption of AI at the edge.
  • Caution is warranted when integrating AI into safety-critical systems, where determinism is crucial.
  • Cloud-to-device AI models are becoming the go-to for fleet operations, anomaly detection, and predictive maintenance.
  • Many promising LLM-based consumer products struggle because hardware constraints and cloud dependence create friction.
  • The future of embedded AI may lie in hybrid architectures that balance on-device intelligence with cloud support.

Chapters:

00:00 Episode Teasers & Welcome

01:10 Meet the Panel: Memfault x Golioth

02:56 Why AI at the Edge Isn’t Actually New

05:33 The Real Use Cases for AI in Embedded Devices

08:07 How Much Chaos Are You Willing to Introduce?

11:19 Edge AI vs. Cloud AI: Where It’s Working Today

13:50 LLMs in Embedded: Promise vs. Reality

17:16 Why Hardware Can’t Keep Up with AI’s Pace

20:15 Building Unique Models When Public Datasets Fail

36:14 Open Source’s Big Moment (and What Comes Next)

42:49 Will AI Kill Open Source Contributions?

49:30 How AI Could Change Software Supply Chains

52:24 How to Stay Relevant as an Engineer in the AI Era

⁠⁠Join the Interrupt Slack

Watch this episode on YouTube

⁠Suggest a Guest⁠

Follow Memfault

Other ways to listen:

⁠⁠Apple Podcasts

iHeartRadio⁠⁠

⁠⁠Amazon Music

GoodPods

Castbox

⁠⁠

⁠⁠Visit our website

  continue reading

16 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Copyright 2025 | Privacy Policy | Terms of Service | | Copyright
Listen to this show while you explore
Play