Backpropagation: The Engine Behind Modern AI
MP3•Episode home
Manage episode 513457704 series 3690682
Content provided by Mike Breault. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Mike Breault or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.
An accessible, concise tour of backpropagation: how the forward pass computes outputs, how the backward pass uses the chain rule to compute gradients efficiently, and why caching intermediates matters. A quick history from 1960s-70s precursors to Werbos, Rumelhart–Hinton–Williams' 1986 breakthrough, with NETtalk and TD-Gammon as milestones. We also discuss limitations like local minima and vanishing/exploding gradients, and what these mean for today’s huge models. Brought to you by Embersilk.
…
continue reading
Note: This podcast was AI-generated, and sometimes AI can make mistakes. Please double-check any critical information.
Sponsored by Embersilk LLC
1366 episodes