Search a title or topic

Over 20 million podcasts, powered by 

Player FM logo
Artwork

Content provided by DataCamp. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by DataCamp or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.
Player FM - Podcast App
Go offline with the Player FM app!

#313 Developing Better Predictive Models with Graph Transformers with Jure Leskovec, Pioneer of Graph Transformers, Professor at Stanford

51:05
 
Share
 

Manage episode 498231050 series 2285898
Content provided by DataCamp. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by DataCamp or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

The structured data that powers business decisions is more complex than the sequences processed by traditional AI models. Enterprise databases with their interconnected tables of customers, products, and transactions form intricate graphs that contain valuable predictive signals. But how can we effectively extract insights from these complex relationships without extensive manual feature engineering?

Graph transformers are revolutionizing this space by treating databases as networks and learning directly from raw data. What if you could build models in hours instead of months while achieving better accuracy? How might this technology change the role of data scientists, allowing them to focus on business impact rather than data preparation? Could this be the missing piece that brings the AI revolution to predictive modeling?

Jure Leskovec is a Professor of Computer Science at Stanford University, where he is affiliated with the Stanford AI Lab, the Machine Learning Group, and the Center for Research on Foundation Models.

Previously, he served as Chief Scientist at Pinterest and held a research role at the Chan Zuckerberg Biohub. He is also a co-founder of Kumo.AI, a machine learning startup. Leskovec has contributed significantly to the development of Graph Neural Networks and co-authored PyG, a widely-used library in the field. Research from his lab has supported public health efforts during the COVID-19 pandemic and informed product development at companies including Facebook, Pinterest, Uber, YouTube, and Amazon.

His work has received several recognitions, including the Microsoft Research Faculty Fellowship (2011), the Okawa Research Award (2012), the Alfred P. Sloan Fellowship (2012), the Lagrange Prize (2015), and the ICDM Research Contributions Award (2019). His research spans social networks, machine learning, data mining, and computational biomedicine, with a focus on drug discovery. He has received 12 best paper awards and five 10-year Test of Time awards at leading academic conferences.

In the episode, Richie and Jure explore the need for a foundation model for enterprise data, the limitations of current AI models in predictive tasks, the potential of graph transformers for business data, and the transformative impact of relational foundation models on machine learning workflows, and much more.

Links Mentioned in the Show:


New to DataCamp?


  continue reading

339 episodes

Artwork
iconShare
 
Manage episode 498231050 series 2285898
Content provided by DataCamp. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by DataCamp or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

The structured data that powers business decisions is more complex than the sequences processed by traditional AI models. Enterprise databases with their interconnected tables of customers, products, and transactions form intricate graphs that contain valuable predictive signals. But how can we effectively extract insights from these complex relationships without extensive manual feature engineering?

Graph transformers are revolutionizing this space by treating databases as networks and learning directly from raw data. What if you could build models in hours instead of months while achieving better accuracy? How might this technology change the role of data scientists, allowing them to focus on business impact rather than data preparation? Could this be the missing piece that brings the AI revolution to predictive modeling?

Jure Leskovec is a Professor of Computer Science at Stanford University, where he is affiliated with the Stanford AI Lab, the Machine Learning Group, and the Center for Research on Foundation Models.

Previously, he served as Chief Scientist at Pinterest and held a research role at the Chan Zuckerberg Biohub. He is also a co-founder of Kumo.AI, a machine learning startup. Leskovec has contributed significantly to the development of Graph Neural Networks and co-authored PyG, a widely-used library in the field. Research from his lab has supported public health efforts during the COVID-19 pandemic and informed product development at companies including Facebook, Pinterest, Uber, YouTube, and Amazon.

His work has received several recognitions, including the Microsoft Research Faculty Fellowship (2011), the Okawa Research Award (2012), the Alfred P. Sloan Fellowship (2012), the Lagrange Prize (2015), and the ICDM Research Contributions Award (2019). His research spans social networks, machine learning, data mining, and computational biomedicine, with a focus on drug discovery. He has received 12 best paper awards and five 10-year Test of Time awards at leading academic conferences.

In the episode, Richie and Jure explore the need for a foundation model for enterprise data, the limitations of current AI models in predictive tasks, the potential of graph transformers for business data, and the transformative impact of relational foundation models on machine learning workflows, and much more.

Links Mentioned in the Show:


New to DataCamp?


  continue reading

339 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Copyright 2025 | Privacy Policy | Terms of Service | | Copyright
Listen to this show while you explore
Play