Building predictive models shouldn’t feel like wrestling with 20-year-old machine learning hacks. If you’ve ever burned weeks hand-engineering features and stitching fragile pipelines, you know how broken the old way is.
In this talk, we’ll introduce Relational Graph Transformers and show a pretrained Relational Foundation Model that allows predictive modeling directly over your warehouse, eliminating manual joins and feature engineering while boosting predictive accuracy 30-50%. You’ll see benchmarks, dive into architecture details, and watch a live demo of instant, production-grade predictions–—turning months of work into hours.
Jure Leskovec is Professor of Computer Science at Stanford University. He is affiliated with the Stanford AI Lab, the Machine Learning Group, and the Center for Research on Foundation Models. In the past, he served as Chief Scientist at Pinterest and was an investigator at Chan Zuckerberg BioHub. Most recently, he co-founded the machine learning startup Kumo.AI.
Leskovec pioneered the field of Graph Neural Networks and created PyG, the most widely used graph neural network library. Research from his group has been used by many countries to fight the COVID-19 pandemic, and has been incorporated into products at Meta, Pinterest, Uber, YouTube, Amazon, and more.
His work has earned numerous awards, including the Microsoft Research Faculty Fellowship (2011), Okawa Research Award (2012), Alfred P. Sloan Fellowship (2012), Lagrange Prize (2015), ICDM Research Contributions Award (2019), and the ACM SIGKDD Innovation Award (2023). His research contributions span social networks, data mining, machine learning, and computational biomedicine with a focus on drug discovery. His work has won 12 best paper awards and 5 test-of-time awards at premier venues in these research areas.
Leskovec earned his bachelor's in Computer Science from the University of Ljubljana, his Ph.D. in Machine Learning from Carnegie Mellon University, and completed postdoctoral training at Cornell University.