If you haven’t heard of it, Depth First Learning is a wonderful resource for learning about machine learning.
Grown out of a Google AI residency, the DFL program builds curricula around specific ML papers. If you were intrigued by the AlphaGoZero paper, for instance, but felt you couldn’t fully appreciate it – maybe you’re a little rusty on the Bellman equation, or you haven’t spent much time with Monte Carlo Tree Search – DFL has built an entire self-paced class, with background reading, lectures, and practice problems, that culminates in the paper itself. So far, they’ve built guides like this for DeepStack, InfoGAN, and TRPO.
Last year, we decided to sponsor Depth First Learning by funding grants for four fellows to create new curricula, and better yet, to use their new materials to run 6-week, open, online classes. To our delight, 113 people applied, and late last week, DFL announced the recipients.
- Steve Kroon - Stellenbosch (South Africa) - Variational Inference with Normalizing Flows
- Sandhya Prabhakaran - New York (USA) - Spherical CNN
- Bhairav Mehta - Montreal (Canada) - Stein Variational Gradient Descent
- Vinay Ramasesh, Piyush Patil, and Riley Edmunds - Berkeley (USA) Resurrecting the sigmoid in deep learning through dynamical isometry
Congratulations! We can’t wait to take your new courses.
At Jane Street, technical education has always been a core part of the culture. It’s not just about having a library in the office (though that helps, too) – we’re constantly looking for ways to help engineers go a level deeper, whether that takes the form of talks, trips to conferences, or internal classes built, like DFL’s, around specific libraries and projects.
We’ve found that going “depth-first” is always better in the long run.