Discussion about this post

User's avatar
The AI Architect's avatar

Brilliant synthesis of pre-deep learning ML history. The kernel trick explanation really captures why academics gravitated toward SVMs over neural nets, becuase there was actual theory to publish. Its fascinating how XGBoost managed to dominate Kaggle for years even after AlexNet, which shows how much inertia there was inthe industry despite the obvious potential of deep learning.

Expand full comment
Aseem's avatar

This does a great job of getting to the heart of ML research motivations which is often obfuscated either by overly complex papers or basic blog posts. Look forward to the next post!

Expand full comment

No posts

Ready for more?