Skip to main content
JH

Jeremy Howard

1episode
1podcast

We have 1 summarized appearance for Jeremy Howard so far. Browse all podcasts to discover more episodes.

Featured On 1 Podcast

All Appearances

1 episode
Machine Learning Street Talk

"Vibe Coding is a Slot Machine" - Jeremy Howard

Machine Learning Street Talk
87 minDeep Learning Pioneer, Kaggle Grandmaster

AI Summary

→ WHAT IT COVERS Deep learning pioneer Jeremy Howard joins Machine Learning Street Talk to argue that vibe coding functions like a slot machine, creating an illusion of control while eroding genuine software engineering competence. He draws on ULMFiT's origins, transfer learning history, and his own Claude Code experiments to distinguish coding from software engineering, warning that organizations betting on AI productivity gains face measurable, documented risks. → KEY INSIGHTS - **Vibe Coding Productivity Gap:** A study Howard cites shows only a tiny measurable uptick in software actually shipped despite widespread AI coding adoption. The slot machine analogy applies precisely: users craft prompts, adjust MCPs, and pull the lever repeatedly, experiencing stochastic wins that feel like skill but mask the absence of genuine output growth. No organization is demonstrably producing 50x more high-quality software. - **Coding vs. Software Engineering:** LLMs perform style transfer between training data points, which constitutes coding but not software engineering. Designing novel abstractions, identifying correct component boundaries, and composing systems that have never existed before requires moving outside training distribution — something LLMs demonstrably cannot do. Howard cites Anthropic's browser and the AI-generated C compiler as empirical examples of sophisticated copying, not original design. - **Understanding Debt in Organizations:** When teams delegate cognitive tasks to LLMs, organizational knowledge erodes. Howard frames this using the call center analogy: even seemingly routine roles generate edge cases that propagate upward and keep institutional knowledge adaptive. Automating those roles removes the feedback loop that makes organizations evolvable, cutting the legs off future adaptability without any immediate visible cost. - **The Desirable Difficulty Principle:** An Anthropic study found that developers using AI coding tools experienced so little friction they retained almost nothing. Howard connects this to Ebbinghaus and spaced repetition research: memories and skills only form under effortful conditions. He recommends organizations explicitly prioritize employee learning slope over output intercept, measuring how fast individuals grow rather than how many pull requests they close. - **Interactive Notebook Environments Outperform Terminal-Based AI:** Howard's nbdev framework embeds tests, documentation, implementation, and examples inside Jupyter notebooks, enabling CI integration without sacrificing exploratory feedback loops. Placing AI inside a live Python interpreter rather than a bash terminal gives both human and model richer real-time feedback. Howard reports feeling energized after notebook sessions versus drained after 14-hour Claude Code marathons. - **ULMFiT's Three-Stage Architecture Predicted Modern LLM Training:** Howard's 2018 model used general-purpose Wikipedia pretraining on an AWD-LSTM with five regularization types, followed by domain-specific supervised fine-tuning, then classifier fine-tuning — matching today's pretraining, mid-training, and post-training pipeline. Discriminative learning rates assigned per layer and sequential unfreezing from last to first layer were the key fine-tuning innovations, achieving state-of-the-art sentiment classification in minutes on a single gaming GPU. → NOTABLE MOMENT Howard spent two weeks using GPT-4 and Codex to fix a crashing IPython kernel upgrade, ultimately producing what he believes is the only working implementation of the new protocol. He then faced a genuine dilemma: whether to build a company product on code nobody, including him, actually understands. 💼 SPONSORS [{"name": "Indeed", "url": "https://indeed.com/podcast"}, {"name": "NVIDIA GTC", "url": "https://www.nvidia.com/gtc/"}] 🏷️ Vibe Coding, Transfer Learning, Software Engineering, AI Productivity, Notebook Programming, Organizational Knowledge

Never miss Jeremy Howard's insights

Subscribe to get AI-powered summaries of Jeremy Howard's podcast appearances delivered to your inbox weekly.

Start Free Today

No credit card required • Free tier available