Skip to main content
How I AI

How Coinbase scaled AI to 1,000+ engineers | Chintan Turakhia

58 min episode · 2 min read
·

Episode

58 min

Read time

2 min

Topics

Artificial Intelligence

AI-Generated Summary

Key Takeaways

  • Leadership by demonstration: Engineering leaders who mandate AI tool adoption without personal hands-on usage fail. Turakhia spent January through April 2025 using Cursor daily, identifying specific wins like automated PR creation from plain-language commands, then showed engineers concrete examples rather than issuing top-down directives. Credibility requires personal fluency before organizational change.
  • PR Speed Run format: To accelerate adoption, Turakhia ran a timed all-hands session where every engineer submitted one trivial PR using Cursor within 30 minutes. A team-level run produced 70 PRs; a company-wide run with 800 engineers produced 300–400 PRs. The format creates visible proof of velocity and breaks psychological inertia around AI tools.
  • Target toil first: AI adoption sticks when it eliminates work engineers already resent. Turakhia prioritized unit test generation, linting automation, and git command replacement as entry points. Removing these "soul-draining" tasks creates early wins that build habitual usage before engineers attempt more complex agent-driven workflows in their daily development cycles.
  • Cycle time compression: Coinbase reduced average PR review cycle time from 150 hours to roughly 15 hours — a 10x reduction — by combining AI-assisted authoring with structured review workflows. Turakhia measures AI impact through full ticket-to-user delivery time rather than lines of code, capturing every coordination, review, and deployment stage in one metric.
  • Slack as adoption infrastructure: Turakhia's team built an internal agent called Cloudbot that accepts commands directly in Slack, reads Linear tickets, queries Datadog, Sentry, Amplitude, and Snowflake, then authors PRs across multiple codebases. Routing AI workflows through Slack rather than separate tools makes adoption viral — colleagues observe outputs in real time and self-onboard organically.

What It Covers

Chintan Turakhia, Senior Director of Engineering at Coinbase, details how he drove Cursor AI adoption across 1,000+ engineers by leading hands-on, running company-wide PR speed runs, and building custom Slack-based agents that compress the full cycle from user feedback to shipped code.

Key Questions Answered

  • Leadership by demonstration: Engineering leaders who mandate AI tool adoption without personal hands-on usage fail. Turakhia spent January through April 2025 using Cursor daily, identifying specific wins like automated PR creation from plain-language commands, then showed engineers concrete examples rather than issuing top-down directives. Credibility requires personal fluency before organizational change.
  • PR Speed Run format: To accelerate adoption, Turakhia ran a timed all-hands session where every engineer submitted one trivial PR using Cursor within 30 minutes. A team-level run produced 70 PRs; a company-wide run with 800 engineers produced 300–400 PRs. The format creates visible proof of velocity and breaks psychological inertia around AI tools.
  • Target toil first: AI adoption sticks when it eliminates work engineers already resent. Turakhia prioritized unit test generation, linting automation, and git command replacement as entry points. Removing these "soul-draining" tasks creates early wins that build habitual usage before engineers attempt more complex agent-driven workflows in their daily development cycles.
  • Cycle time compression: Coinbase reduced average PR review cycle time from 150 hours to roughly 15 hours — a 10x reduction — by combining AI-assisted authoring with structured review workflows. Turakhia measures AI impact through full ticket-to-user delivery time rather than lines of code, capturing every coordination, review, and deployment stage in one metric.
  • Slack as adoption infrastructure: Turakhia's team built an internal agent called Cloudbot that accepts commands directly in Slack, reads Linear tickets, queries Datadog, Sentry, Amplitude, and Snowflake, then authors PRs across multiple codebases. Routing AI workflows through Slack rather than separate tools makes adoption viral — colleagues observe outputs in real time and self-onboard organically.

Notable Moment

During a live user call, Turakhia authored and shipped a fix before the 30-minute conversation ended, then asked the user to reload the app to confirm the change. The story reframes what "fast feedback loops" actually means in practice for product teams.

Know someone who'd find this useful?

You just read a 3-minute summary of a 55-minute episode.

Get How I AI summarized like this every Monday — plus up to 2 more podcasts, free.

Pick Your Podcasts — Free

Keep Reading

More from How I AI

We summarize every new episode. Want them in your inbox?

Similar Episodes

Related episodes from other podcasts

Explore Related Topics

This podcast is featured in Best AI Podcasts (2026) — ranked and reviewed with AI summaries.

Read this week's AI & Machine Learning Podcast Insights — cross-podcast analysis updated weekly.

You're clearly into How I AI.

Every Monday, we deliver AI summaries of the latest episodes from How I AI and 192+ other podcasts. Free for up to 3 shows.

Start My Monday Digest

No credit card · Unsubscribe anytime