Skip to main content
EM

Emad Mostaque

1episode
1podcast

We have 1 summarized appearance for Emad Mostaque so far. Browse all podcasts to discover more episodes.

Featured On 1 Podcast

All Appearances

1 episode

AI Summary

→ WHAT IT COVERS The White House Genesis Mission launches to unite federal supercomputers and datasets for AI-driven scientific discovery. Anthropic releases Claude Opus 4.5 with 76% token efficiency gains, outperforming human engineers on coding benchmarks while recursive self-improvement accelerates. → KEY INSIGHTS - **Genesis Mission Structure:** Department of Energy connects US supercomputers and federal scientific datasets into unified AI platform targeting biotech, fusion, and quantum computing with goal to double American scientific productivity within decades through coordinated compute resources and unlocked government data enclaves for pretraining models. - **Claude Opus 4.5 Performance:** New model scores 52% on SWE Bench Pro without reasoning tokens, surpassing previous versions that required reasoning. Cost drops 67% to $25 per million tokens. Multi-agent orchestration reaches 88% when Opus coordinates with Haiku or Sonnet agents, enabling swarm architectures. - **Recursive Self-Improvement Threshold:** Anthropic reports incoming employees on performance teams now outperformed by AI on key homework assignments and tests. Frontier labs allocate more compute to AI researchers than human researchers, marking transition point where models improve themselves faster than humans can enhance them. - **Variable Cost Economics:** AI enables businesses to operate with zero fixed costs through enterprise contracts that bill 30-60 days after service while charging customers upfront. Entire business stack including tax compliance, financial forecasting, and payment balancing automates within one year, enabling minute-scale company launches. - **Brain-Computer Interface Velocity:** Paradromics achieves 200 bits per second throughput in sheep trials, 20x faster than Neuralink's 10 bits per second. Foundation models trained on fMRI data decode human thought from one million voxels per second despite low spatial and temporal resolution, enabling noninvasive uploading pathways. → NOTABLE MOMENT The panel reveals multiple research groups including Meta now train foundation models directly from fMRI brain scans, capturing human thought patterns at one million voxels per second. This enables noninvasive mind uploading despite fMRI's limited resolution of one millimeter cubed spatially and one to two second temporal windows. 💼 SPONSORS [{"name": "Blitsy", "url": "blitsy.com"}] 🏷️ AI Infrastructure, Brain-Computer Interfaces, Scientific Computing, Autonomous Coding, Economic Transformation

Explore More

Never miss Emad Mostaque's insights

Subscribe to get AI-powered summaries of Emad Mostaque's podcast appearances delivered to your inbox weekly.

Start Free Today

No credit card required • Free tier available