Skip to main content
SC

Steffen Cruz

1episode
1podcast

We have 1 summarized appearance for Steffen Cruz so far. Browse all podcasts to discover more episodes.

Featured On 1 Podcast

All Appearances

1 episode
Eye on AI

#340 Steffen Cruz: Training AI Without Data Centres

Eye on AI
46 minCofounder and CTO of Microcosmos

AI Summary

→ WHAT IT COVERS Steffen Cruz, CTO of Macrocosmos, explains how his company uses BitTensor's blockchain infrastructure to train large language models through distributed compute nodes worldwide, eliminating the need for centralized data centers and enabling cost arbitrage through surplus energy and idle consumer hardware like Mac minis and spare GPUs. → KEY INSIGHTS - **Distributed Pretraining Economics:** Training large language models through geographically distributed nodes enables cost arbitrage unavailable to centralized data centers. When a facility builds out thousands of GPUs, training costs are fixed at construction. Distributed systems can target surplus energy pockets — such as Icelandic renewable energy available only 12 hours daily — reducing training costs to roughly 10–20% of conventional rates. - **Model Parallelism Architecture:** Macrocosmos's IOTA system (Incentivized Orchestrated Training Architecture) splits models into small slivers across nodes rather than running full model copies on each machine. This approach allows training of frontier-scale models — targeting 70 billion parameters by mid-2025 and 100 billion-plus by 2026 — using consumer-grade hardware like Mac minis and CUDA-enabled GPUs. - **Supply-Side GPU Utilization Strategy:** Cloud providers and neo-clouds with idle GPU inventory can plug surplus capacity into IOTA's network during rental gaps. Since training commands higher margins than inference token sales, providers earn better returns on underutilized hardware than selling compute at discounted spot rates, creating a direct bottom-line improvement without additional capital expenditure. - **Consumer Passive Income via Train-at-Home:** Individuals with idle Mac minis, MacBooks, or consumer GPUs can download a one-click app, set availability windows — for example, 10PM to 6AM — and earn passive income contributing to model training runs. Macrocosmos reports 2,500 app downloads within the first two weeks, with the payout system rewarding participation proportionally to hours of compute contributed daily. - **Blockchain as Coordination Layer, Not Compute:** The blockchain in BitTensor functions as an identity registry, synchronization clock, and transparent payout trigger — not as a compute or storage layer. Off-chain tracking records each node's contribution, then pushes verified totals on-chain to trigger token payouts. This architecture allowed Macrocosmos to scale beyond BitTensor's native 256-node limit to support thousands of simultaneous participants. → NOTABLE MOMENT Cruz describes a near-future scenario where a personal AI agent, after completing its assigned tasks by mid-morning, autonomously decides to contribute the machine's idle compute to a training network and earns money before the owner returns home — reframing personal computers as proactive economic participants rather than passive tools. 💼 SPONSORS None detected 🏷️ Distributed AI Training, BitTensor, Blockchain Infrastructure, Decentralized Compute, Large Language Models

Explore More

Never miss Steffen Cruz's insights

Subscribe to get AI-powered summaries of Steffen Cruz's podcast appearances delivered to your inbox weekly.

Start Free Today

No credit card required • Free tier available