Why Are Palantir and OpenAI Scared of Alex Bores?
Episode
92 min
Read time
3 min
Topics
Artificial Intelligence
AI-Generated Summary
Key Takeaways
- ✓AI Industry Political Spending: The super PAC "Leading the Future," funded by Palantir cofounder Joe Lonsdale and OpenAI cofounder Greg Brockman, has spent $2.5 million targeting Bores and has signaled willingness to spend up to $10 million. Their stated goal is to make an example of any legislator who passes AI regulation, causing future politicians to avoid the issue entirely — regardless of how popular regulation polls across party lines.
- ✓RAISE Act Baseline Requirements: New York's RAISE Act mandates that AI developers publish and adhere to safety plans following industry best practices, report critical safety incidents to government before harm occurs, and document model testing results. Two stronger provisions — prohibiting release of models that fail developers' own safety tests, and requiring third-party audits similar to financial SOC 2 audits — were removed during negotiation, leaving Bores to describe the final bill as embarrassingly weak given the stakes.
- ✓AI Dividend Funding Mechanisms: Bores proposes funding a universal basic income through three parallel mechanisms: a wealth tax on AI company owners, a token tax on commercial AI usage that charges per AI inference rather than taxing human labor, and out-of-the-money government warrants in major AI companies that only pay off if those companies reach extreme valuations — structuring the instrument so it reads as upside participation rather than wealth seizure after the fact.
- ✓Data Center Grid Leverage: Rather than opposing data center construction, Bores proposes using AI companies' urgency for grid interconnection as leverage. Companies that build to a high renewable energy threshold and pay a grid resilience fee beyond standard connection costs get moved to the front of the interconnection queue. Competitors who do not meet those standards get moved to the back, creating a market incentive without direct mandates and using private capital to fund grid modernization that taxpayers and ratepayers currently bear.
- ✓Government AI Capacity Gap: AI is the first major technology developed almost entirely outside government structures — unlike the internet, which originated from DARPA's ARPANET, or the space race. Bores argues governments must hire AI experts at competitive salaries, build shared GPU clusters for public universities (New York's Empire AI initiative is a model), and direct grants toward alignment research. Without internal expertise, legislatures cannot write effective regulation or deploy AI for public benefit in areas like drug discovery or tax filing.
What It Covers
New York State Assembly member Alex Bores, author of the RAISE Act — one of the first AI safety laws passed by any U.S. state — discusses AI regulation, job displacement, an AI dividend proposal, and the $2.5 million super PAC funded by Palantir, OpenAI, and Andreessen Horowitz cofounders actively working to end his congressional campaign.
Key Questions Answered
- •AI Industry Political Spending: The super PAC "Leading the Future," funded by Palantir cofounder Joe Lonsdale and OpenAI cofounder Greg Brockman, has spent $2.5 million targeting Bores and has signaled willingness to spend up to $10 million. Their stated goal is to make an example of any legislator who passes AI regulation, causing future politicians to avoid the issue entirely — regardless of how popular regulation polls across party lines.
- •RAISE Act Baseline Requirements: New York's RAISE Act mandates that AI developers publish and adhere to safety plans following industry best practices, report critical safety incidents to government before harm occurs, and document model testing results. Two stronger provisions — prohibiting release of models that fail developers' own safety tests, and requiring third-party audits similar to financial SOC 2 audits — were removed during negotiation, leaving Bores to describe the final bill as embarrassingly weak given the stakes.
- •AI Dividend Funding Mechanisms: Bores proposes funding a universal basic income through three parallel mechanisms: a wealth tax on AI company owners, a token tax on commercial AI usage that charges per AI inference rather than taxing human labor, and out-of-the-money government warrants in major AI companies that only pay off if those companies reach extreme valuations — structuring the instrument so it reads as upside participation rather than wealth seizure after the fact.
- •Data Center Grid Leverage: Rather than opposing data center construction, Bores proposes using AI companies' urgency for grid interconnection as leverage. Companies that build to a high renewable energy threshold and pay a grid resilience fee beyond standard connection costs get moved to the front of the interconnection queue. Competitors who do not meet those standards get moved to the back, creating a market incentive without direct mandates and using private capital to fund grid modernization that taxpayers and ratepayers currently bear.
- •Government AI Capacity Gap: AI is the first major technology developed almost entirely outside government structures — unlike the internet, which originated from DARPA's ARPANET, or the space race. Bores argues governments must hire AI experts at competitive salaries, build shared GPU clusters for public universities (New York's Empire AI initiative is a model), and direct grants toward alignment research. Without internal expertise, legislatures cannot write effective regulation or deploy AI for public benefit in areas like drug discovery or tax filing.
- •Bipartisan AI Polling Reality: Surveys consistently show roughly 10% of Americans want AI development halted entirely, 10% want no restrictions, and 80% want to see benefits while managing risks and slowing the pace. This distribution has remained stable across Republicans, Democrats, and Independents longer than most polarizing issues. Bores frames this as an actionable political window — the constituency for thoughtful regulation is large and bipartisan, but only if legislators are willing to act before AI companies accumulate enough political spending power to close that window.
- •Writing Skill Degradation as Governance Risk: Bores observes that AI-assisted writing is eroding the cognitive practice of forming and editing ideas through composition. In hiring contexts, cover letters have become unreliable signals of analytical ability. In legislative contexts, he notes that assigning take-home essays no longer develops critical thinking. His proposed response includes requiring essays written by hand or in keystroke-trackable platforms like Google Docs, and treating writing instruction as a policy priority rather than a pedagogical detail, because writing is the mechanism through which people clarify and strengthen their thinking.
Notable Moment
Bores describes how, while leading Palantir's Department of Justice project under the Obama administration, he personally refused to allow the software to be used for civil immigration enforcement when the Trump DOJ requested it in early 2017 — a refusal his contract permitted. He later resigned in 2019 when Palantir executives declined to write deportation restrictions into an ICE contract renewal.
You just read a 3-minute summary of a 89-minute episode.
Get The Ezra Klein Show summarized like this every Monday — plus up to 2 more podcasts, free.
Pick Your Podcasts — FreeKeep Reading
More from The Ezra Klein Show
Our Tax System Should Make You Furious
Apr 17 · 65 min
ZOE Science & Nutrition
The 5 best foods to fight cancer growth and lower your risk of death | Dr William Li
Apr 23
More from The Ezra Klein Show
Reckoning With Israel’s ‘One-State Reality’
Apr 14 · 87 min
Masters of Scale
The art of the steal: Serial founder Eric Ryan on finding inspiration
Apr 23
More from The Ezra Klein Show
We summarize every new episode. Want them in your inbox?
Our Tax System Should Make You Furious
Reckoning With Israel’s ‘One-State Reality’
The Civilization Trump Destroys May Be Our Own
Why Iran Believes It Has the Upper Hand
Michael Pollan’s Journey to the Borderlands of Consciousness
Similar Episodes
Related episodes from other podcasts
ZOE Science & Nutrition
Apr 23
The 5 best foods to fight cancer growth and lower your risk of death | Dr William Li
Masters of Scale
Apr 23
The art of the steal: Serial founder Eric Ryan on finding inspiration
Software Engineering Daily
Apr 23
Hype and Reality of the AI Coding Shift
Everything Everywhere Daily
Apr 23
Mythical Creatures: Unicorns, Dragons, and Mermaids
Odd Lots
Apr 23
Google's Liz Reid on Who Will Own Search in a World of AI
Explore Related Topics
This podcast is featured in Best Politics Podcasts (2026) — ranked and reviewed with AI summaries.
Read this week's AI & Machine Learning Podcast Insights — cross-podcast analysis updated weekly.
You're clearly into The Ezra Klein Show.
Every Monday, we deliver AI summaries of the latest episodes from The Ezra Klein Show and 192+ other podcasts. Free for up to 3 shows.
Start My Monday DigestNo credit card · Unsubscribe anytime