Skip to main content
AB

Alex Bores

2episodes
2podcasts

We have 2 summarized appearances for Alex Bores so far. Browse all podcasts to discover more episodes.

Featured On 2 Podcasts

All Appearances

2 episodes
The Ezra Klein Show

Why Are Palantir and OpenAI Scared of Alex Bores?

The Ezra Klein Show
93 minNew York State Assembly Member, AI Regulation Advocate

AI Summary

→ WHAT IT COVERS New York State Assembly member Alex Bores, author of the RAISE Act — one of the first AI safety laws passed by any U.S. state — discusses AI regulation, job displacement, an AI dividend proposal, and the $2.5 million super PAC funded by Palantir, OpenAI, and Andreessen Horowitz cofounders actively working to end his congressional campaign. → KEY INSIGHTS - **AI Industry Political Spending:** The super PAC "Leading the Future," funded by Palantir cofounder Joe Lonsdale and OpenAI cofounder Greg Brockman, has spent $2.5 million targeting Bores and has signaled willingness to spend up to $10 million. Their stated goal is to make an example of any legislator who passes AI regulation, causing future politicians to avoid the issue entirely — regardless of how popular regulation polls across party lines. - **RAISE Act Baseline Requirements:** New York's RAISE Act mandates that AI developers publish and adhere to safety plans following industry best practices, report critical safety incidents to government before harm occurs, and document model testing results. Two stronger provisions — prohibiting release of models that fail developers' own safety tests, and requiring third-party audits similar to financial SOC 2 audits — were removed during negotiation, leaving Bores to describe the final bill as embarrassingly weak given the stakes. - **AI Dividend Funding Mechanisms:** Bores proposes funding a universal basic income through three parallel mechanisms: a wealth tax on AI company owners, a token tax on commercial AI usage that charges per AI inference rather than taxing human labor, and out-of-the-money government warrants in major AI companies that only pay off if those companies reach extreme valuations — structuring the instrument so it reads as upside participation rather than wealth seizure after the fact. - **Data Center Grid Leverage:** Rather than opposing data center construction, Bores proposes using AI companies' urgency for grid interconnection as leverage. Companies that build to a high renewable energy threshold and pay a grid resilience fee beyond standard connection costs get moved to the front of the interconnection queue. Competitors who do not meet those standards get moved to the back, creating a market incentive without direct mandates and using private capital to fund grid modernization that taxpayers and ratepayers currently bear. - **Government AI Capacity Gap:** AI is the first major technology developed almost entirely outside government structures — unlike the internet, which originated from DARPA's ARPANET, or the space race. Bores argues governments must hire AI experts at competitive salaries, build shared GPU clusters for public universities (New York's Empire AI initiative is a model), and direct grants toward alignment research. Without internal expertise, legislatures cannot write effective regulation or deploy AI for public benefit in areas like drug discovery or tax filing. - **Bipartisan AI Polling Reality:** Surveys consistently show roughly 10% of Americans want AI development halted entirely, 10% want no restrictions, and 80% want to see benefits while managing risks and slowing the pace. This distribution has remained stable across Republicans, Democrats, and Independents longer than most polarizing issues. Bores frames this as an actionable political window — the constituency for thoughtful regulation is large and bipartisan, but only if legislators are willing to act before AI companies accumulate enough political spending power to close that window. - **Writing Skill Degradation as Governance Risk:** Bores observes that AI-assisted writing is eroding the cognitive practice of forming and editing ideas through composition. In hiring contexts, cover letters have become unreliable signals of analytical ability. In legislative contexts, he notes that assigning take-home essays no longer develops critical thinking. His proposed response includes requiring essays written by hand or in keystroke-trackable platforms like Google Docs, and treating writing instruction as a policy priority rather than a pedagogical detail, because writing is the mechanism through which people clarify and strengthen their thinking. → NOTABLE MOMENT Bores describes how, while leading Palantir's Department of Justice project under the Obama administration, he personally refused to allow the software to be used for civil immigration enforcement when the Trump DOJ requested it in early 2017 — a refusal his contract permitted. He later resigned in 2019 when Palantir executives declined to write deportation restrictions into an ICE contract renewal. 💼 SPONSORS [{"name": "The New York Times", "url": "https://nytimes.com/gift"}] 🏷️ AI Regulation, Tech Industry Political Spending, RAISE Act, AI Dividend, Job Displacement, Government Technology Capacity, Palantir

AI Summary

→ WHAT IT COVERS New York Assembly Member Alex Bores discusses his sponsorship of the RAISE Act, an AI safety law, and the resulting $10 million campaign against him by Leading the Future, a Silicon Valley super PAC backed by a16z, OpenAI's Greg Brockman, and Joe Lonsdale, revealing the financial scale of industry opposition to state-level AI regulation. → KEY INSIGHTS - **Regulatory landscape split:** Public opinion on AI breaks into three distinct camps: roughly 10% want AI eliminated entirely, 10% (represented by Leading the Future) want zero regulation, and 80% want measured oversight ensuring broad societal benefit. Politicians targeting that 80% majority face disproportionate financial opposition from the minority "let it rip" faction funding super PACs. - **RAISE Act scope:** The law applies only to AI companies with over $500 million in revenue that have built sufficiently large frontier models — currently just Google, Meta, OpenAI, xAI, and Anthropic. Requirements include publishing and adhering to a safety plan and reporting catastrophic safety incidents that cause or could imminently cause injury or death. - **Money asymmetry in AI politics:** Leading the Future has raised $125 million and pledged at least $10 million specifically against Bores, while the pro-regulation Public First Action PAC (backed by $20 million from Anthropic) has spent under $500,000 supporting him — a 20-to-1 spending ratio that signals deliberate intimidation of state legislators. - **Deepfake policy solution:** Bores argues deepfakes are a solvable problem through mandatory adoption of C2PA, a free, open-source metadata standard already created by industry. New York legislation requiring content provenance via C2PA is near passage, with the governor including a version in her budget — a replicable model other states can adopt immediately. - **AI training data transparency:** Bores is advancing a bill requiring AI models to disclose basic training data information, including whether copyrighted material or personally identifiable information was used. A similar California bill passed the state assembly unanimously before stalling in the senate, suggesting broad bipartisan support exists for this specific, narrow disclosure requirement. → NOTABLE MOMENT Bores revealed that when Leading the Future announced its campaign against him, he proactively sought meetings with all founding members through mutual contacts. Only one agreed to a fifteen-minute conversation. The group then shifted its ad strategy from opposing AI regulation to attacking his prior employment at Palantir over its ICE contracts — despite Palantir co-founder Joe Lonsdale funding the PAC. 💼 SPONSORS [{"name": "LifeLock", "url": "https://lifelock.com/podcast"}] 🏷️ AI Regulation, State Legislation, Political Lobbying, AI Safety, Tech Policy

Explore More

Never miss Alex Bores's insights

Subscribe to get AI-powered summaries of Alex Bores's podcast appearances delivered to your inbox weekly.

Start Free Today

No credit card required • Free tier available