
Why Are Palantir and OpenAI Scared of Alex Bores?
The Ezra Klein ShowAI Summary
→ WHAT IT COVERS New York State Assembly member Alex Bores, author of the RAISE Act — one of the first AI safety laws passed by any U.S. state — discusses AI regulation, job displacement, an AI dividend proposal, and the $2.5 million super PAC funded by Palantir, OpenAI, and Andreessen Horowitz cofounders actively working to end his congressional campaign. → KEY INSIGHTS - **AI Industry Political Spending:** The super PAC "Leading the Future," funded by Palantir cofounder Joe Lonsdale and OpenAI cofounder Greg Brockman, has spent $2.5 million targeting Bores and has signaled willingness to spend up to $10 million. Their stated goal is to make an example of any legislator who passes AI regulation, causing future politicians to avoid the issue entirely — regardless of how popular regulation polls across party lines. - **RAISE Act Baseline Requirements:** New York's RAISE Act mandates that AI developers publish and adhere to safety plans following industry best practices, report critical safety incidents to government before harm occurs, and document model testing results. Two stronger provisions — prohibiting release of models that fail developers' own safety tests, and requiring third-party audits similar to financial SOC 2 audits — were removed during negotiation, leaving Bores to describe the final bill as embarrassingly weak given the stakes. - **AI Dividend Funding Mechanisms:** Bores proposes funding a universal basic income through three parallel mechanisms: a wealth tax on AI company owners, a token tax on commercial AI usage that charges per AI inference rather than taxing human labor, and out-of-the-money government warrants in major AI companies that only pay off if those companies reach extreme valuations — structuring the instrument so it reads as upside participation rather than wealth seizure after the fact. - **Data Center Grid Leverage:** Rather than opposing data center construction, Bores proposes using AI companies' urgency for grid interconnection as leverage. Companies that build to a high renewable energy threshold and pay a grid resilience fee beyond standard connection costs get moved to the front of the interconnection queue. Competitors who do not meet those standards get moved to the back, creating a market incentive without direct mandates and using private capital to fund grid modernization that taxpayers and ratepayers currently bear. - **Government AI Capacity Gap:** AI is the first major technology developed almost entirely outside government structures — unlike the internet, which originated from DARPA's ARPANET, or the space race. Bores argues governments must hire AI experts at competitive salaries, build shared GPU clusters for public universities (New York's Empire AI initiative is a model), and direct grants toward alignment research. Without internal expertise, legislatures cannot write effective regulation or deploy AI for public benefit in areas like drug discovery or tax filing. - **Bipartisan AI Polling Reality:** Surveys consistently show roughly 10% of Americans want AI development halted entirely, 10% want no restrictions, and 80% want to see benefits while managing risks and slowing the pace. This distribution has remained stable across Republicans, Democrats, and Independents longer than most polarizing issues. Bores frames this as an actionable political window — the constituency for thoughtful regulation is large and bipartisan, but only if legislators are willing to act before AI companies accumulate enough political spending power to close that window. - **Writing Skill Degradation as Governance Risk:** Bores observes that AI-assisted writing is eroding the cognitive practice of forming and editing ideas through composition. In hiring contexts, cover letters have become unreliable signals of analytical ability. In legislative contexts, he notes that assigning take-home essays no longer develops critical thinking. His proposed response includes requiring essays written by hand or in keystroke-trackable platforms like Google Docs, and treating writing instruction as a policy priority rather than a pedagogical detail, because writing is the mechanism through which people clarify and strengthen their thinking. → NOTABLE MOMENT Bores describes how, while leading Palantir's Department of Justice project under the Obama administration, he personally refused to allow the software to be used for civil immigration enforcement when the Trump DOJ requested it in early 2017 — a refusal his contract permitted. He later resigned in 2019 when Palantir executives declined to write deportation restrictions into an ICE contract renewal. 💼 SPONSORS [{"name": "The New York Times", "url": "https://nytimes.com/gift"}] 🏷️ AI Regulation, Tech Industry Political Spending, RAISE Act, AI Dividend, Job Displacement, Government Technology Capacity, Palantir
