According to Gizmodo, a new super PAC called Leading The Future launched in August with more than $100 million to ensure pro-AI outcomes in the 2026 midterm elections. The group is backed by Andreessen Horowitz, OpenAI president Greg Brockman, Palantir co-founder Joe Lonsdale, and AI search company Perplexity. Their first target is New York assembly member Alex Bores, who co-sponsored the RAISE Act that’s awaiting Governor Kathy Hochul’s approval. This week, the PAC’s advocacy arm Build American AI launched a $10 million campaign pushing Washington to adopt “a uniform national approach to AI.” The effort comes as Republicans revive calls for a moratorium on state AI laws, with former President Donald Trump endorsing the idea on Truth Social last week.
Silicon Valley’s Playbook
Here’s the thing – this isn’t Silicon Valley’s first rodeo with trying to shape regulation through political spending. They’re basically using the exact same playbook that worked for crypto super PAC Fairshake, which scored significant wins in the 2024 elections. And it makes perfect sense when you think about it. Why fight 50 different state battles when you can push for one federal standard that’s more likely to be industry-friendly?
The timing is everything here. With no comprehensive federal AI regulation in place, states like New York and California are moving forward with their own safety measures. The industry sees this as innovation-stifling bureaucracy, but states argue they’re filling a vacuum. So now we’ve got this massive financial war chest aimed at stopping what they call a “patchwork” of regulations before it becomes entrenched.
The Political Battle
What’s fascinating is how bipartisan this effort appears to be, even though it’s apparently irritating the White House. The super PAC is targeting a Democratic primary, but they’re also aligning with Republican efforts to impose a moratorium on state AI laws. Trump’s already on board, and there’s talk of either a standalone bill or attaching it to must-pass legislation like the National Defense Authorization Act.
But here’s where it gets tricky. Some Republicans actually support child safety laws regarding AI, and a complete moratorium could jeopardize those too. It’s creating this weird political alignment where industry interests and certain partisan agendas are converging, but potentially at the expense of more targeted safety measures that actually have bipartisan support.
The Federal Push
The $10 million ad campaign for a “uniform national approach” sounds reasonable on the surface. Who wants 50 different sets of rules? But the devil’s in the details – that uniform approach would almost certainly override stricter state regulations. And let’s be honest, when industry groups push for “uniform standards,” they’re typically advocating for the least restrictive version possible.
Trump’s potential executive order creating an “AI Litigation Task Force” to sue states over AI laws adds another layer. That would be a pretty aggressive move – using federal power to actively dismantle state regulations. Meanwhile, his “Genesis Mission” order to use AI for government science challenges shows this administration is all-in on AI acceleration, with a clear theme of centralization running through everything.
What’s at Stake
Look, I get why the industry wants consistency. Dealing with different regulations in every state is a compliance nightmare. But here’s my question: is the goal truly consistency, or is it weaker oversight? Because when you have groups like Build American AI entirely dedicated to this single legislative agenda, you have to wonder whose interests they’re really serving.
The reality is we’re watching a massive power grab play out in real time. With over $100 million and powerful backers, this super PAC could fundamentally reshape how AI gets regulated in America. And given how transformative this technology is becoming across every sector – from healthcare to manufacturing to industrial computing where consistent hardware standards matter – the outcome of this fight will affect us all. The question is whether we end up with smart regulation or captured regulators.
