What the Latest AI Regulations Mean for NH Businesses
Back to Blog

What the Latest AI Regulations Mean for NH Businesses

Mar 30, 2026

AI regulation is moving faster than most people expected. A year ago, the conversation was mostly theoretical — academics and policy wonks debating hypotheticals while businesses quietly kept building. Now? There are actual rules on the table, and some of them have teeth.

If you run a business in New Hampshire, or you're building AI-powered products here, it's worth taking a hard look at what's changed and what it actually means for your day-to-day operations.

The Federal Landscape Is Still a Patchwork

Let's be honest — the U.S. federal government hasn't exactly moved with urgency on AI regulation. The Biden-era Executive Order on AI from October 2023 set some meaningful precedents around safety testing and transparency for high-risk AI systems, but enforcement has been uneven. The Trump administration's approach has leaned toward deregulation and keeping American AI competitive, which sounds great on paper but leaves a lot of ambiguity for businesses trying to figure out what's actually required of them.

The result is a patchwork. No single sweeping federal AI law yet, but a growing pile of sector-specific guidance from agencies like the FTC, FDA, and EEOC — each with their own interpretation of how existing laws apply to AI. If you're using AI in hiring, lending, healthcare, or advertising, you're already operating under some form of oversight whether you realize it or not.

What New Hampshire Is (and Isn't) Doing

New Hampshire has generally taken a business-friendly, hands-off approach to tech regulation, and AI is no exception so far. There's no state-level AI law on the books right now. But that doesn't mean NH businesses are operating in a vacuum.

Several neighboring states are moving. Colorado passed its AI Act in 2024, focused on algorithmic discrimination in high-stakes decisions. Connecticut has been pushing similar legislation. Massachusetts has had multiple AI bills in committee. These state-level moves matter for NH companies because if you're selling into those markets or employing people there, their rules can apply to you.

It's the kind of thing that sneaks up on small and mid-sized businesses especially. You're not thinking of yourself as an "AI company" — you're just using an AI tool to screen resumes or generate customer communications — but that might be enough to put you in scope.

The EU AI Act Is More Relevant Than You Think

Okay, hear me out on this one. A lot of NH business owners see "EU regulation" and immediately tune out. But if your company has any European customers, partners, or even uses AI tools built by European companies, the EU AI Act starts to matter.

The Act classifies AI systems by risk level — unacceptable, high, limited, and minimal. High-risk applications (think: CV screening, credit scoring, medical devices, critical infrastructure) face strict requirements around transparency, human oversight, and documentation. The fines for non-compliance are significant — up to 35 million euros or 7% of global turnover for the most serious violations.

EU AI Act risk classification pyramid showing four tiers from minimal to unacceptable risk with examples

Even if you're not directly subject to it, many of the AI vendors you rely on are building compliance into their products right now. That means changes to how tools behave, what data they can use, and what documentation they require from you as the deployer.

Three Practical Things NH Businesses Should Do Right Now

So what do you actually do with all this? A few things that aren't overly complicated but genuinely matter:

1. Audit what AI you're actually using. This sounds obvious but most organizations haven't done it. Shadow AI is rampant — employees using ChatGPT, Copilot, or third-party tools without IT or leadership knowing. You can't manage risk you can't see. Do a quick inventory. What tools are being used, by whom, for what decisions?

2. Pay attention to high-stakes use cases. Not all AI use carries the same risk. Using AI to generate a first draft of a newsletter? Pretty low stakes. Using it to score job applicants or flag customer accounts for fraud? That's where regulatory scrutiny is concentrated. Understand where your AI touches decisions that affect people's livelihoods, access to services, or safety.

3. Start building documentation habits now. One of the most consistent requirements across different regulatory frameworks — EU, FTC guidance, sector-specific rules — is documentation. What data was the model trained on? How was it tested? Who's responsible for monitoring it? Companies that already have this documented will have a much easier time when compliance requirements firm up. Companies that don't will be scrambling.

The Opportunity Hidden in the Compliance Conversation

Here's a take you might not hear often: responsible AI practices aren't just a compliance burden, they're actually a competitive advantage in a lot of markets. Enterprise customers, government contracts, healthcare clients — they're increasingly asking vendors about AI governance before signing deals. Having clear answers is a differentiator.

New Hampshire has a real shot at being a place where businesses build AI thoughtfully and use that as a selling point. We're not Silicon Valley, we're not trying to be. But there's a growing market of customers who want AI tools built by people who actually thought about the consequences. That's an audience worth speaking to.

Stay Engaged — This Is Moving Fast

The regulatory environment six months from now will probably look different from today. New state laws are advancing, federal agencies are issuing new guidance, and the EU AI Act is rolling out in phases through 2027. Staying informed isn't optional anymore if you're building with AI or making business decisions based on it.

That's honestly one of the best reasons to stay plugged into communities like this one. Not just for the technical side of AI, but for the practical, business-facing questions that don't always get enough airtime. What rules apply to me? What do I actually need to do? Those questions deserve real answers, not just more hype.

We'll keep covering this as things develop — and if you've got specific questions about how regulations might affect your industry or use case, bring them to the next meetup. That's exactly the kind of conversation worth having in person.