Missouri lawmakers are right to take artificial intelligence seriously. AI raises legitimate questions around consumer protection, campaign communications, professional accountability, public trust, and the future of work. These issues deserve thoughtful consideration.
But Senate Bill 1012, as drafted, risks reaching much farther than intended. Rather than narrowly targeting harmful uses of AI, it could create legal uncertainty for ordinary Missouri businesses simply trying to use modern tools to compete.
That matters because AI is no longer just a “Big Tech” issue. It is quickly becoming a Main Street issue.
I have worked directly with companies to build and integrate AI solutions that improve efficiency, help teams compete, and better serve customers. Many are not large — no massive legal departments or compliance teams. For them, AI is a practical tool. It helps draft customer communications, summarize meetings, prepare proposals, support scheduling, improve quality control, and reduce repetitive administrative work.
These are probably not the uses most Missourians think of when they hear concerns about AI. These companies are not deploying deepfakes. They are not building autonomous machines to replace professional judgment. They are using ordinary business tools that help them save time, reduce costs, and better serve customers.
That is where SB 1012 raises real concerns.
One of the most concerning provisions involves labeling. The bill would require notice when someone is or may be interacting with AI, or when a human is using AI during the interaction. That may sound reasonable, but the language could reach much further.
Would a manufacturer need to disclose that AI supported a warranty communication or notice? Would a small business need to label a social media post because AI helped draft the copy? The bill does not answer those questions clearly enough.
A closer look at the text reveals why. The definition of “artificial intelligence” is broad enough to capture any automated system that makes predictions, recommendations, or decisions — which describes most modern business software, not just AI. The disclosure requirement applies not only when AI interacts with a customer directly, but whenever a human employee is simultaneously using any such system during a customer interaction — meaning a salesperson with a CRM open, or a support rep with a knowledge base running in the background, could trigger notice obligations. The bill’s liability provisions attach to this same sweeping definition, assigning risk to small businesses using off-the-shelf automation tools. These are not edge cases that might arise in litigation. They are the plain text.
That lack of clarity creates major litigation risk. When businesses do not know what the law requires, they over-comply, delay adoption, or become targets. Small and mid-sized companies are especially vulnerable because they cannot easily absorb lawsuits or complex state-specific compliance programs.
The competitive problem runs in both directions. A contractor in Texas submitting a proposal doesn’t have to disclose that AI helped draft it. A firm in Tennessee doesn’t have to label a client email because an employee used AI to pull background research. Your Missouri customer doesn’t know any of that — they just see your disclosure and wonder what you’re hiding. Meanwhile, a Missouri company serving clients anywhere in the country carries this bill’s obligations with them, while out-of-state competitors do not. And companies deciding where to deploy AI-enabled products will weigh Missouri’s requirements against states that have none. Neither dynamic was likely intended. Both are written into the bill as it stands.
As AI becomes a normal part of business, requiring disclosure for routine AI assistance may soon look as outdated as requiring a company to disclose that a proposal was “produced on a computer” or that “information was sourced from the internet.” Those notices would not help customers evaluate quality or accountability. They would simply make the work look less serious.
Missouri businesses should be able to use responsible AI tools to serve customers, reduce costs, compete with larger firms, and grow here. Lawmakers should fix or scrap SB 1012 to ensure ordinary AI adoption doesn’t become legally risky, reputationally suspect, or commercially disadvantageous.
If you run a Missouri business and you’re uncertain how this bill could affect your operations, that uncertainty is exactly the problem — and it’s worth making your voice heard before this becomes law. Missouri does not need to choose between public trust and innovation. But to protect both, we need rules that are clear, targeted, and practical for the businesses actually trying to build the future here.

Joe Lakin is the CEO of Objective Strategies, a St. Louis company that builds and implements communications and technology solutions.











