AI Compliance Checklist for Professional Services Firms: What to Do Before the June 30 Deadline
Published April 30, 2026 · By The Crossing Report · 12 min read · Last updated April 30, 2026
Summary
As of 2026, 14 US states have passed or are advancing AI-specific legislation, with 3 — Colorado, Illinois, and Texas — directly affecting how professional services firms must disclose AI use to clients. The EU AI Act is fully in effect. ABA Formal Opinion 512 and updated AICPA guidance have reset the compliance baseline for lawyers and accountants. This guide separates what applies to you now from what you should monitor — written for owners of 5–50 person accounting, law, consulting, and marketing firms who need to stay compliant without a full-time legal or compliance team.
The AI Regulatory Landscape in 2026: What Professional Services Firms Need to Know
If you have been half-reading the AI compliance news and feeling a growing low-grade anxiety about what your firm is supposed to be doing, you are not alone. The regulatory environment for AI has moved fast, the headlines are often misleading, and the language in the actual rules is written for lawyers — not for the owners of 12-person firms who are just trying to use AI responsibly.
Here is the honest overview: most of the regulation that has passed so far applies either to large technology companies or to high-risk automated decision systems. The majority of professional services firms using AI for drafting, research, client communication, and administrative work are not the primary targets. But you are not entirely in the clear — because your professional associations (the ABA, AICPA, and state bars) have been issuing guidance that does apply to you, regardless of whether a statute requires it.
The map of what you need to know in 2026 breaks into four areas:
- •State AI laws — Colorado's AI Act is the most relevant. Several other states are following.
- •The EU AI Act — only matters if you have EU clients or use EU-regulated AI tools. Most small US firms: low direct impact.
- •Professional association guidance — ABA Opinion 512 for lawyers, AICPA AI Framework for accountants. These matter now and are not optional.
- •Client disclosure — what you are required or expected to tell clients about AI use in their matters.
We will go through each one specifically.
Colorado AI Act: Does It Apply to Your Firm?
Colorado's AI Act (Senate Bill 24-205) went into effect February 1, 2026, making Colorado the first US state with a comprehensive AI law. It has received outsized attention because of that first-mover status, but most professional services firms will find its direct application narrower than the headlines suggest.
The Act focuses on high-risk AI systems — defined as AI systems that make or substantially support “consequential decisions” about Colorado residents. Consequential decisions are defined to include decisions about education, employment, financial services, healthcare, housing, insurance, and legal services that have a significant effect on a person's rights, opportunities, or access to essential services.
For a small professional services firm, the practical threshold question is this: Is AI making a consequential decision about a client, or is AI assisting a professional who makes the decision? If a lawyer uses AI to draft a contract and the lawyer reviews and finalizes it, the lawyer is making the consequential decision and the Act's high-risk provisions likely do not apply. If an AI system automatically approves or denies a loan application without meaningful human review, that is high-risk.
The Act does impose obligations on “deployers” — any business that uses a high-risk AI system — including impact assessments, disclosure obligations, and anti-discrimination provisions. If your firm is Colorado-based and uses AI in any workflow that touches legal, financial, or employment decisions about clients or employees, you should have a lawyer review whether any of those tools qualify as high-risk systems under the Act's definitions. The review for most firms will be straightforward and will likely confirm that general-purpose AI writing and research tools fall outside the high-risk category.
Bottom line
Colorado-based firms: audit your AI tools and document that they assist professional judgment rather than replace it. Firms outside Colorado: monitor — at least 5 other states are advancing similar legislation.
EU AI Act: What US-Based Firms Need to Know
The EU AI Act became fully applicable in August 2026. It is the world's most comprehensive AI regulation, and it has extraterritorial reach — it applies to any organization that offers AI systems or AI-enabled services to EU users, or whose AI outputs affect people in the EU.
For most US-based professional services firms with a purely domestic client base, the EU AI Act does not apply. If your firm has no EU clients, partners, or employees, you can monitor this one rather than act on it now.
If you do have EU exposure — a client headquartered in Europe, an employee working remotely from an EU country, or services provided to an EU-based entity — the relevant questions are:
- 1.Do any of your AI tools fall into the Act's high-risk categories? Annex III of the Act lists specific high-risk uses including AI systems for legal interpretation, credit scoring, and employment decisions. Most general-purpose writing and research tools do not qualify.
- 2.Are your AI tool vendors compliant? The primary compliance burden under the EU AI Act falls on AI system providers (the companies building the tools), not on deployers using those tools for standard professional services work. Confirm with your major AI vendors that their tools are registered and compliant where required.
- 3.Do you need to update client-facing disclosures for EU clients? The Act requires transparency about AI use in certain contexts, particularly AI-generated content and AI-assisted decisions that affect individuals.
The honest assessment: for the vast majority of small US professional services firms, the EU AI Act is a “monitor and confirm vendor compliance” situation, not an immediate compliance emergency.
ABA Formal Opinion 512 and State Bar AI Guidance: The Lawyer's Compliance Checklist
ABA Formal Opinion 512, issued in July 2024, is the most consequential professional guidance for law firms using AI. It does not create new rules — it interprets how the existing Model Rules of Professional Conduct apply to generative AI use. But that interpretation has teeth, because the underlying rules are enforceable and state bars are increasingly paying attention.
The Opinion establishes four core obligations for lawyers using generative AI:
1. Competence (Rule 1.1). A lawyer must understand the AI tool they are using well enough to evaluate its output critically. You cannot submit AI-generated work product without meaningful review. The Opinion notes that this does not require technical expertise, but it does require understanding the tool's limitations — including its tendency to generate confident-sounding errors (hallucinations) in legal research.
2. Confidentiality (Rule 1.6). Do not enter client confidential information into AI tools that use input data to train their models unless you have client consent and a data processing agreement with the vendor. This is the rule most commonly violated at small firms — not out of bad faith, but because nobody checked the terms of service. Review your AI tools' data handling policies now.
3. Supervision (Rule 5.3). If non-lawyer staff or contract workers are using AI to produce work product on client matters, a supervising lawyer is responsible for the output. You cannot delegate quality control to the AI tool itself. The supervising lawyer must review AI-assisted work before it goes to the client.
4. Disclosure. The Opinion says disclosure of AI use is required when a reasonable client would want to know — which the ABA interprets as situations where AI substantially generates work product the client relies on, or where AI is used in ways that implicate confidentiality. The practical approach: update your engagement letter to describe how your firm uses AI tools. This converts a potential disclosure obligation into a transparent upfront conversation.
State bar variations
California, New York, Florida, and Texas have each issued their own AI guidance that supplements the ABA framework. California's guidance is the most prescriptive on disclosure; New York has focused on AI in court filings. If you practice in any of these states, check your state bar's website for the current guidance — most have been updated in the past 12 months.
AICPA AI Guidelines for Accountants: What's Required, What's Recommended
The AICPA has taken a more principles-based approach to AI guidance than the ABA, releasing its AI Framework for CPA firms in late 2024 and updating it in early 2026. Unlike the ABA Opinion, the AICPA guidance does not attach to a specific enforceable rule — but it reflects the direction state CPA boards are moving in, and it is the framework that peer reviewers and disciplinary bodies will reference when questions arise.
The AICPA Framework focuses on five areas:
Professional skepticism. Accountants must apply the same critical evaluation to AI-generated output that they would apply to any other source of information. AI outputs should be treated as a starting point, not a conclusion. This is especially important in audit — AI that analyzes 100% of transactions can surface anomalies humans would miss, but the interpretation and professional judgment remain the accountant's responsibility.
Data privacy and client confidentiality. The Framework requires firms to establish and document policies for what client data may be entered into AI systems, under what conditions, and with what vendor data protections in place. For firms handling tax returns, financial statements, and payroll data, this is not optional — it is a basic fiduciary responsibility.
Transparency with clients. The AICPA recommends — and state boards are beginning to require — that engagement letters disclose the firm's use of AI tools and describe how client data is protected when AI is involved in service delivery.
Quality control. Firms should have documented policies for reviewing AI-generated work product before delivery. The standard of review should be no less rigorous than for work produced by a junior staff member.
Ongoing education. The Framework expects CPAs to maintain competency in the AI tools they use — both technically (understanding what the tool does and does not do) and ethically (staying current with evolving guidance from the AICPA and state boards).
AI Client Disclosure: What You Must Tell Clients (and What's Best Practice)
The disclosure question is the one that makes most firm owners uncomfortable — not because they are hiding anything, but because nobody knows exactly where the line is. Here is the practical breakdown.
What is currently required: For lawyers in states that follow ABA Opinion 512, disclosure is required when AI substantially generates work product the client will rely on, or when AI use implicates confidentiality. For accountants, disclosure is required when AI tools process personal financial data, and the AICPA recommends it for any material AI use in service delivery. In Colorado (and likely soon in other states), disclosure may be required by statute when high-risk AI systems are involved in decisions affecting clients.
What is best practice (and increasingly standard): Add a clear AI use disclosure to your engagement letter. The leading language firms are using in 2026 covers three things: (1) a description of how AI tools are used in service delivery; (2) a statement that all AI-generated work product is reviewed by a licensed professional before delivery; and (3) a description of how client data is protected when processed by AI systems. Firms that have added this language report that clients receive it positively — it reads as a sign of modernity and rigor, not as a red flag.
Sample disclosure language
“[Firm name] uses AI-assisted tools to support research, drafting, and document analysis in our engagements. All AI-generated work product is reviewed and approved by a licensed [attorney/CPA] before delivery. We do not input personal client data into AI tools that use such data for model training. For questions about our AI practices, please contact [name].”
Adapt to your firm's specific AI use cases and have your professional liability counsel review before adopting.
Your 5-Step AI Compliance Checklist for Professional Services Firms
Most small professional services firms can complete this checklist in a single afternoon. It covers the minimum required actions for 2026 compliance — not a comprehensive risk management program, but the floor that any firm using AI tools should have in place.
1. Inventory your AI tools
Make a list of every AI tool your firm uses, who uses it, what tasks it is used for, and what data it touches. Include tools your team uses informally (ChatGPT, Copilot, Gemini) as well as tools embedded in your practice management, tax prep, or document management software. You cannot manage compliance for tools you do not know about.
2. Review data handling policies for each tool
For each AI tool that touches client data, confirm: (a) does the tool use input data for model training? (b) is there a data processing agreement (DPA) available? (c) does the tool meet your jurisdiction's data protection requirements? Tools that use inputs for training without client consent are a professional responsibility risk. Most major AI providers now offer enterprise tiers that disable training on user inputs — confirm this before entering client data.
3. Update your engagement letter
Add an AI disclosure clause to your standard engagement letter that describes how your firm uses AI tools, confirms that AI-generated work is reviewed by a licensed professional, and describes your client data protection practices. This is the single highest-leverage compliance action available to small firms — it converts a potential obligation into proactive transparency that most clients appreciate.
4. Establish a review-before-delivery policy
Document a firm-wide policy that AI-generated work product requires professional review before it is delivered to clients. The policy does not need to be long — one paragraph is enough. What matters is that it is written down, communicated to staff, and consistently followed. This is what your professional liability carrier will ask about if an AI-related error reaches a claim.
5. Assign someone to track regulatory updates quarterly
AI regulation is moving fast. Assign one person at your firm — an owner, partner, or office manager — to check for updates from your professional association (ABA, AICPA, state bar or CPA board) every quarter. This does not require hours of reading; most associations now publish AI guidance summaries. Add it to your calendar as a 30-minute quarterly review. The firms that get caught flat-footed by new requirements are usually the ones with nobody assigned to watch for them.
Premium Content
AI Compliance Kit for Professional Services Firms
Premium subscribers get the ready-to-use engagement letter disclosure language (lawyer and accountant versions), the AI tool data handling audit template, state-by-state bar guidance tracker, and the full Issue #8 guide including jurisdiction-specific compliance notes for California, New York, Florida, and Texas.
Free weekly digest. No spam. Unsubscribe anytime.
$19/month · Cancel anytime · First issue free
FAQ: AI Regulation Questions from Professional Services Firm Owners
Q: Does the Colorado AI Act apply to small professional services firms?
A: The Colorado AI Act (SB 24-205, effective February 1, 2026) applies to “deployers” of high-risk AI systems — defined as systems that make, or substantially support, consequential decisions about individuals. For most small professional services firms using AI for drafting, research, and administrative tasks, the Act does not apply directly because you are not making high-risk automated decisions about clients. The threshold that most affects small firms is client-facing AI that produces decisions about credit, insurance, employment, housing, or legal services. If your AI tools assist your judgment rather than replace it, you are likely outside the Act's enforcement scope. Colorado-based firms should document their AI use cases and confirm with counsel whether any deployed tools qualify as high-risk systems.
Q: What does ABA Formal Opinion 512 require law firms to do?
A: ABA Formal Opinion 512 (2024) addresses lawyer competence, confidentiality, and supervision obligations when using generative AI. It requires lawyers to: (1) understand the AI tool well enough to evaluate its output — you cannot simply accept what AI produces without review; (2) protect client confidential information — do not input client data into AI tools that use inputs for training without client consent; (3) supervise non-lawyer staff using AI on client matters; (4) disclose AI use to clients when required by applicable rules or when a reasonable client would want to know. The opinion does not prohibit AI use — it establishes that existing professional conduct rules apply fully to AI-assisted work.
Q: Do I need to tell clients when I use AI on their matter?
A: The disclosure requirement depends on your profession and jurisdiction. For lawyers, ABA Formal Opinion 512 says disclosure is required when a client would reasonably expect to be informed — particularly when AI substantially generates work product the client will rely on, or when AI is used with client confidential data. For accountants, AICPA guidance recommends transparency about AI use in client deliverables and consent for AI use involving personal financial data. The safe, practical approach: add an AI disclosure to your engagement letter that describes how you use AI tools in service delivery. This is now standard practice at firms that have updated their agreements in 2025–2026.
Q: What is the EU AI Act and does it affect US firms?
A: The EU AI Act is the world's first comprehensive AI regulation, fully in effect as of August 2026. It applies to any business that offers AI systems or AI-enabled services to users in the EU, or whose AI outputs are used in the EU — regardless of where the business is based. For a US-based professional services firm with no EU clients, the Act does not apply directly. If you have EU-based clients or partners, confirm with your AI vendors that their tools are compliant, and review whether any of your AI use cases qualify as high-risk under the Act's Annex III categories. Most general-purpose AI writing and research tools are not classified as high-risk.
Q: What should my AI compliance checklist include?
A: A practical AI compliance checklist for a professional services firm with 5–50 employees should cover five areas: (1) Inventory — document every AI tool in use, who uses it, and what data it touches; (2) Engagement letter update — add an AI disclosure clause; (3) Data protection review — confirm that client data entered into AI tools is not used for model training; (4) Review policy — establish that AI-generated work product requires professional review before client delivery; (5) Monitoring — assign someone to track regulatory updates from your professional association quarterly. Most small firms can complete this checklist in a single afternoon.
Sources & Further Reading
- Colorado SB 24-205 — Artificial Intelligence Act — Full text of the Colorado AI Act, effective February 1, 2026
- ABA Formal Opinion 512 (2024) — Generative AI use and the Model Rules of Professional Conduct
- AICPA — AI Framework for CPA Firms (2024, updated 2026)
- European Commission — EU AI Act — Official regulatory framework and compliance timeline
- National Conference of State Legislatures — State AI legislation tracker (2026)
Related Reading
- The AI Adoption Gap in Professional Services: 2026 Data and What It Means for Your Firm
- How to Measure AI ROI for Professional Services Firms: The 2026 Framework
- Approved tools for compliant AI use
- The AI Business Model Shift: What Professional Services Firms Need to Change in 2026
- Heppner Ruling: AI and Attorney-Client Privilege (2026)
- View all issues in the archive
Get weekly updates on AI regulations affecting your firm — free in The Crossing Report.
Every Thursday: the one AI development that matters most to professional services firm owners, with specific guidance for your kind of firm.
Free weekly digest. No spam. Unsubscribe anytime.