How to Use AI Tools Ethically in a CPA Practice: Data Privacy, Confidentiality, and Workflow Guidance
AI tools are already embedded in CPA workflows. ChatGPT, Claude, Microsoft Copilot, and purpose-built accounting tools from Thomson Reuters, Intuit, and Wolters Kluwer are being used for tax research, document summarization, draft preparation, and client communication. The efficiency gains are real. So are the compliance obligations. Three separate legal frameworks govern how CPAs can use AI with client data — and most practitioners are not fully aware of all three. This guide covers what the rules require, which AI workflows are safe, and how to structure an ethical AI policy for your firm.
The Three Legal Frameworks That Govern AI Use in CPA Practice
Before deploying any AI tool in client-facing work, understand the compliance layer you are operating within.
IRC §7216 — Unauthorized Disclosure of Tax Return Information
Under IRC §7216 and the implementing regulations at 26 CFR §301.7216-1 through -3, it is a federal criminal offense for a tax return preparer to knowingly or recklessly disclose or use tax return information for any purpose other than preparing a tax return — without the taxpayer's written consent. The penalty is up to one year imprisonment and a $1,000 fine per violation. Treasury Decision 9745 (2016) defines "tax return information" broadly: any information furnished by the taxpayer, derived from a return, or obtained in connection with preparing a return. This includes income amounts, deductions, financial account details, and business structure information.
The key compliance question for AI tools: does the AI system receive, process, or store client return data? If you paste a client's tax data into a general-purpose AI tool that uses inputs for model training or retains them in logs accessible to third parties, you have disclosed tax return information without consent. That is a potential §7216 violation unless the client has signed a compliant consent form before disclosure.
AICPA Code of Professional Conduct — ET §1.700.001 (Confidential Client Information)
The AICPA Code of Professional Conduct ET §1.700.001 prohibits CPAs from disclosing confidential client information without the client's specific consent, regardless of what the law permits. This is broader than §7216 — it covers all client information, not just tax return data, and applies to AICPA members regardless of whether they are a tax return preparer under the IRC definition. Using client financial data with an AI tool that transmits that data to a third-party server is a disclosure. Whether that disclosure is permitted depends on what your engagement letter says and what the client has consented to.
FTC Safeguards Rule — 16 CFR Part 314
CPA firms that provide tax preparation, financial planning, and accounting services qualify as "financial institutions" under the Gramm-Leach-Bliley Act. The FTC Safeguards Rule (16 CFR Part 314), strengthened effective June 9, 2023, requires covered firms to implement a written information security program, conduct a risk assessment, and maintain controls over third-party service providers that access customer information. AI vendors are third-party service providers under this rule. Before deploying any AI tool that handles client data, you must assess the vendor's security controls and document that assessment in your information security program.
Which AI Workflows Are Safe vs. Risky
Not all AI use creates the same risk. The distinction that matters most: does client-identifiable data enter the AI system?
Lower-risk AI workflows (no client data required):
- Tax research and law interpretation: "Explain the passive activity loss rules under IRC §469" — no client data needed
- Template drafting: "Draft an engagement letter section covering AI tool use" — no client-specific details
- Summarizing IRS publications, revenue rulings, or Tax Court decisions: copy the public IRS document into the AI, not the client's version with their information attached
- Writing client-facing educational content, newsletters, or FAQ documents
- Analyzing anonymized, hypothetical scenarios: "A single-member LLC with $200k in net income is considering S-Corp election" — hypothetical, not a specific client
- Internal process documentation: firm workflow design, checklist drafting, training materials
Higher-risk workflows that require consent and vendor vetting:
- Pasting client tax data — income figures, K-1 details, Social Security numbers, EINs, or account numbers — into AI prompts to get return preparation guidance
- Uploading client financial statements for analysis or categorization
- Using AI to draft client-specific responses to IRS notices, referencing actual client facts
- Running client payroll or accounting data through AI-powered bookkeeping tools
For higher-risk workflows, three conditions must be met before proceeding: (1) client written consent specifically referencing AI processing of their data, (2) a vendor that provides a signed data processing agreement (DPA) confirming they do not use your data for model training, and (3) an updated risk assessment in your Safeguards Rule information security program documenting the vendor's controls.
Evaluating AI Vendors: What to Look For
Enterprise and business-tier versions of AI platforms have materially different data handling policies than consumer versions. Enterprise API products from OpenAI, Anthropic, Microsoft Azure OpenAI, and Google Vertex AI typically do not use customer data for model training and provide data processing agreements. Consumer accounts — free or individual-plan ChatGPT, Claude.ai, Gemini — may use inputs for model improvement by default, though users can opt out.
This distinction is critical and must be verified for each tool before deployment. Do not assume an enterprise designation is sufficient without reviewing the DPA.
Questions to ask any AI vendor before deploying in a CPA practice:
- Data retention: Does the platform retain user inputs? For how long?
- Training data use: Does the platform use client data for model training or fine-tuning?
- Data residency: Where is data stored? Does it leave the United States?
- Access controls: Can the vendor's employees access session data, and under what circumstances?
- Security certifications: Is the vendor SOC 2 Type II certified?
- Data processing agreement: Will they sign a DPA confirming their obligations regarding your customer data?
Purpose-built accounting AI tools — Thomson Reuters CoCounsel Tax, Intuit Assist for ProConnect, Wolters Kluwer CCH Axcess AI — are designed with CPA firm data handling requirements in mind and typically provide appropriate DPAs as a standard part of their enterprise agreements. General-purpose AI tools require direct due diligence before use with client data.
Client Consent: What the Consent Form Must Include
If you intend to use AI tools that process client-specific data, written consent is required. A valid §7216 consent form must satisfy 26 CFR §301.7216-3, which requires:
- The taxpayer's name and the tax return preparer's name
- A description of the information to be disclosed or used
- A description of the purpose of the disclosure or use
- The name of the recipient (for third-party disclosures)
- The tax year of the return to which the consent applies
- The taxpayer's signature and date
Under 26 CFR §301.7216-3(b), consent forms for online disclosures must be a separate document from other authorizations and must display the disclosure in 12-point type. A checkbox buried in an engagement letter does not comply.
Practical approach: Amend your engagement process to include a standalone §7216 consent form that specifically references "AI-assisted tax preparation and research tools" and names the specific vendors — or categories of vendors — you use. Update this form annually as your tool stack changes.
The Competence Obligation Under Circular 230
IRS Circular 230 does not yet specifically address AI, but the general competence requirement under 31 CFR §10.35 applies directly. Practitioners must possess the knowledge, skill, thoroughness, and preparation necessary for the matter at hand. A CPA who relies on an AI output without verifying its accuracy is not practicing competently — regardless of how confident the AI's response appears.
The practical implication: AI is a research and drafting accelerator, not a substitute for professional judgment. Every AI-generated output used in client work must be reviewed and verified by a qualified practitioner before it reaches the client or the IRS. This applies equally to tax research memos, IRS notice responses, and planning recommendations.
Building a Firm AI Policy
A written AI policy is now a professional and regulatory necessity. Your policy should address at minimum:
Approved tools and permitted uses: List specifically which AI tools are approved for which purposes. Distinguish between "research only" tools — where no client data enters the system — and "client-data capable" tools that have been vendor-vetted, have signed DPAs, and require prior client consent.
Prohibited uses: Explicitly prohibit pasting client tax data, Social Security numbers, EINs, or financial account numbers into unapproved AI systems.
Supervision and review requirements: All AI outputs require professional review before any client use. Establish a clear review workflow — who reviews AI-generated research, how draft outputs are verified against applicable authority, and how errors are caught before client delivery or IRS submission.
Staff training: All staff using AI tools must understand the firm's confidentiality obligations and the approved tool list. AI ethics training should be integrated into onboarding and annual CPE programs — most states count technology and ethics content toward license renewal requirements.
Client communication: Clients should know you use AI tools. Frame this around efficiency and accuracy: "We use AI-assisted research and drafting tools reviewed and verified by our CPAs." Transparency here builds trust and provides an opportunity to obtain proper consent before it is needed.
Recordkeeping: Retain all §7216 consent forms. For Safeguards Rule compliance, maintain documentation of your vendor assessments, your written information security program, and your AI policy itself — in a format that can survive regulatory review. For how long these records must be kept and what your obligations are when clients terminate the engagement, see Document Retention Requirements for Business Clients.
AI in the Context of CPA Practice Growth
AI tools create the most leverage in high-volume, repeatable workflows: first-draft tax research memos, engagement letter templates, client education content, and checklist generation. This is precisely the work that consumes associate time in compliance-heavy practices — and where automation creates margin without sacrificing quality.
For firms building Client Advisory Services (CAS) practices, AI-assisted analysis can accelerate the delivery of financial summaries, variance commentary, and cash flow narrative that differentiate Tier 2 and Tier 3 CAS from basic bookkeeping. The key is using AI on firm-generated work product — analysis and templates — rather than routing raw client data through unapproved systems. The combination of value-based pricing and AI-assisted delivery is what makes CAS economically sustainable: subscription revenue at advisory rates, delivered with compliance-level consistency and speed.
FAQ
Can I use ChatGPT to help with tax research without violating client confidentiality?
Yes, if you structure prompts so no client-identifiable information is included. Describing a hypothetical scenario — "a single-member LLC with $180k in net income considering S-Corp election" — does not implicate §7216 or AICPA confidentiality rules because no client information is disclosed. General tax research using AI (analyzing IRC provisions, summarizing IRS publications, exploring planning strategies in the abstract) is lower-risk and does not require consent. The moment you include a client's actual name, SSN, EIN, or specific financial figures, the analysis changes.
Does my engagement letter need to reference AI tools?
Yes, if you intend to use AI tools that process client data. Your engagement letter should describe the services you provide and the technology you use. For §7216 compliance, the written consent must be on a standalone form (not embedded in the engagement letter itself), but the engagement letter should reference AI-assisted tools so clients understand the nature of your workflow before signing anything.
Are AI tools from accounting software vendors safer than general AI tools?
Generally yes, but verify with each vendor before deploying. Purpose-built CPA tools from Thomson Reuters, Intuit, and Wolters Kluwer have designed their AI features for professional services data requirements and typically provide data processing agreements that prohibit using your client data for model training. General-purpose AI tools require you to verify these commitments at the specific tier of service you are using — enterprise plans typically differ significantly from consumer plans, and the default settings often do not reflect enterprise data handling commitments.
What happens if I accidentally share client data with an AI tool without consent?
A §7216 violation requires knowing or reckless disclosure. An accidental paste may not meet the "knowingly" standard, but it can meet "recklessly" if your firm had no written policy prohibiting it. The best defense is an established, documented AI policy that explicitly prohibits client data in unapproved tools. If an accidental disclosure occurs, document it immediately, assess whether state breach notification laws apply, and implement procedural controls to prevent recurrence.
How should I explain AI tool use to clients who are concerned about data privacy?
Be direct about what you do and what you do not do. If you use AI for research only — with no client data in the system — say so clearly. If you use AI-assisted drafting or analysis that processes their data, explain the consent process and what the vendor's data handling commitments are. A clear, confident answer here — "We use AI tools to improve efficiency and accuracy, here is how we protect your data, and here is the consent form" — is a stronger position than a vague non-answer that creates more uncertainty than it resolves.
Does Circular 230 impose any specific requirements for AI-assisted tax advice?
The IRS Office of Professional Responsibility has not issued AI-specific guidance as of early 2026, but the general competence (§10.35), diligence (§10.22), and written advice (§10.37) standards all apply to AI-assisted work. A practitioner who adopts an AI-generated tax research conclusion without independent verification is not exercising the thoroughness Circular 230 requires. The AICPA Ethics Committee and several state boards of accountancy have begun issuing supplemental AI guidance that CPAs should monitor alongside Circular 230 compliance obligations.
Arvori helps CPA firms manage AI-assisted workflows with built-in compliance tracking, client document management, and secure data handling — designed for professional services firms that need efficiency without compromising client confidentiality. Learn more at arvori.app.