- newsletter
- A&A FOCUS
Writing an effective AI prompt for an audit
The November A&A Focus webcast also looked at quality management documentation and real-world fraud lessons.
Related
Lurking in the shadows: The costs of unapproved AI tools
QM is here: Advice from early adopters
A new frontier: CPAs as AI system evaluators
The AICPA A&A Focus webcast on Nov. 5 once again delivered a blend of cutting-edge technology discussion and practical implementation insights for practitioners navigating today’s assurance environment. Hosted by Bob Durak, CPA, CGMA, director–A&A Technical Services for the AICPA, and Andrew Merryman, CPA, senior manager–A&A Technical Services, the program featured returning guest Danielle Supkis Cheek, CPA, senior vice president, AI Analytics and Assurance at Caseware; Jeanne Dee, CPA, CGMA, partner at Anders CPAs and Advisors; and Steve Dawson, CPA, president, DFG Forensic Accounting Services.
Durak opened by noting that this month’s session skipped the usual news segment to allow deeper conversations on three core themes: the practical use of artificial intelligence (AI), documenting firm quality management systems, and preventing and detecting fraud.
The STAR method for better AI prompts
Continuing her ongoing series on AI in accounting, Supkis Cheek returned to show practitioners how to move beyond basic questions and create prompts that yield consistent, professional-grade results. She framed her discussion around a simple but powerful structure — the STAR framework, which stands for Situation, Task, Appearance, and Refine.
Supkis Cheek began by reminding participants that writing a good prompt is itself a learned skill. As she put it, “The better prompt we make, the better question we make in AI, and making sure we word it the right way improves our chances of better results.” The framework, she explained, helps accountants organize their thinking so that nothing important is left out.
To demonstrate, Supkis Cheek walked through a lease agreement review scenario, one that most auditors have faced under FASB ASC Topic 842. She showed how a user might begin with the Situation statement:
“I am an auditor and need to assess a manufacturing client’s recording of a lease. I will be using lease-accounting software and need to input key terms from the lease agreement into the software to assess if the terms have been recorded in accordance with ASC 842.”
That opening context, she explained, gives the AI critical information:
- The user’s role;
- The document type;
- The goal; and
- The accounting framework involved.
The next step, Task, specifies exactly what the system should do. In her example, the prompt continues:
“List the following key terms from the lease so I can input them in my software.”
Once the Task is made clear, the user will focus on the Appearance step. Supkis Cheek stressed the importance of defining the output format. She instructed attendees to tell the AI precisely how to present its answer:
“Present in a tabular format and stay in the same order as the following table [presented in the event deck], and complete the results column, and page number column for the page number that you found the response on. If a term has conflicting or multiple responses include both responses and note the conflict.”
She suggested that auditors could attach an Excel import template or, if their software does not allow that, request a pipe-delimited layout so that results can be copied directly into the working papers.
Finally came Refine, the step that turns a single result into an iterative conversation. Supkis Cheek shared two examples for her sample prompt:
“Add confidence levels to each item, and for missing items or items that cannot be found leave blank and note that Confidence = Low.”
“Assess the lease like a banker and describe the operational risks of the lease to the lessor.”
This final step, she explained, helps professionals push AI to reason more deeply and produce insights that staff can then review, validate, and learn from.
Supkis Cheek concluded by underscoring that responsible AI use includes proper data governance. “If you are not paying for the product, you are the product,” she said, urging firms to choose secure, paid AI solutions that prevent client data from being used for model training.
Supkis Cheek will return to the broadcast in early 2026 to continue her discussion and provide more practical examples for applying AI to increase efficiency and effectiveness.
Documenting your firm’s system of quality management
Next, Dee turned the program’s attention to one of the most practical and time-sensitive challenges facing firms: documenting the system of quality management (Statement on Quality Management Standards No. 1, A Firm’s System of Quality Management) before its Dec. 15, 2025, effective date. Dee emphasized that the standard “is one of the best examples of a principles-based framework,” meaning it can and should be scaled to fit each firm’s size, structure, and service mix, from sole practitioners to regional networks.
She began by reinforcing that documentation is not optional. The system itself is principles-based, but every firm must be able to demonstrate how its firm-specific quality objectives, risks, and responses are linked. Firms should document this linkage for each of the six components that include required quality objectives (governance and leadership, relevant ethical requirements, acceptance and continuance of client relationships, engagement performance, resources, and information and communication).
Dee explained that the best way to approach documentation is to think in the same terms auditors already use for risk assessment. “Anchor and link everything to objectives, risks, and responses,” she said. A well-prepared system, she noted, starts with identifying the quality objectives that apply to the firm, considering the nature of the firm and the engagements it performs, and then documenting the risks that could prevent the firm from achieving those objectives. Finally, the firm will describe the responses it has designed and implemented to address those risks. “If your documentation stops at saying ‘we do annual training,’ but you never link that to the quality objective it supports, you’re missing the intent of the standard,” she said.
Dee also discussed scaling and tailoring, two of the most frequent areas of confusion among firms. She emphasized that scalability doesn’t mean omitting key components; it means right-sizing the documentation. For example, a sole practitioner might integrate governance and leadership considerations into a single narrative, whereas a multipartner firm might use an organizational chart and delegated responsibilities. Both approaches can be appropriate if they are consistent, risk-based, and traceable.
Addressing post-implementation documentation requirements, Dee explained that quality management is not a one-and-done exercise. “The days of putting a QC [quality control] manual on the shelf are over,” she reminded the audience. Once a firm implements its system, it will continually monitor and perform necessary remediation. Each finding, inspection, or complaint should trigger a documented analysis of the root cause and, if necessary, an update to policies or responses.
Dee closed her segment by reminding firms that every QM system should look different, and that’s precisely what the standard intends. “A 400-person firm’s system will not look like a sole practitioner’s,” she said. “But both need to be complete, both need to connect the dots, and both must show how the firm’s quality objectives, risks, and responses are linked and monitored.”
As Dee concluded, Merryman highlighted two recent Journal of Accountancy articles — “QM Is Here: Advice From Early Adopters” and “Right-Size Your Quality Management Documentation for SQMS No. 1,” noting that both provide helpful examples from firms already documenting their systems of quality management. He encouraged attendees to review those articles for additional context and implementation insights.
The AICPA has developed several resources to assist firms with the design, implementation, and documentation of their system of quality management.
Understanding and preventing fraud
Rounding out the program, Dawson shared decades of insight from his forensic investigations. He categorized fraud into misappropriation of assets, corruption, and financial statement fraud, noting that shell company schemes remain the most frequent type encountered.
What is a shell company scheme? Dawson explained that a shell company fraud occurs when an employee secretly creates an outside company — a so-called “shell” — and then submits fictitious invoices for payment to that outside entity. Because of weak or absent internal controls, those invoices get processed, and large sums leave the organization for goods or services that were never provided. He said his team encounters this scheme “every time we turn a corner,” emphasizing how easily it can persist when segregation of duties or vendor-approval processes are lax.
Dawson reminded listeners of the fraud triangle — pressure, rationalization, and opportunity — and presented a striking statistic: “Ninety-five percent of frauds we investigate are committed by decent people facing financial need, not by inherently dishonest individuals.”
This reinforces the need for strong preventive controls rather than reliance on perceived integrity.
For auditors, Dawson emphasized practical red flags:
- Cash flow anomalies (e.g., large increases in receivables without corresponding operating cash).
- Persistent client delays or evasive answers during evidence requests.
- Inconsistent expense trends revealed through horizontal and vertical analyses.
He urged auditors to maintain professional skepticism without adopting an adversarial tone, saying “trust but verify.” For smaller entities, he recommended a cost-effective universal control: increase the perception of detection. Simple actions such as fraud-awareness talks, internal tip-reporting channels, and visibly modeled ethical leadership can deter wrongdoing far more effectively than complex procedures.
Audience questions and looking ahead
During the Q&A, Supkis Cheek fielded questions on AI privacy and confirmed that secure, team-based AI subscriptions can meet confidentiality standards. Dee clarified that while SQMS No. 1 does not mandate a single-document “quality manual,” firms that rely solely on a risk-response memo likely under-document required elements. Dawson responded to questions about whether fraud is more prevalent in public or private companies, concluding that both face comparable risk profiles — but AI now influences both fraud detection and perpetration, underscoring the dual-edged nature of the technology (see “AI-Powered Hacking in Accounting: ‘No One Is Safe’,” JofA, Oct. 1, 2025).
Durak closed by reminding viewers to add the final webcast of 2025, scheduled for Dec. 3, to their calendars.
AICPA members are encouraged to attend these monthly events and review the accompanying newsletters for more in-depth coverage of these critical topics. Members can access archives of past sessions at the A&A Focus Series webpage.
— Dave Arman, CPA, MBA, is senior manager–Audit Quality at the Association of International Certified Professional Accountants. To comment on this article or to suggest an idea for another article, contact Jeff Drew at Jeff.Drew@aicpa-cima.com.
