- news
- TECHNOLOGY
Lurking in the shadows: The costs of unapproved AI tools
Related
A new frontier: CPAs as AI system evaluators
Creating an AI agent in ChatGPT
Using TEXTSPLIT to dissect Excel text strings
TOPICS
Company-approved AI tools are commonly falling short of meeting the needs of employees, who in turn use “shadow AI” solutions that are creating a heightened threat to cybersecurity.
A recent Cybernews survey of more than 1,000 employees in the United States found that 59% use shadow AI — artificial intelligence tools that haven’t been formally approved by their companies — to get the job done. Among executives and senior managers participating in the survey, 93% said they use shadow AI tools at work.
That’s a big deal that could cost companies. The average cost of a data breach is $670,000 greater if it’s caused by shadow AI as opposed to sanctioned AI, and shadow-AI-related data breaches are more common than sanctioned-AI-related breaches, according to a recent IBM report.
“Shadow AI thrives in silence,” Žilvinas Girėnas, head of product at nexos.ai, said in the survey release. “When managers turn a blind eye and there’s no clear policy, employees assume it’s fine to use whatever tool gets the job done. That’s how sensitive data ends up in places it should never be.
“AI use in the workplace should never live in a gray zone, and leaders need to set clear rules and give employees secure options before the shortcuts turn into breaches.”
Based on the survey results, that “gray zone” often is created by the response of leadership to an underdeveloped approach to AI:
- About half of employees (52%) said employers provide approved AI tools for work, but just 33% said those tools fully meet their needs.
- Three-quarters of employees using shadow AI admitted to sharing possibly sensitive information with unapproved AI tools, most commonly employee data, customer data, and internal documents. Even so, 57% of those employees report that their direct managers are aware of and support their use of such tools.
- More than half of all employees surveyed said either their company has no official policy about shadow AI use (23%), they’re unaware of whether there’s such a policy (16%), or that shadow AI use is actively encouraged (16%). Just 10% said it’s strictly prohibited.
“Once sensitive data enters an unsecured AI tool, you lose control. It can be stored, reused, or exposed in ways you’ll never know about,” Girėnas said. “That’s why companies need secure, approved tools to keep critical information protected and traceable.”
— To comment on this article or to suggest an idea for another article, contact Bryan Strickland at Bryan.Strickland@aicpa-cima.com.
