Perforce research also reveals that 60% of organizations have experienced data breaches or theft in software development, testing, AI, and analytics environments.
MINNEAPOLIS, Sept. 30, 2025 /PRNewswire/ -- Perforce Software, the DevOps company for global teams seeking AI innovation at scale, announced the findings of the 2025 State of Data Compliance and Security Report. This comprehensive research reveals alarming trends when it comes to AI and data privacy, with mass confusion around the safety of sensitive data in AI model training and the frequency of data privacy exposure.
The report reveals that 91% of organizations believe sensitive data should be allowed in AI training, yet 78% express high concern about theft or breach of model training data. This paradox and lack of understanding that sensitive data allowed into an AI model can never be removed or secured, highlights the urgent need for clearer guidance and robust solutions that support organizations to more securely accelerate AI initiatives. Given the confusion and concerns around safety, it is no surprise that 86% are planning to invest in AI data privacy solutions over the next 1-2 years.
"The rush to adopt AI presents a dual challenge for organizations: Teams are feeling both immense pressure to innovate with AI and fear about data privacy in AI," said Steve Karam, Principal Product Manager, Perforce. "To navigate this complexity, organizations must adopt AI responsibly and securely, without slowing down innovation. You should never train your AI models with personally identifiable information (PII), especially when there are secure ways to rapidly deliver realistic but synthetic data into AI pipelines."
The report also reveals alarming trends related to sensitive data exposures in non-production, with 60% of organizations experiencing data breaches or theft in software development, AI and analytics environments, an 11% increase from last year. Despite the known threat landscape, 84% of organizations surveyed are still allowing data compliance exceptions in non-production, thus propagating these exposures.
"These findings underscore the critical need for organizations to address the growing data security and compliance risks in non-production environments," said Ross Millenacker, Senior Product Manager, Perforce. "There's a perception that protecting sensitive data through measures like masking is cumbersome and manual. Too many organizations see the cure of masking data and implementing those steps as worse than the disease of allowing exceptions. But this leads to a significant vulnerability. It's time to close these gaps and truly protect sensitive data."
Perforce is prepared to help organizations address these risks. Earlier this month, Perforce introduced AI-powered synthetic data generation in the Delphix DevOps Data Platform. Uniting data masking, data delivery, and synthetic data generation in a single platform, and boosting it with AI, ensures privacy compliance and AI/ML model training can go hand-in-hand for businesses. Learn more about Delphix AI-powered capabilities.
Interested parties can download the full 2025 State of Data Compliance and Security Report on perforce.com.
About Perforce
The best-run DevOps teams in the world choose Perforce. Powered by advanced technology, including powerful AI that takes you from AI ambition to real results, the Perforce suite is purpose-built to handle complexity, maintain speed without compromise, and ensure end-to-end integrity across your DevOps toolchain. With a global footprint spanning more than 80 countries and including over 75% of the Fortune 100, Perforce is the trusted partner for innovation.
Harness the power of AI and accelerate your technology delivery without shortcuts. Build, scale, and innovate with Perforce — where efficiency meets intelligence.
Media Contacts
PERFORCE GLOBAL
Maxine Ambrose
Ambrose Communications
Ph: +44 118 324 1040
[email protected]
SOURCE Perforce Software

WANT YOUR COMPANY'S NEWS FEATURED ON PRNEWSWIRE.COM?

Newsrooms &
Influencers

Digital Media
Outlets

Journalists
Opted In
Share this article