New release empowers enterprises to run AI securely within their own data centres
Cloudera, the only provider enabling AI to operate on data anywhere, has unveiled the latest version of Cloudera Data Services, bringing Private AI capabilities directly into on-premises environments. This development allows enterprises to harness secure, GPU-powered generative AI behind their own firewall.
For many organisations, safeguarding sensitive data and intellectual property remains a major barrier to AI adoption. Research from Accenture reveals that 77% of businesses lack the foundational data and AI security measures necessary to protect models, pipelines, and cloud infrastructure.
With this update, enterprises can cut infrastructure costs, simplify data lifecycle management, enhance productivity, and accelerate deployment times. They can now enjoy cloud-native flexibility within their firewall, scaling operations securely without compromising on compliance or control.
For the first time, both Cloudera AI Inference Service and AI Studios are available for in-data centre use. Previously limited to the cloud, these tools tackle critical challenges to AI adoption, enabling organisations to build and operate generative AI applications entirely within their own secure environments:
-
Cloudera AI Inference Service, powered by NVIDIA, is one of the first on-premises AI inference services to embed NVIDIA NIM microservice capabilities. It delivers a secure, scalable engine for deploying and managing AI models directly where the data resides.
-
Cloudera AI Studios provides low-code templates for developing and launching generative AI applications and agents, making the AI lifecycle more accessible to all teams.
An independent Forrester Consulting Total Economic Impact™ study, commissioned by Cloudera, found that enterprises implementing Cloudera Data Services on-premises achieved an 80% faster workload deployment, boosted data team productivity by 20%, and cut costs by 35%. The research also noted improved operational efficiency, with hardware utilisation increasing from 30% to 70% and capacity requirements falling by up to 50% post-modernisation.
“Historically, enterprises have been forced to cobble together complex, fragile DIY solutions to run their AI on-premises,” said Sanjeev Mohan, industry analyst. “Today the urgency to adopt AI is undeniable, but so are the concerns around data security. What enterprises need are solutions that streamline AI adoption, boost productivity, and do so without compromising on security.”

“Cloudera Data Services On-Premises delivers a true cloud-native experience on-premises, providing agility and efficiency without sacrificing security or control,” said Leo Brunnick, Cloudera’s Chief Product Officer. “This release is a significant step forward in data modernization, moving from monolithic clusters to a suite of agile, containerized applications.”
“BNI is proud to be an early adopter of Cloudera’s AI Inference service,” stated Toto Prasetio, Chief Information Officer of BNI. “This technology provides the essential infrastructure to securely and efficiently expand our generative AI initiatives, all while adhering to Indonesia’s dynamic regulatory environment. It marks a significant advancement in our mission to offer smarter, quicker, and more dependable digital banking solutions to the people of Indonesia.”
By delivering secure, high-performance AI within the data centre, Cloudera is enabling customers to progress from prototype to production in weeks rather than months.
