DDN Leans on Better Software for Enterprise AI Success

DDN ostensibly is known for building big, fast storage hardware that’s capable of keeping the world’s largest supercomputers fed with data. But as the race to build enterprise AI applications heats up, the company is counting on software to provide a competitive advantage, both for itself and its customers.

“I don’t want to be in the hardware business,” said DDN CEO and Co-Founder Alex Bouzari. “It’s all commodity. The storage is commodity. The networking is commodity. The compute is all commodity. The real value is in the software and how the software leverages the underlying commodity hardware to deliver value to customers.”

That doesn’t mean that Bouzari doesn’t sell hardware. His company sold hundreds of millions of dollars’ worth of appliances in 2024 to satiate hyperscalers’ seemingly limitless appetite for high-end data storage. But the thing that ultimately gives value to the customer is software, he said.

DDN co-founder and CEO Alex Bouzari

To that end, DDN made several software-related announcements at Nvidia’s GPU Technical Conference (GTC). The common theme across the announcements is developing better software to help customers build and deploy AI applications.

DDN’s first GTC announcement is the launch of a new platform for enterprise AI. Dubbed xFusionAI, the platform combines the company’s parallel file system, called EXAScaler, with DDN’s object storage system, Infinia.

DDN said that, by fusing file and object storage into a single platform, xFusionAI will give customers the scale-out storage capabilities needed to handle the wide range of steps in the AI lifecycle, from training AI models to powering retrieval augmented generation (RAG) pipelines and running inference efficiently.

“Bringing those two things together makes it easier for organizations to benefit from AI,” Bouzari told BigDATAwire in an interview last week. “It’s not, ‘Well, do I pick EXAScaler or do I pick Infinia?’ No, both of them together will add more value. And so the thought was fusion–let’s integrate it to make it easier for customers to deploy it.”

xFusionAIalso provides a platform for integrating other critical elements of AI workflows, including vector databases, memory-augmented transformers, and knowledge graphs. All told, the platform can speed AI training by 100x, speed RAG by 10x, and cut inference costs by 60%, it claimed.

Several DDN customers are already using xFusionAI. The computer maker Supermicro has realized a 15X speedup in its AI data center workflow, DDN said. As more enterprises look to deploy their RAG and agentic AI applications, they will need a unified data platform to support it, Bouzari said.

Image courtesy DDN

“Bringing these two together is giving us the ability to better address agentic AI types of use cases, as well as multimodal types of use cases, because now we can greatly accelerate the training side, as well as significantly lower the latency on the inference side,” he said.

DDN’s second GTC announcement, made today, is the launch of Inferno, a new software-defined appliance designed to accelerate AI inference workloads. While EXA Fusion enables customers to go from AI training (largely EXAScaler workloads) all the way to RAG and inference (largely Infinia workloads), Inferno gives customers that extra performance kick that is sometimes needed to make inference workable in the real world.

Inerno brings together DDN’s Infinia 2.0 (launched last month) and Nvidia’s Spectrum-X networking to get the response times down into sub-millisecond range and lower costs. DDN says the solution can lower the latency in AI inference by 10x compared to running against S3, and drop costs by up to 12x.

“Inference is so expensive if you don’t do the latency properly,” Bouzari said. “At the end of the day, all of this–the whole infrastructure and the models and the training of the models–if you don’t deliver insight that is business enabling, it’s all b******t. So that last step, that inference step where multimodal resides, where agentic AI resides, cost reducing that is critical. And so that’s really what Inferno is doing. It’s significantly lowering the latency and therefore cost-reducing the inference part of it.”

Not every industry needs the sort of latency reduction that Inferno can deliver, but some do. Autonomous driving and high-frequency trading are two use cases that DDN expects Inferno to find success, but there are undoubtedly other places where latency plays a big factor in whether an AI solution will be a hit or a miss.

DDN is hoping to help its clients hit AI homeruns in whatever industry they’re in. And that brings us to the company’s third GTC announcement: the launch of IndustrySync.

With IndustrySync, DDN is launching a series of industry-specific solutions that bring together all of the capabilities that customers need to succeed with AI. That includes hardware (Nvidia DGX boxes), software (DDN Infinia and its Data Intelligence software), and services from DDN, Nvidia, Nvidia Cloud Partners (NCPs), and cloud providers.

DDN is launching three IndustrySync “AI stacks” to start out, including packages for financial services, life sciences, and autonomous driving, but more will be on the way.

IndustrySync is necessary because enterprises are struggling to bring together all of the necessary components to build and run AI, Bouzari said. DDN has experience with hundreds of enterprise AI deployments (AI drove half-a-billion dollars in revenue for the company last year, Bouzari said), and it has seen many of the “gotchas” that can trip up AI projects. The goal with IndustrySync is to smooth the AI deployment so that it’s non-disruptive to the organization, he said.

“Eventually for us, the value is in deploying a software model which is what we’re doing with Nvidia, which can be deployed anywhere, on prem, in the cloud, and so on,” Bouzari said. “We don’t want to be a systems integrator. Look, Nvidia does not want to be a system integrator, but they are providing reference architectures and expert advice and helping handhold the deployment of these solutions. That’s what we’re doing.”

Related Items:

DDN Gooses AI Storage Pipelines with Infinia 2.0

Nvidia Debuts Enterprise Reference Architectures to Build AI Factories

DDN Cranks the Data Throughput with AI400X2 Turbo

The post DDN Leans on Better Software for Enterprise AI Success appeared first on BigDATAwire.