
The world’s main storage and server producers are combining their design and engineering experience with the NVIDIA AI Knowledge Platform — a customizable reference design for constructing a new class of AI infrastructure — to supply techniques that allow a brand new technology of agentic AI purposes and instruments.
The reference design is now being harnessed by storage system leaders globally to assist AI reasoning brokers and unlock the worth of data saved within the thousands and thousands of paperwork, movies and PDFs enterprises use.
NVIDIA-Licensed Storage companions DDN, Dell Applied sciences, Hewlett Packard Enterprise, Hitachi Vantara, IBM, NetApp, Nutanix, Pure Storage, VAST Knowledge and WEKA are introducing merchandise and options constructed on the NVIDIA AI Knowledge Platform, which incorporates NVIDIA accelerated computing, networking and software program.
As well as, AIC, ASUS, Foxconn, Quanta Cloud Know-how, Supermicro, Wistron and different unique design producers (ODMs) are creating new storage and server {hardware} platforms that assist the NVIDIA reference design. These platforms function NVIDIA RTX PRO 6000 Blackwell Server Version GPUs, NVIDIA BlueField DPUs and NVIDIA Spectrum-X Ethernet networking, and are optimized to run NVIDIA AI Enterprise software program.
Such integrations permit enterprises throughout industries to rapidly deploy storage and knowledge platforms that scan, index, classify and retrieve massive shops of personal and public paperwork in actual time. This augments AI brokers as they purpose and plan to resolve complicated, multistep issues.
Constructing agentic AI infrastructure with these new AI Knowledge Platform-based options will help enterprises flip knowledge into actionable information utilizing retrieval-augmented technology (RAG) software program, together with NVIDIA NeMo Retriever microservices and the AI-Q NVIDIA Blueprint.
Storage techniques constructed with the NVIDIA AI Knowledge Platform reference design flip knowledge into information, boosting agentic AI accuracy throughout many use circumstances. This will help AI brokers and customer support representatives present faster, extra correct responses.
With extra entry to knowledge, brokers can even generate interactive summaries of complicated paperwork — and even movies — for researchers of all types. Plus, they will help cybersecurity groups in protecting software program safe.
Main Storage Suppliers Showcase AI Knowledge Platform to Energy Agentic AI
Storage system leaders play a vital function in offering the AI infrastructure that runs AI brokers.
Embedding NVIDIA GPUs, networking and NIM microservices nearer to storage enhances AI queries by bringing compute nearer to vital content material. Storage suppliers can combine their document-security and access-control experience into content-indexing and retrieval processes, bettering safety and knowledge privateness compliance for AI inference.
Knowledge platform leaders reminiscent of IBM, NetApp and VAST Knowledge are utilizing the NVIDIA reference design to scale their AI applied sciences.
IBM Fusion, a hybrid cloud platform for operating digital machines, Kubernetes and AI workloads on Purple Hat OpenShift, affords content-aware storage providers that unlock the which means of unstructured enterprise knowledge, enhancing inferencing so AI assistants and brokers can ship higher, extra related solutions. Content material-aware storage permits quicker time to insights for AI purposes utilizing RAG when mixed with NVIDIA GPUs, NVIDIA networking, the AI-Q NVIDIA Blueprint and NVIDIA NeMo Retriever microservices — all a part of the NVIDIA AI Knowledge Platform.
NetApp is advancing enterprise storage for agentic AI with the NetApp AIPod answer constructed with the NVIDIA reference design. NetApp incorporates NVIDIA GPUs in knowledge compute nodes to run NVIDIA NeMo Retriever microservices and connects these nodes to scalable storage with NVIDIA networking.
VAST Knowledge is embedding NVIDIA AI-Q with the VAST Knowledge Platform to ship a unified, AI-native infrastructure for constructing and scaling clever multi-agent techniques. With high-speed knowledge entry, enterprise-grade safety and steady studying loops, organizations can now operationalize agentic AI techniques that drive smarter selections, automate complicated workflows and unlock new ranges of productiveness.
ODMs Innovate on AI Knowledge Platform {Hardware}
Providing their in depth expertise with server and storage design and manufacturing, ODMs are working with storage system leaders to extra rapidly deliver revolutionary AI Knowledge Platform {hardware} to enterprises.
ODMs present the chassis design, GPU integration, cooling innovation and storage media connections wanted to construct AI Knowledge Platform servers which might be dependable, compact, power environment friendly and reasonably priced.
A excessive share of the ODM trade’s market share contains producers primarily based or colocated in Taiwan, making the area an important hub for enabling the {hardware} to run scalable agentic AI, inference and AI reasoning.
AIC, primarily based in Taoyuan Metropolis, Taiwan, is constructing flash storage servers, powered by NVIDIA BlueField DPUs, that allow increased throughput and higher energy effectivity than conventional storage designs. These arrays are deployed in lots of AI Knowledge Platform-based designs.
ASUS partnered with WEKA and IBM to showcase a next-generation unified storage system for AI and high-performance computing workloads, addressing a broad spectrum of storage wants. The RS501A-E12-RS12U, a WEKA-certified software-defined storage answer, overcomes conventional {hardware} limitations to ship distinctive flexibility — supporting file, object and block storage, in addition to all-flash, tiering and backup capabilities.
Foxconn, primarily based in New Taipei Metropolis, builds most of the manufacturing trade’s accelerated servers and storage platforms used for AI Knowledge Platform options. Its subsidiary Ingrasys affords NVIDIA-accelerated GPU servers that assist the AI Knowledge Platform.
Supermicro is utilizing the reference design to construct its clever all-flash storage arrays powered by the NVIDIA Grace CPU Superchip or BlueField-3 DPU. The Supermicro Petascale JBOF and Petascale All-Flash Array Storage Server ship excessive efficiency and energy effectivity with software-defined storage distributors and assist use with AI Knowledge Platform options.
Quanta Cloud Know-how, additionally primarily based in Taiwan, is designing and constructing accelerated server and storage home equipment that embrace NVIDIA GPUs and networking. They’re well-suited to run NVIDIA AI Enterprise software program and assist AI Knowledge Platform options.
Taipei-based Wistron and Wiwynn supply revolutionary {hardware} designs appropriate with the AI Knowledge Platform, incorporating NVIDIA GPUs, NVIDIA BlueField DPUs and NVIDIA Ethernet SuperNICs for accelerated compute and knowledge motion.
Study extra in regards to the newest agentic AI developments at NVIDIA GTC Taipei, operating Might 21-22 at COMPUTEX.
