NetApp is innovating in the AI space with several updates within our partner ecosystem. By deepening our partnership with Domino Data Lab, harnessing NVIDIA accelerated computing and AI software platforms, and releasing the NetApp® AIPod™ with Lenovo solution in its general availability (GA) version, we're driving innovation to new heights. These advancements provide our customers with top-tier, integrated AI solutions, ensuring a seamless and powerful AI experience. Get ready to see how we're shaping the future of enterprise AI.
NetApp's partnership with Domino Data Lab is a strategic move that underscores the importance of smooth integration in AI workflows, from preparation, to training, to deployment. By using Amazon FSx for NetApp ONTAP as the underlying storage for Domino's hyperscaler-based AWS solutions, Domino Data Lab and NetApp ensure that the infrastructure supporting AI workloads is both high performing and highly reliable. This collaboration enhances Domino's offerings to provide businesses with a seamless way to handle everything from model development to data management. The collaboration also further validates and solidifies NetApp's position as a versatile storage provider for cloud-based AI solutions.
NetApp and NVIDIA are long standing technology partners, serving hundreds of customers for AI model training and inferencing. NetApp is one of NVIDIA’s storage partners who can fulfill the data management and storage demands for both NVIDIA DGX BasePOD and NVIDIA DGX SuperPOD systems.
NetApp has begun NVIDIA’s certification process of NetApp ONTAP storage on the AFF A90 platform with NVIDIA DGX SuperPOD AI infrastructure, which will enable organizations to leverage industry-leading data management capabilities for their largest AI projects. NetApp ONTAP addresses data management challenges for high-performance computing (HPC), eliminating the need to compromise data management for AI training workloads.
Designed for consistent, feature-rich data management across the data pipeline, the certification of AFF A90 systems with the latest ONTAP technology for NVIDIA DGX SuperPOD systems is an exciting prospect. It signifies that NetApp is prepared to meet the demands of the most demanding AI deployments with advanced enterprise data management capabilities that will empower data scientists to push the boundaries of AI innovation.
NetApp is enhancing enterprise AI by integrating NVIDIA NeMo Retriever and NVIDIA NIM, part of the NVIDIA AI Enterprise software platform, with NetApp BlueXP™ services. This integration, initially unveiled at NVIDIA's GTC conference in March, will enable businesses to manage data effortlessly across both on-premises and cloud environments. Using BlueXP, companies can discover, search, and curate data on premises and in the public cloud while adhering to governance and security policies. Subsequently, this data is processed by NVIDIA NeMo Retriever and NVIDIA NIM for use in generative AI applications with the necessary security and privacy provisions. Currently showcased at the NetApp INSIGHT® conference, this integration is slated for a customer technology preview later this year, marking a milestone in NetApp's AI-driven innovation journey.
NetApp, with Lenovo and NVIDIA, is bringing AI inferencing to the enterprise with this simple, scalable, and secure converged infrastructure solution that is now GA. This solution integrates NVIDIA accelerated computing and NVIDIA AI Enterprise software with Lenovo’s servers and NetApp’s powerful storage systems into one comprehensive offering. NetApp AIPod with Lenovo technology is designed for generative AI and optimized for supporting retrieval-augmented generation (RAG) techniques. To learn more about the NetApp AIPod with Lenovo for NVIDIA OVX solution, read about how RAG technology can reshape the landscape of enterprise AI.
The E-Series certification update for NVIDIA DGX SuperPOD is a game changer for organizations that require massive computational power for AI model training and inference. With the E-series now certified for DGX SuperPOD with NVIDIA DGX H100, DGX H200, and DGX B200 systems, users can experience substantial improvements in processing power, energy efficiency, and speed. This enhancement means that IT teams can expect their AI models to train faster, and data scientists can iterate more quickly, leading to a more efficient path from experimentation to production.
Our collaborations, from Domino Data Lab to NVIDIA, as well as the storage validations for NVIDIA DGX SuperPODand our solutions like NetApp AIPod, are key components of a unified vision: to empower AI's full cycle, from data management to deployment, both on premises and in the cloud.
For IT infrastructure team members, this means deploying systems that are robust, scalable, and optimized for AI workloads. For data scientists, it translates to having the freedom to experiment with assurance that their data is secure, and their models are running as efficiently as possible.
Explore the NetApp suite of AI solutions, designed to empower organizations to innovate freely and effectively. As AI continues to drive business transformation, NetApp remains committed to providing the intelligent data infrastructure that makes it all possible.
Russell Fishman is the Sr. Director of Solutions Product Management at NetApp, leading the businesses for Artificial Intelligence, hybrid cloud virtualization, modern workloads, and enterprise database solutions. His responsibilities include solutions strategy, roadmaps, execution, and associated GTM across these areas.
During his ten years at NetApp, Russell led NetApp's Converged Systems Product Management group. His Converged Solutions innovation portfolio included FlexPod, HCI solutions, joint offerings with other infrastructure providers, manageability platforms, and GTM programs. Russell currently resides in Stamford, CT with his wife and children.