DataCenterNews Asia Pacific - Specialist news for cloud & data center decision-makers
Story image

Dell unveils new Pro Max AI PC & innovations for data centres

Today

Dell has announced the launch of the Dell Pro Max AI PC, which features what the company states is the industry's first enterprise-grade discrete NPU in a mobile form factor.

The Dell Pro Max Plus laptop incorporates the Qualcomm AI 100 PC Inference Card, enabling on-device AI inferencing for large models that are typically run in the cloud, such as those with 109 billion parameters. According to Dell, this addition makes the Pro Max Plus the world's first mobile workstation to offer an enterprise-grade discrete NPU.

The Qualcomm AI 100 PC Inference Card is equipped with 32 AI-cores and 64 GB of memory. This configuration is designed to support the requirements of AI engineers and data scientists who are working with substantial models for edge inferencing. The device is being positioned as a solution for organisations that seek faster and secure handling of AI workloads directly at the edge.

The Pro Max AI PC is now available as part of the Dell AI Factory portfolio. Dell stated that this release is part of a wider suite of infrastructure updates intended to deliver performance for enterprise AI workload development and deployment across client devices, data centres, edge locations, and cloud environments.

Alongside the new AI PC, Dell has introduced innovations aimed at improving data centre efficiency. Among these is the Dell PowerCool Enclosed Rear Door Heat Exchanger (eRDHx), which is engineered to capture all IT heat output with a self-contained airflow system, potentially reducing cooling energy costs by up to 60% compared to existing solutions. The technology allows data centres to use warmer water for cooling—between 32 and 36 degrees Celsius—removing reliance on traditional expensive chillers and enabling up to 16% more racks of dense compute capacity to be deployed without requiring additional power.

Further enhancements target risk management, offering features such as advanced leak detection, real-time thermal monitoring, and integrated management through Dell's Rack Controller. According to Dell, air cooling capacity can reach up to 80 kW per rack for dense AI and high-performance computing applications.

Dell also announced that its PowerEdge XE9785 and XE9785L servers will support AMD Instinct MI350 series GPUs, which deliver 288 GB of HBM3E memory per GPU and claim up to 35 times greater inferencing performance compared to previous systems. The servers will be available with liquid and air cooling options to further reduce facility energy costs related to cooling.

The company's storage and data platforms received updates as well. Dell Project Lightning, described by the company as the world's fastest parallel file system based on internal testing, is said to provide double the throughput compared to competing systems, which could accelerate AI training times for large-scale and complex workflows. Enhancements to the Dell Data Lakehouse are designed to simplify AI workflows by enabling the creation and querying of AI-ready datasets for use cases such as recommendation engines, semantic search, and customer intent detection.

"We're excited to work with Dell to support our cutting-edge AI initiatives, and we expect Project Lightning to be a critical storage technology for our AI innovations," Dr. Paul Calleja, Director, Cambridge Open Zettascale Lab and Research Computing Services, University of Cambridge, commented.

In networking, Dell announced Linear Pluggable Optics, intended to lower power consumption and reduce latency for high-performance computing and AI deployments. The company also introduced AI Security and Resilience Services, which aim to provide end-to-end protection across AI infrastructure, data, applications, and models.

Expansion of Dell's AI partner ecosystem was also outlined, connecting organisations with AI solutions from companies including Cohere, Google, Meta, Glean, and Mistral AI. These partnerships facilitate deployment of enterprise search, agent-based AI applications, and on-premises AI models in a secure environment. Dell also revealed joint engineering efforts with AMD and Intel, supporting new hardware stacks such as AMD ROCm and Intel Gaudi 3 AI accelerators for AI infrastructure.

"It has been a non-stop year of innovating for enterprises, and we're not slowing down. We have introduced more than 200 updates to the Dell AI Factory since last year. Our latest AI advancements — from groundbreaking AI PCs to cutting-edge data centre solutions — are designed to help organisations of every size to seamlessly adopt AI, drive faster insights, improve efficiency and accelerate their results," Jeff Clarke, Chief Operating Officer at Dell Technologies, said.

"We leverage the Dell AI Factory for our oceanic research at Oregon State University to revolutionise and address some of the planet's most critical challenges. Through advanced AI solutions, we're accelerating insights that empower global decision-makers to tackle climate change, safeguard marine ecosystems and drive meaningful progress for humanity," Christopher M. Sullivan, Director of Research and Academic Computing for the College of Earth, Ocean and Atmospheric Sciences at Oregon State University, said.

These announcements collectively aim to address industry needs related to data quality, deployment costs, and security, while supporting the transition of AI projects into production environments for organisations worldwide.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X