Via EdgeIR.comZededa and Submer have partnered to deliver modular, liquid-cooled edge AI infrastructure for high-density GPU inference in locations without traditional data centers.
Linking Submer’s liquid-cooled AI infrastructure with Zededa’s edge intelligence platform, the partners will deliver scalable, secure and resilient edge AI deployments.
“AI is rapidly moving from centralized cloud environments into real-world operations, from industrial sites to telecom networks and remote energy infrastructure,” says Patrick Smets, CEO of Submer. “Delivering that intelligence requires purpose-built AI infrastructure that operates efficiently in environments where traditional data centers simply cannot exist. By combining Submer’s liquid-cooled high-density AI infrastructure with Zededa’s edge intelligence platform, we’re enabling organizations to deploy scalable, resilient AI infrastructure anywhere it is needed.”
Three modular solutions are offered: edge pods (from 2 to 8 GPUs); ruggedized micro-data centers (to 168 GPUs), and; megawatt-scale containerized systems (up to 800 GPUs).
Submer’s liquid cooling technology improves energy efficiency, reduces water consumption and enables high-density GPUs in harsh environments.
Zededa’s software-defined resilience reduces hardware redundancy costs by redistributing workloads during node failures, ensuring uptime and lowering total cost of ownership.
The deployments address a wide range of AI workloads such as real-time computer vision, predictive maintenance and industrial automation so that AI can run at the edge in remote or extreme environments.
The first pilot deployments are expected to begin with industrial and telecommunications customers later this year.
The partners will focus on delivering the emerging edge AI infrastructure necessary to bring workloads from centralized cloud environments to operational and industrial settings.
Late last year Zededa rolled out full-stack edge Kubernetes to tackle large-scale AI deployments.
http://dlvr.it/TRrsqd


Leave a comment