The Nokia Data Center AI strategy acknowledges that rapidly expanding artificial intelligence workloads are forcing data centers to rethink how their networks are built, scaled, and operated.
Vach Kompella, Senior Vice President and General Manager of IP Networks at Nokia, put it plainly when he said the “astonishing growth in AI adoption has led to a dramatic overhaul in how data centers operate.” Demand for more bandwidth, lower latency, and smarter automation is outpacing traditional infrastructure, and Nokia’s latest 7220 IXR-H6 switches were engineered specifically to meet these new conditions.
Their extreme performance profile aligns with the needs of modern AI applications, particularly agentic AI systems where large clusters of accelerators must coordinate in real time.
A major technical leap defining the Nokia Data Center AI portfolio comes from hardware capable of supporting 102.4 Tb/s throughput and interface speeds of both 800 Gigabit Ethernet and 1.6 Terabit Ethernet.
This essentially doubles the throughput and port performance of previous-generation equipment without increasing the physical footprint in the rack. For hyperscalers, cloud operators, and colocation providers, density improvements like this are crucial.
Space, power, and cooling constraints are tightening while AI-driven computational loads continue to rise, meaning operators must extract more performance from the same square footage while managing thermals and power budgets with sharper efficiency.
Standards play a central role in Nokia Data Center AI evolution. Compliance with the Ultra Ethernet Consortium positions Nokia as one of the leading contributors to the push for AI-optimized networking.
UEC’s work targets the unique communication patterns of distributed AI training and inference, where thousands of GPUs or other accelerators exchange constant, tightly timed messages.
By improving congestion management, bandwidth allocation, and reliability at scale, UEC-aligned designs help ensure AI networks behave predictably under intense load, which is essential for both speed and cost control in industrial-scale training environments.
Flexibility is another major differentiator for Nokia Data Center AI solutions. The new switches support both Nokia’s SR Linux operating system and the open-source Community SONiC platform, making Nokia the only vendor offering hardware that can run either an embedded NOS or SONiC.
According to Heidi Adams, Head of IP Marketing at Nokia, this gives operators the freedom to choose between an open ecosystem with community-based innovation or a fully integrated commercial platform backed by Nokia’s engineering resources. That flexibility helps operators standardize on a single hardware platform even when they maintain different operating models across regions or workloads.
Thermal management, once a background consideration, is now central to the Nokia Data Center AI hardware roadmap. With AI accelerators pushing past 700 watts per chip and approaching 1,000 watts in upcoming generations, traditional air-cooling designs struggle to keep pace.
Nokia’s portfolio includes both air-cooled and liquid-cooled switch variants. Liquid cooling through direct-to-chip or full immersion removes heat more efficiently, cuts energy costs, and supports higher-density deployments. These capabilities matter deeply as customers try to pack more accelerators into the same footprint while keeping operational costs under control.
Nokia Data Center AI innovation extends into operations through enhancements to the Event-Driven Automation platform, which now incorporates agentic AI-powered AIOps. This system can interpret natural language, understand network context, and provide reasoning-based guidance for diagnosing and resolving issues.
Combined with real-time telemetry, digital twins, dry-run simulations, and instant rollback features, EDA enables operators to achieve dramatic reductions in downtime. Research from Bell Labs Consulting and Futurum reports a 96 percent decline in data center network downtime when these intelligent automation tools are deployed.
The shift to agentic AI is a fundamental change in how operations work. Earlier automation systems were script-driven, meaning they executed predefined rules. Nokia Data Center AI tools now reason through issues, correlate telemetry signals, evaluate topologies, and propose or execute corrective actions autonomously.
As Adams noted, the system “doesn’t just understand what you ask; it understands your network.” That level of contextual intelligence simply wasn’t possible before the rise of large language models and modern AIOps engines.
Real-world validation for Nokia Data Center AI comes from customers like Nscale, a German data center operator that confirmed Nokia’s technology gives them significant operational advantages as AI demands accelerate. For operators already feeling pressure from rising density, rising thermal loads, and rising customer expectations, endorsements from active deployments help show that Nokia’s claims translate into measurable value.
Nokia’s rollout timeline spans both immediate software availability and hardware deliveries expected through early 2026. EDA’s agentic AI capabilities are available for demonstration now and will begin deployment later in 2025. The next-generation 7220 IXR-H6 switches will start shipping in the first quarter of 2026. This phased approach lets customers begin transforming operational workflows through software before committing to large-scale hardware refresh cycles.
Looking forward, the Nokia Data Center AI roadmap is aligned with long-term industry trends. Market analysts at 650 Group expect Ethernet to dominate AI networking due to cost, openness, and scalability. They note that 1.6 Terabit Ethernet interfaces represent a key sweet spot for the next wave of AI infrastructure, validating Nokia’s architectural focus as hyperscalers scale out massive GPU farms.
Monitor the infrastructure innovations enabling artificial intelligence’s computational demands and the networking architectures determining which providers will capture value from AI’s explosive growth, visit ainewstoday.org for comprehensive coverage of data center switching advances, operational automation breakthroughs, standards development, and the technology choices shaping whether AI systems can scale efficiently or encounter bottlenecks limiting transformative potential!