Building the future of AI-native networks
Over the past few decades I’ve lived through, and at times helped shape, networking’s major shifts, including the deployment of IPTV, the global spread of high-speed internet, the wireless explosion (LTE/5G), the ubiquitous growth of Wi-Fi, and the increasing use of cloud architectures and automation. Each of these shifts was ultimately an evolutionary step forward. But artificial intelligence (AI) is fundamentally different, and it requires a new way of thinking about networking technology.
The best way to understand this change is to look at the distinct networking requirements of AI training and AI inferencing. AI training runs on large graphics processing unit (GPU) clusters and depends on lossless, deterministic networking fabrics to handle massive traffic spikes to complete jobs within tight timeframes. AI inference workloads depend on ultra-low-latency networks to deliver real-time responses and coordinate model execution in microseconds.
In this new world, minor networking inefficiencies slow apps, stall training, waste GPU minutes and drive up costs. The network has become a major constraint on AI performance, scale and return on investment (ROI).
That’s why we built the Nokia AI Networking Innovation Lab in Sunnyvale, California. At this center of excellence, we’re working with AI ecosystem partners to build highly performant and scalable networking solutions for AI.
Why AI demands a different network—and a different lab
Faster speeds and feeds alone won’t meet the demands of AI. AI workloads call for a more fundamental rethink of the network architecture that prioritizes:
- Reliability by design, eliminating failure domains and minimizing human error.
- Deterministic performance, favoring predictability over best-effort behavior.
- Multidimensional scalability, supporting intensive AI training and inferencing traffic patterns.
- Real-time operational awareness, leveraging automation that understands network state in near-real time.
- End-to-end validation, testing AI networks under realistic conditions across a verticalized stack within a multivendor ecosystem.
We recognized early on that AI changes the way networks need to be designed, along with the economics that underpin them. In response, we proactively built this new lab as a catalyst for innovation. It’s a place where we can co-develop with key partners, incubate new ideas and rigorously validate our solutions under real-world AI conditions.
Inside the Nokia AI Networking Innovation Lab
The lab is a unified hub where we engineer, test and validate next-generation AI networking hardware, protocols and architectures with our partners. It’s built on three pillars: technology innovation, ecosystem collaboration and real-world validation.
Technology innovation
We’re shaping the future of AI networking through rigorous testing and hands-on experimentation across emerging standards and technologies. The lab provides access to cutting-edge switching silicon (including the latest Tomahawk chipset for industry-leading switching capacity), new architectural models, and emerging protocols and features. These innovations are driving progress in congestion control, traffic engineering, real-time telemetry, intelligent automation and more.
Our networking solutions power some of the world’s most demanding networks, and we weave that rigor into the lab’s methodologies. We actively participate in standards bodies such as the Ultra Ethernet Consortium (UEC), and have recently completed a successful end-to-end Ultra Ethernet test across our data center switches. We are also a platinum member of the Open Compute Project (OCP) where we participate in workstreams like Ethernet Scale-up Networking (ESUN).
Ecosystem collaboration
The lab serves as a co-innovation venue where we collaborate closely with leading AI technology partners to test interoperability and optimize end-to-end integrations. This coordinated approach reduces integration risk, accelerates release cycles and ensures customers can confidently deploy future-proof, ecosystem-ready solutions.
A strong example of ecosystem collaboration is the development of highly optimized and validated designs for front-end and back-end networks in AI factories built on AMD and NVIDIA GPUs and network interface cards (NICs).
Collaborations with ecosystem partners like Lenovo highlight the tangible value and impact of a truly integrated partner ecosystem.
Real-world validation
The lab is used to develop Nokia Validated Designs (NVDs) that minimize deployment risk and accelerate time to value. It enables end-to-end validation under real AI networking conditions, covering training and real-time inference workloads.
The resulting NVDs are downloadable today. Some NVDs are tailored with partners for specific use cases (e.g., Lenovo for sovereign AI, and Supermicro and Weka for training and inferencing).
Validation in realistic conditions across a complete vertical stack delivers predictable performance, faster time to market, minimal rework and downtime, and lower operational risk.
Leading the way in the AI Supercycle
Our AI Networking Innovation Lab is a strategic investment in advancing networking for modern AI workloads, a foundational technology for the AI Supercycle.
Co-innovation is a core focus. By collaborating with leading AI technology companies, we accelerate the creation of next generation AI networking architectures and help bring new capabilities to market faster, reinforcing our leadership in AI-driven connectivity.
The lab targets investments where we can deliver true differentiation. By concentrating on networking innovation, we empower our partners and customers to unlock more value from AI.
The future of AI is bright and Nokia will be a big part of it
As AI models grow, workloads diversify and performance expectations increase, the industry will need networks that are faster, smarter, more predictable and tightly integrated with the systems they support. The Nokia AI Networking Innovation Lab is built for this future. It gives us a place to work with partner companies to shape tomorrow’s AI ecosystem by engineering, testing and validating the next generation of AI-native networks.
The advancements emerging from the Sunnyvale lab will influence new architectures, standards and capabilities that extend well beyond the data center. They will guide the way networks scale, enable customers to deploy AI with confidence, and drive higher performance, efficiency and reliability across the industry.
We are committed to leading from the front—driving innovation, strengthening our ecosystem and delivering solutions that unlock the full potential of the AI Supercycle for our customers. The future of AI-native networking is being built now, and the Nokia AI Networking Innovation Lab is at the heart of that effort.