A global leader in programmable, energy efficient AI computing, today announced a strategic Memorandum of Understanding (MOU) with Nokia Solutions and Networks Singapore Pte. Ltd. (“Nokia”), a company duly incorporated and validly existing under the laws of Singapore.
Targeting the Asia Pacific markets, the MOU establishes a framework for joint exploration, development, and deployment of Practical AI and Physical AI systems designed for real world operation, combining Nokia’s leadership in networking, automation, and cloud infrastructure with Blaize’s programmable AI inference platform. Together, the companies aim to enable Real World AI that operates reliably at the edge and across hybrid environments where latency, power efficiency, and operational resilience are critical.
Under the MOU, Blaize and Nokia intend to collaborate on:
- Edge and hybrid AI inference use cases, integrating Blaize’s Hybrid AI platform with Nokia’s IP networking, data center networking, and automation systems
- Reference architectures and solution blueprints that position Blaize as a complementary AI inference layer alongside Nokia’s AI networking infrastructure
- Joint validation of AI inference deployments for telecom, industrial, and smart infrastructure environments
- Go to market and ecosystem initiatives, including customer workshops, pilot programs, and solution demonstrations focused on production ready AI
The collaboration reflects a shared belief that the next phase of AI adoption will be driven by inference at scale, not just model training. By pairing energy efficient AI inference at the edge with centralized GPU resources in the cloud, Blaize and Nokia are addressing the growing demand for Hybrid AI architectures that balance performance, cost, and operational efficiency.
“Our collaboration with Nokia marks an important step forward in delivering Practical AI and Physical AI at scale,” said Dinakar Munagala, Co-Founder and Chief Executive Officer of Blaize. “By combining Nokia’s leadership in connectivity and automation with the Blaize AI inference platform, we are enabling Real World AI that runs efficiently at the edge while integrating seamlessly with cloud and GPU infrastructure. This Hybrid AI approach allows organizations to deploy inference where it matters most and turn intelligence into real operational outcomes.”
Nokia intends to contribute its expertise in networking, automation, and cloud infrastructure, including leadership in IP networking, industrial connectivity, and intelligent network operations. Blaize intends to provide its AI inference hardware and software platform, designed to complement GPU based systems by enabling scalable, low power inference closer to where data is generated, and actions are taken.
“Our work with Blaize is accelerating, and we are excited to build on this momentum into the coming year,” said Sang Xulei, Senior Vice President and Head of Network Infrastructure, Asia Pacific at Nokia. “As demand grows for Practical AI and Physical AI systems that operate in real world environments, Blaize’s energy efficient AI inference platform gives us the flexibility to extend Hybrid AI architectures across networks, edge systems, and cloud infrastructure. Together, we are enabling scalable, production ready intelligence that brings AI closer to where data is generated and actions are taken in the region.”
The collaboration leverages Blaize’s specialized AI inference platform, chosen for its ability to support real world AI deployment across edge, cloud, and data center environments.
The MOU is non-binding and outlines a cooperative framework under which the parties may pursue specific projects through future definitive agreements in the Asia Pacific region. The collaboration will focus on enabling secure, scalable, and energy efficient AI inference deployments that integrate seamlessly into existing network, cloud, and industrial environments.
Read also: Cincoze Launches New Slim Industrial Display Solutions for Enhanced HMI and Visualization













































































































































































































































































































































































































































































































































































































































































































































































































































































































