
VeriSilicon and Google Unveil Open-Source Coral NPU IP, Advancing Ultra-Low-Power Edge LLM Deployment
VeriSilicon has announced a major collaborative milestone with Google: the joint release of the open-source Coral NPU IP, engineered specifically for always-on, ultra-low-energy edge Large Language Model (LLM) applications. This launch brings together Google’s foundational research in machine learning compilers and VeriSilicon’s proven strengths in chip design, silicon verification, and system-level optimization. Together, they are laying the groundwork for a new generation of edge AI solutions that are efficient, secure, and open for developers worldwide.
The Coral NPU IP is particularly significant in today’s rapidly evolving AI landscape, where demand for high-performance on-device inference continues to rise. From smart wearables and ambient sensing systems to next-generation augmented reality devices, the industry is increasingly moving toward edge-based LLMs that can operate continuously while consuming minimal power. The new Coral NPU IP initiative directly addresses these needs by offering an open, flexible, and developer-friendly infrastructure for designing custom intelligent hardware.
A New Open-Source Foundation for Edge AI
At the heart of the Coral NPU IP is Google’s longstanding commitment to open-source machine learning tools and compiler technology. The IP integrates Google’s advanced research in open ML compilers, creating a unified, transparent, and easily extensible platform. This enables developers, researchers, and chipmakers to build and refine edge AI systems without proprietary limitations.
Built on the open RISC-V instruction set architecture, the Coral NPU offers native tensor processing capabilities designed to accelerate complex machine learning workloads directly on the device. RISC-V’s open and highly customizable nature allows silicon designers to fine-tune performance, reduce power consumption, and optimize silicon area—all critical factors for always-on edge devices where resources are constrained.
The NPU also includes built-in support for widely adopted machine learning frameworks such as:
- JAX
- PyTorch
- TensorFlow Lite (TFLite)
This broad compatibility ensures developers can transition their existing AI models to edge-based deployment with minimal friction. It opens the door for a vibrant ecosystem where AI researchers and product developers can seamlessly deploy models across a wide range of edge devices, from wearables and home sensors to mobile accessories and specialized IoT products.
Leveraging Open-Standard Compiler Infrastructure
A notable highlight of the Coral NPU design is its reliance on open-standard compiler tools, particularly the Multi-Level Intermediate Representation (MLIR) from the LLVM project. MLIR is an increasingly influential technology in the world of AI acceleration, enabling hardware-agnostic optimization, flexible model transformations, and efficient compilation pipelines.
By embracing MLIR and other open tools, the Coral NPU IP ensures:
- Highly modular compiler architecture
- Visibility into optimization processes
- Customization for different model architectures
- Easier integration with emerging AI workflows
This open approach is essential for the development of long-term, sustainable edge AI ecosystems. It eliminates traditional barriers imposed by proprietary compilers and creates an environment where hardware and software innovation can move forward together at accelerated pace.
Advanced AI Security Features Built In
Security has emerged as a key concern for always-on AI systems, especially those processing sensitive audio, visual, or contextual data at the edge. The Coral NPU IP integrates AI-focused security features directly into the architecture, ensuring that edge-based LLM operations remain protected against common attack vectors.
These embedded security enhancements help safeguard:
- Model execution integrity
- Data pathways during inference
- Access to hardware resources
- Firmware and runtime environments
By building security into the core of the NPU, the platform supports more trustworthy edge intelligence—especially for applications like wearables, smart home sensors, and enterprise devices where privacy and reliability are critical.
Available Worldwide Through Google Developers
The Coral NPU IP is now available globally as an open-source project hosted through the Google Developers platform. This move dramatically increases accessibility for universities, startups, semiconductor companies, and independent developers. Anyone can explore the IP, modify it, or integrate it into new hardware designs without licensing restrictions.
This open release also encourages cross-industry collaboration. As more developers and companies experiment with the Coral NPU, the ecosystem around edge LLMs is expected to grow—leading to faster improvement cycles, broader experimentation, and more diverse applications.
VeriSilicon to Deliver Enterprise-Ready, Commercial Versions
While the open-source version is freely accessible, VeriSilicon is simultaneously preparing a commercial-grade edition of the Coral NPU IP. This enterprise version will be optimized for industrial deployment with:
- Verified silicon-proven design
- Enhanced reliability and performance tuning
- Advanced verification and validation
- Optimized power management features
- Extended support and customization options
VeriSilicon plans to leverage its extensive IP portfolio—ranging from multimedia processing to advanced connectivity blocks—to provide one-stop custom silicon services for companies looking to commercialize Coral NPU-powered chips.
Validation Chip in Development for Wearables and Smart Home AI
To accelerate real-world adoption, VeriSilicon is currently developing a validation chip based on the Coral NPU IP. This chip is targeted at applications such as:
- AI-enhanced AR glasses
- Smart home devices
- Ambient intelligence sensors
- Always-listening AI assistants
- Low-power, continuous monitoring systems
With LLMs increasingly being miniaturized and optimized for on-device processing, the validation chip will help manufacturers rapidly prototype new products and bring commercial solutions to market faster.
Strengthening the Open-Source Edge AI Ecosystem
This collaboration builds on VeriSilicon’s previous work with Google on the Open Se Cura open-source project. The two companies continue to deepen their integration efforts as they push forward the boundaries of accessible and secure edge AI computing.
Wiseway Wang, Executive Vice President and General Manager of VeriSilicon’s Custom Silicon Platform Division, emphasized the long-term vision behind the partnership:
This builds on our prior experience on the Open Se Cura project and represents the continued deep integration of Google’s open-source technology with VeriSilicon’s chip design and commercialization capabilities. We will continue to leverage our strengths in chip design, verification, and system-level optimization to advance the edge AI ecosystem, support the deployment of open-source technologies in real-world products, and enable edge LLM applications—providing strong support for industry innovation.
A Significant Step Toward Mass Adoption of Edge LLMs
With the introduction of the open-source Coral NPU IP, VeriSilicon and Google are helping democratize access to high-performance, ultra-efficient edge AI technology. This initiative sets the stage for the next generation of AI devices that are increasingly intelligent, always available, and capable of running powerful LLMs without relying on cloud connectivity.
In an era when privacy, responsiveness, and energy efficiency matter more than ever, the Coral NPU represents a foundational advancement for the global semiconductor and AI communities. As more developers adopt and build upon this open-source infrastructure, the industry will likely see a surge of innovative edge AI products, accelerated chip development cycles, and a stronger focus on secure, sustainable intelligence at the device level.
Source Link:https://www.businesswire.com/



