New Patent by Professor Louri Reinforces Network-on-Chip Security


April 10, 2025

Ahmed Louri

As the demand for faster, more efficient computing grows, parallel computing has become the backbone of modern technology. In parallel computing, key components such as processors and memory units work together seamlessly to perform computations. Network-on-Chips (NoCs) have emerged as the standard interconnect fabric that allows them to communicate, but they’re not without their vulnerabilities. One of the biggest threats is Hardware Trojans (HTs).

“Hardware trojans are just like the trojans of the past: they are stealthy by design, mimicking legitimate circuitry while covertly compromising chip integrity,” said Ahmed Louri, the David and Marilyn Karlgaard Professor of Electrical and Computer Engineering. “Their activation can cripple communication across the NoC, making preemptive detection vital to modern computing security.”

HTs are malicious circuits inserted during design or manufacturing that lie dormant until triggered to leak data, disrupt services, or cause catastrophic failures. To guard against HTs, Louri and his team developed a novel learning-enabled framework to promptly and accurately detect and isolate these threats with minimal performance loss.

Traditional techniques rely on static signature-based monitoring or historical behavior analysis, which struggle to detect HTs and cause lags due to resource-heavy data storage. Louri’s patented approach, “Systems and methods for learning-based high-performance, energy-efficient, and secure on-chip communication design framework (Patent Number: 12,128,126), eliminates this bottleneck by embedding artificial intelligence (AI) and machine learning (ML) within devices’ NoC architecture to enable real-time threat detection and mitigation.

The ML-enabled NoC routers continuously analyze traffic patterns to learn normal behaviors and identify anomalies indicative of HT activation. Upon threat detection, the framework isolates the compromised routers and reroutes data through secure paths—all without interrupting computational throughput with minimal energy overhead. Louri shared that AI and ML are the “secret sauce” of this patent.

“Unlike conventional security measures, our framework operates at the hardware level, using on-device learning to adapt to evolving threats in real-time,” he emphasized.

By deploying energy-efficient ML models directly on-chip, the system achieves superior HT detection accuracy and reduces power consumption by up to 30% compared to software-based solutions. This dual focus on security and efficiency ensures robust protection without sacrificing the speed or scalability needed for applications like AI accelerators and edge computing.

With hardware-based cyber threats like HTs escalating due to global semiconductor supply chains, securing NoCs has become a critical frontier in hardware security. “Most research targets processor-level vulnerabilities, but the communication fabric, the NoC, is an equally attractive attack surface in multicores,” Louri noted.

Louri’s latest patent bridges this gap by offering a cross-layer defense mechanism that complements processor-centric security. This is crucial as multicore systems dominate industries where compromised communication may lead to systemic failures, such as autonomous vehicles and cloud infrastructure.

Last fall, Louri secured another patent for a fault-tolerant scheme that tackles reliability issues by ensuring NoCs could recover from faults and internal disruptions without duplicating computing resources or increasing resource consumption. The new invention builds on this pioneering contribution to resilient and sustainable computing systems by extending resilience to cybersecurity. It creates a unified framework for both fault recovery and threat mitigation, so, together, both frameworks enable next-generation architectures that are not only faster and more scalable but also inherently secure and sustainable.

By redefining NoC security through intelligent, adaptive design, Louri’s patent marks another transformative step toward safeguarding the future of computing, where performance, robustness, and protection coexist seamlessly.