New Patents Advance Scalable, Efficient AI Systems


March 13, 2026

Ahmed Louri

Ahmed Louri, the David and Marilyn Karlgaard Professor of Electrical and Computer Engineering, and his students have received two new patents for critical artificial intelligence (AI) accelerator designs, advancing faster, scalable, and more energy-efficient computational platforms for AI systems.

The first patent with Jiajun Li introduces a scalable accelerator architecture designed to speed up graph-based AI models–models in which information is presented as a network of connected dots–used in everyday applications such as e-commerce analysis, recommendation systems, and drug discovery, while providing flexibility for optimal efficiency and workload balancing. The second patent, with students Jiaqi Yang and Hao Zheng, is for a versatile accelerator that enables multiple deep neural networks to run concurrently, featuring advanced memory access optimization, adaptable communication fabrics, and an intelligent scheduling algorithm to improve performance, efficiency, and applicability.

​These innovations build on Louri’s ongoing work developing next-generation computing architectures, for which he received additional patents for frameworks to optimize and secure on-chip communication. To date, Louri holds more than 10 patents from the US Patent and Trademark Office for AI-related technologies.