Dr. Gina Adam is an assistant professor who joined our department in Fall 2018. Her research lab aims to develop novel hardware foundations that will enable new ways of computing. In particular, her team works on novel electronic devices called memristors that can implement the functionality of artificial synapses in a compact and energy-efficient way for artificial neural network hardware. She explains “A memristor has a programmable state that can be retained naturally due to its ionic-based device physics, making it energy efficient. It contains two characteristics that make it ideal for implementing artificial synapses in artificial neural networks. These two characteristics are state programmability and retention. They provide learning and memory capabilities to the brain. Our research endeavors lie at the intersection between identifying materials for these devices, engineering the device design for integration into circuits, and exploring the algorithms that can be implemented using these memristive based systems.”
There are many challenges and limitations associated with this field of research. Device variability is one of the grand challenges preventing memristor technology from being adopted by industry. Impressive performance has been shown on so-called “hero” devices, the best devices in a fabrication run. However, only a few percent of the devices typically perform extremely well, while the average only performs in a mediocre fashion and some devices do not switch at all. Implementing large memristive systems becomes very limited when there is such a wide performance gap between devices.
No wonder device variability is typically seen as an undesirable trait. However what sets Dr. Adam’s research apart from other researches being conducted in this area is that her team looks at device variability as an opportunity to learn more about these devices and systems. ”We are tackling device variability from different perspectives. For instance, we look at the devices during operation to understand their failure modes which help to engineer better materials and designs. We also look at algorithms that are robust to these device non-idealities and are still able to deliver high performance, despite the device variability. Reliable device manufacturing is crucial for robust device performance, so we also carefully investigate how to improve the nanofabrication flow.”
Luckily, the new GW Nanofabrication and Imaging Center (GWNIC) located in the Science and Engineering Hall offers access to expensive equipment and dedicated staff, making the research achievable. “Such a facility can be used for transformative work and motivate students to join our research efforts,” she remarks. “I wish I had such a facility when I was an undergraduate student, so I encourage students from different fields to explore using this facility for their work.” Dr. Adam’s journey towards nanotechnology and memristors began in 2009 when she was an undergraduate student in Romania. “ I read an article in IEEE Spectrum about nanoscale memristors and their behavior resembling an artificial synapse. I found the field fascinating and applied for a Fulbright fellowship and Ph.D. studies in the US. The following year I was fortunate to be able to begin my Ph.D. at the University of California Santa Barbara working with the group of one of the Professors featured in that Spectrum article.”
Now ten years later, Dr. Adam’s research in memristive systems has been funded by NSF, DARPA/ONR, and GWU through the CDRF and UFF programs. She is currently working with Prof. Igor Efimov from the Biomedical Engineering Department and Prof. Marco Mercader from the School of Medicine on a project related to memristive neuromorphic networks for high definition bioelectric diagnostics and therapy. She also has an active collaboration with NIST Gaithersburg and Western Digital Research working on memory-efficient hardware for neural network training acceleration.
Dr. Adam and her team recently showed in simulations how batch training can provide significant improvements approaching software-level accuracies despite significant variability in their device performance. They have proposed a streaming batch principal component analysis algorithm that uses low-rank approximations to reduce the memory and computational overhead in such neural network training. One of her students, Siyuan Huang, presented a paper on this topic at AAAI conference in February 2020. They are currently working with their collaborators on experimental demonstrators.
The long term goal of this research project is to build hardware that can perform complex computations approaching the efficiency of the human brain. The immediate goal is developing hardware accelerators for neural network training and inference systems for a variety of edge applications. These energy-efficient computing technologies could be used in applications ranging from self-driving cars to handwriting recognition and image classification.
Dr. Adam highlights that all this work has been possible thanks to the hard-working GW students that are in her group. “Students are the engine behind the work that I do and I am grateful to have in my team such talented and diverse students, coming from different departments from ECE to CS, BME, and MAE.” While Dr. Adam’s group started less than two years ago, the students’ hard work has already shown results. Alyssa Andrade received a Clare Boothe Luce fellowship for 2019-2020. Jonathan Schwartz received an Honorable Mention for the NSF Graduate Research Fellowship Program. Several alumni have been accepted to top MSc and Ph.D. programs, like MIT, CMU and Vanderbilt University. Dr. Adam hopes to see her group develop and grow in the future. “I want to thank my students for their dedication and welcome motivated graduate and undergraduate students who might be interested in this area of work to contact me at [email protected].”