logo EECS Rising Stars 2023




Foroozan Karimzadeh

Software and Hardware for Artificial General Intelligence



Research Abstract:

Artificial Intelligence (AI) has profoundly reshaped our society, revolutionizing industries and daily life. It has automated tasks, boosting efficiency and productivity in many important applications such as health care for disease diagnosis, automotive to improve road safety, and natural language processing (NLP) to enable machines to understand and generate human language. As artificial intelligence models continue to grow in size, complexity and scale, the demand for computational resources has escalated exponentially. The problem with computational resources is getting even worse where applying AI on resource-constrained (memory, battery, etc.) edge and mobile devices. Edge enabled Internet of Things (IoT) devices encompass a wide array of device types from sensors, smartphones, self-driving cars, to security cameras. The number of consumer AI-powered edge enabled IoT devices throughout the world is forecast to grow to almost 6.5 billion by 2030. Mobile systems are progressively more proficient in executing advanced AI tasks, encompassing artificial general intelligence (AGI) models, a form of AI with a human level performance. These models, which comprise substantial large language models (LLMs), have facilitated impressive progress across diverse fields like computer vision, NLP, and healthcare. Yet, effectively implementing and operating these demanding AI models give rise to notable difficulties. The growth of generalizable AI requires three key enablers: (A) algorithm design, such as deep neural networks (DNNs) and LLMs; (B) hardware design, such as GPUs, FPGAs and ASICs; (C) architecture design supporting algorithms being deployed to hardware for general applications. The three enablers are closely related and must be studied together rather than individually. My research is interdisciplinary and bridges the gap between three enablers to accommodate the rapid AGI growth. My current and prior research covers the independent study as well as co-design of three enablers, including: 1. algorithm design for efficient deep learning and LLMs processing; 2. DNN and hardware implementation co-design; 3. Developing architecture and hardware accelerator for efficient LLMs and generative models computing.

Bio:

Foroozan Karimzadeh is currently a Postdoctoral researcher at Georgia Institute of Technology. She received her PhD degree at Electrical and Computer Engineering department, Georgia Institute of Technology under supervision of Dr. Arijit Raychowdhury in 2022. She also received M.Sc. in Biomedical Engineering and B.S. in Electrical and Computer Engineering from Shiraz University, Iran in 2016 and 2012, respectively. Her research interest mainly includes developing novel deep learning algorithms, large language and generative models for computer vision, autonomous driving and biomedical applications. Moreover, she is working on hardware co-design for memory/energy efficient machine learning techniques. She is awarded a prestigious Semiconductor Research Corporation (SRC) Graduate Fellowship, which is awarded in partnership with Texas Instruments. She is also awarded DAC Young Fellow at Design and Automation Conference 2022. Foroozan is selected two-times to be a main-stage speaker at IEEE Woman in Engineering-International Leadership Conference (IEEE WIE-ILC) 2021-2022.