logo EECS Rising Stars 2023




Tara Boroushaki

Innovative Multi-Modal Sensing Technologies: Enabling Superhuman Perception in Mobile Systems



Research Abstract:

My research interests are in advancing mobile sensing technologies and systems with applications in networking, Human Computer Interaction (HCI), and robotics. I develop innovative algorithms and build integrated systems that harness the power of multi-modal sensing to sense, connect to, and interact with the environment in novel ways. My work centers on leveraging cutting-edge sensing technologies, such as mmWave imaging, RFID technology, and the Internet of Things (IoT), to develop fundamentally new approaches for sensing. Computational sensing today enables a wide range of applications from industrial IoT (e.g. in warehouse monitoring) and Augmented Reality (AR) to robotics. However, individual sensing modalities are inherently limited. For instance, visual sensing (e.g. RGB camera) is limited to scenarios with a clear line of sight and good lighting. Radio Frequency (RF) signals, such as WiFi and Bluetooth, traverse through everyday objects (like walls, cardboard boxes, etc), but lack high-resolution information necessary for complex tasks such as scene understanding. As a result, no single sensing modality can provide a comprehensive understanding of complex environments, necessitating the integration of diverse sensing modalities. In recognition of this, a wide spectrum of emerging connected devices, including AR headsets, robots, and mobile devices, are already incorporating various sensing technologies such as computer vision, WiFi tracking, and inertial measurement units (IMUs). However, the key remaining challenge is that this integration requires innovative approaches to strategically combine different modalities in a manner that maximizes functionality while remaining cost-effective. My research aims to deliver superhuman perception for IoT connected and mobile devices, such as AR headsets and robots. I leverage advanced signal processing and mathematical modeling techniques to combine different sensing modalities with a particular focus on RF and computer vision. My research goes beyond developing algorithms to building end-to-end systems, and evaluating them in real world environments. My ultimate goal is to unlock unprecedented functional capabilities through the fusion of diverse sensing modalities, as they complement each other, unveiling capabilities and enabling the perception and execution of novel tasks.

Bio:

Tara Boroushaki is a fifth year Ph.D student at MIT and a Microsoft Research PhD fellow. Her research focuses on fusing different sensing modalities with a particular focus on RF and computer vision. She designs algorithms and build systems that leverage such fusion to enable capabilities that were not feasible before in applications spanning augmented reality, virtual reality, robotics, smart homes, and smart manufacturing. Her research was named as one of the 103 ways MIT is making a better world, and has been featured in public media including the Wall Street Journal, BBC, and World Economic Forum.