Projects
Visual sensors such as cameras are ubiquitous today in almost all our devices. Lidars can perceive even in the absence of ambient light. However, for perceiving objects in non line of sight conditions, through thick smoke or fog or debris, visible and infrared light doesn’t penetrate all the way. Radio frequency is known to pass through buildings and walls and reach your cell phones from towers far away. The field of wireless sensing uses radio frequency signals to perceive the world in conditions challenging to other wavelengths.
As can be seen from the figure, today’s state-of-the-art RF sensors are inferior in many aspects compared to lidars and cameras. For example, while camera and lidar offer angular resolutions of about 0.01° and 0.1° respectively, the best commodity radars are atleast 10 times worser. In my research, I build application-specific RF sensing systems that address some of these challenges using signal processing, hardware design, machine learning and sensor fusion techniques.
Osprey
Tire wear affects safety and performance of automobiles. Past solutions are either inaccurate, require sensors embedded in tires or are prone to road debris. Osprey proposes a radio frequency tire wear sensing system design, based on automotive millimeter wave radars, to overcome challenges related to road debris. We design super resolution algorithms to measure millimeter-level changes due to wear and tear. In addition, the principles used also open up solutions for detecting and localizing harmful metallic foreign objects in the tire.
Full Paper | Demo Paper | Magazine | Slides | Talk | Video-1min | Video-3min
Press: CMU | Gizmodo | Hackster.io | That’s Cool News - Podcast
Best Paper Honorable Mention (MobiSys 2020)
Best Demo (MobiSys 2020)
ACM GetMobile Research Highlight
Millimetro
How can cars perceive critical roadside infrastructure even when weather conditions are not favorable for visual sensors? Millimetro tackles this by designing low power radio frequency tags that can be mounted on such infrastructure and builds decoding algorithms that run on radars in cars to decode, identify and accurately localize tags.
Full Paper | Demo Paper | Talk
Press: UIUC
Metamoran
How can we enable single vantage point depth imaging that works at long ranges to build simple-to-deploy but high quality survelliance systems? Metamoran proposes a monocular camera-radar fusion solution that leverages the pros of each sensor to combat the cons and provides an accurate depth image of various objects even at long ranges.
Full Paper | Slides | Talk