CMU Researchers Pioneer Privacy-Preserving Activity Tracking with Radar

CMU Researchers Pioneer Privacy-Preserving Activity Tracking with Radar
Imagine a future where your smart devices can understand your activities without invasive cameras. Researchers at Carnegie Mellon University's Future Interfaces Group are making this a reality by developing a novel approach to activity tracking using millimeter wave (mmWave) doppler radar. This technology promises to enhance smart home and personal assistant capabilities while safeguarding user privacy.
The Privacy Imperative
Connected cameras in homes pose significant privacy risks. To address this, the CMU team explored the potential of mmWave doppler radar as a sensing tool for detecting various human activities. Unlike cameras, radar can capture rich signal data without visual recording, offering a more privacy-friendly alternative.
Leveraging mmWave Doppler Radar
Millimeter wave (mmWave) doppler radar offers a signal richness comparable to microphones and cameras. However, a key challenge has been the lack of readily available datasets to train AI models for recognizing human activities from radar signals. The CMU researchers tackled this by synthesizing doppler data and developing a software pipeline for training privacy-preserving AI models.
Breakthroughs in AI Training
The researchers demonstrated the effectiveness of their approach through a video showcasing their model correctly identifying activities such as cycling, clapping, waving, and squats. This was achieved by training the AI on public video data and translating it to interpret mmWave signals. "We show how this cross-domain translation can be successful through a series of experimental results," they stated, highlighting its potential to reduce the burden of training human sensing systems.
Capabilities and Limitations
While mmWave radar is not sensitive enough for detecting very subtle movements like facial expressions, it is adept at recognizing less vigorous activities such as eating or reading. A crucial requirement for this technology is a clear line-of-sight between the subject and the sensing hardware, meaning it cannot yet "reach around corners."
Real-World Applications and Future Potential
Companies like Google are already integrating radar technology. Google's Project Soli uses radar sensors in Pixel phones and Nest Hub devices for features like sleep tracking. The CMU research suggests broader applications, including:
- Smarter AI Assistants: Devices could understand context, like knowing when you're eating, exercising, or cleaning.
- Fitness Tracking: Counting exercise repetitions (e.g., squats, bench presses) without wearables.
- Enhanced Smart Homes: Automatically adjusting mood lighting or playing music based on detected activities.
- Occupancy Detection: More advanced sensing in buildings for energy efficiency or security.
Cost and Accessibility
The cost of radar sensors is rapidly decreasing, with some units available for around $1 on eBay. This affordability makes widespread integration into various devices feasible.
Privacy Advantages of Radar
Compared to cameras, radar data is considered highly anonymizing. Even if radar data were to be exposed, it would be difficult for individuals to be identified, unlike leaked camera footage. "If your doppler radar data leaked online, it'd be hard to be embarrassed about it. No one would recognize you," noted researcher Chris Harrison.
Overcoming Data Synthesis Challenges
Synthesizing training data for radar models is not entirely turnkey, but the availability of large video datasets (like YouTube-8M) significantly aids the process. Downloading video data and creating synthetic radar data is considerably faster than manually collecting motion data. The process can be parallelized using cloud services, allowing for high throughput.
Related Research and the Future of HCI
The CMU Future Interfaces Group has a history of innovative research in human-computer interaction (HCI), including projects like Pose-on-the-Go (using smartphone sensors for pose estimation), low-cost smart home sensing, and using smartphone cameras for AI context. Their work also explores laser vibrometry, electromagnetic noise, conductive spray paint for touchscreens, and wearable interaction techniques.
This radar-based activity tracking represents a significant step towards more contextually aware and privacy-respecting human-computer interaction, moving beyond the limitations of current smart devices.
Original article available at: https://techcrunch.com/2021/05/11/cmu-researchers-show-potential-of-privacy-preserving-activity-tracking-using-radar/