MIT’s Artificial Intelligence System Can Detect People, Postures and Movements Through Walls

MIT Artificial Intelligence driven system uses RF signals from wireless devices to remotely detect human movement, locate and track people - works to track multiple people simutaneously. (source: MIT CSAIL)

MIT Artificial Intelligence driven system uses RF signals from wireless devices to remotely detect human movement, locate and track people - works to track multiple people simutaneously. (source: MIT CSAIL)

MIT Artificial Intelligence driven system uses RF signals from wireless devices to remotely detect human movement, locate and track people - can determine posture. (source: MIT CSAIL)

MIT Artificial Intelligence driven system uses RF signals from wireless devices to remotely detect human movement, locate and track people - can determine posture. (source: MIT CSAIL)

MIT Artificial Intelligence driven system uses RF signals from wireless devices to remotely detect human movement, locate and track people - works in poor lignt and dark conditions. (source: MIT CSAIL)

MIT Artificial Intelligence driven system uses RF signals from wireless devices to remotely detect human movement, locate and track people - works in poor lignt and dark conditions. (source: MIT CSAIL)

MIT Artificial Intelligence driven system uses RF signals from wireless devices to remotely detect human movement, locate and track people - even works through obscuration such as walls and windows. (source: MIT CSAIL)

MIT Artificial Intelligence driven system uses RF signals from wireless devices to remotely detect human movement, locate and track people - even works through obscuration such as walls and windows. (source: MIT CSAIL)

December 3, 2018 | Source: MIT News, news.mit.edu, Adam Conner-Simons & Rachel Gordon, 12 June 2018

Artificial Intelligence (AI)-driven system can accurately locate, determine posture of and track multiple individuals even in the dark and through walls.  Massachusetts Institute of Technology (MIT) researchers demonstrated that the system could accurately "identify" somebody 83% of the time out of a line-up of 100 individuals. Potential application for medical monitoring, search and rescue, and hazardous/military operations.


X-ray vision has long seemed like a far-fetched sci-fi fantasy, but over the last decade a team led by Professor Dina Katabi from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has continually gotten us closer to seeing through walls.

Their latest project, “RF-Pose,” uses artificial intelligence (AI) to teach wireless devices to sense people’s postures and movement, even from the other side of a wall.

The researchers use a neural network to analyze radio signals that bounce off people’s bodies, and can then create a dynamic stick figure that walks, stops, sits, and moves its limbs as the person performs those actions.

One challenge the researchers had to address is that most neural networks are trained using data labeled by hand. A neural network trained to identify cats, for example, requires that people look at a big dataset of images and label each one as either “cat” or “not cat.” Radio signals, meanwhile, can’t be easily labeled by humans.

To address this, the researchers collected examples using both their wireless device and a camera. They gathered thousands of images of people doing activities like walking, talking, sitting, opening doors and waiting for elevators.

They then used these images from the camera to extract the stick figures, which they showed to the neural network along with the corresponding radio signal. This combination of examples enabled the system to learn the association between the radio signal and the stick figures of the people in the scene.

Post-training, RF-Pose was able to estimate a person’s posture and movements without cameras, using only the wireless reflections that bounce off people’s bodies.

Since cameras can’t see through walls, the network was never explicitly trained on data from the other side of a wall—which is what made it particularly surprising to the MIT team that the network could generalize its knowledge to be able to handle through-wall movement.

Besides sensing movement, the authors also showed that they could use wireless signals to accurately identify somebody 83% of the time out of a line-up of 100 individuals. This ability could be particularly useful for the application of search-and-rescue operations, when it may be helpful to know the identity of specific people.

For this paper, the model outputs a 2-D stick figure, but the team is also working to create 3-D representations that would be able to reflect even smaller micromovements.

Communities: