Scientists from MIT have created a technology that is remarkably similar to the X-ray vision of our sci-fi superheroes. The project from the Computer Science and Artificial Intelligence Laboratory (CSAIL) uses artificial intelligence (AI) to teach wireless devices to sense people’s postures and movement, even from the other side of a wall. 

The MIT scientists say the project called, “RF-Pose,” could be used to monitor patients with diseases like Parkinson’s, multiple sclerosis (MS), and muscular dystrophy. By tracking movement doctors can have a better understanding of the disease's progression and adjust medication and care accordingly. 

Teaching wireless devices to track movement could also help elderly people live independently for longer by monitoring the way they use space, keeping track of falls or alerting people to troubling changes in activity patterns. The AI engineers are teaming up with medical doctors to investigate the full potential of RF-Pose’s applications. 

The technology works by using a neural network to analyze radio signals that bounce off people’s bodies. From these signals, a dynamic stick figure is created that mimics the movements of its target. 


The researchers are well aware of how creepy the project can sound and have already built in precaution related to security. All data collected by the wireless device can only happen with the subject's consent and then this data is anonymized and encrypted to protect user privacy. 

Once the technology goes into real-world applications users would need to provide consent for other parties to have access to their movement data. “We’ve seen that monitoring patients’ walking speed and ability to do basic activities on their own gives health care providers a window into their lives that they didn’t have before, which could be meaningful for a whole range of diseases,” says lab leader Dina Katabi.

“A key advantage of our approach is that patients do not have to wear sensors or remember to charge their devices.” Traditionally data is fed to neural networks that has been labeled by hand by humans. 

For instance, to teach a neural network to recognize a cat, scientists would show it images either labeled 'cat' or 'not cat'. But radio signals, can’t be easily labeled by humans. 

To overcome this change the scientists collected thousands of samples of humans walking and moving and sitting from both their wireless device and a camera. They then extracted a simple stick figure form the video footage and showed the stick figure and the corresponding radio signals to the neural network to teach it to recognize movements. 

By using both the stick figure and the radio signal the neural network was able to understand the correlation between the two. After an intensive training period, the neural network was able to recognize people's movement using just the radio signals it detected bouncing off their bodies. 


Source : Interesting Engineering