It’s a common trope in many futuristic films; one hand gesture can turn on all the lights in a room or switch screens on a holographic table. Scientists may have found the beginnings of this very technology, by scanning human hand gestures they believe this sensory technology can be placed into smartwatches.
“Ultrasound imaging has remained under-explored in the HCI community despite being non-invasive, harmless and capable of imaging internal body parts, with applications including smart-watch interaction, prosthesis control, and instrument tuition,” write the authors in their paper, published in the journal Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems.
Researchers at the University of Bristol, have used an ultrasound machine to detect hand gestures. Interaction technology is more commonly used in products like video games and household appliances.
However, this latest advanced “physiologic” version could be placed in a smartwatch and used by surgeons who will be able to browse hands-free through radiological images during scans.
“With current technologies, there are many practical issues that prevent a small, portable ultrasonic imaging sensor integrated into a smartwatch. Nevertheless, our research is a first step towards what could be the most accurate method for detecting hand gestures in smartwatches,” Jess McIntosh, a Ph.D. student in the Department of Computer Science and BIG Group told the University of Bristol News.
The team, led by Professor Mike Fraser, Asier Marzo and Jess McIntosh from the Bristol Interaction Group (BIG) at the University of Bristol, together with University Hospitals Bristol NHS Foundation Trust (UH Bristol), presented their findings this past summer at a well-known human-computer interface conference, called ACM CHI 2017, held in Denver, Colorado.
Now that smartwatches have become more mainstream, the research team hopes to implement their new find into the devices, allowing sensors to work within the watch and detect any random hand movement. Inspired by ultrasonic imaging, the team developed a gesture recognition algorithm called EchoFlex which combines image processing, and neural networks classify the flexion and extension of 10 discrete hand gestures with an accuracy above 98%.
Lip Service
Scientists are also looking at other body parts to implement in interaction technology. Binghamton University, State University of New York recently released their findings on a new framework that interprets mouth gestures as a medium for interaction with virtual reality in real time.
"We hope to make this applicable to more than one person, maybe two. Think Skype interviews and communication," said Binghamton University Professor of Computer Science Lijun Yin. "Imagine if it felt like you were in the same geometric space, face to face, and the computer program can efficiently depict your facial expressions and replicate them, so it looks real."
Combining this tech with an interactive smartwatch would definitely be something to behold when in action.
Source: Interesting Engineering, Science Daily, Tech Radar