By Tom Fleischman for the Cornell Chronicle
Human facial movements convey emotions, and help us communicate nonverbally and perform physical activities, such as eating and drinking.
Tracking facial movements – and possibly their cause – is one of the proposed applications for NeckFace, one of the first necklace-type wearable sensing technologies. A team led by Cheng Zhang, assistant professor of information science in the Cornell Ann S. Bowers College of Computing and Information Science, has developed NeckFace, which can continuously track full facial expressions by using infrared cameras to capture images of the chin and face from beneath the neck.
Their work is detailed in “NeckFace: Continuously Tracking Full Facial Expressions on Neck-mounted Wearables,” which was published June 24 in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies.
Co-lead authors are Tuochao Chen (from Peking University) and Yaxuan Li (from McGill University), visiting students in the Smart Computer Interfaces for Future Interactions (SciFi) Lab, and Cornell MPS student Songyun Tao. Other contributors are HyunChul Lim, Mose Sakashita and Ruidong Zhang, Cornell Ph.D. students in the field of information science, and François Guimbretière, professor of information science in the Cornell Bowers College.
NeckFace is the next generation of Zhang’s previous work, which resulted in C-Face, a similar device but in a headset format. Zhang said NeckFace provides significant improvement in performance and privacy, and gives the wearer the option of a less-obtrusive neck-mounted device.In addition to potential emotion-tracking, Zhang sees many applications for this technology: virtual conferencing when a front-facing camera is not an option; facial expression detection in virtual reality scenarios; and silent speech recognition.
“The ultimate goal is having the user be able to track their own behaviors, through continuous tracking of facial movements,” said Zhang, principal investigator of the SciFi Lab. “And this hopefully can tell us a lot of information about your physical activity and mental activities.”
Guimbretière said NeckFace also has the potential to change video conferencing.
“The user wouldn’t need to be careful to stay in the field of view of a camera,” he said. “Instead, NeckFace can recreate the perfect head shot as we move around in a classroom, or even walk outside to share a walk with a distant friend.”
To test the effectiveness of NeckFace, Zhang and his collaborators conducted a user study with 13 participants, each of whom was asked to perform eight facial expressions while sitting and eight more while walking. In the sitting scenarios, the participants were also asked to rotate the head while performing the facial expressions, and remove and remount the device in one session.
NeckFace was tested in two designs: a neckband, draped around the back of the neck with twin cameras just below collarbone level; and a necklace, with a pendant-like infrared (IR) camera device hanging below the neck.
The group collected baseline facial movement data using the TrueDepth 3D camera on an iPhone X, then compared that to the data collected with NeckFace. Between the sitting, walking and remounting expressions, study participants expressed a total of 52 facial shapes.
Using calculations involving deep learning, the group determined that NeckFace detected facial movement with nearly the same accuracy as the direct measurements using the phone camera. The neckband was found to be more accurate than the necklace, the researchers said, possibly because two cameras on the neckband could capture more information from both sides than could the center-mounted necklace camera.
Zhang said the device, when optimized, could be particularly useful in the mental health realm, for tracking people’s emotions over the course of a day. While people don’t always wear their emotions on their face, he said, the amount of facial expression change over time could indicate emotional swings.
“Can we actually see how your emotion varies throughout a day?” he said. “With this technology, we could have a database on how you’re doing physically and mentally throughout the day, and that means you could track your own behaviors. And also, a doctor could use the information to support a decision.”
This work is funded by the Cornell Department of Information Science.
This article first appeared at the Cornell Chronicle.