03/25/2026 | Press release | Archived content
A team led by Rutgers University researchers has developed a security system that could change how people log in to virtual and augmented reality platforms by eliminating passwords, personal identification numbers and eye scans and replacing them with something far more seamless.
A prolific inventor, Yingying Chen, a Distinguished Professor and chair of the Department of Electrical and Computer Engineering, is looking to find ways to make immersive systems more secure.The system, a software program called VitalID, is based on the team's discovery of a new biometric: tiny vibrations generated by breathing and heartbeats that resonate through the skull in patterns unique to each person's bone structure and facial tissues.
"Extended reality will play a major role in our future," said Yingying Chen, a Distinguished Professor and chair of the Department of Electrical and Computer Engineering in the Rutgers School of Engineeringand a corresponding author of the study. "If immersive systems are going to become woven into daily life, authentication has to be secure, continuous and effortless."
The work was presented in November in Taipei, Taiwan, at the ACM Conference on Computer and Communications Security, a major annual meeting of the Association for Computing Machinery's Special Interest Group on Security, Audit and Control, and was recognized with a Distinguished Paper Award.
Extended reality, or XR, includes virtual reality, augmented reality and mixed reality technologies that blend digital content with the physical world. As XR systems expand beyond gaming into finance, medicine, education and remote work, security has become increasingly urgent.
"XR is becoming a gateway to everyday internet services, many of which involve sensitive personal data," said Chen, who started this line of research in 2019. "We need authentication that works without adding hardware."
(Fourth and fifth from left) Yingying Chen, research team leader and Distinguished Professor, and Yan Wang, an Associate Professor at Temple University and a co-author of the study, have worked on the VitalID project for years. Wang is a former doctoral student of Chen's. Rutgers doctoral students and members of Chen's Data Analysis and Information Security (DAISY) Lab include: (from left) Honglu Li, Zhitao Cheng, Yuchen Sun and (at far right) Changming Li.Headsets now store personal accounts, confidential documents and access to web services. But typing passwords in a gesture-based environment is awkward. Two- factor authentication interrupts immersion and iris scanning hardware adds cost, Chen said.
If adopted commercially, the technology could allow users to access financial platforms, medical records or enterprise systems inside immersive environments without stopping to log in.
The study was conducted through a collaboration with Cong Shi at the New Jersey Institute of Technology, Yan Wang at Temple University and Nitesh Saxena at Texas A&M University.
At the heart of the research is a simple biological phenomenon.
As Chen explained it, the human body is always moving in tiny ways, even when a person is sitting still. Each breath and each heartbeat create very small vibrations inside the body. Those vibrations travel up through the neck and into the head.
When they reach the skull, they cause it to vibrate slightly. Because every skull has a different shape, thickness and bone structure, the vibrations change in unique ways as they move through each person's head. Soft tissues in the face, such as muscle and fat, also influence how the vibrations travel.
As a result, each person produces a distinct vibration pattern. Motion sensors built inside virtual reality headsets can detect these tiny patterns and assess them like a fingerprint to determine who is wearing the device.
"We do not need to add any device or additional hardware," Chen said. "It requires only software."
In testing across 52 users over a 10-month period using two popular XR headsets, the system correctly authenticated legitimate users more than 95% of the time and rejected unauthorized users more than 98% of the time.
The researchers built a filtering system that removes interference from extraneous head and body movement, allowing the headset to focus only on the tiny vibrations in the skull caused by breathing and heartbeat. They then used advanced computer models to analyze those vibration patterns.
Because the vibrations travel internally through bone and tissue, they may also be more difficult to spoof, Chen said. Someone might imitate another person's breathing rhythm but cannot easily replicate the biomechanical properties of another person's skull.
The headset would continuously confirm identity in the background simply by sensing the subtle vibrations that come with being alive.
Rutgers Technology Transfer, within the Office for Research, has filed a provisional patent application covering this innovative technology. A non-confidential summary is available at: techfinder.rutgers.edu/tech/Effortless_Biometric_User_Authentication_for_Extended_Reality_(XR)_Headsets_Using_Vital-Sign_Harmonics.
This technology is available for licensing and/or research collaboration. Organizations interested in business development or other collaborative partnerships are encouraged to contact: [email protected]. Rutgers looks forward to advancing this technology towards commercialization and bringing its benefits to the public.
Chen is a prolific inventor in the field of remote sensing. More information about her other work can be found here:
Explore more of the ways Rutgers research is shaping the future.