A UW-led team has developed a method that uses the camera on a person’s smartphone or computer to take their pulse and breathing rate from a real-time video of their face.
Cristina Zaragoza/Unsplash
A University of Washington-led team has developed a method that uses the camera on a person’s smartphone or computer to take their pulse and respiration signal from a real-time video of their face.
The researchers presented this state-of-the-art system in December at the Neural Information Processing Systems conference.
Now the team is proposing a better system to measure these physiological signals. This system is less likely to be tripped up by different cameras, lighting conditions or facial features, such as skin color. The researchers will present these findings April 8 at the ACM Conference on Health, Interference, and Learning.
“Machine learning is pretty good at classifying images. If you give it a series of photos of cats and then tell it to find cats in other images, it can do it. But for machine learning to be helpful in remote health sensing, we need a system that can identify the region of interest in a video that holds the strongest source of physiological information — pulse, for example — and then measure that over time,” said lead author Xin Liu, a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering.
“Every person is different,” Liu said. “So this system needs to be able to quickly adapt to each person’s unique physiological signature, and separate this from other variations, such as what they look like and what environment they are in.”
Try the researchers’ demo version that can detect a user’s heartbeat over time, which doctors can use to calculate heart rate.
The team’s system is privacy preserving — it runs on the device instead of in the cloud — and uses machine learning to capture subtle changes in how light reflects off a person’s face, which is correlated with changing blood flow. Then it converts these changes into both pulse and respiration rate.
The first version of this system was trained with a dataset that contained both videos of people’s faces and “ground truth” information: each person’s pulse and respiration rate measured by standard instruments in the field. The system then used spatial and temporal information from the videos to calculate both vital signs. It outperformed similar machine learning systems on videos where subjects were moving and talking.
But while the system worked well on some datasets, it still struggled with others that contained different people, backgrounds and lighting. This is a common problem known as “overfitting,” the team said.
The researchers improved the system by having it produce a personalized machine learning model for each individual. Specifically, it helps look for important areas in a video frame that likely contain physiological features correlated with changing blood flow in a face under different contexts, such as different skin tones, lighting conditions and environments. From there, it can focus on that area and measure the pulse and respiration rate.
While this new system outperforms its predecessor when given more challenging datasets, especially for people with darker skin tones, there’s still more work to do, the team said.
“We acknowledge that there is still a trend toward inferior performance when the subject’s skin type is darker,” Liu said. “This is in part because light reflects differently off of darker skin, resulting in a weaker signal for the camera to pick up. Our team is actively developing new methods to solve this limitation.”
The researchers are also working on a variety of collaborations with doctors to see how this system performs in the clinic.
“Any ability to sense pulse or respiration rate remotely provides new opportunities for remote patient care and telemedicine. This could include self-care, follow-up care or triage, especially when someone doesn’t have convenient access to a clinic,” said senior author Shwetak Patel, a professor in both the Allen School and the electrical and computer engineering department. “It’s exciting to see academic communities working on new algorithmic approaches to address this with devices that people have in their homes.”
Original Article: New system that uses smartphone or computer cameras to measure pulse, respiration rate could help future personalized telehealth appointments
More from: University of Washington | Microsoft Research
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
Smartphone health check
- Health-tech: How MFine Is Turning Your Phone Into A Doctor's Assistant & Diagnostic Labon April 10, 2021 at 4:52 am
MFine is a mobile application with an aim to easy medical consultations for its users. The AI-powered platform allows people to chat with a doctor.
- NHS Covid-19 app updates delayed - despite pub check-in rules changing on Mondayon April 8, 2021 at 6:06 am
But by lunchtime today users of the smartphone ... for Health and Social Care spokesperson said: “The NHS COVID-19 app is a key tool in our pandemic response. As venues begin to open up we encourage ...
- From sleep curfews to locking yourself out — how to ditch your smartphoneon April 7, 2021 at 3:05 pm
Psychotherapist and couples counsellor Hilda Burke, author of The Phone Addiction Workbook, says although smartphone addiction can be a cause of accidents, it is bad for your mental health too ...
- Medical diagnosis via smartphone: MyHomeDoc secures coveted FDA OKon April 7, 2021 at 2:47 am
The 9 tests include a stethoscope check (lung, heart, and bowel sounds), otoscopy of the ear, oximeter (pulse rate and saturation), thermometer (body temperature), and a throat and skin test that uses ...
- Researchers develop health tech tool that can detect vital signs from a person’s face via videoon April 6, 2021 at 7:33 am
(UW Photo) At an international health conference this week ... new technology that allows medical providers to remotely check a patient’s pulse and heart rate. The tool uses the camera on a smartphone ...
Go deeper with Google Headlines on:
Smartphone health check
Go deeper with Bing News on:
Remote health sensing
- NASA data powers new USDA National Agricultural Statistics Service Soil Moisture Portalon April 8, 2021 at 11:45 am
Farmers, researchers, meteorologists and others now have access to high-resolution NASA data on soil moisture, thanks to a new tool developed by USDA’s National Agricultural Statistics Service (NASS) ...
- A digital nervous system aiming toward personalized IoT healthcareon April 8, 2021 at 3:51 am
On the hardware side, IoT includes sensors and actuators, placed on and inside items, systems, animals, and humans, that together form wired or wireless networks connected to information technology ...
- Pear Therapeutics Expands Platform with Digital Biomarkers, Machine Learning Algorithms and Sensor-Based Technologieson April 6, 2021 at 9:05 am
The newly licensed technologies enable the building of a comprehensive product offering for remote sensing of patient physiology ... capabilities to track and produce individualized mental health ...
- Global Precision Farming Market Size Will Reach USD 12.84 Billion by 2026, at Expected CAGR of 12.7%: Facts & Factorson April 6, 2021 at 6:25 am
Pages Research Report] According to the recent analysis research report; the global Precision Farming Market in 2019 was approximately USD 5.56 Billion. The market is expected to grow at a CAGR of 12.
- Workplace accidents: DOSH wants construction industry to use drones, remote sensing devices to monitor siteson April 6, 2021 at 4:39 am
Subscribe to our Telegram channel for the latest updates on news you need to know. PETALING JAYA, April 6 — The Department of Occupational Safety and Health (DOSH) has urged the construction industry ...