rPPG vs Continuous Monitoring
AI face scan vs wearable health monitoring
A face scan is a snapshot. A wearable is a continuous signal. That difference matters more than the marketing pages admit.
Camera-based vitals are real technology. Remote photoplethysmography, or rPPG, can detect tiny color changes in facial skin caused by blood flow. Under good lighting, with a still face and a decent camera, that signal can estimate heart rate and sometimes respiratory rate.
The problem is not the physics. The problem is the leap from "heart rate in good conditions" to "blood pressure, glucose, stress, hydration, biological age, and a digital twin from a selfie." That is where the marketing gets ahead of the validation.
Why continuous wearables win for health monitoring
The most useful health signals are usually trends, not isolated numbers. Resting heart rate drifting up for three nights. HRV dropping after a medication change. Skin temperature rising before symptoms. Sleep fragmenting after alcohol. Those patterns only appear when the system is measuring passively against your own baseline.
| Capability | AI face scan app | Continuous wearable |
|---|---|---|
| Sampling | One 15-60 second reading when the user remembers to scan | Passive collection across waking hours and overnight |
| Signal quality | Ambient light, motion, distance, camera focus, and skin tone can degrade the signal | Contact sensor with controlled optical signal and less environmental noise |
| Heart rate | Often good in still, well-lit conditions | Strong for continuous trend monitoring and baseline deviation |
| HRV and sleep | Cannot measure while the user sleeps | Captures overnight HRV, sleep duration, and recovery trends |
| Blood pressure | Not validated to ISO 81060-2 as a replacement for a cuff | Still not a cuff, but better suited to longitudinal waveform research and trend context |
| Blood glucose | Not clinically reliable from a selfie | Wearables do not replace CGM or fingerstick testing either |
| Illness signals | Only sees the moment of the scan | Can detect resting heart rate, HRV, and temperature deviations before symptoms |
| Caregiver use | Requires active participation | Works passively for older adults, memory care, and family visibility |
The 3am problem
A face scan cannot catch what happens when you are asleep. It cannot notice an overnight HRV collapse, sustained tachycardia, irregular rhythm notification, fever trend, or repeated oxygen dip unless you happen to wake up, open the app, sit still, and scan your face at the right moment.
That is why Mother Nature AI built around continuous data first: Apple Health, Oura, Whoop, Garmin, Fitbit, MyChart, and VitalIQ for users who need a passive wearable. The AI is more useful when it has your baseline, not just a selfie.
Where face scans still make sense
We are not anti-camera. We have already prototyped face-scan vitals and expect to use the parts that earn their place: heart-rate spot checks, atrial fibrillation screening in users who do not wear a watch, and skin-condition analysis. Those are narrower, more defensible claims.
Blood pressure, glucose, hydration, biological age, and "40+ vitals" from a 15-second selfie are not ready to guide health decisions. For now, we would rather ship the boring thing that works than the impressive-looking number people might act on.
Read the full evidence review
For the deeper research breakdown, including rPPG validation studies, bias concerns, and why we chose continuous monitoring first, read Can an AI Face Scan Measure Your Blood Pressure and Glucose?