rPPG vs Continuous Monitoring

AI face scan vs wearable health monitoring

A face scan is a snapshot. A wearable is a continuous signal. That difference matters more than the marketing pages admit.

Camera-based vitals are real technology. Remote photoplethysmography, or rPPG, can detect tiny color changes in facial skin caused by blood flow. Under good lighting, with a still face and a decent camera, that signal can estimate heart rate and sometimes respiratory rate.

The problem is not the physics. The problem is the leap from "heart rate in good conditions" to "blood pressure, glucose, stress, hydration, biological age, and a digital twin from a selfie." That is where the marketing gets ahead of the validation.

Why continuous wearables win for health monitoring

The most useful health signals are usually trends, not isolated numbers. Resting heart rate drifting up for three nights. HRV dropping after a medication change. Skin temperature rising before symptoms. Sleep fragmenting after alcohol. Those patterns only appear when the system is measuring passively against your own baseline.

CapabilityAI face scan appContinuous wearable
SamplingOne 15-60 second reading when the user remembers to scanPassive collection across waking hours and overnight
Signal qualityAmbient light, motion, distance, camera focus, and skin tone can degrade the signalContact sensor with controlled optical signal and less environmental noise
Heart rateOften good in still, well-lit conditionsStrong for continuous trend monitoring and baseline deviation
HRV and sleepCannot measure while the user sleepsCaptures overnight HRV, sleep duration, and recovery trends
Blood pressureNot validated to ISO 81060-2 as a replacement for a cuffStill not a cuff, but better suited to longitudinal waveform research and trend context
Blood glucoseNot clinically reliable from a selfieWearables do not replace CGM or fingerstick testing either
Illness signalsOnly sees the moment of the scanCan detect resting heart rate, HRV, and temperature deviations before symptoms
Caregiver useRequires active participationWorks passively for older adults, memory care, and family visibility

The 3am problem

A face scan cannot catch what happens when you are asleep. It cannot notice an overnight HRV collapse, sustained tachycardia, irregular rhythm notification, fever trend, or repeated oxygen dip unless you happen to wake up, open the app, sit still, and scan your face at the right moment.

That is why Mother Nature AI built around continuous data first: Apple Health, Oura, Whoop, Garmin, Fitbit, MyChart, and VitalIQ for users who need a passive wearable. The AI is more useful when it has your baseline, not just a selfie.

Where face scans still make sense

We are not anti-camera. We have already prototyped face-scan vitals and expect to use the parts that earn their place: heart-rate spot checks, atrial fibrillation screening in users who do not wear a watch, and skin-condition analysis. Those are narrower, more defensible claims.

Blood pressure, glucose, hydration, biological age, and "40+ vitals" from a 15-second selfie are not ready to guide health decisions. For now, we would rather ship the boring thing that works than the impressive-looking number people might act on.

Read the full evidence review

For the deeper research breakdown, including rPPG validation studies, bias concerns, and why we chose continuous monitoring first, read Can an AI Face Scan Measure Your Blood Pressure and Glucose?