New study shows how AI can unlock deeper cardiac data from Apple Watch’s optical sensor

A few days ago, we looked at how Apple could one day use AirPods’ brainwave sensors to measure sleep quality and even detect seizures.

Now, a new paper shows how the company is exploring deeper insights into heart health with the help of AI. Here are the details:

a little context

With watchOS 26, Apple introduced high blood pressure notifications to Apple Watch.

The company explains:

High blood pressure notification on Apple Watch uses data from an optical heart sensor to analyze how your blood vessels respond to your heartbeat. The algorithm works passively in the background, reviewing data over 30 days and notifying users when it detects consistent signs of high blood pressure.

The feature is far from a medical-grade diagnostic tool, and while Apple was the first to admit that “hypertension notifications do not detect every instance of high blood pressure,” the company also claims that the feature is expected to “notify more than 1 million people with undiagnosed hypertension within one year.”

One important aspect of this feature is that it is based on data over a 30-day period, rather than instantaneous measurements. That is, its algorithm analyzes trends rather than generating real-time hemodynamic readings or estimating specific cardiovascular parameters.

And that’s exactly where this new Apple study comes in.

Get more data from optical sensors

There is one important point that I would like to clarify first. It’s meaningless Although this study refers to the Apple Watch, Any Claims regarding future products or features.

This research, like most (if not all) of the research published on Apple’s Machine Learning Research blog, focuses on the fundamental research and the technology itself.

In this particular paper, called “Hybrid Modeling of Photoplethysmography for Non-Invasive Monitoring of Cardiovascular Parameters,” Apple proposes “a hybrid approach to directly estimate cardiovascular biomarkers from PPG signals using hemodynamic simulation and unlabeled clinical data.”

In other words, researchers have demonstrated that it is. Possible Estimate deeper cardiac metrics using a simple finger pulse sensor, also known as photoplethysmography (PPG). This is the same optical sensing method used in the Apple Watch (although the signal characteristics are different).

What the Apple researchers did was to obtain a large dataset of labeled simulated arterial pressure waveforms (APW) and simultaneous measurements of actual APW and PPG.

Next, you learned how to essentially train a generative model to map PPG data to co-occurring APW.

Briefly, this allowed us to infer APW data from PPG measurements with sufficient accuracy for our research purposes.

The interpreted APW was then input into a second model, which was trained to infer cardiac biomarkers such as stroke volume and cardiac output from that data.

They accomplished this by training this second model using simulated APW data combined with known cardiovascular parameter values ​​for stroke volume, cardiac output, and other metrics.

Finally, we generated multiple valid APW waveforms for each PPG segment, estimated the corresponding cardiovascular parameters for each, and averaged the results to generate the final estimate along with an uncertainty measure.

result

Once the entire training process and model pipeline was in place, they selected an entirely new dataset “consisting of APW and PPG signals from 128 patients undergoing non-cardiac surgery, labeled with cardiovascular biomarkers.”

When we ran this data through our pipeline, we found that the trends in stroke volume and cardiac output were accurately tracked, although not in absolute terms.

Still, their method outperforms traditional techniques and shows that AI-assisted modeling can extract more meaningful cardiac insights from simple optical sensors.

Here are the researchers’ conclusions in their own words:

In this study, we use a hybrid modeling approach to infer cardiovascular parameters from in vivo PPG signals. Compared to purely data-driven approaches, which are difficult due to limited labeled data, our method achieves promising results by incorporating simulation and avoiding the need for invasive and costly annotations. While other existing hybrid approaches for cardiovascular modeling either embed physical properties as structural constraints within neural networks or augment traditional physiological models with data-driven components, our method incorporates physical knowledge into the model through SBI. (…) Our results contribute to the characterization of the informativeness of PPG signals for predicting cardiac biomarkers and may be extended beyond what was considered in our experiments. Although our results are promising in monitoring temporal trends, absolute value prediction of complex biomarkers remains challenging and represents an important direction for future research. Future research may also consider alternative generation approaches for PPG to APW mapping and explore different architectural choices. Finally, a learning strategy similar to the one used here for finger PPG could be extended to other modalities, including wearable PPG, opening the door to passive and long-term cardiac biomarker monitoring.

While it’s impossible to know whether Apple will build these features into the Apple Watch, it’s encouraging that the company’s researchers are looking for new ways to extract even more meaningful and potentially life-saving data from the sensors already in use.

The entire study is available on arXiv.

Apple has great deals on Amazon

Add 9to5Mac as a preferred source on Google
Add 9to5Mac as a preferred source on Google

We will be happy to hear your thoughts

Leave a reply

Cyberstorehut
Logo
Shopping cart