Advertisement

Scientists have developed an algorithm that can accurately predict when a patient is about to develop dangerously low blood pressure—a condition known as hypotension—while under the knife.

The researchers used machine learning to observe subtle signs in routinely collected physiological data to help predict the onset of hypotension in surgical patients. The algorithm was able to identify hypotension 15 minutes before it occurred in 84 percent of cases, 10 minutes before in 84 percent of cases and five minutes before in 87 percent of cases.

“Physicians haven't had a way to predict hypotension during surgery, so they have to be reactive, and treat it immediately without any prior warning,” lead researcher Maxime Cannesson, MD, PhD, a professor of anesthesiology and vice chair for perioperative medicine at UCLA Medical Center in Los Angeles, said in a statement.

“Being able to predict hypotension would allow physicians to be proactive instead of reactive,” he added. “By finding a way to predict hypotension, we can avoid its complications, which can include postoperative heart attack and acute kidney injury, that can lead to death in some cases.”

The researchers used a data set that consisted of 1,334 patient records with 545,959 minutes of arterial pressure waveform recordings—recording of the increase and decrease of blood pressure in the arteries during a heartbeat—and 25,461 episodes of hypotension.

A second data set consisted of 204 patient records with 33,236 minutes of arterial pressure waveform recordings and 1,923 episodes of hypotension.

“We are using machine learning to identify which of these individual features, when they happen together and at the same time, predict hypotension,” Cannesson said. “The statistical association between these features and the occurrence of hypotension is fascinating because we can potentially reverse engineer this statistical association and augment our understanding of this complex physiological phenomenon.”

The team extracted 3,022 individual features for each heartbeat and when combined, the features yielded more than 2.6 million bits of information used to build the algorithm.

 “It is the first time machine learning and computer science techniques have been applied to complex physiological signals obtained during surgery,” Cannesson said. “Although future studies are needed to evaluate the real-time value of such algorithms in a broader set of clinical conditions and patients, our research opens the door to the application of these techniques to many other physiological signal, such as EKG for cardiac arrhythmia prediction or EEG for brain function. It could lead to a whole new field of investigation in clinical and physiological sciences and reshape our understanding of human physiology.”

The study was published in Anesthesiology.

Advertisement
Advertisement