Source: Butterfly Network

Ultrasound machines are crucial but arduous devices that play an important role in the medical technology space.

These large carts possess big computers along with multiple probes and transducers that get pushed around hospitals. Each transducer is built with small piezoelectric crystals that vibrate to emit sounds, receive an echo, and then form an image of a baby in development or other abnormalities located close to the surface like a clogged artery, vein, or abscess. 

But there is an additional layer of complexity for these apparatuses when it comes to delivering an overview of an imaged area or identifying certain conditions, according to Wired.

Every crystal needs to be individually wired together and then have cables that are attached to a separate machine for processing. Next, the crystals need to be tuned in order to produce the desired type of ultrasonic wave for imaging at a certain depth. In addition, separate probes are needed for the heart, stomach, uterus and other areas.

However, a startup has created a unique device that can bypass these limitations.

Butterfly Network created a device called the iQ. It’s a compact, inexpensive ultrasound tool that hooks into the lighting jack of an iPhone.

“What we've done is we've engineered these ultrasound machines onto a chip and we've given it very broad acoustic bandwidth,” said Matt Dejonge, head of product development for Butterfly Network, in an interview with R&D Magazine.

The acoustic transducer is infused with thousands of little drums, each one the size of a human hair, which sits on the custom chip developed by the company. Butterfly’s team of engineers were able to integrate all of the functions of the large computers built into traditional ultrasound machines within that component.

Piezoelectric crystals are only good at a very narrow bandwidth of sound, but this specialized chip means Butterfly’s invention can span the entire range of ultrasound applications with just one transducer.

Market applications

Emergency medicine is one of the core markets the company is targeting with its device, said Dejonge.

“Emergency medicine has embraced what the industry has come to call point-of-care ultrasound. So it's this idea of using ultrasound almost as part of the physical examination.”

Imagine a scenario where an individual gets in a serious car accident.

Typical ultrasound machines would be used to peek inside the body when the patient arrives in the emergency room to determine if there is internal bleeding. Next, physicians could send the patient off to receive a CT scan, which could require the patient  to wait a long time and involves a very high dose of radiation. It would also take additional time to interpret the results from a CT scan.

Furthermore, a doctor may have to perform something called a guided procedure, where they need to make sure they are putting the large needle in the correct area to drain that fluid. This procedure has very high complication rates associated with it because the needles that are placed in the internal jugular vein could accidentally get placed in the carotid artery.

“It’s a lot like a battle field in the sense that these physicians are short-staffed, they’re caring for a lot of different patients at once and are lucky if the emergency department even owned an ultrasound sound machine,” continued Dejonge.

Essentially, having Butterfly’s ultrasound device of its size and breadth would give doctors the ability to get the answers for diagnosis right at the point-of-care, therefore optimizing their decisions for treatment of their patient population.

They could either put the probe down on the patient, look inside the body, and see if there’s internal bleeding or place the transducer on the individual’s body to gain more specific insight into where the needle needs to be placed to remove an obstruction.

Better education

An artificial intelligence application will be built into iQ to enhance the device’s hardware capabilities, which Dejonge noted will be split into two steps: acquisition assistance and automated interpretation.

“You’ve got to really be able to position that probe in just the right way to get the anatomy of interest on screen in a high quality way with the added step of being able to interpret what you are looking at.”

To achieve the acquisition assistance step, the team’s software is being infused with deep learning neural networks which tell the user in real-time how to position the probe on the body.. In this setup arrows will be drawn on the screen, providing a path to the designated target through the cameras that are part of the iPhone. A checkmark will appear on screen to alert the user that they have arrived at the right spot.

Automated interpretation would be a process where the software measures the anatomy and delivers feedback regarding what the user is looking at.

FDA approval and updates

The company recently received clearance from the Food and Drug Administration for 13 different clinical use cases, including fetal, abdominal, cardiac, gynecological, urological, and pediatric use cases.

The A.I. update is expected to arrive via an upgrade issued in 2018. Dejonge feels this device has a couple of advantages that could make it popular among the healthcare practitioner community.

“We can manufacture these life-saving ultrasound devices in the same way that you would manufacture all the chips in your iPhone in many cases, at the same factories, using the same technique. We run these chips through hundreds of processes as well and at the end we get a life-saving high-performance ultrasound,” he noted.

Essentially, this keeps costs low because the transducer gets built at an incredibly low price of under 2,000 dollars. It can see in inside the entire body and can be carried around inside the physician’s pocket.

These tools are expected to start shipping in 2018.