5 ways AI is changing healthcare

A doctor wearing a stethoscope and ID badge speaks to a patient in a medical exam room

As emerging AI tools are being used in healthcare, doctors are spending more time looking at their patients instead of computer screens. Researchers are interpreting medical images faster and more accurately. Operating room schedulers are fitting in more lifesaving surgeries.  

With these early advances, clinicians, administrators, researchers and developers say they’re already seeing AI’s positive impact with innovations in how they approach patient care, handle administrative tasks and coordinate with care teams. 

“This is technology that spans all aspects of our business,” says Eric Shelley, vice president of analytics and digital solutions at Northwestern Medicine. 

Here are five ways AI is transforming healthcare, from the business office to the exam room. 

A person holds a cellphone

Improving patient visits and reducing clinician burnout 

AI tools are personalizing office visits, says Dr. Jorge Scheirer, a physician and the chief medical information officer at St. Luke’s University Health Network in Pennsylvania. He and his colleagues use Microsoft Dragon Copilot, the healthcare industry’s first unified voice AI assistant, to focus fully on patients rather than computer screens. 

Scheirer can query charts ahead of appointments for insights and reminders, and then the system securely records, transcribes and summarizes — sometimes catching pertinent comments he hadn’t heard. It pulls up any health records or vetted medical resources he asks for, helps find the right medical codes, and drafts after-visit notes and referrals for him to proof and sign. 

“It’s uncanny how good a job it does,” Scheirer says. 

Dragon Copilot’s time savings also alleviate clinician burnout, he says. Scheirer often worked until 10:30 p.m. to complete regulatory documentation. Now patients get their after-visit notes sooner, and he’s home in time for dinner with his wife. 

A person in a suit looks through a microscope, adjusting the focus knob while seated at a lab workstation.

Talking to medical images for faster diagnoses

Researchers are testing new tools that may help them identify tumors and diseases. Providence, the University of Washington and Microsoft developed multimodal AI models that may be able to help researchers interpret medical imaging and answer questions about it in natural language. 

GigaPath stitches all the tiny details in microscopic slides together into one complete picture, unlike traditional tools that focus on single sections. That could help with earlier disease detection and more personalized treatments. 

BiomedParse analyzes all types of medical scans to detect and identify abnormalities — even spotting things human eyes miss — and answer questions about highlighted areas. That could help speed up the diagnostic process and make it more accurate. 

“These technologies put us on the path to a future where vision becomes part of the intelligence we have,” says Dr. Carlo Bifulco, chief medical officer at Providence Genomics in Oregon. With AI chat capabilities, “you literally will have conversations with the medical images.” 

A stethoscope and pen rest on top of medical charts and data sheets.

Providing ‘clean data’ for smarter hospital decisions

Morning huddles are now more informed, as everyone from operating room coordinators to pharmacy managers uses daily data reports to provide better, more efficient healthcare across Northwestern Medicine’s 11 hospitals in and around Chicago. 

About 400 Power BI data visualization reports hosted on the Microsoft Fabric platform provide “a snapshot of the state of the health system,” says Eric Shelley, Northwestern’s vice president of analytics and digital solutions. They track emergency visits, scheduled surgeries and patient appointments to help allocate resources. 

Fabric collects data in one secure place, from any software the organization’s various groups might be using, and provides a shared workspace so Northwestern’s data teams can manage it for accuracy. The “clean data” gives more confidence to hospital workers and allows them to use AI programs to screen reports about everything from medication dosing errors to safety incidents, helping prioritize responses and monitor trends, Shelley says.

Two doctors observe medical scans on a computer.

Spotting patterns to match patients with treatments and trials

Integrating AI with medical research and clinical practice can make healthcare more effective, improving outcomes and reducing costs, says Jonathan Carlson, vice president and managing director of Microsoft Health Futures.

AI-powered research is driving progress in precision medicine and personalized treatment plans, Carlson says. AI tools can sort through mounds of data faster than whole teams of researchers could, spotting patterns and making data-based predictions. That information can help doctors match patients with the right clinical trials faster, for example, or find existing medications likely to work on specific tumor mutations. 

“We can use this increasingly holistic image of a patient both to help the clinician reason about, ‘Hey, what’s the next thing I should do to understand this patient?’” Carlson says, “and then, ‘How do I compare that patient with the population and have a better idea of what’s actually going to work?’” 

A scientist in a lab coat closely examines tech that has a glowing blue light.

Giving new tools to healthcare developers and innovators

Assessing a patient’s health requires more than medical text comprehension. That’s where AI models can help, by integrating and analyzing data sources across modalities such as images, video and audio.  

Microsoft Azure’s multimodal medical imaging foundation models allow healthcare organizations to build AI tools specific to their needs. These models, developed in collaboration with Microsoft Research and strategic partners, reduce the extensive computing and data requirements typically involved in creating tools from scratch. 

The RAD-DINO model, for example, converts chest X-rays into digital formats that can be processed and organized to help better identify diseases. MedImageInsight helps classify and sort medical imaging, and MedImageParse-3D helps analyze and interpret MRIs and CT scans. The ECG-FM model detects patterns in electrocardiograms and HistAI helps with tasks identifying the organ a sample came from and whether it’s healthy or diseased.  

Lead image: Dr. Jorge Scheirer with St. Luke’s University Health Network — photo by Rachel Wisniewski 

Other images: Improving patient visits: DAX Copilot — photo by Rachel Wisniewski; Talking to medical images: Dr. Carlo Bifulco with Providence Genomics — photo provided by Providence Foundation; Providing ‘clean data’: photo by Krisanapong Detraphiphat / Getty; Spotting patterns: photo by Praetorianphoto / Getty; Giving new tools to developers: photo by Halfpoint Images / Getty