Neurosurgeons can leave the operating room more confident today than ever before about their patient’s brain tumor diagnosis, thanks to integration of a new system that will allow them to quickly see diagnostic tissue and tumor margins in near-real time.
And the accuracy and precision will only continue to advance as they work toward incorporating deep learning and computer vision to make the whole process quicker, too, surgeons at Michigan Medicine say. In the operating room, faster also means more affordable.
Todd Hollon, M.D., a chief neurosurgical resident at Michigan Medicine, describes a two-part approach to improving intraoperative diagnostic accuracy and efficiency in a new publication in Nature Medicine.
Hollon, along with Daniel Orringer, M.D., an associate professor of neurosurgery at NYU Langone Health, and colleagues, report the most recent application of a technique called stimulated Raman histology (SRH), developed at Michigan Medicine to rapidly generate images of tumor tissue at the bedside. This means neuropathologists can review the images without the need for a pathology lab, eliminating the long wait time needed for traditional processing, staining and interpretation.
The researchers also used an artificial intelligence algorithm called a deep convolutional neural network to learn the characteristics of the 10 most common types of brain cancer and predict diagnosis. Surgeons are provided with a diagnostic prediction within minutes at the bedside with accuracy comparable to that of the conventional method.
“This is the first prospective trial evaluating the use of artificial intelligence in the operating room,” says Hollon, lead author of the publication. “We have executed clinical translation of an AI-based workflow.”
Diagnosis without delay
Patients hospitalized for brain tumor surgery often don’t know exactly what kind of tumor they have beforehand, only that surgery is required. The whole OR team, after obtaining and processing a specimen to physically deliver to the expert neuropathologist, has to wait around for answers, too. A surgeon can’t be sure how they’ll approach a case until receiving that pathologist feedback.
“The amount of time involved can be prohibitive,” Hollon says.
But when the image is digital and it appears within a few minutes, it can be used rapidly, in the operating room to inform the care of the patient.
“It’s so quick that we can image many specimens from right by the patient’s bedside and better judge how successful we’ve been at removing the tumor,” Hollon says.
Michigan Medicine recently made the first commercial purchases of the technology used to perform SRH microscopy. Surgeons have already used the imager on over 500 patients, as a first-line diagnostic tool for neurosurgery and otolaryngology, with more specialties expected to start using them shortly.
“The surgeon and pathologist determine whether they can make the diagnosis using the SRH image, or whether there is a need to send additional tissue to the pathology lab, the way we used to in the past,” Hollon says.
In another use, the authors hope SRH technology will one day allow under-resourced hospitals to easily consult with colleagues at academic medical centers on difficult cases.
A digital collaboration aid
While there’s much to celebrate, brain surgeons acknowledge outcomes are still unacceptably poor for patients with some types of brain cancer, such as glioblastoma. It’s imperative to have reliable decision-making in the OR and a strong partnership between the surgeon and the neuropathologist for the best possible outcome, researchers say.
That’s where artificial intelligence comes in.
Hollon trained a convolutional neural network (CNN) on more than 2.5 million de-identified images from 415 patients, obtained via stimulated Raman histology and focusing on the more common brain tumor types.
The researchers then tested the network with 278 patients from three institutions. They split a sample from each patient: one was stained and processed in the typical way and delivered to a pathologist for a diagnosis; the second, imaged with SRH and evaluated for a diagnosis by the neural network.
The CNN-based method actually had comparable diagnostic accuracy, at 94.6% compared to 93.9% for conventional histology.
“We’re transforming brain tumor diagnosis,” Hollon says. “It’s a highly standardized tool that can deliver accurate diagnoses to a broad swath of brain tumor patients.”
Hollon notes the CNN correctly identified the pathologists’ misdiagnoses, while the pathologists correctly identified the misdiagnoses from AI, indicating that the two methods are synergistic.
“It’s another way to help the pathologists and surgeons increase certainty while making important decisions in the operating room,” he says.
Researchers also worked with the system to help it learn rarer diagnoses.
Hollon says, while the Michigan Medicine team is already taking advantage of intraoperative diagnosis today, the use of CNN in a clinical practice is possible in the next few years.
“The results reported in our study represent the culmination of a 9-year journey at Michigan Medicine to develop and implement a better way to do brain tumor surgery- one that leverages advances in optics and artificial intelligence decision- to make safer, more effective decisions in the operating room,” says Orringer, the senior author of the article.
It all started at the beginning of Orringer’s time as a neurosurgery resident at Michigan Medicine, performing the pivotal work to develop SRH in mouse models and later using the first SRH imager in the Michigan Medicine operating rooms. The team has since developed an SRH imager that’s fast, mobile, user-friendly and registered with the FDA.
Paper cited: “Near real-time intraoperative brain tumor diagnosis using stimulated Raman histology and deep neural networks,” Nature Medicine. DOI: 10.1038/s41591-019-0715-9
Disclosures: Daniel A. Orringer, M.D., is an adviser and shareholder of Invenio Imaging Inc., a company developing SRH imagers. The University of Michigan has intellectual property rights on elements of the technology reported here.