New AI Technology May Have the Potential to Optimize Cancer Surgery

Fact checked by Ariana Pelosci
News
Article

Anant Madabhushi, PhD, and Farzad Fereidouni, PhD, are developing the MarginCall technology to reduce time lag and improve tumor margin assessment accuracy in breast and ovarian cancer surgery.

Anant Madabhushi, PhD, and Farzad Fereidouni, PhD, are developing the MarginCall technology to reduce time lag and improve tumor margin assessment accuracy in breast and ovarian cancer surgery.

Anant Madabhushi, PhD, and Farzad Fereidouni, PhD, are developing the MarginCall technology to reduce time lag and improve tumor margin assessment accuracy in breast and ovarian cancer surgery.

Anant Madabhushi, PhD, and Farzad Fereidouni, PhD, are currently developing a tool that utilizes artificial intelligence (AI) and fluorescent imitating brightfield imaging (FIBI) microscopy technology to optimize modern practices of surgical oncology, particularly for breast and ovarian cancer.1

Madabhuhsi is the executive director of the Emory Empathetic AI for Health Institute and faculty in the Department of Radiology and Imaging Sciences and the Department of Biomedical Informatics and Pathology at Emory University, and Fereidouni is an associate professor in the Department of Pathology and Laboratory Medicine at Emory University School of Medicine.

CancerNetwork® spoke with Madabhushi and Fereidouni after the Advanced Research Projects Agency for Health (ARPA-H) donated funds to the MarginCall project. Currently, there are many inefficiencies in surgical oncology, like the frozen section procedure—a process that requires a surgeon to remove tumor tissue from a patient, freeze it, and then wait for a pathologist to verify if it is positive or negative for tumor tissue—being just 1 example highlighted by Fereidouni. The MarginCall tool will be able to better help surgeons create more accurate tumor margin assessments while eliminating the need to freeze a piece of tumor tissue.

MarginCall may also be able to mitigate the rates of recurrence among patients with breast and ovarian cancer while improving outcomes, making it so fewer patients need to have a second surgery. Fereidouni noted that at least 25% of patients with breast cancer need a second surgery, and many times, during the second surgery, the entire breast ends up getting removed.

Though this project remains in its early stages, Madabhushi and Fereidouni remain optimistic and hopeful that MarginCall will help patients, surgeons, and pathologists fight cancers.

CancerNetwork: What unmet clinical need led to the beginning of the MarginCall project?

Madabhushi: One of the big challenges that we have in pathology [right now] is being able to provide rapid information and feedback to a surgeon in the operating room when they’re operating on a patient’s cancer. [Another] big challenge that we have in cancer surgeries today is knowing exactly where the [tumor] margin is, and what surgeons tend to do is try to be somewhat conservative, because they don’t want to take out too much normal tissue. They want to take out the tumor, obviously, but they don’t want to take out too much outside the tumor, because [when] more tissue is taken out, that’s going to compromise normal function for the patient.

What typically tends to happen is that the surgeon will take out the tumor and some of the margin, and that piece of tissue gets frozen, and that frozen specimen is then looked at by a pathologist, [and this is called frozen section] … The problem with this current paradigm is that it's somewhat labor-intensive and time-consuming… This could go on from 30 minutes to an hour. It takes time. At that same time, you have the surgeon twiddling her or his thumbs, just waiting for that diagnosis to come back.

The question that we’re trying to address here is: Could we accelerate the time in which we get the information about margin positivity or the lack thereof [back to the surgeon] so that instead of wasting precious minutes in the operating room, the information gets fed back to the surgeon…in maybe even 2 or 3 minutes.

Fereidouni: Frozen section is destructive. It ruins the tissue; it consumes the tissue; and it affects downstream molecular analysis.

What technology will this tool be using, and how will it lead to improved precision during surgery?

Fereidouni: We are trying to combine 2 cutting-edge technologies here. One of them is the AI tools, which are being developed by [Dr Madabhushi], and the other is the FIBI technology that technically addresses the problem that Dr Madabhushi was explaining. We are trying to image the surface of the tissue without freezing it, so while the patient is [being operated on], we can get the tissue.

We are trying to create an end-to-end solution so that everything is automated [like] the way that they resect the tissue. They can slice it by hand. They can be automatically stained and can be imaged very fast, and it's not destructive. These images can be fed to AI, and they can be diagnosed either totally by AI, or they can help pathologists make the diagnosis.

Madabhushi: First, you have to do rapid imaging, and second you have to do rapid interpretation. That's what is exciting here because we're taking a high-resolution, rapid imaging technology, and combining it with a very powerful, very rapid, efficient interpretation algorithm with AI. You take these 2 powerful algorithms, you put them together, you’re going to get an accurate, rapid diagnosis, which is not subject to interreader [agreement], in terms of variability, and we’re going to be able to get that to the surgeon, and therefore minimize the time spent in the [operating room].

What impact will the improved accuracy of tumor margin assessment have for patients with breast and ovarian cancer?

Madabhushi: Apart from reducing the amount of time in the [operating room], hopefully, what we’ll see is that this approach, in the long term, results in better patient outcomes. With this technology, we're reducing errors, and we’re improving the accuracy of the process, therefore ensuring that the surgeon is not leaving any tumor behind. From a patient perspective, that means fewer recurrences, better long-term outcomes, and, overall, better survival.

What are some of the challenges you anticipate facing while developing MarginCall?

Fereidouni: We are going to have fresh tissue to deal with, and it’s not going to be easy in terms of slicing and cutting and avoiding cautery effects by surgery. Speed is [another] issue that we need to handle. For example, the requirement by ARPA-H is that we need to image 10 x 10 x 10 cubic centimeters within 15 minutes. That's going to require a very fast scanning speed and very fast AI diagnostic tools. The images that we are going to create are going to be different from the standard [hematoxylin and eosin stain] images that we used to see, [that] our AI model [used to run on], and [that pathologists read]. We need to convert them in a way that’s AI- and pathologist-friendly. Data management is going to be a huge problem because we’re going to create gigapixel images, and [there are] going to be 10 or 15 sizes of them.

Madabhushi: One of the things that we are very sensitive to is that, as we develop this, we have to be sensitive to the human aspect. We need to make sure that our pathology colleagues and our surgical colleagues are on board and that they appreciate what we’re trying to accomplish. We work with them very closely to ensure that this fits into the context of the ecosystem, and so that they have a role in that ecosystem. This is a technology that is going to help them.

Are there any patient demographics that will affect the use of this technology more than others?

Fereidouni: We are going to reduce costs, and we are going to create something that is histology in a box, rather than [something that] cannot be implemented into the hospitals that don't have histology labs. For the time being, when they do surgery, they send tissues out to other places to be seen. [The] bottom line is that it's going to make the surgery easier and more accessible for everyone. That's a very good thing for global applications and in low-income settings.

Madabhushi: One of the things that we’re acutely aware of is that populations of color and underserved minority populations have not always been beneficiaries of technology. There’s been a certain skepticism and, in some cases, justified distrust of technologies. One of the things that we are thinking about is [ensuring] that, as we develop these technologies, they are being validated in the context of a diverse patient population. One of the things that we want to have is…an advisory board, [as well as] community engagement. We want to make sure that, as we're developing these tools, they are going to benefit multiple different demographics and multiple different populations.

How will this tool change surgical oncology in the future?

Fereidouni: The low-hanging fruit is that we’re going to avoid repeated surgeries; for breast cancer, at least 25% of patients need to come back for the second surgery. It’s not about [the fact that] they need to come back or that it adds anxiety, cost, planning, and all of that. The most important feature of the second surgery is that…it adds complications and changes the patient outcome. They are not going to have the same outcome as a patient who has only 1 surgery. Most of the time they try to minimize the amount of tissue that you want to take from the patient, but when it comes to the second surgery, most of the time they remove the entire breast. That…complicates [things] downstream [like future] treatment plans. What I'm seeing in a much bigger picture is that we are going to facilitate surgery in a way that is going to help surgeons.

Madabhushi: This is going to be disruptive, not just for our surgical oncology colleagues, but it’s also going to be disruptive for our pathology colleagues. As we develop this decision support tool, it is going to mean that a brand-new tissue imaging technology [which pathologists have never seen before] is going to be used. Pathologists are not used to looking at FIBI images, because they have not existed thus far. They have been used to looking at [H and E] images—hematoxylin and eosin-stained images… Pathologists are going to have to start to become more well-versed in understanding what…cancer cells on FIBI look like compared with a standard H and E image.

This is also going to be true for surgeons. They’re going to have to understand and interpret what the predictions are. The way they have been thinking about their practice has been through communication from the pathologist, [then the] pathologist looking at the frozen [tissue] and telling them, “Okay, you’re good” or “You need to take out more tissue”. Now they’re going to rely on this AI-derived or informed technology, which means that the way they practice is going to undergo some [changes]. Even if it’s a minute difference, it’s a change.

Medicine is notoriously conservative, and any disruption means that there’s going to be a ripple effect. It’s something that we’ll have to work…and navigate through and get our clinical collaborators to understand the importance of that disruption and embrace it. Unless they embrace it, this is not going to scale. We need them to be champions and supporters, and embrace this so that this technology is not just successful from a technological standpoint, but is successful from a deployment implementation standpoint.

Reference

Emory researchers awarded up to $17.6M from ARPA-H to innovate cancer surgery, improve outcomes. Emory Winship Cancer Institute. January 6, 2025. Accessed January 28, 2025. https://tinyurl.com/y3dpmtxr

Recent Videos
Regardless of disease burden or disease progression speed on front-line therapy, trastuzumab deruxtecan appears effective in HER2-low breast cancer.
Standardizing surgical outcomes and better training oncologic surgeons may be accomplished through the use of AI.
Performance status, age, and comorbidities may impact benefit seen with immunotherapy vs chemotherapy in patients with breast cancer.
Related Content