MGH breast cancer researchers use AI to spot new details in mammograms

A deep learning computer model developed by researchers at Massachusetts General Hospital (MGH) was able to identify subtle information in breast cancer images that could help better predict a woman’s chances of developing the disease.

Currently, the main methods of judging individual risk include checking for a family history of cancer, evaluating any biopsied tissue and noting whether they’ve given birth to a child.  

Screening mammograms—recommended annually by the American Cancer Society for women between the ages of 45 and 54—are typically used by oncologists to measure the density of the breast.

“Why should we limit ourselves to only breast density when there is such rich digital data embedded in every woman’s mammogram?” said Constance Lehman, M.D., Ph.D., MGH’s division chief of breast imaging and senior author of a paper presented at the annual meeting of the Radiological Society of North America.

“Every woman’s mammogram is unique to her just like her thumbprint,” Lehman said. “It contains imaging biomarkers that are highly predictive of future cancer risk, but until we had the tools of deep learning, we were not able to extract this information to improve patient care.”

RELATED: An instant 2nd opinion: Google's DeepMind AI bests doctors at breast cancer screening

The artificial-intelligence-powered algorithm was built and tested using data gathered from more than 245,000 two-dimensional mammograms taken from over 80,000 patients across seven years. These included women with histories of breast cancer, implants or prior biopsies.

When forecasting whether a person could develop breast cancer within five years of a mammogram, the computer model demonstrated a predictive rate of 71%, compared to 61% with widely used methods.

“Traditional risk assessment models do not leverage the level of detail that is contained within a mammogram,” said Leslie Lamb, M.D., a breast radiologist at MGH. “Even the best existing traditional risk models may separate sub-groups of patients but are not as precise on the individual level.”

RELATED: FDA clears Zebra Medical's breast cancer AI for spotting suspicious mammography lesions

Those models and the work of gathering patient histories can also be time-consuming and inconsistent, Lamb added. “A deep learning image-only risk model can provide increased access to more accurate, less costly risk assessment and help deliver on the promise of precision medicine,” she said.

MGH is making deep learning risk information available through reporting software when a radiologist reads a person’s screening mammogram. The computer model has been externally validated in Sweden and Taiwan, and the researchers are currently planning additional studies in larger African American and minority populations.