Using AI to predict breast cancer and personalize care
New technology could predict cancer up to 5 years in advance.
Despite major advances in genetics and modern imaging, the diagnosis catches most breast cancer patients by surprise.
For some, it comes too late. Later diagnosis means aggressive treatments, uncertain outcomes, and more medical expenses. As a result, identifying patients has been a central pillar of breast cancer research and effective early detection.
With that in mind, a team from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) and Massachusetts General Hospital (MGH) has created a new deep-learning model that can predict from a mammogram if a patient is likely to develop breast cancer as much as five years in the future. Trained on mammograms and known outcomes from over 60,000 MGH patients, the model learned the subtle patterns in breast tissue that are precursors to malignant tumors.
MIT Professor Regina Barzilay, herself a breast cancer survivor, says that the hope is for systems like these to enable doctors to customize screening and prevention programs at the individual level, making late diagnosis a relic of the past.
Although mammography has been shown to reduce breast cancer mortality, there is continued debate on how often to screen and when to start. While the American Cancer Society recommends annual screening starting at age 45, the U.S. Preventative Task Force recommends screening every two years starting at age 50.
"Rather than taking a one-size-fits-all approach, we can personalize screening around a woman's risk of developing cancer," says Barzilay, senior author of a new paper about the project out recently in Radiology. "For example, a doctor might recommend that one group of women get a mammogram every other year, while another higher-risk group might get supplemental MRI screening." Barzilay is the Delta Electronics Professor at CSAIL and the Department of Electrical Engineering and Computer Science at MIT and a member of the Koch Institute for Integrative Cancer Research at MIT.
The team's model was significantly better at predicting risk than existing approaches: It accurately placed 31 percent of all cancer patients in its highest-risk category, compared to only 18 percent for traditional models.
Harvard Professor Constance Lehman says that there's previously been minimal support in the medical community for screening strategies that are risk-based rather than age-based.
"This is because before we did not have accurate risk assessment tools that worked for individual women," says Lehman, a professor of radiology at Harvard Medical School and division chief of breast imaging at MGH. "Our work is the first to show that it's possible."
Barzilay and Lehman co-wrote the paper with lead author Adam Yala, a CSAIL PhD student. Other MIT co-authors include PhD student Tal Schuster and former master's student Tally Portnoi.
How it works
Since the first breast-cancer risk model from 1989, development has largely been driven by human knowledge and intuition of what major risk factors might be, such as age, family history of breast and ovarian cancer, hormonal and reproductive factors, and breast density.
However, most of these markers are only weakly correlated with breast cancer. As a result, such models still aren't very accurate at the individual level, and many organizations continue to feel risk-based screening programs are not possible, given those limitations.
Rather than manually identifying the patterns in a mammogram that drive future cancer, the MIT/MGH team trained a deep-learning model to deduce the patterns directly from the data. Using information from more than 90,000 mammograms, the model detected patterns too subtle for the human eye to detect.
"Since the 1960s radiologists have noticed that women have unique and widely variable patterns of breast tissue visible on the mammogram," says Lehman. "These patterns can represent the influence of genetics, hormones, pregnancy, lactation, diet, weight loss, and weight gain. We can now leverage this detailed information to be more precise in our risk assessment at the individual level."
Making cancer detection more equitable
The project also aims to make risk assessment more accurate for racial minorities, in particular. Many early models were developed on white populations, and were much less accurate for other races. The MIT/MGH model, meanwhile, is equally accurate for white and black women. This is especially important given that black women have been shown to be 42 percent more likely to die from breast cancer due to a wide range of factors that may include differences in detection and access to health care.
"It's particularly striking that the model performs equally as well for white and black people, which has not been the case with prior tools," says Allison Kurian, an associate professor of medicine and health research/policy at Stanford University School of Medicine. "If validated and made available for widespread use, this could really improve on our current strategies to estimate risk."
Barzilay says their system could also one day enable doctors to use mammograms to see if patients are at a greater risk for other health problems, like cardiovascular disease or other cancers. The researchers are eager to apply the models to other diseases and ailments, and especially those with less effective risk models, like pancreatic cancer.
"Our goal is to make these advancements a part of the standard of care," says Yala. "By predicting who will develop cancer in the future, we can hopefully save lives and catch cancer before symptoms ever arise."
- Researchers develop comprehensive new way to predict breast ... ›
- Breast Cancer Risk Assessment Tool ›
- Cancer What Do Breasts Look Like on a Mammogram? ›
What can 3D printing do for medicine? The "sky is the limit," says Northwell Health researcher Dr. Todd Goldstein.
- Medical professionals are currently using 3D printers to create prosthetics and patient-specific organ models that doctors can use to prepare for surgery.
- Eventually, scientists hope to print patient-specific organs that can be transplanted safely into the human body.
- Northwell Health, New York State's largest health care provider, is pioneering 3D printing in medicine in three key ways.
Technology may soon grant us immortality, in a sense. Here's how.
- Through the Connectome Project we may soon be able to map the pathways of the entire human brain, including memories, and create computer programs that evoke the person the digitization is stemmed from.
- We age because errors build up in our cells — mitochondria to be exact.
- With CRISPR technology we may soon be able to edit out errors that build up as we age, and extend the human lifespan.
The controversial herbicide is everywhere, apparently.
- U.S. PIRG tested 20 beers and wines, including organics, and found Roundup's active ingredient in almost all of them.
- A jury on August 2018 awarded a non-Hodgkin's lymphoma victim $289 million in Roundup damages.
- Bayer/Monsanto says Roundup is totally safe. Others disagree.
The pizza giant Domino's partners with a Silicon Valley startup to start delivering pizza by robots.
- Domino's partnered with the Silicon Valley startup Nuro to have robot cars deliver pizza.
- The trial run will begin in Houston later this year.
- The robots will be half a regular car and will need to be unlocked by a PIN code.
Would you have to tip robots? You might be answering that question sooner than you think as Domino's is about to start using robots for delivering pizza. Later this year a fleet of self-driving robotic vehicles will be spreading the joy of pizza throughout the Houston area for the famous pizza manufacturer, using delivery cars made by the Silicon Valley startup Nuro.
The startup, founded by Google veterans, raised $940 million in February and has already been delivering groceries for Kroger around Houston. Partnering with the pizza juggernaut Domino's, which delivers close to 3 million pizzas a day, is another logical step for the expanding drone car business.
Kevin Vasconi of Domino's explained in a press release that they see these specially-designed robots as "a valuable partner in our autonomous vehicle journey," adding "The opportunity to bring our customers the choice of an unmanned delivery experience, and our operators an additional delivery solution during a busy store rush, is an important part of our autonomous vehicle testing."
How will they work exactly? Nuro explained in its own press release that this "opportunity to use Nuro's autonomous delivery" will be available for some of the customers who order online. Once they opt in, they'll be able to track the car via an app. When the vehicle gets to them, the customers will use a special PIN code to unlock the pizza compartment.
Nuro and its competitors Udelv and Robomart have been focusing specifically on developing such "last-mile product delivery" machines, reports Arstechnica. Their specially-made R1 vehicle is about half the size of a regular passenger car and doesn't offer any room for a driver. This makes it safer and lighter too, with less potential to cause harm in case of an accident. It also sticks to a fairly low speed of under 25 miles an hour and slams on the breaks at the first sign of trouble.
What also helps such robot cars is "geofencing" technology which confines them to a limited area surrounding the store.
For now, the cars are still tracked around the neighborhoods by human-driven vehicles, with monitors to make sure nothing goes haywire. But these "chase cars" should be phased out eventually, an important milestone in the evolution of your robot pizza drivers.
Check out how Nuro's vehicles work:
SMARTER FASTER trademarks owned by The Big Think, Inc. All rights reserved.