Boston, MA, April 18, 2019 – On-demand crowdsourcing methods can be used to rapidly prototype artificial intelligence (AI) algorithms that can replicate the results of an expert radiation oncologist in targeting lung tumors, while reducing associated time costs by 75% - 96.8%, according to a study published in JAMA Oncology.
A team of researchers from the Dana-Farber Cancer Institute, Brigham and Women’s Hospital, Harvard Catalyst, Harvard Business School and the Laboratory for Innovation Science at Harvard, in collaboration with Topcoder, the world’s largest talent network and crowdsourcing platform for developers, designers, data scientists and testers, ran a three-phase, 10-week prize-based crowdsourcing contest to develop an AI-based solution to address the critical and resource intensive task of tumor segmentation. Requiring substantial subspecialty training, and performed by a radiation oncologist, tumor segmentation is a key informant in the development of a patient’s radiation therapy plan, yet is subject to significant interobserver variance.
“The ability to rapidly develop high-performing algorithms for tumor segmentation, and doing so in a cost-efficient way, has the potential to markedly improve how we deliver lung cancer therapy, said first author, Dr. Raymond Mak of Brigham and Women’s Hospital and Dana-Farber Cancer Institute. “Reducing the time costs associated with tumor segmentation, without losing any of the accuracy, allows clinicians to reallocate their time and attention to fine-tuning therapy and spending more time with patients, and may also provide a potential solution that addresses the larger skilled oncology workforce crisis experienced by under-resourced health care systems worldwide.”
The paper’s senior author, Dr. Eva Guinan, Professor of Radiation Oncology at Dana-Farber Cancer Institute and Harvard Medical School, noted that “This kind of tool also opens the door to innovative applications in the areas of clinical training, quality assurance, and multi-institutional research trials.”
The authors used a data set of computed tomographic (CT) scans and lung tumor segmentations generated by a thoracic radiation oncology expert from the Brigham and Women’s Hospital and Dana-Farber Cancer Institute. In total, 45 solutions were submitted by contestants through three phases on the Topcoder on-demand crowdsourcing platform. Solutions were then scored by comparing volumetric segmentation produced by each algorithm on a given patient’s CT scan against the expert segmentation. A higher score reflected an automated segmentation for a given patient’s entire tumor that has a high level of both relative and absolute overlap with the expert segmentation.
“As the leader in crowdsourced software development and an innovator of the gig economy business model, Topcoder is honored to be part of such humanitarian, groundbreaking research that has a positive impact on global healthcare,” said Mike Morris, CEO of Topcoder. “This team of experts from around the world, across multiple industries, have demonstrated how AI and data science can save time and costs in the collective search for ways to live longer, healthier, better lives.”
The authors found that Phase Two algorithms performed better than commercially available products and Phase Three algorithms performed at a level comparable to the inter-observer variation seen when comparing between human experts. Seventy-five percent of the ensemble segmentations (combination of Phase Three algorithms) exceeded the lower threshold of the intraobserver performance, suggesting that the Topcoder contest produced algorithms capable of matching expert performance. Phase Three algorithms produced segmentations at rates between 15-seconds per scan to 2 minutes per scan, compared to the mean time for the expert to perform manual segmentations at 8 minutes per scan.
“What’s remarkable is the diversity of solutions we saw from contestants,” said Karim Lakhani, Charles Edward Wilson Professor of Business Administration at Harvard Business School. “Approaches included convolutional neural networks, cluster growth, and random forest algorithms, which is reflective of the diverse skillsets and community found through crowdsourcing and open innovation, whereas these may be limited if approaching this through a traditional development approach.”
The JAMA Oncology article can be found at: https://jamanetwork.com/journals/jamaoncology/fullarticle/2730638