Two studies published in Radiology from Brigham and Women's Hospital illustrate the use of data mining to create open source toolkits of radiation exposure metrics.
Radiation dose to organs can be calculated based on data on the radiopharmaceutical used and how it was administered, explains Aunt Minnie. The first study involved the Perl Automation for Radiopharmaceutical Selection and Extraction (PARSE) that looks for this information in the text of radiology reports. It converts the units of radioactivity to a standard format, then uses each unit as an anchor point to search for numeric information about administration within the text. Broader searches were used to match radiopharmaceutical products to a defined list.
Reports without complete information were flagged and used in a quality-control program to determine the frequency of incomplete data and the source.
Researchers analyzed 2,359 nuclear medicine reports prepared between September 1985 and March 2011. PARSE analyzed the reports to identify those those involving a single radiopharmaceutical administered once or multiple times, as well as those matching multiple radiopharmaceuticals administered multiple times. The authors report 97.6 percent accuracy in the percentage of complete reports from which all data fields were correctly extracted and 98.7 percent precision in the data extraction, Aunt Minnie reports.
In addition to the promise that such repositories pose for patient-safety initiatives and individual dose monitoring, the authors suggested that a data repository could be used to monitor exam protocol changes over time. It also would allow individual cumulative organ dose heat maps to be developed.
The second study, lead by Dr. Aaron Sodickson, associate professor of radiology at Harvard Medical School, uses technology to extract anatomy-specific CT radiation exposure metrics from existing digital image archives. In addition to information about dose, the generalized radiation observation kit (GROK) pulls out information about the CT scanner, CT protocol, type of examination, and patient demographics. It allows comparison of imaging facilities even when they use different protocol names for the exam category.
They started with 54,549 CT encounters taken during a specific week each quarter between 2000 and 2010 on scanners manufactured by GE, Phillips, Siemens and Toshiba. They noted a dose screen retrieval rate of 99 percent and anatomic assignment precision rate of 91 percent.
The toolkit is strictly an extraction tool; there is no software to analyze data. The data also is not correlated with patient size,which tends to be lacking in most archives, the authors noted. They added that the toolkit may need to be customized with hardware and software upgrades.
In an accompanying editorial, Cynthia McCollough, professor of biomedical engineering and medical physics at the Mayo Clinic in Rochester, Minn., lauded the work.
"The availability of data describing the amount of radiation used for different examinations or procedures, stratified by patient size and clinical indication, is foundational for quality improvement initiatives in the field of radiation dose utilization," she wrote.
Though she commended the the American College of Radiology (ACR) for creating its Dose Index Registry, she said each dose index value must be associated with not only a specific anatomic region but also a specific patient size and diagnostic task.
The ACR has been working to create dose standards with its registry as the industry focuses on reducing radiation exposure while preserving image quality. Child-sizing exposure and correct dose by body weight have gained a lot of attention, since multiple scans have been linked to cancer.
To learn more:
- read the Aunt Minnie article
- here's the Ikuta abstract
- read the Sodickson abstract
- check out the editorial