TY - JOUR
T1 - Multi-organ segmentation of abdominal structures from non-contrast and contrast enhanced CT images
AU - Yu, Cenji
AU - Anakwenze, Chidinma P.
AU - Zhao, Yao
AU - Martin, Rachael M.
AU - Ludmir, Ethan B.
AU - S.Niedzielski, Joshua
AU - Qureshi, Asad
AU - Das, Prajnan
AU - Holliday, Emma B.
AU - Raldow, Ann C.
AU - Nguyen, Callistus M.
AU - Mumme, Raymond P.
AU - Netherton, Tucker J.
AU - Rhee, Dong Joo
AU - Gay, Skylar S.
AU - Yang, Jinzhong
AU - Court, Laurence E.
AU - Cardenas, Carlos E.
N1 - Publisher Copyright:
© 2022, The Author(s).
PY - 2022/12
Y1 - 2022/12
N2 - Manually delineating upper abdominal organs at risk (OARs) is a time-consuming task. To develop a deep-learning-based tool for accurate and robust auto-segmentation of these OARs, forty pancreatic cancer patients with contrast-enhanced breath-hold computed tomographic (CT) images were selected. We trained a three-dimensional (3D) U-Net ensemble that automatically segments all organ contours concurrently with the self-configuring nnU-Net framework. Our tool’s performance was assessed on a held-out test set of 30 patients quantitatively. Five radiation oncologists from three different institutions assessed the performance of the tool using a 5-point Likert scale on an additional 75 randomly selected test patients. The mean (± std. dev.) Dice similarity coefficient values between the automatic segmentation and the ground truth on contrast-enhanced CT images were 0.80 ± 0.08, 0.89 ± 0.05, 0.90 ± 0.06, 0.92 ± 0.03, 0.96 ± 0.01, 0.97 ± 0.01, 0.96 ± 0.01, and 0.96 ± 0.01 for the duodenum, small bowel, large bowel, stomach, liver, spleen, right kidney, and left kidney, respectively. 89.3% (contrast-enhanced) and 85.3% (non-contrast-enhanced) of duodenum contours were scored as a 3 or above, which required only minor edits. More than 90% of the other organs’ contours were scored as a 3 or above. Our tool achieved a high level of clinical acceptability with a small training dataset and provides accurate contours for treatment planning.
AB - Manually delineating upper abdominal organs at risk (OARs) is a time-consuming task. To develop a deep-learning-based tool for accurate and robust auto-segmentation of these OARs, forty pancreatic cancer patients with contrast-enhanced breath-hold computed tomographic (CT) images were selected. We trained a three-dimensional (3D) U-Net ensemble that automatically segments all organ contours concurrently with the self-configuring nnU-Net framework. Our tool’s performance was assessed on a held-out test set of 30 patients quantitatively. Five radiation oncologists from three different institutions assessed the performance of the tool using a 5-point Likert scale on an additional 75 randomly selected test patients. The mean (± std. dev.) Dice similarity coefficient values between the automatic segmentation and the ground truth on contrast-enhanced CT images were 0.80 ± 0.08, 0.89 ± 0.05, 0.90 ± 0.06, 0.92 ± 0.03, 0.96 ± 0.01, 0.97 ± 0.01, 0.96 ± 0.01, and 0.96 ± 0.01 for the duodenum, small bowel, large bowel, stomach, liver, spleen, right kidney, and left kidney, respectively. 89.3% (contrast-enhanced) and 85.3% (non-contrast-enhanced) of duodenum contours were scored as a 3 or above, which required only minor edits. More than 90% of the other organs’ contours were scored as a 3 or above. Our tool achieved a high level of clinical acceptability with a small training dataset and provides accurate contours for treatment planning.
UR - http://www.scopus.com/inward/record.url?scp=85141469199&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85141469199&partnerID=8YFLogxK
U2 - 10.1038/s41598-022-21206-3
DO - 10.1038/s41598-022-21206-3
M3 - Article
C2 - 36351987
AN - SCOPUS:85141469199
SN - 2045-2322
VL - 12
JO - Scientific reports
JF - Scientific reports
IS - 1
M1 - 19093
ER -