Head and neck cancer patient images for determining auto-segmentation accuracy in T2-weighted magnetic resonance imaging through expert manual segmentations

Carlos E. Cardenas, Abdallah S.R. Mohamed, Jinzhong Yang, Mark Gooding, Harini Veeraraghavan, Jayashree Kalpathy-Cramer, Sweet Ping Ng, Yao Ding, Jihong Wang, Stephen Y. Lai, Clifton D. Fuller, Greg Sharp

Research output: Contribution to journalArticlepeer-review

25 Scopus citations

Abstract

Purpose: The use of magnetic resonance imaging (MRI) in radiotherapy treatment planning has rapidly increased due to its ability to evaluate patient’s anatomy without the use of ionizing radiation and due to its high soft tissue contrast. For these reasons, MRI has become the modality of choice for longitudinal and adaptive treatment studies. Automatic segmentation could offer many benefits for these studies. In this work, we describe a T2-weighted MRI dataset of head and neck cancer patients that can be used to evaluate the accuracy of head and neck normal tissue auto-segmentation systems through comparisons to available expert manual segmentations. Acquisition and validation methods: T2-weighted MRI images were acquired for 55 head and neck cancer patients. These scans were collected after radiotherapy computed tomography (CT) simulation scans using a thermoplastic mask to replicate patient treatment position. All scans were acquired on a single 1.5 T Siemens MAGNETOM Aera MRI with two large four-channel flex phased-array coils. The scans covered the region encompassing the nasopharynx region cranially and supraclavicular lymph node region caudally, when possible, in the superior–inferior direction. Manual contours were created for the left/right submandibular gland, left/right parotids, left/right lymph node level II, and left/right lymph node level III. These contours underwent quality assurance to ensure adherence to predefined guidelines, and were corrected if edits were necessary. Data format and usage notes: The T2-weighted images and RTSTRUCT files are available in DICOM format. The regions of interest are named based on AAPM’s Task Group 263 nomenclature recommendations (Glnd_Submand_L, Glnd_Submand_R, LN_Neck_II_L, Parotid_L, Parotid_R, LN_Neck_II_R, LN_Neck_III_L, LN_Neck_III_R). This dataset is available on The Cancer Imaging Archive (TCIA) by the National Cancer Institute under the collection “AAPM RT-MAC Grand Challenge 2019” (https://doi.org/10.7937/tcia.2019.bcfjqfqb). Potential applications: This dataset provides head and neck patient MRI scans to evaluate auto-segmentation systems on T2-weighted images. Additional anatomies could be provided at a later time to enhance the existing library of contours.

Original languageEnglish (US)
Pages (from-to)2317-2322
Number of pages6
JournalMedical physics
Volume47
Issue number5
DOIs
StatePublished - Jun 1 2020

Keywords

  • MRI
  • automatic segmentation
  • grand challenge
  • head and neck cancer
  • radiation therapy

ASJC Scopus subject areas

  • Biophysics
  • Radiology Nuclear Medicine and imaging

MD Anderson CCSG core facilities

  • Clinical Trials Office
  • High Performance Compute and Data Storage

Fingerprint

Dive into the research topics of 'Head and neck cancer patient images for determining auto-segmentation accuracy in T2-weighted magnetic resonance imaging through expert manual segmentations'. Together they form a unique fingerprint.

Cite this