Minimally interactive segmentation of 4D dynamic upper airway MR images via fuzzy connectedness

Yubing Tong, Jayaram K. Udupa, Dewey Odhner, Caiyun Wu, Sanghun Sin, Mark E. Wagshul, Raanan Arens

Research output: Contribution to journalArticlepeer-review

2 Scopus citations


Purpose: There are several disease conditions that lead to upper airway restrictive disorders. In the study of these conditions, it is important to take into account the dynamic nature of the upper airway. Currently, dynamic magnetic resonance imaging is the modality of choice for studying these diseases. Unfortunately, the contrast resolution obtainable in the images poses many challenges for an effective segmentation of the upper airway structures. No viable methods have been developed to date to solve this problem. In this paper, the authors demonstrate a practical solution by employing an iterative relative fuzzy connectedness delineation algorithm as a tool. Methods: 3D dynamic images were collected at ten equally spaced instances over the respiratory cycle (i.e., 4D) in 20 female subjects with obstructive sleep apnea syndrome. The proposed segmentation approach consists of the following steps. First, image background nonuniformities are corrected which is then followed by a process to correct for the nonstandardness of MR image intensities. Next, standardized image intensity statistics are gathered for the nasopharynx and oropharynx portions of the upper airway as well as the surrounding soft tissue structures including air outside the body region, hard palate, soft palate, tongue, and other soft structures around the airway including tonsils (left and right) and adenoid. The affinity functions needed for fuzzy connectedness computation are derived based on these tissue intensity statistics. In the next step, seeds for fuzzy connectedness computation are specified for the airway and the background tissue components. Seed specification is needed in only the 3D image corresponding to the first time instance of the 4D volume; from this information, the 3D volume corresponding to the first time point is segmented. Seeds are automatically generated for the next time point from the segmentation of the 3D volume corresponding to the previous time point, and the process continues and runs without human interaction and completes in 10 s for segmenting the airway structure in the whole 4D volume. Results: Qualitative evaluations performed to examine smoothness and continuity of motions of the entire upper airway as well as its transverse sections at critical anatomic locations indicate that the segmentations are consistent. Quantitative evaluations of the separate 200 3D volumes and the 20 4D volumes yielded true positive and false positive volume fractions around 95% and 0.1%, respectively, and mean boundary placement errors under 0.5 mm. The method is robust to variations in the subjective action of seed specification. Compared with a segmentation approach based on a registration technique to propagate segmentations, the proposed method is more efficient, accurate, and less prone to error propagation from one respiratory time point to the next. Conclusions: The proposed method is the first demonstration of a viable and practical approach for segmenting the upper airway structures in dynamic MR images. Compared to registration-based methods, it effectively reduces error propagation and consequently achieves not only more accurate segmentations but also more consistent motion representation in the segmentations. The method is practical, requiring minimal user interaction and computational time.

Original languageEnglish (US)
Pages (from-to)2323-2333
Number of pages11
JournalMedical physics
Issue number5
StatePublished - May 1 2016


  • 4D MR imaging
  • fuzzy connectedness
  • segmentation
  • upper airway

ASJC Scopus subject areas

  • Biophysics
  • Radiology Nuclear Medicine and imaging

Fingerprint Dive into the research topics of 'Minimally interactive segmentation of 4D dynamic upper airway MR images via fuzzy connectedness'. Together they form a unique fingerprint.

Cite this