A generative adversarial approach to facilitate archival-quality histopathologic diagnoses from frozen tissue sections

Kianoush Falahkheirkhah, Tao Guo, Michael Hwang, Pheroze Tamboli, Christopher G. Wood, Jose A. Karam, Kanishka Sircar, Rohit Bhargava

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

In clinical diagnostics and research involving histopathology, formalin-fixed paraffin-embedded (FFPE) tissue is almost universally favored for its superb image quality. However, tissue processing time (>24 h) can slow decision-making. In contrast, fresh frozen (FF) processing (<1 h) can yield rapid information but diagnostic accuracy is suboptimal due to lack of clearing, morphologic deformation and more frequent artifacts. Here, we bridge this gap using artificial intelligence. We synthesize FFPE-like images (“virtual FFPE”) from FF images using a generative adversarial network (GAN) from 98 paired kidney samples derived from 40 patients. Five board-certified pathologists evaluated the results in a blinded test. Image quality of the virtual FFPE data was assessed to be high and showed a close resemblance to real FFPE images. Clinical assessments of disease on the virtual FFPE images showed a higher inter-observer agreement compared to FF images. The nearly instantaneously generated virtual FFPE images can not only reduce time to information but can facilitate more precise diagnosis from routine FF images without extraneous costs and effort.

Original languageEnglish (US)
Pages (from-to)554-559
Number of pages6
JournalLaboratory Investigation
Volume102
Issue number5
DOIs
StatePublished - May 2022

ASJC Scopus subject areas

  • Pathology and Forensic Medicine
  • Molecular Biology
  • Cell Biology

MD Anderson CCSG core facilities

  • Tissue Biospecimen and Pathology Resource

Fingerprint

Dive into the research topics of 'A generative adversarial approach to facilitate archival-quality histopathologic diagnoses from frozen tissue sections'. Together they form a unique fingerprint.

Cite this