4 Scopus citations

Abstract

Tumor buddings (TB), a special formation of cancerous cells that bud from the tumor front, are fast becoming the key indicator in modern clinical applications where they play a significant role in prognostic and evaluation of colorectal cancers in histopathological images. Recently, computational methods have been rapidly evolving in the domain of digital pathology, yet the literature lacks computerized approaches to automate the localization and segmentation of TBs in hematoxylin and eosin (HE)-stained images. This research addresses this very challenging task of tumor budding detection in HE images by presenting different deep learning architectures designed for semantic segmentation. The proposed design for a new Convolutional Neural Network (CNN) incorporates convolution filters with different factors of dilations. Multiple experiments based on a newly constructed colorectal cancer histopathological image collection provided promising performances. The best average intersection over union (IOU) for TB of 0.11, IOU for non-TB of 0.86, mean IOU of 0.49 and weighted IOU of 0.83 were observed.

Original languageEnglish
Title of host publication2020 IEEE Region 10 Conference, TENCON 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages52-56
Number of pages5
ISBN (Electronic)9781728184555
DOIs
StatePublished - Nov 16 2020
Event2020 IEEE Region 10 Conference, TENCON 2020 - Virtual, Osaka, Japan
Duration: Nov 16 2020Nov 19 2020

Publication series

NameIEEE Region 10 Annual International Conference, Proceedings/TENCON
Volume2020-November
ISSN (Print)2159-3442
ISSN (Electronic)2159-3450

Conference

Conference2020 IEEE Region 10 Conference, TENCON 2020
Country/TerritoryJapan
CityVirtual, Osaka
Period11/16/2011/19/20

Keywords

  • Colorectral Cancer
  • Deep Learning
  • Digital Pathology
  • Tumor Budding Detection

Fingerprint

Dive into the research topics of 'Tumor budding detection in HE-stained images using deep semantic learning'. Together they form a unique fingerprint.

Cite this