dc.contributor.author | Eren, Furkan | |
dc.contributor.author | Aslan, Mete | |
dc.contributor.author | Kanarya, Dilek | |
dc.contributor.author | Uysallı, Yigit | |
dc.contributor.author | Aydin, Musa | |
dc.contributor.author | Kiraz, Berna | |
dc.contributor.author | Aydın, Ömer | |
dc.contributor.author | Kiraz, Alper | |
dc.date.accessioned | 2022-09-30T14:20:58Z | |
dc.date.available | 2022-09-30T14:20:58Z | |
dc.date.issued | 2022 | en_US |
dc.identifier.citation | EREN, Furkan, Mete ASLAN, Dilek KANARYA, Yiğit UYSALLI, Musa AYDIN, Berna KİRAZ, Ömer AYDIN & Alper KİRAZ. "DeepCAN: A Modular Deep Learning System for Automated Cell Counting and Viability Analysis". Generic Colorized Journal, 20 (2022): 1-9. | en_US |
dc.identifier.uri | https://hdl.handle.net/11352/4179 | |
dc.description.abstract | Precise and quick monitoring of key cytometric features such as cell count, cell size, cell morphology,
and DNA content is crucial for applications in biotechnology, medical sciences, and cell culture research. Traditionally, image cytometry relies on the use of a hemocytometer accompanied with visual inspection of an operator
under a microscope. This approach is prone to error due
to subjective decisions of the operator. Recently, deep
learning approaches have emerged as powerful tools enabling quick and highly accurate image cytometric analysis that are easily generalizable to different cell types.
Leading to simpler, more compact, and less expensive
solutions, these approaches revealed image cytometry as
a viable alternative to flow cytometry or Coulter counting.
In this study, we demonstrate a modular deep learning
system, DeepCAN, that provides a complete solution for
automated cell counting and viability analysis. DeepCAN
employs three different neural network blocks called Parallel Segmenter, Cluster CNN, and Viability CNN that are
trained for initial segmentation, cluster separation, and
cell viability analysis, respectively. Parallel Segmenter and
Cluster CNN blocks achieve highly accurate segmentation
of individual cells while Viability CNN block performs viability classification. A modified U-Net network, a wellknown deep neural network model for bioimage analysis, is
used in Parallel Segmenter while LeNet-5 architecture and
its modified version called Opto-Net are used for Cluster
CNN and Viability CNN, respectively. We train the Parallel Segmenter using 15 images of A2780 cells and 5 images
of yeasts cells, containing, in total, 14742 individual cell
images. Similarly, 6101 and 5900 A2780 cell images are
employed for training Cluster CNN and Viability CNN models, respectively. 2514 individual A2780 cell images are
used to test the overall segmentation performance of Parallel Segmenter combined with Cluster CNN, revealing high
Precision/Recall/F1-Score values of 96.52%/96.45%/98.06%,
respectively. Overall cell counting/viability analysis performance of DeepCAN is tested with A2780 (2514 cells), A549
(601 cells), Colo (356 cells), and MDA-MB-231 (887 cells) cell
images revealing high counting/viability analysis accuracies of 96.76%/99.02%, 93.82%/95.93%, and 92.18%/97.90%,
85.32%/97.40%, respectively. | en_US |
dc.language.iso | eng | en_US |
dc.publisher | IEEE | en_US |
dc.relation.isversionof | 10.1109/JBHI.2022.3203893 | en_US |
dc.rights | info:eu-repo/semantics/embargoedAccess | en_US |
dc.subject | Bioimage Segmentation | en_US |
dc.subject | Bright Field Imaging | en_US |
dc.subject | Cell Counting | en_US |
dc.subject | Convolutional Neural Network | en_US |
dc.subject | Viability Analysis | en_US |
dc.title | DeepCAN: A Modular Deep Learning System for Automated Cell Counting and Viability Analysis | en_US |
dc.type | article | en_US |
dc.relation.journal | Generic Colorized Journal | en_US |
dc.contributor.department | FSM Vakıf Üniversitesi, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü | en_US |
dc.identifier.issue | 20 | en_US |
dc.identifier.startpage | 1 | en_US |
dc.identifier.endpage | 9 | en_US |
dc.relation.publicationcategory | Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı | en_US |
dc.contributor.institutionauthor | Aydın, Musa | |