Detail publikace

Label-free nuclear staining reconstruction in quantitative phase images using deep learning

VIČAR, T. GUMULEC, J. BALVAN, J. HRACHO, M. KOLÁŘ, R.

Originální název

Label-free nuclear staining reconstruction in quantitative phase images using deep learning

Anglický název

Label-free nuclear staining reconstruction in quantitative phase images using deep learning

Jazyk

en

Originální abstrakt

Fluorescence microscopy is a golden standard for contemporary biological studies. However, since fluorescent dyes cross-react with biological processes, a label-free approach is more desirable. The aim of this study is to create artificial, fluorescence-like nuclei labeling from label-free images using Convolution Neural Network (CNN), where training data are easy to obtain if simultaneous label-free and fluorescence acquisition is available. This approach was tested on holographic microscopic image set of prostate non-tumor tissue (PNT1A) and metastatic tumor tissue (DU145) cells. SegNet and U-Net were tested and provide "synthetic" fluorescence staining, which are qualitatively sufficient for further analysis. Improvement was achieved with addition of bright-field image (by-product of holographic quantitative phase imaging) into analysis and two step learning approach, without and with augmentation, were introduced. Reconstructed staining was used for nucleus segmentation where 0.784 and 0.781 dice coefficient (for DU145 and PNT1A) were achieved.

Anglický abstrakt

Fluorescence microscopy is a golden standard for contemporary biological studies. However, since fluorescent dyes cross-react with biological processes, a label-free approach is more desirable. The aim of this study is to create artificial, fluorescence-like nuclei labeling from label-free images using Convolution Neural Network (CNN), where training data are easy to obtain if simultaneous label-free and fluorescence acquisition is available. This approach was tested on holographic microscopic image set of prostate non-tumor tissue (PNT1A) and metastatic tumor tissue (DU145) cells. SegNet and U-Net were tested and provide "synthetic" fluorescence staining, which are qualitatively sufficient for further analysis. Improvement was achieved with addition of bright-field image (by-product of holographic quantitative phase imaging) into analysis and two step learning approach, without and with augmentation, were introduced. Reconstructed staining was used for nucleus segmentation where 0.784 and 0.781 dice coefficient (for DU145 and PNT1A) were achieved.

Dokumenty

BibTex


@inproceedings{BUT147411,
  author="Tomáš {Vičar} and Jaromír {Gumulec} and Jan {Balvan} and Michal {Hracho} and Radim {Kolář}",
  title="Label-free nuclear staining reconstruction in quantitative phase images using deep learning",
  annote="Fluorescence microscopy is a golden standard for contemporary biological studies. However, since fluorescent dyes cross-react with biological processes, a label-free approach is more desirable. The aim of this study is to create artificial, fluorescence-like nuclei labeling from label-free images using Convolution Neural Network (CNN), where training data are easy to obtain if simultaneous label-free and fluorescence acquisition is available. This approach was tested on holographic microscopic image set of prostate non-tumor tissue (PNT1A) and metastatic tumor tissue (DU145) cells. SegNet and U-Net were tested and provide "synthetic" fluorescence staining, which are qualitatively sufficient for further analysis. Improvement was achieved with addition of bright-field image (by-product of holographic quantitative phase imaging) into analysis and two step learning approach, without and with augmentation, were introduced. Reconstructed staining was used for nucleus segmentation where 0.784 and 0.781 dice coefficient (for DU145 and PNT1A) were achieved.",
  address="Springer, Singapore",
  booktitle="World Congress on Medical Physics and Biomedical Engineering, June 3-8, 2018, Prague, Czech Republic",
  chapter="147411",
  doi="10.1007/978-981-10-9035-6_43",
  howpublished="online",
  institution="Springer, Singapore",
  year="2019",
  month="january",
  pages="239--242",
  publisher="Springer, Singapore",
  type="conference paper"
}