Volume 19, Issue 2 (Journaloflasersinmedicine 2022)                   lmj 2022, 19(2): 22-31 | Back to browse issues page

XML Persian Abstract Print


Optical Bio-imaging Lab, Laser and Plasma Research Institute, Shahid Beheshti University, Tehran, Iran
Abstract:   (930 Views)
Introduction: Diffuse optical tomography (DOT) is a non-invasive imaging technique using near-infrared electromagnetic waves to measure the optical properties of biological tissue from boundary measurement. Image reconstruction in this method is an inverse, ill-posed, and nonlinear problem. Traditional optimization methods can't overcome explicitly this problem. Recently, deep neural networks were used in image reconstruction, and they have achieved significant improvement. In this research, we apply neural network algorithms to reconstruct the absorption coefficient distribution of 3-dimensional phantoms. We show that deep learning algorithms has a reliable performance in reconstructing DOT images in comparison to the model-based method.
Methodology: We generate 17000 digital cubic phantoms in size of 64×64×64 mm3 including tumors at depth of 21 to 45 mm, with distinct size, shape, various places and absorption coefficients. An imaging system including 25 sources and detectors were considered up and down of tissue. We propose two different neural network architectures: a fully connected layer and a convolutional network.
Finding: The performance of networks was evaluated by four metrics including mean absolute error (MAE), mean squared error (MSE), peak signal to noise ratio (PSNR), and structural similarity index metric (SSIM). Result shows, using fully connected layer and convolutional neural network, MAE 86% and MSE 81% were reduced and PSNR was doubled.
Result: We show that deep learning has a reliable performance on reconstructing DOT image in comparison of model-based method.
Full-Text [PDF 1146 kb]   (628 Downloads)    
Educational: Research | Subject: General
Received: 2022/11/16 | Accepted: 2022/11/1 | Published: 2022/11/1

Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.