4.5 Article

Automatic segmentation of Caenorhabditis elegans skeletons in worm aggregations using improved U-Net in low-resolution image sequences

Journal

HELIYON
Volume 9, Issue 4, Pages -

Publisher

CELL PRESS
DOI: 10.1016/j.heliyon.2023.e14715

Keywords

Caenorhabditis elegans; Skeletonizing; Synthetic dataset; Low-resolution image; U-Net

Ask authors/readers for more resources

This article proposes a novel method for predicting the poses of multiple C. elegans worms in cases of aggregation and noise. The method utilizes an improved U-Net model to address the challenges posed by low-resolution images. The model is trained and validated using a custom-generated dataset with a synthetic image simulator, and tested with a dataset of real images, achieving precision greater than 75% and an Intersection over Union (IoU) value of 0.65.
Pose estimation of C. elegans in image sequences is challenging and even more difficult in lowresolution images. Problems range from occlusions, loss of worm identity, and overlaps to aggregations that are too complex or difficult to resolve, even for the human eye. Neural networks, on the other hand, have shown good results in both low-resolution and high-resolution images. However, training in a neural network model requires a very large and balanced dataset, which is sometimes impossible or too expensive to obtain. In this article, a novel method for predicting C. elegans poses in cases of multi-worm aggregation and aggregation with noise is proposed. To solve this problem we use an improved U-Net model capable of obtaining images of the next aggregated worm posture. This neural network model was trained/validated using a custom-generated dataset with a synthetic image simulator. Subsequently, tested with a dataset of real images. The results obtained were greater than 75% in precision and 0.65 with Intersection over Union (IoU) values.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available