Cell Segmentation Guide¶
This guide helps you choose the optimal segmentation method for your bacterial images and achieve the best possible cell detection results.
π― Overview¶
mAIcrobe offers four main segmentation approaches:
| Method | Type | Training Required | Speed | Accuracy |
|---|---|---|---|---|
| π StarDist | Deep Learning | β Custom model | Medium | High |
| π¬ Cellpose | Deep Learning | β Pre-trained | Medium | High |
| π§ U-Net | Deep Learning | β Custom model | Medium | High |
| β‘ Thresholding | Classical | β None | Fast | Medium |
Although StarDist and U-Net models require custom training, mAIcrobe includes a pre-trained models for several bacterial species and modalities. The models are downloaded from our GitHub repository on first use and stored inside your user folder under .maicrobecache.
Tip: You can always refine the segmentation manually using napariβs built-in label editing tools. Check the official napari Labels layer documentation for more details on using Labels layers.
π StarDist Models¶
Best for: Star-convex shaped cells (most bacteria)
Key Features¶
- π― Purpose: Deep learning-based segmentation for star-convex shapes
- π§ Requirement: Custom trained model needed.
Getting Started¶
- Learn more: Check the StarDist paper and repository
- Training: Use our example notebook at
notebooks/StarDistSegmentationTraining.ipynbNotebook link: https://github.com/HenriquesLab/mAIcrobe/blob/main/notebooks/StarDistSegmentationTraining.ipynb - Examples: See StarDist training examples
Note: mAIcrobe provides a pre-trained StarDist model for S. aureus SIM images stained with NileRed membrane dye. Select "StarDist S.aureus SIM" in the segmentation widget.
π¬ Cellpose Models¶
Best for: General cell segmentation across diverse cell types
Key Features¶
- π― Purpose: Universal deep learning segmentation model
- π Ready to use: Pre-trained cyto3 model included
- π Versatile: Trained on diverse cell types and imaging modalities
Getting Started¶
- Learn more: Check the Cellpose paper and repository
- First run: Model weights download automatically on first use (can take several minutes depending on your internet connection)
- Usage: Select "CellPose cyto3" in the segmentation widget
Tip: Cellpose is great for getting started quickly without training custom models.
π§ U-Net Models¶
Best for: Custom applications with specific imaging conditions
Key Features¶
- π― Purpose: Convolutional neural network
- π§ Format: Requires Keras model files (
.keras) - π¨ Flexible: Can be trained for specific cell types and conditions
- βοΈ ZeroCostDL4Mic: Recommended tool for training U-Net models that integrate with mAIcrobe
Model Requirements¶
Your U-Net model should output:
- 0: Background
- 1: Cell boundary
- 2: Cell interior
mAIcrobe converts this to individual cell labels using scikit-image watershed segmentation.
Note: mAIcrobe provides several pre-trained U-Net models for different bacterial species and imaging modalities. In the compute_label widget, select from the following options:
- "Ph.C. S. pneumo" : Phase contrast S. pneumoniae
- "WF FtsZ B. subtilis": Widefield fluorescence B. subtilis expressing FtsZ-GFP
- "Unet S. aureus": Membrane labeled SIM S. aureus
Getting Started¶
- Learn more: Read the U-Net paper
- Training: Use ZeroCostDL4Mic
- Technical details: See watershed documentation
β‘ Thresholding-Based Methods¶
Best for: Quick analysis without training requirements
Key Features¶
- π Speed: Fastest segmentation method
- π§ No training: Classical image processing
- βοΈ Trade-off: Lower accuracy for complex images
Available Methods¶
π Isodata Thresholding¶
- Type: Global automatic threshold
- How it works: Analyzes image histogram to find optimal threshold
- Best for: Images with clear intensity separation. Phase contrast is a typical example.
- Reference: scikit-image documentation
π― Local Average Thresholding¶
- Type: Adaptive local threshold
- How it works: Computes threshold based on local neighborhood
- Best for: Images with uneven illumination
- Reference: scikit-image documentation
Processing Pipeline¶
- Threshold β Binary image
- Distance transform β Separate touching cells. See scipy distance transform docs
- Watershed β Individual cell labels. See scikit-image watershed docs.
π Validation and Quality Control¶
β Manual Validation Checklist¶
Always validate segmentation results:
- Sample size: Check 50-100 cells randomly
- Visual inspection: Look for common segmentation errors:
- Under-segmentation (multiple cells as one)
- Over-segmentation (one cell split into multiple)
- Boundary accuracy
- Missing cells
π Automated Quality Metrics¶
Key indicators to monitor:
- Cell count consistency across similar images
- Size distribution of segmented cells - look for outliers
- Circularity of segmented cells - should be species-appropriate |
π Further Reading¶
- Cell Analysis Guide - What to do after segmentation
- Cell Classification Guide - Cell classification workflows
- API Reference - Programmatic control
π Scientific References¶
- StarDist: Schmidt et al., MICCAI 2018
- Cellpose: Stringer et al., Nature Methods 2021
- U-Net: Ronneberger et al., MICCAI 2015
π οΈ Technical Documentation¶
- Watershed segmentation: scikit-image docs
- Image filters: scikit-image docs
Next: Learn how to analyze your segmented cells in the Cell Analysis Guide π¬