View a PDF of the paper titled Static Segmentation by Tracking: A Label-Efficient Approach for Fine-Grained Specimen Image Segmentation, by Zhenyang Feng and 24 other authors
View PDF
HTML (experimental)
Abstract:We study image segmentation in the biological domain, particularly trait segmentation from specimen images (e.g., butterfly wing stripes, beetle elytra). This fine-grained task is crucial for understanding the biology of organisms, but it traditionally requires manually annotating segmentation masks for hundreds of images per species, making it highly labor-intensive. To address this challenge, we propose a label-efficient approach, Static Segmentation by Tracking (SST), based on a key insight: while specimens of the same species exhibit natural variation, the traits of interest show up consistently. This motivates us to concatenate specimen images into a “pseudo-video” and reframe trait segmentation as a tracking problem. Specifically, SST generates masks for unlabeled images by propagating annotated or predicted masks from the “pseudo-preceding” images. Built upon recent video segmentation models, such as Segment Anything Model 2, SST achieves high-quality trait segmentation with only one labeled image per species, marking a breakthrough in specimen image analysis. To further enhance segmentation quality, we introduce a cycle-consistent loss for fine-tuning, again requiring only one labeled image. Additionally, we demonstrate the broader potential of SST, including one-shot instance segmentation in natural images and trait-based image retrieval.
Submission history
From: Wei-Lun Chao [view email]
[v1]
Sun, 12 Jan 2025 08:27:14 UTC (33,260 KB)
[v2]
Fri, 4 Jul 2025 22:40:19 UTC (33,470 KB)