Publication Details |
| Category | Text Publication |
| Reference Category | Journals |
| DOI | 10.5194/bg-22-6545-2025 |
Licence ![]() |
|
| Title (Primary) | Automated mask generation in citizen science smartphone photos and their value for mapping plant species in drone imagery |
| Author | Soltani, S.; Gillespie, L.E.; Exposito-Alonso, M.; Ferlian, O.; Eisenhauer, N.; Feilhauer, H.; Kattenborn, T. |
| Source Titel | Biogeosciences |
| Year | 2025 |
| Department | iDiv; RS |
| Volume | 22 |
| Issue | 21 |
| Page From | 6545 |
| Page To | 6561 |
| Language | englisch |
| Topic | T5 Future Landscapes |
| Data and Software links | https://doi.org/10.5281/zenodo.10019552 https://doi.org/10.5281/zenodo.17456239 |
| Abstract | Spatially accurate information on plant species is essential for monitoring in forestry, agriculture and nature conservation. Unoccupied aerial vehicle (UAV)-based remote sensing combined with supervised deep learning segmentation methods can provide accurate segmentation of plant species. However, labeling training data for supervised deep learning methods in vegetation monitoring is a resource-intensive task. Citizen science photographs annotated with species recognition apps could solve this challenge. However, citizen science photographs only have weak species classification labels and no segmentation masks, which are required to train state-of-the-art segmentation methods for fine-grained species recognition. Here, we explore the potential of an automated workflow that integrates the Segment Anything Model (SAM) with Gradient-weighted Class Activation Mapping (Grad-CAM) to automatically generate segmentation masks from citizen science plant photographs. We evaluated the workflow by using the generated masks to train CNN-based segmentation models to segment 10 broadleaf tree species in UAV images. Our results demonstrate that segmentation models can be trained directly using citizen science-sourced plant photographs, automating mask generation without the need for extensive manual labeling. Despite the inherent complexity of segmenting broadleaf tree species, the model achieved an overall acceptable performance for several species. In the context of monitoring vegetation dynamics across space and time, this study highlights the potential of integrating foundation models with citizen science data and remote sensing into automated vegetation mapping workflows, providing a scalable and cost-effective solution for biodiversity monitoring. |
| Persistent UFZ Identifier | https://www.ufz.de/index.php?en=20939&ufzPublicationIdentifier=31710 |
| Soltani, S., Gillespie, L.E., Exposito-Alonso, M., Ferlian, O., Eisenhauer, N., Feilhauer, H., Kattenborn, T. (2025): Automated mask generation in citizen science smartphone photos and their value for mapping plant species in drone imagery Biogeosciences 22 (21), 6545 - 6561 10.5194/bg-22-6545-2025 |
|
