in

A drone imagery dataset for semantic segmentation of urban garden ground covers in biodiversity studies


Abstract

Urban gardens promote urban biodiversity by providing diverse ground covers that support habitat provision, pollination, pest control, and soil functions. However, lacking high spatial resolution images, their spatial heterogeneity remains poorly mapped, limiting our understanding of how these features support ecosystem services. This study presents a high-resolution dataset derived from unmanned aerial vehicle (UAV) RGB imagery for the semantic segmentation of diverse ground covers in urban community gardens. The dataset consists of 2,521 images processed into 24 orthomosaics, acquired in 2021–2022 at five garden locations in Munich, Germany. Each image (18.9–146.4 M px; 3.2–7.9 mm resolution) is manually annotated into eight ground-cover classes (grass, herb, litter, soil, stone, straw, wood, and woodchip). We evaluated deep-learning segmentation models, including UNet and DeepLabV3+. The DeepLabV3+ (overall accuracy = 93.2% and Intersection over Union = 69.4) achieved high classification accuracy in distinguishing these complex classes. This dataset is intended to support research on urban biodiversity, habitat modelling, garden management, remote sensing research, and can be integrated with other fine-scale datasets to advance sustainable urban green planning.

Similar content being viewed by others

Consumer-grade UAV imagery facilitates semantic segmentation of species-rich savanna tree layers

Revolutionizing urban mapping: deep learning and data fusion strategies for accurate building footprint segmentation

Detecting mangrove seedlings from UAV imagery using deep learning for restoration monitoring

Data availability

The dataset generated by the current study is available on Zenodo53 (https://doi.org/10.5281/zenodo.18757882).

Code availability

The code is available on GitHub (https://github.com/paglab/ugc-mapping).

References

  1. Jha, S. et al. Multiple ecosystem service synergies and landscape mediation of biodiversity within urban agroecosystems. Ecol. Lett. 26, 369–383, https://doi.org/10.1111/ELE.14146 (2023).

    Google Scholar 

  2. Lin, B. B. et al. COVID‐19 gardening could herald a greener, healthier future. Front. Ecol. Environ. 19, 491–493, https://doi.org/10.1002/FEE.2416 (2021).

    Google Scholar 

  3. Cabral, I. et al. Ecosystem services of allotment and community gardens: A Leipzig, Germany case study. Urban For. Urban Green. 23, 44–53, https://doi.org/10.1016/J.UFUG.2017.02.008 (2017).

    Google Scholar 

  4. Klein, A. M. et al. Importance of pollinators in changing landscapes for world crops. Proc. R. Soc. B Biol. Sci. 274, 303–313, https://doi.org/10.1098/rspb.2006.3721 (2007).

    Google Scholar 

  5. Plascencia, M. & Philpott, S. M. Floral abundance, richness, and spatial distribution drive urban garden bee communities. Bull. Entomol. Res. 107, 658–667, https://doi.org/10.1017/S0007485317000153 (2017).

    Google Scholar 

  6. Reganold, J. P., Glover, J. D., Andrews, P. K. & Hinman, H. R. Sustainability of three apple production systems. Nature 410, 926–930, https://doi.org/10.1038/35073574 (2001).

    Google Scholar 

  7. Tresch, S. et al. Direct and indirect effects of urban gardening on aboveground and belowground diversity influencing soil multifunctionality. Sci. Rep. 9, 9769, https://doi.org/10.1038/S41598-019-46024-Y (2019).

    Google Scholar 

  8. Igalavithana, A. D. et al. Assessment of soil health in urban agriculture: Soil enzymes and microbial properties. Sustainability 9, 310, https://doi.org/10.3390/su9020310 (2017).

    Google Scholar 

  9. Salomon, M. J., Watts-Williams, S. J., McLaughlin, M. J. & Cavagnaro, T. R. Urban soil health: A city-wide survey of chemical and biological properties of urban agriculture soils. J. Clean. Prod. 275, 122900, https://doi.org/10.1016/J.JCLEPRO.2020.122900 (2020).

    Google Scholar 

  10. Wander, M. M. et al. Developments in Agricultural Soil Quality and Health: Reflections by the Research Committee on Soil Organic Matter Management. Front. Environ. Sci. 7, 1–9, https://doi.org/10.3389/fenvs.2019.00109 (2019).

    Google Scholar 

  11. App, M., Strohbach, M. W., Schneider, A. K. & Schröder, B. Making the case for gardens: Estimating the contribution of urban gardens to habitat provision and connectivity based on hedgehogs (Erinaceus europaeus). Landsc. Urban Plan. 220, 104347, https://doi.org/10.1016/J.LANDURBPLAN.2021.104347 (2022).

    Google Scholar 

  12. Graffigna, S., González-Vaquero, R. A., Torretta, J. P. & Marrero, H. J. Importance of urban green areas’ connectivity for the conservation of pollinators. Urban Ecosyst. 27, 417–426, https://doi.org/10.1007/S11252-023-01457-2 (2024).

    Google Scholar 

  13. Taylor, J. R. & Lovell, S. T. Urban home gardens in the Global North: A mixed methods study of ethnic and migrant home gardens in Chicago, IL. Renew. Agric. Food Syst. 30, 22–32, https://doi.org/10.1017/S1742170514000180 (2015).

    Google Scholar 

  14. Davies, Z. G. et al. A national scale inventory of resource provision for biodiversity within domestic gardens. Biol. Conserv. 142, 761–771, https://doi.org/10.1016/j.biocon.2008.12.016 (2009).

    Google Scholar 

  15. Felderhoff, J., Gathof, A. K., Buchholz, S. & Egerer, M. Vegetation complexity and nesting resource availability predict bee diversity and functional traits in community gardens. Ecol. Appl. 33, e2759, https://doi.org/10.1002/eap.2759 (2023).

    Google Scholar 

  16. Egerer, M. H., Wagner, B., Lin, B. B., Kendal, D. & Zhu, K. New methods of spatial analysis in urban gardens inform future vegetation surveying. Landsc. Ecol. 35, 761–778, https://doi.org/10.1007/S10980-020-00974-1 (2020).

    Google Scholar 

  17. Huang, B., Zhao, B. & Song, Y. Urban land-use mapping using a deep convolutional neural network with high spatial resolution multispectral remote sensing imagery. Remote Sens. Environ. 214, 73–86, https://doi.org/10.1016/J.RSE.2018.04.050 (2018).

    Google Scholar 

  18. Giang, T. L. et al. Coastal landscape classification using convolutional neural network and remote sensing data in Vietnam. J. Environ. Manage. 335, 117537, https://doi.org/10.1016/J.JENVMAN.2023.117537 (2023).

    Google Scholar 

  19. Deshpande, P., Belwalkar, A., Dikshit, O. & Tripathi, S. Historical land cover classification from CORONA imagery using convolutional neural networks and geometric moments. Int. J. Remote Sens. 42, 5148–5175, https://doi.org/10.1080/01431161.2021.1910365 (2021).

    Google Scholar 

  20. Hermosilla, T., Wulder, M. A., White, J. C. & Coops, N. C. Land cover classification in an era of big and open data: Optimizing localized implementation and training data selection to improve mapping outcomes. Remote Sens. Environ. 268, 112780, https://doi.org/10.1016/J.RSE.2021.112780 (2022).

    Google Scholar 

  21. Bansal, K. & Tripathi, A. K. Dual level attention based lightweight vision transformer for streambed land use change classification using remote sensing. Comput. Geosci. 191, 105676, https://doi.org/10.1016/J.CAGEO.2024.105676 (2024).

    Google Scholar 

  22. Yao, J., Zhang, B., Li, C., Hong, D. & Chanussot, J. Extended Vision Transformer (ExViT) for Land Use and Land Cover Classification: A Multimodal Deep Learning Framework. IEEE Trans. Geosci. Remote Sens. 61, https://doi.org/10.1109/TGRS.2023.3284671 (2023).

  23. Wagner, B. & Egerer, M. Application of UAV remote sensing and machine learning to model and map land use in urban gardens. J. Urban Ecol. 8, 1–12, https://doi.org/10.1093/jue/juac008 (2022).

    Google Scholar 

  24. Seitz, B. et al. Land sharing between cultivated and wild plants: urban gardens as hotspots for plant diversity in cities. Urban Ecosyst. 25, 927–939, https://doi.org/10.1007/S11252-021-01198-0 (2022).

    Google Scholar 

  25. Goddard, M. A., Dougill, A. J. & Benton, T. G. Scaling up from gardens: biodiversity conservation in urban environments. Trends Ecol. Evol. 25, 90–98, https://doi.org/10.1016/j.tree.2009.07.016 (2010).

    Google Scholar 

  26. Anderson, K. & Gaston, K. J. Lightweight unmanned aerial vehicles will revolutionize spatial ecology. Front. Ecol. Environ. 11, 138–146, https://doi.org/10.1890/120150 (2013).

    Google Scholar 

  27. Pádua, L. et al. UAS, sensors, and data processing in agroforestry: A review towards practical applications. Int. J. Remote Sens. 38, 2349–2391, https://doi.org/10.1080/01431161.2017.1297548 (2017).

    Google Scholar 

  28. Colomina, I. & Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 92, 79–97, https://doi.org/10.1016/J.ISPRSJPRS.2014.02.013 (2014).

    Google Scholar 

  29. Toth, C. & Jóźków, G. Remote sensing platforms and sensors: A survey. ISPRS J. Photogramm. Remote Sens. 115, 22–36, https://doi.org/10.1016/J.ISPRSJPRS.2015.10.004 (2016).

    Google Scholar 

  30. Pajares, G. Overview and current status of remote sensing applications based on unmanned aerial vehicles (UAVs). Photogramm. Eng. Remote Sens. 81, 281–330, https://doi.org/10.14358/PERS.81.4.281 (2015).

    Google Scholar 

  31. Aasen, H., Honkavaara, E., Lucieer, A. & Zarco-Tejada, P. J. Quantitative remote sensing at ultra-high resolution with UAV spectroscopy: A review of sensor technology, measurement procedures, and data correction workflows. Remote Sens. 10, 1091, https://doi.org/10.3390/rs10071091 (2018).

    Google Scholar 

  32. Cui, H. et al. Knowledge evolution learning: A cost-free weakly supervised semantic segmentation framework for high-resolution land cover classification. ISPRS J. Photogramm. Remote Sens. 207, 74–91, https://doi.org/10.1016/J.ISPRSJPRS.2023.11.015 (2024).

    Google Scholar 

  33. Zhu, Q. et al. Knowledge-guided land pattern depiction for urban land use mapping: A case study of Chinese cities. Remote Sens. Environ. 272, 112916, https://doi.org/10.1016/j.rse.2022.112916 (2022).

    Google Scholar 

  34. Du, S., Du, S., Liu, B. & Zhang, X. Mapping large-scale and fine-grained urban functional zones from VHR images using a multi-scale semantic segmentation network and object based approach. Remote Sens. Environ. 261, 112480, https://doi.org/10.1016/J.RSE.2021.112480 (2021).

    Google Scholar 

  35. Li, Z., Weng, Q., Zhou, Y., Dou, P. & Ding, X. Learning spectral-indices-fused deep models for time-series land use and land cover mapping in cloud-prone areas: The case of Pearl River Delta. Remote Sens. Environ. 308, 114190, https://doi.org/10.1016/J.RSE.2024.114190 (2024).

    Google Scholar 

  36. Ghosh, A., Sharma, R. & Joshi, P. K. Random forest classification of urban landscape using Landsat archive and ancillary data: Combining seasonal maps with decision level fusion. Appl. Geogr. 48, 31–41, https://doi.org/10.1016/J.APGEOG.2014.01.003 (2014).

    Google Scholar 

  37. Chowdhury, M. S. Comparison of accuracy and reliability of random forest, support vector machine, artificial neural network and maximum likelihood method in land use/cover classification of urban setting. Environ. Challenges 14, 100800, https://doi.org/10.1016/J.ENVC.2023.100800 (2024).

    Google Scholar 

  38. Christovam, L. E., Pessoa, G. G., Shimabukuro, M. H. & Galo, M. L. B. T. Land use and land cover classification using hyperspectral imagery: Evaluating the performance of spectral angle mapper, support vector machine and random forest. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. XLII-2/W13, 1841–1847, https://doi.org/10.5194/isprs-archives-XLII-2-W13-1841-2019 (2019).

    Google Scholar 

  39. Kavzoglu, T. & Bilucan, F. Effects of auxiliary and ancillary data on LULC classification in a heterogeneous environment using optimized random forest algorithm. Earth Sci. Informatics 16, 415–435, https://doi.org/10.1007/S12145-022-00874-9 (2023).

    Google Scholar 

  40. Li, Z. et al. Deep learning for urban land use category classification: A review and experimental assessment. Remote Sens. Environ. 311, 114290, https://doi.org/10.1016/J.RSE.2024.114290 (2024).

    Google Scholar 

  41. Alom, M. Z. et al. A state-of-the-art survey on deep learning theory and architectures. Electron. 8, 292, https://doi.org/10.3390/electronics8030292 (2019).

    Google Scholar 

  42. LeCun, Y., Bengio, Y. & Hinton, G. Deep learning. Nature 521, 436–444, https://doi.org/10.1038/nature14539 (2015).

    Google Scholar 

  43. Bengio, Y., Courville, A. & Vincent, P. Representation learning: A review and new perspectives. IEEE Trans. Pattern Anal. Mach. Intell. 35, 1798–1828, https://doi.org/10.1109/TPAMI.2013.50 (2013).

    Google Scholar 

  44. Ball, J. E., Anderson, D. T. & Chan, C. S. Comprehensive survey of deep learning in remote sensing: theories, tools, and challenges for the community. J. Appl. Remote Sens. 11, 042609, https://doi.org/10.1117/1.JRS.11.042609 (2017).

    Google Scholar 

  45. Yuan, Q. et al. Deep learning in environmental remote sensing: Achievements and challenges. Remote Sens. Environ. 241, 111716, https://doi.org/10.1016/j.rse.2020.111716 (2020).

    Google Scholar 

  46. Dalal, N. & Triggs, B. Histograms of oriented gradients for human detection. In Proc. IEEE Conf. Comput. Vis. Pattern Recognit. 886–893, https://doi.org/10.1109/CVPR.2005.177 (2005).

  47. Guo, Z., Wen, J. & Xu, R. A Shape and Size Free-CNN for Urban Functional Zone Mapping With High-Resolution Satellite Images and POI Data. IEEE Trans. Geosci. Remote Sens. 61, https://doi.org/10.1109/TGRS.2023.3320658 (2023).

  48. Fan, Y., Ding, X., Wu, J., Ge, J. & Li, Y. High spatial-resolution classification of urban surfaces using a deep learning method. Build. Environ. 200, 107949, https://doi.org/10.1016/J.BUILDENV.2021.107949 (2021).

    Google Scholar 

  49. Bergado, J. R., Persello, C. & Stein, A. LAND USE CLASSIFICATION USING DEEP MULTITASK NETWORKS. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. XLIII-B3-2020, 17–21, https://doi.org/10.5194/ISPRS-ARCHIVES-XLIII-B3-2020-17-2020 (2020).

    Google Scholar 

  50. Reichstein, M. et al. Deep learning and process understanding for data-driven Earth system science. Nature 566, 195–204, https://doi.org/10.1038/s41586-019-0912-1 (2019).

    Google Scholar 

  51. Afrasiabian, Y., Contiz, F., Van Cleemput, E., Egerer, M. & Yu, K. Biodiversity monitoring in urban community gardens using proximal sensing and drone remote sensing. Remote Sens. Appl. Soc. Environ. 39, 101685, https://doi.org/10.1016/J.RSASE.2025.101685 (2025).

    Google Scholar 

  52. Wadoux, A. M. J. C., Heuvelink, G. B. M., de Bruin, S. & Brus, D. J. Spatial cross-validation is not the right way to evaluate map accuracy. Ecol. Modell. 457, 109692, https://doi.org/10.1016/J.ECOLMODEL.2021.109692 (2021).

    Google Scholar 

  53. Afrasiabian, Y. et al. Urban Garden Ground-Cover UAV RGB Orthomosaic Dataset for Semantic Segmentation. Zenodo. Version 2 https://doi.org/10.5281/zenodo.18757882 (2026).

  54. Lizarazo, I. & Elsner, P. Fuzzy segmentation for object-based image classification. Int. J. Remote Sens. 30, 1643–1649, https://doi.org/10.1080/01431160802460062 (2009).

    Google Scholar 

  55. Colkesen, I. & Kavzoglu, T. The use of logistic model tree (LMT) for pixel- and object-based classifications using high-resolution WorldView-2 imagery. Geocarto Int. 32, 71–86, https://doi.org/10.1080/10106049.2015.1128486 (2017).

    Google Scholar 

  56. Goodfellow, I., Bengio, Y. & Courville, A. Deep Learning. (MIT Press, 2016).

  57. Zhao, Y. et al. Land-Unet: A deep learning network for precise segmentation and identification of non-structured land use types in rural areas for green urban space analysis. Ecol. Inform. 87, 103078, https://doi.org/10.1016/J.ECOINF.2025.103078 (2025).

    Google Scholar 

  58. Ronneberger, O., Fischer, P. & Brox, T. U-Net: Convolutional networks for biomedical image segmentation. In Med. Image Comput. Comput.-Assist. Interv. (MICCAI) 234–241 https://doi.org/10.1007/978-3-319-24574-4_28 (2015).

  59. Chen, L. C., Zhu, Y., Papandreou, G., Schroff, F. & Adam, H. Encoder-decoder with atrous separable convolution for semantic image segmentation. Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics) 11211 LNCS, 833–851, https://doi.org/10.1007/978-3-030-01234-2_49 (2018).

    Google Scholar 

  60. Iglovikov, V. & Shvets, A. TernausNet: U-Net with VGG11 Encoder Pre-Trained on ImageNet for Image Segmentation. https://arxiv.org/abs/1801.05746v1 (2018).

  61. Deng, J. et al. ImageNet: A Large-Scale Hierarchical Image Database. 2009 IEEE Conf. Comput. Vis. Pattern Recognition, CVPR 2009 248–255 https://doi.org/10.1109/CVPR.2009.5206848 (2009).

  62. Kingma, D. P. & Ba, J. L. Adam: A Method for Stochastic Optimization. 3rd Int. Conf. Learn. Represent. ICLR 2015 – Conf. Track Proc. https://arxiv.org/abs/1412.6980v9 (2014).

  63. Prechelt, L. Early stopping, but when? 55–69 https://doi.org/10.1007/3-540-49430-8_3 (1998).

  64. Zhang, Y., Li, K., Zhang, G., Zhu, Z. & Wang, P. DFA-UNet: Efficient Railroad Image Segmentation. Appl. Sci. 13, https://doi.org/10.3390/APP13010662 (2023).

  65. Dong, R., Pan, X. & Li, F. DenseU-Net-Based Semantic Segmentation of Small Objects in Urban Remote Sensing Images. IEEE Access 7, 65347–65356, https://doi.org/10.1109/ACCESS.2019.2917952 (2019).

    Google Scholar 

  66. Shahinfar, S., Meek, P. & Falzon, G. How many images do I need?” Understanding how sample size per class affects deep learning model performance metrics for balanced designs in autonomous wildlife monitoring. Ecol. Inform. 57, 101085, https://doi.org/10.1016/J.ECOINF.2020.101085 (2020).

    Google Scholar 

  67. Sergeev, A. & Del Balso, M. Horovod: fast and easy distributed deep learning in TensorFlow. https://arxiv.org/pdf/1802.05799 (2018).

  68. Zhu, X. X. et al. Deep Learning in Remote Sensing: A Comprehensive Review and List of Resources. IEEE Geosci. Remote Sens. Mag. 5, 8–36, https://doi.org/10.1109/MGRS.2017.2762307 (2017).

    Google Scholar 

Download references

Acknowledgements

This work was supported by the Hans Eisenmann-Forum für Agrarwissenschaften (HEF) Seed Fund 2021 and the NH3-Min project. The authors gratefully acknowledge the assistance of Ali Mokhtari, Jürgen Pluss, Wuhua Wang, and Jingcheng Zhang. During the preparation of this work the authors used ChatGPT to improve language. After using this tool, the authors reviewed and edited the content as needed and take full responsibility for the content of the publication.

Funding

Open Access funding enabled and organized by Projekt DEAL.

Author information

Authors and Affiliations

Authors

Contributions

Yasamin Afrasiabian: Investigation, Writing – Original Draft, Conceptualization, Validation. Chenghao Lu: Methodology, Visualization, Writing – Original Draft, Conceptualization. Anirudh Belwalkar: Conceptualization, Investigation, Validation. Hany Elsharawy: Methodology. Xiaoxin Song: Methodology. Ying Yuan: Methodology. Fei Wu: Methodology. Xiang Su: Methodology, Validation. Elisa Van Cleemput: Conceptualization, Writing – Review & Editing. Monika Egerer: Conceptualization, Recourses, Writing – Review & Editing. Kang Yu: Conceptualization, Methodology, Recourses, Writing – Review & Editing, Supervision.

Corresponding author

Correspondence to
Kang Yu.

Ethics declarations

Competing interests

The authors declare no competing financial interests or personal relationships that could have influenced the work presented in this manuscript.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary information (download PDF )

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Cite this article

Afrasiabian, Y., Lu, C., Belwalkar, A. et al. A drone imagery dataset for semantic segmentation of urban garden ground covers in biodiversity studies.
Sci Data (2026). https://doi.org/10.1038/s41597-026-07152-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1038/s41597-026-07152-z


Source: Ecology - nature.com

Asymmetric carbon response to the 2019 extreme positive Indian Ocean Dipole

The safety margin of small-scale tree cover loss in global fragmented forests

Back to Top