UNI-MB - logo
UMNIK - logo
 
E-viri
Celotno besedilo
Recenzirano Odprti dostop
  • CONTEXTS.py (CS.py): A supe...
    Ferrara, Vincenza; Lindberg, Johan; Wästfelt, Anders

    MethodsX, 06/2024, Letnik: 12
    Journal Article

    The qualitative dimensions of visible features in space can be captured by connecting spatial configurations arranged in a variety of different ways to diverse conceptual spaces. By conceptual spaces, we intend mental concepts describing specific spatial configurations present in a geographical area, defined by the contextual relationships among their constitutive elements. This paper presents a new supervised post-classification method allowing the extraction of semantically complex spatial objects from a single image of the Earth as, for instance, diverse conceptual spaces referring to multiple dimensions of land use (temporal, cultural, social, etc.). Computationally, our method is operationalised by CONTEXTS.py (CS.py), a plugin written in Python for QGIS. CS.py relies on training areas, defined by the user at diverse scales, to identify and extract in the input image conceptual spaces whose spatial contexts have the same spatial features present in the training areas. Applied to a case study on the island of Sicily, where millennial land use dynamics have resulted in a mosaic landscape, CS.py could detect from an orthophoto diverse conceptual spaces of land use in an area ordinarily classified as one land cover, thus expanding the capabilities of geospatial analysis to reach additional qualitative dimensions of information from image data.•CS.py simplifies a supervised contextual post-classification routine in an easy-to-use, practical and accessible QGIS plugin;•CS.py joins a family of tools for supervised object-based classification (e.g. OTB, GRASS), providing, additionally, the possibility to include contextual information as spatial criteria to train the classification routine.•CS.py has broad applications in different disciplines investigating landscape from quantitative and qualitative perspectives, allowing both, as in multiple environments. Display omitted