•Site-connected habitat amount accounts for movements across habitat networks.•Integrated habitat amount evaluates aggregations of connected habitat in area units.•Together these provide a consistent ...and equitable method for calculating potential occupancy.•Raster-based assessment best represents fine-grained, heterogeneous habitats.•We resolve issues of equitable trading between habitat quality and extent.
Landscape connectivity measures based on metapopulation theory were developed over 20 years ago. Initially, they applied classic metapopulation models to simple patch-based representations of landscapes using vector spatial data structures. Realism was improved by developing dynamic estimates of occupancy and metapopulation capacity, the latter providing a measure of the integrated habitat amount. Such measures are used to estimate the ability of habitat networks to support metapopulation persistence. The original methods for occupancy mapping and metapopulation capacity were adapted to work with fine-grained, continuous-value raster data. That step shifted the method outside of the classic metapopulation model which left some methodological issues unresolved; in particular, what has been termed the deceptive paradox of patch-based connectivity whereby perverse and what we describe as inequitable results are obtained through arbitrary circumscription of the analysis grid and through the trading of habitat between habitat quality, extent and connectivity. We provide a solution to this issue and apply it within the frame of Drielsma and Ferrier's (2009) raster-based Rapid Evaluation of Metapopulation Persistence (REMP).
We demonstrate our solution using simple hypothetical examples; and in order to demonstrate the practicality of our approach to real-world settings, we apply the approach to habitat suitability mapping of the White-browed Treecreeper (Climacteris affinis) in south eastern New South Wales, Australia.
Computer screens often constrain the level of detail and clarity of displays. High-density data require a predefined strategy to select significant features hierarchically to allow interactive data ...zooming. Although many methods are available for hierarchically selecting rivers from vector data, some approaches for raster data are better than others for maintaining accuracy when the original river data are in a raster format during generalization. In this study, a raster-based approach is proposed to allow hierarchical superpixel selection in river networks. Linear spectral clustering segmentation was applied to divide the original raster river networks into superpixels at multiple levels. A graph was constructed to organize the generated river network superpixels based on the distances between adjacent superpixels by considering the weights determined by the four types of rules. Finally, the total weight values were ranked, the river-network superpixels were selected according to their weights, and the redundant pixels at the river-network intersections were removed. Compared with the traditional vector selection method, the proposed superpixel river network selection method can effectively consider the characteristics of river width without artificial river grading and preserve the main structure and connectivity features during hierarchical mapping. Notably, the average geometry and density changes decreased by 15.8% and 5.1%, respectively.
► Modified GNN method provides means for regional-scale mapping of tree species. ► Integrates field plots with moderate resolution raster data in ecological model. ► Power of CCA model bolstered by ...both ecological zone and phenology data. ► At scales assessed, model results compared favorably with plot-based estimates. ► Accuracy improved by using stratification layer and via optimal value for k.
The paper describes an efficient approach for mapping multiple individual tree species over large spatial domains. The method integrates vegetation phenology derived from MODIS imagery and raster data describing relevant environmental parameters with extensive field plot data of tree species basal area to create maps of tree species abundance and distribution at a 250-m pixel size for the entire eastern contiguous United States. The approach uses the modeling techniques of k-nearest neighbors and canonical correspondence analysis, where model predictions are calculated using a weighting of nearest neighbors based on proximity in a feature space derived from the model. The approach also utilizes a stratification derived from the 2001 National Land-Cover Database tree canopy cover layer. Data pre-processing is also described, which includes the use of Fourier series transformation for data reduction and characterizing seasonal vegetation phenology patterns that are apparent in the MODIS imagery.
A suite of assessment procedures is applied to each of the modeled dataset presented. These indicate high accuracies, at the scales of assessments used, for total live-tree basal area per hectare and for many of the most common tree species found in the study area. The end result is an approach that enables the mapping of individual tree species distributions, while retaining much of the species covariance found on the forest inventory plots, at a level of spatial detail approaching that required for many regional management and planning applications. The proposed approach has the potential for operational application for simultaneously mapping the distribution and abundance of numerous common tree species across large spatial domains.
Numerous methods based on square rasters have been proposed for polygon generalization. However, these methods ignore the inconsistent distance measurement among neighborhoods of squares, which may ...result in an imbalanced generalization in different directions. As an alternative raster, a hexagon has consistent connectivity and isotropic neighborhoods. This study proposed a hexagon-based method for polygon generalization using morphological operators. First, we defined three generalization operators: aggregation, elimination, and line simplification, based on hexagonal morphological operations. We then used corrective operations with selection, skeleton, and exaggeration to detect, classify, and correct the unreasonably reduced narrow parts of the polygons. To assess the effectiveness of the proposed method, we conducted experiments comparing the hexagonal raster to square raster and vector data. Unlike vector-based methods in which various algorithms simplified either areal objects or exterior boundaries, the hexagon-based method performed both simplifications simultaneously. Compared to the square-based method, the results of the hexagon-based method were more balanced in all neighborhood directions, matched better with the original polygons, and had smoother simplified boundaries. Moreover, it performed with shorter running time than the square-based method, where the minimal time difference was less than 1 min, and the maximal time difference reached more than 50 mins.
This paper proposes an algorithm for the conversion of raster data to hexagonal DGGSs in the GPU by redevising the encoding and decoding mechanisms. The researchers first designed a data structure ...based on rhombic tiles to convert the hexagonal DGGS to a texture format acceptable for GPUs, thus avoiding the irregularity of the hexagonal DGGS. Then, the encoding and decoding methods of the tile data based on space-filling curves were designed, respectively, so as to reduce the amount of data transmission from the CPU to the GPU. Finally, the researchers improved the algorithmic efficiency through thread design. To validate the above design, raster integration experiments were conducted based on the global Aster 30 m digital elevation dataDEM, and the experimental results showed that the raster integration accuracy of this algorithms was around 1 m, while its efficiency could be improved to more than 600 times that of the algorithm for integrating the raster data to the hexagonal DGGS data, executed in the CPU. Therefore, the researchers believe that this study will provide a feasible method for the efficient and stable integration of massive raster data based on a hexagonal grid, which may well support the organization of massive raster data in the field of GIS.
One important classical research area in automated cartographic generalization is simplification. Over the past few decades, numerous scholars have proposed various methods for polygon and line ...simplification, most of which have focused on vector data. However, with the rapid development of computer vision technology, unstructured image analysis and processing has provided a plethora of information, as well as new challenges. Therefore, in this article, we propose a new method for simplifying polygonal and linear features: a superpixel segmentation (SUSS) method specially designed for image data. In this method, polygonal boundaries are first divided by a superpixel algorithm called simple linear iterative clustering. Then, three types of curves - convex, concave, and flat - are globally simplified by comparing and selecting superpixels. Finally, uneven local features are removed by Fourier descriptors. In addition, the proposed SUSS method is extended for linear features, and it maintains topological relationships. To demonstrate the effectiveness of this approach, we use contours and water area data to perform experiments. Compared with the classic Douglas-Peucker and Wang and Muller algorithms, the proposed method is able to properly simplify the curves of polygonal and linear features while maintaining their essential shapes, and it maintains a steady change in area for large-scale applications while effectively avoiding self-intersection issues. Compared with the typical smoothing and Raposo algorithms, the proposed SUSS method can simplify lines at different scales and guarantee effective smoothing while maintaining displacement.
Given a grid of cells, each having a value indicating its cost per unit area, a variant of the least-cost path problem is to find a corridor of a specified width connecting two termini such that its ...cost-weighted area is minimized. A computationally efficient method exists for finding such corridors, but as is the case with conventional raster-based least-cost paths, their incremental orientations are limited to a fixed number of (typically eight orthogonal and diagonal) directions, and therefore, regardless of the grid resolution, they tend to deviate from those conceivable on the Euclidean plane. In this paper, we propose a method for solving the raster-based least-cost corridor problem with reduced distortion by adapting a distortion reduction technique originally designed for least-cost paths and applying it to an efficient but distortion-prone least-cost corridor algorithm. The proposed method is, in theory, guaranteed to generate no less accurate solutions than the existing one in polynomial time and, in practice, expected to generate more accurate solutions, as demonstrated experimentally using synthetic and real-world data.
ABSTRACT
As an important bottom‐up driver of ecosystem processes, rainfall is intrinsically linked to the dynamics of vegetation and species distributions through its effects on soil moisture content ...and surface water availability. Rainfall effects are thus spatially and temporally specific to different environmental role‐players. Knowledge of its spatio‐temporal pattern is therefore essential to understanding natural ecosystem flux and potential climate change effects. Climate change poses a serious threat to protected areas in particular, as they are often isolated in fragmented landscapes and confined within hard park boundaries. In consequence, a species' natural movement response to resulting climate‐induced niche shifts is often obstructed. Long‐term, accurate and consistent climate monitoring data are therefore important resources for managers in large protected areas like the Kruger National Park (Kruger). In this article we model local rainfall measurements as a function of global rainfall surfaces, elevation and distance to the ocean using a generalized additive mixed effects model to produce fine‐scale (1 km2) monthly rainfall surfaces from July 1981 to June 2015. Results show a clear seasonal cycle nested within an oscillating multi‐decadal trend. Most noticeably, seasonality is shifting both temporally and spatially as rainfall moves outside of the typical dry/wet periods and areas. In addition, high‐rainfall seasons are generally receiving more rainfall while low‐rainfall seasons are receiving less. Northwestern regions of the park are experiencing more extreme annual rainfall differences, while far northern and southern regions show greater seasonality changes. The well‐described north–south and east–west rainfall gradient is still visible but the spatial complexity of this pattern is more pronounced than expected. Taken together, we show that Kruger's spatio‐temporal rainfall patterns are changing significantly in the short to medium term. The resulting raster data set is made freely available to promote holistic ecosystem studies and support longer‐term climate change research (http://dataknp.sanparks.org/sanparks/).
Using generalized additive mixed effects models to explore the variability of local rainfall observations as a function of global rainfall surfaces, elevation and distance to the ocean. Results suggest significant change in spatio‐temporal patterns of rainfall in the Kruger National Park, South Africa from 1985 to 2015.
One of the main aims of the infrastructural organization of geospatial data is to provide users to be capable to acquire complete, exact and updated dataset at the right time. This is necessary for ...providing of an ideal environment, where all stakeholders can work collaboratively in an effective way, in order to solve environmental issues and to fulfill their target goals. The necessity of infrastructural organization of geospatial data in global level, by including official geospatial datasets developed by the national mapping organizations, for environmental monitoring, protection, and early warning management in international level, are the main findings of this research study. Data standardization of Global Map as contributor of GSDI and GEOSS have been analyzed through developed Albanian GM dataset. As main components that were taken into consideration for performing research analyses are data and metadata, technology, institutional framework, policies, interoperability, network services, search opportunities, and data sharing within GSDI.