Accuracy 2010 Conference

 

Proceedings of the Ninth International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences

Edited by: Nicholas J Tate, Peter F Fisher

Local Organising Committee

International Steering Committee

Programme Committee

 

A Comparison of Indicator and Poisson Kriging of Herbivore Species Abundance in Kruger National Park, South Africa

Ruth Kerry1, Pierre Goovaerts2, Izak Smit3, Benjamin R. Ingram4
1.Department of Geography, Brigham Young University, Provo, USA.
2.Biomedware Inc., Ann Arbor, USA.
3.South African National Parks, Skukuza, South Africa.
4.Department of Computer Science, University of Talca, Talca, Chile.
1. ruth_kerry@byu.edu;2 2.ingrambr@googlemail.com; 3. goovaerts.pierre@gmail.com;4. izaks@sanparks.org

Abstract: This paper investigates the potential of indicator and Poisson kriging for populating gaps in aerial transect surveys of herbivore species abundance in Kruger National Park, South Africa. Indicator kriging does not perform well due to a lack of zero counts in the raw observations and due to poor variogram structure for rare large counts. Two Poisson approaches perform better than indicator kriging. Poisson approach (1), investigating spatial density, performed better than Poisson approach (2) which investigated the proportion each species was of all animals. However, Poisson approach (2) seems promising for mapping abundance of the rarer animals.

Keywords: indicator kriging; Poisson kriging; species abundance

AttachmentSize
KerryAccuracy2010.pdf370.41 KB

A Comparison of Text Mining and Semantic Approaches for integrating national and Local Habitat Data

Alexis Comber1, Andy Lear2, Richard Wadsworth3
1. Department of Geography University of Leicester Leicester, LEI 7RH, UK Leicester and Rutland
2. Wildlife Trust Leicester, LE5 2JJ, UK
3. Centre for Ecology and Hydrology, Lancaster, LAI 4AP, UK
1. ajc36@le.ac.uk; 2. alear@lrwt.org.uk;3. rawad@ceh.ac.uk

Abstract: This paper compares two approaches for integrating data semantics: data primitives and latent semantic analysis. The former is shown to be appropriate when the user has an understanding of the domain. The latter is suitable when the user has no a priori knowledge of the domain.

Keywords: land cover, habitat, integration, semantics

AttachmentSize
ComberAccuracy2010.pdf517.67 KB

A Dynamic Web-based Data Model for Representing Geographic Points with Uncertain Locations

Evan Bowling1, Ashton Shortridge2
1. Department of Computer Science and Engineering Michigan State University East Lansing, MI, USA
2. Department of Geography Michigan State University East Lansing, MI, USA

1. bowlinlO@msu.edu; 2. ashton@msu.edu

Abstract: This paper considers the problem of implementing positional error models for point data within the context of internet mapping. We identify three particular challenges: existing internet spatial data model standards, which were not designed for stochastic representation; the means by which the parameters and algorithms for a positional uncertainty model might be transmitted with the spatial data; and the nature, timing, and efficiency of simulation algorithms. We develop and implement several solutions of increasing complexity. These solutions implement on-the-fly point position modification through independent and spatially autocorrelated error model realizations on the client- and server-side. We discuss the limitations of both client-side and server-side solutions and the implications for current web mapping applications.

Keywords: positional uncertainty; internet mapping; error propagation

AttachmentSize
BowlingAccuracy2010.pdf521.81 KB

A Geo-Spatial Site characterization Framework for Determining Optimal Offshore Locations of Hydrokineti Tidal Energy Devices in the Pentland Firth

Eric P.M. Grist and Jason D. Mc Ilvenny
Environmental Research Institute (ERI) North Highland College, UHI Millennium Institute Thurso, Caithness, UK
{Eric.Grist, Jason.Mcilveny}@thurso.uhi.ac.uk

Abstract: Hydrokinetic marine renewable energy devices are forecast to play a major role in contributing to the UK renewable power generation capacity. In the north of Scotland, the Pentland Firth (located between the north coast of mainland Scotland (UK) and the Orkney Islands) is considered to be a prime marine region for the deployment of hydrokinetic tidal energy devices (Mackay, 2008). One of the key unresolved development issues is how to identify locations and configurations that are most suitable for hydrokinetic device deployment under the various physical, ecological and economic constraints existing within a particular domain. In this paper, we develop a site characterization modelling approach that determines optimal locations within a geographic information systems (GIS) framework. A central feature is the spatial incorporation of available tidal energy power density (henceforth referred to as 'tidal power density') estimated through raster coverage weighted by dominant cost drivers associated with installation of offshore renewable energy devices. The latter include geo-spatial factors such as offshore site distance, seabed depth and cable connection distance to the electricity grid. Operational restrictions and uncertainties imposed in time and space by tidal ranges and fluctuations on specific hydrokinetic devices are also examined. The framework provides a generic management decision tool to assist with evaluating uncertainties and concerns connected with marine renewable energy development.

Keywords: seabed depth; coastal distance; installation; cost drivers; raster coverage; tidal power density

AttachmentSize
EricGristAccuracy2010.pdf784.87 KB

A High Accuracy and High Speed Method and its Application to Ecological Modelling

Tian Xiang Yue1,2, Chuan-Fa Chen3, Ze Meng Fan3, and Xiao-Fang Sun3
1.Institute of Geographical Sciences and Natural Resources Research, Chinese Academy of Sciences, Beijing, China
2.Ecological Complexity and Modeling Laboratory University of California, Riverside, USA
3. Institute of Geographical Sciences and Natural Resources Research Chinese Academy of Sciences Beijing, China
1.2 Yue@lreis.ac.cn; txyue@ucr.edu

Abstract: The high accuracy and high speed method for surface modeling (HAHSM) is transformed into a symmetric positive definite and large sparse linear system. It is applicable to developing scenarios of terrestrial ecosystems on global level.

Keywords: High accuracy; high speed; surface modelling; ecosytsem; HASM

AttachmentSize
YueAccuracy2010.pdf237.42 KB

A Sketch-based Language for Representing Uncertainty in the Locations of Origin of Herbarium Specimens

Barry J. Kronenfeld1 and Andrea Weeks2
1. Department of Geography and Geoinformation Science George Mason University Fairfax, Virginia, USA
2. Department of Environmental Science and Policy & Ted R. Bradley Herbarium - George Mason University Fairfax, Virginia, USA

1.bkronenf@gmu.edu; 2. aweeks3 @gmu.edu

Abstract: Uncertainty fields have been suggested as an appropriate model for retrospective georeferencing of herbarium specimens. Previous work has focused only on automated data capture methods, but techniques for manual data specification may be able to harness human spatial cognition skills to quickly interpret complex spatial propositions. This paper develops a formal modeling language by which location uncertainty fields can be derived from manually sketched features. The language consists of low-level specification of critical probability isolines from which a surface can be uniquely derived, and high-level specification of features and predicates from which low-level isolines can be derived. In a case study, five specimens of Kolsteletzkya pentacarpos housed in the Ted Bradley Herbarium at George Mason University are retrospectively georeferenced, and locational uncertainties of error distance, possibility region and uncertainty field representations are compared. I.

Keywords: Herbarium databases; Retrospective georeferencing; Uncertainty fields;

AttachmentSize
KronenfeldAccuracy2010.pdf545.17 KB

A New Look at Semantic Accuracy

Ola Ahlqvist
Department of Geography, The Ohio State University Columbus, OH, USA
ahloqvist.l@osu.edu

AttachmentSize
Alhqvist2010Accuracy.pdf390.54 KB

A New Method for Checking the Planimetric Accuracy of Digital Elevation Models Data Derived by Airborne Laser Scanning

Joachim Hohle and Christian Oster Pedersen
Department of Development and Planning,Aalborg University
jh@land.aau.dk, choepe@gmail.com

Abstract: The current state of the art in checking the planimetric accuracy of Digital Elevation Models derived from Airborne Laser Scanning is analyzed. The principle of a proposed method is presented including the mathematical equations. Special emphasis is given to the precision of derived points which are used for the comparison with true values. Least squares adjustment is applied and the influence of blunders in the observations is reduced by means of the iterative determination of weights as a function of the size of the corrections. Practical tests have been carried out with data from the new Digital Elevation Model of Denmark. The required reference values were derived by means of aerial images and photogrammetric techniques. A few ground control points were determined by GPS. The reliability and practicability of the method is then discussed on the basis of the experiences obtained from the practical usage of the method. It is concluded that the proposed method is accurate, robust against blunders and with potential for automation.

Keywords: Digital Elevation Models; Airborne Laser Scanning; planimetric accuracy; robust adjustment; precision

AttachmentSize
JoachimAccuracy2010.pdf405.91 KB

Accuracy Sampling design Bias on Coarse Spatial Resolution Land Cover Data in the Great Lakes Region (United States and Canada

John S. Iiames, Jr.
US Environmental Protection Agency Research Triangle Park, NC, USA
iiames.john@epa.gov

Abstract: A number of articles have investigated the impact of sampling design on remotely sensed landcover accuracy estimates. Gong and Howarth (1990) found significant differences for Kappa accuracy values when comparing pure- pixel sampling, stratified random sampling, and stratified systematic unaligned sampling. This study compares accuracy assessment results for landcover derived from 2007 Moderate Resolution Imaging Spectroradiometer (MODIS) 250 m normalized difference vegetation index (NDVI) time-series data for the Great Lakes Basin (GLB), USA. Here, two sampling schemes are compared: (1) Pure-pixel sampling (center pixel within a 3 x 3 window) and (2) Isolated independent pixel sampling (i.e. 'edge pixels'). MODIS spectral characteristics typically 'bleed' from adjacent pixels, causing pure 'edge pixels' to be suspect with respect to their homogeneity. This study will explain the possible bias by inclusion of these fringe pixels within the assessment process. Our study focuses on the Northern Lakes and Forests Level III Omernik ecological region (115,934 km2) south of Lake Superior and existent within three states: Michigan, Wisconsin, and Minnesota.

Keywords: MODIS, NDVI, accuracy

AttachmentSize
IiamesAccuracy2010.pdf342.74 KB

Accounting for Spatial Sampling Effects in Regional Uncertainty Propagation Analysis

Gerard B.M. Heuvelink*, Dick J. Brus and Gertjan Reinds
Environmental Sciences Group, Wageningen University and Research Centre, Wageningen, the Netherlands
* gerard.heuvelink@wur.nl

Abstract: Spatial uncertainty propagation analysis (UPA) aims at analysing how uncertainties in model inputs propagate through spatial models. Monte Carlo methods are often used, which estimate the output uncertainty by repeatedly running the model with inputs that are sampled from their probability distribution. Regional application of UPA usually means that the model output must be aggregated to a larger spatial support. For instance, decision makers may want to know the uncertainty about the annual nitrate leaching averaged over an entire region, whereas a model typically predicts the leaching for small plots. For models without spatial interactions there is no need to run the model at all points within the region of interest. A sufficiently large sample of locations may represent the region sufficiently well. The reduction in computational load can then be used to increase the number of Monte Carlo runs, which decreases the Monte Carlo sampling error. In this paper we analyse how a combination of analytical and numerical methods can be used to evaluate the errors introduced by Monte Carlo and spatial sampling. This is important to be able to correct for the bias inflicted by the spatial sampling, to determine how many model runs are needed to reach sufficiently accurate results and to determine the optimum ratio of the Monte Carlo and spatial sample sizes. Results are briefly illustrated with an UPA of a linear regression model that predicts the terrestrial nitrous-oxide emission for Europe.

Keywords: spatial uncertainty analysis, sampling error, Monte Carlo error, optimization, spatial aggregation

AttachmentSize
HeuvelinkAccuracy2010.pdf479.39 KB

Accuracy Assessment with Complex Sampling Designs

Raymond L. Czaplewski
United States Forest Service Rocky Mountain Research Station Fort Collins, Colorado USA
rczaplewski@fs.fed.us

Abstract: A reliable accuracy assessment of remotely sensed geospatial data requires a sufficiently large probability sample of expensive reference data. Complex sampling designs reduce cost or increase precision, especially with regional, continental and global projects. The General Restriction (GR) Estimator and the Recursive Restriction (RR) Estimator separate a complex sample survey into simple statistical components, each of which is sequentially combined into the final estimate. GR and RR produce a design-consistent Empirical Best Linear Unbiased Estimator (EBLUE) for any sample survey design, regardless of its complexity.

Keywords: Kalman filter; error matrix; GIS; geospatial database; MODIS; Landsat; LiDAR; photo-interpretation

AttachmentSize
CzaplewskiAccuracy2010.pdf443.92 KB

Accuracy of Built-up Area Mapping in Europe at Varying Scales and Thresholds

Pavol Hurbanek1, Peter M. Atkinson1, Jeganathan Chockalingam1, Robert Pazur2, Konstantin Rosina3
1. University of Southampton Southampton, United Kingdom
2.Slovak Academy of Sciences Bratislava, Slovakia
3. Comenius University Bratislava, Slovakia

1. pavolhurbanek@gmail.com P.M.Atkinson@soton.ac.uk j.chockalingam@soton.ac.uk; 2. geogpazu@savba.sk; 3. konstantin.rosina@gmail.com

Abstract: The paper provides an accuracy assessment of the Soil Sealing Layer (SSL), a map of impervious surfaces in most of Europe. It focuses on the extent of mapped built-up area and the accuracy of built-up area mapping as a function of the soil sealing threshold applied, the spatial resolution used, and the spatial configuration of built-up areas mapped. The results from the stratified random sampling comparison of SSL with aerial orthophotos derived for Slovakia are compared with those for other countries and complemented by "complete coverage" comparisons in three 6x6 km study areas in Slovakia.

Keywords: soil sealing; impervious surfaces; land cover; spatial resolution; threshold; Europe; Slovakia; population downscaling

AttachmentSize
HurbanekAccuracy2010.pdf1.01 MB

Accuracy Assessment for Boolean and Fuzzy Classificaiton in Tripoli, Libya

Abdulhakim Khmag, Alexis Comber, and Peter Fisher
Department of Geography, University of Leicester, Leicester, LEI 7RH, United Kingdom
{aek9, ajs36, pffl}@le.ac.uk

Abstract: Satellite imagery is a longstanding and effective resource for environmental analysis and monitoring at local, regional and global scales. Thematic map accuracy continues to be problematic; especially when Boolean representations are used as each image pixel is assumed to be pure and is classified to one and only one class. In reality the pixel may be mixed, containing many classes. Fuzzy classifications may be useful as multiple class memberships are assigned. A membership function is defined for each class against the feature value (digital numbers) and membership values of a class to belong to a particular pixel are determined based on function definition. Quantifying classification accuracy is an important aspect of map production as it allows confidences to be attached to the classifications for their effective end use. Accuracy measures serve as the analysis of errors, arising from the classification process due to complex interactions between the spatial structure of landscape, classification algorithms, land cover change and sensor resolutions. The accuracy of Boolean classifications may be assessed in a number of different ways and traditionally the error matrix is generates overall accuracy, producer and user accuracy and kappa coefficients. A number of additional measures, for example, classification success index and Tau coefficient. Therefore, other accuracy measures may appropriately including the fuzziness in the classification outputs and/or reference (ground) data. These include Euclidean, entropy, cross-entropy, and LI distances, fuzzy set operators, and fuzzy error matrix based measure. Generally, the confusion matrix, compares ground observations for a given set of validation samples with the classification result. Accuracy assessment usually includes three essential mechanisms: sampling design, response design, and estimation and analysis procedures. Selection of a suitable sampling strategy is a critical step, the major components of a sampling plan include sampling unit (pixels or polygons), sampling design, and sample size. Possible sampling designs include random, stratified random, systematic, double, and cluster sampling. This paper compares different accuracy assessment measures for Boolean and Fuzzy classified images at different dates in Tripoli city, Libya and the results show that classification accuracy related to a range of different factors. The result also shows the use of kappa coefficients for accuracy assessment gives good result for both Fuzzy and Boolean.

Keywords: Accuracy assessment, Boolean classification, fuzzy classifications, kappa coefficient, error matrix

Accuracy-aware Web-based Geoprocessing Chains

Andreas Donaubauer1, Tatjana Kutzner2, Florian Straub2
1.Geographic Information Systems Group ETH Zurich Zurich, Switzerland
2. Fachgebiet Geoinformationssysteme Technische Universitat Munchen Munchen, Germany
1.donaubauer@geod.baug.ethz.ch;2.{tatjana.kutzner, florian.straub}@bv.tum.de

Abstract: Quality information is essential to determine the fitness for use of geospatial data for certain applications and furthermore to correctly interpret the results of spatial analyses. However, up to now the latter aspect is still not sufficiently included in geospatial applications and geospatial web services. This paper presents an approach to dynamically generate quality information for the results of spatial analyses within a Web-based chain of geoprocessing operations, hereby taking into account the quality of the input data. The approach is based on standards of the International Organization for Standardization (ISO) and the Open Geospatial Consortium (OGC). An implemented prototype together with a use case from forestry shows the practicability and relevance of this approach.

Keywords: Metadata, Data Quality, Positional Accuracy, Geoprocessing Workflow, OGC, Web Processing Service, SOA, Error Propagation, Model Driven Architecture, ISO 19100

AttachmentSize
DonaubauerAccuracy2010.pdf622.33 KB

Adjusting for Measurement Lag in the Estimation of Rapid Marine Tidal Current Flow with Sparse Spatiotemporal Boat Survey Data

Eric P.M. Grist and Jason D. Mc Ilvenny
Environmental Research Institute (ERI) North Highland College, UHI Millennium Institute Thurso, Caithness, UK
{Eric.Grist, Jason.Mcilveny}@thurso.uhi.ac.uk

Abstract: Accurate determination of marine tidal current flow is a crucial component in assessing offshore site suitability for marine renewable energy devices. The use of real time boat surveys to provide field data of estimated current speeds and directions can help identify potential sites for development. However, in a rapidly changing tidal flow system the time lag between measurements at different points in space and time may induce significant error into estimates of current speeds and directions. This in turn implies substantial inaccuracies from spatiotemporal computations which attempt to extrapolate tidal flow patterns over a wider regional scale. Whereas hydrodynamic models can provide flow parameter estimates at any position and time, they cannot be relied on to be accurate without verification or correction from such field measurements. Here, we introduce a statistical approach which adjusts for measurement lags inherent in measured flows, by combining boat survey data with hydrodynamic model output. The approach exploits a regression model which is fitted to observed differences between survey data and output from the hydrodynamic model. It is exemplified with the POLPRED® (POLPRED, 2007) hydrodynamic model for the Pentland Firth in the north of Scotland, a key region designated for development of tidal stream technology. The regression model is used to estimate currents with associated uncertainties over a spatiotemporal range within the domain of the survey region. Our results indicate that correction for measurement lag is likely to be a major factor in achieving accurate estimation of currents in such dynamic marine environments.

Keywords: regression; current profile; hydrodynamic model; ADCP; POLPRED

AttachmentSize
GristAccuracy2010.pdf451.04 KB

Air Quality Mapping: Combining Models, In Situ Measurements and Remote Sensing

Nicholas S. Hamm1*, Tiblets Z. Demwez2, Alfred. Stein3, Richard Boucherie4
1. Faculty of Geoinformation Science and Earth Observation (ITC), Enschede The Netherlands
2. Faculty of Geoinformation Science and Earth Observation (ITC), Enschede The Netherlands, Faculty of Electrical Engineering, Mathematics and Computer Science,University of Twente, Postbus 6, 7500 AA, Enschede, The Netherlands
3. Faculty of Geoinformation Science and Earth Observation (ITC), Enschede The Netherlands
4. Faculty of Electrical Engineering, Mathematics and Computer Science University of Twente, Postbus 6, 7500 AA Enschede, The Netherlands
* hamm@itc.nl

Abstract: This paper presents a system for modelling traffic flow, estimating vehicle emissions and validating them against remotely sensed and in situ data. The results showed that the queuing model was satisfactory and that the vehicle emissions model was internally consistent. The validation results were inconclusive and point to future important areas of research. Notably, it is necessary to develop a system to predict concentrations (rather than emissions) in time and space.

Keywords: Queuing model, emissions, PM10, remote sensing, in situ data

AttachmentSize
HammAccuracy2010.pdf479.98 KB

An Accuracy Assessment of Spaceborne X-band (TerraSAR-X) Spotlight Mode InSAR DEMs

Cutberto U. Paredes-Hernandez, Kevin J. Tansey and Nicholas J. Tate
Department of Geography, University of Leicester, Leicester, United Kingdom
{cupl, kjt7, njt9}@le.ac.uk

Abstract: In this paper we present the initial results of an accuracy assessment of Digital Elevation Models extracted from two TerraSAR-X interferometric datasets collected in Spotlight mode. Our results show that for sparsely vegetated areas of low relief TerraSAR-X Spotlight mode DEMs can achieve a vertical accuracy that can be compared to the vertical accuracy achieved using airborne X-band InSAR.

Keywords: DEM Accuracy, Incidence Angle, Multi-looking, TerraSA R-X

AttachmentSize
CutbertoAccuracy2010.pdf1 MB

An ID3-improved Approach of for Optimum Rule Mining Through Granular Computing Search

Hadis Smadi Alinia1,Mahmoud Reza Delavar2, Yiyu Yao3
1.Department of Surveying and Geomatics Eng, University of Tehran, Tehran, Iran
2.Deparment of Surveying and Geomatics Eng Center of Excellence in Geomatics Eng and Disaster Management
3.Department of Computer Science, University of Regina, Sasktchewan, Canada
1.alinia@ut.ac.ir; 2. medelavar@ut.ac.ir; 3. yyao@cs.uregina.ca

Abstract: Rule induction is an area of machine learning in which formal rules are extracted from a set of observations or training dataset. Inducted rules can be expressed as a final result of the decision tree in which each branch represents a possible scenario of decision and its outcome. Existing decision learning algorithms like Iterative Dichotomiser (ID3) is an attribute centered method which may introduce unnecessary attributes in the classification rules. To overcome the problem, coverage and confidence measures are applied to select the most promising attribute-value at each step. The proposed approach is granule centered in which, instead of focusing on the selection of a suitable partition, i.e., a family of granules is defined, a teach step, by values of an attribute. This paper is concentrated on the selection of a single granule. The decision tree learning algorithm ID3 and granular network are successfully applied for information table of test dataset of seismic vulnerability of urban areas in Tehran, capital of Iran.

Keywords: ID3 decision tree; Granule network; Uncertainty; Consistent classification; Seismic vulnerability

AttachmentSize
AliniaAccuracy2010.pdf483.47 KB

An Analysis of Propagate Uncertainties in Ecological Niche Spatial Modelling

Marco Antonio Marinelli and Robert Corner
Department of Spatial Sciences Curtin University of Technology Perth, Western Australia
{m.marinelli, r.corner}@bom.gov.au

Abstract: BIOCLIM is a probabilistic ecological niche model that can be used to investigate and predict species distributions for both native and agricultural species. Its results have greatest validity when studying relatively large (subcontinental) areas. The version of BIOCLIM used in this study uses three basic spatial climatic input layers (monthly maximum and minimum temperature and precipitation layers) and a dataset describing the current spatial distribution of the species of interest. Our work has investigated how uncertainty in the input data propagates through to the estimated spatial distribution for Field Peas (Pisum sativum) in the agriculturally significant region of south west Western Australia. Our results clearly show the effect of uncertainty in the input layers on the predicted specie's distribution map. In places the uncertainty significantly influences the final validity of the result and the spatial distribution of the validity also varies significantly.

Keywords: BIOCLIM; prediction; uncertainty

AttachmentSize
MarinelliAccuracy2010.pdf489.52 KB

Analysing the Precision of resource Aware Localisation Algorithms for Wireless Sensor Networks

Frank Niemeyer, Alexander Born and Ralf Bill
Faculty for Agricultural and Environmental Sciences, Chair of Geodesy and Geoinformatics, University of Rostock, Germany
{frank.niemeyer,Alexander.born,ralf.bill}@uni-rostock.de

Abstract: One of the key issues in wireless sensor networks is the precise determination of positions of arbitrarily located sensor nodes with low complexity and lowest energy consumption. Depending on the application, different accuracy levels are required. It can be distinguished between coarse- grained algorithms with lower hardware requirements, but low precision and fine-grained algorithms with high accuracy, but which are also very resource-intensive. In this paper we investigate and compare the precision of selected methods with respect to the network geometry and highly inaccurate distance measurements.

Keywords: wireless sensor networks, geosensors, localisation, accuracy, precise positioning

AttachmentSize
NiemeyerAccuracy2010.pdf667.69 KB

Analysis of Spatial Autocorrelation for Point Objects based on Line Buffer

Jiangping Chen and Jingchao Jiang
School of remote sensing Information Engineering Wuhan University Wuhan Hubei, China
chenjp_lisa@ 163 .com; 77068177@qq.com

Abstract: In the light of Tobler's first law: everything is related to everything else, but near things are more related than distant things. This kind of relations called spatial autocorrelation. Spatial autocorrelation occurs when values of a variable sampled at nearby locations are more similar than those sampled at locations more distant from each other. Moran's I maybe one of the oldest and best indices for spatial autocorrelation. The ways of weight matrix definition will affect the value of Moran's I greatly. Since some weight matrix cannot differentiate the strength of spatial linkages between adjacent locations, some more complex spatial weight matrices are proposed for more precise spatial linkages. There are many ways to define a spatial weight matrix. For example: Rook's weight matrix, Queen's matrix, Binary connectivity matrix, K nearest matrix and distance threshold matrix etc. For point objects in the real world, the distance between the point objects is not the only feature affect their autocorrelation which were more affected by their accessibility. For example the autocorrelation of two cities, it's more affected by roads and the accessibility of the two cities than the distance between. It's naturally to consider the spatial autocorrelation for point objects based on the line between them. One new form of spatial weight matrix for point objects based on line buffer is provided in this paper. There are 3 steps in the paper. First, the line buffer range of the point objects according to the theme knowledge of the research fields is set. Second, the relationship of two point objects is defined by line buffer. If the point objects are both within the line buffer then they are spatial related and will be measured by a formula which offered in the paper otherwise their spatial autocorrelation is considered zero. Third, the method is applied in Moran's I. The experiment on spatial autocorrelation of the prices of buildings were applied in the new method, it shows that the method is effective.

Keywords: Spatial autocorrelation;Spatial weight matrix; Moran's I; Point objects; Line buffer

AttachmentSize
JiangpingAccuracy2010.pdf444.51 KB

Analyzing Small Geographic Area Datasets Containing Values Having High Levels of Uncertainty

Daniel A. Griffith1 and Robert P. Haining2
1. School of EPPS University of Texas at Dallas Richardson, TX, USA
2. Department of Geography University of Cambridge Cambridge, UK

1. dagriffith@utdallas.edu; 2. rph26@cam.ac.uk

Abstract: Data collected and then post-stratified by small geo- graphic areas frequently result in small, or even zero, sample sizes for some areas. Government agencies faced with this out- come commonly suppress many of the sample-based attribute measures for confidentiality reasons. Meanwhile, modeling concerns of spatial scientists faced with this situation include the accompanying large standard errors for parameter esti- mates obtained with such data, as well as how to deal directly with any missing values. This paper addresses these two issues in the context of spatial statistical modeling that accounts for high levels of uncertainty for some data values in two specific datasets. Its purpose is to demonstrate ways of handling high levels of uncertainty in georeferenced data. In doing so, empiri- cally-based findings summarized in this paper illustrate se- lected approaches that can be employed to account for high levels of uncertainty for some data values in a dataset. Its impli- cations should be of interest to users in government, the private sector, and the academic community who engage in the model- ing of georeferenced data.

Keywords: Bayesian; error; imputation; missing data; Poisson regression; sample size; uncertainty

AttachmentSize
GriffithAccuracy2010.pdf507.46 KB

Application of Hyperspectral Remotely sensed Data for Water Quality Monitoring: Accuracy and Limitation

AsifM. Bhatti1, John Schalles2, Donald Rundquist3, Luis Ramirez3 and Seigo Nasu4
1. Kochi University of Technology (KUT), Kochi, Japan
2. Biology Dept., Creighton University, Omaha, USA.
3.Center for Advanced Land Management Information Technologies (CALM1T), School of Natural Resources, University of Nebraska-Lincoln, Lincoln, USA
4. Kochi University of Technology (KUT), Kochi, Japan

1. asif_engr@yahoo.com

Abstract: Remote sensing is a valuable tool for monitoring water quality parameters in inland and coastal waters. The prime objective of present research was to investigate the accuracy and limitations of hyperspectral remotely sensed data for water quality monitoring. The in situ hyperspectral spectroradimeter data of Altamaha River, Georgia, USA, and the St. Marys River, Georgia, USA was collected below the water surface. The pronounced difference was observed between the subsurface spectral reflectance of different sampling points within the same water body. The spectral signatures were found to be strongly correlated with the optically active constituents present within the water body. The collected hyperspectral and in situ water quality data were analyzed to develop the models for estimation of total suspended sediment (TSS), colored dissolved organic matter (CDOM), chlorophyll-a and turbidity. The band ratio algorithms were developed by means of collected remotely sensed hyperspectral data. The developed regression models showed good correlation with the water quality parameters. It is imperative to comprehensively understand the spectral nature, spectral response to individual water quality parameters, and the effect of influencing factors on the reflected signals. The research work demonstrates the operational feasibility of remotely sensed data for monitoring water quality parameters.

Keywords: Hyperspectral data, water quality models, accuracy

AttachmentSize
BhattiAccuracy2010.pdf439.06 KB

Assessing the Accuracy of ‘Crowdsourced’ Data and its Integration with Official Spatial Data sets

Richard Jones*, Lucy Bastin, Dan Cornford and Matthew Williams
Knowledge Engineering Group,Aston University Birmingham, United Kingdom
*jonesrml @aston. ac.uk

Abstract: Recent developments in service-oriented and distributed computing have created exciting opportunities for the integration of models in service chains to create the Model Web. This offers the potential for orchestrating web data and processing services, in complex chains; a flexible approach which exploits the increased access to products and tools, and the scalability offered by the Web. However, the uncertainty inherent in data and models must be quantified and communicated in an interoperable way, in order for its effects to be effectively assessed as errors propagate through complex automated model chains. We describe a proposed set of tools for handling, characterizing and communicating uncertainty in this context, and show how they can be used to 'uncertainty- enable' Web Services in a model chain. An example implementation is presented, which combines environmental and publicly-contributed data to produce estimates of sea-level air pressure, with estimates of uncertainty which incorporate the effects of model approximation as well as the uncertainty inherent in the observational and derived data.

Keywords: uncertainty propagation, probability, web services

AttachmentSize
BakriAccuracy2010.pdf475.88 KB

Assessing the Spatial Characteristics of DEM Interpolation Error

Stephen Wise
Department of Geography, University of Sheffield, Sheffield, UK
s.wise@sheffield.ac.uk

Abstract: Studies of the detailed characteristics of DEM error have been hampered by the difficulty of obtaining a large sample of error values for a DEM. The approach proposed in this paper is to resample a DEM to a lower resolution and then re-interpolate back to the original resolution which produces a large sample of error values well distributed across the DEM. It is argued that this is analogous to creating a DEM from low- density data sources such as spot height and contour data. This method is applied to a sample area from Scotland which contains a variety of terrain types. The results show that the standard measure of error, the Root Mean Square Error (RMSE) of elevation shows only moderate correlation with a visual assessment of the quality of DEMs produced by a range of interpolation methods. The usual assumptions that elevation error has a Gaussian distribution and is strongly correlated are thrown into doubt. The distribution is much more strongly clustered around zero than the Gaussian, and the level of spatial autocorrelation varies markedly depending on the density of the source data and the interpolation method. At the level of the individual DEM point, elevation error is shown to be a poor predictor of error in slope derivatives which depend on the spatial pattern of elevation errors around the point and are also sensitive to differences in terrain. At the level of a whole DEM however, RMSE of elevation is a good predictor of RMSE in gradient and aspect.

Keywords: Error modelling, terrain analysis, spatial interpolation, spatial statistics

AttachmentSize
WiseAccuracy2010.pdf592.47 KB

Assimilation of Remote Sensing Data into Land-surface Models: the Importance of Uncertainty

Darren Ghent, Heiko Balzter and Jorg Kaduk

Department of Geography, University of Leicester, Leicester, LEI 7RH, UK.

{djg20, hb91,jk61}@le.ac.uk

Abstract: Land-surface models calculate the surface to atmosphere fluxes of heat, water and carbon; and are crucial elements of General Circulation Models (GCMs). Much variation however, exists in their parameterization and representation of physical processes, leading to uncertainty in how climate change influences the land surface on a regional or global scale. A key variable in the calculation of the surface energy budget is land-surface temperature (LST), which influences the partitioning of downward radiant energy into ground, sensible and latent heat fluxes. Furthermore, LST can be applied in the prediction of live fuel moisture content, a critical variable determining fire ignition and propagation; and is crucial to soil moisture - climate feedbacks. Reductions in the uncertainty in model predicted soil moisture and surface energy fluxes are achievable by constraining LST simulations with remote sensing data through the process of assimilation. An often used data assimilation mechanism is the Ensemble Kalman Filter (EnKF), which is a variant of the Kalman Filter sequential assimilation method, taking a Monte Carlo approach. Of key importance to the performance of the filter are the determination of both the uncertainty in the observation source and the size of the ensemble. Results presented here indicate significantly different assimilated LST can emerge as a consequence of changes made to either of these two factors.

Keywords: data assimilation; Ensemble Kalman Filter; land- surface modelling

AttachmentSize
GhentAccuracy2010.pdf353.2 KB

Assigning Confidence Limits to Outputs from a Nitrate Leaching Model: A Case Study from the River Ure Catchment, UK

Paulette Posen1, Andrew Lovett1, Michael Hutchins2, Helen Davies2

1. School of Environmental Sciences,University of East Anglia,Norwich, UK 2. Centre for Ecology and Hydrology,Crowmarsh Gifford,Wallingford, UK 1. {p.posen, a.lovett}@uea.ac.uk; 2. {mihu,hnd}@ceh.ac.uk

Abstract: Diffuse pollution from agriculture is a major concern for surface water management under the EU Water Framework Directive, and water quality models are frequently used as decision support tools for policy implementation to achieve water quality objectives. The current study uses uncertainty analysis techniques to assign confidence limits to outputs from a nitrate leaching model. A sensitivity analysis determines which input variables require most detailed attention in terms of limiting output error. The issue is illustrated through application of a nitrate leaching model to the River Ure catchment in northern England. Implications for predicting possible outcomes of future, policy-driven, land use change are examined, and recommendations are made regarding the use of such models as decision support tools.

Keywords: diffuse pollution, nitrates, SLIMMER, uncertainty analysis, Water Framework Directive

AttachmentSize
PosenAccuracy2010.pdf584.17 KB

Challenges Associated with Integrating Data from Multiple Scales to Assess Relationships

Linda J. Young1, Carol A. Gotway2, Kenneth K. Lopiano1
1.Department of Statistics; IFAS, University of Florida, Gainesville, FL, USA
2.Epidemiology and Analysis Program Office, U.S. Centers for Disease Control and Prevention, Atlanta, GA, USA
1.LJYoung@ufl.edu; KLopiano@ufl.edu; 2.Cdg7@cdc.gov

Abstract: Existing data from multiple sources (e.g.,surveillance systems, health registries, governmental agencies) are used increasingly in programs and studies for analysis and inference. More often than not, the data have been collected on different geographical or spatial units, and each of these may be different from the ones of interest. Rarely are investigators satisfied with combining the data on a common scale. After linking the variables, the focus naturally turns to exploring the relationships among the linked variables. Regression of the health outcomes on environmental factors, adjusted for appropriate covariates, is commonly used to quantify such associations. The effect of change-of-support is considered in this setting. Efforts to quantify the association between myocardial infarction (MI) and ozone within Florida illustrate some of the challenges.

Keywords: change-of-support; Berkson error; classical measurement error; spatial misalignment

AttachmentSize
YoungAccuracy2010.pdf542.92 KB

Clustering Detection for Amazonia Deforestation Using Spatio-temporal: Scan Statistics

Carlos Antonio Oliveira Vieira1, Nerilson Terra Santos1, Antonio Policarpo de Souza Carneiro1, Antonio Alcirley da Silva Balieiro2
1.Federal University of Vicosa Vicosa-MG, Brazil
2. Fundacao de Vigilancia em Saiide do Estado Amazonas - FVS/AM, Av. Andre Araiijo n° 701 - Aleixo, CEP. 69.060-001 - Manaus - Amazonas, Brasil

1. {carlos.vieira, nsantos, policarpo}@ufv.br; 2. alcirley@gmail.com

Abstract: The spatio-temporal models, developed for analyses of diseases, can also be used for others fields of study, including concerns about forest and deforestation. Therefore, this paper evaluated a methodology for detection of space-time clusters of cases that were mapped through the investigation of deforestation in Amazonas State. The methodology includes the location and the year that the deforestation's alert occurred. These deforestation's alerts are mapped by the DETER (Detection System of Deforestation in Real Time in Amazonia). The area of study, took place the south of Amazonas State, including Boca do Acre, Labrea, Canutama, Humaita, Manicore, Novo Aripuana and Apui County. This area has showed a significant change for the land cover which has increased the number of deforestation's alerts. Therefore this situation becomes a concern and gets more investigation, trying to stop factors that increase the number of cases in the area. The outcome shows an efficient model to detect space-time clusters of deforestation's alerts. The model was efficient to detect the location, the size, the order and characteristics about activities at the end of the study. Two clusters were considered alive clusters and kept alive until the end of the study. These clusters are located in Canutama and Labrea County.

Keywords: deforestation's alert, clusters, Scan statistiscs.

AttachmentSize
VieiraAccuracy2010.pdf563.57 KB

Comparison of Three Spatial Sensitivity Analsis Techniques

Nathalie Saint-Geours1 and Linda Lilburne2
1.AgroParisTech, UMR TETIS, F-34035, Montpellier, France
2.Landcare Research, PO Box 40, Lincoln 7640, Canterbury, New Zealand
1.nathalie.saint-geours@teledetection.fr; 2. lilburnel@landcareresearch.co.nz

Abstract: This paper compares the spatial Sobol' sensitivity analysis approach to two other sensitivity analysis techniques on a model with spatially distributed inputs. The comparison is performed on AquiferSim, a model that simulates groundwater flow and nitrate transport from paddock (i.e. field) to aquifer. Some of the input layers have considerable uncertainty. Alternative soil and land-use layers were simulated through Monte Carlo simulation based on expert-derived confusion matrices. Uncertainty of the raster rainfall layer was simulated via geostatistical unconditional simulation of error fields. The three sensitivity techniques are: (1) the spatial Sobol' technique, (2) one-at-a- time (OAT) variation around base sample points, and (3) the Elementary Effects method. The results show that the spatial Sobol' approach gives the best insight on AquiferSim behavior. OAT local variations of inputs around some sample points allow checking of the robustness of model predictions around those points, but give no insight on the relative importance of inputs. The Elementary Effects method shows that land use layer is the most influential input factor, but fails to capture interactions between input factors. The spatial Sobol' approach identifies the land use layer as being the most influential. It shows that strong interactions occur between most of the inputs, explaining 43% of the output variability.

Keywords: sensitivity analysis, Sobol, Elementary Effects

AttachmentSize
NathalieAccuracy2010.pdf422.12 KB

Comparison of the Accuracy of OpenStreetMap for Ireland with Google Maps and Bing Maps

Blazej Cipeluch1, Ricky Jacob1, Adam Winstanley1 and Peter Mooney2
1.Geotechnology Research Group, Department of Computer Science National University of Ireland Maynooth (NUIM) Maynooth, Co. Kildare, Ireland
2. Environmental Research Center Environmental Protection Agency Clonskeagh, Dublin 14, Ireland
1. bciepluch@cs.nuim.ie; 2.p.mooney@epa.ie

Abstract: We describe a comparison of the accuracy of OpenStreetMap for Ireland with Google Maps and Bing Maps. Five case study cities and towns are chosen for this comparison. Each mapping system is analysed for accuracy under three main headings: spatial coverage, currency, and ground-truth positional accuracy. We find that while there is no clear winner amongst the three mapping platforms each show individual differences and similarities for each of the case study locations. We believe the results described in this paper are useful for those developing Location-based services for countries such as Ireland where access to high- quality geospatial data is often prohibitively expensive or made difficult by other barriers such as lack of data or access restrictions.

Keywords: OpenStreetMap, Google Maps, Bing Maps, Ireland,

AttachmentSize
CipeluchAccuracy2010.pdf614.15 KB

Conservative Updating of Sampling Designs

Kristina B. Helle*, Edzer Pebesma
Institute for Geoinformatics University of Minister Munster, Germany
*kristina.helle(Suni-muenster.de

Abstract: When improving existing monitoring networks, to adapt to changed requirements, keeping as many stations as possible is cheapest and therefore often preferred over a completely new setup. Here, the sampling design for ambient gamma dose monitoring in Norway is optimised. We consider two goals: equal spread of stations, and detection of plumes that affect densely populated areas. For optimisation, we compare and improve algorithms that replace the existing stations one by one: a greedy algorithm replaces the most unimportant station by the best candidate station; random replacement keeps all random improvements. A new approach is random replacement that rejects all sampling designs with too many stations moved. We add a penalty term to the cost function to search sampling designs with few station moves. This combines the advantages of the two previous approaches: The greedy algorithm replaces the most unimportant stations only, therefore as many stations as possible are kept. Random search can consider more candidates and often is faster. Random replacement with penalty is faster than the greedy algorithms, whereas for detection, the resulting sampling designs were of the same quality: moving a station pays off with a similar improvement in cost.

Keywords: update cost, spatial sampling design, space coverage, plume detection, greedy algorithm

AttachmentSize
HelleAccuracy2010.pdf437.22 KB

Countering New Challenges Regarding Classification Quality Assessment Methods with the Help of Fuzzy Boundaries

Christoph Kinkeldey and Jochen Schiewe
Lab for Geoinformatics and Geovisualization (g2lab) HafenCity University (HCU) Hamburg, Germany
{christoph.kinkeldey, jochen.schiewe}@hcu-hamburg.de

Abstract: The quality assessment of classified remote sensing data has become more challenging since the geometrical and spectral resolution of remote sensing data has increased. The project CLAIM (Classification Assessment using an Integrated Method) deals with the development of an a posteriori quality assessment method countering these challenges. We propose the consideration of uncertainties in the classification data as well as in the ground truth data (integrated method). By substituting the discrete object boundaries by symmetrical buffer areas we introduce transition zones as a model for the geometric and semantic uncertainties. A key aspect of the application is the selection of two parameters, i.e., the transition zone width and the class membership function. In this contribution we focus on the characteristics of the model parameters and how to choose appropriate values.

Keywords: classification evaluation; accuracy assessment; uncertainty; remote sensing; fuzzy logic; transition zones

AttachmentSize
KinkeldeyAccuracy2010.pdf619.32 KB

DEMView: 3-D Visualization of DEM Error

Michael B. Gousie
Department of Math & Computer Science Wheaton College Massachusetts, USA
mgousie@wheatoncolIege.edu

Abstract: It is well known that a digital elevation model (DEM) may contain systematic or other errors. In many 3-D visualization systems, problems in the data may be highlighted, but it is often difficult for the viewer to discern the exact nature of the problem. We present DEMView, a viewing and error assessment system specifically for use with DEMs. The system displays a DEM in 3-D with the usual translation, rotation, and zooming tools. However, the system incorporates a suite of visual (qualitative) and statistical (quantitative) assessment tools that help a researcher determine and analyze errors and uncertainty in a given DEM. A case study shows the efficacy of the system.

Keywords: DEM; error; viewer; 3-D; visualization

AttachmentSize
GousieAccuracy2010.pdf949.75 KB

Data Quality and the Detection of Woody Vegetation

Elizabeth Farmer*, Karin J. Reinke, Alex M. Lechner and Simon D. Jones
School of Mathematical and Geospatial Sciences RMIT University Melbourne, Australia
* elizabeth.farmer@rmit.edu.au

Abstract: This paper presents preliminary work into the data quality of a series of woody vegetation presence maps of differing spatial resolutions. The correspondence between two differing map products is compared. The non-woody/woody map products considered were derived from medium resolution satellite imagery, including the SPOT and Landsat satellites, and produced by government agencies at a national and regional scale. These 'off-the-shelf products are routinely utilised in environmental, conservation and landscape management. Whilst metadata and data quality statements exist for these map products, localised error in terms of the detection of small patches of woody vegetation remains largely unquantified and poorly understood by users. This work describes and quantifies differences in mapped woody vegetation extent as a consequence of remote sensing data product. Localised errors are compared to landscape scale error estimates, and the implications for the delineation of small woody vegetation patches are discussed. Small patches of remnant woody vegetation have recognised ecological and conservational importance in terms of the ecosystem services they offer. Consequently, there is a distinct need to map and monitor these critical ecological structures. It is in this context that remote sensing technologies, due to their spatial coverage and synoptic view of landscape structure, are increasingly being utilised to provide assessments of woody vegetation extent. The ability of remote sensing technologies coupled with standard processing methods to accurately map and characterise small patches of woody vegetation needs to be understood in order for users of this information to achieve best practice in environmental and conservation management. Previous research (Farmer et al., in review) has indicated that the minimum mapping unit of remote sensing derived map products is significantly greater than a single pixel when considering small, discrete patches of woody vegetation. As a result small patches of woody vegetation are under-represented by up to 40% in current methods of vegetation mapping. The magnitude of these localised errors and their implications for landscape scale estimates of woody vegetation cover are demonstrated to be a consequence of map product and landscape structure, in particular landscape fragmentation and the associated patch size distribution.

Keywords: Fitness for purpose, Landscape structure, Scale, Spatial resolution, Woody vegetation, Extent

AttachmentSize
FarmerAccuracy2010.pdf477.01 KB

Designing a Reference Validation Database for Accuracy Assessment of Land Cover

Stephen Stehman1, Pontus Olofsson2, Curtis Woodcock2, Mark Friedl2, Adam Sibley2, Jared Newell2, Damien Sulla-Menashe2 and Martin Herold3
1. College of Environmental Science & Forestry, State University of New York, Syracuse, NY, USA
2.Department of Geography and Environment, Boston University Boston, MA, USA
3. Laboratory of Geo-Information, Science and Remote Sensing, Wageningen University, Wageningen, NL
1. svstehma@syr.edu; 2. olofsson@bu.edu; 3. Martin.Herold@wur.nl

Abstract: The cost efficiency and consistency of accuracy assessments of global and regional land-cover maps would be greatly enhanced by creating a global reference validation database applicable to a variety land-cover maps. This validation database would have at its core a common underlying rigorous sampling design and a consistent protocol for interpreting the reference (ground condition) classification against which the map classification would be compared. The basic sampling design proposed for collecting reference data is a cluster design stratified based on Koppen climate classes and population density. Some features of a sample of 500 clusters selected by this design are described.

Keywords: sampling design; stratified random sampling; error matrix

AttachmentSize
StehmanAccuracy2010.pdf378.93 KB

Detecting Land Cover Change When the Cover Classes are Modelled as type 2 Fuzzy Sets

Peter Fisher
Department of Geography University of Leicester, Leicester, United Kingdom
pffl@le.ac.uk

Abstract: As an alternative to the normal Boolean set approach, fuzzy sets have been presented as a basis of analysis of landscapes when the classes in that landscape are considered to be poorly defined or vague. A fuzzy set, however, can easily be transformed by an a-cut into a number of Boolean sets, but this is addressed by the extension into type 2 fuzzy sets where the result of an a-cut operation is a new fuzzy set. Examining an area of Bolivian Savanna-Forest ecotone this paper shows how fuzzy change analysis can be applied to type 2 fuzzy sets, and then examines the rich potential of the resulting analysis.

Keywords: fuzzy sets, type 2 fuzzy sets, land cover change

AttachmentSize
img-921091753.pdf405.41 KB

Digital Chart Error Theory and Quality Assessment

AttachmentSize
DiWuAccuracy2010.pdf995.95 KB

Discriminant Models for Uncertainty Characterization in Remotely Sensed Land Cover

Jingxiong Zhang*, Jiong You**, and Yunwei Tang
School of Remote Sensing and Information Engineering Wuhan University Wuhan 430079, China
*jxzhang@whu.edu.cn, *jonel9851 105@126.com

Abstract: Discriminant space defining area classes is an important conceptual model for uncertainty characterization in area-class maps. It needs to be adapted for use with real data sets, as area classes intended are rarely completely and unambiguously defined by empirical data classes. This paper explores its applications in land cover mapping and land cover change analyses. Through experiments using real data sets, it was found that there are significant differences between the results obtained by referring to data classes and those by information classes, and uncertainty characterization is well supported by discriminant models and geostatistics, which accommodate spatio-temporal interdependence in error occurrences and enable quantification of effects due to partially random measurement errors and systematic categorical discrepancy, respectively.

Keywords: uncertainty, area classes, discriminant space, geostatistics, land cover change

AttachmentSize
ZhangAccuracy2010.pdf528.5 KB

Downscaling the Character of Spatial Variation

Peter M. Atkinson and Jeganathan Chockalingam
School of Geography, University of Southampton, Southampton, UK
pma@soton.ac.uk

Abstract:The study attempts to simulate the spatial and spectral patterns realised in a fine spatial resolution (5m) image using the spectral variability observed in a coarse spatial resolution (30m) image. Two different approaches were attempted: (i) inverse modelling using the punctual variogram(PUNC) and (ii) multi-resolution lookup space (MRLS). The lookup space approach yielded reliable mechanism for predicting fine spatial resolution patterns for patches or features where the loss of information was not extreme. The concept of multi-resolution lookup space downscaling proved as a powerful tool for scaling the character of spatial variation. The problem identified in this paper relates to features where the relation between scales is unusual relative to the majority of the image and could not be differentiated using some additional dimension.

Keywords: local variogram; lookup space; Euclidean distance; convolution; deconvolution.

AttachmentSize
AtkinsonAccuracy2010.pdf959.69 KB

Error Propagation in Space-time Prisms

Harvey J. Miller1, Tetsuo Kobayashi1 and Wailed Othman2
1.Department of Geography University of Utah, Salt Lake City, Utah, USA
2.Theoretical Computer Science Group, Hasselt University, Diepenbeek, Belgium
harvey.miller@geog.utah.edu; tetsuo.kobayashi@geog.utah.edu; wailed.othman@UHasslet.be

Abstract: The space-time prism demarcates all locations in space that a mobile object or person can occupy during an episode of potential or unobserved movement. The prism is a central concept in time geography as a measure of accessibility, and in mobile object databases as a measure of object location possibilities given sampling error. This paper develops an analytical approach to assessing error propagation in space- time prisms and prism-prism intersections. We analyze the geometry of the prisms to derive a core set of geometric problems involving the intersection of circles and ellipses. Analytical error propagation techniques such as the Taylor linearization method based on the first-order partial derivatives are not available since explicit functions describing the intersections and their derivatives are unwieldy. However, since we have implicit functions describing the core geometry, we modify this approach using an implicit function theorem that provides the required first-order partials without the explicit expressions.

Keywords: mobile objects, space-time prism, error propagation, implicit function theorem

AttachmentSize
MillerAccuracy2010.pdf526.79 KB

Error Propagation in the Fusion of Multi-source and Multi-scale Spatial Information

Yee Leung1, Jiang-Hong Ma2, Tung Fung1
1.Department of Geography and Resource Management The Chinese University of Hong Kong Shatin, Hong Kong;
2. Department of Mathematics and Information Science Chang'an University Xi'an, China;

1.{yeeleung; tungtung}@cuhk.edu.hk;2. jhma@chd.edu.cn

Abstract: This paper proposes an optimization model for error propagation in the fusion of multi-source and multi-scale spatial information commonly encountered in the analysis of heterogeneous spatial data. The proposed method is mathematically simple and practical, and can remarkably improve the error variance of the fused data. Different cases of multi-source and multi-scale measurements are explored and the theoretical analysis of the method is established. The simulation study supports the theoretical arguments. The model paves the path for further analysis of uncertainty in the fusion of multi-source and multi-scale data.

Keywords: information fusion, multi-source, multi-scale, error propagation

 

AttachmentSize
LeungAccuracy2010.pdf384.24 KB

Estimation of Digitizing and Polygonal Approximation Errors in the Computation of Length in Vector Databases

Jean-Francois Girres1 and Patrick Julien 2
1.COG IT Laboratory Institut Geographique National Saint-Mande, France
2.MATIS Laboratory Institut Geographique National Saint-Mande, France
1. jean-francois.girres@ign.fr; 2. patrick.julien@ign.fr

Abstract: In vector databases, errors are generally the result of different causes. In this research, we tend to identify the different components of errors, in order to estimate their impact on basic measurements (length, area). This article focuses on two sources of errors impacting length computation in linear vector databases: digitizing error and polygonal approximation. The expression of the impact of these sources of error on length computation is exposed and illustrated by experimentation on the road network of a French topographic database. Results show that the proportion of digitizing error decreases when the total length experimented increases.

Keywords: errors; length computation; vector databases; accuracy

AttachmentSize
GirresAccuracy2010.pdf354.92 KB

Extracting Spatial Decision Rules Using Rough Fuzzy Sets

Hexiang Bai and Yong Ge*
State Key Laboratory of Resources and Environmental Information System, Institute of Geographic Sciences and Natural Resources Research, Chinese Academy of Science, Beijing, China
*gey@lreis.ac.cn

Abstract: Rough sets and fuzzy sets have been widely used in the spatial analysis. Classical rough set theory has no ability to handle fuzzy uncertainty while rough fuzzy set can deal with the case when the conditional attributes are crisp and the decision attributes are fuzzy concepts in the decision information system. This type of decision information system is called a fuzzy decision information system. Rough fuzzy set are specialized in analyzing fuzzy decision information system. Based on the rough fuzzy set, this paper proposes a new method to extract the spatial fuzzy decision rules. This new method first converts sample data into a fuzzy decision information system from which spatial fuzzy decision rules are extracted. Then the conditional attributes in the fuzzy decision information system are then discretized. Third, the fuzzy decision information system is reducted. The conditional attributes selected are used to extract fuzzy decision rules in the discretized fuzzy decision information system. Finally these rules can be used to predict spatial objects with no fuzzy decisions. The general process of this method has been demonstrated by an example.

Keywords: Rough fuzzy set; spatial fuzzy decision rule; spatial data analysis component

AttachmentSize
BaiAccuracy2010.pdf356.34 KB

Geostatistical Analysis of Multiple Correlated Variables from Salt Weathering Simulations

Jennifer McKinley1, Stephen McCabe1, Antoinette Keaney1 and Joanne Curran2
1. School of Geography, Archaeology and Palaeoecology Queen's University Belfast Belfast, Northern Ireland
2. Stone Conservation Services Consarc Design Group Belfast, Northern Ireland

1.{j.mckinley, stephen.mccabe, akeaney04}@qub.ac.uk; 2. joanne.curran@consarc-design.co.uk

Abstract: Geostatistical analysis is used to investigate the spatial relationships and quantify the spatial uncertainty between the trigger factors of stone decay in salt weathering and the effect on salt penetration. Interactions between the rock properties of permeability, porosity and mineralogy have implications for moisture movement and salt input, output and storage. The weathering simulation in this study was designed to simulate pre-loading of salt during a wet winter and then complete drying out in summer. Non-destructive (probe permeametry) and destructive techniques were applied to the weathered block. This provided a spatial dataset of several variables for geostatistical analysis and the potential to quantify the degree of spatial correlation between variables to characterise the salt weathered block. The linear model of coregionalisation (LMC) was used to simultaneously model the direct and cross variograms for subsequent kriging and simulation. The results showed strongest spatial cross correlation between permeability data and salt (NaCl) data at a depth of 4-6 cm in the salt loaded sandstone block and suggest that, initially, salt concentration in the near-surface zone decreases permeability. However, continual wetting with salt and alternate heating, increases permeability, enabling the ingress and movement of salt and moisture more effectively through the stone.

Keywords: salt weathering; spatial cross correlation; cross variograms; permeability

AttachmentSize
MckinleyAccuracy2010.pdf619.32 KB

Geostatistical Modeling Using Non-gaussian Copulas

Hannes Kazianka1 and Jürgen Pilz2
1.Department of Business Mathematics, Vienna University of Technology, Vienna, Austria
2.Department of Statistics, Alpen-Adria University of Klagenfurt, Klagenfurt, Austria
Hannes.kazianka@tuwien.ac.at; juergen.pilz@uni-klu.ac.at

Abstract: Copula-based spatial models have recently attractedmuch attention and are used as a flexible tool for spatialinterpolation. For computational reasons, in most applicationsonly the radially symmetric Gaussian copula is employed.However, radial asymmetry is a property often observed inenvironmental data i.e. high values of the data have a strongerspatial dependence than low values. This paper presents a casestudy where radiological measurements have been taken in theregion of Gomel, near Tschernobyl. We show that copulamodels that are based on the radially asymmetric chi-squared-copula outperform the Gaussian copula models and classicalinterpolation methods like ordinary kriging.

Keywords: copula; spatial interpolation; radial asymmetry; Bayesian prediction; predictive distribution

AttachmentSize
KaziankaAccuracy2010.pdf586.48 KB

Geostatistical Regression for Areal Data

Phaedon C. Kyriakidis1 and Nicholas N. Nagle2
1. Department of Geography University of California Santa Barbara, CA, U.S.A.
2. Department of Geography University of Tennessee Knoxville, TN, U.S.A.

1.phaedon@geog.ucsb.edu; 2.nnagle@utk.edu

Abstract: This paper presents a geostatistical approach for linear regression with areal data, applicable when such data are defined as aggregates of point-level attribute values within regular or irregular areal units (pixels or polygons), and when Gaussian assumptions are made for the regression model disturbances or errors. The distinctive features of the proposed geostatistical regression approach as compared to the more common spatial econometrics approach for linear regression involving lattice data are highlighted, and recommendations are provided for choosing between the two approaches.

Keywords: spatial aggregation; spatial autocorrelation; Kriging; spatial econometrics; Modifiable Area Unit Problem; Ecological Inference Problem

AttachmentSize
KyriakidisAccuracy2010.pdf440.63 KB

Geostatistical Tools for National-scale Soil Monitoring

Ben P. Marchant1, R. Murray Lark1, Nicolas P.A. Saby2, Claudy C. Jolivet2 and Dominique Arrouays2
1.Rothamsted Research, Harpenden, UK
2.Infosol, INRA, Orléans, France
Ben.marchant@bbsrc.ac.uk

Abstract: National-scale soil datasets exhibit variation over widely disparate spatial scales. This includes variation because of localised anomalous processes such as point source pollution or geological anomalies which lead to outliers within the dataset. Conventional geostatistical methods are not suited to the analysis of datasets which include outliers. We demonstrate that robust geostatistical methods can be used to predict the underlying variation of a soil property in the presence of outliers whereas copula-based models are appropriate if the outliers are to be included in the predictions. Keywords: Soil monitoring, robust geostatistics, copulas

AttachmentSize
MarchantAccuracy2010.pdf456.43 KB

Handling Positional Uncertainty in a Real-time Bus Tracking System

Bashir Shalaik and Adam Winstanley
Computer Science Department, National University of Ireland, Maynooth, Kildare
{bsalaik, adam.winstanley}@cs.nuim.ie

Abstract: Automatic Vehicle Location (AVL) systems are increasingly being used by transit agencies for the real time monitoring of their vehicles. AVL systems can be used to improve the service given to passengers by using information on the current position of buses to maintain headways or increase reliability by improved operational control and provide an estimate of as to the arrival time of the next bus at the stop. In real-time bus tracking systems, some positional uncertainty is usually associated with the location of buses in service that are tracked using a locational device such as a Global Positioning System (GPS) receiver. Rather than raw coordinates, the location is usually better understood in terms of the landmarks along the route, particularly the named stop. Three prediction models have been implemented to estimate the location of vehicles on bus routes. Analysis indicates that one based on mining historical data for patterns gives more accurate results than regression and Kalman filter models when travel is disrupted by one-off events.

Keywords: AVL; GPS; APC; GPRS; Real Time\Bus Tracking

AttachmentSize
ShalaikAccuracy2010.pdf387.98 KB

Handling and communicating Uncertainty in chained Geospatial Web Services

Richard Jones*, Lucy Bastin, Dan Cornford, Matthew Williams
Knowledge Engineering Group,Aston University Birmingham, United Kingdom
*jonesrml @aston. ac.uk

Abstract: Recent developments in service-oriented and distributed computing have created exciting opportunities for the integration of models in service chains to create the Model Web. This offers the potential for orchestrating web data and processing services, in complex chains; a flexible approach which exploits the increased access to products and tools, and the scalability offered by the Web. However, the uncertainty inherent in data and models must be quantified and communicated in an interoperable way, in order for its effects to be effectively assessed as errors propagate through complex automated model chains. We describe a proposed set of tools for handling, characterizing and communicating uncertainty in this context, and show how they can be used to 'uncertainty- enable' Web Services in a model chain. An example implementation is presented, which combines environmental and publicly-contributed data to produce estimates of sea-level air pressure, with estimates of uncertainty which incorporate the effects of model approximation as well as the uncertainty inherent in the observational and derived data.

Keywords: uncertainty propagation, probability, web services

AttachmentSize
JonesAccuracy2010.pdf578.23 KB

Hydrological Model Hypothesis Testing Using Imprecise Spatial Flux Measurements

Tobias Krueger1, Jim Freer2, John N. Quinton3, Philip M. Haygarth3, Christopher J. A. Macleod4, Jane M. B. Hawkins4, Gary S. Bilotta5, Richard E. Brazier6
1. School of Environmental Sciences University of East Anglia Norwich, UK

2. School of Geographical Sciences University of Bristol Bristol, UK
3. Lancaster Environment Centre Lancaster University Lancaster, UK
4. Cross Institute Programme for Sustainable Soil Function North Wyke Research Okehampton, UK
5. School of Environment and Technology University of Brighton Brighton, UK

6. School of Geography University of Exeter, Exeter, UK
1.t.krueger@uea.ac.uk; 2. jim.freer@bristol.ac.uk; 3. {j.quinton, p.haygarth}@lancaster.ac.uk;
4. g.s.bilotta@brighton.ac.uk; 5. r.e.brazier@ex.ac.uk; 6. {kit.macleod, jane.hawkins}@bbsrc.ac.uk

Abstract: The semi-distributed conceptual Dynamic Topmodel is set up to model runoff generation in the headwater catchment Den Brook in Devon, UK. The model is tested for its spatial consistency using discharge measurements made at four nested locations within the extended Generalised Likelihood Uncertainty Estimation (GLUE) framework accounting explicitly for parameter and observational uncertainties. It is shown how model simulations deemed behavioural at the catchment outlet can be composed of highly unrealistic internal fluxes. It follows that a catchment model's spatial predictions should not be assumed reasonable unless tested against observations. A close integration of model development and field experimentation is advocated as part of an iterative learning framework of catchment hydrology.

Keywords: data uncertainty; catchment topology; Dynamic Topmodel; extended GLUE

AttachmentSize
KruegerAccuracy2010.pdf483.47 KB

IDW Implementation in Map Calculus and Commercial GIS Products

Zhiwei Cao, Mordechai (Muki) Haklay and Tao Cheng
Department of Geomatic Engineering University College London (UCL) Gower Street, London, WC1E 6BT, UK {uceszwc, m.haklay, tao.cheng}@ucl.ac.uk

Abstract: The aim of this paper is to explore the development of spatial analysis functions in Map Calculus by comparing the implementation of the IDW (Inverse Distance Weighting) interpolation method between Map Calculus and commercial GIS (Geographic Information System) software. Compared with traditional GIS products, Map Calculus is based on the Lambda Calculus and Functional Programming (Haklay 2004) and therefore the computation operates only at the end of the analysis. Investigating IDW implementation across these fundamentally different system designs provide a first step towards the development of Map Calculus.

Keywords: component: GIS; IDW; Map Calculus; Lambda Calculus; Data Accuracy

AttachmentSize
CaoAccuracy2010.pdf666.73 KB

Identification of Optimal Spatial Resolution with Local Variance, Semivariogram and Wavelet Method – Case Studying Typical Landscape of Mountainous Area in Southeast China

Qiu Bingwen*, Sui Yinbo, Chen Chongcheng and Tu Xiaoyang
Key Laboratory of Spatial Data Mining and Information Sharing of Ministry of Education, Fuzhou University,Fuzhou 350002, Fujian Province, China
*qiubingwen@fzu.edu.cn

Abstract: Understanding the spatial structure of images and selecting optimal spatial resolution has always been an important issue in the application area of remote sensing. The prime objective of this study was to conduct a comparative study of the optimal spatial resolutions for measurement of typical landscape of Mountainous Area in Southeast China, detected with local variance, semivariogram and wavelet analysis. The first principal component of SPOT image was used. The results of local variance demonstrated that optimal spatial resolution of urban, agricultural and forest landscape was around 10-20m, 30-40m and 50-80m respectively. As derived from wavelet transform, it was 40m, 80-160m and 160m respectively. Their values indicated by semivariogram through Exponential and spherical model should be at least less than 160m, 328m and 309m. Although the values we gained from variogram modeling fall within the bound of values derived from wavelet analysis, the values obtained from local variance are relatively small. The values of optimal spatial resolution obtained can vary from the methods and data source we choose and also the prime objects of interest in study areas.

Keywords: Optimal spatial resolution; Landscape; semivariogram; wavelet transform; local variance

AttachmentSize
QiubingwenAccuracy2010.pdf516.92 KB

Identification of Priority Zones for Conservation Considering Water Resources in the Alto Paraguai Watershed Region

Carlos Antonio Oliveira Vieira1, Leonardo Campos de Assis1, Michael Becker2
1.Federal University of Vicosa - Civil Engineering Dept., Surveying Engineering Sector, Vicosa-MG, Brazil
2. World Wildlife Fund – WWF, Brasilia - DF, Brazil

1.{carlos.vieira, leonardo.assis}@ufv.br; 2.michael@wwf.org.br

Abstract: Due to the environmental importance of the Pantanal (west swam region in Brazil) and the constant alterations occurring in the region, resulting from the exploration of natural resources, this work was set out to use multi-criteria analysis in order to define priority zones for conservation in the Alto Paraguai River Basin. Separate scenarios were constructed for the contexts of: Anthropic (human activities) use, Water Resources and Natural Soil Conservation, which used criteria associated to: elevation, slope face orientation, degree of slope, morphology, accumulated flow, distance from urban centers, distance from road and rail systems, types of soil, and vegetation cover, NDVI and average annual rainfall accumulation. The priority zones were effectively obtained by combining the three scenarios and subsequently reclassifying the values into five conservation priority categories: very low, low, medium, high and very high. The results show that the highland areas inside the basin where the headwaters are located are the top priority zones for conservation, as they were defined taking into account the aspects of anthropic activity, water resources and natural soil conservation.

Keywords: multi-criteria analysis, geographic information system, conservation areas

AttachmentSize
VieriaAccuracy2010.pdf547.51 KB

Identifying Fuzzy Land Use from Network and graph Theoretical Approaches

Alexis Comber1, Masahiro Umezaki2 and Chris Brunsdon1
1. Department of Geography, University of Leicester, Leicester, LEI 7RH, UK
2. Department of Human Ecology, Tokyo University, Tokyo, Japan

1.{ajc36, cb179}@le.ac.uk; 2.umezaki@humeco.rn.u-tokyo.ac.jp 

Abstract: This paper describes the application of network science community finding techniques to networks of land cover objects in order to identify areas of land use. The results show that areas of homogenous land use can be identified and fuzzy land use determined using training data generated from a linear discriminant analysis. Further work will explore the application of different parameters and attributes to community identification algorithms.

Keywords: network, graph, communities, land use, land cover

AttachmentSize
AlexisComerAccuracy2010.pdf626.19 KB

Impacts of Error on the Predicted Pattern of Change in a Post-Classification Change Analysis

Amy C. Burnicki

Department of Geography, University of Wisconsin-Madison,Madison, WI, USA
burnicki@wisc.edu

Abstract: One of the most common uses of time-series classified imagery is the monitoring of changes in land-cover composition and structure over time. A common approach to map changes in land-cover is post-classification change detection. The resulting accuracy of a post-classified map of change depends directly on the patterns of error associated with the time-series land-cover maps. This research examined the impact of classification error on the spatial pattern of change observed in a change map. A series of Iand-cover-change maps were produced using a simulation approach that controlled the: 1. amount and pattern of change occurring between the time-1 and time-2 classified maps; and 2. amount and pattern of classification error associated with the time-1 and time-2 classified maps. Both error-free and error-perturbed maps of change were produced and compared using landscape pattern indices. Results showed an increase in fragmentation within the land-cover-change classes (e.g., increase in number of land- cover-change patches, total edge, shape complexity) under all error conditions. Fragmentation was greatest when the spatial autocorrelation of the change class increased. Regardless of the pattern of change considered, errors associated with classified maps significantly altered the pattern of change simulated in the post-classification change analysis.

Keywords: land-cover change; simuattion; error propagation; landscape metrics

AttachmentSize
BurnickiAccuracy2010.pdf602.25 KB

Improvement of the Accuracy on Image Classification Process through Incorporation of contextual Information

Leonardo Campos de Assis, Carlos Antonio Oliveira Vieira and Fabyano Fonseca Silva
Federal University of Vicosa Vicosa-MG, Brazil
{leonardo.assis,carlos.vieira,fabyanofonseca}@ufv.br

Abstract:The present study used contextual information modelled through Bayesian Inference to improve image classification in urban areas, with objective to produce a vegetation Map of Belo Horizonte Municipal District (capital of Minas Gerais State - Brazil). Contextual information was inputted into the classification process through ancillary data and specialist knowledge about land cover types. Accuracy assessment shows that despite the amount of generated data (one image based on probability values to each vegetative category), the proposed method improved the image classification accuracy, being more efficient than a conventional Gaussian Maximum Likelihood estimator.

Keywords: contextual information; image processing; Bayesian Inference.

AttachmentSize
AssisAccuracy2010.pdf603.89 KB

Improving Image classification Accuracy: a Method to Incorporate Uncertainty in the Selection of Training Sample Set

Luisa M S Goncalves1, Cidalia C Fonte2, Hugo Carrao3 and Mario Caetano3
1.Polytechnic Institute of Leiria Department of Civil Engineering, Portugal Institute for Systems and Computers Engineering at Coimbra (INESC Coimbra) Coimbra, Portugal
2.Institute for Systems and Computers Engineering at Coimbra (INESC Coimbra) Coimbra, Portugal Department of Mathematics - University of Coimbra Coimbra, Portugal
3.Remote Sensing Unit (RSU) Portuguese Geographic Institute (IGP) Lisboa, Portugal Research Centre for Statistics and Information Management (CEGI), Institute for Statistics and Information Management (ISEGI),Universidade Nova de Lisboa, Campus de Campolide Lisboa, Portugal

1.luisa.goncalves@ipleiria.pt; 2. cfonte@mat.uc.pt; 3. {hugo.carrao, mario.caetano}@igeo.pt

Abstract: The automatic production of land cover maps using multispectral remote sensing images requires the use of learning classifiers for mapping the imagery data into a set of discrete classes. A group of classifiers commonly used are the supervised classifiers. The first stage of a supervised classification consists on the identification of training areas in the satellite image for each class, which are then used as descriptors of the spectral characteristics of the different classes. The classification results are therefore influenced by the sample pixels selected as training sets. This paper proposes an automatic method to assist the selection of training samples for mapping land cover from satellite images with the aid of ancillary information, namely elder or contemporaneous maps with lower spatial resolution, the Normalized Difference Vegetation Index and information provided by the classification uncertainty. It is shown that more accurate outputs may be derived with this methodology and some conclusions are drawn.

Keywords: training samples, soft classification, measures of uncertainty, classification accuracy

AttachmentSize
Concalves.pdf970.48 KB

Incorporating Uncertainty in the Accuracy Assessment of Land Cover Maps Using Fuzzy Numbers and Fuzzy Arithmetic

Pedro Sarmento, Hugo Carrão, Mario Caetano*, Cidália C. Fonte** and Nuno Cortês
Centro de Estatistica e Gestão de Informação(CEGI), Lisbon, Portuga
Instituto Superior de Estatística e Gestão de Informação(ISEGI),Lisbon, Portugal
Remote Sensing Unit (RSU) Portuguese Geographic Institute(IGP), Lisbon, Portugal
Institute for Systems and Computer Engineering at Coimbra Department of Mathematics, University of Coimbra, Coimbra, Portugal
*Mario.caetano@igeo.pt; **cfonte@mat.uc.pt

Abstract: This paper proposes an effort to include uncertainty in reference databases used to assess the accuracy of land cover maps. Five linguistic levels of confidence in land cover labelling are assigned to each sample observation and converted into fuzzy numbers. This information is introduced in a fuzzy confusion matrix and fuzzy accuracy measures, similar to the global, user's and producer's accuracy, are then derived from the fuzzy confusion matrix using fuzzy arithmetic. These measures consist of fuzzy numbers that incorporate the uncertainty in identifying the reference land cover class of the sample data. Fuzzy accuracy measures can be defuzzified to generate real numbers, enabling the conversion into crisp measures, which allow the comparison with the accuracy results obtained with traditional confusion matrixes. The proposed methodology is tested on a case study. The quality of a map for Continental Portugal, derived from the automatic classification of MERIS images, is evaluated using a reference database generated with the proposed methodology.

Keywords: land cover maps; accuracy assessment; reference database uncertainty; fuzzy numbers; fuzzy arithmetic

AttachmentSize
img-921091746.pdf534.35 KB

Increasing the Accuracy of Digital Elevation Models by Means of Geostatistical Conflation

Cutberto U. Paredes-Hernandez, Nicholas J. Tate Kevin J. Tansey, Peter F. Fisher
Department of Geography University of Leicester Leicester, UK {cupl, njt9, kjt7, pffl}@le.ac.uk

Abstract: In this paper we compare the use of two geostatistical conflation techniques, Ordinary Cokriging (OCK) and Kriging with an External Drift (KED), to increase the accuracy of a Digital Elevation Model using a set of sparsely distributed accurate Ground Control Points. Our results show that both conflation techniques produce more accurate DEMs than any of the data sources used individually, with KED producing the most accurate results.

Keywords: DEM Accuracy, Geostatistical Conflation, Cokriging, Kriging with an external drift.

AttachmentSize
img-X03141145.pdf610.8 KB

Joint Space-time Modeling of West Nile Virus Vector Mosquito Abundance

Eun-Hye [Enki] Yoo
Department of Geography University at Buffalo, The State University of New York Buffalo, NY, USA
eunhye@buffalo.edu

Abstract: Mosquitoes are involved in the transmission of West Nile viruses, therefore, their abundance, which depends on the weather conditions and physical environment, are closely linked to major disease outbreaks. The objective of this paper is to provide a geostatistical tool to predict mosquito abundance, which is based on the established relationship between mosquito abundance and meteorological and environmental factors, as well as a stochastic spatiotemporal patterns of mosquito abundance informed from observed mosquito count data. The proposed geostatistical model allows assessing the joint space-time uncertainty regarding the prediction of the mosquito abundance at unsampled location during a week of trapping seasons.

Keywords: West Nile Virus (WNv); Joint space-time modeling; Simple cokriging; Uncertainty assessment

AttachmentSize
YooAccuracy2010.pdf318.76 KB

Land Suitability Analysis Comparing Boolean Logic with Fuzzy Analytic Hierarchy Process

Mukhtar Elaalem, Alexis Comber, and Peter Fisher
Department of Geography University of Leicester Leicester, United Kingdom
{me84, ajc36,pffl}@le.ac.uk

Abstract: Sustainable land management in agriculture is a very complex and challenging concept, especially in developing countries. Optimal use of land is very important in the context of rapid population growth and urban expansion which make available land for agriculture a relatively scarce commodity. Agricultural land management planning seeks to identify the most beneficial land uses whilst improving and conserving land resources for future. This paper compares land suitability models for barley using Boolean logic theory and Fuzzy Analytic Hierarchy Process method (FAHP) for a test area within North-Western region of Jeffara Plain of Libya. The logic of the different models is explored in relation to land qualities and land characteristics and their influence on the model outputs. The results of the FAHP models are based on standardizing land characteristics using different fuzzy set models and applying the pairwise comparisons for criteria weighting to drive land suitability map for barley. Land suitability results for barley from the use the Fuzzy AHP and the Boolean methodologies have been derived. Error matrix and analysis of the results are presented using two assessment technologies; an overall accuracy and KAHAT statistic.

Keywords: sustainability; land evaluation; pairwise comparisons; Fuzzy Analytical Hierarchy Process; Boolean evaluation.

AttachmentSize
ElaalemAccuracy2010.pdf387.08 KB

Land-cover Class as a Qualifier for Quoted Elevation Error in Aerial LiDAR

Seam us Coveney
National Centre for Geocomputation,NUI Maynooth Ireland
seamus.coveney@nuim.ie

AttachmentSize
CoveneyAccuracy2010.pdf321.8 KB

Latin Hypercube Sampling of Gaussian Random field for Sobol’s Global Senitivity Analysis of Models with Spatial Inputs and Scalar Outputs

Nathalie Saint-Geours1, Jean-Stephane Bailly1, Frederic Grelot2, Christian Lavergne3
1. AgroParisTech, UMR TETISF-34093, Montpellier, France
2. Cemagref, UMR G-EAUF-34093, Montpellier, France
3. Mathematics and Modelling Institute of Montpellier, Montpellier 3 University, Montpellier, France
1.{saintge, bailly}@teledetection.fr; 2. frederic.grelot@cemagref.fr; 3. Christian.lavergne@univ-montp3.fr

Abstract: The variance-based Sobol' approach is one of the few global sensitivity analysis methods that is suitable for complex models with spatially distributed inputs. Yet it needs a large number of model runs to compute sensitivity indices: in the case of models where some inputs are 2D Gaussian random fields, it is of great importance to generate a relatively small set of map realizations capturing most of the variability of the spatial inputs. The purpose of this paper is to discuss the use of Latin Hypercube Sampling (LHS) of geostatistical simulations to reach better efficiency in the computation of Sobol' sensitivity indices on spatial models. Sensitivity indices are estimated on a simple analytical model with a spatial input, for increasing sample size, using either Simple Random Sampling (SRS) or LHS to generate input map realizations. Results show that using LHS rather than SRS yields sensitivity indices estimates which are slightly more precise (smaller variance), with no significant improvement of bias.

Keywords: global sensitivity analysis; Latin Hypercube Sampling; Gaussian random field; unconditionnal simulation

AttachmentSize
GeoursAccuracy2010.pdf517.96 KB

Locally Accurate Prediction Standard Errors with Spatially Varying Regression coefficient Models

Paul Harris1, Stewart Fotheringham1 and Chris Brunsdon2

1. National Centre for Geocomputation University of Ireland Maynooth Maynooth, Co. Kildare, Ireland
2. Department of Geography University of Leicester Leicester, LEI 7RH, UK
1. {paul.harris, stewart.fotheringham}@num.ie
2. cbl79@le.ac.uk

Abstract: This study assesses the prediction and prediction uncertainty performance of models that cater for both: (i) a nonstationary relationship between the response and a contextual variable and (ii) a nonstationary residual variance (or variogram), at point locations for a single realisation spatial process. Here the crucial aspect of the model specification is allowing the residual variance (or variogram) to vary across space. Without this, the estimated prediction standard errors are only likely to be accurate in a global (or overall) sense and not the desired, local sense. Locally-accurate prediction standard errors, allow locally-relevant prediction confidence intervals and/or locally-relevant estimates of risk (e.g. the risk of exceeding some critical threshold) which is not only valuable to researchers who attempt to model spatial processes, but also to policy makers who need to plan and manage the outcomes of spatial processes at different spatial scales.

Keywords: georaphically weighted regression, moving window kriging, heteroskedastic, bayesian prediction models

AttachmentSize
HarrisAccuracy2010.pdf449.28 KB

Managing Uncertainty in Complex Geospatial Models

Dan Cornford and Remi Barillec
Non-linearity and Complexity Research Group Aston University Birmingham, UK
{d.cornford, r.barillec}@aston.ac.uk

Abstract: The development of increasingly complex, often processes based, models of geospatial phenomena makes the management of uncertainty in such models ever more pressing. Uncertainties on inputs to the models, including parameters within the models, can have complex and difficult to predict effects on uncertainties on the outputs. One approach to managing uncertainties in complex models is to develop emulators, which are statistical representations of our beliefs about the model we are analysing. The emulator, or surrogate statistical model, can then be applied to a range of typically Monte Carlo based analysis methods including uncertainty analysis, sensitivity analysis, calibration and optimal decision making. Applying such emulation ideas to geospatial models presents several challenges. First it is necessary to be supplied with, or elicit prior beliefs over, the distribution of the various inputs to the model. Where the geospatial model has many inputs joint specification of beliefs remains challenging. Geospatial models often also have high numbers of outputs, for example the value of some variable across a spatial or spatio- temporal field. Handling high dimensional outputs in emulation is challenging since this requires the specification of a multivariate Gaussian process (Bayesian cokriging). The paper describes the emulation framework, including the subjective Bayesian approach within which it forms a component tool. Particular focus is placed on extending emulator approaches to analysing geospatial models in the context of dynamic spatio- temporal simulators. The results illustrate that emulation can be applied jointly to relatively high dimensional outputs particularly where these admit an intrinsically lower dimensional representation. The conclusion addresses the strengths and weaknesses of the emulation approach to managing uncertainties in complex geospatial models.

Keywords: emulation, meta-models, surrogate models, Monte Carlo, Gaussian processes, kriging.

AttachmentSize
CornfordAccuracy2010.pdf496.06 KB

Modelling Uncertainty in Watershed Divides from SRTM and GDEM

Laura Poggio1 and Pierre Soille2
1. The Macaulay Land Use Research Institute, Aberdeen, United Kingdom
2. Joint Research centre, European Commission, Ispra, Italy
1. poggio@macaulay.ac.uk; 2. pierre.soille@jrc.ec.europa.eu

Abstract: Watersheds are considered important units in many environmental decision-making processes. The delineation of watersheds using digital elevation models (DEMs) is common and presents many advantages. However it is very sensitive to the uncertainty of the elevation datasets used. The main aim of this work is to use a probabilistic approach to extract watersheds divides on two widely available datasets in order to estimate the uncertainty. Hundred simulations of each of the input dataset were generated using a Monte Carlo probabilistic approach. The watershed divides were delineated from each iteration. The different iterations were combined to produce a cumulative probability surface representing how many times a cell was part of a watershed divide. The preliminary results showed a high uncertainty in most of the test area. The highest uncertainty was related to small sub-watershed of low Strahler order streams. For both the considered datasets, the modelling of the elevation errors improved the delineation process, providing important additional information.

Keywords: digital elevation models, simulations, Strahler orders, probabilistic.

AttachmentSize
LauraPoggioAccuracy2010.pdf668.2 KB

Multi-scale Analysis on NDVI and Topographical Factors of Wuyi Mountain Reserve Area Using Wavelet Method

Qiu Bingwen1*, Su Zanyou1, Chen Chongcheng1 and Bela Markus2
1.Key Laboratory of Spatial Data Mining and Information, Sharing of Ministry of Education, Spatial Information Research Center of Fujian Province, Fuzhou University, Fuzhou 350002, Fujian Province, China
2. University of West Hungary, 8002 Szekesfehervar, P.O.B. 52, Hungary

*qiubingwen@fzu.edu.cn

Abstract: Wuyi Mountain Reserve area is the one of the few reserve Areas for both world biosphere, cultural and natural heritage in China. This paper addresses the scale dependency problem of vegetation-topography relations of Wuyi Mountain Reserve area. The wavelet transform was applied to analyze the multi-scale correlations between Normalized Difference Vegetation Index (NDVI) and several topographic factors (DEM, slope and aspect). The results of wavelet coefficient variograms show that spatial patterns of NDVI and Topographical indicator both exhibit two dominant scales. The fact of nearly synchronous scale domain, especially in small scale domain, suggests that tightly-coupled relationship exists between NDVI and DEM. Results of multi-scale correlation relationships among NDVI and geographical factors suggest that the correlation is scale-dependent, i.e. different scales have different coefficients among the factors. The coefficient values between NDVI and topographical factors are larger in coarser scales than those in finer scales, . which suggest that topographical factors have important roles on controlling NDVI patterns in larger scale. The relationship between NDVI and slope, aspect exhibit complicated variation with spatial scales. This study may improve the understanding of the multi- scale role that topography plays in the formation of vegetation patterns in mountainous areas and also suggest that wavelet transform is useful in exploring the multi-scale pattern of natural resources.

Key words: Wuyi Mountain Reserve area; multi-scale analysis; NDVI; DEM

AttachmentSize
QiuAccuracy2010.pdf457.37 KB

Non-stationary Modelling and Simulation of LIDAR DEM Uncertainty

Juha Oksanen* and Tapani Sarjakoski
Department of Geoinformatics and Cartography Finnish Geodetic Institute Masala, Finland
*juha.oksanen@fgi.fi

Abstract: Appropriate modelling and simulation of digital elevation model (DEM) uncertainty is among the most long- lasting of topics in geographical information science, because DEMs and terrain analysis are widely used in tasks with high societal impact. Decisions based on the analysis are expected to be of better quality if the uncertainty of the analysis results is taken into account. Despite the long research history of the topic, a few big challenges have decelerated the final break- through of the uncertainty-aware terrain analysis. Firstly, the utilisation of Monte Carlo simulation, which is the most flexible method for investigating the propagation of uncertainty in terrain analysis, is time-consuming. Moreover, the use of massive high-resolution DEMs based on airborne light detection and ranging (LIDAR) has made the performance issue even worse. Secondly, mainstream uncertainty-aware terrain analysis is done by applying stationary models of DEM uncertainty, even though the number of experiments has proven that the uncertainty would be modelled more realistically as a non-stationary stochastic process. The paper demonstrates how the process convolution method can be applied in a realistic and efficient non-stationary simulation of LIDAR DEM uncertainty.

Keywords: process convolution, unconditional simulation

AttachmentSize
OksanenAccuracy2010.pdf583.45 KB

Object-oriented Remote Sensed Image Classification Accuracy Assessment

Zhaocong Wu, Lina Yi, and Guifeng Zhang
School of remote sensing and information engineering Wuhan university Wuhan, China
Hnal986350@163.com

Abstract: This paper proposes a new accuracy assessment scheme for object-oriented remote sensing imagery classification. It measures both the geometrical and thematic accuracy of objects. The geometrical accuracy are measured from two aspects of the area and boundary accuracy. The thematic accuracy is measured associated with objects. The classification results of a Quickbird image with different object location are evaluated by the proposed method. Experiments show the method can provide more information about the classification accuracy and is potential to solve the problem of the traditional statistical accuracy assessment measures.

Keywords: object-oriented; classification accuracy assessment; segmentation quality

AttachmentSize
WuAccuracy2010.pdf740.07 KB

On the Reproducibility of Reflectance Factors: Implications for EO Science

Karen Anderson1, E. J. Milton2, Vincent Odongo2 and Jennifer L. Dungan3
1.School of Geography, University of Exeter, UK
2.School of Geography, University of Southampton, UK
3.NASA Ames Research Center, Moffett Field, CA, USA
1.karen.anderson@exeter.ac.uk; 2. milton@soton.ac.uk; 3.Jennifer.L.Dungan@nasa.gov

Abstract: Measurements of reflectance quantities collected in natural radiation conditions underpin quantitative Earth observation (EO) science through calibration, validation and atmospheric correction techniques. Despite their importance to the longevity of EO data, only a few studies have commented on the reliability of such measurements. This paper will report on results from three experiments. Each was designed to explore a different facet of measurement uncertainty in field measurements of hemispherical conical reflectance factors(HCRF), and to consider the broader implications of this for EO. From an end-user's standpoint we will describe a simple methodology for standard uncertainty characterisation for any reflectance-measurement scenario. The work provides a broad basis for considering standardised approaches to uncertainty characterisation and reporting, which is a necessary step towards improving the reproducibility and traceability of EO data and associated products.

Keywords: reflectance, uncertainty, spectroradiometer, HCRF

AttachmentSize
AndersonAccuracy2010.pdf420.97 KB

Overall Accuracy Estimation for Geographic Object-based Image Classification

Julien Radoux*, Patric Bogaert, and Pierre Defourny
Earth and Life Institute-Environmental Sciences, Universite catholique de Louvain, Louvain-la-Neuve, Belgium
*Julien.Radoux@uclouvain.be

ABSTRACT: Geographic object-based image analysis is a processing method where groups of spatially adjacent pixels are classified as elementary units. This approach raises concerns about the design of subsequent validation strategies. Indeed, classical point-based sampling strategies based on the spatial distribution of sample points (using systematic, probabilistic or stratified probabilistic sampling) do not rely on the same concept of objects. New methods explicitly built on the concept of objects used for the classification step are thus needed. An original object-based sampling strategy is therefore proposed and compared with other approaches used in the literature for the thematic accuracy assessment of object-based classifications. The new sampling scheme and sample analysis are founded on a sound theoretical framework based on few working hypotheses. The performance of the sampling strategies is quantified using object-based classifications results simulated for a Quickbird imagery. The bias and the variance of the overall accuracy estimates were used as indicators of the methods benefits. The main advantage of the object-based overall accuracy predictor is its performance: for a given confidence interval, it requires less sampling units than the other methods. In many cases, this can help to noticeably reduce the sampling effort. The use of objects-based sampling units leads to practical and conceptual issues, which are sometimes, but not always, similar to those of point-based accuracy assessment. These issues (mixed entities, spatial correlation, effect of the geolocation errors, sample representativity) are discussed with regard to the representation of environmental variables together with the limitations of the proposed method.

Keywords: Overall accuracy; Object; Spatial region

AttachmentSize
img-921091403.pdf478.79 KB

Positional Error Propagation Analysis in Habitat Distribution Modelling

Babak Naimi1*, Andrew K. Skidmore2, Nicholas A.S. Hamm2, and Thomas A. Groen2
1.Faculty of Geo-Information Science and Earth Observation, (ITC), Enschede, The Netherlands, Graduate School of the Environment and Energy,Science and Research Branch, Islamic Azad University, Tehran, Iran
2.Faculty of Geo-Information Science and Earth Observation, (ITC), Enschede, The Netherlands
* naimi@itc.nl

Abstract: This study examines how robust habitat distribution models are to uncertainty in the position of species occurrence. An artificial species was simulated and mapped in southern Spain (Malaga) and error was introduced to the location of samples. Three commonly used habitat distribution modelling algorithms (GAM, BRT, and MaxEnt) were selected. The propagation of error into the predictions was then analyzed using Monte Carlo (MC) simulation. The models were evaluated for overall performance using the area under receiver operating characteristic curve (AUC). The Root Mean Square Error (RMSE) was also calculated to assess the accuracy of probabilities predicted at grid cells. The results indicate only a small decline in the performance of models with introduced error in species position. Visualizing of RMSEs at grid cells indicates that uncertainty varies with location.

Keywords: Habitat distribution modeling; positional uncertainty; spatial error propagation

AttachmentSize
MaimiAccuracy2010.pdf527.66 KB

Positional and Thematic Tolerance Operators for the Intercomparable Accuracy Measures of Land Use/Land Cover Base-maps

Stephane Couturier
CentroGeo -Centre for Geography and Geomatics Research (Centra de Investigacion en Geografia y Geomatica 'Ing. Jorge L. Tamayo' A.C.), Mexico City, Mexico Stephane.
Coiiturier@CentroGeo.org.mx

Abstract: Accuracy assessments at regional scale are generally tailored to give one measure of error (either global or per class), which may vary substantially with the uncertainty contained in the assessment process. In fact, procedures are generally incorporated in the assessment to implicitly set a degree of tolerance regarding positional and/or thematic aspects, at which the map is evaluated. This research focusses on the adoption of a standardized response design for map accuracy assessments, enabling the comparison between agreement definitions among maps, with parametrized, moveable, degrees of tolerance. To this end, a formalization based on fuzzy GIS- based tolerance operators is described, whose continuous and discrete-field parameters are able to approximate major (positional and thematic) aspects of published regional accuracy assessment designs. The operators are applied to two cartographic datasets in Mexico, facilitating the comparison with the accuracy of the international cartography.

Keywords: accuracy assessment; fuzzy map comparison; reference maplet; agreement definition

AttachmentSize
CuturierAccuracy2010.pdf489.53 KB

Quality Evaluation of DEMs

Haris Papasaika and Emmanuel Baltsavias
Institute of Geodesy and Photogrammetry ETH Zurich Wolfgang-Pauli-Str. 15, CH-8093, Zurich, Switzerland Telephone: +41 44 6336808
{haris, manos}@geod.baug.ethz.ch

Abstract: Nowadays, sensors and processing techniques provide for the same site DEMs with different geometric characteristics and accuracies. Each DEM contains intrinsic errors due to the primary data acquisition technology and processing methodology in relation with the particular terrain and landcover type. The accuracy of these datasets is often unknown, inhomogeneous within each dataset, and almost always, only a global measure is given. We present a new concept for the quality characterization of DEMs based on different quality measures.

Keywords: Multi-source data, Integration, Quality Control, Spatio-temporal, Digital Elevation Model

AttachmentSize
PapasaikaAccuracy2010.pdf913.76 KB

Random and Spatially Autocorrelated Sensor Noise Effects on Image Classification

Tarmo K. Remmel1 and Scott W. Mitchell2
1.Department of Geography York University, Toronto, Ontario, Canada
2.Department of Geography and Environmental Studies Carleton University, Ottawa, Ontario, Canada
1. remmelt@yorku.ca; 2. Scott_Mithell@carleton.ca

Abstract: One factor limiting the accuracy of land cover maps derived from classified, remotely-sensed imagery is the quality of the spectral data used in the classification process. Satellite data is routinely pre-processed to improve both its geometric and radiometric qualities. We implement a factorial design that assesses the individual and joint effects of simulated sensor noise on specific spectral bands and along continua of intensity and spatial configuration; an image with no added simulated noise is our control. Our focus is on the radiometric component of image quality, as we assume that for our single-image controlled experiment, the multispectral bands are all perfectly aligned and that topographic relief insignificantly affects the geometric properties of our data. For each simulated noisy image we produce a detailed land cover classification using identically-defined classification tree decisions and observe the spatial changes relative to the classification of the control image. We assess the classification accuracy between all noisy cases and the control using traditional error matrices and measures of overall thematic agreement. The objective is to perform a full sensitivity analysis that quantifies the effect of noisy data on image classification, both in terms of the aspatial class area tabulations and their spatial configurations. We link the classification differences with uncertainty metrics as a guide to improving the selection of classifiers and pre-processing techniques.

Keywords: CART; random noise; autocorrelated noise; uncertainty; composition; configuration

AttachmentSize
img-921091737.pdf576.79 KB

Raster Data Transformation for Land change Analyses

Zachary Christman1 and John Rogan2

1. Rutgers University Department of Geography, Piscataway, NJ, USA
2. Clark University Graduate School of Geography Worcester, MA, USA
1. zachxman@rci.rutgers.edu 2. jrogan@clarku.edu

Abstract: The use of raster-based categorical maps from multiple sources necessitates the transformation of the geometric characteristics to directly compare maps, such as in land change analyses. Through the operations of reprojecting maps to a new geographic reference framework and rescaling pixel values to a new size, distortions of map information are introduced that can affect both the proportion and arrangement of thematic classes across the landscape. Using a sample land cover dataset, images were reprojected and rescaled using three common raster-based transformation methods and one new vector-based method. Through an evaluation of changing class areas and landscape ecology metrics, results demonstrate that the values of more than a third of pixels in a categorical map may be affected by common reprojecting and rescaling methods. While relative class area was best preserved by a nearest-neighbour resampling method, the contiguity of thematic classes and the overall fragmentation of the landscape was lowest when using a vector-based reprojecting and resampling method. Results reinforce the need for careful attention to categorical data transformations in land change analyses.

Keywords: land change; data integration; projection; scale; reference system; transformation

AttachmentSize
ChristmanAccuracy2010.pdf450.91 KB

Sampling for Validation of Ecotope Maps of Floodplains in the Netherlands

Martin Knotters, Dick J. Brus
Soil Science Centre Alterra, Wageningen University and Research Centre Wageningen, the Netherlands
martin.knotters@wur.nl

Abstract: Ecotope maps of five districts of main water courses in the Netherlands were validated by independent field observations of ecotopes. The map quality was quantified by the overall map purity, and by the user's and producer's accuracies of the map units. In four districts the validation locations were selected by purposive (targeted) sampling. In these purposively sampled districts the sampling points were clustered in a limited number of compact validation areas, in order to reduce travel costs. For the fifth district a stratified two-stage probability sample was designed, such that the spatial pattern resembles that of the purposive samples. In this way the same practical and budgetary constraints are met as in the purposively sampled districts. In the first stage the district was divided into eight geographical strata, representing the main river branches, and each stratum was divided into validation areas of approximately constant size (primary sampling units). In each geographical stratum two validation areas were selected by simple random sampling without replacement. In the second stage, in each selected validation area a simple random sample of points was selected (the secondary sampling units). At these locations ecotopes were observed in the field. For the maps validated by purposive sampling the quality measures were estimated by model-based inference based on a stochastic model for the spatial variation of classification errors. For the map validated by probability sampling the quality measures were estimated by design-based inference based on the inclusion probabilities of the validation locations. The total map purities varied from 56 to 76 % among the five districts. Both user's and producer's accuracies showed large variation among the map units, depending on the contribution of several sources of error in the mapping process. Stratified two-stage sampling combined with a design-based estimation method results in model-free estimates of total map purity, user's and producer's accuracies. This is an important advantage in validation, because the results do not depend on the quality of model assumptions. This means that the validity of the estimated map purities, user's and producer's accuracies is beyond discussion if a design-based approach is followed.

Keywords: map accuracy; ratio estimator; two-stage sampling

AttachmentSize
KnottersAccuracy2010.pdf388.23 KB

Satellite Precipitation Assessments for Flash Flood Events in Turkey

Fatih Keskin1, İsmail Yücel2, Robert J. Kuligowski3
1. State Hydraulic Works, Center for Satellite, Ankara, Turkey
2. Middle East Technical University, Ankara, Turkey
3. NOAA/NESDIS, Applications and Camp Springs, USA 1. fatihk@dsi.gov.tr;2. iyucel@metu.edu.tr; 3.Bob.Kuligowski@noaa.gov

Abstract: This study investigates the performance of the NOAA/NESDIS operational precipitation estimation algorithm, called the Hydro-Estimator (HE) in its depiction of the timing, intensity, and duration of flash flood events occurred at several locations in the North-West of Turkey along with the period of September 7 through 11 in 2009. In this study, precipitation estimates from the HE algorithm were evaluated against point observations collected from rain gauges and radar data as becomes available.

Keywords: precipitation, interpolation, kriging, hydro- estimator, satellite, cross validation

AttachmentSize
KeskinAccuracy2010.pdf637.24 KB

Scale, Resolution and Missing Data in a Long Term Spatiotemporal Epidemiological Study

Robert Corner*, Grit Schuster and Bernhard Klingseisen
Department of Spatial Sciences, Curtin University, Perth Western, Australia
* R.Corner@curtin.edu.au

Abstract: As part of Australia's biosecurity effort we are working to understand the relationship between the spatiotemporal variability of climatic and environmental conditions and the occurrence of two vector-borne viruses; Murray Valley Encephalitis Virus (MVEV) which causes disease in humans and Bluetongue Virus (BTV) which causes disease in animals. This involves bringing together remotely sensed and, other environmental data with long term serological test data for disease occurrence in sentinel animals. In working with these datasets we are faced with a number of challenges involving scale, resolution and data completeness.

Keywords: Epidemiology, uncertainty, sparse data sets, Remote Sensing

AttachmentSize
CornerAccuracy2010.pdf591.78 KB

Second-phase Spatial Sampling: Local and Global Objectives to Optimize Sampling Patterns

Eric Delmelle
Department of Geography and Earth Sciences University of North Carolina - Charlotte Charlotte, U.S.A.
eric.delmelle(a)uncc.edu

Abstract: In geographic sampling, once initial samples of the primary variable have been collected, it is possible to take additional measurements, an approach known as second-phase sampling. It is generally desirable to collect such additional samples in areas far away from existing observations to reduce redundancy, which coincide with regions where the kriging variance is maximum. However, the kriging variance is independent of data values and computed under the assumption of stationary spatial process, which is often violated in practice. Weighting the kriging variance with another criterion, giving greater sampling importance to locations exhibiting significant spatial roughness, can serve as an alternative objective (Delmelle & Goovaerts 2009). This roughness is computed by a spatial moving average window. Another objective function consists of locally determined variogram models to obtain local kriging variances, reflecting non-stationarity (Haas 1990). The benefits and drawbacks of these three approaches are illustrated in a case study using an exhaustive remote sensing image. Combinations of first-phase systematic and nested sampling designs (or patterns) are generated, while the location of additional observations is guided in a way which optimizes each objective function. Augmented sampling sets minimizing the weighted kriging variance or minimizing the kriging variance computed by local variograms lead to better reconstruction of the true image, while patterns minimizing the kriging variance computed by a global variogram lead to reconstruction similar to a random addition. This indicates that accounting for spatial roughness in second-phase sampling improves the overall accuracy of the prediction.

Keywords: sampling pattern, local variance, spatial roughness, weighted kriging variance

AttachmentSize
DelmelleAccuracy2010.pdf669.58 KB

Sensitivity of Quasi-Dynamic Topographic Wetness Index to Choice of DEM Resolution, Flow Routing Algorithm, and Soil Variability

Thao T. M. Nguyen and John P. Wilson
GIS Research Laboratory University of Southern California Los Angeles, California, USA
{thaotngu, Jpwilson}@usc.edu

Abstract:The steady-state topographic wetness index, which is frequently used to predict soil water content and sources areas for saturated overland flow, relies on two terrain attributes whose values have been shown to vary with both the grid resolution and flow routing algorithm that are used. A quasi- dynamic index (QD-TWI) has been proposed to help overcome the steady-state assumption that is implicit in the traditional form and not applicable in most semi-arid and arid landscapes. Few studies have examined the sensitivity of this index to variations in inputs and this paper examines its sensitivity to simulated DEM error, choice of DEM resolution and flow routing algorithm, and the variability of soil parameters for four small catchments in Dane County, Wisconsin.

Keywords: quasi-dynamic topographic wetness index; sensitivity analysis; error propagatio

AttachmentSize
NguyenAccuracy2010.pdf506.37 KB

Significance Analysis of Multi-temporal RapidEye Satellite Images in a Land-cover Classification

Michael Forster, Christian Schuster and Birgit Kleinschmit
Department of Geo information Processing for Landscape and Environmental Planning,Technische Universitaet Berlin,Berlin, Germany
{michael.foerster, christian.schuster, birgit.kleinschmit}@tu-berlin.de

Abstract: Multi-temporal satellite information can supply valuable information about changing patterns of land-cover, especially for vegetation species. As a contribution to evaluate the additional information content of intra-annual high spatial resolution satellite images the presented study assesses the significance of the classification accuracy for the identification of land-cover data. An Microarray Significance Analysis (MSA) was used to evaluate a sequence of nearest neighbor classifications (with training areas from field spectral measurements and existing geo-data) using image combinations of different dates and spectral bands for the utilized land cover classes. The resulting microarray of accuracy percentages of single classes and the overall classification was used for the subsequent MSA. The results from the MSA showed a higher significance when more images were included in the classification process. Especially the scene from September 2009 indicated a positive significance within the land-cover classification.

Keywords: Microarray Significance Analysis; RapidEye; object- based image analysis

AttachmentSize
FosterAccuracy2010.pdf625.26 KB

Sources of Uncertainty in Predicting Land Surface Fluxes Using Diverse Data and Models

Jennifer L. Dungan*, Weile Wang, Andrew Michaelis, Petr Votava and Ramakrishna Nemani
Biospheric Science Branch, NASA Ames Research Center, Moffett Field, CA, USA
*Jennifer.L.Dungan@nasa.gov

Abstract: In the domain of predicting land surface fluxes, models are used to bring data from large observation networks and satellite remote sensing together to make predictions about present and future states of the Earth. Characterizing the uncertainty about such predictions is a complex process and one that is not yet fully understood. Uncertainty exists about initialization, measurement and interpolation of input variables; model parameters; model structure; and mixed spatial and temporal supports. Multiple models or structure soften exist to describe the same processes. Uncertainty about structure is currently addressed by running an ensemble of different models and examining the distribution of model outputs. To illustrate structural uncertainty, a multi-model ensemble experiment we have been conducting using the Terrestrial Observation and Prediction System (TOPS) will be discussed. TOPS uses public versions of process-based ecosystem models that use satellite-derived inputs along with surface climate data and land surface characterization to produce predictions of ecosystem fluxes including gross and net primary production and net ecosystem exchange. Using the TOPS framework, we have explored the uncertainty arising from the application of models with different assumptions, structures, parameters, and variable definitions. With a small number of models, this only begins to capture the range of possible spatial fields of ecosystem fluxes. Few attempts have been made to systematically address the components of uncertainty in such a framework. We discuss the characterization of uncertainty for this approach including both quantifiable and poorly known aspects. structures, parameters, and variable definitions. With a small number of models, this only begins to capture the range of possible spatial fields of ecosystem fluxes. Few attempts have been made to systematically address the components of uncertainty in such a framework. We discuss the characterization of uncertainty for this approach including both quantifiable and poorly known aspects.

Keywords: structural uncertainty, ensemble modeling, carbon flux, North America

AttachmentSize
DunganAccuracy2010.pdf480.4 KB

Spatial Entropy for the Measurement of the Spatial Accuracy of Classified Remote Sensing Imagery

Didier G. Leibovici*, Giles M. Foody and Doreen Boyd
School of Geography University of Nottingham Nottingham, United Kingdom
*didier.leibovici@nottingham.ac.uk

Abstract: Accuracy assessment is now widely accepted as an integral part of any mapping programme from remote sensing. Accuracy assessments are usually performed on either a global basis or at class level without consideration of spatial properties. Analysing the spatial distribution of the accuracy may be valuable, especially in providing information that aids evaluation of the quality of the derived map and of the methods used in its production. The paper focuses on spatial accuracy measurements and their spatial pattern using statistics, taking into account proximities of accurate and non accurate pixel values (e.g. classified image versus a reference). A methodological framework using spatial entropy measures based on co-occurrences of the predicates (accurate and non- accurate values) is described and illustrated with a one-class classification of land cover from Landsat ETM+ data.

Keywords: Remote Sensing; land cover; Accuracy; Spatial Pattern; Entropy; Scale

AttachmentSize
iLeiboviciAccuracy2010.pdf425.46 KB

Spatial Uncertainty and Hypothesis Testing in Medical Geography: a Geostatistical Perspective

Pierre Goovaerts
BioMedware, Inc Ann Arbor, Michigan, USA
goovaerts@biomedware.com

Abstract.This paper provides an overview of geostatistical methods available for the analysis of both individual-level and aggregated health outcomes, with applications to cancer mortality and incidence in the US. Traditional kriging and stochastic simulation algorithms are tailored to the characteristics of health data, allowing the incorporation of positional and spatial uncertainty into mapping and explaining geographical variation in the risk of late-stage diagnosis for breast cancer. In another study, uncertainty about cervix cancer mortality rates is propagated through the detection of significant changes in mortality across county boundaries. Both applications use a novel simulation-based multiple testing correction procedure that is very flexible and less conservative than the traditional false discovery rate approach.

Keywords: cancer; boundary detection; logistic regression; false; positives; geocoding errors.

AttachmentSize
Goovaerts2010Accuracy.pdf650.42 KB

Statistical Inference and Local Spatial Modelling: Living and Working with Uncertainty

Chris Brunsdon
Department of Geography, University of Leicester, Leicester, United Kingdom
cb179@le.ad.uk

Abstract: Uncertainty exists in all forms of spatial process modelling. Errors exist in data, the correct model form for the process is often uncertain and the actual process being modeled is itself random. This all suggests that there is a need to handle uncertainty in the modeling and data analysis process. One emerging approach to inference in this framework is that of Bayesian inference based on Monte Carlo Markov Chain simulations. Until the advent of these techniques, Bayesian approaches, although of theoretical interest were often computationally impractical. The simulation approach overcomes many of these problems and allows great flexibility in terms of questions that may be addressed. Here these methods will be reviewed, and in particular their application to spatial data analysis will be discussed, together with examples.
 

Keywords: Bayesian data analysis; Monte Carlo Markov Chain;Ceographically Varying Coefficients

AttachmentSize
Brunsdon2010Accuracy.pdf141.75 KB

Statistical Mapping of air Quality by Remote Sensing

Alessandro Fassd* and Francesco Finazzi
University of Bergamo - Dept. of Information Technology and Mathematical Methods Bergamo,Italy
*alessandro.fasso@unibg.it

Abstract: In this paper we consider the multivariate dynamic coregionalization model which has recently been introduced in environmental spatio-temporal statistics. The main modelling objective here is dynamic mapping of airborne particulate matters (PM10) by merging measurements from an irregularly spaced ground level monitoring network and regularly spaced satellite measurements of aerosol optical thickness (AOT). Due to the fact that AOT measurements are not available under cloudy conditions, we have to manage a large amount of missing data. In principle this task is naturally handled by the state space representation of the model and the maximum likelihood estimation through the EM algorithm. After discussing the uncertainty sources of this model, we check the model sensitivity to missingness in the case of the "padano- veneto" region, North Italy, including the Alps. To do this, an extensive simulation campaign is performed with missing rate ranging from 0% to 90% and showing the reliability of the method for the case under study.

Keywords: dynamic coregionalization model; aerosol optical thickness; multivariate spatio-temporal modeling; EM algorithm

AttachmentSize
FassoAccuracy2010.pdf598.8 KB

Study of Spatial fusion of Geographical Entities and Quantitative Information in Accordance with Their Imprecision

Cyril de Runz1, Eric Desjardin1 and Herman Akdag2
1.Universite de Reims Champagne-Ardenne, Reims, France
2.University of Paris 6, Paris, France
Cyril.de-runz@univ-reims.fr; eric.Desjardin@univ-reims.fr; Herman.Akdag@lip6.fr

Abstract: The article deals with imprecise geographical entities modelled according to the fuzzy set theory for both spatial and quantitative information. It presents the issues of fusion of fuzzy geographical objects according to its storing modes (raster, vector) in a mutualised geographical information system (GIS). We study the aggregation of imprecise spatial entities and its impact in the imprecise description of quantities associated to space. It exposes the current choices in the Observox Project managing multiple sources of information.

Keywords: fusion; fuzzy representation; imprecise geographical; entities, fuzzy spatial object storage, mutualized GIS

AttachmentSize
Cyrial2010Accuracy.pdf555.75 KB

The Effect of Map Accuracy on Estimates of Terrestrial Carbon Budgets

Curtis E. Woodcock1*, Pontus Olofsson1, Sung Bae Jeon1, and Stephen Stehman2
1. Department of Geography and Environment, Boston University, Boston, MA, USA
2. College of Environmental Science and Forestry, State University of New York, Syracuse, NY, USA
* Curtis@bu.edu; 2 svstehman@syr.edu

Abstract: Uncertainty in estimates of rates of forest change significantly influence the magnitude of a modelled sink for atmospheric carbon dioxide for five states in New England, USA. Accuracy assessment of remote sensing results is essential to characterizing uncertainty in estimates of rates of forest change. The width of the confidence intervals around the area estimates of forest change is a better indication of the value of remote sensing products than overall or per class accuracies.

Keywords: greenhouse gas emissions, remote sensing, map, accuracy, area estimation.

AttachmentSize
img-921091320.pdf279.47 KB

The Quantification and Effect of Data Accuracy on Spatial Configuration and Composition Measurements of Landscape Pattern in Ecology

Alex M. Lechner*, Karin J. Reinke, Elizabeth Farmer, Bill T. Langford and Simon D. Jones
RMIT University Melbourne, Australia
*alex.lechner@rmit.edu.au

Abstract: In natural resource management, ecological models are increasingly used to investigate the relationship between environmental and ecological processes (e.g. quantifying the response of rare and threatened species to habitat fragmentation). This paper examines the potential effect of data accuracy upon the measurement of landscape pattern using spatial configuration and composition measures commonly used in ecological modelling. This was achieved via an accuracy assessment of two standard environmental metrics calculated from commonly used continental and regional vegetation datasets and compared to a higher spatial resolution dataset treated as if it were the truth. Three study areas were selected from within Victoria, Australia, that represented varying degrees of spatial heterogeneity. The two metrics calculated to describe the environment within each plot were vegetation extent and Nearest Neighbour (NN). The first metric, vegetation extent describes spatial composition, whilst the second metric, NN, describes spatial configuration. Assessments were made at the local scale, represented by 80 one hectare circular plots for each study area. The effect of transforming the data was also tested as in many cases the relationship between ecological phenomenon and land cover measurements may not be linear. Results confirm the expectation that the utilisation of lower accuracy spatial data results in the derivation of less accurate environmental metrics. Further findings showed that vegetation extent was less sensitive to map product than nearest neighbour suggesting that certain metrics may be more susceptible to error. Importantly, the magnitude of error was also influenced by the type of mathematical function used to transform the data. The overall magnitude of error as recorded for vegetation extent, for each of the three landscapes, were qualitatively shown to poorly predict the magnitude of error that would occur using the spatially explicit NN environmental metric. Consequently, it is likely that global error statements (e.g. confusion matrices) that are not spatially explicit, are potentially inadequate for describing error in map products that are to be used for modelling spatially explicit phenomenon.

Keywords: ecology; environmental metrics; error propagation; landscape pattern; remote sensing; spatial error

AttachmentSize
LechnerAccuracy2010.pdf499.37 KB

Time Dependent Variance-based Sensitivity Analysis of Development Aggregation Generated by Heterogeneous Land Use Agents

Arika Ligmann-Zielinska1 and Libo Sun2
1.Department of Geography Environmental Science and Policy Program,Michigan State University, East Lasing, USA
2.Department of Statistics and Probability, Michigan State University, East Lasing, USA
1.ligmannz@msu.edu; 2.sunlibo@msu.edu

Abstract: Agent-based models have been recognized as computational laboratories furnishing spatial scientists with a plausible exploratory apparatus for learning about land use dynamics through an explicit representation of human behavior. At the same time research suggests that the utility of agent-based modeling has been hampered by a limited understanding of the decision processes involving a wide array of stakeholders with different perceptions and preferences. Therefore, it is critically important to offer new tools for a more comprehensive inspection of uncertainties related to the interrelationships between individual choices and land development patterns. In this paper, we propose a new approach to evaluating agent behavioral uncertainty using time dependent variance-based global sensitivity analysis. The method produces time series of first order sensitivity indices that allocate the variance of development patterning to two heterogeneous behavioral features: risk perceptions, quantified through attitude utility functions, and land preferences, in the form of weights assigned to different decision criteria. We experiment with three ABM scenarios that emphasize the various decision components. The scenarios utilize a fixed number of parameters with changing distributions reflecting the behavioral characteristic under consideration. Outcome maps for each time step are summarized using the aggregation index, which is further employed in sensitivity computation. The resulting sensitivity indices are plotted against time to track the impact of input conditions on land use compactness. The comparisons of the plots reveal varying sensitivity trajectories that depend on the modified decision rule.

Keywords: sensitivity analysis; agent-based model; behavioral heterogeneity

AttachmentSize
ZielinskaAccuracy2010.pdf558.34 KB

Toward Quantitative Geocode Accuracy Metrics

Daniel W. Goldberg1, John P. Wilson2, Myles G. Cockburn3
1.Department of Computer Science University of Southern California Los Angeles, CA USA
2.Departments of Computer Science, and Geography University of Southern California Los Angeles, CA USA
3.Department of Preventive Medicine University of Southern California Los Angeles, CA USA
1. dwooldbe@usc.edu;2. jpwilson@usc.edu;3. mylesc@usc.edu

Abstract: Existing geocode quality metrics provide little utility for those interested in the spatial uncertainty associated with a geocoded location. The per-geocode metrics describe aspatial characteristics of individual aspects of the geocoding process, while the per-dataset spatial metrics provide only general information that may not apply to a single geocode of interest. In this paper we develop a method for describing the certainty of a geocoded (latum as a spatial probability surface based on an uncertainty propagation model which takes into account the certainty stemming from each portion of the geocoding process. This surface-based geocode output structure provides a more truthful view of the uncertainty present in these data and will enable more realistic estimates of information derived from them in such tasks as environmental exposure modeling.

Keywords: geocode; uncertainty, spatialprobabilty distributions

AttachmentSize
GoldbergAccuracy2010.pdf592.45 KB

Transgression of Semantic Boundaries by Methodical Terminology Management: Applicaion to the Terminology and Metrology of Satellite Based Localisation Systems

Lars Schnieder and Marco Wegener
Technische Universitat Braunschweig Institute for Traffic Safety and Automation Engineering , Braunschweig, Germany
{schnieder, Wegener} @iva.ing.tu-bs.de

Abstract: In the future satellite based positioning systems will be applied to a broad application area ranging from civil to military use. This paper is focusing on the civil transportation sector and shall be a contribution to the consistent and domain- spanning use of satellite based localisatin systems with possible safety-related applications. Applying satellite based localization systems to surface transportation requires the bridging of the currently existing gap between the different terminological worlds. This includes a binding definition of the properties, characteristics and values further describing the meaning of the applied terms. On the one hand there is the terminology of aviation which formed the base for the specification of the satellite-based localization system, on the other hand there is the terminology of the application domain (here rail transportation respectively the concerned transportation domain). Especially for safety-related applications it is mandatory to ensure a clear understanding between the terminological worlds which shall also enable a certification process for receivers of satellite-based localization systems. This paper outlines an approach based on the linguistic concept of lexemes as well as a structured procedure in order to reach consensus on the terminology being applied in an increasingly interdisciplinary context.

Keywords: specification, terminology, terminology management

AttachmentSize
SchniederAccuracy2010.pdf521.31 KB

Uncertainties of Cultivated Landscape drainage Network Mapping and its Consequences on Hydrological Fluxes Estimations

I-lorent Levavasseur1, Philippe Liigacherie1, Michael Rabotin1, Jean-Stephane Bailly2, Francois Colin3
1."iJMR 1221 LISAH, F-34035, INRA, Montpellier, France
2. UMR1221 LISAH, F-34035, UMR TET1S, F-34093, AgroParisTech, Montpellier, France
3. UMR 1221 LISAH, F-34035, Montpellier SupAgro, Montpellier, France
1. {levavass lagache rabotin}@supagro.inra.fr 2. jean-stephane.bailly@teledetection.fr 3. francois.colin@supagro.inra.fr

Abstract: In this study, we use a network generation method to simulate equi-probable artificial drainage networks on a small Mediterranean cultivated catchment. The method consists in a stochastic algorithm assimilating sampled observations on the network that generates a ditch network based on the map of field boundaries. A set of one hundred simulations is used to represent the uncertainty on the ditch network. With regards to geometrical (lengths indices), topological (Shreve indices) and topographical metrics (slope indices), simulations similarities compared to the real network are computed. Secondly, uncertainty on the network is propagated through the hydrological model MHYDAS. The induced hydrological responses in hydrographs present a variability that can be linked to previous networks metrics. At this stage of the network generation process, this uncertainty propagation study allows to drive the choice of criteria for future network generation process improvement.

Keywords: artificial drainage network, directed tree, uncertainties, mapping, error propagation, hydrological response.

AttachmentSize
LevavasseurAccuracy2010.pdf734.02 KB

Uncertainty Assessment for Soil Remediation Projects

Ana Horta and Amilcar Soares
Centre for Natural Resources Evaluation Institute Superior Tecnico, Technical University of Lisbon, Lisbon, Portugal
{ahorta, asoares}@ist.utl.pt

Abstract: Uncertainty evaluation provides crucial information for soil contamination assessment. However, for remediation planning, uncertainty has to be combined with risk evaluation and subsequent costs. This paper proposes a new model to integrate uncertainty and average remediation costs to define risk target areas for decision making.
Keywords: soil contamination, geostatistics, cost functions

AttachmentSize
img-X03141113.pdf1.76 MB

Uncertainty Assessment of Land Use/cover Maps: User Beware!

Bryan C. Pijanowski*, Kimberly D. Robinson and James Plourde
Department of Forestry and Natural Resources Purdue University West Lafayette, Indiana 47906 USA
*bpijanow@purdue.edu

Abstract: We compare land use/cover maps developed by local government and those developed by state agencies for two locations in the Upper Midwest USA. A map comparison rubric is presented that includes the examination of (1) amount of land use/cover class in each map; (2) the percent map agreement using a cross-tabulation; and (3) patch metrics that include the number of patches, patch shape and arrangement of patches in a landscape. We compare several of these statistics across 4 different cell size aggregations. We find that there is more difference in methods used to derive land use maps than there is 20-30 years of change represented in maps developed using the same method. The implications of this work are significant if decision makers and modelers rely on these maps for planning and simulation. Furthermore, this analysis throws into question what the "truth" land use map might be or even how it might be created. We argue that methods need to be developed that incorporate land use map errors into the decision making process and scientific research that uses these maps.

Keywords: land use maps; uncertainty; classification methods.

AttachmentSize
PijanowskiAccuracy2010.pdf630.51 KB

Uncertainty Quantification with Support Vector Regression Prediction Models

Vasily Demyanov1, Alexei Pozdnoukhov2, Mikhail Kanevski3, Mike Christie1

1. Institute of Petroleum Engineering, Heriot-Watt University, Edinburgh, UK
2. National Centre for Geocomputation, National University of Ireland, Maynooth, Ireland
3. Institute of Geomatics and Risk Analysis, University of Lausanne, Switzerland
1. { vasily.demyanov, mike.christie }@pet.hw.ac.uk
2. alexei.pozdnoukhov@nuim.ie
3. Mikhail.Kanevski@unil.ch

Abstract: Uncertainty quantification of petroleum reservoir models is one of the present challenges, which is usually approached with a wide range of geostatistical tools linked with statistical optimisation or/and inference algorithms. The paper considers a data driven approach in modelling uncertainty in spatial predictions. Proposed semi-supervised Support Vector Regression (SVR) model has demonstrated its capability to represent realistic features and describe stochastic variability and non-uniqueness of spatial properties. It is able to capture and preserve key spatial dependencies such as connectivity, which is often difficult to achieve with two-point geostatistical models. Semi-supervised SVR is designed to integrate various kinds of conditioning data and learn dependences from them. A stochastic semi-supervised SVR model is integrated into a Bayesian framework to quantify uncertainty with multiple models fitted to dynamic observations. The developed approach is illustrated with a reservoir case study. The resulting probabilistic production forecasts are described by uncertainty envelopes.

Keywords: uncertainty; prediction; petroleum; machine learning; support vectors; data integration

AttachmentSize
DemyanovAccuracy2010.pdf570.46 KB

Uncertainty in Demographic Small Area Estimates

Stefan Leykand1, Barbara P. Buttenfield1 and Nicholas N. Nagle2
1.Department of Geography, University of Colorado, Boulder, CO, USA
2. Department of Geography, University of Tennessee, Knoxville, TN, USA
1.{stefan.leyk, babs}@colorado.edu;2. nnagle@utk.edu

Abstract: This paper describes a methodology to model uncertainties underlying the spatial allocation of demographic microdata to small areas such as census tracts or subunits of census tracts. The procedure models probable allocations for each single household to one of the small area units as derived from bootstrap analysis. The method allows evaluation of the uncertainty of the allocation dependent on the constraint variables used for the imputation of population weights which is based on empirical likelihood methods. Dasymetric mapping is demonstrated as an effective tool to spatially refine the allocation of households and shows great potential for more advanced spatial allocation processes.

Keywords: Small area estimates, uncertainty, dasymetric modeling, census data

AttachmentSize
LeykAccuracy2010.pdf795.54 KB

Uncertainty in Determining the Extension of Anomalous Zones of Spatial Environmental Variables

Peter Bossew
Bundesamt fur Strahlenschutz (German Federal Radioprotection Authority) Berlin, Germany
pbossew@bfs.de

Abstract: Spatial anomalies, extremes and outliers are of great interest in environmental studies, because they may represent zones of hazard or of valuable resources. Estimating risks or benefits associated with them is therefore closely related to estimating their spatial extension. Since it is estimated from an inevitably finite number of samples any estimate has an associated uncertainty. In this contribution I want to present some ideas how the problem could be tackled and show results for a particular simple case.

Keywords: spatial field, anomaly, uncertainty

AttachmentSize
BossevAccuracy2010.pdf485.02 KB

Uncertainty in Habitat Quality Maps for Elk: Implications for Estimates of Carrying Capacity

Ronald W. Davis1, John G. Cook2, Rachel Cook2, Louis C. Bender3, Richard E. Warner4 1.Geosciences and Natural Resources,Western Carolina University, Cullowhee, NC USA
2.National Council for Air and Stream Improvement, La Grande, OR USA
3. New Mexico Cooperative Research Unit,New Mexico State University, Las Cruses, NM USA
4. Natural Resources and Environmental Sciences, University of Illinois, Urbana, IL USA
1. rdavis@wcu.edu; 2. jgcook@nacasi.gmail.com; 3. lbender@nmsu.edu; 4. dickw@uiuc.edu

Abstract: Long term declines in ungulate populations in the Pacific Northwestern U.S. have been attributed to reduced forage production within closed-canopy forests and subsequent declines in elk nutritional body condition (i.e. autumn fat levels). Remote sensing imagery has been extensively used monitor forest resources in the region, yet error and inaccuracy in habitat models produced from these layers can be high given the detailed data required to evaluate forage resources. To estimate elk habitat quality in a way directly related to their use of forage and to explore the effects of uncertainty, we used GIS to apply a relationship between the biomass of selected forages (BSF) for elk and percent overstory canopy cover (PCC) (R2 >0.68) estimate forage-based habitat quality in 3 managed forests western Oregon and Washington. We used data-splitting to develop and validate remote sensing models and applied a Tasseled Cap transformation of Landsat ETM imagery to develop maps of PCC and estimate BSF. PCC was predictable from Tasseled indices (R2 >0.64) and we calculated supportable elk densities from BSF map estimates and compared this to actual densities and autumn fat levels in wild elk at these sites. To examine potential error and uncertainty we created a random raster layer representing error calculated between actual and estimated PCC and incorporated this into BSF models. We produced 1000 iterations and calculated the proportion of iterations in which BSF estimates met or exceeded a minimum viable BSF of >400kg/ha. Minimum viable BSF occupied an estimated 6-29% of the study area producing estimates of supportable elk densities of 6-20 elk/km2. Models incorporating error predicted BSF values ranging from 2.7-6.5% in at least 700-800 of 1000 model iterations. Error based estimated supportable densities were reduced to 0.65-6.3 elk/km2 with actual densities ranging from 0.20-5.0 eik/km2. The highest autumn body fat levels (12.4%) occurred where BSF estimates where highest and elk densities were lowest, while the poorest condition (6.2%) occurred at intermediate BSF levels but with the highest wild elk density. While the actual levels of nutrition acquired by elk will ultimately determine body condition, the incorporation of error and uncertainty into these habitat quality estimates produced estimates of habitat quality more in line with actual elk performance at these sites.

Keywords: elk, habitat evaluation, GIS, remote sensing, error, uncertainty

AttachmentSize
DavixAccuracy2010.pdf498.21 KB

Uncertainty in Image Interpretation as Reference for Accuracy Assessment in Object-based Image Analysis

Florian Albrecht
Centre for Geoinformatics, Salzburg University, Salburg, Austria
florian.albrecht@sbg.ac.at

Abstract: For accuracy assessment in object-based image analysis (OBIA) image interpretations can be used as reference data. This approach has some drawbacks as image interpretation contains uncertainty that may affect the reliability of the accuracy assessment of an OBIA product. Therefore, thematic and spatial uncertainties in image interpretation need to be quantified so that it can be accounted for by an object-based accuracy assessment. In this study the spatial uncertainty in object delineation was emphasised, because boundaries are more relevant for OBIA products than they had been for traditional pixel-based image analysis approaches. Twenty-two image interpretations acquired through an assignment within a postgraduate distance-learning programme were the basis to analyse uncertainty. The interpretations were grouped according to their delineation detail and overlaid to identify uncertain areas. The group of detailed interpretations agreed on the same class for 83.4% of the area, with 16.6% of the area remaining uncertain. Based on the overlay, median interpretations serving as a reference were retrieved for each group. Object Fate Analysis (OFA) was applied to estimate boundary deviation. The area that was affected by uncertainty was measured for several levels of agreement between the interpreters. E.g., for the detailed group the boundaries of 75% of the objects deviated less than 4.91 meters from their corresponding reference object. The deviation values of boundaries can be interpreted as distances for an error band along the delineations.

Keywords: classification accuracy; remote sensing; OBIA; object; fate analysis

AttachmentSize
img-921091331.pdf954.76 KB

Uncertainty in the Estimation of Drought Risk due to Soil-climate Interactions in Scotland

Laura Poggio*, Alessandro Gimona, Iain Brown, and Marie Castellazzi
The Macaulay Land Use Research Institute, Aberdeen, UK
*l.poggio@macaulay.ac.uk

Abstract: The impact of climate change on ecosystems is a global issue. As a result of the interaction between decreasing precipitation during the growth period and soil properties, thewater available for plants and crops may become a limitation factor for crops or certain forest species in some areas in Scotland. The aim of this study was to estimate the uncertainty of a model predicting drought risk in the Dee catchment in the North East of Scotland. The model focuses on the fundamental interactions between soil and climate, which are the critical drivers for determining the available water capacity. Soil available water capacity was calculated, using pedotransfer functions, with data derived from the Scottish Soil Survey Database at ca. 100 profiles. We used a variation of regression kriging to interpolate the data. The preliminary results showed that the uncertainty related to soil modelling is higher in areas with rougher morphology and complex hydrology. A Bayesian framework for uncertainty integration of soil and climate interactions is briefly presented. The evaluated overall uncertainty is useful to underpin informed policy decisions, via risk assessment.

Keywords: Gaussian simulations, spatial uncertainty, geostatistics, stochastic modelling, General Additive Model

AttachmentSize
PoggioAccuracy2010.pdf757.45 KB

Uncertainty in the Extracted Drainage Network Associated to the Applied DEM Correction Method: Implementation of a new DEW Correction Approach

Juan Camilo Castro Gallego1, Veronica Botero Fernandez2, and Jaime Ignacio Velez3
Facultad de Minas, Universidad Nacional De Colombia, Medellin, Colombia
1. *jccastro70@gmail.com; 2. vbotero@unal.edu.co; 3. jivelezu@gmail.com

Abstract: Automated generation of drainage networks from digital elevation models (DEM) has become a popular practice due to the increment of available information. Along with the different methods proposed for the automated extraction of drainage networks, new methods to correct the DEMs have also been suggested. This paper analyses the uncertainty related to the extraction of drainage networks from several DEMs when using different correction methods, including a new approach suggested by the authors. The automated extraction of the drainage networks was made from DEMs, at different scales, corrected with three different approaches: 1) ArsGIS's sink filling and flow direction method (ArcGIS is a commercial GIS package from the Environmental Systems Research Institute ESRI). 2) Tarboton's (1997) TauDEM (methodology proposed and implemented by David Tarboton, students and collaborates in Utah State University) implemented for the free/open software MapWindow GIS in 2005 by his team and other contributors from Idaho State University. 3) HidroSIG 4.0 DEM processing functions (methodology proposed and implemented by the authors of this paper in the free software HidroSIG 4.0 from the Universidad Nacional de Colombia, sede Medellin). This methodology suggests a new approach based on the physical processes that take place on river networks and their geomorphological tendencies due to the work done by long term erosive processes. The different river networks extracted from different processed DEMs, from a mountainous tropical area in Colombia, at four different resolutions, were analyzed to determine their inherent uncertainty due to the correction and flow direction definition method used for the automated extraction.

Keywords: Uncertainty, digital elevation models (DEM), automated extraction of drainage networks, DEM processing, river networks

AttachmentSize
GallegoAccuracy2010.pdf491.34 KB

Using Geographically Weighted Regression for Analysing Elevation Error of High-resolution DEMS

Michal Gallay1, Chris Lloyd1, Jennifer McKinley1 and Michal Gallay2
School of Geography, Archaeology and Palaeoecology, Queen's University Belfast, Belfast, UK Institute of Geography, University of Pavol Jozef Safarik, Košice, Slovakia
{mgallayO 1, c.lloyd, j.mckinley}@qub.ac.uk; 2. michal.gallay@upjs.sk

Abstract: This case study compares five different high- resolution digital elevation models (DEMs) originated from light detection and ranging (LiDAR), interferometric SAR, photogrammetric acquisition and contour maps digitizing. The LiDAR DEM derived from last return points was considered as the reference DEM. The aim was to analyse the statistical and spatial distribution of the residuals and their relationship with the DEM surface roughness of the analysed DEMs. Surface roughness measured as area ratio and inverted vector strength were used to parameterise the DEM surface. The results show that globally no linear relationship exists between the surface roughness and DEM residuals but it was found to be very diverse locally. High elevation errors occurred along DEM artefacts and sharply defined landforms. The applied surface roughness parameters were found to be useful predictors of such features and could be used for identification of such features. The findings also suggest that the assumption of stationarity and Gaussian distribution of the DEM error field is questionable.

Keywords: GWR, DEM error, LiDAR, roughness, local analysis

AttachmentSize
GallayAccuracy2010.pdf1.03 MB

Using Signal Propagation Models to Improve Distance Estimations for Localisation in Wireless Geosensor Networks

Alexander Born, Frank Niemeyer, Mario Schwiede and Ralf Bill
Faculty for Agricultural and Environmental Sciences, Chair of Geodesy and Geoinformatics, University of Rostock, Germany
{Alexander.born,frank.niemeyer,Mario.schwiede,ralf.bill}@uni-rostock.de

Abstract: The determination of a precise position in wirelessgeosensor networks requires the use of e.g. distancemeasurements. These distance observations derived byReceived Signal Strength (RSS) measurements are inherentlyinaccurate. Furthermore, in general, the distance observationsusing RSS do not take obstacles into account. In this paper wepresent a new approach to correct erroneous RSSmeasurements affected by obstacles. This technique iscombined with the known "Anomaly Correction inLocalization" (ACL) algorithm where sensor measurements areused to improve the determined sensor node positions and todetect and eliminate outliers.

Keywords: wireless sensor networks, geosensor, localisation,raytracing, received signal strength, precise positioning

AttachmentSize
BornAccuracy2010.pdf548.38 KB

Using Sound to Represent Positional Accuracy of Address Locations

Blazej Cipeluch1, Ricky Jacob1, Adam Winstanley1 and Peter Mooney2
1.Geotechnology Research Group, Department of Computer Science National University of Ireland Maynooth (NUIM) Maynooth, Co. Kildare. Ireland
2.Environmental Research Center Environmental Protection Agency Clonskeagh, Dublin 14. Ireland
1. bciepluch@cs.nuim.ie; 2.p.mooney@epa.ie

Abstract: We describe a comparison of the accuracy of OpenStreetMap for Ireland with Google Maps and Bing Maps. Five case study cities and towns are chosen for this comparison. Each mapping system is analysed for accuracy under three main headings: spatial coverage, currency, and ground-truth positional accuracy. We find that while there is no clear winner amongst the three mapping platforms each show individual differences and similarities for each of the case study locations. We believe the results described in this paper are useful for those developing Location-based services for countries such as Ireland where access to high- quality geospatial data is often prohibitively expensive or made difficult by other barriers such as lack of data or access restrictions.

Keywords: OpenStreetMap, Google Maps, Bing Maps, Ireland

AttachmentSize
BearmanAccuracy2010.pdf843.21 KB

Visual Analysis of Sensitivity in CAT Models: Interactive Visualisation for CAT Model Sensitivity Analysis

Aidan Slingsby1, Jo Wood1, Jason Dykes1, David Clouston2 and Matthew Foote2
1. giCentre, Department of Information Science City University London London, UK
2. Willis Analytics, Willis London, UK
1. {sbbb717, jwo, jad7}@soi.city.ac.uk; 2. {cloustond,footem}@willis.com

Abstract: We demonstrate how visual interactive graphics can support both spatial and aspatial model sensitivity analysis, using a Venezuela-based earthquake CAT model as a case study. We identify the model inputs that drive the model's estimated losses using interactive maps, treemaps to give overviews and linked barcharts, spineplots and maps to explore the effects of specific input combinations on the estimated loss outputs. Interactively linking these methods allow them to be integrated into the workflows of analysts.

Keywords: interactive visualizisation, sensitivity, spatial, multivariate, treemaps

AttachmentSize
SlingsbyAccuracy2010.pdf1.02 MB

Visualizing Uncertainty in Spatio-temporal Data

Lydia Gerharz*, Edzer Pebesma, Harald Hecking Institute for Geoinformatics University of Munster Miinster, Germany
*lydia.gerharz@uni-muenster.de

Abstract: Visualization methods to show uncertainties in geospatial data are important tools for communication. Methods have been mainly developed for marginal probability distribution functions (pdfs) describing uncertainties independently for each location in space and time. Often uncertainties can be described better by joint pdfs, including the spatio-temporal dependencies of uncertainties. In this paper, methods for visualization of marginal distributions for space-time grids or features were compared to the case where the full joint distribution needs to be considered in order to find typical or rare spatial or spatio-temporal patterns, such as in ensemble weather forecasts. A number of statistical methods to sample representative realizations from a collection of model ensembles based on the spatio-temporal dependencies such as Mahalanobis distance were investigated and compared. We conclude that taking the full joint probability into account by showing a set of selected ensembles besides visualization methods using marginal distributions is helpful to understand the spatio-temporal structure.

Keywords: uncertainty visualization; ensembles; Mahalanobis distance; similarity selection

AttachmentSize
GerharzAccuracy2010.pdf541.88 KB

Web-based Assessment of Operator Performance and Variability in Remote Sensing Image Analysis

Soetkin Gardin1, Sebastien M.J.Van Laere2, Frieke M.B Van Coillie1. Frederik Anseel3, Wouter Duyck2, and Robert R. De Wulf1
1 Laboratory of Forest Management and Spatial Information Techniques
2 Department of Experimental Psychology
3 Department of Personnel Management, Work and Organizational Psychology Ghent University, Ghent, Belgium
{Soetkin.Gardin, Sebastien.Vanlaere, Frieke.VanCoillie, Frederik.Anseel, Wouter.Duyck, Robert.DeWulf}@UGent.be

Abstract: Human perception and interpretation is an indispensable component in many aspects of remote sensing image analysis. Human intervention is a requisite for visual image interpretation and even in computer-based digital image processing, human screening and interpretation is still needed at certain stages. Next to the remote sensing domain, human intervention plays an important role in other types of geodata processing such as GIS and cartography. Although it is crucial for adequately assessing automated systems' performance, virtually no research has focussed on operator functioning. Instead, it is implicitly assumed that operator performance approaches perfection, and that infrequent errors are randomly distributed across time, operators and image types. The goal of the present study is to test these assumptions, and to determine the human factors that influence operator functioning. To this end a web application has been developed including several experiments testing operator performance. In a first part a personal profile is made consisting of some demographics like age and gender. Next, a personality questionnaire is presented and an interactive tool measures the capacity of the visual working memory. The second part consists of a long series of digitizing tasks. So far, a try-out took place in a controlled environment. The results of this control group already showed significant variability amongst operators that could partly be explained by human factors.

Keywords: image analysis; accuracy; human factors

AttachmentSize
GardinAccuracy2010.pdf.pdf517.62 KB