Accuracy 2006 Conference

7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences. Edited by M. Caetano and M. Painho.

Logo Accuracy 2006

Symposium chairs:
Chair: Mário Caetano (Portuguese Geographic Institute, PT)
Co-chair: Marco Painho (ISEGI – New University of Lisbon, PT)

Organizing Committee:
Secretary: Maria Pereira (Portuguese Geographic Institute, PT)
Members: Amílcar Soares (IST, Technical University of Lisbon, ePT), Cidália Fonte (University of Coimbra, PT), Gerard Heuvelink (Wageningen University, NL), Linda Lilburne (Landcare Research, NZ), Rui Juliao (Portuguese Geographic Institute, PT)

| OFFICIAL BROCHURE | THE FINAL REPORT

PLENARY LECTURES

APPROACHES TO SPATIAL ACCURACY ASSESSMENT

CHARACTERISING UNCERTAINTY IN DEM

ERROR SENSITIVE SPATIAL DATABASES

METADATA AND DATA QUALITY

POSITIONAL UNCERTAINTY

PROPAGATION OF UNCERTAINTY

SPATIAL UNCERTAINTY MODELLING FOR CATEGORICAL DATA

SPATIO-TEMPORAL ANALYSES AND UNCERTAINTY

STOCHASTIC SPATIAL SIMULATION

UNCERTAINTY IN REMOTELY SENSED DATA

UNCERTAINTY IN SPATIAL DATA FUSION

UNCERTAINTY IN SPATIAL DECISION MAKING

USING FUZZY SET THEORY TO CHARACTERISE SPATIAL UNCERTAINTY

VISUALISATION OF UNCERTAINTY IN GEOGRAPHICAL DATA

POSTER SESSION

NOTE: The above listed papers have not been peer-reviewed. Please read also the website disclaimer.

 

AttachmentSize
report2006accuracy.pdf404.11 KB
2nd_Ann2006accuracy.pdf537.6 KB
preface2006accuracy.pdf722.34 KB

A classification method for remotely sensed imagery by integrating with spatial structure information

Yong Ge 1, Hexiang Bai 1,2 and Deyu Li 2
1 State Key Laboratory of Resources and Environmental Information System, Institute of Geographic
Sciences & Natural Resources Research, Chinese Academy of Sciences
Beijing 100101, China.
Tel: +86 10 64888967; Fax: +86 10 64889630
gey@lreis.ac.cn.
2 School of Computer and Information Technology, Shanxi University
Taiyuan 030006, China
Tel: +86 351 7018775
baihx@lreis.ac.cn; lidy@sxu.edu.cn

Abstract
Remote Sensing technologies have been widely applied to monitoring natural and man-made phenomena such as desertification, land cover changes, coastal environments and environmental pollutions. Information extraction technologies from remotely sensed imagery as an important tool to understand and analyze nature phenomena on earth have been given great attention over past decades. However, only spectral information is not enough to obtain accurate information of interest in some cases, for instance the spectral values of shadows of clouds and water body can be confused easily when classification in TM imagery. Therefore how to incorporate spatial structure or spatial pattern of surface features to extracting process to improve the reliability of results has been investigated in lots of literatures. In this paper, we propose the application of multiple-point simulation (MPS) to the classification of remotely sensed imagery. In order to illustrate the advantage of integrating spatial structure information into classification process, we compare the results of Maximum Likelihood Classification (MLC) and MLC with spatial structure information from MPS in this paper. The latter gives a superior overall performance in the classification of remotely sensed imagery.

Keywords: remotely sensed imagery, information extraction, multiple-point simulation, MLC

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Ge2006accuracy.pdf734.23 KB

A comparison of error propagation analysis techniques applied to agricultural models

Marco Marinelli, Robert Corner and Graeme Wright
Department of Spatial Sciences
Curtin University
Kent Street, Bentley,
Western Australia
Tel.: + 61 89 266 7565; Fax: + 61 89 266 2703
Marco.Marinelli@postgrad.curtin.edu.au
R.Corner@curtin.edu.au, G.Wright@curtin.edu.au

Abstract
This paper examines two different methods  by which the error propagated through a GIS model, applicable to precision agriculture, may be calculated. The two methods are an analytical method, referred to here as a first order Taylor series method, and the Monte Carlo simulation method. Good agreement is found between the spatial distributions of the resulting error surfaces. However, the magnitude of the error calculated using the full analytical method that incorporates a term for the correlation between input data sets is larger than expected. The model is then examined for sensitivity to its inputs and found to be more sensitive to one of them. Some limiting conditions are also noted.

Keywords: spatial accuracy, error propagation, Monte Carlo simulation, Precision Agriculture

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Marinelli2006accuracy.pdf820.03 KB

A methodology for translating positional error into measures of attribute error, and combining the two error sources

Yohay Carmel 1, Curtis Flather 2 and Denis Dean 3
1 Faculty of Civil and Environmental Engineering
The Technion, Haifa 32000, Israel
Tel.: + 972 4 829 2609; Fax: + 972 822 8898
yohay@technion.ac.il  
2 USDA, Forest Service
Rocky Mountain Research Station
2150 Centre Ave, Bldg A
Fort Collins, CO 80526-1891
Tel.: +001 970 295 5910; Fax: + 001 970 295 5959
cflather@fs.fed.us
3 Remote Sensing/GIS Program
Colorado State University
113 Forestry Building
Fort Collins, CO 804523-1472
Tel.: +001 970 491 2378; Fax: +001 970 491 6754
denis@cnr.colostate.edu 
 
Abstract

This paper summarizes our efforts to investigate the nature, behavior, and implications of positional error and attribute error in spatiotemporal datasets. Estimating the combined influence of these errors on map analysis has been hindered by the fact that these two error types are traditionally expressed in different units (distance units, and categorical units, respectively. We devised a conceptual model  that enables the translation of positional error into terms of thematic error, allowing a simultaneous assessment of the impact of positional error on thematic error – a property that is particularly useful in the case of change detection. Linear algebra-based error model combines these two error types into a single measure of overall thematic accuracy. This model was tested in a series of simulations using artificial maps, in which complex error patterns and interactions between the two error types, were introduced. The model accommodated most of these complexities, but interaction between the two error types was found to violate model assumptions, and reduced its performance. A systematic study of the spatiotemporal structure of error in actual datasets was thus conducted. Only weak and insignificant interactions were found between  the two error types. Application of this error model to real-world  time series data indicated that such data are much less accurate than is typically thought. These results highlight the importance of reducing positional error. The second part of our paper presents an analysis of how to reduce the impacts of positional error through aggregation (i.e., increasing the observation grain). Aggregation involves information loss, and thus, the choice of a  proper cell size for aggregation is important. A model was developed to quantify the decay in impact of positional error, with increasing cell size. Applying the model to actual data sets, a major reduction in effective positional error was found for cell sizes ranging between 3-10 times the average positional error (RMSE). The model may help users in deciding on an optimal aggregation level given the tradeoff between information loss and accuracy gains.

Keywords: positional accuracy, attribute accuracy, thematic accuracy, post-classification change detection, Combined Location-Classification model

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Carmel2006accuracy.pdf1.05 MB

A new method for obtaining geoidal undulations through Artificial Neural Networks

Maurício Roberto Veronez 1, Adriane Brill Thum 1 and Genival Côrrea de Souza 2
1 Graduate Program in Geology - Vale do Rio dos Sinos University
Avenida Unisinos, 950, São Leopoldo/RS, Brazil – CEP: 93022-000
Tel.: + 55 (51) 3591-1100 – Ramal 1769; Fax: + 55 (51) 3590-8177
veronez@unisinos.br
2 Graduate Program in Civil and Environmental Engineering. Feira de Santana State University
Feira de Santana/BA Brazil. Br 116 - Km 03 , Campus Universitário – CEP:44031-460 
Tel.: +55(75)3224-8240; Fax: +55(75)3224-8056

Abstract
The height supplied by the GPS system is merely mathematics. In most studies the height shall be referred to the Geoid. With a sufficient number of Level References with known horizontal and vertical coordinates, it is almost always adjusted by polynomials of Least Squares Method that allow the interpolation of geoidal undulations. These polynomials  are inefficient when extrapolating the data outside the  study area. Thus the aim of this work is to present a new method for modeling the surface of Local Geoid based on the technique of Artificial Neural Networks. The study area is the Hydrological Basin of Rio dos Sinos which is located in Rio Grande do Sul State – Brazil and for the training of the neural network undulations data supplied by the MAPGEO2004 program was used. The program was developed by the Instituto Brasileiro de Geografia e Estatística (Brazilian Institute of Geography and Statistics) with an absolute error band bigger than 0.5 meter. Even with such a big error, the data supplied by MAPGEO2004 can be used in the training of a neural network, because it is tolerant to errors and noises. The efficiency of  the model was tested in 8 points with known undulations and distributed throughout the study area. On these points the model has presented, through calculated discrepancies, a root mean square error of 0.170 meter, approximately. The study shows that this method can be an alternative in modeling local and/or regional Geoid.

Keywords: GPS, artificial neural networks, geoidal undulation, MAPGEO2004

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Veronez2006accuracy.pdf890.43 KB

A performance index developed by data envelopment analysis (DEA) to compare the efficiency of fire risk monitoring actions in municipalities of Brazilian Amazon region

José Alberto Quintanilha and Linda Lee Ho
Escola Politécnica – USP – Brazil
jaquinta@usp.br; lindalee@usp.br

Abstract
PROARCO (Programa de Prevenção, Controle  e Combate a incêndios florestais na Amazônia), a program for the prevention and control of burning and forest fires in the Arc of Deforestation, began in the spring of 1998. The program is jointly  administered by the Instituto Brasileiro do Meio Ambiente e dos Recursos Naturais Renováveis (IBAMA, the Brazilian government’s official environmental agency), and the Ministry for the Environment, Water Resources, and the Amazon region. The program was designed to monitor agricultural burning and forest fires including monitoring fire risk, to enforce regulations regarding the use of fire in land management, prevent and combat forest fires, and to establish a strategic task force. The fire monitoring program managed by IBAMA collected fire pixels counts from 1998 to 2002 and used them as a measure of wildfire threats for the Amazon region. The objective of the study is to develop a performance index based on the frequency of fire pixel counts and explanatory variables related to land management, census and agricultural data to compare the efficiency of fire risk monitoring actions in municipalities of Brazilian Amazon region. The index was developed using data envelopment analysis (DEA).

Keywords: data envelopment analysis, fire  hot pixels, efficiency, Amazon Region, fire monitoring

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Quintanilha2006accuracy.pdf775.45 KB

A stratified sampling approach to the spatial accuracy assessment of digital cartography: an application to the Portuguese Ecological Reserve

Miguel Peixoto 1, Ana Cristina Costa 1, Marco Painho 1 and Thomas Bartoschek 1, 2 
1 Instituto Superior de Estatística e Gestão de Informação
Universidade Nova de Lisboa
Campus de Campolide, 1070-312 Lisboa, Portugal
Tel.: + 351 21 387 04 13; Fax: + 351 21 387 21 40
mpeixoto@isegi.unl.pt; ccosta@isegi.unl.pt; painho@isegi.unl.pt; tbarto@isegi.unl.pt
2 Institute for Geoinformatics
University of Münster
Robert-Koch-Str. 26-28, 48149 Münster, Germany
Tel.: +49 (0) 251 83 33083; Fax: +49 (0) 251 83 39763
bartoschek@uni-muenster.de
 

Abstract
Managing spatial data in paper maps is quite different from managing digital spatial information. Sometimes manual vectorization of scanned maps is the only way to produce digital cartography, especially when the only  source of spatial information is a paper map. Digital scanning and manual vectorization are  two processes well known for adding error to data and accuracy is a particularly important issue for users of spatial information. The National Ecological Reserve (Reserva Ecológica Nacional - REN) established in the Portuguese national law protects areas with a  diversified bio-physic structure and specific ecological characteristics. This information is often required to manage several human activities, such as mineral extraction, real estate, industry,  tourism, etc. REN maps were originally produced in paper and were vectorized to produce digital cartography. The objective of this study is to measure the spatial accuracy and to assure the conformity with the original cartography. The accuracy of the REN digital cartography was assessed through a stratified sampling scheme, with strata defined as ecological classes within each county. The global sample size and the strata samples sizes were first determined assuring proportional area representation of each class. The sampling methodology assumes a sampling error equal to 1% for the estimation of the digital cartography errors proportion in the whole study area, for a 95% confidence level. The complexity and uncertainty inherent to each ecological class was taken into consideration on the computation of those samples sizes by using the results from a previous pilot study. The final global sample size and the strata sample sizes were established assuring at least two sample units in each stratum. The sampling methodology and the accuracy assessment procedure are detailed, finally the results are discussed and some conclusions of this study are drawn.

Keywords: data quality; quality control; stratified sampling; National Ecological Reserve

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Peixoto2006accuracy.pdf373.77 KB

Accuracy assessment methodology for the Mexican national forest inventory: a pilot study in the Cuitzeo lake watershed

Stéphane Couturier 1, Jean-François Mas 2, Erna López 2, Gabriela Cuevas 2, Álvaro Vega 1 and Valdemar Tapia 1
1 Geography Institute, Universidad Nacional Autónoma de México (UNAM), 
Ciudad Universitaria, Coyoacán 04510, Mexico City, Mexico
Tel.: + 00 52 56 22 3443; Fax: + 00 52 56 16 2145
andres@igiris.igeograf.unam.mx
2 Geography Institute, UNAM, Unidad Académica Morelia,
Aquiles Serdan 382, Col. Centro 58000 Morelia, Michoacán, Mexico
Tel.: + 00 52 443 317 9423; Fax: + 00 52 443 317 9425
jfmas@igiris.igeograf.unam.mx

Abstract
A methodology for assessing the accuracy of the Mexican National Forest Inventory (NFI) map is presented. This methodology emerges as the most adequate strategy found after various trials along the successive steps of the assessment design. A main challenge was to integrate the high diversity of classes encompassed in the classification scheme within a cost-controlled statistically sound assessment. A pilot study focused on the Cuitzeo Lake watershed region covering 400,000 ha of the 2000 Landsat-derived NFI. The availability of detailed quasi- synchronous reference data and the high variability of mapped classes allowed a careful thematic analysis on the selected region, relevant for national extrapolation. The assessment strategy incorporated an original two stage sampling design. The selection of Primary Sampling Units (PSU) was done under separate schemes for commonly and scarcely distributed classes. A compromise was statistically found for maximizing PSU spatial distribution while including all classes. The verification protocol included stereoscopic photo- interpretation and a digital restitution towards  the geometry of the Landsat data. A scale adjustment operator, based on the epsilon  probabilistic band approach, was applied to the PSU verification maplets in order to reduce the inclusion of errors due to scale. A total of 2023 punctual secondary sampling units were then compared with their NFI map label, according to conventional Boolean and linguistic fuzzy criteria. Issues regarding the assessment strategy and trends of class confusions are devised. Conclusions are drawn in terms of separability of classes on remote-sensing supports, classification system, geographic stratification and scale. This methodology is to be applied to a larger territory including a wider set of classes in the classification system.

Keywords: double sampling, rare class, scale, fuzzy, classification system

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Couturier2006accuracy.pdf1.09 MB

Accuracy assessment of High Resolution Satellite Imagery by Leave-one-out method

Maria Antonia Brovelli1, Mattia Crespi2, Francesca Fratarcangeli 2, Francesca Giannone2 and Eugenio Realini1
1 DIIAR, Politecnico di Milano, Polo Regionale di Como
via Valleggio 11 - 22100 Como, Italy
Tel.: ++39 0313327517; Fax: ++39 0313327519
maria.brovelli@polimi.it; eugenio.realini@polimi.it
2 DITS - Area di Geodesia e Geomatica, Università di Roma “La Sapienza”
via Eudossiana, 18 - 00184 Rome, Italy
Tel.: +39 0644585068; Fax: +39 0644585515
mattia.crespi@uniroma1.it; francesca.giannone@uniroma1.it

Abstract
Interest in high-resolution satellite imagery (HRSI) is spreading in several application fields, at both scientific and commercial levels. A fundamental and critical goal for the geometric use of this kind of imagery is their orientation and orthorectification, the process able to correct the geometric deformations they undergo during acquisition. One of the main objectives of the studies about orthorectification  is the definition of an effective methodology to assess the spatial accuracy achievable from orthorectified imagery. Currently, the most used method (hold-out validation - HOV) to compute this accuracy just consists in partitioning the known ground points in two sets, the first used into the orientation-orthorectification model (GCPs – Ground Control Points) and the second to validate the model itself (CPs – Check Points); in this respect, the accuracy is just the RMSE of residuals between imagery derived coordinates with respect to CPs coordinates. However this method has some drawbacks: it is generally not reliable and it is not applicable when a low number of ground points is available. First of all, once the two sets are selected, accuracy estimate is not reliable since it is strictly dependent on the points used as CPs; if outliers or poor quality points are included in the CPs set, accuracy estimate is biased. In addition, when a low number of ground points is available, almost all of them are used as GCPs and very few CPs remain, so that RMSE may be computed on a poor (not significant) sample. In these cases accuracy assessment with the usual procedure is essentially lost. In the present work we propose an alternative to the  previously described method to perform a spatial accuracy assessment,  that is the use of the Leave-one-out cross- validation (LOOCV) method for the orientation and orthorectification of HRSI. The method consists in the iterative application of the orthorectification model using all the known ground points (or a subset of them) as GCPs except one, different in each iteration, used as CP. In every iteration the residual between imagery  derived coordinates with respect to CP coordinates (prediction error of the model on  CP coordinates) is calculated; the overall spatial accuracy achievable from the orthorectified image may be estimated by calculating the usual RMSE or, better, a robust accuracy index like the mAD (median Absolute Deviation) of the prediction errors on all the iterations. In this way we solve both mentioned drawbacks of the classical procedure: it is a reliable and robust method, not dependent on a particular set of CPs and on outliers, and it allows us to use each known ground point both as a GCP and as a CP, capitalising all the available ground information. To test this method we modified the software SISAR, developed by the Geodesy and Geomatics Team at the University of Rome “La Sapienza” to perform rigorous orientation of HRSI, integrating it with a module suited to carry out iteratively the core algorithm with point configurations  required  to  apply  the  Leave-one-out  method.  The  software  was  tested  on EROS-A1 and Quickbird imagery, confirming the good features of the Leave-one-out method. Moreover SISAR was compared to the world recognized commercial software OrthoEngine v. 10 (PCI Geomatica), which required manual iterations to realize the Leave-one-out procedure; this comparison showed the quite good performances of the SISAR rigorous model. 

Keywords: HRSI, accuracy, Leave-one-out

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Brovelli2006accuracy.pdf665.52 KB

Accuracy assessment of digital elevation model using stochastic simulation

Annamaria Castrignanò1, Gabriele Buttafuoco2, Roberto Comolli3 and Cristiano Ballabio3
1 CRA - Agronomic Research Institute
Via Celso Ulpiani, 5 - 70125 Bari, Italy
Tel.: + 39 080 5475024; Fax: + 39 080 5475023
annamaria.castrignano@entecra.it
2 CNR - Institute for Agricultural and Forest Systems in the Mediterranean
Via Cavour, 4-6 - 87030 Rende (CS), Italy, 
Tel.: + 39 0984 466036; Fax: + 39 0984 466052
g.buttafuoco@isafom.cs.cnr.it
3 Department of Environmental Sciences (DISAT), University of Milano-Bicocca
Piazza della Scienza 1 - 20126 Milano, Italy
Tel.: + 39 02 64482875; Fax: + 39 02 64482994
roberto.comolli@unimib.it

Abstract
Various methods have been applied to generate Digital Elevation Models (DEM) but whatever method is used, DEM estimates will always be  affected by errors. Comparing interpolated values with the actual elevation values, obtained with a high precision field survey, allows us to assess DEM accuracy. At every spot height  location it is possible to subtract the actual values from the DEM values to obtain the error at that point. Using this error information, different types of statistics can be computed; however, Root Mean Squared Error (RMSE) is the standard measure of error used by surveyors around the world. Such a way of error reporting uses a single value for the whole DEM and makes several implicit and unacceptable assumptions about the error: it has no spatial distribution and is statistically stationary across a region. An alternative approach is proposed here to achieve an improved estimate of local error. It is based on error  modelling using conditional stochastic simulations to produce alternative values of the data and so result in a probabilistic assessment of DEM accuracy. The study area is a doline of about 1.5 ha in size and is located in the Alps (North Italy) at a mean elevation of 1900 m above sea level. In this study, to test the accuracy of a previously generated DEM, elevation data were measured at 110 randomly  distributed points using a laser distance system linked with an electronic theodolite. Five hundred simulations were generated using conditional and sequential Gaussian simulation algorithms. Statistical information was extracted from the set of simulated error images: 1) averaging the values for each pixel and producing the map of the ‘expected’ error in any considered location and that of standard deviation; 2) counting the number of times that each pixel exceeded the null value and converting the sum to a proportion, in  order to produce the probability maps of overestimation and underestimation. Basic statistics and the histogram of errors showed them to be of approximately normal distribution with a small positive bias. The map of the expected values revealed a clear correlation of errors in the DEM to the slope of the land surface. The highest values were localised on the steepest  areas in the northern half of the doline. The spatial distribution of errors was not random but showed a high probability for overestimation in the northern area, while an  actual probability for underestimation was restricted to the southern area. Multiple DEM results generated by incorporating different images of the error field also allow us to derive a probable version of the products used for subsequent decision- making processes.

Keywords: digital elevation model, geostatistics, stochastic simulation, accuracy

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Castrignano2006accuracy.pdf1.21 MB

Accuracy improvement program for VMap1 to Multinational Geospatial Co-production Program (MGCP) using artificial neural networks

Luis Filipe Nunes
Portuguese Army Geographic Institute
Avª. Dr. Alfredo Bensaúde 
Tel.: + 351 218505313
lnunes@igeoe.pt

Abstract
Deeply committed and involved in international geospatial production military groups, like the Multinational Geospatial Co-production Program (MGCP), the Army Geographic Institute is assessing the spatial accuracy of Vector Smart Level 1 (VMap 1), so that product can serve as the base for the production of the Portuguese national territory, according to the standards approved by all MGCP nations. In order to evaluate the positional, attribute and temporal accuracy of the data, a process was developed to check the suitability in relation to a densification from the base information required by 1: 100k scale MGCP project cells. Simultaneous tests were made in order to determine the uncertainty of positional and attribute elements in the main features of VMap 1, to  evaluate the applicability of the data in both products. An expert system using Artificial Neural Networks to improve the accuracy of data from VMap1 database and allowing its integration with MGCP database, in a common environment, is proposed and analyzed.

Keywords: MGCP, VMAP1, positional accuracy, attribute accuracy, artificial neural networks

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Nunes_p2006accuracy.pdf698.18 KB

Analysing the sensitivity of two variogram models for the characterisation of the spatial pattern of depth in rivers.

Monica Rivas Casado 1, Sue White 1, Pat Bellamy 1, Mike Dunbar 2, Douglas Booker 2, Ian Maddock 3
1 Cranfield University at Silsoe, Institute of Water and Environment
Barton Road, Silsoe, Bedfordshire, MK45 4DT, UK
Tel.:+44(0)1525863354
m.rivas-casado.s03@cranfield.ac.uk 
2 Centre for Ecology and Hydrology, CEH Wallingford
Maclean Building, Benson Lane, Crowmarsh Gifford, Wallingford, Oxfordshire, OX10 8BB, UK
Tel.:+44(0)1491838800
mdu@ceh.ac.uk, dobo@ceh.ac.uk 
3 Department of Applied Sciences, Geograpy & Archeology
University of Worcester, Henwick Grove, Worcester, WR 6AJ, UK.
Tel.:+44(0)1905855180
i.maddock@worc.ac.uk

Abstract
Depth data are measured in river channels to obtain a three dimensional representation of the habitat available for the species inhabiting the system. Data can be interpolated through the application of geostatistical techniques to obtain information of those areas that have not been sampled. This interpolation requires the fitting of a model to an experimental variogram that will then be used to predict values of depth at non measured locations. Before proceeding to calculate the experimental variogram the following parameters need to be decided on: lag distance (or step), azimuth tolerance and  maximum distance of analysis. This poster summarises the results obtained for a sensitivity  analysis of the calculation of experimental variograsm to the three parameters noted above, and demonstrates the effect on the fitting of two models; the spherical and the exponential. The analysis has been developed for depth data sets collected at eighteen river sites. Results show that it is necessary to be aware of the effect of choosing values of the three parameters when estimating the experimental variogram and hence the model. Models can give distorted  values of range, sill and nugget for variograms calculated for specific combinations of  minimum lag distance and maximum distance. Spherical and exponential variogram models can  introduce considerable differences in terms of spatial structure for the same river sites.

Keywords: spatial pattern, geostatistical analysis, kriging, river depth

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
CasadoWhite2006accuracy.pdf577.99 KB

Analysis of homogeneity and isotropy of spatial uncertainty by means of GPS kinematic check lines and circular statistics

Ángel M. Felicísimo 1, Aurora Cuartero 1 and María E. Polo 2
 1 Escuela Politécnica, Universidad de Extremadura
Avenida de la Universidad s/n, Cáceres 10071, Spain
Tel.: + 34  927 257 195; Fax: + 34 927 257 203
amfeli@unex.es; acuartero@unex.es
2 Centro Universitario de Mérida
Santa Teresa de Jornet, 38, Mérida 06800, Spain
Tel.: + 34 924 387 068; Fax: + 34 924 303 782
mepolo@unex.es

Abstract

We propose an alternative way to analyze the planimetric error with a special application to quality control of geometric corrections of satellite images. Our proposal is performed in three stages: a) the capture of check lines by kinematic GPS procedures, b) the calculation of disagreement between GPS check lines and the corresponding lineal structures (ways) extracted from images, and c) the analysis of error by means of circular statistics.  The GPS kinematic allows to obtain quickly  hundreds of check points above the ways. The automatic reconnaissance and extraction of these ways allow to calculate the error vectors due to defective overlapping between GPS data and image. A vector represents the deviation between the GPS data (true position) and the corresponding point over the image. Each vector is defined by means of its modulus and azimuth. The last step allows to check the magnitude, homogeneity and isotropy of errors.

Keywords: spatial accuracy, error analysis, uncertainty, environmental data, model

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Felicisimo2006accuracy.pdf739.56 KB

Analysis of interpolation errors in urban digital surface models created from Lidar data

Gil Gonçalves
Faculdade de Ciências e Tecnologia da Universidade de Coimbra, 
Apartado 3008, 3001-154 Coimbra, Portugal
Tel.: + 351 239 79 11 50 ; Fax: + 351 239 832 568
gil@mat.uc.pt 

Abstract
In urban areas Light Dectection and Ranging (LIDAR) data is becoming a widely available source for the construction of Digital Surface Models (DSMs). Because the urban surfaces have specific geometric characteristics such as discontinuities in terms of elevation and slope gradient, the interpolation of the irregularly spaced set of Lidar points to a regular grid have to be done carefully. In fact, many of the commercial GIS interpolation packages are based on the assumption that the surface is smoothly undulating. The choice of one of these interpolation functions will introduce errors across the surface models that they will be more expressive in the presence of surface discontinuities. Any subsequent analysis of the interpolated urban surface model, such as flood and viewshed analysis, feature extraction or image segmentation will be affected by propagation of these errors. In this paper, the pattern of errors and the general characteristics of six well-known interpolation methods (nearest neighbour, inverse distance weighting, triangulation with linear interpolation, minimum curvature, kriging and radial basis functions) are analysed using data either from synthetic surface models and from real urban scenes. 

Keywords: Digital Surface Models (DSMs), urban surfaces, Lidar data, interpolation errors

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Goncalves2006accuracy.pdf1.07 MB

Analysis of spatial variability of pH, P and K in red latosoil cultivated in direct planting system

Adriane Brill Thum 1,Getulio Cerioli 1 and Maurício Roberto Veronez 2
1 Civil Engineering, Vale do Rio dos Sinos University, 
São Leopoldo - RS, Brazil, Avenida Unisinos, 950 – CEP: 93022-000 
Tel.: 051 35911100 Extension 1769 Fax: 051 35908177 
adrianebt@unisinos.br
2 Post-Graduation Program in Geology, Vale do Rio dos Sinos University, 
São Leopoldo - RS, Brazil, Avenida Unisinos, 950 – CEP: 93022-000 
Tel.: 051 35911100 Extension 1769 Fax: 051 35908177 
veronez@unisinos.br

Abstract
The concern about the environmental quality,  together with new technologies, like the Geographic Information Systems – GIS has been helping many sectors, including agriculture. The digital mapping that is being done from  the harvesting of field  work, with machines equipped with mass sensors and GPS receptors, allows geographic identification of proper places for great productivity. To this technique it is given the name of precision agriculture. With this, it is possible to analyze the specific factors of each site, allowing the maximization of its productive potential. The increase in productivity in cultivated field works only happens with the knowledge of the attributes of the soil and its characterization in specific areas within the cultivated glebe. In direct planting system, the superficial application through casting limestone and traditional fertilizers done in the sowing line, provide vertical and horizontal variability of some attributes of the soil, fact that may cause great variation of nutrient contents with low mobility of the soil, even in samples collected a few centimeters from each other. Through the panorama depicted above, the  aim of this research is to evaluate the variability of phosphorus and potassium contents, and pH of the soil from samples collected from 0 to 10 cm and 10 to 20 cm depth. The study was carried out in a glebe which has been cultivated for twelve years with direct planting system that is part of a land in the countryside of Lagoa Vermelha, Rio Grande do Sul State, Brazil. The exact geographic locations of the sampling points and the geostatistical treatment  allowed the definition of spatial variability and the nutrient dispersion around the analyzed area. The preparation of variograms in order to quantify the reach of each sample together with the kriging technique made the drawing of thematic maps from the analyzed attributes possible. Results have shown that in the study area there is a necessity of pH correction of the soil in specific places and the geostatistical analysis proved the impossibility of phosphorus and potassium application in variable quantities.

Keywords: analysis, spatial variability, geostatistics

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Thum2006accuracy.pdf1.14 MB

Analyzing the Accuracy of Spatial and Temporal Dynamics of Land Use Pattern in Turkey: Case study in İnayet and Yenice Forest Planning Units

Ali Ihsan Kadıoğulları 1, E. Zeki Başkent 2, Sedat Keleş 3 and Alkan Günlü 1
1 Karadeniz Technical University, Faculty of Forestry, 61080, Trabzon, Turkey
Tel: +90 462 377 28 46, Fax: +90 462 325 74 99
alikadi@ktu.edu.tr , alkan61@ktu.edu.tr 
2 Karadeniz Technical University, Faculty of Forestry, 61080, Trabzon, Turkey
Tel: +90 462 377 37 34, Fax: +90 462 325 74 99
baskent@ktu.edu.tr
3 Karadeniz Technical University, Faculty of Forestry, 61080, Trabzon, Turkey
Tel: +90 462 377 28 36, Fax: +90 462 325 74 99
skeles@ktu.edu.tr

Abstract
Recognition and understanding of landscape dynamics  as a historical legacy of disturbances are necessary for sustainable management of forest ecosystems. This study analyzed the effects of the accuracy during the analyzing forest dynamics and spatial-temporal changes in land use/land cover pattern in a sub-temperate like alluvial forest land with 32.660 ha area along the Eagan Sea coast of Turkey (İnegöl). This area is studied  by comparing LANDSAT images from 1987 to 2001 and evaluated with spatial analyses of forest cover type maps from 1972, 1993 to 2004 using GIS. The study investigated temporal changes of spatial structure of forest conditions over the period using FRAGSTATSTM software. The results showed that the forested areas increased both in 1972 and 1993 (3,61%), 1993 and 2004 years (3,37%), 1993 and 2004 years (7,10%) and between 1987 and 2001 years (4,74%). In terms of spatial configuration, forest areas are generally fragmented in the latter periods due to intensive forest utilization, illegal use, expansion of settlements and infrastructural development in the lowlands. Land use pattern significantly changed over time depending on a few factors such as unregulated management actions, social pressure and demographic movements. In conclusion, land use changes have developed in favor of forestry over time between 1972-2004 and 1987- 2001. The study analyzed accuracy between forest stand type map (accepted as ground truth data) and Land cover map produced from Landsat Image. Accuracy of supervised classification to determine cover type from Landsat TM (1987) 91% with the Kappa statistics (Conditional Kappa for each Category) value  of 0.945, Landsat ETM (2001) 91.04% with 0.9018 Kappa statistics value. 

Keywords: Land use/Land cover change, GIS,  Remote Sensing, Fragmentation, Forest Management

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Kadiogullari2006accuracy.pdf1.77 MB

Application of statistics to detection of green resources changes at Yangmingshan area using remote sensing imagery

Shu-Ping Teng 1, Ke-Sheng Cheng 2, Hann-Chung Lo 1 and Yeong-Kuan Chen 3
1 School of Forestry and Resource Conservation, National Taiwan University
No. 1, Sec. 4, Roosevelt Road, Taipei 10617, Taiwan
Tel.: + 886 233664624; Fax: +886 223654520
d90625004@ntu.edu.tw; hclo@ntu.edu.tw
2 Department of Bioenvironmental Systems Engineering, National Taiwan University
No. 1, Sec. 4, Roosevelt Road, Taipei 10617, Taiwan
Tel.: + 886 233663465; Fax: +886 223635854
rslab@ntu.edu.tw
3 Department of Leisure & Recreation Management, TOKO University
No.51, Sec. 2, University Road, Pu-Tzu City, Chia-Yi County, 613  Taiwan 
Tel.: + 886 233664624; Fax: +886 23366
ykchen@ntu.edu.tw

Abstract
Change detection is one of the most important remote sensing applications for environmental monitoring. Green resources investigation is essential for the sustainable development of natural resources. In this study, change detection of green  resources in the Yangmingshan area near Taipei was conducted using multi-temporal remote sensing images. The process of change detection in this study is composed of several steps. Firstly, preprocessing of satellite images including geometric and atmospheric  corrections was conducted. Supervised landcover classification using maximum likelihood method was then applied to extract unchanged vegetation pixels. A bivariate normal distribution for NDVI of unchanged vegetation pixels was then established and used to construct the critical region of change detection. At 5% level of significance, the proposed approach detected approximately 8.47% of green resources changes within a period of 15 years (1986 to 2001).

Keywords: change detection, remote sensing, geometric correction, statistics

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Teng2006accuracy.pdf2.78 MB

Artificial Neural Networks applied in the determination of Soil Surface Temperature – SST

Maurício Roberto Veronez 1, Adriane Brill Thum 1, Anderson Silva Luz 2
and Deivis R. da Silva
2
1 Graduate Program in Geology - Vale do Rio dos Sinos University
Avenida Unisinos, 950, São Leopoldo - RS, Brazil – CEP: 93022-000
Tel.: + 55 (51) 3591-1100 – Extension 1769 Fax: + 55 (51) 3590-8177
veronez@unisinos.br, adrianebt@unisinos.br
2 Students of Civil Engineering. - Vale do Rio dos Sinos University
Avenida Unisinos, 950, São Leopoldo - RS, Brazil – CEP: 93022-000
Tel.: + 55 (51) 3591-1100 – Ramal 1769; Fax: + 55 (51) 3590-8177
andersonluz@terra.com.br

Abstract
Artificial intelligence techniques are being used to facilitate modeling in many different research areas. One example of these techniques is the use of Artificial Neural Networks – ANNs. ANNs are not a new technique and have been studied since the 1940s. The technique was almost forgotten during the 1970s, but reappeared in the 1980s as a possible alternative to traditional computing. Today, ANNs are used in many projects, especially to forecast data, learn algorithms, optimize systems, recognize standards, and others. The surface temperature (ST) is a parameter influenced by changes in weather (temperature and air relative humidity, wind speed, precipitation, etc.) and indicates the hydric state of a plant. Thus, estimating the ST is very useful in monitoring projects that tend to hydric demands of cultures, which will contribute to irrigation programs. Another important application is in the use of the determination of evapotranspiration, where, together with other components of the hydrological cycle, it is important to evaluate the replenishing of underground water-bearing aquifers. Recently, a method used to estimate the ST uses the analysis of NOAA-AVHRR thermal images adapted to the split windows equation. This modeling relates emission amount of variables (generated by the images) and atmospheric data. It is a complex methodology, because besides difficult statistical modeling, it is necessary to digitally process the images to determine the emission amount. ANNs are indicated in this study due to their excellent capacity for generalization, classification, interpolation, extrapolation, tolerance to errors and noise, and because of the fact that they do not require the specification of explicit parameters to complete the modeling process. The purpose of this project is to verify the possibility of the use of ANNs to estimate soil temperatures. The test area selected is a part of an urban area located in Ivoti - RS. In different points of  the area, values of ST were collected using a portable temperature sensor model. For each collecting point, the position was known by the UTM coordinates (E, N) and the elevation (H). Different network topologies were tested in a supervised way by the backpropagation algorithm, using the E, N and H coordinates, as data entry, and giving ST as results. The best topology found was that possible in the limitations of available time for the execution of the successive necessary refinements. Despite the fact that several tests have been done with two and three  intermediate layers, a simple topology was adopted, with only one intermediate layer of 3 and 5 neutrons. Four learning algorithms were tested. The algorithms, Scaled Conjugate Gradient, Levenberg-Marquerdt, and Resilient, provided values for ST with mean errors below 0.9 ºC, during the simulations, while the Gradient Descendent showed a mean error below 1.9 ºC.

Keywords: Artificial Neural Networks, modeling, soil surface temperature

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
VeronezThum2006accuracy.pdf925.54 KB

Assessing spatial clustering of MRSA with stochastic simulations, kernel estimation and SATSCAN.

L. Bastin 1, J. Rollason 2, A.C. Hilton 2, D.G. Pillay 3, C. Corcoran 3, J. Elgy 1, P. Lambert 2, T. Worthington 2, P. De 3, and K. Burrows 2
1 School of Engineering and Applied Science, University of Aston
Birmingham, UK
+44 (0) 121-204-3560
l.bastin@aston.ac.uk
2 School of Life and Health Sciences, University of Aston
Birmingham, UK
3 Good Hope Hospital NHS Trust
Sutton Coldfield, Birmingham, UK

Abstract
Apparent spatial disease clusters may stem from a combination of factors including transmission events between individuals, heterogeneous environmental influences, population clustering, and/or chance. 832 incidences of methicillin-resistant Staphylococcus aureus (MRSA) in the West Midlands (UK) were located to postcode-centroid level to test for evidence of community transmission. In an exploratory kernel estimation analysis, clustering effects due to local population density were visualized  and assessed for significance by thresholding against ‘spatial nulls’ (based on the 97.5th  percentile of 1000 age-stratified Poisson-process realisations with no a priori assumptions of spatial autocorrelation). This approach, combined with a spatial and spatio-temporal scan, was  of particular value in identifying apparent outbreaks at nursing and residential care homes. An attempt to disaggregate the approach to postcodes caused notable accuracy problems in modelling expected MRSA occurrences, biasing the apparent significance of localised  occurrences. Stochastically-simulated cases were therefore aggregated to Census Output Area centroids to mitigate the effects of spatial aggregation in the real data. Isolates of methicillin-sensitive Staphylococcus aureus (MSSA) from the same region and time period were used as controls in a ‘random labelling’ approach to investigate possible variation in testing intensity among family doctors and primary Health Centres. We demonstrate the combination of standard spatial epidemiological tools with more novel simulation techniques in an exploratory  analysis which identified community MRSA clusters. In the absence of occupational/lifestyle data on patients, the assumption was made that an individual’s location and consequent risk is adequately represented by their residential postcode. The problems of this assumption are discussed.

Keywords: epidemiology, stochastic simulation, cluster, MRSA

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Bastin2006accuracy.pdf926.85 KB

Assessing the Thematic Accuracy for Coral Reef Classification

Supawan Wongprayoon 1, Carlos A.O. Vieira 2, and Joseph J. H. Leach 1
1 Department of Geomatics, 
The University of Melbourne, 
Parkville , VIC 3010,Australia 
s.wongprayoon@pgrad.unimelb.edu.au, leach@unimelb.edu.au
2 Departamento de Engenharia Civil, 
Universidade Federal de Viçosa, 
Viçosa, MG, 36570-000, Brazil 
carlos.vieira@ufv.br

Abstract
This paper describes methods of assessment of classifier performance for coral reef classifications that explicitly include the spatial pattern of classification errors, and which presents the user with a visual indication of the reliability of the pixel label assignments. Two coral reef test sites in Thailand were used in this study. Two of Ikonos images were also used covering the respective sites.  It was performed a supervised classification using the maximum likelihood classifier. Non-spatial statistics such as: overall accuracy, Kappa coefficient, variance, and Z statistics; were computed from the error matrix and thematic images were also generated.  None of these statistics explicitly considers the spatial distribution of misclassified pixels. This paper describes methods of assessment of classifier performance for coral reef classifications that explicitly include the spatial pattern of classification errors, and which presents the user with a visual indication of the reliability of the pixel label assignments. Results show that despite statistical accuracy measurements which present outstanding results (values), the accuracy is not spatially  similar for every pixel through the image (homogeneous).  Thus, a considerable amount of research and development needs to be accomplished before the spatial characterization of thematic accuracy associated with remote sensing products can be adequately reported in standardized format and legends.

Keywords: reliability of the remote sensing products, thematic accuracy, coral reef mapping, IKONOS, image classification

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Wongprayoon2006accuracy.pdf1.11 MB

Assessing the accuracy of hexagonal versus square tilled grids in preserving DEM surface flow directions

Luís de Sousa, Fernanda Nery, Ricardo Sousa and João Matos
Departamento de Engenharia Civil e Arquitectura
Instituto Superior Técnico (IST)
Avenida Rovisco Pais
1049-001 Lisbon, Portugal
Tel. +351-218418350, Fax: +351-218419765
lads@mega.ist.utl.pt; nery@ist.utl.pt; rts@civil.ist.utl.pt; jmatos@civil.ist.utl.pt

Abstract
The theoretical advantages of hexagonal grids over rectangular grids have been known for a large number of years. Among these, two can be stressed out: the higher spatial resolution achieved with the same number of samples and the isotropy of local neighbourhoods. This work explores such advantages in the representation of flow directions used in hydrologic modeling. The 3 arc-second resolution DEM data collected by the Shuttle Radar Topography Mission (SRTM) was resampled to increasingly  lower resolution grids, both of squares and hexagons. The flow direction vectors where computed in each of these grids using the steepest down slope neighbour criteria; for the hexagonal grids an equivalent model was used. Reference data was obtained from  the original full resolution DEM, calculating the resulting flow direction vectors of the samples contained  inside each of the lower resolution cells. The accuracy of each model in preserving the original DEM flow direction characteristics was assessed by comparing the angles defined by  lower resolution flow vectors with the resulting vectors of the corresponding full resolution samples. In the data analysis phase, the influence of local terrain morphology and variability was evaluated. A set of tests where conducted in the Minho River basin (ca. 16950 Km2 in the northwest of the Iberian Peninsula). The results obtained suggest a superior capacity of the hexagonally tilled grids in maintaining the original flow directions, as given by the resulting vector of the original samples.

Keywords: regular tessellation, hexagonal grid, flow direction

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
deSousa2006accuracy.pdf340.89 KB

Best locations for river water quality monitoring sensors through fuzzy interpolation

Angelo Marcello Anile 1, Salvatore Spinella 2 and Marco Ostoich
1 Università degli studi di Catania
Dipartimento di Matematica e Informatica,
viale A. Doria 6, 95125 Catania, Italy
anile@dmi.unict.it
2 Consorzio Catania Ricerche
Via A. Sangiuliano 262, I95124 Catania, Italy 
spins@unical.it
3 ARPAV - Dipartimento Prov. Padova
Osservatorio Regionale Acque Interne
Ufficio Studi e Progetti
Piazzale Stazione n. 1 35131 Padova, Italy
mostoich@arpa.veneto.it

Abstract
This work concerns the interpolation of environmental data using fuzzy splines in order to monitor water quality in a river. A fuzzy interpolated model representing the river water quality is constructed and  then queried in order to retrieve information useful for planning precautionary measures. Moreover the information retrieved can be used in order to improve the distribution of the monitoring sensors on  the basin area for optimizing the coverage. Geographical data concerning environment pollution consist of a large set of temporal measurements (representing,  e.g .monthly measurements for  one year) at a few scattered spatial sites. In this case the temporal data at a given site must be summarized in some form in order to employ it as input to build a spatial model. Summarizing the temporal data (data reduction) will necessarily introduce some form  of uncertainty which must be taken into account. Fuzzy numbers can represent this uncertainty in a  conservative way without any statistical “a priori” hypothesis. This method has been employed for ocean floor geographical data by Patrikalakis (1995), in  the interval case, and Anile et al. (2000), for fuzzy numbers, and to environmental pollution data by Anile et al.(2004) Fuzzy interpolation is carried out with splines to get a deterministic model for environmental pollution data. Then the model is interrogated by fuzzy queries to find the sites exceeding a quality threshold. The results suggest the areas of the basin which should be subjected to a further rigorous examination; therefore the methods could be useful for monitoring network reorganization in order to be better representative of water quality.

Keywords: uncertain, fuzzy number, fuzzy interpolation, fuzzy queries, spline

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Anile2006accuracy.pdf1.06 MB

Categorical models for spatial data uncertainty

Sarah L. Hession, Ashton M. Shortridge and Nathan M. Torbick
Department of Geography
Michigan State University
East Lansing, MI 48824, USA
Tel.: + 001 517 355 4649; Fax: + 001 517 432 1671
hessions@msu.edu; ashton@msu.edu; torbick@msu.edu

Abstract
Considerable disparity exists between the current state of the art for categorical spatial data error modeling and the current state of the practice for reporting categorical data quality. On one hand, the general Monte Carlo simulation-based error propagation framework is a fixture in spatial data error handling; researchers have identified potentially powerful approaches to characterizing categorical data  error so that its effects on  application uncertainty may be assessed. On the other hand, standard data quality assessments for categorical data are 'spatially unaware,' fail to provide critical information for error propagation models, and neglect the fitness for use paradigm underlying the longstanding rationale for accuracy metadata. Many error assessments rely on area-averaged indicators of map error that do not reflect spatial variability, such as the confusion matrix. How might this gulf between state of the art and state of the practice be bridged? In the present work we lay the foundation for such an edifice: we contrast several categorical error models proposed in the literature in terms of input parameters and performance for a heterogeneous land cover dataset. Familiar methods such as the confusion matrix are considered for their utility in developing error propagation models, as well as theoretically-based, spatially explicit methods like indicator simulation that are not commonly employed in applied research. We develop a comparative matrix to summarize different model requirements, characteristics, and performance, and utilize available secondary data sources where possible to develop improved inputs for the analysis of uncertainty propagation.

Keywords: categorical data, land cover data, indicator kriging, indicator simulation, uncertainty propagation

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Hession2006accuracy.pdf759.26 KB

Checking the spatial accuracy of class boundaries with a varying range of accuracy requirements

David Bley 1 and Ruedi Haller 2 
1 Swiss National Park (SNP)
Schlossstr. 25, 38871 Ilsenburg, Germany
Tel.: + 49 171 4789890
david_bley@hotmail.com
2 Swiss National Park (SNP)
Casa dal park, 7530 Zernez, , Switzerland
Tel.: +41 81 856 12 82; Fax: +41 81 845 17 40
rhaller@nationalpark.ch

Abstract
This paper presents a method to assess the horizontal accuracy of class boundary location for large scale categorical data for large mountainous and protected areas. The test was performed on a land cover dataset which was created for the Swiss National Park (SNP) with an overall area of 370 km2. The average area of a polygon has been 0.66 ha with a minimal distance between two lines of 5 meters. Therefore a set up of stratified random samples was required because of the lack of sufficient  data with higher accuracy as a reference and restricted access to the area due to conservation policies. Additionally we regarded the assumption that spatial properties in maps are a function of thematic categories and accuracy requirements differ in boundaries  between different classes.  The comparison is based on orthoimages with a well defined spatial quality description. We defined all possible pairs of neighboring habitat classes and classified the required accuracy by a confusion matrix. The accuracy values are based on the guidelines for interpretation. We intersected a regular set of directed transect lines across the data set and obtained a point  dataset of all intersections between transect lines and class boundaries. The intersection point data was buffered according to the varying accuracy between the adjacent habitat types. With an independent manually operated control procedure the accuracy of the line was accepted or rejected by defining the delineation within or outside the buffer. In an area with missing data of higher accuracy or restricted access, the method has shown a simple approach to assess the overall accuracy of categorical data as well as a well distinguished image of the delineation work of different adjacent classes. Moreover, with the design of control points, an overall view over a large area is guaranteed.

Keywords: spatial uncertainty, categorical data, accuracy assessment

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Bley2006accuracy.pdf605.64 KB

Classification of Spatial Variation Pattern for Identifying Pollution Sources: A Case Study on Sediment Contamination in Bizerte Lagoon, Tunisia

Mitsuo Yoshida 1 and RPP-SEPMCL2002 Shipboard Scientist Team 2
1 Institute for International Cooperation, Japan International Cooperation Agency (JICA)
10-5, Ichigaya Honmuracho, Shinjuku-ku, Tokyo 162-8433 JAPAN
Tel.: + 0081 3 3269 3851; Fax: + 0081 3 3269 6992
Yoshida.Mitsuo.2@jica.go.jp
2 Institut National de Recherche Scientifique et Technique (INRST)
B.P. 95, 2050 Hammam-Lif, TUNISIA

Abstract
The concentration of potentially toxic elements (PTEs) was measured for a total of 180 sediment samples collected from the Upper  Layer (surface sediments) and the Lower Layer (sediments repository) of lagoon bottom sediments distributed in the Bizerte lagoon, Mediterranean coast, northern Tunisia. The spatial variation pattern of PTEs contamination in the lagoon bottom sediments is classified into three types, Type I, II, and III. (i) Type I: A ‘peak’-shaped very high concentration part limitedly appears at several shoreline sites nearby the Menzel Bourguiba industrial zone. The  other sites show relatively homogeneous distribution with low concentration values.  This variation pattern implies that the contamination is caused by an effluent from the industrial zone, and the contamination has not widely spread toward the lagoon basin. Twelve elements are found in this type: Sb, As, Ba, Cd, Cu, Pb, Mo, Se, Ag, Tl, U, and Zn. Three heavy metals, Ba, Pb, and Zn, and two metalloids, As and Se, show toxic level of values above the regulation. (ii) Type II:  A ‘plateau’-shaped high concentration zone is present from south western part (Menzel Bourguiba, Tinja, and its north) to the west central part of the lagoon basin. The concentration significantly decreases eastward, southward, and northward. This ‘Plateau’ type pattern implies that the pollution source is the south western side of the lagoon, and the contamination migrates towards the central part of the lagoon. Seven elements are found in this pattern: Al, Cr, Co, Fe, Mn, Ni, and V. The concentration of five metals, Al, Cr, Co, Mn, and V, indicates above or around the criteria. (iii) Type III: The Hg concentration shows unique spatial variation of which higher values appear only near the south eastern shoreline of the lagoon basin. The Hg contamination is probably due to agricultural chemicals such as pesticides derived from the eastward-southward agricultural zone.

Keywords: sediment contamination, lagoon, potentially toxic elements, spatial variation

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Yoshida2006accuracy.pdf981.64 KB

Comparing accuracy of classified Landsat data with land use maps reclassified from the stand type maps

Fatih Sivrikaya, Sedat Keles, Günay Cakir, Emin Zeki Baskent And Selahattin Köse
Karadeniz Technical University, Faculty of Forestry
61080, Trabzon, Turkey
Tel: +90 462 377 37 34, Fax: +90 462 325 74 99
fatihs@ktu.edu.tr; gcakir@ktu.edu.tr, skeles@ktu.edu.tr, baskent@ktu.edu.tr, skose@ktu.edu.tr

Abstract
A key step in natural resource management is the delineation of land units that are similar relative to type, structure, and productivity of vegetation in a given area. This is accomplished by land classification system on the basis of satellite images with appropriate specifications that serves as an essential database for forest management planning. Managing forests on an ecosystem basis relies upon accurate estimation of forest land classification that responds in a similar manner to management practices. The integration of remotely sensed data into a GIS offers a wide variety of new perspectives and  possibilities for the analysis, evaluation and interpretation of such data, in combination with auxiliary digital information maps.The aim of this study is to compare accuracy of classified data with land use maps reclassified from the stand type maps. Two forest planning areas, Artvin and Bulanıkdere Forest Planning Unit (FPU) in Turkey, were selected as case study areas. Two land use maps were produced using Landsat ETM+ (2000) data and reclassified the  stand type maps. The results suggest that accuracies of classified Landsat ETM+ for Artvin and Bulanıkdere FPU are 82.14 and 88.75%, respectively. Later, classified Landsat ETM+ data were converted to vector database. Landsat ETM+ data and reclassified stand type maps were overlaid the appropriate spatial analysis carried out using GIS. As a result of spatial analysis, accuracy of vectorized Landsat ETM+ data is around 5-10 %, lower than classified Landsat ETM+ data. In addition, according to land use map derived from stand  type map, accuracy of land use map derived from Landsat ETM+ data is 64.2% in Bulanıkdere FPU and 44.2%  Artvin FPU. These differences are resulted from slope and vegetation homogeneity of case study areas, and geometric accuracy. 

Keywords: supervised classification, Landsat, forest management planning, accuracy 

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Sivrikaya2006accuracy.pdf2.73 MB

Comparison of Methods for Deriving a Digital Elevation Model from Contours and Modelling of the Associated Uncertainty

Anicet Sindayihebura 1, Marc Van Meirvenne1 and Stanislas Nsabimana2
1 Department of Soil Management and Soil Care, Ghent University
Coupure 653, 9000 Gent, Belgium
Tel.: + 32 (0)9 264 6056; Fax: + 32 (0)9 264 47
anicet_fr@yahoo.fr; marc.vanmeirvenne@ ugent.be
2 Department of Geography, University of Burundi
P.O. Box: 1550 Bujumbura, Burundi
Tel.: + 257 244006; Fax: + 257 22 3288
snsabim@yahoo.fr

Abstract
Topographic maps are the most  common sources of Digital Elevation Models (DEMs).On these maps elevation data are explicitly displayed along contour lines. However such an input is not fully representative of the terrain shape which is known to vary continuously and smoothly in space. Moreover, in many instances the DEM is not the final objective. In our case, we intend to use it along with its derivatives as secondary information in interpolation of soil attributes. So DEM errors will propagate into the target variable. Therefore assessment of the amount and spatial distribution of these errors is crucial. Often, what is important is not the absolute DEM accuracy but the reproduction of the terrain shape. In this respect quantitative assessment criteria should be complemented by other qualitative criteria that account e.g. for artefacts and inconsistencies. The topographic data source is an 1:50000 topographic map with a 20 m contour interval. The study area has an area of 14.8 km2  and is composed of two contiguous watersheds located in a mountainous and lithologically contrasting zone of Burundi. Correlations between soil attributes and DEM-derived topographic attributes are significant, but they are too weak to be readily accounted for in soil attributes interpolation. Also propagation of DEM uncertainty into DEM derivatives is too pronounced and precludes the use of these as  secondary information  in soil attributes interpolation. Therefore we suggest changing the DEM into topographic units and use these as well as lithologic units as categorical soft information.   

Keywords: Digital Elevation Model, correlation, interpolation, uncertainty

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Sindayihebura2006accuracy.pdf464.52 KB

Conversion between the vector and raster data structures using Fuzzy Geographical Entities

Cidália Fonte
Department of Mathematics
Faculty of Sciences and Technology
University of Coimbra, Apartado 3008, 3001 – 454 Coimbra, Portugal
Tel.: + 351 239 791150; Fax: + 351 239 832568
cfonte@mat.uc.pt

Abstract
The conversion of geographical entities between the raster and vector data structures introduces errors in the entities position. The vector to raster conversion results in a loss of information, since, when a Boolean classification of each pixel is used, the entities shape must follow the shape of the pixels. Thus, the information about the position of the entities in the vector data structure is lost with the conversion. This loss can be minimized if, instead of making a Boolean classification of the pixels, a fuzzy classification is performed, building Fuzzy Geographical Entities. These entities keep, for each pixel, the information about the pixel area that was inside the vector entities and therefore only the information about the position of the entities boundaries inside the pixels is lost. The Fuzzy Geographical Entities obtained through the previous conversion may be converted back to the vector data structure. An algorithm was developed to reconstruct the boundaries of the vector geographical entities using the information stored in the raster Fuzzy Geographical Entities. Since the grades of membership represent partial membership of the pixels to the entities, this information is valuable to reconstruct the  entities boundaries in the vector data structure, generating boundaries of the obtained vector entities that are as close as possible to their original position. Even though slight changes on the entities shape and position are expected after the re-conversion to the vector data structure, when Fuzzy Geographical Entities are used, the entities areas are always kept during successive conversions between both structures. Moreover, if the conversion and re-conversion methods are successively applied considering the same pixel, size, origin and orientation, the obtained results are always identical, that is, the positional errors introduce will not propagate indefinitely.

Keywords: vector to raster conversion, raster to vector conversion, fuzzy geographical entities, positional error, error propagation

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Fonte2006accuracy.pdf740.61 KB

Dealing with Multiple Accuracy Levels in Spatial Databases with Continuous Update

Alberto Belussi 1, Maria Antonia Brovelli 2, Mauro Negri 3, Giuseppe Pelagatti 3 and Fernando Sansò 2
1 Dipartimento di Informatica, Università degli studi di Verona
Strada Le Grazie, 15, 37100 Verona, Italy
Fax: + 0039 045 802 7xxx
alberto.belussi@univr.it
2 Dipartimento di Ingegneria Idraulica, Ambientale, Infrastrutture Viarie, Rilevamento, 
Politecnico di Milano – Polo Regionale di Como
Via Valleggio 11, 22100 Como, Italy
Fax: + 0039 031 3327519
maria.brovelli@polimi.it, fernando.sanso@polimi.it
3 Dipartimento di Elettronica e Informazione, Politecnico di Milano 
Politecnico di Milano
Via Ponzio, 34/5, 20100 Milano, Italy
Fax: + 004 555 874 414

Abstract
It is widely recognized that most Spatial Databases should be continuously updated, and that this goal can be reached only through the integration with the information systems that they serve, since most of the information required to perform the updates should be obtained from the administrative processes of the institutions which control the territory. However, since the different update processes collect spatial data  with different accuracy, one of the main problems in dealing with continuous update  is the requirement of managing data having different levels of accuracy also at instance level. Moreover, in this scenario a correct update process should address the two following goals: i) to use the updates having a high level of accuracy in order to increase the overall accuracy level of the spatial database; ii) to maintain the correct representation of all the qualitative spatial properties of the information (i.e. topological properties, like containment, adjacency, etc, but also other geometric properties, like parallelism, shape types, etc…). It can be shown that sometimes there is a conflict between the above two goals which is difficult to overcome. The main contribution of the paper is to propose an approach to perform an update that  overcomes the above conflict and achieves both the goals stated above. In particular, one of the possible techniques in data handling that could reasonably cope with such an endeavour is to use a Bayesian approach to the updating concept. Some elementary examples will clarify the benefits of this approach.

Keywords: Bayesian approach, spatial databases, multiple accuracy levels, continuous update

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Belussi2006accuracy.pdf351.49 KB

Deriving thematic uncertainty measures in remote sensing using classification outputs

Kyle M. Brown 1, Giles M. Foody 2 and Peter M. Atkinson 2
1 Environment Agency, Science Group - Technology, 
Lower Bristol Road, Bath, UK BA2 9ES, UK, 
kyle.brown@environment-agency.gov.uk
2 School of Geography, University of Southampton, 
Highfield, Southampton SO17 1BJ, UK, 
G.M.Foody@soton.ac.uk ; P.M.Atkinson@soton.ac.uk

Abstract
The process of estimating thematic errors in the classification of remotely sensed imagery generally involves the use of the confusion matrix. Though measures of overall and per-class accuracy may be derived from the confusion matrix there is no information on where thematic error occurs. However, spatial variation in  thematic error can be a key variable in determining errors when overlay operations such as change detection are carried out. One method of indicating thematic error on a per-pixel basis is to use the outputs of a classifier to estimate the uncertainty associated with the allocation of a particular class to the pixel. This probabilistic approach has been  used previously, but studies have generally used a single classifier and so comparisons of the relative accuracy of classifiers for deriving thematic uncertainty measures have not been made. Also, the effects of classifier training on estimation of thematic uncertainty have not yet been examined. This paper compared three classification methods for estimating thematic uncertainty for a sand dune test site at Ainsdale, Southport, UK using data acquired by the Compact Airborne Spectrographic Imager. The classifiers used were the maximum likelihood (ML), the multi-layer perceptron (MLP) neural network and probabilistic neural network (PNN). The paper also examined the effect of varying the training of neural network classifiers on estimating thematic uncertainty by altering the number of iterations and the architecture of the MLP and the smoothing function of the PNN. The MLP and PNN with the largest proportion of correctly allocated cases, Po, had larger overall accuracies than the ML (ML Po = 0.774; MLP Po = 0.827; PNN Po = 0.827). A significant, at 99.999% confidence, one-to-one relationship between predicted and actual thematic uncertainty was found for all classifiers. This indicates that all the classifiers tested were able to estimate thematic  uncertainty. The MLP and PNN estimated thematic uncertainty with similar accuracy, but the ML was less accurate than both classifiers. However, the MLP and PNN that had the largest overall accuracy were  not the classifiers that estimated thematic uncertainty most accurately. The number of iterations used in training for the MLP was significantly correlated with accuracy of estimation of thematic uncertainty (adjusted- r 2  = 0.324,  p = 0.046).  The  smoothing function of the PNN clearly influenced the overall accuracy and the accuracy of thematic uncertainty estimation, though a significant correlation was not found when linear, log-linear and polynomial regressions were applied. The results of the study are discussed in terms of selecting the most suitable classifier for mapping land cover or predicting thematic error, as the most appropriate classifier for each task may be different.

Keywords: thematic uncertainty, classification, maximum likelihood classifier, multi-layer perceptron, probabilistic neural network

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Brown2006accuracy.pdf1.3 MB

Determining the effects of different scanner and scanning resolutions on orientation errors in producing of Orthophotos

Günay Çakır, Sedat Keleş, Fatih Sivrikaya, Emin Zeki Başkent and Selahattin Köse
Karadeniz Technical University, Faculty of Forestry
61080, Trabzon, Turkey
Tel: +90 462 377 37 34, Fax: +90 462 325 74 99
gcakir@ktu.edu.tr; fatihs@ktu.edu.tr, skeles@ktu.edu.tr, baskent@ktu.edu.tr, skose@ktu.edu.tr

Abstract
An orthophoto is a geo-referenced image produced from aerial photos. It has geometric features of a map and the qualities of photographs, representing all terrain features. Due to its unlimited capability, it can be used in various  areas such as forest management, ecosystem management, environmental planning and social science. The quality of the final orthophoto mainly depends on several major factors such as the accuracy of the Digital Surface Model (DSM), clarity of the air photos, scanning  resolution and quality, Ground Control Point (GCP) accuracy, camera model and mosaicking. The objective of this study is to explain the effects of different scanners and scanning resolutions on spatial accuracy in conversion of analog aerial photos to orthophotos. We selected two different scanners, a normal desktop and photogrammetric. 23 by 23 cm diapositives and 23 by 23 analog photographs of air photos were scanned through desktop scanner with a resolution of 600 dpi. In addition, 23 by 23 cm diapositives of air photos were scanned through photogrammetric scanner with a resolution of 1200 dpi. Orthophotos were created with ERDAS imagine OrthoBASE software. DEM was also used to produce orthophoto images. Results show that orientation errors of digital orthophoto generation for diapositives and  analog photographs, scanned with a desktop scanner, are 2 – 3 pixels and 10 – 11 pixels, respectively, whereas the errors of diapositives, scanned with the photogrammetric scanner, are 0.1 – 0.3 pixels.

Keywords: orthophoto, accuracy, digital image processing, forest management

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Cakir2006accuracy.pdf1.11 MB

Development of a decision framework to identify appropriate spatial and temporal scales for modeling N flows

Egon Dumont 1, Carolien Kroeze 1, Evert Jan Bakker 2, Alfred Stein 2 and Lex Bouwman 3
1 Environmental Systems Analyses Group, Wageningen University 
Ritzema Bosweg 32a, Wageningen, The Netherlands
Tel.: + 31 317 484812; Fax: + 31 317 484839
egon.dumont@wur.nl; carolien.kroeze@wur.nl
2 Mathematical and Statistical Methods Group
Wageningen University, Bornsesteeg 47, Wageningen, The Netherlands
Tel.: + 31 317 484085; Fax: + 31 317 483554
alfred.stein@wur.nl
3 Netherlands Environmental Assessment Agency
A. van Leeuwenhoeklaan 9, Bilthoven, The Netherlands
Tel.: + 31 30 2742745; Fax: + 31 30 2744479
lex.bouwman@mnp.nl

Abstract
Decision makers need to know the effects of alternative watershed management strategies on export of nitrogen (N) to coastal waters. They use this knowledge to reduce undesirable effects of excess N in coastal waters. Various models exist to predict the effects of watershed management on export of N. This paper describes a decision framework to identify the appropriate spatial and temporal scales for using N flux models. The framework is developed for existing models that predict N export from large watersheds and the contribution of N sources and N sinks to this N export. With  this framework, modelers can identify the appropriate scale for model predictions and independently scalable model parts. The framework bases the appropriateness of model scales on indicators, which are to be specified by the modeler and which are associated with four criteria. The four criteria require modeling scales to correspond with (A) data, mitigation options, and scenarios, (B) model assumptions, (C) available resources for modeling, and (D) requirements of prediction users. A successful application of the framework is illustrated for a global model of dissolved inorganic nitrogen export from watersheds to coastal waters. Ranges of appropriate scales are determined for model output and five independently scalable model parts, which model the (1) surface N balance, (2) point sources, (3) N flux in sediments and small streams, (4) retention in dammed reservoirs, and (5) riverine retention. Appropriate model scales were found, if the four criteria were not set too strict. We conclude that the decision framework can contribute substantially to selecting the appropriate modeling scales in a balanced and comprehensive manner.

Keywords: nitrogen cycling, surface water quality, decision support, models, scale

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Dumont2006accuracy.pdf605.48 KB

Digital Terrain Model generation using airborne LiDAR in a forested area of Galicia, Spain

Luis Gonçalves-Seco, David Miranda, Rafael Crecente and Javier Farto
Land Laboratory – Department of Agroforestry Engineering
University of Santiago de Compostela – Campus universitario s/n, Lugo 27002, SPAIN
Tel.: + 34 982252231 Fax: + 34982285926
lgseco@usc.es; dmiranda@usc.es; rcrecente@usc.es; jfarto@usc.es

Abstract
Information on the shape and relief of the Earth’s surface is essential for improving land management practices that promote more  sustainable development. Such a need for information is even greater in regions with rough topography and a high percentage of woodland cover. In the last few years, Airborne Laser Scanning (ALS) technology has demonstrated that laser altimetry is a reliable technology for determining accurate Digital Terrain Models (DTM). This paper presents a method for filtering LiDAR data based on mathematical morphology that is capable of using point cloud data from both the first and last return to discriminate terrain points and to segment the objects in forested areas into low and high vegetation. A pilot project was conducted in a mountainous area of 4 km2  covered by Eucalyptus globulus plantations. In the study area, 4 zones were differentiated according to land use in order to allow for better presentation and interpretation of results. To validate the results, more than 40 control plots were distributed over the study area. In general, the results obtained in the study were better than expected, considering the hilly nature of the study area, often covered by dense shrub layers. RMSE values in the range 0.12 m – 0.27 m were obtained for the different zones studied, which reveals the suitability of the method for this type of data and this area. The inclusion of the first and last returns enabled an average increase of 27% in the number of terrain points, and guaranteed a final point density of 2 points/m2  before interpolation.

Keywords: airborne laser scanning, filtering, DTM, segmentation

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
GoncalvesSeco2006accuracy.pdf1.38 MB

Do spatial data consumers really understand data quality information?

Anna T. Boin and Gary J. Hunter
Cooperative Research Centre for Spatial Information
Department of Geomatics
The University of Melbourne
VIC 3010, Australia
Tel.: + 61 3 8344 9200; Fax: + 61 3 9349 5185
a.boin2@pgrad.unimelb.edu.au, garyh@unimelb.edu.au

Abstract
Metadata is an important tool for recording data assets and in the spatial information industry it is also used to describe the quality of data to consumers. It is text-based information with industry-specific terms, however there is a general impression in the industry that consumers still do not fully understand its meaning. This research aims to bridge the communication gap between data providers and consumers by designing, developing and testing better ways to communicate spatial quality information. The research intends to develop strategies for presenting quality information that meets the needs of the consumer when deciding whether or not to use a particular dataset. Accordingly, we believe there are three questions to be answered: (1) Can this be achieved without losing information? (2) Is the quality information prescribed in current standards adequate and helpful? and (3) Will data providers be able to conform to the new protocols? Our aim is not to categorize the quality of data as ‘good’ or ‘bad’—since that is task dependent—but rather to communicate the reliability of the data in a way that allows consumers to make the necessary value judgments about its suitability for their needs.

Keywords: spatial data, consumers, quality information, usefulness

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Boin2006accuracy.pdf223.51 KB

Do users ignore spatial dada quality?

P.A.J. van Oort, A.K. Bregt and S. de Bruin
Wageningen University, Centre for Geo-Information, P.O. Box 47,  6700 AA Wageningen, The Netherlands 
pepijn.vanoort@wur.nl, arnold.bregt@wur.nl, sytze.debruin@wur.nl

Abstract
Risk analysis (RA) has been proposed as a means of assessing fitness for use of spatial data. However, literature suggests that users of spatial data often ignore the potential implications of inherent uncertainties and errors in their decision making.  There is very little empirical research on how users cope with issues of spatial data quality. In this paper, we explore a number of hypotheses for why users would be more or less willing to spend resources on RA. The hypotheses were tested using questionnaires which were  distributed to researchers, policy-makers and GIS-technicians. Our results suggest that the willingness of users to spend resources on quantifying implications of spatial data quality depends on the presence of feedback mechanisms in the decision-making process. For example stakeholder interaction at the start of a project may identify and eliminate errors and as a result, decision-makers may consider RA unnecessary. Another form of  feedback, at the end of the decision, is accountability for implications due to errors and uncertainties. We found a very high willingness to spend resources on RA in case  of such accountability. Apart from feedback mechanisms, we found that the willingness of users to spend resources on quantifying implications of spatial data quality depends on how much is at stake and to a minor extent on how well the decision-making process can be modeled.

Keywords: spatial data quality, fitness for use, risk analysis, decision making

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
vanOort2006accuracy.pdf612.43 KB

Dynamic visualisation of spatial and spatio-temporal probability distribution functions

Edzer J. Pebesma 1, Derek Karssenberg 1 and Kor de Jong
1 Dept of Physical Geography, Geosciences Faculty
PO Box 80.115, 3508 TC Utrecht, The Netherlands
Tel.: +31 30 2533051; Fax: +31 30 2531145
e.pebesma@geo.uu.nl

Abstract
In this paper we will present  and demonstrate aguila, a tool for interactive dynamic visual analysis of gridded data that come as spatial or spatio-temporal probability distribution functions. Probability distribution functions are analysed in their cumulative form, and we can choose to visualize exceedance  probabilities given a threshold value, or its inverse, the quantile values. Threshold value or quantile level can be modified dynamically. In addition, classified probabilities in terms of (1-alpha)x100% (e.g. 95%) confidence or prediction intervals can be visualized for a given threshold value. Different modelling scenarios can be compared by organizing maps in a regular lattice, where individual maps (scenarios) are shown in panels that share a common legend and behave identically to actions like zooming, panning, and identifying (querying) cells. Variability over time is incorporated by showing sets of maps as animated movies. We will demonstrate this tool using sea floor sediment quality predictions under different spatial aggregation scenarios (block sizes), covering the Dutch part of the North Sea. The tool is freely available in binary and source code form; source code is distributed under the Gnu GPL; grid maps are read from disc through the GDAL library, or from memory as e.g. in an R session.

Keywords: dynamic graphics, interactive graphics, exploratory data analysis

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Pebesma2006accuracy.pdf1.04 MB

Effects of different methods for estimating impervious surface cover on runoff estimation at catchment level

Frank Canters 1, Jarek Chormanski 2, Tim Van de Voorde 1 and Okke Batelaan 2
1 Department of Geography, Vrije Universiteit Brussel
Pleinlaan 2, B-1050 Brussel, Belgium
Tel.: + 0032 2 629 3381; Fax: + 0032 2 6293378
fcanters@vub.ac.be, tvdvoord@vub.ac.be
2 Department of Hydrology and Hydraulic Engineering, Vrije Universiteit Brussel
Pleinlaan 2, B-1050 Brussel, Belgium
Tel.: + 0032 2 629 3039; Fax: + 0032 2 6293022
jchorman@vub.ac.be, batelaan@vub.ac.be

Abstract
One of the most obvious effects of urbanization is an increase  of impervious surface cover. Surface imperviousness has an important impact on hydrology, stream water quality and ecology, and is often used as an overall indicator of the health status of urbanized watersheds. It has also been identified as one of the key  factors in the occurrence of flash floods. This paper examines the impact of different methods for estimating impervious surface cover on the outcome of a distributed rainfall-runoff model for the upper catchment of the Woluwe River in the southeastern part of Brussels. The study shows that mapping of impervious surface distribution, using remotely  sensed data, produces substantially different estimates of discharge at catchment level than traditional approaches that are based on expert judgment of average imperviousness for different types of urban land use.  Little difference is observed between results obtained with detailed impervious surface maps derived from high resolution satellite data and sub-pixel estimates of imperviousness derived from medium resolution data. This demonstrates that sub-pixel classification may be an interesting alternative for more expensive high resolution mapping of imperviousness for rainfall-runoff modeling at catchment level.

Keywords: impervious surfaces, runoff modeling, remote sensing, sub-pixel classification

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Canters2006accuracy.pdf1.28 MB

Enhancement of image-to-image co-registration accuracy using spectral matching methods

Keith Rennolls 1 and Mingliang Wang 1,2,3 
1 School of Computing and Mathematical Sciences, University of Greenwich
London SE10 9LS
k.rennolls@gre.ac.uk
2 Warnell School of Forest resources, UGA USA. 
mwang@smokey.forestry.uga.edu
3 Chinese Academy of Forestry, Beijing, PRC.

Abstract
Two of the important stages in the production of maps from remotely sensed imagery are rectification and classification. Residual geometric errors following rectification produce a “component” of the apparent classification error. Hence, besides being of value in its own right, reduction of the rectification error will enhance the quality of the classification stage, and the classification accuracy statistics will more closely refer to “pure” classification error rather than classification error due to pixel mismatch. In rectification to ground control points (GCPs) there is a natural limit of what can be achieved in rectification of an image, and this is determined by the cost of collecting good GCPs. However, if we are concerned with a sequence of images, and are concerned primarily with estimating change and growth, the rectification is image-to image, and the process becomes one of pixel matching, and the only cost is processing time. We present a spectrally-based pixel-matching algorithm which seems to offer considerable scope for very accurate pixel-to-pixel and hence image-to-image matching. An algorithm is used which maximizes local correlations in each spectral band at each co-registration point: a multivariate anisotropic spatial auto-correlation approach. The results are demonstrated and validated in a case-study of a 1987 Landsat5 TM image, 1997 Landsat5 TM image and a 2000Landsat7 ETM image, all of the same forested region in China.

Keywords: image-to-image, pixel-to-pixel, co-registration, spectral-correlation-search, sub- pixel accuracy

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Rennolls2006accuracy.pdf1.2 MB

Error propagation in groundwater pesticide vulnerability modelling

Paulette Posen 1, Andrew Lovett 1, Kevin Hiscock 1, Brian Reid 1, Sarah Evers 2 and Rob Ward 2
1 School of Environmental Sciences
University of East Anglia, Norwich NR4 7TJ, UK
Tel.: + 44 1603 592547; Fax: + 44 1603 591327
p.posen@uea.ac.uk
2 Environment Agency
Olton Court, 10 Warwick Road, Olton, Solihull B92 7HX, UK

Abstract
Although environmental modelling is increasingly performed within a GIS framework, analysis of the associated error is far from routine, and rarely presented with the results. An important benefit of performing error analysis is its value in determining which elements of a vulnerability assessment framework need improving. With this in mind, it was decided to examine the extent to which error might propagate through a model of groundwater vulnerability to pesticide contamination. A pesticide leaching model was developed and incorporated into an assessment of groundwater contamination risk from normal agricultural use of the herbicide isoproturon, in a 30 km x 37 km area of river catchment to the north-west of London, England. The model, which comprised two main components accounting for (i) degradation and (ii) attenuation of the pesticide, was based on conventional contaminant transport calculations, combined with existing  soil, rainfall, hydrogeological and depth to water table data. The results of an error analysis on the model were used to assign confidence limits to the resulting risk maps. In this instance, correlation of model variables led to a reduction of error in the final output. However, the results  of the analysis showed how inclusion of low quality input data can lead to  a large increase in output uncertainty. It is suggested that error propagation analysis should be routinely included in groundwater vulnerability assessment.

Keywords: error propagation, GIS, groundwater, pesticide, leaching model

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Posen2006accuracy.pdf1.18 MB

Estimating error and uncertainty in change detection analyses of historical aerial photographs

Joanne N. Halls 1 and Lindsey Kraatz 2
1 University of North Carolina Wilmington
Department of Earth sciences
601 S. College Road, NC 28403, USA
Tel.: + 001 910 962 7614; Fax: + 001 910 962 7077
hallsj@uncw.edu
2 Virginia Institute of Marine Science
Rt. 1208 Greate Rd., Gloucester Point, Virginia, 23062
lindsey@vims.edu

Abstract
By gathering, rectifying, interpreting, and digitizing historical aerial photography (from 1938 to 1998) we computed the rate of change of back-barrier land cover types and used GIS spatial analysis tools to compute the degree of fragmentation  of marshes through time and place. To quantify the significance of this historical change, a series of tests were designed and conducted to describe the amount of spatial variability and  accuracy of the rectified photographs, the digitized polygons, and the quantification of change. A digitizing accuracy assessment was conducted where randomly chosen  locations were identified on the aerial photographs and compared with the digitized  data. The accuracy assessment resulted in greater than 80 percent accurate which is acceptable. Second, the digitized polygons were tested for degree of crenulation, or curviness, and also line generalization tests were conducted which indicated that the interpretation of the photographs was not a factor in the results. Third, we incorporated a fuzziness test (using derived epsilon bands) into the GIS data to identify and quantify true changes in the marsh habitats versus positional changes, or sliver polygons. Results indicate that rectification of aerial photography (although an RMS error of less than 1), interpretation, and digitizing can  lead to erroneous results however by using fuzziness techniques we can minimize the errors and predict which areas are changing through time.

Keywords: change detection, aerial photography, digitizing accuracy

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Halls2006accuracy.pdf1.14 MB

Expanding the conceptual, mathematical and practical methods for map comparison

Robert Gilmore Pontius Jr and John Connors
Clark University
School of Geography
950 Main Street
Tel.: + 001 508 793 7761; Fax: + 001 508 793 8881
rpontius@clarku.edu; jconnors@clarku.edu

Abstract

Conventional methods of map comparison frequently produce unhelpful results for a variety of reasons. In particular, conventional methods usually analyze pixels at a single default scale and frequently insist that each pixel belongs to exactly one category. The purpose of this paper is to offer improved methods so that scientists can obtain more helpful results by performing multiple resolution analysis on pixels that belong simultaneously to several categories. This paper examines the fundamentals of map comparison beginning from the elementary comparison between two pixels that have partial membership to multiple categories. We examine the conceptual foundation of three methods to create a crosstabulation matrix for a single pair of pixels, and then show how to extend those concepts to compare entire maps at multiple spatial resolutions. This approach is important because the crosstabulation matrix is the basis for numerous popular measurements of spatial accuracy. The three methods show the range of possibilities for constructing a crosstabulation matrix based on possible variations in the spatial arrangement of the categories within a single pixel. A smaller range in the possible spatial distribution of categories within the  pixel corresponds to more certainty in the crosstabulation matrix. The quantity of each category within each pixel constrains the range for possible arrangements in subpixel mapping, since there is more certainty for pixels that are dominated by a single category. In this respect, the proposed approach is placed in the context of a philosophy of map comparison that focuses on two separable components of information in a map: 1) information concerning the proportional distribution of the quantity of categories and 2) information concerning the spatial distribution of the location of categories. The methods apply to cases where a scientist needs  to compare two maps that show categories even when the categories in one map are different from the categories in the other map. We offer a fourth method that is designed for the common case where a scientist needs to compare two maps that show the same set of categories. Results show that the methods can produce extremely different measurements, and that it is possible to interpret the differences at multiple resolutions in a manner that reveals patterns in the maps. The method is designed to present the results graphically in order  to facilitate communication. We describe the concepts using simplified examples, and then apply the methods  to characterize the change in land cover between 1971 and 1999 in Massachusetts, USA.

Keywords: accuracy, fuzzy, error, matrix, uncertainty

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Pontius2006accuracy.pdf411.91 KB

Expressing attribute uncertainty in spatial data using blinking regions

Julian Kardos 1, Antoni Moore 2 and George Benwell 2
1 Fenomenal Media
PO Box 3100
Ohope, New Zealand
Tel.: + 64 21 763 725; 
media@fenomenal.co.nz
2 Spatial Information Research Centre
Department of Information Science
University of Otago, PO Box 56, Dunedin, New Zealand
Tel.: + 64 3 479 8138; Fax: + 64 3 479 8311
amoore@infoscience.otago.ac.nz

Abstract
This paper defines a new method of attribute uncertainty representation, the blinking region. It specifically communicates uncertainty in values associated with choropleth map regions. In this way, error and attribute (census) data can  be made to alternate  in the user’s view, communicating both datasets. The blinking region uncertainty visualization was tested via a web-based survey. Results show that it outperforms most current visualisation of attribute uncertainty techniques (e.g. image sharpness) in terms of visual appeal, speed of comprehension and overall effectiveness. Theoretically, regions  that have the most attribute error could flicker at the fastest rate, with speed of blinking being proportional to the error.

Keywords: geovisualisation, dynamic variables, uncertainty

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Kardos2006accuracy.pdf753.5 KB

Fitness for Use - to Support Military Decision Making

Edward J. Wright
Information Extraction and Transport, Inc.
1911 N. Ft. Myer Dr., Suite 600
Arlington, VA, USA, 22209
ewright@iet.com

Abstract
“Determining fitness for use is solely the user's responsibility.” These, or similar words, are common in the licenses or disclaimers associated with most government or commercial producers of spatial data. Unfortunately, the user, who is often not an expert on spatial data or spatial data quality, has no tools or methodology for accomplishing this fitness for use determination. This paper presents results of research to develop technology and a methodology that allows users to evaluate how well uncertainty in the spatial data translates into risk in operational decisions of interest to the user. A prototype application based on this research has been integrated with a commercially available GIS software package. The approach starts with the user definition of the spatial data to be evaluated and the decision process that the data supports. Although we know there is uncertainty in the data, it is initially declared to be “truth” data, and is used to make a baseline decision. Next we have developed error models that represent the data quality attributes of the spatial data layers for the features of interest to the user. The error models handle discrete  and continuous data, and include spatial correlation. Additional models define the relationships between geographic features, and the influence of geographic features on parameters of the error models. We have implemented algorithms to generate simulated spatial data which varies from the assumed ground “truth” data in a way that is consistent with the relationship and error models. The simulated data is used as input to a simulated decision process and the result of the decision is then implemented on the “truth” data and compared to the baseline decision. The differences in results are in an operational dimension that the user understands. Because this is a stochastic process, the process is automated and repeated a large number of times. Graphical displays allow the user to assess the impact of the data uncertainty on operational decisions. This paper describes error models and relationship models, and an example of an experiment to evaluate the “fitness for use” for a military decision support product. While motivated by military applications, the research provides a tool which implements a methodology applicable to any user who must determine fitness for use of a spatial dataset.

Keywords: decision making, uncertainty, risk, probabilistic models, simulation

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Wright2006accuracy.pdf728.81 KB

Hydraulic and event knowledge to reduce the positional uncertainty in SAR flood images for improved flood model calibration and development

Guy Schumann 1,2, Andrew Black 2,Mark Cutler 2, Jean-Baptiste Henry 3, Lucien Hoffmann 1, Patrick Matgen 1 and Laurent Pfister 1
1 Department of Environment and Agro-Biotechnologies, Public Research Centre-Gabriel Lippmann
41, rue du Brill, L-4422, Belvaux, Luxembourg
Tel: +352 47 02 61 417
schumann@lippmann.lu
2 Department of Geography, Dundee University
Nethergate Dundee, DD14HN, UK
3 Service Régional de Traitement d’Images et de Télédétection 
BP 10413, F-67412 Illkirch, Strasbourg, France

Abstract
In an attempt to reduce the positional uncertainty in SAR images depicting floods and lakes, thereby largely improving the  spatial accuracy of SAR-derived maps, this paper presents a Monte Carlo (MC) simulation-like image shifting procedure that shifts the X and Y coordinates of a SAR image until an acceptable minimum absolute error in elevation between the left and right flood extent level is found, whilst assuming a horizontal water surface on river cross sections. Independent Check Points (ICPs) indicating flood positions at bridges and elevated roads and railways are integrated with the MC-based procedure to provide direct validation of the image shifts. The proposed method, which requires hydraulic as well as event knowledge, greatly improves the positional accuracy of SAR-derived flood maps  and thus presents an inviting alternative to adjust the initial inaccurate geo-referencing often encountered with SAR images. The importance of applying the proposed method is illustrated by applying it to a flood inundation model calibration procedure using a SAR-derived flood extent. 

Keywords: SAR flood imagery, geometric correction, Monte Carlo Simulation, flood inundation model calibration, positional uncertainty

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Schumann2006accuracy.pdf730.38 KB

IGeoE: Positional quality control in the 1/25000 cartography

António Jaime Gago Afonso 1, Rui Alberto Ferreira Coelho Dias 2 and Rui Francisco da Silva Teodoro 3
1 Data Acquisition Department, Army Geographic Institute, Lisbon, Portugal, 
Av. Dr. Alfredo Bensaúde 1849-014 Lisboa, 
afonso@igeoe.pt
2 Data Acquisition Department, Army Geographic Institute, Lisbon, Portugal, 
Av. Dr. Alfredo Bensaúde 1849-014 Lisboa, 
ruidias@igeoe.pt
3 Data Acquisition Department, Army Geographic Institute, Lisbon, Portugal, 
Av. Dr. Alfredo Bensaúde 1849-014 Lisboa, 
rteodoro@igeoe.pt

Abstract
Being aware of the importance of the knowledge of the accuracy of the geographic information contained in its digital cartographic products, the Army Geographic Institute (IGeoE) has been investing human and material resources  with the objective of developing work methodologies that allow to evaluate the positional accuracy of that information. The IGeoE has been using as reference the NATO document STANDARDIZATION AGREEMENT (STANAG) 2215 of October 15,  2001, entitled Evaluation of Land Maps, Aeronautical Charts Digital and Topographic  Dates, which describes a methodology for acquisition of planimetric and altimetry coordinates of a sample constituted by 167 diagnosis points, for an area produced with a certain scale. The IGeoE, experienced of the difficulty of finding good reference points to use in the  quality control, it was developed a work methodology that allows to do those studies. This methodology allows, in a selective way, to search the areas of land in order to embrace a significant number of details contained in its cartography. Having in mind huge amount of details contained in a sheet of the 1/25000 military map, it is not possible to check all those details. In conclusion, this article describes, in a briefly way, the main phases of the work  that is done in what concerns of the Quality Control of the IGeoE cartography.

Keywords: accuracy, methodology, quality control

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Afonso2006accuracy.pdf608.6 KB

IGeoE: Positional quality control with different rtk positioning methods

António Jaime Gago Afonso, Rui Alberto Ferreira Coelho Dias, Rui Francisco da Silva Teodoro
Data Acquisition Department, Army Geographic Institute, Lisbon, Portugal, 
Av. Dr. Alfredo Bensaúde 1849-014 Lisboa, 
afonso@igeoe.pt, ruidias@igeoe.pt, rteodoro@igeoe.pt

Abstract
The Army Geographic Institute (IGeoE) uses in its several topography works GPS equipments using the RTK positioning technique (Real Time Kinernatic). However there are several associated limitations related with human resources or with the accuracy obtained as GPS Rover stands away of the base.To minimize these problems, in the end of 2005 the IGeoE implemented in the Lisbon area a GPS network  of 7 reference stations GPS for RTK use. During the first quarter of 2006 several Positional Quality Control tests were made with the objective of evaluating the human needs to perform this control and the positional quality of the coordinates obtained by these two new positioning methods that are the Network versus Single Base Station methods. In the case of RTK network the tests that were made included well-known points in the interior, in the bounds and outside the area of this network. This article describes, in a briefly way, the main phases of the work done in what concerns the Positional Quality Control of the positioning methods referred above.

Keywords: network, GPS, RTK

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
AfonsoDias2006accuracy.pdf826.96 KB

Identification of inhomogeneities in precipitation time series using SUR models and the Ellipse test

Ana Cristina M. Costa 1 and Amílcar Soares 2
1 Instituto Superior de Estatística e Gestão de Informação, Universidade Nova de Lisboa
Campus de Campolide, 1070-312 Lisboa, Portugal
Tel.: + 351 213 870 413; Fax: + 351 213 872 140
ccosta@isegi.unl.pt
2 Instituto Superior Técnico, Universidade Técnica de Lisboa
Av. Rovisco Pais, 1049-001 Lisboa, Portugal
Tel.: + 351 218 417 444; Fax: + 351 218 417 389
ncmrp@alfa.ist.utl.pt

Abstract
The homogenization and analysis of long-term meteorological data sets are currently of unprecedented interest to the scientific community. If the monitoring station network is dense enough, many techniques use data  from nearby stations (‘reference’ stations) in order to account for regional climate changes and to isolate the effects of irregularities in a ‘candidate’ station. We propose an extension of the method of cumulative residuals (Ellipse test) that takes into account the contemporaneous relationship between several candidate series from the same climatic area. The proposed technique uses the residuals from a Seemingly unrelated regression equations (SUR) model. This procedure (SUR+Ellipse test) was applied to a testing variable, with annual resolution, derived from the daily precipitation data from 27 stations located in the southern region of Portugal. Three well established statistical tests were also applied: the Standard normal homogeneity test (SNHT) for a single break, the Buishand range test and  the Pettit test. The promising results from this case study indicate the proposed methodology as a valuable tool for homogeneity testing of climate time series if the station network is dense enough.

Keywords: homogeneity testing, ellipse test, seemingly unrelated regression equations, precipitation

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Costa2006accuracy.pdf733.33 KB

Improvement of positional accuracy of a landslide database using digital photogrammetry techniques

Tomás Fernández 1, Jorge Delgado 1, Javier Cardenal 1, Clemente Irigaray 2, Rachid El Hamdouni 2 and José Chacón 2
1 Departamento de Ingeniería Cartográfica, Geodésica y Fotogrametría. Universidad de Jaén.
Campus de las Lagunillas s/n, Edificio de Ingeniería y Tecnología. 23071 Jaén, Spain
Tel.: + 0034 53 212 843; + 0034 53 212 454; 0034 53 212 844; Fax: + 0034 53 212 855
tfernan@ujaen.es; jdelgado@ujaen.es; jcardena@ujaen.es
2 Departamento de Ingeniería Civil. Universidad de Granada.
Campus de Fuentenueva, s/n, Edificio Politécnico. 18071 Granada, Spain
Tel.: + 0034 58 212 498; + 0034 58 212 496; + 0034 58 212 499; Fax: + 0034 58 212 499
clemente@ugr.es; rachidej@ugr.es; jchacon@ugr.es

Abstract
In this work several techniques for the elaboration of landslides databases are compared. The used techniques are the digitalization on ortophotographies (monoplotting), the digitalization on aerial photographs and geometrical correction, the translate to a topographical map and digitalization, and, finally, the stereoplotting using digital photogrammetry. The landslide scarps databases derived from the different methodologies have been compared by several indexes such as the displacement between significant points of scarps, the lengths of scarps, and the fitness of scarps to a DTM. The analysis shows some important discrepancies between the databases, with displacements between  18 and 40 meters, depending on the compared methodologies. The general best results are  obtained with the methodology of digital stereoplotting whose scarps database is well  fitted to the DTM. Among the rest of methodologies, the digitalization on ortophotography presents the lower differences with the previous one, followed by the digitalization on the map; in both cases the displacements and other changes do not show a pattern, which induces to think in errors of interpretation and digitalization. The methodology of the digitalization on photogram presents the worst results, although with a certain spatial pattern related with the relief displacement. The conclusion is to recommend the use of the digital stereoplotting to elaborate landslides databases and as possible alternatives the digitalization on ortophotography and on topographical map, being dissuaded the digitalization on the photogram, at least in zones with a strong relief.

Keywords: spatial accuracy, landslides database, digital photogrammetry

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Fernandez2006accuracy.pdf977.54 KB

Improving positional accuracy and preserving topology through spatial data fusion

Sue Hope, Allison Kealy and Gary Hunter
Cooperative Research Centre for Spatial Information
Department of Geomatics, The University of Melbourne, VIC 3010, Australia
Tel.: + 61 3 8344 3176; Fax: + 61 3 9349 5185
s.hope3@pgrad.unimelb.edu.au

Abstract
The spatial information industry is currently moving from an era of digital dataset production to one of dataset maintenance. This is being facilitated by the widespread availability of technologies such as Global Navigation Satellite Systems (GNSS) and high resolution imagery, which enable the rapid collection of large amounts of data with high positional accuracy. Typically, these new data are being integrated with existing datasets that possess different quality characteristics. As this trend continues, new spatial data quality concerns are emerging, in particular regarding the spatial variability of quality. Rigorous software solutions are needed that can handle the integration of data of different accuracies. This is already occurring for land parcel mapping, where least squares adjustment techniques have been developed to integrate high accuracy subdivision data with the existing cadastral database and its inherent geometry. A best fit between the two datasets is determined, together with well-defined local positional accuracy parameters, resulting in incremental improvement to the overall positional accuracy of the cadastral database. This research aims to extend such integration methodology to the fusion of more complex features within vertical vector datasets, whilst maintaining the relevant topological relationships between them. Vertical datasets cover the same regional extent and often include duplicate representations of the same features. There may also be rules governing the spatial relationships between features. It is proposed that the development of rigorous integration algorithms that allow for the inclusion of geometric and topological constraints will enable the successful fusion of vertical spatial datasets. Features will be optimally positioned and well-defined positional accuracy parameters will be generated that, in turn, may be reported as spatial variation in the quality of the resultant dataset. This will ultimately result in methods that can be used to improve existing spatial datasets, hence reducing the observed discrepancies as new, higher accuracy data are introduced.

Keywords: spatial data, positional accuracy, vertical topology, data fusion

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Hope2006accuracy.pdf1.17 MB

Incorporating process knowledge in spatial interpolation of environmental variables

Gerard B.M. Heuvelink
Environmental Sciences Group
Wageningen University and Research Centre
P.O. Box 37
6700 AA Wageningen, The Netherlands
Tel.: + 0031 317 474628; Fax + 0031 317 419000
gerard.heuvelink@wur.nl

Abstract

Ordinary kriging is the most commonly used geostatistical interpolation technique. It predicts the value of a spatially distributed environmental variable at an unobserved location by taking a weighed linear combination of neighbouring observations. Ordinary kriging has proven to be very rewarding to the earth and environmental sciences, but a disadvantage is that it is a purely empirical technique that relies solely on point observations of  the target variable. Potentially one ought to be able to do much better by also exploiting all sorts of ancillary information, such as derived from digital terrain models, remote sensing imagery, geological, soil and landuse maps. Knowledge about the physical, chemical, biological or socio-economic processes that caused the spatial variation in the environmental variable is potentially also of much value. This paper explores two recent approaches that incorporate ancillary information and process knowledge in spatial interpolation. Regression kriging differs from ordinary kriging by including a trend function that is steered by ancillary information and process knowledge. Space-time Kalman filtering takes a dynamic approach by formulating a state equation that computes the state of the system at the next time point from driving variables and the current state. The Kalman filter conditions the state predictions to measurements in the measurement update step. Regression kriging and space-time Kalman filtering are compared and their application in practice is illustrated with examples. Incorporating process knowledge in spatial interpolation is advantageous not only because using more information yields more accurate maps, but also because it gives insight into how processes affect the state of the environment and is better suited to make extrapolations. Although these techniques are increasingly applied and have a bright future, several important theoretical and practical issues need to be resolved before routine application is in place.

Keywords: dynamic modelling, geostatistics, Kalman filtering, regression kriging

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Heuvelink2006accuracy.pdf1.11 MB

Kriging of spatial-temporal water vapor data

Roderik Lindenbergh, Maxim Keshin, Hans van der Marel and Ramon Hanssen
Delft Institute of Earth Observation and Space Systems,
Delft University of Technology, 
P.O.Box 5058, 2600 GB,
Delft, The Netherlands,
Tel.: +31 15 27 87 649; Fax: +31 15 27 83 711
r.c.lindenbergh@tudelft.nl, m.o.kechine@tudelft.nl, h.vandermarel@tudelft.nl, r.f.hanssen@tudelft.nl

Abstract
Water vapor is the dominant greenhouse gas but  is varying strongly both in the spatial and temporal domain. A high spatial resolution of down to 300m is available in MERIS water vapor data but the temporal resolution is only three days. On the other hand water vapor observations from GPS ground stations have temporal resolutions in the order of 1 hour, but nearby stations are mostly tenths of kilometers away. A collocated cokriging approach, incorporating the results of a structural spatial and temporal analysis of the different water vapor signals, is used to combine both observation sources in order to obtain a combined water vapor product of high spatial and temporal resolution.

Keywords: spatial-temporal, cokriging, water vapor, accuracy

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Lindenbergh2006accuracy.pdf1.02 MB

Local error evaluation in DEM using direct sequential simulation (DSS) methodology

Jorge Delgado 1, Amilcar Soares 2, José Luis Pérez 1 and Julia Carvalho 2
1 Dpto. Ingeniería Cartográfica, Geodésica y Fotogrametría.
Escuela Politécnica Superior. Universidad de Jaén
Campus de las Lagunillas, Edif. A-3, 23071 Jaén, Spain
Tel.: + 034 953 212 468; Fax: + 034 953 212 855
jdelgado@ujaen.es; jlperez@ujaen.es
2 Environmental Group of the Center for Modelling Petroleum Reservoirs, CMRP/IST
Av. Rovisco Pais, 1049-001 Lisboa, Portugal
Tel.: + 351 218 417 444; Fax: + 351 218 417 389
asoares@ist.utl.pt; jcarvalho@ist.utl.pt

Abstract
In this paper a methodology for local uncertainty due to the modelling process in a DEM generation is presented. Local uncertainty is a basic aspect to determine the final quality of the DEM and it is crucial in order to establish the capabilities of  use of the model. The proposed methodology is based in the use of geostatistical simulation procedure that provides “possible versions” of the reality from limited terrain information (XYZ terrain points). The simulated model generated by  the geostatistical simulation procedure (direct sequential simulation) allows obtaining the cdf of the elevation in each grid node. Using this cdfs it is possible to calculate the uncertainty of the  DEMs (variance, interquartile range, entropy) through a simple probabilistic analysis and several  realisations of DEM that preserves the statistical characteristics of the measured data (histogram, variogram,…).

Keywords: local error, uncertainty, DEM, direct sequential simulation, geostatistics

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Delgado2006accuracy.pdf1.66 MB

Mapping trends in water table depths in a Brazilian Cerrado area

Rodrigo Lilla Manzione 1, Martin Knotters 2 and Gerard B. M. Heuvelink 2
1 INPE (National Institute for Spatial Research) – DPI (Image Processing Division)
Avenida dos Astronautas, 1758. Caixa Postal 515, 12227-010, São José dos Campos, SP, Brazil
Tel.: + 055 12 39456481
manzione@dpi.inpe.br 
2 Alterra – Soil Science Centre
Droevendaalsesteeg 3, 6708PB Wageningen, The Netherlands
Tel.: + 031 317 474240
Martin.Knotters@wur.nl, Gerard.Heuvelink@wur.nl 

Abstract
The Cerrado region is the most extensive woodland-savannah in South America, situated at the central Brazilian Plateau and characterized by wet and dry periods well defined during the year. During the past 30 years, the original vegetation has been replaced by extensive cattle fields and agricultural crops, which are less adapted to drought than the Cerrado vegetation. Therefore, irrigation is increasingly applied, resulting in changes of the hydrological system. The aim of this study is to map systematic  changes of the water table depths, in order to indicate areas with potential risks of future water shortage. To this purpose the PIRFICT- model is applied, a transfer function-noise (TFN) model with a Predefined Impulse Response Function In Continuous Time. Being the most important driving forces of water table fluctuation, precipitation and evapotranspiration are incorporated as exogenous variables into the model. Besides, a linear trend component is incorporated, reflecting systematic changes of water table depths over time. The linear trend parameter of the  time series model is interpolated spatially using universal kriging, by utilizing  actual land use derived from Landsat images as ancillary information. Series of 30 months length of semi-monthly observed water table depths are available from 40 wells in the Jardim river watershed. In this area almost all natural Cerrado vegetation has been replaced by agricultural crops, some of which are intensively irrigated. The time series models are calibrated to the 40 series, and next the trend parameter reflecting systematic changes of water table depths is mapped. The resulting map indicates potential risks of water shortage. The kriged trend parameter is evaluated by cross-validation. The significance of the interpolated trend parameter is also mapped. The uncertainties associated with  the mapped trend parameter are large, suggesting that longer time series of water table depths are needed to obtain more accurate results.

Keywords: groundwater levels, phreatic levels, time series modeling, spatio-temporal modeling, trend analysis

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Manzione2006accuracy.pdf907.97 KB

Mapping uncertainty in land cover characterization by comparison of land cover cartographies - A case study for Portugal

Vasco B. Nunes and Mário Caetano
Remote Sensing Unit
Portuguese Geographic Institute (IGP), Lisbon, Portugal
Tel.: + 0351 21 381 96 00
Vasco.nunes@igeo.pt; Mario.caetano@igeo.pt

Abstract
In this paper we propose a methodology to assess uncertainty of land cover cartographies and its spatial distribution within a ranged scale. The method consists in overlaying different land cover databases and determining the uncertainty  values for the intersecting areas. This assessment is based on different nomenclature hierarchic level comparison and on the number of cartographies in agreement, assuming that different sources of information in agreement provide a higher level of confidence than the same sources in disagreement. A case study was developed for the mainland territory of Portugal. The selected spatial information databases had very different technical specifications, and therefore they had to be made comparable by some harmonisation procedures. The results are consequently, all directly dependent on these procedures. The universe of analysis was dramatically reduced and the overall agreement between all the applied cartographies was considered low at the highest class detail. However, it can be assumed that the resulting areas are now represented with a known level of uncertainty concerning land cover characterization.

Keywords: land cover, error, uncertainty, map comparison

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Nunes2006accuracy.pdf745.78 KB

Maps are not what they seem: representing uncertainty in soil- property maps

Tomislav Hengl 1 and Norair Toomanian 2
1 European Commission, Directorate General JRC Institute for Environment and Sustainability Land Management & Natural Hazards Unit
TP 280, JRC Via E. Fermi 1, I-21020 Ispra (VA), Italy 
Tel.: + 39 0332 785535; Fax: + 39 0332 786394
tomislav.hengl@jrc.it
2 College of Agriculture, Isfahan University of Technology
Isfahan University of Technology, 84154 Isfahan, Iran
Tel.: + 98 311 3913360; Fax: + 98 311 3912254
norair@sepahan.iut.ac.ir

Abstract
The paper discusses use of static visualization techniques for representation of uncertainty in spatial prediction models illustrated with examples from soil mapping. The uncertainty of a prediction model, represented with the prediction error, is commonly ignored or only visualized separately from the predictions. Two techniques that can be used to visualize the uncertainty are colour mixing (whitening) and pixel mixing. In both cases, the uncertainty is coded with the white colour and quantitative values are coded with Hues. Additional hybrid static visualization technique (pixel mixing with simulations) that shows both the short-range variation and the overall uncertainty is described. Examples from a case study from Central Iran (42×71 km) were used to demonstrate  the possible applications and emphasize the importance of visualizing the uncertainty in maps. The soil predictions were made using 118 soil profiles and 16 predictors ranging from  terrain parameters to Landsat 7 bands. All variables were mapped using regression-kriging and grid resolution of 100 m. Final maps of texture fractions, EC and organic matter in topsoil were visualized using the whitening, pixel missing and pixel mixing combined with simulations. Visualization of uncertainty allows users to compare success of spatial prediction models for various variables. In this case study, the results showed that there can be quite some differences in the achieved precision of predictions for various variables and that some soil variables need to be collected with much higher inspection density to satisfy the required  precision. Visualization of uncertainty also allows users to dynamically improve the precision of predictions  by collecting additional samples. Links to scripts that the users can download and use to visualize their datasets are given.

Keywords: visualization, spatial prediction error, conditional simulations, regression kriging

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Hengl2006accuracy.pdf2.5 MB

Merging Landsat and SPOT digital data using stochastic simulation with reference images

Júlia Carvalho 1, Jorge Delgado-García 2 and Amílcar Soares 1
1 Environmental Group of the Centre for Modelling Petroleum Reservoirs, CMRP/IST
Av. Rovisco Pais, 1049-001 Lisboa, Portugal
Tel.: + 351 218 417 444; Fax: + 351 218 417 389
jcarvalho@ist.utl.pt; asoares@ist.utl.pt
2 Dpto. Ingeniería Cartográfica, Geodésica y Fotogrametria, Escuela Politécnica Superior, Univ. de Jaén
Campus de las Lagunillas, Edif. A-3. 23071 Jaén, España
Tel.: + 344 953 212 468; Fax: + 34 953 212 855
jdelgado@ujaen.es

Abstract
There is a wide range of systems providing digital satellite imagery with different spatial and spectral resolutions. But, unfortunately, these resolutions are in most cases opposite; i.e., the high-resolution sensors have low spectral resolution whereas the multiespectral sensors have good spectral resolution but bad spatial resolution making their use in detailed applications difficult. The problem is solved using digital image merging procedures. The main objective of these methods is to obtain synthetic images that combine the advantage of the high spatial resolution of one image with the high spectral resolution of another image. Ideally, the method used to merge data sets with high-spatial  resolution and high-spectral resolution should not distort the spectral characteristics of the high spectral resolution data. The classical methods of merging procedures (PCA, IHS, HPS, etc.) present several drawbacks. The objective of this paper is to present a geostatistical merging methodology based on direct sequential co- simulation with reference images (Carvalho et al., 2006). With the stochastic simulation one generates a high spatial resolution image with the characteristics of the of the higher spectral resolution image. It is an iterative inverse optimization procedure that tends to reach the matching of an objective function by preserving the spectral characteristics and spatial pattern, as revealed by the variograms, of the higher-spectral resolution images both in terms of descriptive statistics and band correlation coefficients. The method was applied to Landsat TM and SPOT-P images. The results were compared with the original Landsat image and the images provided by classical merging procedures.

Keywords: digital image merging method, geostatistics, stochastic simulation

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Carvalho2006accuracy.pdf791.35 KB

Modeling spatial variation in data quality using linear referencing

MGSM Zaffar Sadiq 1, Matt Duckham 2 and Gary J Hunter 2
1 Cooperative Research Centre for Spatial Information, Department of Geomatics, 
University of Melbourne, 723, Swanston Street, Victoria - 3010, Australia.
Tel.: + 061 3 83443177; Fax: + 061 3 93495185 
s.mohamedghouse@pgrad.unimelb.edu.au
2 Department of Geomatics, 
University of Melbourne, Victoria - 3010, Australia.
Tel.: + 061 3 83446935; Fax: + 061 3 93472916
mduckham@unimelb.edu.au, garyh@unimelb.edu.au

Abstract
Spatial data quality (SDQ) is conventionally presented in the form of a report. The data quality statements in the report refer to the entire data set. In reality the quality of data varies spatially due to data collection methods, data  capturing techniques,  and analysis. Thus, the quality of spatial data for one area may not  be applicable to spatial data describing other regions. The present systems for reporting and representing SDQ (data quality statements) cannot address the data user’s requirements as they are not location specific. Consequently, conventional approaches to SDQ that ignore variation in quality within a data set impair the data producer’s ability to correctly communicate knowledge about data quality and jeopardize the user’s ability to assess fitness for use. To enable proper communication of SDQ, spatially varying data quality needs to be represented in the database. This paper discusses the representation of spatial variation of data quality in spatial databases using three models: per- feature, feature-independent, and feature-hybrid. In the per-feature model, quality information is stored against each spatial feature (object) stored in the database. In the feature- independent model, quality information is stored independently of particular features (as a field). The feature-hybrid model is derived  from a combination of per-feature and feature independent models. One example of an existing data management technique that can be adapted for use as a feature-hybrid model is linear referencing. Applying linear referencing in this way is a new approach to representing spatial variation in quality. The paper concludes with a review of the relative merits of the different strategies for storing spatially varying data quality information.

Keywords: spatial data quality, uncertainty, metadata, sub-feature variation, linear referencing

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Sadiq2006accuracy.pdf653.2 KB

Modeling uncertainty in spatiotemporal objects

Shokri, T 1, M. R. Delavar 1, M. R. Malek1 and A. U. Frank 1, 2
1 Center of Excellence in Geomatics Engineering  and Disaster Management, Dept. of Surveying and Geomatics Engineering,
Engineering Faculty, University of Tehran, Tehran, Iran
talashokri@geomatics.ut.ac.ir, mdelavar@ut.ac.ir, malek@ncc.neda.net.ir
2 Institute of Geoinformation and Cartography, TU WIEN, Austria
frank@geoinfo.tuwien.ac.at

Abstract
Spatiotemporal modeling has been received a great attention  in both the research and user community. Although considerable research efforts and valuable results do exists, many of the studies and proposed approaches are based on the assumption that objects have crisp boundaries, relationships among them are precisely defined, and accurate measurement of positions leads to error-free representations. This simplification is not sufficient for all applications and surely spatial objects, attributes, relationships, time  points, time periods, events and their changes may have uncertainty influencing spatiotemporal systems. In this paper, we first explore the semantics of spatial and temporal indeterminacy, to better understand their nature and behavior and show how  the fundamental modeling concepts of spatial objects, time  and events are influenced by indeterminacy representing that subsequently these concepts can be combined. Afterwards, we focus on the change of spatial objects and their geometry in time, then required models in the uncertainties mentioned have been developed and finally the methodology proposed to model indeterminate changes has been presented through a simulation case study.

Keywords: temporal uncertainty, uncertainty, temporal data quality, temporal GIS

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Shokri2006accuracy.pdf662.7 KB

Modelling positional errors with isotropic random vector fields

João Casaca 1 and Ana Maria Fonseca 1
1 National Laboratory for Civil Engineering
Av. Brasil 101, 1700-066 Lisbon, Portugal
Tel.: + 351 21 844 3000; Fax: + 351 21 844 3026
jcasaca@lnec.pt; anafonseca@ lnec.pt

Abstract
Analysis of spatially distributed random multidimensional phenomena, such as positional errors, requires some generalization from the usual concepts of Geostatistics: random vector fields, associating random vectors to space positions, must be used instead of random scalar fields; covariance matrix functions, must replace scalar covariance functions; matrix variograms, must replace scalar variograms; etc. In the case of isotropic vector random fields, which are well suited to model space distribution of positional errors, multidimensional generalization is easier, allowing the construction of a scalar pseudo-variogram which may be used as an efficient tool to  partially estimate its autocovariance matrix functions. After a theoretical introduction, to present the concept of the scalar pseudo-variogram of an isotropic random vector field, the paper describes its application to the positional error vector of a geometrically corrected high resolution numeric image, acquired with the sensors of the Quickbird satellite.

Keywords: autocovariance function, isotropy, positional error vector, pseudo-variogram, vec- tor random field.

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Casaca2006accuracy.pdf579.99 KB

On dealing with spatially correlated residuals in remote sensing and GIS

Nicholas A. S. Hamm 1, Peter M. Atkinson 2 and Edward J. Milton 3
School of Geography
University of Southampton
Southampton SO17 3AT
United Kingdom
Tel.: +44 7887 578 442; Fax: +44 23 8059 3295
nick@hamm.org1
pma@soton.ac.uk2
ejm@soton.ac.uk3

Abstract
Key assumptions in standard regression models are that the residuals are independent and identically distributed. These assumptions are often not met in practice. In particular, the issue of spatial correlation amongst the residuals has had limited attention in the remote sensing and GIS literature. This paper discusses approaches that use familiar authorized models to specify the covariance amongst the residuals. The model parameters were estimated using a maximum likelihood approach (ML or REML). The accuracy of the approach was investigated using simulated data and found to give accurate estimates of the error variance (σ 2 ), although estimates of and range (a) and nugget component (s) were less accurate. The approach was found to be robust to choice of covariance function (exponential or spherical). The approach was then extended to deal with heteroskedastic residuals by incorporating a weighting component analogous to that used in weighted  least squares. This was also shown to yield accurate results for  σ 2 , a and s. Finally, possibilities for extending these approaches to prediction are considered.

Keywords: correlated residuals, regression, GIS, remote sensing, variogram

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Hamm2006accuracy.pdf816.25 KB

Positional Accuracy of Biological Research Data in GIS – A Case Study in the Swiss National Park

Stephan Imfeld 1, Ruedi Haller 2 and Patrick Laube 3
1 Department of Geography, University of Zurich
Winterthurerstr.190, CH-8057 Zurich, Switzerland
Tel.: + 41 44 635 5253; Fax: + 41 44 635 6848 imfeld@geo.unizh.ch
2 Swiss National Park
Chasa dal Parc, CH-7530 Zernez, Switzerland
Tel.: + 41 81 856 1282; Fax: + 41 81 856 1740 rhaller@nationalpark.ch
3 School of Geography and Environmental Science
University of Auckland
Private Bag 92019
Auckland, New Zealand
Tel: +64 9 3737599; Fax: +64 9 3737434 p.laube@auckland.ac.nz 

Abstract
Original field research data requires information about the positional accuracy of objects located in the field, expecially when analysed in the context of GIS. We present the results of a case study assessing the spatial accuracy of vegetation sampling data. The positional accuracy of a research grid consisting of adjacent squares of 20mx20m set up using a large scale orthophoto (1:2000) used for vegetation studies was assessed using surveying techniques. To study the absolute positional accuracy of the setup, the exact locations of a large part of these squares were determined using surveying techniques. The mean positional error was 5.2m (span 0.9-9.1m) for pegs located in the corners of the squares. The size of the individual squares ranged from 64% to 133% of the planned size of 400m2. The average (horizontal) distance of the true locations (n=335) was  exactly as planned (20.0±2.1m). The minimum distance was 14.8m and the maximum distance was 25.4m. The mean horizontal angle in the corners of the plots was 89.9±3.3° (span 77.4-102.3°) (n=615). Overall, 67.4% of the whole area was in accordance to the GIS database, 32.6% was falsely attributed to wrong sampling squares. The influence on vegetation classification statistics was small (maximum of 0.58%). Even with the aid of relatively sophisticated instruments such as orthophotos, the positional accuracy in the original study was low, resulting in differences in plot area of over 200%. Nevertheless, the influence on the results of a single study are moderate. By contrast, these errors are of high concern in areas of intense  interdisciplinary research such as national parks. It is thus recommended that for a focus  research area as the site under investigation, surveying techniques should be implemented to enable long-term research and to minimize the risk of incorrect research results due to inaccurate spatial data. 

Keywords: spatial accuracy, error analysis, uncertainty, environmental data, model

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Imfeld2006accuracy.pdf595.68 KB

Precipitation measurement and the analysis of hydrological resources in a river basin

Arminda Manuela Gonçalves 1 and Teresa Alpuim 2
1 Department of Mathematics for Science and Technology, University of Minho, 
Campus de Azurém 4800-058 Guimarães, Portugal
Tel.: +351 253510433; Fax: +351 253510401
mneves@mct.uminho.pt
2 Department of Statistics and Operations Research, University of Lisbon
Campo Grande, Edifício C5, 1148-016 Lisboa, Portugal
Tel.: +351 21750047
mtalpuim@fc.ul.pt 

Abstract
The spatial and temporal measurement of precipitation on a geographical area, is essential when calculating hydrological balance for the indirect estimation of water flow and for the development of the study of the watershed charge. It is also very important for modelling many environmental phenomena, for example the variation of the water quantity of a hydrological basin of a river. The quality of water, in a certain location, is the reflex of the dominant conditions in the source basin of that location, namely the hydro-meteorological factors. The variation of a spate-time quality variable is associated to the variation of the flow (variable dilution effect) which in term is related, in general, to the seasonal variation of rainfall. We present the problem of area precipitation  measurement in order to estimate a hydro- meteorological factor that will be used in the modelling of the surface water quality of river basins. In our study we need monthly measurements of rainfall averages in area, which influence a certain area of the River Ave basin (located in the north-west of Portugal) and which represent the hydro-meteorological factor in the modelling of the quantity of water. Our main goal is to identify models which estimate well monthly average rainfall in places where there are no observed values (in the monitoring points of quality of water), with the help of the spatial distribution of the rainfall and measurements in other locations. Due to the large space and time variability of rainfall, the precise evaluation, in real time, of mean area estimates poses a difficult problem. These estimates are obtained from a set of rain gages, located at 19 points in the area of the Ave River hydrological basin. We estimate the monthly values of rainfall per area flowing into each monitoring site. To accomplish this, we propose a methodology combining both deterministic (Thiessen’s polygons) and  stochastic (Kriging) processes.

Keywords: hydrological basin, hydro-meteorological factor, area rainfall estimation, Thiessen’s polygons, Kriging

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
GoncalvesAlpuim2006accuracy.pdf1.1 MB

Producing digital elevation models with uncertainty estimates using a multi-scale Kalman filter

John C. Gallant 1 and Michael F. Hutchinson 2
1 CSIRO Land and Water
Clunies Ross St, Acton ACT 2602, Australia
Tel.: +612 6246 5734; Fax: +612 6246 5965
John.Gallant@csiro.au
2 Centre for Resource and Environmental Studies
Australian National University, Acton ACT 2602, Australia
Tel: +612 6125 4783
Michael.Hutchinson@anu.edu.au

Abstract
The Shuttle Radar Topographic Mission (SRTM) digital elevation data provides near-global coverage at about 90 m resolution and in much of the world is now the best available topographic data. Its application for quantitative  analysis is limited by random noise and systematic offsets due to vegetation. This paper describes a multiscale Kalman smoothing algorithm for removing vegetation effects and  smoothing random variations. The algorithm assimilates dense SRTM data, a vegetation mask and sparser but more accurate ICESat satellite laser altimetry data  to produce improved estimates of ground height. The method is found to be effective provided the vegetation mask accurately reflects the location of vegetation-induced offsets in the SRTM data. The method also produces estimates of uncertainty in the elevations, facilitating the use of methods for propagating error through derived terrain attributes.

Keywords: multiscale smoothing, digital elevation model, SRTM, uncertainty estimates

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Gallant2006accuracy.pdf973.5 KB

Progress with the design of a soil uncertainty database, and associated tools for simulating spatial realisations of soil properties

Linda Lilburne, Allan Hewitt and Steve Ferriss
Landcare Research
Private Bag 69, Lincoln, Canterbury 8152, New Zealand
Tel.: +64 (3) 325 6700; Fax: +64 (3) 325 2418
lilburnel@landcareresearch.co.nz; hewitta@landcareresearch.co.nz, ferrisss@landcareresearch.co.nz

Abstract
Uncertainty assessment in soil information has been well served by the geostatistical community. Mathematical techniques based on kriging theory allow for spatial autocorrelation of soil measurements to be characterised  in a semi-variogram, and then conditionally simulated, providing spatial realisations of  soil properties of interest. However, in some countries with a particularly diverse landscape and a poor network of soil samples, data measurements are simply too sparse for this  approach to be used, especially in a national- scale database. This is the case for New Zealand where soil surveys are based on soil– landform relationships, producing polygonal rather than raster data. An alternative, cruder approach is needed for characterising uncertainty that relies upon a cohort of very experienced pedologists. This paper describes how uncertainty information has been incorporated in the design of S-map, New Zealand’s new national spatial soils database. These uncertainty data are primarily derived from expert knowledge due to the lack of other sources. Some tools have been written in Python to access this uncertainty information and create randomly simulated spatial realisations for a variety of soil properties.

Keywords: expert knowledge, simulation, soil data model, soil survey

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
LilburneHewitt2006accuracy.pdf857.73 KB

RENalyzer: A tool to facilitate the spatial accuracy assessment of digital cartography

Thomas Bartoschek 1,2, Marco Painho 1, Roberto Henriques 1, Miguel Peixoto 1, Ana Cristina Costa 1
1 Instituto Superior de Estatística e Gestão de Informação
Universidade Nova de Lisboa
Campus de Campolide, 1070-312 Lisboa, Portugal
Tel.: + 351 21 387 04 13; Fax: + 351 21 387 21 40
tbarto@isegi.unl.pt, painho@isegi.unl.pt, roberto@isegi.unl.pt, mpeixoto@isegi.unl.pt, ccosta@isegi.unl.pt
2 Institute for Geoinformatics
University of Münster
Robert-Koch-Str. 26-28, 48149 Münster, Germany
Tel.: +49 (0) 251 83 33083; Fax: +49 (0) 251 83 39763
bartoschek@uni-muenster.de

Abstract
Managing spatial data in paper maps is quite different from managing digital spatial information. Sometimes manual vectorization  of scanned maps is the only way to produce digital cartography, especially when the only  source of spatial information is a paper map. Digital scanning and manual vectorization are  two processes well known for adding error to data and accuracy is a particularly important issue for users of spatial information. The National Ecological Reserve (REN) established in the Portuguese national law protects areas with a diversified bio-physical structure and specific ecological characteristics. This information is often required to manage several human activities, such as mineral extraction, real estate, industry, tourism, etc. REN maps were originally produced in paper and were vectorized to produce digital cartography. The objective of this study is to measure the spatial accuracy and to assure the conformity with the original cartography. The accuracy of the REN digital cartography was assessed through a stratified sampling scheme, as described in Peixoto et al. (2006). The preparation of the digitized maps for the needs and the appliance of the sampling method as well as the data quality control were combined in the RENalyzer application. The application calculates areas of all polygons respective to their attribute classes using object oriented computation methods. As described in the sampling method, all class combinations (overlaying polygons) are considered. These areas and the global area are the base of the global sample size and the sample size per class  and class combination. Considering these guidelines  the application creates random  points in randomly chosen polygons. The data quality control is done by facilitating the map reviewing process. After adding the scanned map and automatically setting map properties, the application zooms to each point and allows a fast and accurate quality control. Errors in digitizing are computed and classified in spatial and thematic errors. The attributes  distance to original class and error description are added to the table.

Keywords: data quality, quality control, geocomputation, National Ecological Reserve

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Bartoschek2006accuracy.pdf764.05 KB

Reducing positional error in spatio-temporal analyses

Jean-François Mas
Unidad Académica Morelia, 
Instituto de Geografía, Universidad Nacional Autónoma de México, 
Aquilés Serdán 382, Colonia Centro, Morelia, Michoacán, México
Tel.: + 00 52 443 317 94 23; Fax: + 00 52 443 317 94 25
jfmas@igg.unam.unam.mx

Abstract
A frequent method used to assess land use /  land cover (LULC) change is the comparison (overlay) of digital maps of an area within a geographic information system (GIS). However, positional errors of the maps involved in the comparison tend to create “false” change polygons and to overestimate LULC change. In this work, a simple method to improve change estimates by detecting and correcting erroneous changes resulting from  positional errors is presented. In this approach, the boundary uncertainty is managed using the epsilon band which encloses a confidence region with a specific probability of including the true location of the boundary. In order to test the method, a map of change was first generated through the overlay of two digital versions of the same analogical cartography. False changes resulting from both positional (digitizing) and thematic  (labelling) errors were quantified. Finally, the method was applied to a LULC change monitoring project in a region of South-eastern Mexico, which has been undergoing important land cover changes during the past decades. In both cases, the method allowed a significant reduction of erroneous changes due to positional errors. Due to the complexity of error modelling approaches, such simple methods are likely to be the most useful in the context of practical projects aimed at assessing LULC change.

Keywords: spatial accuracy, positional error, epsilon band, land cover change monitoring

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Mas2006accuracy.pdf703.73 KB

Reducing uncertainty in analysis of relationship between vegetation patterns and precipitation

Pavel Propastin 1, Nadiya Muratova 2 and Martin Kappas 1
1 Department of Geography, Georg-August-University
Goldschmidtstr. 3, 37077 Goettingen, Germany
Tel.: + 049 0551 389 487; Fax: + 049 0551
ppavel@gmx.de; mkappas@uni-goettingen.de 
2 Institute for Space Research and Earth Observation
Shevchenko Str. 15, 480100, Almaty, Kazakhstan
Tel.: + 007 3272 494 274; Fax: + 007 3272 918 077
nmuratova@mail.com

Abstract
The spatial relationship between vegetation patterns and rainfall and its trend over the period 1985-2001 in desert, semi-desert and steppe  grassland of the Middle Kazakhstan was investigated with Normalized Difference Vegetation Index (NDVI) images (1985-2001) derived from the Advanced Very High Resolution  Radiometer (AVHRR), and rainfall data from weather stations. The growing season relationship was examined using conventional, global, ordinary least squares (OLS) regression technique, and a local regression technique known as geographically weighted regression (GWR). Regression models between NDVI and precipitation for every  analysis year (1985-2001) were calculated using separately the both statistic approaches, the OLS and the GWR. The study found a presence of high spatial and temporal non-stationarity in the strength of relationships and regression parameters. The ordinary least squares regression model had been applied to the whole study area was superficially strong ( 2 R = 0.63), however it gave no local description of the relationship. Applying the OLS at the scale of the separate land cover classes revealed a different response of various vegetation types to rainfall within  the study area. The strength of the relationship between NDVI and rainfall increased in order from desert ( 2 R = 0.36), to semi-desert ( 2 R = 0.52), and to steppe grassland ( 2 R = 0.67). The approach of  geographically weighted regression provided considerably stronger relationships from the same data sets (mean value of 2 R = 0.88), as well as highlighted local variations within the land cover classes. Relationships between vegetation patterns and rainfall amounts are generally assumed to be spatially and temporally stationary. This assumption was not satisfied in this study. The study found that the relationship varied significantly  in space and time. In such circumstances the results provided by a global  regression model were uncertain and incorrect presented the relationship between the both variables locally. The application of local regression techniques such as GWR, may reveal local patterns of  relationship and significantly reduces the uncertainties of calculations. 

Keywords: NDVI, precipitation, regression modeling

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Propastin2006accuracy.pdf860.23 KB

Sampling simulation on multi-source output forest maps - an application for small areas

Matti Katila 1 and Erkki Tomppo
Finnish Forest Research Institute 
Unioninkatu 40 A, FIN-00170 Helsinki, Finland 
Tel.: +358 10 2111 ; Fax: +358102112104 
matti.katila@metla.fi; erkki.tomppo@metla.fi

Abstract
Systematic samples were simulated on multi-source National Forest Inventory (MS-NFI) output thematic forest maps  obtained by k-nearest neighbours method (k-NN) for areas of 1 km2  and 100 km2  . The standard deviations based on the simulated sample means were used as estimates for the standard errors (SE) of the particular designs. The variables of interest were the mean tree stem volume (m3 /ha) and the volumes by tree species. The effect of the temporal and thematic accuracy of the forest maps was studied with respect to the precision of the SE estimates from simulations. The simulated SEs were validated against an independent field inventory data measured from three test sites of 1 km2  and seven sites of 100 km2  in Eastern Finland. The effect of the estimation parameters of k-NN method on the error estimates was also studied. The study showed that the Finnish MS-NFI thematic maps can be used to estimate sampling errors for systematic field sampling designs with a plot distances down to 75-100 m on areas of 1 km2  and larger.

Keywords: multi-source forest inventory, sample simulation, standard error, Landsat TM

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Katila2006accuracy.pdf752.35 KB

Semantic similarity assessment in support of spatial data integration

Mir Abolfazl Mostafavi
Department of geomatics, 
Laval University, Quebec city, Canada
Tel: +001 418 656 2131 2750, Fax: +001 418 656 7411
mir-abolfazl.mostafavi@scg.ulaval.ca

Abstract
New advances in spatial information technology have made available large amount of spatial data from different sources with different quality levels. Today, the important challenge for the scientists in geographical information sciences is how to  integrate these data in order to respond to the new and emerging needs of the  society for the higher spatial data quality. Modern and personalized applications of the geospatial data require efficient, interactive and on-the-fly data integration. Semantic similarity assessment plays a very important role in ontology and spatial data integration. This paper, reviews different methods for semantic similarity assessment and proposes a new logical  based method in order to establish the necessary links between different ontologies. These links are then used in order to create a new ontology that will serve as basis for the integration of the spatial databases. For this study, we used the national topographic database of Canada and the topographic database of the Quebec province. Both of these databases cover the same geographical area. Most of the features in the databases are the same but, differences occur in the definitions of concepts, categories, classification, granularity and resolution, spatial relations, metric and topological constrains and etc. In this experimentation, the ontologies of the databases are formalised and represented in a knowledgebase then, a matching process proposed between the two ontologies and results were analysed in order to evaluate the proposed method for the similarity assessment of the concepts. Finally, further investigations are proposed in order to take in to account the ontology of the users in the integration process in order to guarantee the external quality of the integrated databases. 

Keywords: similarity assessment, Ontology, data quality, data integration, semantic

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Mostafavi2006accuracy.pdf666.9 KB

Sensitivity analysis and uncertainty analysis for vector geographical applications

Olivier Bonin 1
1 IGN / COGIT lab
2-4 avenue Pasteur, F-94165 Saint-Mandé CEDEX
Tel.: + 331 43 98 84 09
olivier.bonin@ign.fr

Abstract
The problem of the quality assessment of results from geographical  applications can be tackled with the help of sensitivity analysis and uncertainty  analysis techniques. Sensitivity analysis studies the relationships between the output and the inputs of an application. Uncertainty analysis aims at quantifying the overall uncertainty associated with the response of an application. Both techniques rely on the description of a geographical application by a numerical model. Uncertainty and sensitivity analyses are very well established techniques, with applications in many fields (nuclear, environmental, etc.). They rely on an array of techniques including Monte-Carlo simulations and ANOVA. However, they can only be performed when the output and input variables are scalar variables. This restriction can be handled when the geographical data is in raster format, but has many consequences for vector data. We discuss this point, and propose methods  to overcome these limitations in case of vector and raster data. Our methods rely on a statistical modeling of uncertainty in the input variables that takes into account correlation, on  the definition of summaries of the results in non-Euclidean spaces, and on simple meta-models linking the summaries of the output to the input variables. We also investigate the use of functional analysis, and high-dimensional model representations.

Keywords: uncertainty analysis, sensitivity analysis, statistical modeling, error propagation

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Bonin2006accuracy.pdf667.72 KB

Sensitivity analysis on spatial models: a new approach

Linda Lilburne 1, Debora Gatelli 2 and Stefano Tarantola 2
1 Landcare Research
Private Bag 69, Lincoln, Canterbury 8152, New Zealand
Tel.: +64 3 325 6700; Fax +64 3 325 2418
lilburnel@landcareresearch.co.nz
2 Econometric and Applied Statistics Unit, Joint Research Centre of the European Commission
Via E. Fermi 1, 21020 – Ispra,  Italy
Tel.: +39 0332 789 928; Fax: +39 0332 785 733
debora.gatelli@jrc.it; stefano.tarantola@jrc.it

Abstract
Sensitivity analysis involves determining the contribution of individual input factors to uncertainty in model predictions. The most commonly used approach when doing a sensitivity analysis on spatial models is using Monte Carlo simulation. There are a number of techniques for calculating sensitivity indices from the Monte Carlo simulations, some more effective or efficient than others. These techniques are summarised along with their limitations. A new technique for undertaking a spatial sensitivity analysis based on the Sobol' method is proposed and tested. This method is global, variance-based, and model-free. The technique is illustrated with two simple test models.

Keywords: sensitivity analysis, uncertainty analysis, simulation, Monte Carlo

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Lilburne2006accuracy.pdf686.59 KB

Simulating space-time uncertainty in continental-scale gridded precipitation fields for agrometeorological modelling

Allard de Wit and Sytze de Bruin
Centre for geo-information, Wageningen UR
P.O. Box 47, 6700 AA, Wageningen
Tel.: + 0031 317 474761; Fax: + 0031 317 419000
Allard.dewit@wur.nl; Sytze.debruin@wur.nl

Abstract
Previous analyses of the effects of uncertainty in precipitation fields on the output of EU Crop Growth Monitoring System (CGMS) demonstrated that the influence on simulated crop yield was limited at national scale,  but considerable at local and  regional scales. We aim to propagate uncertainty due to precipitation in the crop model by Monte Carlo sampling of the precipitation field. We use an error model fitted to a highly accurate precipitation dataset (ELDAS) which was available for the year 2000. Our error model consisted of two components. The first is an additive component generating precipitation residues over the entire spatial domain. The residues are generated by quantile-based back transformation of standard Gaussian fields using a set of histograms for different CGMS precipitation bins. The second component is multiplicative and generates binary rain/no-rain events on locations where the CGMS precipitation records report nil precipitation. Our results demonstrate that the model generates realistic patterns of precipitation and reproduces the histograms of the reference precipitation dataset well. A remaining problem is the inability to model prolonged dry spells which is due to our model choice. The precipitation realizations were used as input in a crop growth model. The first results indicate that the uncertainty in precipitation is sufficient to sustain divergence in the soil moisture ensemble, but not in the leaf area index ensemble.

Keywords: precipitation, error model, multiple realisations, Gaussian field, crop model

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
deWit2006accuracy.pdf1.33 MB

Some Basic Mathematical Constraints for the Geometric Conflation Problem

María Luisa Casado
Department of Surveying Engineering and Cartography
Universidad Politécnica de Madrid
Campus Sur, Autovía de Valencia km 7, E-28031 Madrid (Spain)
ml.casado@upm.es

Abstract
The geometric conflation problem is really pressing nowadays. New demands and challenges arisen from SDI (Spatial Data  Infrastructures) make the coordination of diverse information more and more necessary in order to answer new questions. As the inception of a PAI (Program of Accuracy Improvement), the UPM (Polytechnic University of Madrid) has signed a collaboration agreement with the IGN (National Geographic Institute) in order to research the harmonization procedures between its 1:25 000 series, the cadastral cartography 1:1000 and the street guide information 1:1000 collected by the INE (National Statistical Institute) This paper does not tackle the attribute issue but the geometry one, so just the two first mentioned cartographies have been considered. A number of situations found in this geometric conflation project are analysed in this work. The different types of geometric transformations that can correct these defects are commented. We also collect and formalize some basic mathematical constraints derived from cartography rules  like topology relationship preservation.

Keywords: geometric conflation, mathematical constraints, cartographic restrictions

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Casado2006accuracy.pdf635.84 KB

Spatial sampling design for prediction taking account of uncertain covariance structure

Jürgen Pilz 1 and Gunter Spöck 2 
1 Universität Klagenfurt, Institut für Mathematik
Universitätsstraße 65-67, 9020 Klagenfurt, Austria
Tel.: + 0043 463 2700-3113; Fax: + 0043 463 2700-3199
juergen.pilz@uni-klu.ac.at
2 Universität Klagenfurt, Institut für Mathematik
Universitätsstraße 65-67, 9020 Klagenfurt, Austria
Tel.: + 0043 463 2700-3125; Fax: + 0043 463 2700-3199
gunter.spoeck@uni-klu.ac.at
 

Abstract
This paper presents a model-based approach to the problem of the optimal choice of a spatial design in the presence of uncertainty about the distribution of the observations, a topic which has received only little attention in the geostatistics literature so far. In spatial sampling one usually starts with an initial design to estimate the covariance function, such a design is non- model-based and chosen e.g. according to principles of deterministic sampling, cluster sampling, simple random and stratified sampling etc. The basic difficulty for rigorous model- based approaches to spatial sampling is the fact that the spatial correlatedness of the observations leads to analytically intractable  design criteria, whereas for linear regression models with uncorrelated observations one obtains well-tractable objective functions which can be optimized using a rich and fully developed methodology from  convex analysis. After briefly reviewing the “classical” experimental design theory approach to regression models with uncorrelated observations, we will show how this approach can be extended to spatial sampling with correlated errors, using an approximation of the random field by linear regression models with random coefficients. In this context, we also present a useful algorithm for the iterative generation of an optimal design for spatial prediction. The results are illustrated by means of a real data set of Cs137 measurements.

Keywords: spatial model-based  design, experimental design  theory, uncertain covariance function, Bayes kriging

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Pilz2006accuracy.pdf276.97 KB

Spatial simulation of forest using Bayesian state-space models and remotely sensed data

Jörgen Wallerman 1, Coomaren P. Vencatasawmy 2 and Lennart Bondesson 3
1 Department of Forest Resource Management and Geomatics,
Swedish University of Agricultural Sciences,
SE-901 83 Umeå, Sweden
Tel.: + 46 (0)90 786 8570; Fax: + 46 (0)90 778 116
jorgen.wallerman@resgeom.slu.se
2 Aviva,  Aviva plc, St Helen's, 1 Undershaft,
London EC3P  3DQ
Tel: +44 (0)20 7662 7157; Fax: +44 (0)20 7662 4122
coomaren.vencatasawmy@aviva.com
3 Department of Mathematics and Mathematical Statistics, 
Umeå University, 
SE-901 87 Umeå, Sweden
Tel.: + 46 (90) 786 6529; Fax: + 46 (0)90 786 7658
lennart.bondesson@matstat.umu.se 

Abstract
Utilizing spatial properties of forest attributes may provide increased accuracy in forestry remote sensing applications, compared to  common non-spatial methods. Spatial models are often complex and their inference are usually difficult, though. Such problems may be addressed using Bayesian models, estimated using the computer-intensive Markov-Chain Monte Carlo (MCMC) stochastic simulation methods. This article presents a Bayesian state- space model of forest attributes using field measurements and remote sensing data. The model is defined on a spatial lattice where each lattice cell corresponds to the spatial extent of one raster cell measurement in the remote sensing data. As prior distribution function, the Conditional Autoregressive model (CAR) is utilized since it is well defined for simulation using the Gibbs sampler. The parameters of the CAR were estimated using a variogram model. Inference is provided by the MCMC method Gibbs sampler, a method which allows inference of very complex models. That is, estimation is made by simulating from the posterior distribution (conditional to the available field measurements and remote sensing data). A case- study is presented where the model is applied to produce a 5917 ha large raster map of forest stem volume using field measurements and Landsat 5 TM data in northern Sweden. This corresponds to estimation of a parameter vector of size exceeding 360 000. The mapping accuracy was assessed using sampled field plots and field measured forest stands, not utilized in the model. Simulations made using Gibbs Sampler did converge to reasonable realizations of the forest, in spite of the very large size of the estimated parameter vector. The general mapping accuracy was low, though, 76.1% root mean square error (RMSE), in per cent of the mean, for raster cell (25 by 25m) predictions, and 60.5% RMSE for stand (0.5 – 22.1 ha) predictions. The methodology shows substantial potential although further development of the model would clearly be beneficial.

Keywords: Markov-Chain Monte Carlo, Gibbs Sampler, Bayesian models, forestry, Remote Sensing

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Wallerman2006accuracy.pdf807.37 KB

Spatial uncertainty analysis of pesticide leaching using a metamodel of PEARL

Ton van der Linden 1, Aaldrik Tiktak 2, Gerard Heuvelink 3 and Toon Leijnse 4
1 National Institute for Public Health and the Environment
PO Box 1, 3720 BA Bilthoven, The Netherlands
Tel.: + 0031 30 274 3342; Fax: + 0031 30 274 4413
ton.van.der.linden@rivm.nl
2 Netherlands Environmental Assessment Agency
PO Box 303, 3720 AH Bilthoven, The Netherlands 
Tel.: + 0031 30 274 3343; Fax: + 0031 30 274 4419
aaldrik.tiktak@mnp.nl
3 Soil Science Centre, Wageningen University and Research Centre
PO Box 47, 6700 AA Wageningen, The Netherlands.
Tel.: + 0031 30 274 3343; Fax: + 0031 30 274 4419
gerard.heuvelink@wur.nl
4 TNO Environment and Geosciences
PO Box 80015, 3508 TA Utrecht, The Netherlands
Tel.: + 0031 30 274 3343; Fax: + 0031 30 274 4419
toon.leijnse@tno.nl

Abstract
Spatially distributed modelling has recently been introduced in pesticide registration procedures and policy evaluations. The assessments are quite demanding with respect to computational efforts, which probably is the reason that little attention has been paid so far to uncertainty analyses of leaching concentrations at the regional scale. Recently, a metamodel of pesticide leaching was developed and calibrated with results of the spatially distributed model EuroPEARL (Tiktak, et al. 2004; Tiktak, et al. (acc.)). The metamodel is based on an analytical expression of the transport of solutes through porous media and calculates the fraction of the applied amounts which leach to a reference depth in the soil profile. Important inputs to the metamodel are the half-life and sorption constant of a pesticide, soil organic carbon content and soil moisture at field capacity, as well as long-term averages of temperature and precipitation surplus. The  metamodel explained more than 90% of the variance in the simulation results and thus might be considered a good approximation of the original model. As opposed to the original model, the metamodel is very inexpensive with regard to computational efforts, which makes it suitable for calculations at high resolutions (i.e. small grid sizes) or uncertainty analyses using Monte Carlo  techniques. Using the metamodel, this paper investigates the uncertainty in pesticide leaching assessments due to uncertainty in the half-life and sorption constant of pesticides and uncertainty and spatial variability in soil parameters and climatic conditions. The magnitude of uncertainty in the annual leached pesticide is computed for the  temperate region of the EU and the relative contributions of individual uncertain inputs are compared.  Possible implications for policy evaluations are discussed.

Keywords: groundwater, spatial correlation, error propagation, Monte Carlo simulation

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
vanderLinden2006accuracy.pdf728.69 KB

Statistical efficiency of model-informed geographic sampling designs

Daniel A. Griffith
School of Social Sciences, University of Texas @ Dallas, Richardson, Texas, USA, P.O. Box 830688, GR31, 75083-0688
Tel. : + 001 972 883 4950; Fax : + 001 883 6297
dagriffith@utdallas.edu

Abstract

As spatial autocorrelation latent in georeferenced data increases, the amount of duplicate information contained in these data also increases, whether an entire population or some type of random sample drawn from that population is being analyzed, resulting in incorrect sample size calculations being given by conventional power and sample size calculation formulae. Griffith (2005) exploits this context to formulate equations for estimating the necessary sample size needed to obtain some predetermined level of precision for an analysis of georeferenced data when implementing a tessellation stratified random sampling design, labeling this approach model-informed, since a model of latent spatial autocorrelation is required. Spatial autocorrelation is accounted for in these power and sample size calculation equations by using the following spatial statistical model specifications: (1) simultaneous autoregressive; (2) geostatistical semivariogram; and, (3) spatial filter. Sample size results are somewhat sensitive to which model is employed to capture spatial autocorrelation effects. This paper addresses issues of efficiency associated with each of these models in  the presence of spatial autocorrelation effects. It summarizes results  from a set of simulation experiments following experimental design guidelines spelled out by Overton and Stehman (1993) that explore continuous linear, quadratic, and sinusoidal response surfaces. 

Keywords: autoregressive model, geostatistical model, spatial autocorrelation, spatial filter model, spatial sampling

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Griffith2006accuracy.pdf246.42 KB

The critical role of geographic information system (GIS) and remote sensing (RS) in forest site classification and mapping

Alkan Günlü 1, Ali Ihsan Kadıoğulları 2, E. Zeki Başkent 3, Deniz Güney 4, Gürcan Buyuksalih 5
1 Karadeniz Technical University, Faculty of Forestry, 61080, Trabzon, Turkey
Tel: +90 462 377 37 34, Fax: +90 462 325 74 99
alkan61@ktu.edu.tr
2 Karadeniz Technical University, Faculty of Forestry, 61080, Trabzon, Turkey
Tel: +90 462 377 28 46, Fax: +90 462 325 74 99
alikadi@ktu.edu.tr
3 Karadeniz Technical University, Faculty of Forestry, 61080, Trabzon, Turkey
Tel: +90 462 377 37 34, Fax: +90 462 325 74 99
baskent@ktu.edu.tr
4 Karadeniz Technical University, Faculty of Forestry, 61080, Trabzon, Turkey
Tel: +90 462 377 28 36, Fax: +90 462 325 74 99
d_guney61@ktu.edu.tr
5.Zonguldak Karaelmas University, Department of Geodesy & Photogrammetry 
Faculty of Engineering, 67100, Zonguldak, Turkey
Tel: +90 372 257 59 20, 
gurcan61@hotmail.com

Abstract
It is essential that forest sites be characterized and classified to realize effective planning and implementation of sustainable forest management activities or regulations. Developing as well as conducting harvesting activities on the ground in the absence of forest site information is highly ineffective or insufficient, calling for recognizing the potential production capacity and conditions of forest sites. Forest site classification has been of a major problem in Turkish forestry. In Turkey, the productivity of forest sites has been determined with solo wood production objective in forest management using the dominant height  at a reference age. However, forests sites have to be determined  based on various factors such as landscape structure, climatic profile, biotic features and soil characteristics, as  typical site parameters. This direct process is highly time demanding, expensive and hard to conduct, necessitating the use of information technologies such as Geographic Information System (GIS) and Remote Sensing (RS). This research was therefore designed to demonstrate the  integrated use of GIS and RS in characterizing, classifying and mapping forest sites in Genya Mountain, located in central Management District in Artvin State Forest Enterprise of Turkey. The ground measurement data, used as  ground truth data, was correlated with the supervised classification of the Landsat ETM (2001) image. 

Keywords: forest site classification, geographic information system, remote sensing

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Gunlu2006accuracy.pdf2.1 MB

The evaluation and comparison of thematic maps derived from remote sensing

Giles M. Foody
School of Geography, University of Southampton, Southampton, S017 1BJ, UK
Tel: +44 (0)2380 595 493; Fax: +44 (0)2380 593 295 g.m.foody@soton.ac.uk

Abstract
The accuracy of thematic maps derived from remote sensing is often viewed negatively. This reflects difficulties in mapping but also issues connected with accuracy evaluation targets and assessment approaches. This paper focuses on the latter issues suggesting that the widely used target accuracy of 85% may often be inappropriate and that the approach to accuracy assessment adopted commonly in remote sensing can be pessimistically biased. Problems are also encountered in the comparison of thematic maps, hindering research that seeks to increase classification accuracy which is often based upon evaluations of the accuracy of maps derived from a set of classification algorithms. It is hoped that a greater awareness of the problems encountered in accuracy assessment and comparison may help ensure that perceptions of classification accuracy are realistic and fair.
 

Keywords: accuracy, classification, difference, target, imagery

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Foody2006accuracy.pdf269.75 KB

Topological relations using two models of uncertainty for lines

Rui Reis 1, Max Egenhofer 2 and João Matos 3

1 Centro para a Exploração e Gestão de Informação Geográfica, Instituto  Geográfico Português, Rua
Artilharia Um, 107, 1099-052 Lisboa, Portugal. 
rui.reis@igeo.pt
2 National Center for Geographic Information and Analysis, Department of Spatial Information Science and Engineering and Department of Computer Science, University of Maine, Orono, ME 04469-5711, USA. 
max@spatial.maine.edu
3 Departamento de Engenharia Civil e Arquitectura, Instituto Superior Técnico, Av. Rovisco Pais, 1049-001 Lisboa, Portugal. 
jmatos@civil.ist.utl.pt

Abstract
This paper presents two models for determining the uncertainty in the topological relations between two lines. One model considers the uncertainty associated with the position of the line, while the other model captures the uncertainty associated with the position of the nodes. The first case considers a region of uncertainty surrounding the entire line and is called a broad line, whereas the second case considers two regions of uncertainty at the end points of the line and is called a line with a broad boundary. The 9-intersection as a generic model for binary topological relations identifies 33 different relations for lines without uncertainty. We found that for broad lines the number of distinguishable cases reduces to 5, while for lines with broad boundaries is extends to 77.

Keywords: modeling uncertainty, lines, topological relations, broad lines, lines with broad boundaries

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Reis2006accuracy.pdf733.58 KB

Towards a systems approach to the visualization of spatial uncertainty

Phaedon C. Kyriakidis
Department of Geography 
University of California Santa Barbara
Santa Barbara, CA 93106-4060, USA
Tel.: + 001 805 893 2266; Fax: + 001 805 893 3146
phaedon@ geog.ucsb.edu

Abstract
Most existing approaches for spatial uncertainty visualization are concerned with the depiction of local or per-pixel uncertainty measures, such as standard errors of attribute prediction in a spatial interpolation setting or posterior probabilities of class occurrence in a classification setting. Most vision operations, however, are pattern-detection endeavors, which are by definition multi-pixel in nature. Consequently, per-pixel uncertainty measures cannot adequately characterize uncertainty in the outcomes of vision operations applied on maps. To overcome the above limitations, a formal quantitative framework for  the visualization of spatial uncertainty is advocated, building on an analogy from engineering systems. A system is a model of some aspect or process of the  real world, often approximated by a set of mathematical equations, which is excited by a set of inputs to produce a set of outputs or model predictions. In a similar fashion, a map  user can be viewed as a system: his or her visual perception and cognition are extremely complex operations that via map analyses lead to decisions and actions. In analogy with engineering systems, quantification of the impact of an uncertain input map on vision-related tasks requires that these tasks be applied to a set of alternative input maps, all of which are processed by the user to arrive at a set of possible analysis results. The proposed framework can thus be seen as a two-step data mining endeavor: (i) exploration of the attribute uncertainty model via stochastic simulation by generating alternative, synthetic, attribute realizations, and (ii) exploration the outputs of early-vision operations applied on this set of realizations in meaningful ways that enable the user to distill the uncertainty in these outputs.

Keywords: spatial uncertainty assessment, geostatistical simulation, early-vision operations, image segmentation

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Kyriakidis2006accuracy.pdf651.39 KB

UGML: an extension of GML to provide support for geographic objects with uncertain boundaries

Ashley Morris 1and Frederick E. Petry 2
1 DePaul University
School of Computer Science
243 S Wabash Ave, Chicago IL, 60604, USA
Tel.: 1 312 362 8252
ashley@ashleymorris.com
2 Naval Research Laboratory
Stennis Space Center, MS, 30529, USA
fpetry@nrlssc.navy.mil

Abstract
The GML (Geographic Markup Language) as created by the Open Geospatial Consortium has emerged as the standard for spatial data interchange. Whether objects are to be stored in a relational database, an object oriented database, or a file, they can be exchanged if they are encoded in the GML format. The designers of GML realized that by basing GML on XML (eXtensible Markup Language), users could then extend the language to provide additional support for spatial objects that are not explicitly defined by the original GML specifications. In this paper, we discuss how the GML can be extended (using the XML schema format), to fully support objects with uncertain boundaries using the framework outlined in our prior work (Morris 2003). The basis of our framework is that by using fuzzy logic, we can provide for indeterminate boundaries for any geographic object. We may store a number of alpha-cuts (a crisp subset of a fuzzy set) for each spatial object (1 only if the object has a crisp boundary), and each alpha-cut representation of that object will represent the boundary of that object with a certain degree of membership. The user may then define any number of alpha-cuts for each spatial object, depending upon the degree of precision they require. This UGML (Uncertain GML) implements this multiple alpha-cut framework within the GML specification.

Keywords: OpenGIS, fuzzy logic, uncertainty, GML

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Morris2006accuracy.pdf625.42 KB

Uncertainty and Risk Analysis in Hydrological Models for Land- use Management

Kim E. Lowell 1,2 and Kurt K. Benke 1
1, 2 Primary Industries Research Victoria
621 Burwood Highway, Knoxfield, Private Bag 15, Ferntree Gully DC, Victoria 3156 AUSTRALIA 
Tel: +61 3 9210 9205, Fax: +61 3 9800 3521 
Kim.Lowell@dpi.vic.gov.au; Kurt.Benke@dpi.vic.gov.au
1 CRC for Spatial Information, University of Melbourne
723 Swanston St., Ground floor, Carlton, VIC 3052 AUSTRALIA
Tel: +61 3 8344 9192
klowell@crcsi.com.au

Abstract
The characterisation of uncertainty and risk analysis are two research domains that have many common elements. Specifically, they  are both based on an acknowledgment that decisions are never made using perfect knowledge and therefore always have some degree of associated risk or uncertainty. Research in uncertainty addresses this by seeking to understand the causes of uncertainty, and to quantify the magnitude of uncertainty associated with limited information. Risk analysis is more focussed upon making decisions in the face of uncertainty. In this paper, areas of overlap between uncertainty research and risk analysis are presented and discussed in the context of hydrological models. A number of  fundamental terms in uncertainty and risk are defined, and ways of presenting uncertainty related to risk analysis are described.

Keywords: epistemic uncertainty, linguistic uncertainty, hydrology, models, risk analysis

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Lowell2006accuracy.pdf668.05 KB

Uncertainty characterization in remotely sensed land cover information

Jingxiong Zhang 1 and Jiabing Sun 2
1 School of Remote Sensing, Wuhan University
LIESMARS – Laboratory for Information Engineering in Surveying, Mapping and Remote Sensing
129 LuoYu Road, Wuhan 430079, China
Tel: +086 27 63187371; Fax: + 086 27 68778086 
jxzhang@whu.edu.cn
2 School of Remote Sensing, Wuhan University
129 LuoYu Road, Wuhan 430079, China

Abstract
Uncertainty characterization has become increasingly recognized as an integral component in thematic mapping based on remotely sensed  imagery, and descriptors such as percent correctly classified pixels (PCC) and Kappa coefficients of agreement have been devised as thematic accuracy metrics. However, such  spatially averaged measures about accuracy neither offer hints about spatial variation in  misclassification, nor are they useful for quantifying error margins in derivatives, such as areal extents of different land cover types and land cover change statistics. Such limitations originate from  the deficiency  that spatial dependency is not accommodated in the conventional methods for error analysis. Geostatistics provides a good framework for uncertainty characterization in land cover information. Methods for predicting and propagating misclassification will be developed on the basis of indicator samples and covariates, such as  spectrally derived posteriori probabilities. Experiment using simulated data sets was carried out to quantify error in land cover change derived from post-classification comparison. It was found that  significant biases result from applying joint probability rules assuming temporal independence between misclassifications across time, thus consolidating the need for stochastic simulation in error modeling. Further investigations are anticipated incorporating indicators and probabilistic data for mapping and propagating misclassification. 

Keywords: geostatistics, land cover change, misclassification, stochastic simulation

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Zhang2006accuracy.pdf1.8 MB

Use of historical flight for landslide monitoring

Javier Cardenal 1, Jorge Delgado 1, Emilio Mata 1, Alberto González 2
and Ignacio Olague
2
1 Department of Ingenieria Cartografica, Geodesica y Fotogrametria
University of Jaen. Campus Las Lagunillas, s/n. 23071-Jaen (Spain)
jcardena@ujaen.es; jdelgado@ujaen.es; emata@ujaen.es
2 Department of Earth Sciences and Condensed Matter Physics.
University of Cantabria. Avenida de Los Castros, s/n. 39005-Santander (Spain)
gonzalea@unican.es; olaguei@unican.es

Abstract
Landslides are one of the most frequent deformations in the shallow layers of the Earth. These deformations have very important social and economic consequences being one of the more hazardous natural risks. For this reason, the landslides knowledge and modelling are actually a very interesting research line in order to establish the susceptibility of a certain zone to be affected by one of these processes. Digital photogrammetry can be used for monitoring landslides since allows for high accuracy calculation of spatial coordinates of points in unstable slopes and its surroundings. Present aerial digital photogrammetric techniques have shown to be effective into the susceptibility and hazard  analysis of the land instability processes at moderate costs. The Digital Elevation Models (DEM), obtained by means of digital photogrammetry, have a high accuracy and precision and improve the results of mass movements’ susceptibility models. This methodology has been applied in an area localized in the internal valleys of the Cantabrian Range (Northern Spain). An interesting aspect of the project has been the data processing of available  historical flights in order to increase the historical record of present information. Thus a total of 5 flights on the study areas were available from 1958 to 2001, with scales between 1:33.000 and 1:15.000 and with color and panchromatic films. The final quality of available originals (obtaining negatives in good condition for scanning was really difficult) and the available information ( there were cases with no information about the geometric camera parameters) have taken to select two flights: a 1:20.000 scale panchromatic flown at 1970 and a 1:15.000 scale color flown at 1988. DEM accuracy was better than 1.5 m and 2 m in XY and Z, respectively. In these cases comparison with present photogrammetric flights (larger 1:10000 and 1:5000 ad hoc flights for the project) has allowed the monitoring of the temporal evolution of landslide crowns (expressed as annual rates of displacements). Other historical flights have also been processed to check out the extent of the metric capabilities of old aerial paper print photographs when an important lack of information about the flight project exists. All flights have been processed with a digital photogrammetric workstation (DPW) running under Leica Photogrammetry SuiteTM (LPS).

Keywords: digital photogrammetry, historical flights, deformation monitoring, landslides 

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Cardenal2006accuracy.pdf1.52 MB

Using spatially constrained clustering in land cover mapping

Fernanda Néry 1,2, Luís de Sousa 1, Pedro Marrecas 2, Ricardo Sousa 1
 and João Matos
1
1 Department of Civil Engineering and Architecture, IST, Technical University of Lisbon, PT Av. Rovisco Pais, 1049-001 Lisboa, Portugal
Tel.: + 001 555 832 1155; Fax: + 001 555 832 1156
nery@ist.utl.pt; rts@civil.ist.utl,pt; lads@ist.utl.pt; jmatos@civil.ist.utl.pt
2 Instituto Geográfico Português
Rua Artilharia Um, n.º 107, 1099-052 Lisboa, Portugal
Tel.: + 004 555 874 414; Fax: + 004 555 874 414
fernanda.nery@igeo.pt, pedro.marrecas@igeo.pt

Abstract
Traditional land cover mapping imposes a predefined taxonomy with mutually-exclusive hard categories upon a surface which can be perceived as continuous. Boundary uncertainty and heterogeneity of resulting regions are inherent to such approach, but can nevertheless be reduced through the definition and application of explicit criteria. Following a design-based evaluation of the positional and attribute uncertainty in a photo-interpreted land cover map, categories where identified that do not attain the predefined accuracy  levels. As expected, those were categories representing land use instead of land cover (e.g. sport and leisure facilities; green urban areas) and heterogeneous categories. This paper focuses on the latter case. Heterogeneous categories (e.g. “complex cultivation patterns” in the CORINE Land Cover nomenclature) are a result of limitations in the support of either the input information (e.g. spectral mixture due to pixel size in remote sensing applications) or the specified output information (e.g. minimum mapping unit [MMU] of vector polygon maps). Information regarding the degree of heterogeneity is generally not provided to the final user. Improvement of data accuracy can be achieved using ancillary information and/or spatially constrained clustering algorithms. Spatial constraints are built using connectivity criteria – which objects are connected? – and distance criteria – how far apart are two connected objects? Connectivity can be defined geometrically or topologically. The resulting graph structure, or its equivalent binary incidence matrix, can be used directly in the clustering algorithm or be weighted using distance functions. This allows further flexibility and the integration of spatial and semantic constraints. Simple Euclidean distance can be used, or any empirical or heuristic measure of similarity between objects or the categories they originally belong to. As an output of the clustering process, a set of heterogeneity measures is obtained for each object, which can be used to evaluate the results against the original visual interpretation.

Keywords: spatial constraints, clustering, classification, land cover

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Nery2006accuracy.pdf1.77 MB

Weighting fidelity versus classified area in remote sensing classifications from a pixel and a polygon perspective

Pere Serra 1, Gerard Moré 2 and Xavier Pons 1,2
1 Department of Geography 
Edifici B, Campus de la Universitat Autònoma de Barcelona, 08193-Cerdanyola del Vallès (SPAIN).
Tel.: 00 34 93581 3273; Fax: 00 34 93 581 2001
pere.serra@uab.cat
2 Center for Ecological Research and Forestry Applications (CREAF)
Edifici C, Universitat Autònoma de Barcelona, 08193-Cerdanyola del Vallès (SPAIN).
Tel.: 00 34 93 581 1312; Fax: 00 34 93 581 4151
g.more@creaf.uab.cat; xavier.pons@uab.cat

Abstract
This paper summarizes the consequences in the area classified and in the thematic accuracy of being more or less conservative in a hybrid classifier. The most important parameter of that classification consists in fidelity (the introduction of the threshold proportion at which to accept a spectral class as being a part of a thematic category). Two options have been tested: the first less conservative, the second more conservative. These fidelities have been applied to ten Mediterranean crops and tested using error matrices. Thematic accuracies were quantified following the classical approach (number of pixels correctly classified), a polygon approach (number of polygons correctly classified)  and, finally, area approach (area correctly classified). Results showed that the most restrictive fidelity produces less area classified but with more thematic accuracy when unclassified pixels are not included in the quantification of the accuracy. This fact occurred in all the options (pixel, polygon and area) although did not affect all the crops equally.

Keywords: hybrid classifier, fidelity, producer’s accuracy, Mediterranean crops, area classified

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Serra2006accuracy.pdf738.41 KB