Accuracy 2008 Conference

8th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences. Edited by J. Zhang and M. F.Goodchild

To order the printed hardcover proceedings, please contact the publisher World Academic Press at publishing@WAU.org.uk. A number of papers have been selected for publication in the International Journal of Remote Sensing special issue "uncertainty in remotely sensed information, spatial analysis, and process modeling".

 
Accuracy 2008 Proceedings
ISBN: 1-84626-170-8
 

 

Symposium chairs:
Hon. Chairs: Jingnan Liu (Wuhan University, CN), Xiaowen Li (Institute of Remote Sensing Applications, CN)
Chairs: Deren Li (Wuhan University, CN), Michael Goodchild (University of California, US)
Co-chairs: Jianya Gong (Wuhan University, CN), Chenghu Zhou (Institute of Geographical Science, CN)

Organizing Committee:
Jingxiong Zhang (Wuhan University, CN)
Secretary: Yong Ge (Wuhan University, CN)

Sponsors: Chinese Society for Geodesy, Photogrammetry and Cartography, Specialty Committee on Photogrammetry and Remote Sensing, China. The World Academic Union has funded the proceedings' publication.

Accuracy 2008 group photo


 

PLENARY LECTURES

GEOSTATISTICS

ERROR, INFORMATION, AND SPATIAL CORRELATION

VAGUENESS AND UNCERTAINTY

ERROR PROPAGATION

UNCERTAINTY VISUALIZATION

VALIDATION IN REMOTE SENSING

UNCERTAINTY IN SPATIAL MODELLING

SCALE

DATA ASSIMILATION

GEO-DATA AND ONTOLOGY

POSITIONAL ACCURACY

ACCURACY IN CLASSIFICATION

ACCURACY IN DEMs

POSTER SESSION

NOTE: The above listed papers have not been peer-reviewed. Please read also the website disclaimer. To order the printed hardcover proceedings, please contact the publisher World Academic Press at publishing@WAU.org.uk. For corrections and errata, please contact the Proceedings editor: Jingxiong Zhang (Wuhan University, CN).

AttachmentSize
Accuracy2008_call_for_Papers.pdf294.24 KB

Uncertainty in Spatial Information, Analysis, and Applications: Chinese Perspectives

Deren Li
State Key Lab for Information Engineering in Surveying, Mapping and Remote Sensing Wuhan University, 129 Luoyu Road, Wuhan 430079, China

Abstract. This paper reviews past research by Chinese researchers on error analysis in geomatic engineering. Currently, a trend towards uncertainty characterization in geoinformation, analysis, and applications is clearly visible in China. Future developments will be seen not only in theory and technology but also applications in terms of making contribution to national economy and social welfares, which, in turn, will reinforce the position of the research community of spatial uncertainty in the world.

Keywords: geomatic engineering, error analysis, uncertainty, geostatistics, validation, indeterminate objects 

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
DerenLi2008accuracy.pdf250.82 KB

Spatial Accuracy 2.0

Michael F. Goodchild
University of California, Santa Barbara

Abstract. Research on spatial accuracy assessment occurs within a broader context that provides its motivation. That broader context is dynamic, and has been changing at an accelerating rate. The concept of the Geospatial Web imagines a world of distributed, interoperable, georeferenced information in which it is possible to know where everything of importance is located in real time. It assumes an ability to conflate that is far beyond today’s capabilities. Web 2.0 describes a substantial involvement of the user in creating the content of the Web, and has particular relevance to geospatial information. Metadata 2.0 shifts the onus for metadata production to the user, and addresses some of the growing issues surrounding existing standards. There is a growing need to address the accuracy assessment of the vast quantities of geospatial data being contributed by individual Web users.

Keywords: Geospatial Web, user-generated content, volunteered geographic information, Web 2.0,
metadata

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Goodchild2008accuracy.pdf302.96 KB

Geostatistics: What’s Hot, What’s Not, and Other Food for Thought

Carol A. Gotway Crawford 1 + and Linda J. Young 2
1 Centers for Disease Control and Prevention, Atlanta, GA, USA
2 Department of Statistics, University of Florida, Gainesville, FL USA

Abstract. The field of geostatistics has evolved tremendously since it was invented for use in the mining industry. Today, geostatistics is used in a variety of disciplines including agriculture and natural resources, environmental science, and, most recently, geography and public health. This advancement into new disciplines has facilitated the development of many new geostatistical methods and techniques. In this paper, we review several of the areas that have seen significant advances within the field of geostatistics, and those that deserve more attention by geostatisticians. The list of topics identified here is intended to be provocative, as we believe questioning leads to the scientific and practical advancement of a discipline.

Keywords: support, non-Euclidean distance, convolutions, geospatial workforce, geoinformatics

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Gotway2008accuracy.pdf250.64 KB

A Method to Improve the Accuracy of Remote Sensing Data Classification by Exploiting the Multi-Scale Properties in the Scene

Yanchen Bo 1, 2
1 State Key Laboratory of Remote Sensing Science, Jointly Sponsored by Beijing Normal University and Institute of Remote Sensing Applications, CAS, Bejing, China 
2 Research Center for Remote Sensing and GIS, School of Geography, Beijing Key Laboratory for  Remote Sensing of Environment and Digital Cities, Beijing Normal University, Beijing, 100875, China

Abstract. Land use mapping is one of the major applications of remote sensing. While most studies focus on the advanced remote sensing thematic classification algorithms for land use mapping, the scale factor in remote sensing data classification was less recognized. Previous studies showed  that while the multi-scale characteristics exist in the remotely sensed data for land use classification, some classes are mostly accurately classified at a finer resolution, and others at coarser ones. Thus, it  is helpful to improve the overall classification accuracy by mapping different land use classes at different scales. In this paper, a framework
for improving the land use classification accuracy by exploiting the multi-scale properties of remotely sensed data is presented. Firstly, the remotely sensed data  at original fine resolution was up-scaled to different coarser resolutions; Secondly, the up-scaled data were classified by independently trained Maximum Likelihood Classifier at every  resolution, and the corresponding  a Posteriori Probability of MLC classification was saved; Thirdly, the classification results at different resolutions were integrated by comparing the a Posteriori  Probability of classification  at every resolution. The  final class of pixel was labeled as the class that has the maximum a Posteriori Probability. A case study on the land use mapping using Landsat TM data using this framework was conducted in the Dianchi Watershed in Yunnan Province of China. The land use was categorized into 6 classes. The classification accuracy was assessed using the Confusion Matrix. Comparison between the classification accuracy at multi-scale  and that at original
resolution showed an improvement of overall classification accuracy by about 10%. The study showed that by exploiting the multi-scale properties in the remotely sensed data, the accuracy the land use mapping can be improved significantly. 

Keywords: remote sensing, classification, multi-scale characteristics, uncertainty, land use

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
YanchenBo2008accuracy1.pdf314.77 KB

Accuracy Evaluation of Three Dimensional Laser Range Scanner Based on Field Calibration

Yunlin Zhang 1 +, Hangbin Wu 1, Xiaojun Cheng 1, 2 and Chun Liu 1, 2
1 Department.of Survey and Geo-Informatics, Tongji University, Shanghai, 200092, China
2 Key Laboratory of Advanced Engineering Surveying of SBSM, Shanghai, 200092, China 

Abstract. As a kind of advanced automatic high precision three dimension data acquisition technology, three dimensional laser range scanning is used widely in high precision modeling and reconstruction for reverse engineering. It is also titled as realistic copy technology because of its well displaying of real object. After scanning a complicate real environment in detail with large volume of points, the technology can then reconstruct an object with point cloud in computer despite of large-size, complication and irregularity of the real landscape. However, the precision of the object modeling may be influenced by the error of the data from three dimension range scanner, which will result in  the bias of analysis based on such model. So it is important to calibrate the instrument based on a better understanding of the error influence. As the objective of this paper, some calibrate schemes is designed in order to analyze the error source from the laser range scanner. The work principle and operation flow of three dimension laser scanner is reviewed firstly in this paper as the basis of further work in this paper. The accuracy of the instrument is evaluated with respect to distance measurement, angle measurement, temperature, intensity of echo, time, echo mode and so on. During the procedure of experiment, LR1-R from company of 3D LaserMaping is used as the three dimension laser range scanner, Kern Mekometer is used as precision electronic apomecometer, Kern prism is used as reflecting prism and TCA2003 is used as the total station.

Keywords: three dimensional range scanning, calibration, accuracy evaluation, field testing

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
YunlinZhang2008accuracy.pdf458.08 KB

Edge Detection of Riverway in Remote Sensing Images Based on Curvelet Transform and GVF Snake

Moyan Xiao 1 +, Yonghong Jia 1, Zhibiao He 2 and Yan Chen 1
1 School of Remote Sensing and Information Engineering,
 2 Satellite and Navigation Location Technology Research Center, Wuhan University, 129 Luoyu Road, Wuhan 430079, P.R. China

Abstract. This paper introduces curvelet transform and gradient vector flow (GVF) snake to improvement accuracy in edge detection of waterway from remote sensing images. Multi-scale geometric analysis (MGA) is booming hot research topic in recent years, which aims to obtain flexible, fast and effective signal processing algorithms through efficient approximation and characterization for the inherent geometric structure of high-dimensional data. Curvelet transform is a special member of this emerging family of MGA which overcomes inherent limitation of traditional multi-scale representation such as wavelet which ignores the geometric properties of objects with edges and does not exploit the regularity of the edge curves in higher dimension. The basic edge detection process is mainly composed of three parts. Firstly, obtain the initial snake based on region growing and morphology methods from curvelet-based denoised image. Secondly, get the edge map derived from curvelet-based enhancement image. Finally, obtain the converging snake by evolving the GVF snake. The edge detection results of Yangtze River derived from the proposed method, wavelet based GVF snake and canny method are compared together. Experiments demonstrate that the new algorithm is superior to other methods, which is more effective and accurate.

Keywords: edge detection, remote sensing image, multi-scale geometric analysis (MGA), curvelet transform (CT), GVF Snake 

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
MoyanXiao2008accuracy.pdf440.42 KB

Estimation of Linear Vectorial Semiparametric Models by Least Squares

Kun Zhang 1 and Songlin Zhang 2
1 Laboratory of Geographic Information Science, East China Normal University, Shanghai, 200062, China
2 Department of Surveying and Geomatics, Tongji University, Shanghai, 200092, China

Abstract. Semiparametric model is a statistical model  consisting of both parametric and nonparametric components, which can be looked on as a mixture model. The theoretical properties of this model have been studied extensively, such as large-sample property. However, most researches are based on scalar value, in which the dimension of the observation is one at each moment. In the fields of spatial data processing, such as econometrics, GPS, engineering surveying, engineering deformation monitoring, etc, the dimension of the observations is always more than one at each moment, they are vectorial models other than scalar ones. This paper focuses on the estimating theory of vectorial semiparametric models under the least-square principle. We deduced the formulas of weighted function estimator and spline estimator. Kernel and nearest-neighbor are most common weighted functions. In kernel estimation, the weights of observations are determined by kernel functions which are always probability density function, such as Nadaraya-Watson kernel. Nearest- neighbor means only the nearest neighbors have effect on a certain observation point. Spline estimation considers the penalized least-squares problem, the criterion function trades off fidelity to the data against function smoothness. The difference between scalar and vectorial semiparametric model was compared. As to weighted function estimator, the difference is a single weight parameter and a weight matrix; as to spline estimator, the difference is the operation of multiplication and Kronecker product.

Keywords: linear vectorial semiparametric model; scalar semiparametric model; least-square estimator

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
KunZhang2008accuracy.pdf245.15 KB

Forest Characteristics and Effects on LiDAR Waveforms Modeling and Simulation

Xiaohuan Xi +, Ran Li, Zhaoyan Liu and Xiaoguang Jiang
Academy of Opto-Electronic, Chinese Academy of Sciences
No.95 Zhongguancundonglu, Beijing, 100190, China

Abstract: LiDAR (Light Detection And Ranging) remote sensing has been used to extract surface information as it can acquire highly accurate object shape characteristics using geo-registered 3D-points, and therefore, proven to be satisfactory for many applications, such as high-resolution elevation model generation, 3-D city mapping, vegetation structure estimation, etc. Large footprint LiDAR especially, offers the great potential for effectively measuring tree parameters in forested areas. Based on the radiative transmission function for vegetation structure, a simplified model cited from previous paper was used to simulate how the vegetation parameters and slope degree affect LiDAR waveform characteristics. And an extinction coefficient is introduced to the model due to the influence of dense vegetation canopy and the simplified and ideal model. The experiment results show that vegetation canopy, trees distribution in one footprint and terrain slope influence LiDAR waveform shapes, while  tree height just affects the starting position of the waveform. The model with the extinction coefficient explains that vegetation canopy can weaken laser which makes the return echo weaker in lower canopy. Although based on some assumptions, simple and ideal conditions, the above results obtained from GLAS data are suitable for other LiDAR systems and have great significance to LiDAR applications in forest parameters extraction, especially in sloped areas, it is necessary to correct terrain effects when deriving vegetation height from LiDAR returns. As a matter of fact, the model has to be perfected in later work so as to be more practical.  

Keywords: LiDAR, GLAS, vegetation parameters, waveform, extinction coefficient

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
XiaohuanXi2008accuracy.pdf321.3 KB

Geostatistical Approaches to Conflation of Continental Snow Data

Jingxiong Zhang 1+, Phaedon Kyriakidis 2 and Richard Kelly 3
1 School of Remote Sensing Information, and Laboratory for Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, 129 LuoYu Road, Wuhan 430079, China
2 Department of Geography, University of California, Ellison Hall 5710, Santa Barbara, CA 93106-4060 
3 Department of Geography, Faculty of Environmental Studies, University of Waterloo,  200 University Avenue West, Waterloo, Ontario, N2L 3G1, Canada

Abstract. Information on snow cover extent and mass is important for characterization of hydrological systems at different spatial and temporal scales, and  for effective water resources management. This paper explores geostatistics for conflation of ground-measured and passive microwave remotely sensed snow data, which are commonly known as primary and secondary data, respectively. A modification to conventional co- kriging is proposed, which first estimates differenced local means between sparsely distributed primary data and densely sampled secondary data by co-kriging, followed by a best linear estimation of the primary variable based on the primary data and bias-corrected secondary data, with variogram models revised in the light of corrections made to the original secondary data. An experiment was carried out with snow depth (SD) data derived from the Advanced Microwave Scanning Radiometer for EOS (AMSR-E) instrument and the World Meteorological Organization (WMO) SD measurement, confirming the effectiveness of the proposed methodology.

 Keywords: geostatistics, conflation, co-kriging, snow data, passive microwave remote sensing 

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).
 
AttachmentSize
JingxiongZhang2008accuracy.pdf868.09 KB

Image-Based Spatial Information Systems for Geologic Logging

Hao Li, Xiaomin Jia, Yueqin Niu and Rui Wang
College of Civil Engineering, Hohai University, 1 Xikang Road, Nanjing 210098, China

Abstract.  The present geologic logging is still primarily accomplished by the manual sketch and field surveying, this operating type does  not adapt to the process of practical productive forces and the fast response of informationization construction on spot. In order to realize the digitalization of collection, processing, management and analysis in engineering geology, the author has researched and used the application method of measuring non-metric camera, it can reduce the  request of special equipment and professional skill in image-based geologic logging.  Based on the research, the author has studied and developed the image-based spatial information system used in engineering geologic logging, and has constructed technology system of digital logging based on the information system. Image-based spatial information system for geologic logging is a special information system, which grounds on the theory of photogrammetry, makes good use of the techniques of close-rang photogrammetry, digital image processing and GIS, and  is characteristic of the interdisciplinary cross. The main functions of the system including the automatic creation of the stretched image map of logging objects, digital plotting of structure feature, Image-based measurement of the structural plane’s attitude, management and query of logging attribute and graphic data, output of AutoCAD drawing etc, besides, it can offer the image production of geology reconnaissance for engineering construction.  Theoretical precision and result of practical application show that, in image-based geologic logging, the mean square error of position of tunnel and well are less than 0.2m, of slope and foundation pit are less than 0.3m. The mean square error of measuring structural plane’s attitude is less than 0.2º, all of these have met the requirement of engineering geologic logging specifications. The system has accomplished the transformation from manual work to computer-assisted work in engineering geologic logging, and also has generally improved the technical levels and the working efficiency in aspect of data collection, processing, management and application.  This paper elaborates some main points of image-based spatial information system for geologic logging, including the system's structure and function module design, the principle of work and spatial information processing algorithm, method of precision estimation and analysis of application instance, etc.

Keywords:  image-based spatial information system, engineering geologic logging, photogrammetry, precision, non-metric image

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
HaoLi2008accuracy.pdf365.32 KB

Land Use/Cover Change Detection Using Feature Database Based on Vector-Image Data Conflation

Hong Zhang + and Ning Shu
1 School of Remote Sensing and Information Engineering, Wuhan University,
129 Luoyu Road, Wuhan 430079, China

Abstract. Change detection in remotely sensed imagery is defined as the procedure of quantitatively analyzing and identifying changes occurred on the earth’s surface from remotely sensed imageries acquired at different times. Land use change survey with remote-sensed imagery has been one of the important methods for the land manage apartment to understand and accommodate land resources, and has attracted universal attention. As a key element for many applications of RS such as resource inventory, environment monitoring, update of fundamental geographical database, etc., change detection technique is of urgent demands and has great potential in scientific applications.         Conflation  is  the  process of combining the information from two (or more) geodata sets to make a master data set that is superior to either source data set in either spatial or attribute aspect. The objectives of conflation include increasing spatial accuracy and consistency, and updating or adding new spatial features into data sets. Based on the analysis and summarizations of researched home and aboard, the dissertation focused on Land Use/Cover Change detection using feature database of basic types based on vector-image data conflation, that is : Combining of  Land use map and RS image,  feature is extracted. This methodology belongs to “Feature class” of LUCC. It should be pointed out that the researches must be focused on the land use span other then traditional methods of the pixels. The main contributions of the study were summarized as follows: 1、Feature extraction based on land use span. The land use span is expressed by vector polygon along with raster region. First the spectrum feature database with histogram, texture and shape feathers of the span is formed. 2、Foundation and update of feature database. In detail, firstly, by means of the sample span according to land use map in time T1, the features of each type of the land use classes are obtained in time T1. Secondly, each sample are analyzed, if the index of regional similarity between the image span of T1 and T2 is accepted, the samples in time T2 could be remained, otherwise  the new samples around that sample are selected and are judged by the similarity between the samples of T1. 3、Change detection based on and feature database. Each span of T2 will be classified according to the minimum Euclidean distance to the T2 sample span accepted, and the corresponding land use type will be assigned to the current span. 4、Change information are extraction automatically based on Boolean operations. After classifications have been performed, the changed span were vectored, then the change information can be statistic through the different Boolean operations in GIS, and various change analysis can be made (i.e. urbanization and loss of the stew) The method is tested on the Quick Bird images of a district in Wuhan and the accuracy of the results is high as 85.7% (in loss of the stew) and 92.6% (in urbanization), and overall accuracy is 88.3%.

Keywords: land use map, sample span, vector-image data conflation, feature database, LUCC, accuracy analysis 

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
HongZhang2008accuracy.pdf335.02 KB

Metric and Identification of Spatial Objects Based on Data Fields

Shuliang Wang 1, 2, Juebo Wu 2 +, Feng Cheng 1 and Hong Jin 1
1 International School of Software, Wuhan University, Wuhan 430079, China
2 State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430079, China

Abstract. Identification research on spatial object, such as action identification and validity identification, is a hot topic in recent years and much study has focused on it. In this paper, data field is proposed to describe spatial object as a metric and improve the accuracy of identification. Potential function, as a part of
data field, is introduced to discover the power of each object. Two kinds of data fields are created to express the personality and common characteristic of each object respectively. Weighted Euclidean Distance (WED) classifier is utilized in final identification. An experiment on real-time  person tracking is carried out, and accuracy analysis is also discussed.

Keywords: data field, spatial object metric, spatial object identification, accuracy analysis.

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
ShuliangWang2008accuracy.pdf496.92 KB

Multi-Level Measurements for Uncertainty in Classified Remotely Sensed Imagery

Yong Ge 1+, Sanping Li 1, Ruifang Duan 1, Hexiang Bai 1 and Feng Cao 1 
1Institute of Geographic Sciences and Natural Resource Research, CAS, Beijing, 100101, China

Abstract. How to measure the uncertainty in remotely sensed data is one of key issues in the uncertainty research on remotely sensed information. In this paper, we utilize information theory, rough set theory to measure the uncertainty in classified remotely sensed imagery and then propose multi-level measurement indices for classified remotely sensed imagery, that is, pixel-level index, class/object-level indices and image-level indices. Following these above discussions, a case study of the Landsat TM image on China Yellow River Delta is used to describe the multi-level measurements. 

Keywords: remote sensing classification image, uncertainty, measurement indices, visual expression

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
YongGe2008accuracy.pdf607.03 KB

Simulation Analysis of Error Propagation on Land Cover Change Maps: A Comparison of Contextual and Non-contextual Models

Desheng Liu 1 + and Yongwan Chun 2 
1 Department of Geography and Statistics, The Ohio State University
2 School of Economic, Political and Policy Sciences, The University of Texas at Dallas

Abstract. The use of land cover change map is subject to error propagation from multi-temporal land cover classification maps. Understanding the factors determining error propagation to land-cover change maps helps to select appropriate classification models and characterize the associated uncertainties. In this paper, we presented a simulation analysis on the rates of error propagation for both non-contextual and contextual classification models. The simulation approach was based on simulated annealing with careful experimental designs to control two related factors including the spatial and temporal patterns on the errors in spectral probability estimation. The results showed that the two factors had different influences on the error propagation for non-contextual and contextual classification models. For non-contextual models, increasing temporal dependence of errors could reduce the rate of error propagation while spatial dependence of errors did not have an impact on the error propagation. For contextual classification models, the use of spatial- temporal information significantly reduced the rate of error propagation. However, the utilities of the spatial- temporal information in mitigating error propagation were dependent on the spatial dependence of errors. The impact of the temporal dependence of errors was weakened in the contextual models.

Keywords: error propagation, land cover change, simulation, contextual models  

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
DeshengLiu2008accuracy.pdf307.24 KB

The Reconstruction and Applications of the Contour Lines Based on TINs

Zhiwei Wang 1, Changqing Zhu 1, 2 and Yaoge Wang 1
1 Institute of Surveying and Mapping, Information Engineering University, Zhengzhou 450052, China
2 Key Laboratory of Virtual Geographic Environment, Nanjing Normal University, Nanjing 210054, China

Abstract: In this paper, the principle of contour interpolation is introduced firstly. And then, we deduce the propagation formula of the horizontal accuracy of the interpolated contour lines, and discuss the accuracy of interpolated contours on triangulated irregular networks with different constrained conditions and find that the quality of the contours interpolation is relative closely with the quality of triangulated irregular networks. At last, the interpolated contours are applied into two practical applications, which are the auto-connecting of the discontinuous contours and the Chinese-English metric unit conversion of the contours. The experiments indicate that the reconstructed contours based on the triangulated irregular networks can satisfy the different accuracy of the two applications.

Keywords: triangulated irregular network (TIN), DEM, accuracy, quality, contour line, breakpoint connection, unit conversion, error analysis

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
ZhiweiWang2008accuracy.pdf481.22 KB

Uncertainty Analysis for Remote Sensing Classification in the Context of Disaster Studies in Shanghai

Zhane Yin 1+, Jiahong Wen 1 and Shiyuan Xu 2
Geography Department of Shanghai Normal University, 100 Guilin Road, Shanghai, 200234, China
2 Geography Department of East China Normal University, 3663 North Zhongshan Road, Shanghai, China

Abstract. Our study applies the object-oriented technology to extract urban green automatically, which could increase accuracy through evaluating and analyzing  the quality of disaster spatial data by measuring the disfigurement points and disfigurement rate in disaster GIS based on error analysis. The study shows that
for high resolution remote sensing images that the accuracy may increase about 20% based on object-oriented technology using  remote sensing image processing software eCognition than based on traditional supervised classification method using software ERDAS.

Keywords: accuracy, remote sensing data, rate of disfigurement in GIS, Shanghai

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
ZhaneYin2008accuracy.pdf422.87 KB

Visualizing Positional Uncertainties of Geometric Corrected Remote Sensing Images

Carlos Vieira +, Giuliano Marotta, Dalto Rodrigues and Rafael Andrade
Federal University of Viçosa – Civil Engineering Department – DEC
Surveying Engineering Sector, Viçosa – MG – Brazil CEP 36570-000

Abstract. A new method to evaluate and to visualize positional uncertainties that occur through the geometric correction process of remote sensing images is successfully presented. Five different transformation methods are described. Results show that the best RMS error was obtained by 3D the projective modified model and the worst one was obtained by the 2D affine model. The 3D projective model
and 3D projective modified one were very similar  in performance. These results also point out the importance in choose a better transformation model in order to perform the geometric corrections in remote sensed data and emphasize the importance of select a number of GCP spread all over the study area. Moreover, using the proposed  positional error map, it is possible to evaluate  every observation, with its precisions, offering high confidence in the transformed image coordinates. It is also important to mention that the use of the variance propagation rules allows analyzing the residual uncertainties of the transformation parameters spatially in the entire image.

Keywords: positional accuracy, error  propagation, geometric correction, least mean square, visualizing uncertainty.

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Vieira2008accuracy.pdf292.4 KB

A Comprehensive Evaluation of Digital Image Map Quality Based on the Cloud Model

Sheng Luo +, Baoming Zhang and Haitao Guo
Department of Remote Sensing Engineering, College of Surveying and Mapping,
Zhengzhou University of Information Engineering, Zhengzhou 450052, China 

Abstract. Firstly in the paper, it raised a comprehensive evaluation method of the digital image map quality estimation based on cloud model using the mapping between qualitative and quantitative knowledge of the cloud mode. Then it calculated the cloud parameters of each quality element using the theory of virtual cloud. After calculated the finally cloud’s parameter, it described the finally digital image map quality with the digital characteristic figure and probability status. Testified by the experiment results, the model is feasible and reliable, and more coincident with actual facts.

Keywords: quality comprehensive evaluation, digital image map production, cloud model.

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
ShengLuo2008accuracy.pdf243.81 KB

A Covariance Conversion Approach of Gamma Random Field Simulation

Jun-Jih Liou 1, Yuan-Fong Su 1, Jie-Lun Chiang 2 and Ke-Sheng Cheng1+ 
1 Department of Bioenvironmental Systems Engineering, National Taiwan University, Taiwan
2 Dept. of Soil and Water Conservation, National Pingtung University of Science and Technology, Taiwan

Abstract. In studies involving environmental risk assessment, random field generators such as the sequential Gaussian simulation are often used to yield realizations of a Gaussian random field, and then realizations of the non-Gaussian target random field are obtained by an inverse-normal transformation. Such simulation requires a set of observed data for estimation of the empirical cumulative distribution function and covariance function of the random field under investigation. However, such observed-data-based simulation will not work if no observed data are available and realizations of a non-Gaussian random field with specific probability density and covariance function are needed. In this paper we present details of a gamma random field simulation approach which does not require a set of observed data. A key element of the approach lies on the theoretical relationship between the covariance functions of a gamma random field and its corresponding standard normal random field. The proposed gamma random field simulation technique is composed of three sequential components: (1) covariance function conversion between a gamma random field and a corresponding Gaussian random field, (2) generating realizations of a Gaussian random field with standard normal density and the desired covariance function, and (3) transforming Gaussian realizations to corresponding gamma realizations. Through a set of devised simulation scenarios, the proposed technique is shown to be capable of generating realizations of the given gamma random fields. The approximation function of the gamma-Gaussian covariance conversion works well for coefficient of skewness of the gamma density not exceeding 3.0. 

Keywords: stochastic simulation, gamma distribution, geostatistics, random field simulation

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Jun-JihLiou2008accuracy.pdf440.53 KB

A Fuzzy Synthetic Evaluation Approach for Land Cover Cartography Accuracy Assessment

Pedro Sarmento 1, 2, Hugo Carrão 1, 2 and Mario Caetano 1, 2 + 
Portuguese Geographic Institute (IGP), Remote Sensing Unit (RSU), Rua Artilharia Um, 107, 1099-052 Lisboa, Portugal
CEGI, Instituto Superior de Estatística e Gestão de Informação, ISEGI, Universidade Nova de Lisboa, 1070-312 Lisboa, Lisboa, Portugal

Abstract. The accuracy assessment of land cover maps is traditionally based on reference sample observations randomly selected over the study area.  It is assumed that reference sample observations, representing the “real” land cover at Earth’s surface, are free of errors. However, some of these may be erroneous. These errors are sometimes due to an uncertainty in the identification of the most adequate reference land cover classes by visual interpretation of aerial images and/or field work. This uncertainty is caused by landscape fragmentation and/or presence of more than one land cover class in sampled areas. The introduction of uncertainty in thematic accuracy measures remains an issue, but ignoring this uncertainty can significantly influence the land cover maps accuracy reported to end users. In this paper we propose a very simple and understandable method for thematic accuracy assessment of land cover maps that uses reference uncertainty as input feature. This fuzzy synthetic evaluation (FSE) approach is based on the combination of linguistic fuzzy operators. Specifically, we evaluate errors magnitudes per land cover class and weight their importance in map accuracy assessment process. In the sequence, we compare our approach with most traditional accuracy assessment measures and evaluate methodological gains and disadvantages. To achieve this goal we present a case study based on a land cover map of Continental Portugal derived from automatic classification of MERIS images. We demonstrate that trough the use of the fuzzy synthetic evaluation approach we provide accuracy descriptors that are more comprehensible for map users. In fact, this approach allows end users to easier decide if a land cover map satisfies their needs and to become more conscientious about map error extension and its particular impacts. 

Keywords: fuzzy synthetic evaluation, reference databases uncertainty, land cover maps, accuracy assessment.

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Sarmento2008accuracy.pdf293.84 KB

A General Method for Assessing the Uncertainty in Classified Remotely Sensed Data at Pixel Scale

Yanchen Bo 1, 2  + and Jinfeng Wang 3
1 State Key Laboratory of Remote Sensing Science, Jointly Sponsored by Beijing Normal University and Institute of Remote Sensing Applications, CAS
2 Research Center for Remote Sensing and GIS, School of Geography, Beijing Key Laboratory for Remote Sensing of Environment and Digital Cities, Beijing Normal University, Beijing, 100875, China
3 LREIS, Institute of Geographical Science and Natural Resources Research
Chinese Academy of Sciences, Beijing, 100101, China

Abstract. The uncertainty assessment on the classification of remotely sensed data is a critical problem in both academic arena and applications. The conventional solution to this problem is based on the error matrix (i.e. confusion matrix) and the kappa statistics derived from the error matrix. However, no spatial distribution information of classification uncertainty can be presented by this method. A probability vector-based method has been developed for assessing classification at pixel-scale. However, the use of this method is severely limited because the probability vector can be derived only through the Bayesian classification. In practice, other classifiers such as Artificial Neural Network Classifier, Minimum Distance Classifier, Mahalanobis Distance Classifier and Fuzzy Classifier are wildly used for remote sensing data classification. To assess the uncertainty of thematic maps by these classifiers at pixel-scale, a general method is presented in this paper which extends the probability vector-based method to the assessment on  the uncertainty classified by classifiers beyond Bayesian classifier. This extension is realized through a transformation method, which transforms the “Membership Vector” in various classifiers to the “transformed probability vector” so that it is comparable to the probability vector in Bayesian Classifier. The uncertainty measurements could be derived from the probability vector were evaluated and, the probability residual and entropy that derived from the extended probability vector are used as indicators to assess the absolute and relative uncertainty perceptively. The uncertainties by different classifiers are compared at pixel scale. Some examples of the uncertainty of maps from distance classifiers were presented and compared with that of maps from MLC classifier. 

Keywords: scale, a posteriori probability vector, uncertainty measure.

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
YanchenBo2008accuracy2.pdf301.21 KB

A Method to Incorporate Uncertainty in the Classification of Remote Sensing Images

Luísa M S Gonçalves 1+, Cidália Fonte 2, 3, Eduardo N B S Júlio 4 and Mario Caetano 5, 6
 1 Polytechnic Institute of Leiria, School of Technology and Management, Portugal
2 Institute for Systems and Computers Engineering at Coimbra, Portugal
3 Department of Mathematics, University of Coimbra, P – 3001 454 Coimbra, Portugal
4 ISISE, Civil Engineering Department, University of Coimbra
5 Portuguese Geographic Institute (IGP), Remote Sensing Unit (RSU), Lisboa, Portugal
6 CEGI, Instituto Superior de Estatística e Gestão de Informação, ISEGI,Universidade Nova de Lisboa,  1070-312 Lisboa, Portugal

Abstract. The authors analyze in this paper whether the introduction of the uncertainty associated to the classification of surface elements in the classification of landscape units can improve the results accuracy. To this end, a hybrid classification method is developed, incorporating uncertainty information in the automatic classification of very high spatial resolution multispectral satellite images to obtain a map of landscape units. The developed classification methodology includes the following steps: 1) a soft pixel-based classification; 2) computation of the classification uncertainty; 3) image segmentation; and 4) object classification based on decision rules.  The first step of the proposed methodology is the soft pixel-based classification performed with the maximum likelihood classifier, aiming to identify the surface elements (e.g., tree crown, shade, bare soil, buildings). Subsequently the posterior probabilities are computed to all pixels of the image. This information enables the computation of the classification uncertainty. An image segmentation is then made to obtain image-objects. The classification of the resulting objects into landscape units is performed considering a set of decision rules that incorporate the probabilities assigned to the several classes at each pixel and the degree of uncertainty associated to these assignments. The proposed methodology was tested on the classification of an IKONOS satellite image. The accuracy of the classification was computed using an error probabilistic matrix. The comparison between the results obtained with the proposed approach and those obtained without considering the classification uncertainty revealed a considerable improvement in the classification accuracy. This shows  that the information about uncertainty can be valuable when taking decisions and can actually increase the accuracy of the classification results.

Keywords: soft classification, maximum likelihood classifier, hybrid classification, uncertainty.

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Luisa2008accuracy.pdf362.82 KB

A Mine Environmental GIS: Framework, Key Techniques, and Case Study

Hairong Zhang 1, 2 +, Fengjie Gao 1 and Qiyan Feng 1, 2
1 School of Environmental Science and Spatial Informatics, China University of Mining and Technology, Xuzhou, Jiangsu, 221008, China 
2 Jiangsu Key Laboratory of Resources and Environmental Information Engineering, Xuzhou 221008, China

Abstract. The large-scale coal mining has caused serious environmental pollution and ecological damage in the coal mining areas in China. The serious conditions are listed as below: the geological disasters such as landslide, dilapidation, mud-rock flow and so on aroused by mining subsidence; the side effects such as land occupying, side sloping, filtrating and raised dust brought by the waste rock and gangue heap; the ruin of the water cycle system when the groundwater is drained  and the waste water is discharged; the air and noise pollution during the coal mining and  the transporting processes. Based on the characteristics of acquisition and management of environmental information in China as well as the theory and technology of environmental informatics and GIS, the Mine Environmental GIS (MEGIS) was studied and developed, and some key techniques were investigated. The objective of MEGIS is to manage, inquire and analyze environmental data and provide users with decision support for environmental protection. In the MEGIS, both the Point Mode and the Region Mode of the Gaussian Air Diffusion Model are used to simulate air pollution, and the water quality model is used to simulate the water pollution in 1-dimensional and 2- dimensional. The spatial discrete concentration points  are used to build up Triangulated Irregular Network (TIN) which creates a new  isoline or isosurface for environmental analysis in mining areas. Mining subsidence simulation and prediction model was used  to analyze ground surface subsidence caused by mining. Using the proposed MEGIS it is much easier to update, manage and analyze environmental data, further to support environmental management decision-making.

Keywords: mining area, environmental impacts, mine environmental GIS (MEGIS), environmental information science (EIS)

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
HairongZhang2008accuracy.pdf196.14 KB

A Multi-Scale Image Segmentation Algorithm Based on the Cloud Model

Weihong Cui +, Zequn Guan and Kun Qin
School of Remote Sensing and Information Engineering, Wuhan University,  129 Luoyu Road, Wuhan 430079, P.R. China

Abstract. A new method of image segmentation based on cloud model theory is proposed in this paper. A major contribution of this work is to add uncertainty of image to the segmentation algorithm. Segmentation is realized in three stages. First, we use cloud model  theory to transform the image’s qualitative model to its quantitative model (concept  tree). Second, we use climbing policy to get different level concepts which represent different level objects. At last, determine which concept each pixel belongs to. Such process will generate a scale-space hierarchical tree that induces segmentation without a priori knowledge. Experimental results based on natural images with respect to the concept tree and segmentation  proved this multi-scale image segmentation algorithm can get different levels objects very well and good at resolving the edges of different objects which have uncertainty to objects.

Keywords: multi-scale, image segmentation, uncertainty, cloud model, concept tree

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
WeihongCui2008accuracy.pdf498.42 KB

A New Method for Multi-Temporal SAR Image-Based Change Detection

Juntuan Zhang 1, 2, 3, Shiqi Huang 3 +, Guangliang Zhu 3 and Jun Lin 1
1 Texas Instruments-Jilin University DSPs Laboratory, Jilin University, 130026, Changchun, P.R. China
2 Xi’an Communication Institute, Xi’an, 710106, P.R. China
3 Xi’an Research Inst. of Hi-Tech, Hongqing Town, Xi’an, 710025, P.R. China

Abstract. Synthetic aperture radar (SAR) imaging has the characteristics of acquiring remote sensing data under all weather and all time. So SAR image change detection techniques have large advantage in abruptly natural and man-made disaster. Inherent speckle noise of SAR image badly obstructs the applications for SAR image change detection. SAR image belongs to non-Gaussian distribution in general, which accords with the conditions of independent component analysis (ICA) theory. The most important benefit is that ICA and wavelet transform both can reduce speckle noise. Therefore, a new change detection algorithm based on ICA and stationary wavelet transform (ICA-SWT-CD) for multi-temporal SAR images was proposed in this paper. The merit of the algorithm is that it is insensitive to speckle noise. Finally, the practical SAR image data is performed compare experiments and the experimental results verify that  the proposed algorithm is effective and feasible.

Keywords: SAR image, change detection, independent component analysis, stationary wavelet transform

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
JuntuanZhang2008accuracy.pdf468 KB

A Realistic Structure Model for Forest Radiosity Simulation on Large Scales

Huaguo Huang 1, 2, Min Chen 1, Qinhuo Liu 1, Qiang Liu 1, Yang Zhang 1 and Wenhan Qin 1 +
1 State Key Laboratory of Remote Sensing Science, Jointly Sponsored by the Institute of Remote Sensing Applications of Chinese Academy of Sciences and Beijing Normal University, 100101, Beijing, P.R. China
2 Key Laboratory for Silviculture and Conservation, Ministry of Education; College of Forestry, Beijing Forestry University, 100083, Beijing, P.R. China

Abstract. The Radiosity-Graphics Model (RGM) is an important branch of computer simulation models for vegetation BRDF. Because the radiosity method is based on a global solving technique and computers’ computation capability is limited, RGM can not run beyond a limitation of the number of polygons in a scene (estimated as 500,000 for a normal  PC). Therefore, RGM was only used for small-scale scenes such as farmland, lawn or homogeneous forest. However, it is common that the land is rugged in nature, so the main work of this paper is to extend RGM to simulate  forest radiosity on a complex topography and put simulations of large-scale forest scenes into practice. Based on RGM, we focus on the methods to construct a large-scale virtual forest scene and use a scene division method to simulate large-scale forest radiosity on complex topographic conditions. Our work can be described as follows: (1) Virtual forest scene generation combined with DEM. (2) Scene division: We realize a scene-division method by correcting shading effects of the sub-scenes and considering multiple scattering. (3) BRF of the whole scene: We realize an algorithm to merge simulation results from all the divided sub-scenes and calculate the BRF of the whole scene. (4) Evaluation and application: By comparing the sub-scene method with the original RGM model in a relative small scene, the code is confirmed to be correct. By applying our new model to a conifer forest scene on a GUASS terrain from RAMI3 (http://rami-benchmark.jrc.it), we make sure of the rationality of our simulation results. The application supply the gap that RGM is  absent of large-scale scene simulations in RAMI3. Although good results have been achieved, some limitation and errors of the scene division method are also discussed in the paper. In addition, the future research direction is proposed.

Keywords: RGM, large scale, BRDF, scene division

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
HuaguoHuang2008accuracy1.pdf802.09 KB

A Remote Sensing Feature Discretization Method Accommodating Uncertainty in Classification Systems

Guifeng Zhang, Zhaocong Wu and Lina Yi
School of Remote Sensing and Information Engineering, Wuhan University, Wuhan 430079, China

Abstract. Most of the classification methods in remote sensing can only process the discrete feature data, such as rough set. Thus the discretization of feature plays a very important role in the remote sensing imagery classification system. In general, the remote sensing features are discretized currently by means of the methods from fields other than remote sensing. Because there is a lack of consideration of uncertainty of the classification system in these methods, it is not predicted whether the discretization influences the classification accuracy or not. This paper introduces a discretization method considering the uncertainty of the classification system. It comprises three components: the building of the initial candidate cut points set, the selection of cut points based on information entropy and the deletion of redundant cut points. All the three parts are executed iteratively. The first two are iterative processes from top to bottom, while the last is an iterative process from bottom to top. The stopping criterion of iterative process is a threshold which represents the max possible change of the uncertainty  of the classification. Therefore, the change of the uncertainty of the classification system resulted from  discretization is limited to a certain range, and its influence on classification can be predicted and controlled. The experiment shows that the proposed method produces comparative results with  those of Ent-MDLC and can lessen the influence on the classification accuracy from the discretization. 

Keywords: uncertainty, discretization, segment range, rough entropy, accuracy

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
GuifengZhang2008accuracy.pdf852.91 KB

A Scale Transform Method for Leaf Area Index Retrieved from Multi-Resolutions Remote Sensing Data

Xin Tao, Binyan Yan, Daihui Wu, Wenjie Fan +, and Xiru Xu 
Institute of Remote Sensing and GIS, Peking University, Beijing, 100871, China

Abstract.  Scaling effect of leaf area index (LAI) in remote sensing field means that values of LAI derived from different resolutions images also differ. The foundation of discussing LAI scaling effect problem is accurately retrieving LAI values from remote sensing  images, and trying to exclude the influence of other factors.  A hybrid model for continuous vegetation is chosen to retrieve LAI from reflectance images of 2.5 m, 10 m SPOT as well as 250 m and 500 m MODIS data. The task of scaling transform of LAI at different scales is deriving the expectation of LAI value at one specific scale from values of other scales. A formula for transforming LAI values at different scales is presented and successfully applied on these remote sensing images. The application results match well with ground truth.

Keywords: scaling transform, leaf area index (LAI), spatial scaling, non-linearity

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
XinTao2008accuracy.pdf305.31 KB

A Study on the LAI Up-Scaling Based on Mathematic Transformation

Zhe Yan 1, Hua Yang 2 + and Yuan Chai 2 
1 Nanjing University Library, Nanjing 210093, China
2 State Key Laboratory of Remote Sensing Science, Jointly Sponsored by Beijing Normal University and  Institute of Remote Sensing Applications of Chinese Academy of Sciences, School of Geography and Remote Sensing Science, Beijing Normal University, Beijing 100875, China.

Abstract. How to apply some mathematic transformation in remote sensing scaling is investigated in this paper. The LAI (Leaf Area Index) up-scaling and Fourier and wavelet transformation are taken for example. Commonly, a larger spatial scale process is acquired by averaging the smaller scale remotely sensed process, but the high frequency components are eliminated by the averaging operation. Then Fourier transformation is a low-pass filter in essence, so the outline information of a remotely sensed image with high resolution can be gotten by Fourier transformation. However, some detailed information is also lost at the same time. Therefore, wavelet transformation is applied in Now, we can acquire the up-scaled image by combining the detailed information and the outline information. Test results shown that the  overall evaluating index suggested in the paper is correct and reasonable. Transfer function related to scale correct factor is also introduced into this up-scaling method to improve the results. But it’s a dependence factor. Further study on the scale correct factor and transformation parameters is doing.

Keywords: LAI, up-scaling, transfer function, Fourier transformation, wavelet transformation.

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
ZheYan2008accuracy.pdf315.33 KB

A System for Automatic Processing of MODIS L1B Data

Lingkui Meng +, Liang Tao, Jiyuan Li and Chunxiang Wang
School of Remote Sensing and Information Engineering, Wuhan University,
129 Luoyu Road, Wuhan 430079, P.R. China

Abstract. MODIS is an important sensor of EOS program. In this paper we introduce the main characteristics of MODIS L1B data and its application status. Then we design the MODIS L1B Data Automatic Processing System, explaining its concept, main function and realization of the system. Meanwhile, the products of the system include: L1B radiance products, L1B reflectance products, day NDVI mosaic image, colored composite mosaic image, brightness temperature mosaic image, ten-day period NDVI mosaic image, etc. Finally, we sum up the system,
give spatial accuracy assessment, optimization strategy and point out future work.

Keywords: MODIS L1B, automatic process, ENVI, bow-tie, mosaic, NDVI, brightness temperature. 

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
LingkuiMeng2008accuracy.pdf521.01 KB

A fuzzy-based tool for spatial reasoning: A Case study on soil erosion hazard prediction

Zuhal Akyurek 1 and Kıvanç Okalp 2
1 METU Civil Eng. Dept. 
06531 Ankara, Turkey
Tel.: + 90 312 2102481; Fax: + 90 312 210 7956
zakyurek@metu.edu.tr
2 METU Geodetic and Geographic Information Technologies, Institute of Natural and Applied Sciences 
06531 Ankara , Turkey

Abstract
Fuzzy set theory provides a formal system for representing and reasoning with uncertain information. Linguistic variable concept in a  fuzzy logic system enables to handle numerical data and linguistic knowledge simultaneously. Even L. A. Zadeh (1965), formulated the initial statement of fuzzy set theory, at first never expected fuzzy sets to be used in consumer products or in geographic information. A collection of objects of any kind form a classical set and the objects themselves are called elements or members of the set. Since classical set theory is used in conventional decision making systems to model uncertain real world, the natural variability in the environmental phenomena can not be modeled appropriately.  Because, pervasive imprecision of the real world  is unavoidably reduced to artificially precise spatial entities when the conventional crisp logic is used for modeling. In this study fuzzy sets and fuzzy logic algebra were used in predicting the soil erosion hazard. Annual soil loss rates were estimated using Universal Soil Loss Equation (USLE) that has been used for five decades all over the world. Fuzzification of the landscape elements used in the model was done using a Fuzzy Semantic Import modeling approach. FuzzyCell, which has been developed on a commercial GIS software namely, Arc-Map, was used to  implement the fuzzy algebra operators for determining the likelihood an area to low, moderate or high erosion hazard. The results were compared with the traditional USLE model results. When the results obtained from the traditional and fuzzified USLE implementation, it is observed that traditional USLE overestimates the areas prone to low level erosion risks and it overestimates the areas prone to high level erosion risk. Although the model provides qualitative estimations, it showed very useful to explore relationships and incorporate uncertainty in spatial decision making.

Keywords: fuzzy logic, spatial reasoning, uncertainty, soil erosion, USLE

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Akyurek2006accuracy.pdf5.06 MB

Accuracy Evaluation and Gross Error Detection in Digital Elevation Models Based on LIDAR Data

Weiwei Sun 1 +, Chun Liu 1, 2 and Hangbin Wu 1
1 Department of Survey and Geo-Informatics, Tongji University, Shanghai, 200092, China 
2 Key Laboratory of Advanced Engineering Surveying of SBSM, Shanghai, 200092, China

Abstract. With the LIDAR data more wide use in DEM products, especially in urban area DEM modeling, the research on Grid DEM with LIDAR data is more and more urgent. Also users are more desired to acquaint themselves with LIDAR data, especially to know more about the accuracy of Grid DEM based on LIDAR data. Therefore, it is rather necessary to explore the matter of accuracy of DEM based on LIDAR data.  In this paper, firstly, the method of local detection with moving windows are introduced to detect and eliminate discrete errors, which will purify LIDAR data for later DEM modeling; secondly, with the rest regular ground points, via fitting  second degree curved surface under the least square criteria and interpolating grid points, the finer Grid DEM based on LIDAR data would be generated; finally, RMSE will be hired to analyze the precision of previous Grid DEM. As a case study, 4000 LIDAR points preprocessed have been selected to test the method and has proved that the method is feasible. Through the experiment, the accuracy of DEM based on LIDAR data proves to be approximately 0.6m. The precision will be convenient for users to make best use of LIDAR data and be conducive to the actual applications of LIDAR data.

Keywords: LIDAR, accuracy evaluation, gross error, local detection

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
WeiweiSun2008accuracy.pdf330.57 KB

Accuracy Issues Associated with Satellite Remote Sensing Soil Moisture Data and Their Assimilation

Xiwu Zhan 
NOAA NESDIS Center for Satellite Applications and Research, Camp Springs, MD 20746, USA

Abstract. Satellite remote sensing is widely used for monitoring the changing planet Earth. Many remote sensing data products are being generated and used every day. Among these data products are the microwave remote sensing data of land surface soil moisture.  Soil moisture often limits the exchanges of water and energy between atmosphere and land surface, controls the partitioning of rainfall among evaporation, infiltration and runoff, and impacts vegetation photosynthetic rate and soil microbiologic respiratory activities. Their accuracy plays essential role for the success of their applications. Accurate measurement of this variable across the global land surface is thus required for global water, energy and carbon cycle sciences and many civil and military applications. Currently available satellite soil moisture data products have been generated from the low frequency channel observations of the currently flying microwave sensors (the TRMM Microwave Imager-TMI; Aqua Advanced Microwave Scanning Radiometer-AMSR-E, and Navel Research Lab’s WindSat). However, because of several accuracy issues all of these soil moisture data have not yet been used in operational applications. The most apparent accuracy issue is that the soil moisture data retrievals from the three different sensors are significantly different from each other even when they are retrieved with the same algorithm. This might have been caused by the calibration errors in their brightness temperatures. A Simultaneous Conical-scanning Overpass (SCO) method is  tested to address this issue. Secondly, satellite sensor footprints are usually several orders  larger than the local points where in situ soil moisture measurements for validation are obtained. How to appropriately compare the satellite soil moisture retrievals of large spatial areas with the in situ measurements becomes an important issue. A point-to-pixel mapping approach is examined for a solution of this issue. The third issue is how to handle biases of the soil moisture retrievals from land surface model (LSM) simulations when they are assimilated into the LSM. Existing solutions for this issue are summarized and whether these error-handling strategies are effective or reliable are discussed. Finally general conclusions of this study are presented for users who are interested in satellite soil moisture data assimilation.

Keywords: data accuracy, soil moisture, satellite remote sensing, data assimilation

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
XiwuZhan2008accuracy.pdf337.89 KB

Accuracy in Remotely Sensed Urban Greenery Land Cover

Junqi Zhou + and Jiabing Sun
School of Remote Sensing and Information, Wuhan University, Wuhan 430079, China

Abstract. The paper analyzed and compared the accuracy of urban greenery extraction from the different sensors images with different methods. It is necessary to compare the accuracy for the same area with different images, 30m TM image, 10m SPOT image and 2.44m QUICK BIRD image. Visual interpretation; classification based on the statistical and classification based on object are used to extract the urban Greenery from three kinds of sensors images with different resolution. Urban greenery extraction from high resolution image has higher accuracy using classification based on the object than that based on the statistical classified accuracy. The result shows that the sample and the statistical is the main factor to affect the accuracy by classification based on the statistical, while the segmentation scale is the main factor affect the accuracy by
classification based on object.

Keywords: remote sensor image, image classification, greenery extraction, accuracy comparison.

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
JunqiZhou2008accuracy.pdf339.86 KB

Accuracy of Matching between Probe-Vehicle and GIS Map

Huifeng Ji + and Aigong Xu
School of Geomatics, Liaoning Technical University, Fuxin, Liaoning, 123000, China 

Abstract. Map matching is the most basic problem of  integrating GPS with GIS. The efficiency and accuracy of matching algorithm directly influence the  application of float cars. Regressive equation is established by the factors, such as perpendicular distance between the GPS point and the road segments, drive’s angle of float car, angles constructed by the post two points and the front two points. The probe- vehicle points are located one by one through the algorithm. The matching method can resolve both the flutter in parallel segment and the mistake match in intersections. The method also considers the disadvantage of the point to line and the advantage of the line to line, and it can guarantee the speed and accuracy.

Keywords: probe-vehicle; matching algorithm; parallel segments; intersections

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press). 

AttachmentSize
HuifengJi2008accuracy.pdf463.71 KB

An Image Decomposition Model Using the Dual Method and H−1 Norm

Qibin Fan + and Tao Zhang
School of Mathematics and Statistics, Wuhan University, Wuhan 430072, China 

Abstract. In image denoising process, it is difficult to separate texture from noise. In order to separate them, we should know their different characteristics, or we can use some metrics (such as norms) to distinguish them. In this paper, we propose a new model which decomposes an initial image  f into three component: structure part  u , texture part  v , and noise part  w . And we use the H−1 norm which is investigated in the work of Aujol and Chamboll to separate texture from noise.

Keywords: metric, total variation, Sobolev space, H−1norm, texture, dual transform

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

 

AttachmentSize
QibinFan2008accuracy.pdf380.23 KB

An Improved Algorithm for Image Fractal Dimension

Zifan Yu 1 and Lihong Hu 2 +
1 School of Resource & Environment Science, Wuhan University, Wuhan 430079, China
2 School of Remote Sensing & Information Engineering, Wuhan University, Wuhan 430079, China

Abstract. There are several methods to estimate fractal dimension in fractional Brownian motion model. In this paper, one of methods, the variance method, is analyzed detailedly, and some problems in the method are pointed out. To resolve the problems, an improved method is proposed. To validate the method, a comparing experiment of two methods is designed. In the experiment, a SPOT image is selected and five categories, flourishing field, bareness land, resident area, water area and mountainous area, in the image are taken as examples. The experiment shows that the result by improved method is coincident with human perception and the improved method is also helpful for classification.

Keywords: fractal dimension, fractional Brownian motion, texture, classification

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
ZifanYu2008accuracy.pdf245.79 KB

Assessing Uncertainties for Lognormal Kriging Estimates

Jorge Kazuo Yamamoto +
 Department of Environmental and Sedimentary Geology, Institute of Geosciences, University of Sao Paulo, Brazil.

Abstract.  Lognormal data calls for lognormal kriging in which the original data are transformed into logarithms. Then ordinary lognormal kriging estimates are backtransformed into original scale of measurement by taking their inverse logarithms. Evidently, we must add some non-bias to the estimates before backtransforming them. After that we have lognormal kriging estimates backtransformed into original scale of data. However, we do not have any idea about errors, because they remain in the logarithmic scale. This paper presents a very simple way to get back errors in the logarithmic domain into original data domain. Three data sets presenting different coefficients of variation are used to show how reliable is the proposed procedure. Furthermore, the non bias term can also be backtransformed giving the estimated smoothing error. This estimated smoothing error presents a reasonable correlation with the true smoothing error. In other words, when the backtransformed estimate presents  high correlation with the actual unknown value, then estimated and true smoothing errors will also present positive correlation.
 

Keywords: uncertainty, interpolation variance, smoothing error, lognormal kriging

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Yamamoto2008accuracy1.pdf533.53 KB

Assimilation of Remote Sensing Data Products into Common Land Model for Evapotranspiration Forecasting

Chunlin Huang +, Xin Li, Jiemin Wang and Juan Gu 
Cold and Arid Region Environmental and Engineering Research Institute, CAS, Lanzhou, 730000, China

Abstract. Evapotranspiration (ET), the sum of water lost to the atmosphere from the soil surface through evaporation and from plant tissues via transpiration, is a vital component of the water cycle. Accurate measurements of ET are required for the global water and energy cycles. However, ET varies in time and space and is difficult to estimate as it depends on many interacting processes. At the local scale, ET may be accurately estimated from detailed ground observations. At the regional scale sufficient ground observations will never be available and instead spatially. Remote  sensing data provide us with spatially continuous information over vegetated surfaces, which supply the frequent lack of ground-measured variables and parameters required to apply the local models at a regional scale. Optical remote  sensing data are strongly affected by atmospheric condition, so the uncertainty also exists in the estimation of ET with remote sensing. In this work, we develop a data assimilation scheme to improve the estimation of ET. The common land model (CoLM) is adopted as model operator to simulate the temporal variation of ET. Ensemble Kalman filter algorithm is chosen as data assimilation algorithm. The scheme can dynamically assimilate MODIS land products such as land surface temperature (LST) and leaf area index (LAI). The scheme is tested by automatic weather station (AWS) and flux tower data  obtained from Xiaotangshan station in China. The results indicate that assimilating MODIS land products can improve the estimation of ET.

Keywords: ET, data assimilation, Ensemble Kalman Filter, Common Land Model, MODIS

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
ChunlinHuang2008accuracy.pdf694.98 KB

Automatic Recognition and Localization of Ground Marked Points Based on Template

Hui Cao
School of Remote Sensing and Information Engineering, Wuhan University, Wuhan 430079, China

Abstract. Distributing some ground marked points as ground control points is a common method in photogrammetry. In traditional photogrammetry measurement of the ground control point remains manual or semi-automatic measurement. This paper presents a method which is based on standards template and image matching principle of least squares. Based on the method, ground marked points are automatically detected by processing digital image with ground marked points. This method is based on the laid ground landmarks images which were obtained by the means of photogrammetry. According to different shapes on the ground signs, we can design the template image library. First, according to different marks we use template images to do difference operation for digital images to get energy images, so we can ensure the approximate position of the marks in digital images. Then we do precise orientation by using standard template images and the approximate object images to do least square image matching. Experiments show that method has higher object identification and location capabilities and it can effectively resolve the signs point of automatic identification and location, so to achieve the purpose of automatic measurement ground control points.

Keywords: object recognition, template match, object localization

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
HuiCao2008accuracy.pdf342.17 KB

Building Semantic Ontology Databases Based on Remote Sensing Images

Zhenfeng Shao 1, Jun Liu 2 and Xianqiang Zhu 1
1 State Key Laboratory for Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, 129 Luoyu Road, Wuhan 430079, China
2 School of Remote Sensing and Information Engineering, Wuhan University, Wuhan 430079, China

Abstract:  This paper puts forward a novel method of automatically generating elements which is necessary for building sharing ontology database and studies the uncertainty problems. A concept called sharing ontology is proposed which is defined as a  bridge of all kinds of spatial information systems involving domain spatial data acquiring, updating, transferring, storing, processing, analyzing, information extracting, knowledge discovering and Geo-services. By establishing the spatial information sharing ontology database, we can provide for the users an  integrated spatial information service and application platform based on the semanteme, and then link all spatial information service platforms on internet at the semantic level, so as to provide a feasible way of semantic-based spatial information sharing and interoperability. Through data mining based on the remote sensing images and GIS database, we can build the sharing ontology database automatically, which achieves semantic sharing of  isomerous spatial information. Some innovative ideas of the research work are as follows:(1) Putting the uncertainty problem about features extracted form remote sensing images up to a height of information theory and exploring the consolidate mathematics expression between information quantity and uncertainty about features mined from remote sensing images.(2) Building the sharing ontology database by combining remote sensing and GIS spatial data, and establishing the  feasibility in practice for the construction of the sharing ontology database.(3) Establishing the stability foundation for the precise semantic sharing between isomerous spatial information systems.

Keywords:  sharing ontology database, domain ontology, semantic-based spatial information sharing and interoperability, uncertainty

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
ZhenfengShao2008accuracy2.pdf473.12 KB

Chlorophyll a (Chl-a) Concentration Measurement and Prediction in Taihu Lake Based on MODIS Image Data

Wanshun Zhang 1+, Caisheng Huang 1, Hong Peng 2, Yan Wang 1, Yanxin Zhao 1 and Tao Chen
School of Resource and Environmental Science, Wuhan University, Wuhan 430079, China
2 School of Water Resources and Hydropower, Wuhan University, Wuhan 430079, China

Abstract.  Comparing with the chlorophyll-a (Chl-a) concentration measurement in ocean, the measurement of the inland water body based on MODIS  image data is far from mature. The inland lake meets a series of more harsh pollution problems yet lack of any efficient methods in fast measurement and prediction. Therefore, the measurement and prediction upon the in-land water body pollution and eutrophication became critical. Taihu Lake is one of the largest freshwater inland lakes in China, which is also the most important water source. The eutrophication in Taihu Lake is becoming more serious while the economy of Taihu lake drainage basin developing. In summer 2007, the algae bloom brake out widespread in Taihu lake. A method of Chl-a concentration measurement and prediction, base on the MODIS image data, is presented in this paper. Two components were included in the method. The first component is estimation of Chl-a concentration based on MODIS image. The DN values were derived from MODIS image data, and translated into normalized reflectance. An image-based atmosphere correction method was applied in preprocess of MODIS image data to reduce the atmospheric effect, including calculating and removing Rayleigh scattering, and removing aerosol contribution at desired wavelength, etc. Based on studying the spectral characteristic of Chl-a, the suitable MODIS bands and band combinations were correlated with Chl-a measurement. The Chl-a concentration were based on Chlorophyll Empirical Algorithm and remote sensing reflectance (Rrs). Field data were used to correct the rough Chl-a concentration data. Secondly, the eco-dynamic model in Taihu Lake was developed. Two sub-modules were included in the eco-dynamic model: the hydrodynamic model and the ecological model in lake. The eco-dynamic model had been calibrated and tested by field measured data. The hydrodynamic model was used to simulate the flow field drove by wind. The distribution of Chl-a concentration in the future could be predicted by the ecological model. Two dates of MODIS data of May 8 and May 19, 2007 in Taihu Lake, China, were used in this study. The results show that, the most serious eutrophication state occurs in the north of Taihu Lake, and the eutrophication state in the east part of the lake is better than other region, which agree well with the local measured data. The result of numerical simulations provided satisfactory result in comparison with the distribution of Chl-a concentration based on MODIS.   This approach could be applied to other coastal or inland regions for the measurement and prediction of Chl-a concentration, but the specific relationship  between MODIS reflectance and Chl-a may vary as a consequence of different water body.  The presence of other constituents can also be investigated in the further research.

Keywords: MODIS, Chl-a concentration, eco-dynamic model in Taihu Lake, measurement, prediction

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
WanshunZhang2008accuracy.pdf372.98 KB

Climate Change Impacts on Protea Species: PDEAR Model Predictions

Danni Guo 1, Renkuan Guo 2, Guy F. Midgley 1 and A. G. Rebelo 1 
1 Kirstenbosch Research Center, South African National Biodiversity Institute, Private Bag X7, Claremont 7735, Cape Town, South Africa
2 Department of Statistical Sciences, Univ. of Cape Town, Private Bag, Rondebosch 7701, Cape Town, South Africa

Abstract. One of the major concerns today is global warming and climate change impacts, and how they are changing the distribution and behaviour of the plant species. For example, Proteas species in the Cape Floristic Region, South Africa, are very sensitive to climate change. In this paper, we first arguing and the random fuzzy error structure for spatial modelling accuracy and then we are focusing on the population category of rare Proteas that has an estimated population size from 1 to 10 per sample site, which is very small. We develop a bivariate partial differential  equation associated regression (PDEAR) model for investigating the impacts from rainfall and temperature on the Protea species. Under same the average biodiversity structure assumptions, we explore the future spatial change patterns of Protea species with future (average) predicted rainfall and temperature.  Our investigation shows that the global climate change impacts on distributional patterns of the endangered Protea species are significant.

Keywords: random fuzzy variable, bivariate partial differential equation associated regression model, Protea, South Africa, climate change, Cape Floristic Region

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
DanniGuo2008accuracy.pdf371.27 KB

Combined Fitting Based on Robust Trend Surface and Orthogonal Multiquadric with Applications in DEMs Fitting

Juqing Zhang 1 + and Pingzhi Liu 2
1 School of Geology Engineering and Geomatics, Chang’an University, Yanta Road, Xi’an 710054, China
2 Xi’an Research Institute of Surveying and Mapping, No.1 Mid-Yanta Road, Xi’an 710054, China

Abstract. Spatial data interpolation is an indispensable process in DEM. A combining fitting with trend surface and multiquadric fitting is proposed, which removes the trend influence of the DEM in the research area then re-fit the residuals by surface fitting. In order to control the influence of some outstanding points to surface fitting, a robust fitting of the  trend surface by using an equivalent weight is adopted. An adaptive node choosing method is proposed based on the effect of every node to the curve fitting calculated by using the orthogonal least squares. It cannot only ensure the stability, but also improve the precision of fitting. A practical example of terrain fitting shows that the proposed combined method is effective in improving the quality of DEM fitting. 

Keywords: trend surface fitting, multiquadric, node, fitting

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
JuqingZhang2008accuracy.pdf303.52 KB

Combining Transition Probabilities in the Prediction and Simulation of Categorical Fields

Guofeng Cao + and Phaedon C. Kyriakidis
Department of Geography, University of California Santa Barbara, U.S.A.

Abstract. Categorical spatial data, such as land cover classes or soil types, are important data sources in many scientific fields, including geography, geology and environment sciences. In geostatistics, indicator kriging (IK) and indicator coKriging (ICK) are typically used for estimating posterior probabilities of class occurrence at any location in space given known class labels at data locations within a neighborhood around that prediction location. In addition, IK and ICK constitute the core of the sequential indicator simulation (SIS) algorithm used for generating realizations of categorical fields. Both IK and ICK require a set of consistently specified indicator (cross)covariance or (cross)variogram models, whose parameter inference can become cumbersome. In addition, IK and ICK may yield estimated probabilities that do not satisfy fundamental probability constraints. To overcome these limitations, transition probability diagrams have been used as an alternative measure of spatial structure for categorical data. More recently, a Spatial Markov Chain (SMC) model was developed for combining transition probabilities into posterior probabilities of class occurrence, under the conditional independence assumption between neighboring data. This paper surveys alternative approaches for combining pre-posterior (two-point) auto- and cross-transition probabilities of class occurrence between any datum location and a prediction or simulation location into conditional or posterior (multi-point) such probabilities. Advantages and disadvantages of existing approaches are highlighted. Last, a proposal is made to synthesize elements of geostatistical and Markov Chain approaches for combining transition probabilities for prediction and simulation of categorical fields.

Keywords: indicator cokriging, transition probability, Markov Chain, geostatistics, spatial statistics

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
GuofengCao2008accuracy.pdf541.17 KB

Consistency Analysis of Multi-Source Remotely Sensed Images for Land Cover Classification

Peijun Du 1 +, Guangli Li 1, Linshan Yuan 1 and Paul Aplin 2 
1 China University of Mining and Technology, Xuzhou, Jiangsu Province, 221116, China 
2  University of Nottingham, University Park, Nottingham NG7 2RD, UK

Abstract. The importance of accurately describing the nature of land cover resources is increasing. With the aim to analyze the consistency of remotely sensed images from different sensors for land cover classification, three medium spatial resolution optical  image sources in Xuzhou city were classified in the study, including CBERS, ETM, and ASTER. Land cover classification was conducted by Maximum Likelihood Classification (MLC), Support Vector Machines (SVM) and Decision Tree (DT). By comparing the classification results, SVM performed best and the results of SVM classifier were used for consistency analysis. The results we obtained suggested that different images obtained around the same time can lead to
dissimilar classification results. Consistency analysis was carried through according to the experimental results of two groups of data. Apart from the individual data source,  the two types of image data in each group were combined to form a mixed dataset of multi-source data and  then used as the input of SVM classifier. It proved that the mixed dataset consisting of multi-source data could improve the classification performance of singe image  so the collaborative use of multi-source data would be feasible for land cover classification. 

Keywords: consistency analysis, ASTER, CBERS, Landsat ETM+, classification, support vector machine

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
PeijunDu2008accuracy2.pdf387.34 KB

Corn 3D Reconstruction for Remote Sensing Validation

Wuming Zhang 1, 2, 3 +, Haoxing Wang 1, 2, 3, Guoqing Zhou 1, 2, 3 and Guangjian Yan 1, 2, 3 
1 School of Geography, Beijing Normal University
2 State Key Laboratory of Remote Sensing Science, Jointly Sponsored by Beijing Normal University and  Institute of Remote Sensing Applications of Chinese Academy of Sciences, Beijing, China
Beijing Key Laboratory for Remote Sensing of Environment and Digital Cities, Beijing, China

Abstract. Ground-based leaf area and leaf direction measurements are crucial for remote sensing validation of leaf area index (LAI) and leaf angle distribution (LAD) products. The existing methods have some drawbacks, for example the instrument is expensive, or the operation is time consuming and labour intensive. Towards the features of corn, an image-based method is proposed to avoid these limitations. On the basis of photogrammetry, a 3D corn model is reconstructed from captured images. Then accurate leaf areas and leaf directions can be measured on this 3D model. The topics about expanding individual measurement to group measurement and its application in remote sensing LAI inversion and validation are also discussed.

Keywords: corn, leaf area, three-dimensional reconstruction, remote sensing validation

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
WumingZhang2008accuracy.pdf497.61 KB

Data Quality—What Can an Ontological Analysis Contribute?

Andrew U. Frank 
Department of Geoinformation Technical University Vienna Gusshausstrasse 27-29/E127-1 A-1040 Vienna, Austria

Abstract. Progress in research on data quality is slow and relevance of results for practice is low. Can an ontological analysis make significant contributions? The “road block” in data quality research seems to be an ontological one. Approaching “data quality” with an ordinary language philosophy method reveals the inherent contradiction in the concept. The ontological analysis reveals the necessity to separate the ontology (reality) proper from the epistemology (data).  Data quality reveals itself when data is used, which focuses our attention on the double linkage between reality and data: (1) the observation that reflects reality into the data and (2) the decision that links the plan to the changes in reality. The analysis of the processes leading from raw observations to decisions leads to operational definitions for “fitness for use” and an effective method to assess the fitness of data for a decision. Novel is the consideration of data quality as transformation through the whole process from data collection to decision.

Keywords: ontology, fitness for use

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Frank2008accuracy.pdf186.25 KB

Degree of Site Suitability Measurement in a GIS: The Effect of Various Standardization Methods

Badri Basnet
Australian Centre for Sustainable Catchments and Faculty of Engineering and Surveying
University of Southern Queensland, Toowoomba, Queensland, 4350, Australia

Abstract. Suitability analysis is performed to identify sites (usually grid cells or pixels) suitable for a specific purpose so that management decisions can be made in a site-specific manner. However, sites identified as suitable are rarely equally suitable in the real world. Measurement of the degree of site suitability (DoSS) is  therefore crucial to be able to manage sites in a truly site-specific manner. Conventionally, site suitability analysis is performed using weighted linear combination (WLC) of standardized input factors within Geographic Information Systems (GIS). Input factors used in such analysis can be standardized in a number of different ways. The method of standardization used in the analysis could have varying effects on the DoSS measurement. However, it is yet to be assessed and quantified. Therefore, the objective of this study was to quantify the effect of various standardization methods on the DoSS measurement. In this study, the DoSS of agricultural field was measured for site-specific application of animal waste as fertilizer. Seven input factors were used in the analysis. They were standardized using a) Boolean logic, b) discrete classification, and c) continuous rescaling methods. The Boolean logic method of standardization classified factor attributes as either ‘suitable (with class weight of 100)’ or ‘unsuitable (with a class weight of zero)’. The discrete classification method of standardization grouped attributes in up to five classes of approximately equal class size. These classes were weighted with equally-incremented class weights that added up to 100. The continuous rescaling method of standardization rescaled the range of attributes in a suitability value of 0 to 100. Standardized input factors were combined respectively using a WLC model to produce composite suitability maps. The DoSS of the composite maps were assessed using weighted average (WA), coefficient of variation (CV), and value range (VR) parameters. Standardization using Boolean logic method was of  no consequence since it did not produce different degrees of site suitability. All suitable grid cells were equally suitable (i.e. WA= 700, CV=0 and VR=0). The discrete classification method of standardization produced diverse suitability values with weighted average ranging between 221.9 (CV=6.3 & VR=100) and 700 (CV=0 & VR=0) depending on the number of classes. This has highlighted the measurement inconsistencies of this method of standardization. Further investigation is therefore essential to quantify the effect of discrete classification method of standardization on the DoSS measurement. The continuous rescaling method produced a DoSS map with a WA of 419.05 (CV=8.04 & VR=332). This method of standardization is more consistent in the DoSS measurement and hence potentially useful for future DoSS assessment.  However, there is a need to further assess the effect of rescaling using different attribute endpoint values on the DoSS measurement.

Keywords: degree of site suitability, input factors, standardization, weighted linear combination model

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Basnet2008accuracy.pdf343.85 KB

Distributed Error Propagation Analysis for Automatic Drainage Basin Delineation

Tomas Ukkonen +, Tapani Rousi, Juha Oksanen and Tapani Sarjakoski
Finnish Geodetic Institute, Department of Geoinformatics and Cartography, 
P.O.Box 15, FIN-02431, Masala, FINLAND.

Abstract. Delineation of drainage basins is a popular terrain analysis method for digital elevation model (DEM) data. Currently, a deterministic delineation method is available in a number of terrain analysis software applications. The method uses elevation data for defining flow directions for each elevation point of a DEM and then follows flow paths from a pour point to all upstream points. The influence of a DEM error in the delineation process can be handled by replacing a single DEM D with the distribution of possible correct DEMs p(D). This uncertainty can be integrated into automatic delineation by using the Monte Carlo method, which uses realisations of DEMs drawn from  p(D) to calculate probability  maps for drainage basin delineations. The benefits of using Monte Carlo-based probability estimation in comparison with deterministic delineation are numerous. Firstly, the Monte Carlo method gives additional information by providing a clear ‘probability band’ for the catchment boundary, where the width of the band is dependent on local topography and the parameters of the DEM error model. In the extreme case, the band covers large areas around the drainage divide. Secondly, in some cases there exist two or more alternative boundaries that will become visible using the Monte Carlo method, whereas a deterministic approach is forced to pick only one of them. The number of samples required to accurately estimate the uncertainties in delineations is not clear, but according to earlier studies, estimation can require hundreds or thousands of samples. Our experiments on a single computer have shown that the use of the Monte Carlo method together with drainage basin delineation algorithms  is a computationally demanding problem. In addition, the  problem in current terrain analysis software applications is that they are not designed for large DEMs, which require  distribution of the processed data and computations between multiple computers. In this paper, we improve existing drainage basin delineation methods for uncertain DEM data by improving and comparing distributed algorithms, used for computing probability maps of the delineations. We measure the performance and behavior of algorithms in different cases and compare the results of MPI (with spatial distribution of the data) and GRID (without spatial distribution) based implementations. 

Keywords: drainage basin delineation, removal of surface depressions, accuracy, parallel computing, algorithms.

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Ukkonen2008accuracy.pdf226.26 KB

Effects of Exterior Orientation Elements on Direct Georeferencing in POS-Supported Aerial Photogrammetry

Xueping Zhang and Xiuxiao Yuan
School of Remote Sensing and Information Engineering, Wuhan University,  
129 Luoyu Road, Wuhan 430079, China

Abstract. Direct georeferencing has become more and more important in aerial photogrammetry, but the accuracy can’t satisfy the requirements at small mapping scales in practical projects. The main reason is that he accuracy of exterior orientation elements determined by the POS is not high enough. According to the heory of space intersection and error propagation law, mathematical model of exterior orientation elements’ effects on direct georeferencing based on space intersection proposed firstly. Afterwards, mathematical model of exterior orientation elements’ effects on direct georeferencing based on collinearity equations are proposed according to collinearity equations and least  square adjustment. The mathematical models are experimented using three sets of actual data at different photographic scales and terrains. Based on the empirical results, the effect rules of exterior orientation elements’ errors on direct georeferencing in theory and in practice are analyzed respectively. Finally, effects of different combinations of same accuracies of exterior orientation elements on direct Georeferencing are discussed.

Keywords: Position and Orientation System, exterior orientation elements, direct georeferencing, error

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
XuepingZhang2008accuracy.pdf236.08 KB

Effects of Spatial Grain Size on Landscape Pattern of Qingpu District in Shanghai

Honglin Zhao 1,Donghui Chen 2 +,Jiakai Shi 1 and Saijie Zhou 1 
1 College of Environmental Science and Engineering, Donghua University, Shanghai 201620, China
2 Shanghai Institute of Technology, Shanghai 200235, China

Abstract.  In this paper, the effects of  spatial grain size on landscape  pattern of Qingpu district are discussed under two levels of class and landscape. To reveal the effects of grain size on landscape pattern, a series of rastergraphs with different grid cells are  constructed based on majority rules using data of TM satellite remote sensing and GIS skills. Software Grid module of Arc/Info Fragstats3.3 was used as the landscape pattern analysis tool. The conclusions are drawn as follows:(1) There are obvious effects to respond to different grain sizes in class-level and landscape-level. Among the 7 landscape types, roadland and inhabitantland change significantly with different grain sizes ;the indices of LSI, CLUMPY, NP are sensitive to grain size contrary to SIDI, SIEI, PAFRAC. Generally, dominant types of landscape were strengthened with increasing grain size; The regular landscape isn’t sensitive to changes with increasing grain size. (2)The curves of indices have scale inflexion and most of them appear at 40m, 60m, 80m and 120m. The appropriate grain size should be chosen in the first scale domain, the range of appropriate grain for landscape indices of Qingpu district was 30m to 40m. (3) The trend of FRAC_MN and PAFRAC is adverse as the increase of grain size. This shows the limitation of the use of indices to quantify the landscape pattern at a certain degree.

Keywords: landscape pattern; landscape indices; grain size; effect of grain size; scale inflexion

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
HonglinZhao2008accuracy.pdf281.96 KB

Error Analysis and Accuracy Assessment of InSAR Phase Unwrapping

Bin Pan and Jun Gan +
School of Remote Sensing Information Engineering, Wuhan University, 129 Luoyu Road, Wuhan 430079, China 

Abstract.  The accuracy of phase unwrapping influences directly the accuracy of DEM in InSAR processing. This paper begins with classifying existing phase unwrapping algorithms based on the principle of phase unwrapping; then error sources of InSAR phase unwrapping are summarized and simulated data has
been taken to analyze the influence of error; thirdly, two assessment methods, rewrapping after unwrapping and elevation comparison method separately, are adopted to evaluate phase unwrapping accuracy of a group of simulated and true data; Finally, some conclusions are given and an optimal strategy for phase unwrapping is presented at the end of this paper after the error analysis and accuracy  assessment of InSAR phase unwrapping. 

Keyword:  phase unwrapping, error analysis, accuracy assessment, residual.

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
BinPan2008accuracy.pdf378.39 KB

Estimation of Travel Times in the Context of Intermodal Transportation

Tao Xu 1, 2 +, Liming Wang 1, Daquan Zhang 1 and Lin Gao 1, 2
1 Institute of Geographic Science and Natural Resources Research, Chinese Academy of Science, Beijing, 100101, P.R. China
2 Graduate School of Chinese Academy of Sciences, Beijing, 100039, P.R. China

Abstract. Travel-time is recognized as a critical ingredient in assessing the service factor of transportation. Two approaches to estimation of travel time using geographical information systems (GIS) — raster calculator (Euclidean distance algorithm or cost distance algorithm) and network analysis — are described. After analyzing the travel-time components and tendency of modern transport systems, it is suggested that travel time for a specific journey is uncertainly and transportation changes the relationship between space and time with the emergence of intermodal transport. Focusing on the estimation of travel time in the context of intermodal transport, a hybrid model combining raster  calculator and network analysis is proposed in this paper for calculating the travel times between origin point and the whole space. In this proposed model, the travel-time components may include an initial walk time from origin point to public transport stop, transport time between the same or different transportation network modes, and a final walk time from public transport stop to destination point. The uncertainly part in travel-time components, such as the time of travel (whether of departure or arrival), transport congestion, and the time for transmodes can be solved by network setting. This paper integrates the model into a GIS environment for regional planning, which successfully copes with the complexity of the public transport of a large region. Finally, this paper explains the result of different method.

Keywords: intermodal transportation, travel times, time geography, spatial structure

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
TaoXu2008accuracy.pdf550.93 KB

Evaluation of Current DEM Accuracy for Condamine Catchment

Kevin McDougall  +, Xiaoye Liu, Badri Basnet and Armando Apan 
Australian Centre for Sustainable Catchment, and Faculty of Engineering and Surveying, University of Southern Queensland, West Street, Toowoomba, QLD 4350, Australia

Abstract. A digital elevation model (DEM) is one of the most important datasets for catchment management and planning. It provides elevation information that is useful for many environmental applications including hydrologic modelling and flood management planning. Currently, the most comprehensive catchment wide DEM in the Condamine Catchment is a dataset derived from 1:100,000 topographic mapping. It is identified as the Department of Natural Resources and Water (NRW) DEM and it has an estimated average accuracy of approximately 10m vertically based on a 25m horizontal grid. The Shuttle Radar Topographic Mission (SRTM) derived DEM is also available for this catchment. It has a horizontal resolution of 87m and a vertical accuracy of about 16m. The accuracy of these DEMs may not be suitable for all application areas. Therefore, accuracy assessment of current DEM was deemed necessary. In this study existing ground survey marks were used as ‘true’ elevation data for DEM accuracy assessment. For assessment purposes, the catchment area was classified into flat, moderate and steeper slope categories and each category was evaluated separately. Elevation data corresponding to the existing survey marks were extracted from each of the above DEMs. Differences were calculated between the survey marks and the elevations extracted from the DEMs. These differences were grouped into several error ranges for each slope category and each DEM type. The result indicated that the NRW 25m DEM has better than 10m accuracy at 90% confidence level. The SRTM 87m DEM over the same area has proven to be slightly better with a 95% confidence of better than 10m. However, the accuracy assessments of these DEMs vary over the different slope categories and land use utilization.

Keywords: digital elevation model, shuttle radar topographic mission, accuracy assessment, catchment

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
McDougall2008accuracy.pdf501.9 KB

Field-Scale Mapping of Soil Organic Carbon with Soil-Landscape Modeling

Feng Chen 1+,  Larry T. West 2, David E. Kissel ¹, Rex Clark 3, and Wayne Adkins 1 
1 Department of Crop & Soil Sciences, University of Georgia, USA
2 National Soil Survey Center, USDA-NRCS, USA
Department of Biological & Agricultural Engineering, University of Georgia, USA

Abstract. Predicting soil organic carbon (SOC) at a field scale plays an important role in field management practices for studies in both soil quality and carbon sequestration. Examining SOC concentration with soil- landscape relations provided an alternative technique for mapping SOC concentrations. The objectives of this study were to develop soil-landscape models for a crop field by quantifying the relationships between SOC concentration and terrain attributes derived from digital elevation models (DEMs), and to refine the models by delineating the sub-watersheds within the field. Separated soil sample sets were obtained from a 115 ha field located in the coastal plain region of Georgia for model development and model validation. The high accuracy GPS measurements over the field were obtained with a survey grade GPS system. The DEMs with 1, 2, 4, and 8 m grid sizes were created by interpolating the GPS data, and the terrain attributes were further derived from the DEMs. Correlation coefficients between SOC concentration and  terrain attributes were analyzed and indicated that the topographic wetness index was the best single predictor for mapping SOC concentration. The study found that prediction of SOC concentration using the DEM with 2 m grid size yielded the best accuracy in both cases. The effects  of grid sizes on the sub-watershed delineation and prediction accuracy were also discussed.

Keywords: digital elevation model, global positioning system, soil-landscape  modelling, soil organic carbon, sub-watershed.

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
FengChen2008accuracy.pdf384.67 KB

Forest Characteristics and Effects on LiDAR Waveforms Modeling and Simulation

Xiaohuan Xi +, Ran Li, Zhaoyan Liu and Xiaoguang Jiang
Academy of Opto-Electronic, Chinese Academy of Sciences
No.95 Zhongguancundonglu, Beijing, 100190, China

Abstract: LiDAR (Light Detection And Ranging) remote sensing has been used to extract surface information as it can acquire highly accurate object shape characteristics using geo-registered 3D-points, and therefore, proven to be satisfactory for many applications, such as high-resolution elevation model generation, 3-D city mapping, vegetation structure estimation, etc. Large footprint LiDAR especially, offers the great potential for effectively measuring tree parameters in forested areas. Based on the radiative transmission function for vegetation structure, a simplified model cited from previous paper was used to simulate how the vegetation parameters and slope degree affect LiDAR waveform characteristics. And an extinction coefficient is introduced to the model due to the influence of dense vegetation canopy and the simplified and ideal model. The experiment results show that vegetation canopy, trees distribution in one footprint and terrain slope influence LiDAR waveform shapes, while  tree height just affects the starting position of the waveform. The model with the extinction coefficient explains that vegetation canopy can weaken laser which makes the return echo weaker in lower canopy. Although based on some assumptions, simple and ideal
conditions, the above results obtained from GLAS data are suitable for other LiDAR systems and have great significance to LiDAR applications in forest parameters extraction, especially in sloped areas, it is necessary to correct terrain effects when deriving vegetation height from LiDAR returns. As a matter of fact, the model has to be perfected in later work so as to be more practical. 

Keywords: LiDAR, GLAS, vegetation parameters, waveform, extinction coefficient

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
XiaohuanXi2008accuracy.pdf321.3 KB

From Uncertainty Description to Spatial Data Quality Control

Wenzhong Shi
Advanced Research Centre for Spatial Information Technology
Department of Land Surveying and Geo-Informatics
The Hong Kong Polytechnic University, Hong Kong

Keywords: error description, quality control, spatial data, GIS

AttachmentSize
WenzhongShi2008accuracy.pdf247.84 KB

Fuzzy Evaluation of Quality in DLGs, DRGs, DEMs, and DOMs

Dejun Xu 1, 2, 3, Mei Zhong 1, 2  and Qingyun Du 1, 2 
1 School of Resource and Environment Science, Wuhan University, 129 Luoyu Road, Wuhan 430079
2 The Key Laboratory Geographic Information System, Ministry of Education,
Wuhan University, 129 Luoyu Road, Wuhan 430079, China
3 The Bureau of Land Resource of Linhai, 299 Renming Road, Linhai, Zhejiang, 31700, China

Abstract. The conventional analogous maps are gradually substituted by the 4Ds, i.e., DLGs, DRGs, DEMs, and DOMs. In various fields of national economic, 4Ds are applied extensively and people attach much importance to the quality of 4Ds. The paper discusses fuzzy comprehensive evaluation techniques. The implementing process and its characteristics are also analyzed. Taking the comprehensive evaluation of DEMs as an example, a model for fuzzy comprehensive evaluation of DEMs is developed.

Keywords:  DLGs,DRGs, DEMs, DOMs, quality model,  fuzzy evaluation

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
DejunXu2008accuracy.pdf265.52 KB

Geo-Data User Chains

P.A.J. van Oort 1
1 Wageningen University, Centre for Geo-Information, P.O. Box 47, 6700 AA Wageningen, The Netherlands, Tel: +31-317−474403, E-mail: pepijn.vanoort@wur.nl

Abstract. The internet is changing the way we exchange information. Exchange is becoming more efficient and more anonymous. For geo-data, one can observe often long geo-data user chains from producer to end-user. For example a dataset is produced by the producer, then handed to the salesperson/data manager, next it is received by the data managers of various other organizations. Internally, the receiving data manager may place the geo-dataset on an internal server. Anyone within the organization has access to this internal server. In the end, one has not a clue of who is actually using the dataset. From various perspectives this can be undesirable. Producers are less well informed about users’ needs. Users are less well informed about data quality and other users’ experiences. User surveys and user conferences may overcome these problems, but often their response rate is very low. As a result, one does not know whether these surveys accurately represent the user community. In this paper, we present the results of a research in which we aimed to identify and classify all users in the population. Differences between organizations are analyzed. We investigate which fractions of users give feedback to the producer and whether these users are representative for their community. And we investigate the importance of printed metadata relative to personal contacts. To our knowledge, this is the first effort ever to identify the  complete user population of a geo-dataset. We conclude the paper with a vision on a new research agenda on geo-data user chains 

Keywords: geo-data user chains, spatial data infrastructures, user typology

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Oort2008accuracy.pdf245.1 KB

Geometrical Uncertainty of Objects and Its Influence in the Object-Oriented Multi-Source Remote Sensing Imagery Processing

Zhaocong Wu, Lina Yi and Guifeng Zhang
School of remote sensing and information engineering, Wuhan University, Wuhan 430079, China

Abstract. In the object oriented classification, object  information extracted from multi-source remote sensing imagery can be integrated to improve the accuracy and reliability of acquired remote sensing information. Before extraction information from multi-source image, the objects should be geometrically located on them. The geometrical location of objects mainly consists  of three steps: the extraction and representation of objects, the pre-processing of different spatial resolution images and the transferring of objects. Each of these steps may lead to uncertainties. This paper investigates and analyzes the geometrical uncertainty of object and its influence on the extracted features and  the classification accuracy. Different transferring methods may cause different uncertainty and the uncertainty of boundary pixels is the main source of the geometrical uncertainty. In this paper, two transferring methods of the objects are used: transferring of the raster objects and transferring of the vector objects. To analyze its influence on the extracted features, spectral features, NDVI (normalized difference vegetation index) and energy are calculated in two feature extraction methods for comparison. One method is based on all the object pixels and the other is based on the inside pixels without the boundary pixels. To analyze its influence on the classification accuracy, the extracted features are used to classify the objects with Bayes maximum likehood classifier. The results show that the influence of the geometrical uncertainty is small and inconstant. To avoid the instability caused by the geometrical uncertainty, the method of transferring the raster object without considering the influence of the boundary pixels may be more suitable in practice.

Keywords: object-oriented, classification, feature, uncertainty, geometrical location 

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
ZhaocongWu2008accuracy.pdf332.2 KB

Geostatistical Analysis of GPS Trajectory Data: Space-Time Densities

Tomislav Hengl+, E. Emiel van Loon, Judy Shamoun-Baranes and Willem Bouten
Computational Geo-Ecology (CGE), Institute for Biodiversity and Ecosystem Dynamics, University of Amsterdam, Nieuwe Achtergracht 166, 1018 WV Amsterdam, Netherlands 
 
Abstract. Creation  of  density maps  and  estimation  of  home  range  is  problematic  for  observations  of animal  movement  at  irregular  intervals.  We  propose  a  technique  to  estimate  space-time  densities  by separately modeling animal movement paths and velocities, both as continuous  fields. First  the  length of trajectories for a given grid is derived; then the velocity of individual birds is interpolated using 3D kriging; finally  the space-time density is calculated by dividing the density of trajectories (total length of lines per grid cell) by the aggregated velocity at that grid cell. The resulting map shows density of a species in both space  and  time,  expressed  in  s/m2   units. This  length-by-velocity  (LV)  technique  is  illustrated  using  two case studies:  (1) a synthetically generated dataset using  the Lorenz model; and  (2) GPS  recordings of 14 individual  birds  of  lesser  black-backed  gull  (Larus  Fuscus).  The  proposed  technique  is  compared  with kernel  smoother –  a  technique  commonly  used  to derive home  range  for  species. The  results of using  a synthetic dataset proved that the LV method produces different outputs than kernel smoothing, especially if irregular  observation  intervals  are  used.  The main  advantages  of  the  proposed  technique  over  a  kernel smoother  are:  (1)  it  is  not  sensitive  to missing  observations;  (2)  it  is  suited  to  analyze  fly  paths  (e.g.  it preserves  information about velocities and directions), and  (3)  it allows  the movement of birds  (velocity, trajectory)  to  be  modeled  separately  e.g.  as  function  of  environmental  conditions,  wind,  day  time  and similar. The remaining research  issues are development of methodology for selection of optimal grid size and optimal time interval between recordings.

Keywords: animal movement, 3D kriging, velocity, home range, space-time cube.

 

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Hengl2008accuracy.pdf1.73 MB

Handling Uncertainties in Image Mining for Remote Sensing Studies

Alfred Stein
ITC International Institute for Geoinformation Science and Earth Observation, PO Box 6, 7500 AA
Enschede The Netherlands. Email: stein@itc.nl

Abstract. This paper presents an overview of uncertainty handling in remote sensing studies. It takes an image mining perspective and identifies ways different uncertainties. It starts with the pixel and through object identification and modeling proceeds towards monitoring and decision making. Methods presented originate both from probability and fuzzy logic based approaches.

Keywords: image mining, remote sensing, statistics

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Stein2008accuracy.pdf239.22 KB

Handling Uncertainty in Underwater Terrain during Dredging by Using Filled-Matrix Algorithms

Lihua Wang 1, Yimin Shi 1, Jun Wan 2 and Wei Chen 1+ 
1 Dept. of Surveying and Geo-informatics, Tongji University 1239, Siping Rd, Shanghai, 200092, China 
2 Shanghai Dahua Surveying & Mapping Co., Ltd, N0.2501, Pudong Avenue, Shanghai, 200136, China 

Abstract. This paper brings up a new method to build the underwater terrain which uses filled matrix to represent the terrain so as to resolve the problem of difficult updating the changing underwater terrain in real time during construction. By the filled-matrix, the underwater constructing terrain will be represented real- timely, so as to get constructing condition and avoid superfluous phenomena. Firstly, the paper tells the definition of filled-matrix and how to build the filled-matrix terrain model on the conventional DEM method. Secondly, the authors analyze the underwater terrain uncertainty during constructing and the necessary for resolving the uncertainty. Thirdly, the authors discuss the updating arithmetic for filled-matrix terrain. Finally the paper tells the application of using updating filled-matrix terrain during dredging to get real-time terrain modeling for supervising.

Keywords: filled-matrix, underwater terrain, uncertainty, update

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
LihuaWang2008accuracy.pdf272.68 KB

Hyperspectral Image Classification Based on Compound Kernels of Support Vector Machine

Yuyong Cui +, Zhiyuan Zeng and Bitao Fu
Digital Engineering and Simulation Research Center, Huazhong University of Science and Technology, Wuhan 430074, China

Abstract. Support vector machine is a  kind of pattern classification  algorithm based on the statistics learning theory. This paper proposes to estimate abundances from hyperspectral image using probability outputs of support vector machines (SVM), training a SVM with a gauss kernel function,and we discussed the relationship between kernel functions and nonlinear mappings and mapped spaces. Then, a new kernel - compound kernel function is given. We applied the kernel in SVM and compared the kernel with other kernels in Hyperspectral image classification, and give a comparison  with some result  to present the validation in remote sensing, the result shows the evidence to the validity of the method and shows that this method is more accurate than the other method, which has a more accurate result.

Keywords: support vector machine, compound kernels, hyperspectral image, classification, accuracy

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
YuyongCui2008accuracy.pdf343.28 KB

Hyperspectral Remote Sensing Image Interpretation Based on Spatial Information Analysis of Homogeneous-Regions

Yan Gong +, Ning Shu and Liqun Lin
School of Remote Sensing Information Engineering, Wuhan University, Wuhan 430079, China

Abstract. This paper presents a hyperspectral remote sensing image analysis frame based on homogeneous-regions.  In this frame, a multi-scale segmentation method based on Spectral Code Mapping (SCM) was proposed, and spatial relationship was discussed for improved classification.

Keywords: spatial information; multi-scale; spatial-relationship; image interpretation; pattern cognition; hyperspectral remote sensing

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
YanGong2008accuracy.pdf346.16 KB

Image Mining for Generating Ontology Databases of Geographical Entities

Zhenfeng Shao 1, Jun Liu 2 and Xianqiang Zhu 1
1State Key Laboratory for Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, 129 Luoyu Road, Wuhan 430079, China.
2 School of Remote Sensing and Information Engineering, Wuhan University, Wuhan 430079, China.

Abstract: This paper extracts the basic geographic information from remote sensing images at first, and then studies the resolution granularity of the remote sensing images which can be applied to distinguish the features of corresponding objects by adopting global-covered remote sensing images with multi-frequency spectra and multi-resolution. Thus  necessary feature information for the geographical ontology database, such as texture characteristic information can be mined and through our data mining strategy from remote sensing images based on the formal concept analysis  theory, data mining methods for texture features are achieved. The emphases of this paper are the mining method for texture characteristic for generating ontology database of the geographical entity. By mining the texture characteristics, we can find the partial structure that frequently appears in the remote sensing image data, and find the restriction relationship between the central pixel and its neighborhood pixels in partial regions of images. This process is constituted by the following four steps: sampling areas partition normalized processing, characteristic data mining, building Hasse graph and generating rules. Through the computation about remote sensing image data mining, we put the uncertainty problem about characteristics form data mining up to a height of information theory and study it, and find the consolidate mathematics expression between information quantity and uncertainty about the characteristics in order to resolve the quantitative evaluation problem between information quantity and uncertainty of remote sensing image. This paper introduces the concepts-driven data mining framework to uncertainty process, so as to guide the idiographic algorithm and process during the image mining procedure. According to the characteristic of remote sensing images, combining with all kinds of GIS data, we can describe the essential characteristics that build ontology database of the geographical entity.

Keywords: image  mining, ontology database, semantic-based spatial information sharing and interoperability, uncertainty

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
ZhenfengShao2008accuracy1.pdf265.54 KB

Impacts of Noise on the Accuracy of Hyperspectral Image Classification by SVM

Peijun Du 1 +, Xiaomei Wang 1, Kun Tan 1 and Giles M. Foody 2 
1 China University of Mining and Technology, Xuzhou City, Jiangsu Province, 221116, China 
2  University of Nottingham, University Park, Nottingham NG7 2RD, UK

Abstract. The support vector machine (SVM) has become a popular tool for image classification recently. The performance of SVM for hyperspectral image classification has been examined from a range of perspectives, but the impacts of noise, errors and uncertainties have attracted less attention. This paper aims to evaluate the impacts of noise on SVM classification. The research is undertaken using real imagery acquired by the OMIS hyperspectral sensor. To assess the sensitivity and reduction capacity of SVM classifier to different types of noise a simulation study is undertaken using two types of noise. The first type of noise is striping, in which some rows or columns of the image have markedly abnormal signals. The second type of noise is caused by some uncertain factors that may impact upon one band, one pixel or one line. This noise may be evaluated by introducing salt and pepper noise. A variety of datasets containing different types of noise are generated and classified  using a SVM. The results of  the classifications, with particular regard to their accuracy, are compared  against a classification of the original dataset and comparative analyses obtained using traditional classifiers including the spectral angle mapper (SAM) and binary encoding (BE). The results indicate that the SVM  is more effective to alleviate the effects of noise than SAM and BE. 

Keywords: support vector machine (SVM), hyperspectral remote sensing, classification, noise.

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
PeijunDu2008accuracy1.pdf692.53 KB

Information-Theoretical Comparison between Actual and Potential Natural Vegetation

Yan Chen 1+, Zongjian Lin 2 and Jiong You 1 
1 School of Remote Sensing Information Engineering, Wuhan University, 129 Luoyu Road, Wuhan 430079
2 Chinese Academy of Surveying and Mapping 16 Beitaiping Road, Beijing 100039, China

Abstract. An information  theory measure of shared information, average mutual information (AMI), was applied to compare the similarity  between potential natural vegetation (PNV) and actual vegetation (AV), where terrain factors were incorporated for mapping PNV, and TM imagery and MNDVI for AV, respectively. Object-oriented approaches were used to produce categorical maps for distributions of PNV and AV.  In order to find the maximum relation between PNV and AV, AMI and contingency tables were used. Results show that AMI is low between PNV and AV, and varies with different scales. Therefore, simple combination of PNV and AV should be implemented with care. 

Keywords: potential natural vegetation (PNV); actual vegetation (AV); average mutual information (AMI); contingency table 

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

 

AttachmentSize
YanChen2008accuracy.pdf418.08 KB

Inversion of Surface Temperature Based on MODIS and ASTER Imagery

Shenghui Fang +, Yuanyong Dian and Yuanchao Fan
School of Remote Sensing and Information Engineering, Wuhan University, Wuhan 430079, China

Abstract. In this paper, on the basis of summarizing the main algorithms of retrieval of land surface temperature, the principle of temperature retrieval algorithm based on multi-channel data is described and factors affecting the accuracy of retrieval are analyzed. In addition, mixed pixel emissivity is discussed and the relevant estimation method using the results of classification of visible-band images is put forward. Experimental retrieval of land surface temperature is conducted by making use of MODIS and ASTER imagery in local Wuhan areas, with the empirical results analyzed.

Keywords: retrieval; land surface temperature (LST); mixed pixels

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
ShenghuiFang2008accuracy.pdf303.98 KB

Irrigation Intensification or Extensification Assessment: A GIS-Based Spatial Fuzzy Multi-Criteria Evaluation

Yun Chen 1, 3, Shahbaz Khan 1, 2 and Zahra Padar 1, 3 
1 CSIRO Land and Water, Canberra, Australia
International Centre of Water for Food Security, Charles Sturt University, Wagga Wagga, Australia
Cooperative Research Centre for Irrigation Futures (CRC IF), Wagga Wagga, Australia

Abstract. This paper presents some preliminary results from a research on spatial multi-criteria evaluation of land suitability for intensification or extensification in irrigated cropland at a catchment scale in Australia. The project was conducted using the fuzzy linguistic ordered weighted averaging (FLOWA) approach which
integrates AHP and fuzzy linguistic OWA operators in ArcGIS 9.2 environment. Several scenarios were derived to show how the uncertainties involved in the suitability decision-making process will influence the outcomes. The study has indicated that there is no need for irrigation extensification in the catchment; there is
a good potential of intensifying irrigation in some areas if water is available. 

Keywords: irrigated cropping, land suitability, fuzzy linguistic ordered weighted averaging

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
YunChen2008accuracy.pdf401 KB

Issues of Uncertainty in Super-Resolution Mapping and the Design of an Inter-Comparison Study

Peter M. Atkinson
School of Geography, University of Southampton, Highfield, Southampton, SO17 1BJ, UK

Abstract. Super-resolution mapping is a relatively new field in remote sensing whereby classification is undertaken at a finer spatial resolution than that of the input remotely sensed multiple-waveband imagery. A variety of different methods for super-resolution mapping have been proposed, including spatial pixelswapping, spatial simulated annealing and Hopfield neural networks, feed-forward back-propagation neural networks and geostatistical methods. The accuracy of all of these new approaches has been tested, but the tests have been individual (i.e., with little bench-marking against other techniques) and have used different measures of accuracy. There is, therefore, a need for greater inter-comparison between the various methods available, and a super-resolution inter-comparison study would be a welcome step towards this goal. This paper describes some of the issues that should be considered in the design of such a study.

Keywords: super-resolution mapping, inter-comparison, accuracy assessment

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Atkinson2008accuracy.pdf174.66 KB

Land Cover Classification Information Decision Making Fusion Based on Dempster-Shafer Theory: Results and Uncertainty

Youhua Ran 1, Xin Li 1, Ling Lu 1 and Zhigang Bai 2
 1 Cold and Arid Regions Environmental and Engineering Research Institute, Chinese Academy of Sciences, Lanzhou 730000, P.R. China
2 Beijing Soil and Water Conservation and Ecological Engineering LTD CO., Beijing, 100055, P.R. China

Abstract. Land cover plays a significant role in the earth system science, which reflects the influence of human activities and environmental changes (Sellers et al., 1997; IGBP, 1990; Aspinall et al, 2004). In China, Many land use/cover maps can be used in recent years derived from remote sensing observation. These data will be whether or how combined effectively to produce better land cover map that is a key question. Dempster-Shafer evidential reasoning is a method of multisource data decision fusion. The method is based on the recognition that the knowledge and information we use in making decisions such as image classification is often uncertain, incomplete, and occasionally imprecise. Past research has shown that evidential reasoning can produce more accurate results compared to traditional classifiers. This paper makes an experiment in HEIHE river basin to develop land cover map using Dempster-Shafer (DS) evidence theory. The China 1:100000 land use data, the 1: 1000000 vegetation map and MODIS land cover classification product as multisource of evidence to support each land cover class. These evidences are combined using DS combination rule. Results shows that the evidence theory can be used for fusing multi classification information and can effective report the spatial distribution of interval of uncertainty. The most important issues is how to accurately determine and expression the uncertainties in the make-decision process, such as the uncertainty of input data, the uncertainty of evidence and the uncertainty of frame of discernment.

Keywords: land cover, data fusion, Dempster-Shafer theory, uncertainty

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
YouhuaRan2008accuracy.pdf631.36 KB

Large Area DEM Generation Using Airborne LiDAR Data and Quality Control

Xiaoye Liu 1, 2 +, Zhenyu Zhang 1, 2, Jim Peterson 2 and Shobhit Chandra 2
1 Australian Centre for Sustainable Catchments and Faculty of Engineering and Surveying University of Southern Queensland, Toowoomba, QLD 4350, Australia
2 Centre fro GIS, School of Geography and Environmental Science
Monash University, Clayton, Vic 3800, Melbourne, Australia

Abstract. Digital Elevation Model (DEM) is a crucial  component in terrain-related applications. Researches on terrain data collection and DEM generation have received great attention. Traditional methods such as field surveying and photogrammetry can yield high accuracy terrain data, but they are time consuming and labour intensive, especially for large area. Airborne Light Detection and Ranging (LiDAR) - also referred to as Airborne Laser Scanning (ALS), provides an alternative for high density and high accuracy three dimensional terrain point data acquisition. LiDAR data have become a major source of digital terrain information and has been used in a wide of areas, with terrain modeling being the primary focus of most LiDAR collection missions. The use of LiDAR for terrain data collection and DEM generation is the most effective way and is becoming a standard practice in spatial science community. Although LiDAR data has become more affordable for users due to the gradually dropping of the costs of LiDAR data collection, how to effectively process the raw LiDAR data and extract useful information remains a big challenge. This paper presented ways to generate a high quality DEM in a large catchment area using LiDAR data. A number of research challenges such as terrain modeling methods, interpolation algorithms, DEM resolution, and data reduction were identified and discussed in detail for quality control of a LiDAR-derived DEM. 

Keywords: LiDAR, DEM, interpolation, resolution, data reduction

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
XiaoyeLiu2008accuracy.pdf937.87 KB

Matching imperfect spatial data

Ana-Maria Olteanu, Sébastien Mustière and Anne Ruas
COGIT Laboratory, Institut Géographique National
2 av. Pasteur, 94160 Saint-Mandé, France
Tel.: + 33; Fax: + 001 555 832 1156
ana-maria.olteanu@ign.fr; sebastien.mustiere@ign.fr; anne.ruas@ign.fr

Abstract
Currently, many independent geographical databases exist in the same area and users need to fusion various information coming from these databases. In order to integrate databases, redundancy and inconsistency between data should be identified. Many steps are required to finalise the databases integration and one of them is automatic data matching. In this paper we study which knowledge is required to guide the matching process and more particularly, how to manage uncertain knowledge. Firstly, we analyse how interactive matching is performed and we identify basic knowledge used in this process: objects with similar location, shapes and attributes are matched. Knowledge used is imperfect and manipulated data hold a certain degree of errors and  vagueness. We classify the various kinds of imperfection information used and we distinguish imprecision uncertainty and incompleteness. For each class of imperfection, we choose the appropriate theory to model it. Finally, we illustrate imperfection in spatial data through the results of experiments of data matching through the comparison of toponyms in the context of ethnographical data.

Keywords: integration, data matching, uncertainty and imprecise knowledge

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Olteanu2006accuracy.pdf915.16 KB

Measurement Indices of Positional Uncertainty for Plane Line Segments Based on the σ ε Model

Guoqin Zhang 1 and Changqing Zhu 2 
1 Information Engineering University, Zhengzhou 450052, China 
2 Ministry of Education Key Laboratory of VGE, Nanjing 210046, China

Abstract. Firstly, eight cases of the random line segments are discussed. Secondly, to the eight existing cases of line segment, the analytic  expressions of the error band for the σ ε  uncertainty  model  of  line segment are deduced, the parameter equations of the error band boundary are gotten, and it is proved that the boundary of the error band is continuous. Thirdly, the average error band width and the algebraic expressions of the uncertainty region area bounded by error band are calculated. Finally, the visual graphics of the error band are drawn with analytic expression of the error  band by examples. Thus, three indexes are given to measure the precision of the line uncertainty: the average error band width, the uncertainty region area and the visual graphics. 

Keywords: plane line, uncertainty, σ ε  model, error band

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press). 

AttachmentSize
GuoqinZhang2008accuracy.pdf255.19 KB

Methods for Estimating the Accuracy of Per-Pixel, Per-Parcel and Expert Visual Classification of High Resolution Optical Satellite Imagery

Neil Stuart +, Tom Jaas, Ioannis Zisopoulos and Karin Viergever
Institute of Geography, University of Edinburgh, Drummond Street Edinburgh EH8 9XP. 

Abstract. We describe methods for collecting appropriate quantities and  types of reference data for validating classifications of high resolution satellite data. We use the example of collecting reference data to test classifications of 1m spatial resolution IKONOS data for an open woodland savanna in Central America. Reference data was collected in the field by GPS survey to ensure the purity and representativeness of the ground areas and a precise matching between the ground data and the corresponding image pixels. The image is then classified by three methods: by automatic per-pixel maximum-likelihood (ML), by automatic per- parcel nearest neighbour and by a visual classification by experienced image interpreters. We find that the per-parcel classifier achieves higher accuracy than the per-pixel ML classifier for all the required land cover classes. The overall accuracy for the per-parcel classifier is 82% (producer accuracy range: 47-95%, k=0.73) compared to 57% (range: 36-70%, k=0.5) for ML.  The classification by expert visual interpretation yields an overall accuracy of 96% (range: 89-100%, k=0.95). The per-parcel classification exceeded the minimum accuracy requirement of 70% for two of five required land cover classes and approached the target of 85% suggested for the overall accuracy required in natural resource mapping.   We conclude that a per-parcel classifier can achieve an acceptable standard of accuracy for some of the savanna land cover classes, but that further work is needed to improve the classification of smaller groups trees and sparse woodlands.   Since visual classification is still commonly used in developing countries for classifying imagery and in some cases is the desired output that an automated classifier seeks to reproduce, we developed a means to measure the stability or reliability of a visual classification. We estimate the average accuracy of a series of manual, visual classifications of the same image by different interpreters, by comparing each to the agreed ‘master’ classification by an expert interpreter. The result shows which map features are frequently classified correctly (or not) by different interpreters and according to their level of expertise.  This information allows further training to focus on these classes.  

Keywords: accuracy comparison, per-parcel, visual interpretation, reference data, IKONOS

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Stuart2008accuracy.pdf344.99 KB

Mining Spatial Association Rules with Geostatistics

Jiangping Chen 1, 2+ and Xiaojin Tan 3

1 School of remote sensing Information Engineering Wuhan University, Wuhan Hubei, PR China 430079
2 Department of Geography, University of Cambridge, Downing Place, Cambridge, UK, CB2 3EN
3 International school of software Wuhan University, Wuhan 430079, China 

 Abstract.  In 1962, G. Matheron introduced the term geostatistics to describe a scientific approach to evaluate problems in geology and mining, from ore reserve estimation to grade control. Geostatistics provides statistical methods used to describe spatial relationships among sample data and to apply this analysis to the prediction of spatial and temporal phenomena. They are used to explain spatial patterns and to interpolate values at unsampled locations. Geostatistics have traditionally been used in the sphere of geosciences: meteorology, mining, soil science, forestry, fisheries, remote sensing, and cartography. It later were successfully applied to economics, health, and other disciplines.  Currently, it’s a trend to integrate powerful methods of geostaitsitcs into a geographic information system (GIS).  This paper put forward a new algorithm of mining association rules with geostatistics in analyzing the epidemic problem. A key feature of epidemic data is their location in a space-time continuum. Geostatistics is independent of mean variance relationship and therefore can be used to verify more traditional methods of evaluation inner spatial structure. During structural analysis, spatial autocorrelation can be analyzed using covariance and semivariogram. With structural analysis predictions at unsampled locations can be made using geostatistic method such as kriging (i.e. multiple linear regression in a spatial context). Geostatistical analysis can interpret statistical distributions of data and also examine spatial relationships. It is capable of revealing how cohesion values vary over distance, and of predicting areas of high and low cohesion values. The geostatistics software provides tools for capturing maximum information on a phenomenon from sparse, often biased, and often under-sampled data. It is a good method for spatial data mining by taking account of the autocorrelation between the spatial data. In this paper, the first step is to use the geostatistics methods such as kriging, Spatial Autoregressive Model (SAR) to analyse and estimate  the correlation of the land use/cover change and hay fever incidence. Then build a spatial autocorrelation model and then use the model to mining the spatial association rules. We can get the spatial frequency items from the autocorrelation Model. This replaces the repeated scanning of the spatial database by the measure of conventional spatial association rules mining. From the result of the example, the method is more quick and efficient than the traditional data mining algorithm Apriori. 

Keywords: geographical information science, statistical analysis, spatial autocorrelation, geostatistics, spatial association rule

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
JiangpingChen2008accuracy.pdf353.74 KB

Mining Subsidence Monitoring by Using the Information of Ground Sunk Seeper

Rui Liu 1 +, Fang Miao 2 and Baocun Wang 1
2 Chengdu University of Technology, Chengdu 610059, China

Abstract. First on the basis of comprehensive analysis of the cause of ground sunk seeper and its relation with coal collapsing, this paper proposes  the feasibility of monitoring surface collapsing by the use of information of ground sunk seeper caused by coal mining collapse; then take the Kailuan Coal Field as an example, by means of the ground  sunk seeper information extracted  from 6 temporal RS image, the relation between Kailuan coal field collapsing and ground sunk seeper are analyzed. The trend of the coal ground collapsing can be reflected more clearly by using the information of ground sunk seeper.

Keywords: ground sunk seeper, mining subsidence, dynamic detection, remote sensing

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
RuiLiu2008accuracy.pdf940.43 KB

Multi-Index Regional Forest Fire Ratings Appraisal Platform Design and Implementation

Lihua Tang, Xiongwei Lou, Haitao Lv and Xuekui Ge
School of Information Engineering, Zhejiang Forestry University, Lin’an, Zhejiang, 311300, China 

Abstract. At present, the appraisal of forest fire danger degree is the base for all the forest fire management system. Because of the difference in the appraisal standards and regional environment, the index system, index weight, and quantitative index value will vary. The paper set up six categories of 16 basic indexes and introduced space index concept considering with the natural environment, social and economical environment, and forestry resource in Lin’an city comprehensively. The paper developed the regional forest danger rating appraisal system with ArcGIS and C#, and realized the dynamic appraisal for forest danger degree. The result is: in Yuqian town, the area of rank IV of forest fire danger covers about 5% area of forest, so the strict fire prevention measures should be adopted; 13% of the area has potential forest fire of rank III of forest fire danger, so people should strengthen their fire accident consciousness; about half of the area is of the rank II of forest fire danger, and 32% of the area is of rank I of forest fire danger, so people should take the daily fire prevention measures. 

Keywords: regional forest fire; index system; spatial index; appraisal of the degree

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
LihuaTang2008accuracy.pdf282.75 KB

Object-Based Correspondence Analysis for Improved Accuracy in Remotely Sensed Change Detection

Hao Gong +, Jinping Zhang and Shaohong Shen
School of Remote Sensing & Information Engineering, Wuhan University
129 Luoyu Road, Wuhan 430079, China

Abstract. The correspondence analysis (CA) method, a multivariate technique widely used in ecology, is relatively new in remote sensing. In the CA differencing method, bi-temporal images were transformed into component space, and individual component image differencing can then be performed to detect possible changes, somehow similar to principal component analysis. The advantage of the CA method is that more variance of the original data can be captured in the first component than in the PCA method. However, these
techniques are all performed on a pixel by pixel basis, becoming unsatisfactory in some circumstances due to higher spectral heterogeneity in imagery of high spatial resolution. This problem can be alleviated by the object-based strategy, which segments the image into regions of relative homogeneity, which are, in turn,
used as the basic units for data analysis. This paper proposes an object-based approach to correspondence analysis for change detection, whose performance was compared with those of pixel-based PCA and CA. Results showed that the object-based CA method produced the best accuracy in change detection. 

Keywords: change detection, correspondence analysis, objects, accuracy

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
HaoGong2008accuracy.pdf581.36 KB

Object-Oriented Classification of High Resolution Satellite Image for Better Accuracy

Luyao Huang  + and Ling Ni
School of Remote Sensing and Information Engineering, Wuhan University, Wuhan 430079, China 

Abstract: Compared with the middle and low resolution satellite images, the high resolution satellite images have richer spatial but less spectral information. When these images are applied in classification and land cover-use extraction, previous study has already  shown that the conventional pixel-based statistical methods can not obtain very satisfying result. To solve the problem, the paper try to introduce and test the object-oriented method based on segment to classify  high resolution remotely sensed data: an image is subdivided into separated regions called objects according to the spectral and spatial heterogeneity in the image segmentation process, then the objects are assigned to a specific class according to detailed description of the class in the classification process. Regarding a QuickBird image of Wuhan as an example and Erdas Imagine, eCognition softwares as the platforms, the paper carries on two kinds of supervised classification: based on pixel and object-oriented independently, using the error matrix to analyse and compare the final classification accuracy. The experiment demonstrates that when choosing the adequate samples and segmentation parameters, the object-oriented method  greatly lighten the noise influence, has higher classification accuracy and efficiency than that achieve by pixel-based method. Meanwhile, the classification result of object-oriented method is much easier to understand and explain.

Keywords: object-oriented classification, supervised classification, high resolution, eCognition

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
LuyaoHuang2008accuracy.pdf456.23 KB

On Effects of Weights in Spatial Interpolation

Chaokui Li, Liang Chen, Yong Wang and Shuanning Zheng 
Institute of Geospatial Information Science, Hunan University of Science and Technology, Xiangtan,  Hunan 411201, China

Abstract.  Nowadays, GIS software systems provide some generic interpolation methods. Errors may occure due to inappropriate setting of parameters. For example, in IDW, the default value of exponent is 2. But experiments prove that 2 may not be the best. This article discusses the inverse distance weighted method
and the spline Functions. It was found that: (1) The choice of weights of IDW has great impact on interpolation results, we must choose the exponent according to the specific circumstances in the problem domain; (2) In some areas, data points are scarce or missing, increasing the exponential value of IDW appropriately can improve the  results; (3) In comparison to  IDW, splines are more suitable for situations where surface changes gradually, such as temperature, elevation, groundwater temperature, concentration of pollution, and so on.

Keywords: IDW, regularized spline, rension spline, exponential, weight

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
ChaokuiLi2008accuracy.pdf426.96 KB

On the Representation of Spatial Uncertainties with Stochastic Simulation in Land Data Assimilation

Xujun Han + and Xin Li
Cold and Arid Regions Environmental and Engineering Research Institute, Chinese Academy of Sciences, Lanzhou, Gansu, 730000, China

Abstract. We use the random simulation and geostatistical sequential simulation to represent the model uncertainties by generating the uncertain input ensembles in land data assimilation, as the proper representation of the spatial uncertainties is very  important to ensembles-based filter land assimilation methods and the efficiency of the filter depends on the proper representation of the model noise statistics. The method outlined in this paper explicitly acknowledges three sources of uncertainty and takes the spatial structure of the variables into consideration. To restrict the simulation interval of the uncertain inputs, the geostatistical interpolation technique, the geostatistical extrapolation technique and the truncated normal random number generator were applied. Simulation results from these uncertain inputs indicated that this method was sufficient to guarantee the separation of the soil moisture ensembles and easy to introduce the non additive noise. We proved the applicability of the stochastic simulation in representing the model spatial uncertainties in the land data assimilation.

Keywords: random simulation, geostatistics, sequential Gaussian simulation, data assimilation.

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
XujunHan2008accuracy.pdf321.12 KB

Propagation and Visualization of Uncertainty in NL-Based Spatial Analysis

Danhuai Guo 1+ 
1 Institute of Remote Sensing Applications, Chinese Academy of Sciences, Beijing, China 

Abstract. Natural language (NL) as the most natural expression way of human is an ideal approach in spatial analysis. A growing attention of GIS scientists and AI scientists has been paid to the uncertainty of NL-based spatial queries and spatial analysis which include uncertainty of natural language expression and processing as well as the one of GIS model and spatial data source. The uncertainty of natural language expression that origin from vagueness of natural language itself, limitation of spatial cognitive ability of speakers and incompleteness of natural language expression will probably be brought into natural language processing result. In spatial analysis calculation based on natural language, the calculation result uncertainty is from uncertainty of starting feature, uncertainty of operator and uncertainty of calculating factor. The confidence of spatial analysis result is a mixture of the uncertainty sources listed above. Furthermore the visualization of spatial analysis uncertainty as another hot issue in uncertainty research field helps domain experts and other users realize reliability of spatial analysis results plainly. In this paper focusing on the study of propagation and visualization of uncertainty in  NL-based spatial analysis, firstly, we summarize the uncertainty source of NL-based spatial analysis especially in NL-based location determination systems, secondly,  we set up an uncertainty propagation model of NL-based spatial analysis and take an example of variant confidence distribution in variant part of linear  and regional features in spatial analysis based on natural language, thirdly, we design a plain uncertainty visualization in GIS GUI, finally, an experimental prototype was developed to verify models, analysis result and visualization design.

Keywords: spatial analysis, natural language, natural  language-based spatial  analysis, uncertainty propagation, uncertainty visualization, confidence

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
DanhuaiGuo2008accuracy.pdf390.27 KB

Properties and Applications of the Interpolation Variance Associated with Ordinary Kriging Estimates

Jorge Kazuo Yamamoto + and Marcelo Monteiro da Rocha 
Department of Environmental and Sedimentary Geology, Institute of Geosciences, University of Sao Paulo, Brazil. 

Abstract. This paper shows some properties and applications of the interpolation variance. The interpolation variance is a reliable alternative to the kriging variance. The interpolation standard deviation presents correlation with estimates  and also spatial correlation following the spatial pattern of the original data. Moreover, it can also be used for computing the global estimation variance associated with the average value of ordinary kriging estimates. Based on this uncertainty measure we have developed numerical procedures for correcting the smoothing effect of ordinary kriging estimates and backtransforming lognormal kriging estimates. Corrected or backtransformed estimates do not present any bias when compared to classical solutions. Properties and applications of the interpolation variance are illustrated on three data sets presenting different frequency distributions: lognormal, normal and negative skewness. 

Keywords: kriging variance, interpolation variance, smoothing effect, ordinary kriging

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Yamamoto2008accuracy2.pdf457.54 KB

Quality Check in Urban and Rural Cadastral Spatial Data Updating

Dezhu Gui 1, 2+, Gang Li 1, Chengming Li 2 and Chengcheng Zhang 1, 2
1 School of Environment Science and Spatial Informatics, China University of Mining and Technology, Xuzhou, Jiangsu, 221008, China
2 Chinese Academy Surveying and Mapping, Beijing, 100039, China

Abstract. Cadastral spatial databases quality control and maintenance is a key issue in construction and updating cadastral information systems. The quality of cadastral spatial data includes position accuracy, attribute accuracy, and topological consistency. In this study, different types of topological inconsistency in cadastral spatial data are analyzed, such as node mis-matching, crack, and superposition between cadastral parcels. Methods were developed to check and modify cadastral spatial data.

Keywords: cadastral, topological relationships, spatial data quality, check, modification

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
DezhuGui2008accuracy.pdf226.08 KB

Quantifying Degrees of Information in Remote Sensing Imagery

Zongjian Lin 1 +, Bing Deng 2 
Chinese Academy of Surveying and Mapping, 16 Beitaiping Road, Haidian District, Beijing 100039, China
2 School of Remote Sensing and Information Engineering, Wuhan University, Wuhan 430079, China

Abstract. So far there’s been no metrology on the measurement of information of remote sensing imagery. We introduce the conception “Information quantity” from  information theory to solve this problem. In this paper, the method and formulation for calculating information content of remote sensing imagery are discussed. Furthermore, we calculate the information quantity of some imagery and analyze the factors that affect the calculation on information quantity.

Keywords: information quantity, uncertainty, entropy, remote sensing imagery, correlation

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
ZongjianLin2008accuracy.pdf312.9 KB

Remote Sensing Image Classification Based on Improved Fast Independent Component Analysis

Fangfang Li 1, Benlin Xiao 2 +, Yonghong Jia 1 and Xingliang Mao 3
1 School of Remote Sensing and Information Engineering, Wuhan University, Wuhan, 430079, China
2 Civil Engineering & Architecture School, Hubei University of Technology, Wuhan, 430068, China
3 Information Office of the People's Government of Hunan Province, Changsha, 410011, China

Abstract. The increasing requirement of classification categories is followed by the increasing probabilities of wrong classification and the decreasing classification speed. If we can separate certain types of pixels out in advance, and then classify the remaining pixels, we can reduce the probabilities of mistakes effectively. This paper proposed an improved Fast Independent Component Analysis (ICA) based remote sensing image classification algorithm. Firstly we analyzed the core iterative process of Fast-ICA algorithm, and adopted adaptive step size control in our search strategy, thus avoid large number of iterations caused by too small or too large step. Secondly, due to the initial value of Fast-ICA algorithm effects the results very much, a favorable initial matrix was selected before our iterative process. Next we use the improved algorithm to separate out certain types of pixels in advance, in such a manner to simplify the following classification. At last we compared the results of this algorithm with general Fast-ICA algorithm、principal component analysis (PCA) and ratio transformation. The experiment result shows the effectiveness of using this algorithm in image classification.

Keywords:  independent component analysis, Fast-ICA algorithm, image classification, principal component analysis

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
FangfangLi2008accuracy.pdf483.17 KB

Residual Error Analysis of GPS Data Sequence Based on WP

Guoqing Qu +, Xiaoqing Su and Baomin Han
Shandong University of Technology, Zibo 255049, China

Abstract.  GPS observation sequence contains all kinds of impact factors and the function relations between them are complicated. That  influences the extraction of feature information and the ability of explanation of parameter models. In residual error that still exists in the signal after a series of professional treatment, including difference correcting, tide correcting, and so on, system error has a relative high value compared to random error. Further more, information of different factors impacting observations behaves different in system error, and shows some periodicity in the frequency domain. If these factors can be departed during positioning and orbit determination, not only will the precision of the two be enhanced but also that can provide materials for the study of other disciplines. Traditional parameter estimation methods in these issues don’t appear so efficiency. In this paper, periodicity of error of GPS data sequence is analyzed, and residual error items with periods of a year, half of a year, a month and half of a month are extracted within different frequency bands obtained by wavelet packet transform. Then the corresponding residual errors are acquired. Frequency aliasing between sub-bands appears during the above-mentioned process. Wavelet filters applied aren’t ideal so that one sub-band obtains some frequencies belong to other near ones and up-sampling and down-sampling these mixed sub-bands will cause frequency folding, for not satisfying the sampling theorem. Both of them lead to the frequency aliasing during decomposition and reconstruction of wavelet packet algorithm. To Eliminate or weaken the impact of aliasing, relevant measures are studied to improve period items quality of GPS data. The availability of method is proved in testing example, and items with period of a year, half of a year, a month and half of a month acquired with it, are more believable.

Keywords: GPS, wavelet packet, frequency aliasing, feature extraction, residual error

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
GuoqingQu2008accuracy.pdf265.18 KB

Revealing Long Term Land Use and Land Cover Change in a Severely Disturbed Environment

Zhenyu Zhang 1, 2 +, Jim Peterson 1, Xuan Zhu 1 and Wendy Wright 3  
1 Centre for GIS, School of Geography and Environmental Science, Monash University, Clayton, Victoria 3800, Australia
2 Faculty of Engineering and Surveying, University of Southern Queensland, Toowoomba, Queensland 4350, Australia
3 School of Applied Sciences and Engineering, Monash University, Churchill, Victoria 3842, Australia

Abstract. Land use and land cover change (LUCC) is one of the important drivers of environmental change on all spatial and temporal scales. LUCC contributes significantly to earth atmosphere interactions, forest fragmentation, and biodiversity loss. It has become one of the major issues for environmental change monitoring and natural resource management. This study aims to reveal the long term land use and land cover changes (from 1939 to 2004) in the Strzelecki Ranges, Victoria, Australia by integrating remote sensing and geographical information system (GIS) and to provide quantitative analysis of LUCC information in the area. The land use and land cover is derived from historical aerial photography with the support of Vicmap Elevation, Ecological Vegetation Classes (EVCs) map and stereo models established by using stereo pair of aerial photographs. The EVC map provides a good ground truth not only for the 2004 imagery, but also is part of the reference for interpreting 1988, 1972, 1954 and 1939 orthoimages. The interpretation was carried out with  respect to the forest canopy patterns that appeared on the imagery, relationships with other land covers, and DEM derived attributes such as aspects and slope declivity. The results show that land use and land cover in this area changed substantially from 1939 to 2004. Large areas of cleared land and natural forest regrowth on previously cleared land were gradually converted to plantations. The area covered by cool temperate rainforest has remained relatively stable throughout the period. 

Keywords: land use and land cover, GIS, classification, environment, ecology. 

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
ZhenyuZhang2008accuracy.pdf898.78 KB

Robust Interpolation of Agricultural Census Data to Hydrological Units and Implications for Diffuse Pollution Modelling

Paulette Posen 1+, Michael Hutchins 2, Andrew Lovett 1 and Helen Davies 2 
1 School of Environmental Sciences, University of East Anglia, Norwich, Norfolk NR4 7TJ, UK
2 Centre for Ecology and Hydrology Wallingford, Maclean Building, Crowmarsh Gifford, Wallingford, Oxfordshire OX10 8BB, UK

Abstract. Diffuse pollution from agriculture is often responsible for observed concentrations of agricultural compounds being in excess of the upper limits prescribed by the WFD in some river catchments, and reductions in these concentrations will require widespread changes in farm practice.  One of the aims of the Catchment hydrology, Resources, Economics and Management (ChREAM) study at the University of East Anglia in the UK is to assess likely impacts of WFD implementation on agricultural land use, and consequent implications for water quality and farm incomes.  An element of this has involved updating an existing diffuse pollution model to reflect present-day land use profiles and comparing outputs (in terms of nitrate concentrations) from current land use with those modelled from early 1990s land use.  Combining agricultural land use data with hydrological spatial units can involve a number of problems arising from the integration of a variety of data formats at a range of spatial and temporal resolutions, and the aggregation of source data over different spatial extents.  This work assesses uncertainty arising from areal interpolation of agricultural census data to hydrological units in the River Derwent catchment in north-east England.  The study sets out to identify the range of spatial resolutions at which robust estimations of agricultural land use can be made and examines the implications for diffuse pollution modelling.

Keywords: agricultural census, diffuse pollution modelling, land use, nitrates, Water Framework Directive

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

 

AttachmentSize
Posen2008accuracy.pdf365.92 KB

Sample Size Determination for Image Classification Accuracy Assessment and Comparison

Giles M. Foody +
School of Geography, University of Nottingham, NG7 2RD, UK

Abstract. The classification accuracy statement is the basis of the evaluation of a classification’s fitness for purpose. Accuracy statements are also used for applications such as the evaluation of classifiers, with attention focused especially on differences in the accuracy with which data are classified. Many factors influence the value of a classification accuracy assessment and evaluation programme. This paper focuses on the size of the testing set(s), and its impacts on accuracy assessment and comparison. Testing set size is important as an inappropriately large or small sample could lead to limited and sometimes erroneous assessments of accuracy and of differences in accuracy. In this paper the basic statistical principles of sample size determination are outlined. Some of the basic  issues of sample size determination for accuracy assessment and accuracy comparison are discussed. With the latter, the researcher should specify the effect size (minimum meaningful difference), significance level and power used in an analysis and ideally also fit
confidence limits to estimates. This will help design a study as well as aid interpretation. In particular, it will help avoid problems such as under-powered analyses and provide a richer information base for classification evaluation. Central to the argument is a discussion of Type II errors and their control. The paper includes equations that could be used to determine sample sizes for common applications in remote sensing, using both independent and related samples.

Keywords: remote sensing classification, sample size, Type I and II error, power, confidence interval.

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Foody2008accuracy.pdf264.06 KB

Sampling Designs for Assessing Map Accuracy

Stephen V. Stehman
State University of New York (SUNY)
College of Environmental Science and Forestry (ESF)
320 Bray Hall, Syracuse, NY 13210 USA

Abstract. Assessing map accuracy requires comparing the categories or quantities mapped to the reality of what is on the ground. Practical necessity dictates that the ground condition can only be determined for a sample of locations. Thus sampling design becomes a critical component of accuracy assessment. Historically, the basic sampling designs implemented for map accuracy assessment were simple random, systematic, stratified random, and cluster sampling. These designs remain the fundamental building blocks of effective sampling design for accuracy assessment. The demands placed upon accuracy assessment have increased as the richness of spatial data and applications have expanded. Desirable assessment objectives now extend beyond the analysis based on an error matrix to include accuracy of gross and net change, composition of the classes mapped (at one or more levels of support), and landscape features (e.g. patch size and shape distributions). Quantitative map products, for example, maps of percent impervious surface or percent forest canopy cover, pose new sampling design challenges. Perhaps the biggest challenge of an expanded set of objectives is the requirement to collect reference data for assessment units of different sizes (e.g. 30 m by 30 m pixel, 3x3 pixel block, or 5 km by 5 km block). Multi-stage cluster sampling becomes a prominent design option when attempting to meet multiple objectives targeting multiple sizes of assessment units. Sampling design choices become more difficult as the number of accuracy objectives increases. Different sampling designs are suited to achieve some objectives better than others, and trade-offs among desirable design criteria must be recognized and factored into the decision-making process. As mapping science continues to advance, to keep pace, accuracy assessment sampling designs need to be developed and evaluated to address these emerging new objectives. Sampling designs can no longer just target the traditional descriptive accuracy objectives encapsulated by the error matrix analyses, but must simultaneously permit assessment of additional objectives such as accuracy of land-cover composition and landscape pattern, and accuracy of quantitative map products.

Keywords: multi-stage cluster sampling, landscape pattern, land-cover composition, quantitative outputs

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Stehman2008accuracy.pdf202.07 KB

Scale Effects and Correction for Land Surface Albedo in Rugged Terrain

Jianguang Wen 1, 3 +, Qiang Liu 1, Qinhuo Liu 1, Qing Xiao 1, 2 and Xiaowen Li 1 

1 State Key Laboratory of Remote Sensing Science, Jointly Sponsored by the Institute of Remote Sensing
Applications of Chinese Academy of Sciences and Beijing Normal University, Beijing 100101, China
2 Beijing Research Institute of Uranium Geology, Beijing 100029, China
3 Graduate School of the Chinese Academy of Sciences, Beijing 100049, China

Abstract. It is well known that influence of topography must be accounted when using high-resolution remote sensing data to estimate land surface reflectance or albedo in rugged  terrain. However, when moderate or low-resolution satellite remote sensing data are used, the topographic effects on albedo calculation are generally considered ignorable because  the slope of the low-resolution pixels are usually small. A potential problem is that  topographic effects within one low-resolution pixel have been omitted, which may cause error in albedo estimation. This problem comes from the scale effect in land surface albedo. This paper investigates in theory whether there is scale effect in land surface albedo when upscaling high-resolution surface albedo into low-resolution surface albedo in rugged terrain. Based on this analysis, we present the method to upscale high-resolution surace albedo to coarse  resolution, and finally derive a correction factor for land surface albedo derived from low-resolution remote sensing data. The method puts forward in this paper is a useful and effective method for scale correction of land surface albedo; and it’s also a good method to calculate albedo from low-resolution remote sensing data in rugged terrain.

Keywords: albedo, scale effect, rugged terrain, topographic effects

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
JianguangWen2008accuracy.pdf418.22 KB

Scale-Span Classification of Multispectral Images Based on Feature Construction and Decision Trees

Ning Shu 1, 2+, Liqun Lin 1, Yan Gong 1, Jun Xiao 2 and Fangfang Jin 3 
1 The School of Remote Sensing Information Engineering, Wuhan University, Wuhan, China
2 National Lab for Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan, China
3 North Star Power Science and Technology CO., LTD, Hangzhou, China

Abstract. Most of the existing classification methods, based on the homogeneous-region, involve the best segmentation criterion choice. Using the so-called best scale to classify the multi-scale objects defined by human subjectivity, the paper doesn’t think it is the best way to correspond with the demand of the scale of human being. So the paper proposes a new scale-span classification method, based on multi-scale homogeneous-region model. The method uses the feature construction to fulfill the construction of scale-span features, and the best scale choice is implicit in the new constructive features, rather than directly carrying on the best scale choice. The experimental result proves the new constructive scale-span features can reduce the dimension of the feature space, and can fully use the longitude information of different scales, thus improve the classification accuracy.

Keywords: feature construction, scale-span, genetic programming, multispectral images, classification, homogeneous-region

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
NingShu2008accuracy.pdf393.25 KB

Simulation of Coastlines Based on Cloud Fractal

Xue Yang +, Kun Qin, Cijun Wu and Li Chen
School of Remote Sensing Information Engineering, Wuhan University, 129 Luoyu Road, Wuhan, 430079, China

Abstract. It is a very important subject to simulate coastlines based on fractal method. In order to describe the coastlines realistically, we should consider the uncertainty, e.g. fuzziness, randomness and other uncertainties. Based on cloud model theory, the paper puts forward an improved fractal method of midpoint
subdivision interpolation, which uses cloud model to express the randomness contained in the self-similarity of natural scenery, and embodies the diversities of uncertainty. The experiments of simulation of coastlines validate the proposed improved method.

Keywords: coastline simulation, fractal, cloud model, uncertainty

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
XueYang2008accuracy.pdf270.61 KB

Spatial Autocorrelation and Random Effects in Digitizing Error

Daniel A. Griffith
Ashbel Smith Professor of Geospatial Information Sciences
University of Texas at Dallas, USA

Abstract. The volume of georeferenced data that have  involved manual digitizing suggests that error associated with this process merits a more thorough analysis, especially given recent advances in conventional and spatial statistical methodology. Spatial filtering and mixed modeling techniques are used to analyze manual digitizing outcomes of  an experiment involving Hill’s famous drumlins data of Northern
Ireland. Findings include: (a) spatial autocorrelation plays an important role, and (b) a random effects term accounts for a nontrivial amount of variability.

Keywords: digitizing error, drumlins, random effects, spatial autocorrelation

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Griffith2008accuracy.pdf438.76 KB

Spatial Projection Rectification for Densifying Ground Control Points

Di Wu 1 +, Yanchun Liu 1, Lixin Guo 1 2, Fang Cheng 1 and Gaixiao Li 1 
1 Department of Hydrography and Cartography, Dalian Naval Academy, No. 667, Jiefang Road, Dalian, Liaoning Province, China
2 Surveying and Mapping Academy, Information Engineering University of the People’s Liberation Army, No. 66, Longhai Road, Zhengzhou, Henan Province, China

Abstract. The ground control points of marine and coastal remote sensing images could not be properly selected because of the special characteristics of the ocean, thus the accuracy of geometric rectification was restricted. A new geometric exact rectification method of spatial projection was put forward, and it could gain dense ground control points. The rectification principle and steps of this method were systematically studied, and the rectification model was researched.  It showed that the accuracy of spatial projection rectification model was 1.078 pixels from the experiment result. So it is an effective method to those remote sensing images lack of ground control points.

Keywords:  geometric exact rectification, spatial projection, rectification accuracy, ground control point, coastal zone, remote sensing image

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
DiWu2008accuracy.pdf375.33 KB

Spatial Uncertainty Assessment in the Reconstruction of Presettlement Forest Patterns in Western NY, USA

E.-H. Yoo 1, Y.-C. Wang 2 and A. Trgovac 1+ 
1 Department of Geography, University at Buffalo, The State University of New York
2 Department of Geography, National University of Singapore

Keywords: sequential indicator simulation, forest landscape, presettlement forest, vegetation reconstruction, spatial uncertainty

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Yoo2008accuracy.pdf168.57 KB

Spatial properties of design-based versus model-based approaches to environmental sampling

Don L. Stevens, Jr.
Department of Statistics, 44 Kidder Hall
Oregon State University, Corvallis, OR 97333 USA
Tel.: +001 541 737 3587; Fax +001 541 737 3489
stevens@stat.oregonstate.edu

Abstract
It is widely recognized that an efficient sample of a spatially distributed resource will have some degree of regularity.  For example, locating sample points at the nodes of a regular grid is an optimal model-based design for some  semivariograms and domain shapes.  Locating points becomes more complicated if the domain has an irregular shape or if the design incorporates existing sample points.  In this talk, I review some model-based techniques, such as simulated spatial annealing, for incorporating prior knowledge in locating new sample points.  These techniques are contrasted with design-based techniques, such as generalized random tessellation stratification, that can also incorporate prior knowledge and existing sample points.

Keywords: spatial simulated annealing, optimal spatial design

In: Caetano, M. and Painho, M. (eds). Proceedings of the 7th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, 5 – 7 July 2006, Lisboa, Instituto Geográfico Português

AttachmentSize
Stevens2006accuracy.pdf343.53 KB

Spatio-Temporal Reconstruction of MODIS NDVI Data Sets Based on Data Assimilation Methods

Juan Gu +, Xin Li and Chunlin Huang
Cold and Arid Region Environmental and Engineering Research Institute, CAS, Lanzhou, 730000, China

Abstract. Consistent Normalized Difference of Vegetation Index (NDVI) time series, as paramount and powerful tool, can be used to monitor ecological resources that are being altered by climate and human impacts, since its temporal evolution is strongly linked to changes in the state of land surface. However, the noise caused mainly by cloud contamination, heavy aerosol, atmospheric variability and signal of background soil and bi-directional effects impedes NDVI data from being further applied. In this work, data assimilation method for NDVI was proposed to reconstruct high-quality spatially and temporally continuous MODIS NDVI data. The historical MODIS NDVI data are used to generate the background field of NDVI based on a simple three-point smoothing technique,  which can generally capture the annual feature of vegetation change. At every time step, the quality assurance (QA) flags in MODIS VI products were adopted to determine empirically the weight between background field and observation of NDVI. Additionally, the gradient inverse weighted (GIW) filter algorithm is adopted further to remove spatial discontinuity. Finally, the more reliable NDVI data can be generated. This method is implemented by the 16-Day L3 Global 1km SIN Grid NDVI data sets covered west China during 2003-2006. Results indicate that the newly developed method is easy and effective in reconstructing high-quality MODIS NDVI time series.

Keywords: data assimilation, gradient inverse weighted filter, MODIS NDVI, spatio-temporal reconstruction

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
JuanGu2008accuracy.pdf311.49 KB

Study on Forest Fire Prevention Three-Dimensional Background Database

Aijun Xu 1, Yongshun Li 2 and Danfeng Wang 1
1School of Information Engineering, Zhejiang Forestry University, Lin’an, Zhejiang, 311300, China 
2The Real Estate Surveying & Mapping Office, Kaituo, Hangzhou, Zhejiang, 310003, China

Abstract. The forest fire prevention background database is the data set which contains all the related data of forest fire prevention, mainly including each kind of attribute data, the spatial data as well as the remote sensing image data. It is a very important basis for the forest fire prevention direction. By researching the three-dimensional background database building method, the paper introduces the forest fire prevention three-dimensional background database creation method which uses MapX and OpenGL as the development tools, mainly study the following fields in the MapX and OpenGL environment: three dimensional terrain modeling method in forest fire prevention background database creation, texture mapping method, two- dimensional and three dimensional linkage, the fusion of the forest resources and fire prevention resources data, the design of the forest fire prevention three dimensional background database and so on. The real three dimensional terrains are established on the basis of the OpenGL, the DEM altitude data, the establishment of the illumination, the normal vector and so on. And the demonstration can be implemented really by integrating the OpenGL texture mapping mechanism and the remote sensing data in the three dimensional terrain. Taking the two-dimensional attribute data of the MapInfo form as a foundation and using the suitable coordinate transformation, the linkage between the two-dimensional map and the three dimensional is achieved.

Keywords: forest fire prevention; three dimension background database; data fusion; coordinate linkage; 3D modelling 

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
AijunXu2008accuracy.pdf513.45 KB

Sustainability, Climate Change and Uncertainty

Bryan C. Pijanowski
Department of Forestry and Natural Resources,
Human-Environment Modeling and Analysis (HEMA) Laboratory and Purdue Climate Change Research Center, Purdue University, West Lafayette, Indiana 47906 USA

Abstract. I present what I view as three levels of uncertainty that is inherent in the sustainability, climate change and human well-being problem. I argue that new approaches are needed to address these uncertainties. These include the need to study more problems that link ecosystem services to human-well being, reduce the scientific community’s dependency on “getting it perfect” before knowledge is communicated to decision makers and the need to use tools and approaches from other disciplines to expand our “imagination” about future pathways and endpoints.

Keywords: uncertainty, ecosystem services, sustainability, climate change and resilience

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Pijanowski2008accuracy.pdf398.49 KB

Temporal and Spatial Thermal Radiation Distribution Analysis within and above Crop Canopies by 3D Simulation

Huaguo Huang 1, 2, Lei Wang 1, Yang Zhang 2 and Qinhuo Liu 2 + 
1 Key Laboratory for Silviculture and Conservation, Ministry of Education;            
College of Forestry, Beijing Forestry University, 100083, Beijing, P.R. China
2 State Key Laboratory of Remote Sensing Science, Jointly Sponsored by the Institute of Remote Sensing Applications of Chinese Academy of Sciences and Beijing Normal University, 100101, Beijing, P.R. China

Abstract. The thermal radiation distribution within and above the canopy is explored. CUPID model is chosen to simulate the canopy component temperatures. TRGM model, based on 3D realistic structure and radiosity solution, is used to simulate the component brightness temperature distribution within canopy and thermal radiation emission directionality distribution  TB  (θ) above canopy. It is noted that the shape of vegetation component temperature profile mostly like an inverse “S” curve. Component temperature histogram is very good indicator to show daily variation. The difference between component brightness temperature and thermodynamic temperature mainly varies with LAI, relative height and leaf angle distribution. Through establishing relationship between TB (θ) variation, LAI and environmental parameters on crop canopies, the temporal effects or angular effects of the thermal radiation distribution is discussed.

Keywords: 3D simulation, thermal radiation distribution, TRGM, CUPID model

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
HuaguoHuang2008accuracy2.pdf685 KB

The Impacts of Landscape Patterns on the Accuracy of Remotely Sensed Data Classification

Haobo Lin 1, 2, Jindi Wang 1 +, Yanchen Bo 1 and Jing Bian 2
Research Center for Remote Sensing and GIS, School of Geography and Remote Sensing, Beijing Normal University, China; State Key Laboratory of Remote Sensing Science, Beijing, China
2 Hebei University, Baoding, Hebei Province, China.

Abstract. The accuracy of the Land Use/Land Cover (LULC) data derived from remote sensing images is critical for many applications. Classification error is caused by the interaction of numerous factors, including landscape characteristics, sensor resolution, spectral  overlap, preprocessing algorithms, and classification
procedures[1,2]. The purpose of this paper is to analyze the impacts of landscape characteristics on classification accuracy and to analyze the distribution of errors from a landscape pattern perspective. Logistic regression was employed to assess  the impact of landscape characteristics on classification accuracy. Two landscape variables, patch size and heterogeneity, were calculated at the pixel’s level and sub-pixel’s level respectively and their effects were evaluated. The results indicate that classification accuracy increases as land cover patch size increases and as heterogeneity decreases. The effect of patch size is more important than heterogeneity and the impact of variables calculated at sub-pixel level is more important than pixel level.

Keywords: classification, accuracy, landscape, logistic

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
HaoboLin2008accuracy.pdf362.45 KB

The Study on Trinary Join-Counts for Spatial Autocorrelation

Songlin Zhang 1, Kun Zhang 2
1 Department of Surveying and Geomatics, Tongji University, Shanghai, 200092, China
2 Lab. of Geographic Information Science, East China Normal University, Shanghai 200062, China

Abstract. Spatial autocorrelation must handle two kinds of geographic data. One is continuous valued variables in which the observations are real numbers. Another is nominal variables which consist of a set of discrete categories. The frequently used spatial autocorrelation statistic for nominal variable is “join-counts”, which deals with two categories that are often referred to as “black” and “white”. However, three categories are also common case in present world. For example, as to land cover, the attribute of each parcel may be changed, unchanged or uncertain which represents parcels not belonging to the first two categories. In three valued logic, values could be true, false or unknown. This paper extended join-counts to trinary join-count. The trinary categories are referred as “black”, “white” and “gray” in this paper and the possible type of joins are limited to black-black (BB), white-white (WW), gray-gray (GG), black-white (BW), black-gray (BG) and white-gray (WG). In order to calculating joins, we assign a trinary variable xi to each region with xi=1 if region is “black”, xi=0 if region is “white”, and xi=-1 if region is “gray”. Formulas for counting six kinds of joins are deduced. The aim of trinary join-counts is to test the null hypothesis that the values are assigned to the regions randomly and independently. Means and variance of trinary join-counts are calculated under sampling without replacement assumption. The tests statistic is computed by transforming the join-counts into standardized value. At last, two kinds of examples are designed. The first one is regular grid with trinary values, the selection of regular grid is particularly motivated by widely use of remote sensing data. The second one is irregular spatial regions, such as land cover statues. The results suggest that the trinary joincount is useful and could be used in spatial autocorrelation analysis.

Keywords: trinary variable, join-counts statistics, spatial autocorrelation

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
SonglinZhang2008accuracy.pdf221.67 KB

The YUE-HASM Method

Tianxiang Yue and Yinjun Song
Institute of Geographical Sciences and Natural Resources Research, Chinese Academy of Sciences, 11A, Datun Road, Anwai, Beijing, 100101, China 

Abstract. Multigrid is employed to solve the partial differential equation set of SMTS and a method for high accuracy surface modeling (YUE-HASM) is developed. Numerical tests demonstrate that computing time of YUE-HASM is proportional to the first power of the total number of grid cells, while SMTS
computing time was proportional to the third power of the total number of grid cells. YUE-HASM has highly accelerated computational speed, especially for simulation with huge computational work. In the meanwhile, YUE-HASM has greatly increased simulation accuracy.

Keywords: computational speed, multigrid method, simulation accuracy, surface modelling, YUE-HASM

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
TianxiangYue2008accuracy.pdf358.57 KB

Uncertainty Analysis of Radar Wind Retrieval with Data Assimilation

Ming Wei 1, 2+, Nan Li 1, Hao Kang 1, Yan Shi 3 and Junling Jiang 4 
1Nanjing Sino-America Cooperative Remote Rensing Laboratory
2 Jiangsu Key Laboratory of Meteorological Disaster, Nanjing University of Information Science and Technology, Nanjing 210044, P.R. China
3 Institute of Heavy Rains, CMA, Wuhan 430074, P.R.China
4 Yantai Observatory, Yantai, Shangdong, 264003, P.R.China

Abstract. Uncertainty of Doppler radar retrieval wind with data assimilation is discussed. It is known that a single Doppler radar measures only the radial velocity of precipitation particles. The simple adjoint model retrieval with data assimilation is one of main methods. Experiment has been made with Yantai single Doppler radar snowstorm data at 2:00am Dec.6, 2005. For decreasing the error information of retrieval wind field, the retrieval uncertainty and  the reasons of error are analyzed. In mathematics and physics, parameter retrieval is the inversion problem of the differential  equation, which is always the ill-posed. Many factors could affect the retrieval precision. For minimizing the uncertainty of retrieval wind using adjoint model with data assimilation, we have to study its mathematics and physics characteristics for improving retrieval and getting more accuracy information.

Keywords: Doppler radar, wind retrieval, uncertainty, ill-posed, local minimum

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
MingWei2008accuracy.pdf795.62 KB

Uncertainty Analysis of the GeoPEARL Pesticide Leaching Model

G.B.M. Heuvelink 1, F. Van Den Berg 1, S.L.G.E. Burgers 2 and A. Tiktak 3
1 Alterra, Wageningen University and Research Centre, PO Box 47, 6700 AA Wageningen, The Netherlands
2 Biometris, Wageningen University and Research Centre, PO Box 9101, 6700 HB Wageningen, The Netherlands
3 Netherlands Environmental Assessment Agency, PO Box 303, 3720 AH Bilthoven, The Netherlands

Abstract. GeoPEARL is a spatially distributed model describing the fate of pesticides in the soil-plant system. It calculates the drainage of pesticides into  local surface waters and the leaching into the regional groundwater. GeoPEARL plays an important role in  the evaluation of Dutch pesticide policy plans. This study analysed how uncertainties in soil and pesticide properties propagate through GeoPEARL for three representative pesticides. The GeoPEARL output considered is the 90 percentile of the spatial distribution of the temporal median of the leaching concentration (P90). The uncertain pesticide properties are the coefficient of sorption on organic matter and the half-life of transformation in soil. Both were assumed uncorrelated in space and were represented by lognormal probability distributions. Uncertain soil properties considered were horizon thickness, texture, organic matter content, hydraulic conductivity and the water retention characteristic. Probability distributions were derived from meta-data stored in the Dutch soil information system. A regular grid sample of 258 points covering the agricultural area in the Netherlands was randomly selected. At the grid nodes, realisations from the probability distributions of uncertain inputs were generated and used as input to a Monte Carlo uncertainty propagation analysis. The results show large uncertainties in P90, with interquartile ranges larger than the median for all three pesticides. Further analysis showed that the pesticide properties were the main source of uncertainty and that uncertainty in soil organic matter contributed to a lesser extent. Uncertainty contributions from other soil properties were negligible. These results suggest that  improved assessment of soil properties will hardly improve the accuracy of the predicted pesticide leaching. Instead, more accurate assessment of the pesticide properties is required, but this is difficult because these uncertainties in fact reflect the simplified process descriptions of GeoPEARL.

Keywords: error propagation, Monte Carlo, stochastic simulation, upscaling.

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
Heuvelink2008accuracy.pdf439.61 KB

Uncertainty and Its Propagation in Land Investigation

Haixia Mao and Wenzhong Shi
Department of Land Surveying and Geo-informatics, The Hong Kong Polytechnic University,  Hung Hom, Kowloon, Hong Kong

Abstract. The result of land investigation  is very important, which helps to learn about the status of the national land resource and decision making. While the uncertainty and its propagation during the whole land investigation process will inevitably influence the result. In other  word, how to reduce or avoid the uncertainty problem is very important as well as investigation itself. In this paper, we first start with discussing each land investigation procedures, such as image rectification, image classification, land particle information capturing and so on, in order to detect and classify the uncertainty and  its propagation. Commonly, the land use map is the product of the land investigation, which is constitutive of discrete features with point shape, linear features and land parcels. Moreover, the uncertainty of these features can be described from several aspects, namely positional accuracy, attribute accuracy etc. Based on the study of practical workflow, the positional accuracy of land parcels is obviously the most important part of uncertainty estimation in land investigation. So, we will then focus on describing the positional uncertainty of land parcels with a proposed model. Consequently, we can get the quantitative description of the positional accuracy of land parcels and figure out the quality of land investigation.

Keywords: land investigation, uncertainty, propagation, model, quantitative estimation

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
HaixiaMao2008accuracy.pdf201.67 KB

Validation of Spatial Prediction Models for Landslide Susceptibility Maps

S.B.Bai 1, J.Wang 1, A. Pozdnoukhov 2 and M. Kanevski 2 
1 National Education Administration Key Laboratory of Virtual Geographic Environments, Nanjing Normal University, Nanjing, 210046, China
2 Institute of Geomatics and Analysis of Risk, University of Lausanne, Amphipole, 1015 Lausanne, Switzerland

Abstract. A wide range of numerical models and tools have been developed over the last decades to support the decision making process in environmental applications, ranging from physical models to a variety of statistically-based methods. In this study, a landslide susceptibility map of a part of Bailongjiang River, in northwest China was produced, employing binary logistic regression analyses. The available information includes the digital elevation model of the region, geological map and different GIS layers including land cover data obtained from satellite imagery. The landslides were observed and documented during the field studies. To achieve the most appropriate results some sensitivity analyses were also carried out. To validate the quality of mapping, the studied area was divided into training part (3 sub-basins) and validation part (2 sub-basins). Correct classification percentage and Root Mean Square Error (RMSE) values for the validation data for that case were estimated as 76.6% and 0.432, respectively. 

Keywords: landslide susceptibility, GIS, binary logistic regression, validation

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
ShibiaoBai2008accuracy.pdf777.42 KB

Visualising Uncertainty in Spatial Decision Support

Rachel O’Brien +
Institute for Land Water and Society, Charles Sturt University

Abstract. Uncertainty is an  issue in environmental spatial decision  support, as it is in most spatial modelling problems. When uncertainty is ignored in spatial modelling, issues can arise around the validity of decisions based on these models. This paper discusses sources of uncertainty in Spatial Decision Support Systems (SDSS) and introduces the SDSS CaNaSTA (Crop Niche Selection in Tropical Agriculture) based on Bayesian probability modelling. CaNaSTA focuses in  particular on visualising uncertainty introduced through lack of data or knowledge. The SDSS incorporates some sources of uncertainty into the structure of the model itself, and provides tools to visualise other sources of uncertainty. Although CaNaSTA has been developed for use in agricultural decision-making,  the model and tools used to handle and visualise uncertainty are applicable to all spatial decision tasks. This paper provides a case-study approach to acknowledging this uncertainty and ways of managing it in a spatial decision making context.

Keywords: Spatial Decision Support Systems, CaNaSTA, visualising uncertainty

In: Wan, Y. et al. (eds) Proceeding of the 8th international symposium on spatial accuracy assessment in natural resources and environmental sciences, World Academic Union (Press).

AttachmentSize
O¡¯Brien2008accuracy.pdf232.43 KB