Accuracy 2004 Conference


The Joint Proceedings of the Accuracy and Environmentrics 2004 conferences

Preface and table of contents

INVITED PRESENTATIONS

Brillinger, David R., University of California, USA "Wildfire chances and probabilistic assessment"

Calder, Catherine, Ohio State University, USA "Efficient posterior inference and prediction of space-time processes using dynamic process convolutions"

Fasso, Alessandro, University of Bergamo, Italy "Data quality and uncertainty in fine particulate monitoring"

Goovaerts, Pierre, Biomedware, Inc., USA "Modeling uncertainty about pollutant concentration and human exposure using geostatistics and a space-time information system: application to arsenic in groundwater of southeast Michigan"

Lowell, Kim, University of Laval, Canada "Estimating boundary existence and width from a single forest map"

Myers, Donald, University of Arizona, USA "Estimating and modeling space-time variograms"

Quintanilha, J.A., Universidade de São Paulo, Brazil "Wildfire threat count analysis by longitudinal models"

Scott, Marian, University of Glasgow, Scotland "Spatial scale and its effects on comparisons of airborne and ground-based gamma-ray spectrometry for mapping environmental radioactivity"
 

CONTRIBUTED PRESENTATIONS

NOTE: The above listed papers have not been peer-reviewed. Please read also the website disclaimer.

 

AttachmentSize
accuracy2004.jpg57.41 KB
TitlePrefaceContents.pdf197 KB

A Bayesian Approach to Determining the “Paternity” of Environmental Contamination

Douglas E. Splitstone 1 and Michael E. Ginevan 2
1 SPLITSTONE & ASSOCIATES
4530 William Penn Hwy. #110
Murrysville, Pennsylvania 15668
Phone (724) 325-7421 Fax (724) 327-5958

2 Exponent
1730 Rode Island Ave., NW
Washington, DC  20036
Phone (202) 441-6484  Fax (703) 834-2707

Abstract

Most evaluations of the contribution of different sources to environmental contamination at different locations begin with a set of measurements of chemical species at different locations. One then calculates either a variance-covariance or correlation matrix and performs some type of factor analysis. The resulting multivariate patterns shown in the factor solution are then used to identify the contribution of different sources to contamination and proportion the cleanup cost. The present discussion takes a different approach that derives from the Bayesian analysis of genetic data to determine the likely paternity of a given child given a set of data from potential fathers, the mother and the child. We demonstrate the use of two variants this approach. One assumes independence of chemical contaminant production while the other considers the specific multi-contaminant production. In both we begin with a non-informative prior that assumes that all sources are equally likely to have contributed to the site contamination and use the data to calculate a posterior probability of each source being responsible for the contamination.

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

 

 

 

AttachmentSize
Splitstone2004accuracy.pdf33.43 KB

A Method to Distinguish Real Landscape Change from Map Error During Map Comparison

R. Gil Pontius Jr. and Christopher D. Lippitt
Clark University 
School of Geography
950 Main St. Box 1151
Worcester, MA 01610
Phone: (508) 769-4980
Fax: (508) 793-8881
Email: rpontius@clarku.edu

Abstract 
This paper describes a method to assess uncertainty in the measurement of change among categories of land cover between two points in time. Our method acknowledges that  error is an inherent part of map production, so differences between maps can be due to both real landscape change and to map error. For example, if two producers create a map of the same location for the same time, we would expect there to be disagreement between those two maps due to producer error. If we compare two maps of the same location from different times, we would expect to see disagreement between the two maps due to two reasons: 1) producer error, and 2) true landscape change between the former time and the  latter time. Our method uses matrix algebra and the conventional error matrix to distinguish between disagreement due to error versus disagreement due to real landscape change. The technique is applicable to conventional accuracy assessment because it relies on standard probability theory. We illustrate the technique with maps of Anderson Level I categories from 1971 and 1999 in Central Massachusetts and find there to be a dominant transition from Forest to Built over the last three decades. The method also shows that a substantial portion of the difference in water and wetland can be attributable to map error.

Keywords: Accuracy, Change, Error, GIS, Uncertainty, Matrix

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Pontius2004accuracy.pdf303.2 KB

A space-time dynamic model based on image warping

S. Aberg, F. Lindgren and A. Malmberg
Centre for Mathematical Sciences, Division of Mathematical Statistics
Lund University
Box 118, SE-221 00 Lund, Sweden
Phone: +46 46 2227974; Fax: +46 46 2224623
E-mail: s_aberg@maths.lth.se

Abstract
In this paper we present a spatio-temporal dynamic model which can be realized using image warping, a technique used to match images. Image warping is a non-linear transformation which maps all positions in one image plane to positions in a second plane. Using thin-plate splines this transformation is defined by a small set of matching points, a feature that will make this method work even for large data sets. In our case the dynamics of the process is described by warping transformations between consecutive images in the space-time series. Finding these transformations is a trade-off between a good match of the images and a smooth, physically plausible deformation. This leads to a penalized likelihood method for finding the transformation. The deformations that can be describedwith this approach include all affine transformations as well as a large class of non-linear deformations. Our method is applied to the problem of nowcasting radar precipitation.

Keywords: Dynamic models; Spatio-temporal modelling; Image warping; Thin-plate splines

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Aberg2004accuracy.pdf111.83 KB

Accounting for error propagation in the development of a leaf area index (LAI) reference map to assess the MODIS MOD15A2 LAI product

J.S. Iiames 1*, R. Congalton 2, A. Pilant 1, and T. Lewis

1* Corresponding Author: iiames.john@epa.gov (w) 919-541-3039 (fax) 919-541-9420
1Environmental Protection Agency, Research Triangle Park, NC 2
University of New Hampshire, Durham, NH

Abstract
The ability to effectively use remotely sensed data for environmental analysis is dependent on understanding the underlying procedures and associated variances attributed to the data processing and image analysis technique. Equally important is understanding the error associated with the reference data used to assess the accuracy of image products. This paper details measurement variance accumulated in the development of a leaf area index (LAI) reference maps used to assess the accuracy of the Moderate Resolution Imaging Spectroradiometer (MODIS) MOD15A2 LAI 1000-m product in the southeastern United States. MODIS LAI was compared with reference data derived from Landsat Enhanced Thematic Mapper (ETM+) during the 2002 field season in the Albemarle-Pamlico Basin in south-central Virginia. Ground-based optical LAI estimates, at scales ranging between 1-m to 100-m, were correlated with various ETM+ derived vegetation indices at a 30-m pixel resolution. These 30-m pixels were scaled up to the 1000-m MODIS LAI pixel resolution and averaged to give one LAI value. A detailed listing of error propagation for this reference data set includes uncertainty associated with: (1) integrated optical LAI field estimating technique combining the Tracing Radiation and Architecture of Canopies (TRAC) instrument and hemispherical photography, (2) the classification of ETM+ into land-cover components, and (3) the scaling of this 30-m classified data into MODIS (1000-m x 1000-m) scale resolution.

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Iiames2004accuracy.pdf236.1 KB

Accuracy Assessment and Uncertainty in Baseline Projections for Land-Use Change Forestry Projects

Louis Paladino 1 and R Gil Pontius, Jr. 2
1 Research Scientist
ISciences LLC
685 Centre Street #207
Jamaica Plain, MA 02130
(617) 524-1115
loupaladino@yahoo.com
2Assistant Professor
Clark University, Graduate School of Geography
Department of International Development, Community and Environment
950 Main Street, Worcester MA 01610-1477, USA
508-793-7761
rpontius@clarku.edu

Abstract
This paper uses state-of-the-art  validation techniques to estimate uncertainty in the prediction of  future disturbance on a landscape.  Interpreted satellite imagery from 1975 to 1992 was used to calibrate the land change model.  Data from 1992 to 2000 was used to assess the goodness-of-fit of validation as measured by the statistic Kappa for Location (Klocation), which is a variant of  the traditional Kappa index of agreement. Based on the goodness-of-fit in the year 2000, Klocation is extrapolated to predict the goodness-of-fit for the year 2026.  The extrapolation of Klocation allows the scientist to predict the model’s accuracy with regard to the location of future disturbance.  Based on the extrapolated Klocation,  the scientist  can estimate  the conditional probability that a location will be disturbed in the future, given that the model says it will be disturbed. For the validation year of 2000, Klocation is 0.22, which means that the model is 22% of the way between random and perfect in predicting the location of disturbed land versus undisturbed land.  The predicted Klocation in the year 2026 is 0.008.  Therefore, the estimated probability that a pixel will be disturbed in 2026, given that the model says  it will be disturbed is 1.8%.  The probability that a pixel will be disturbed given that the model says it will be undisturbed is 1.0%. The results allow us to understand the uncertainty when using models for land-use change forestry  project baseline estimates.  In this example,  the uncertainty is very  high, which  means that either models need to dramatically improve or carbon trading and Kyoto Protocol policy needs to be reevaluated.

Keywords: carbon, Kappa, land use/land cover change, model prediction, validation.

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Paladino2004accuracy.pdf793.81 KB

An Iterative Uncertainty Assessment Technique for Environmental Modeling

D.W. Engel, A.M. Liebetrau, K.D. Jarman, T.A. Ferryman, T.D. Scheibe, and B.T. Didier
Pacific Northwest National Laboratory, P.O. Box 999, Richland, Washington 99352
Telephone (509) 375-2307; e-mail dave.engel@pnl.gov

Abstract
The reliability of and confidence in predictions from  model simulations are crucial—these predictions can significantly affect risk assessment decisions.  For example, the fate of contaminants at the U.S. Department of Energy’s Hanford Site has critical impacts on long-term waste management strategies.  In the uncertainty estimation efforts for the Hanford Site-Wide Groundwater Modeling program, computational issues severely constrain both the number of uncertain parameters that can be considered and the degree of realism that can be included in the models.  Substantial improvements in the overall efficiency of uncertainty analysis are needed to fully explore and quantify significant sources of uncertainty.  We have combined state-of-the-art statistical and mathematical techniques in a unique iterative, limited sampling approach to efficiently quantify both local and global prediction uncertainties resulting from model input uncertainties.  The approach is designed for application to widely diverse problems across multiple scientific domains.  Results are presented for both an analytical model where the response surface is “known” and a simplified contaminant fate transport and groundwater flow model.  The results show that our iterative method for approximating a response surface (for subsequent calculation of uncertainty estimates) of specified precision requires less computing time than traditional approaches based upon noniterative sampling methods.

Keywords: iterative, uncertainty, risk, groundwater, sampling

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

 

AttachmentSize
Engel2004accuracy.pdf266.49 KB

An Object-based Method for Quantifying Errors in Meteorolgical Models

C. A. Davis, B. G. Brown, R. Bullock, M. Chapman, K. Manning, and A. Takacs
National Center for Atmospheric Research
P.O. Box 3000
Boulder, CO 80307 USA

Abstract
Measures-based characterizations of errors in forecasts fail to provide useful information as increasingly complex spatial structures become evident in numerical weather forecasts. The perceived utility of such forecasts often lies in their ability to predict localized and episodic events. Subtle forecast timing and location errors yield low skill scores by traditional measures because the phenomena of interest (e.g. precipitation, turbulence, icing) contain large spatial gradients. Yet, the occurrence of such features in forecasts can provide forecasters with important clues to the possible occurrence of important weather events. An alternative verification strategy is to decompose numerical forecasts into objects whose position and attributes can be objectively compared between models and observations. We describe a recently developed method for defining rain areas for the purpose of verifying precipitation produced by numerical weather prediction models. Objects are defined in both forecasts and observations based on a convolution (smoothing) and thresholding procedure. Nearby objects are merged according to a rule set involving separation distance and orientation. Objects in the two datasets are matched and a statistical analysis of matched pairs is performed. In addition, the raw rainfall values within each object are retained and the distribution of intensities is analyzed as another object attribute. Extension of this method to a wide variety of environmental forecast verification problems is also discussed. This approach allows us to more appropriately assess the spatial accuracy of these types of forecasts than previously used methodologies.

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Davis2004accuracy.pdf313.02 KB

Assessing the Spatial Uncertainty of Boundaries on Forest Maps Using an Ecological Similarity Index

Maria-Gabriela Orzanco 1, Kim Lowell 2 and Marie-Josée Fortin 3
1 Centre de recherche en géomatique
Université Laval, Pavillon Casault
Québec (Québec) - G1K 7P4 - CANADA 
(418) 656 2131 ext. 12677
Maria-Gabriela.Orzanco@scg.ulaval.ca
2 Centre de recherche en géomatique
Université Laval, Pavillon Casault 
Québec (Québec) - G1K 7P4 - CANADA
Kim.Lowell@scg.ulaval.ca 
3Landscape Ecology Laboratory
Department of Zoology
University of Toronto
Toronto (Ontario) - M5S 3G5 – CANADA
mjfortin@zoo.utoronto.ca

Abstract
Forest stand boundaries are usually represented on vegetation maps as fine lines all having the same width.  Thus spatial accuracy and precision are considered to be uniform over an entire map.  However, real forest boundaries — i.e., those that can be observed on the ground— differ in degree of definition and contrast between adjacent stands.  This study examines the association between ecological contrast and spatial context of forest boundaries and the uncertainty associated with their spatial reliability — i.e., location and probability of true existence.  In this study, the contrast between adjacent forest stands was quantified via an ecological similarity index (ESI).  The attributes used to estimate the value of this  index for each mapped boundary were species composition, density class, height class and age class.  In addition, the context, or neighbourhood, of each boundary was quantified by using the contrast values and their locations to calculate a local autocorrelation index (local Moran’s I).  The relationships among the context, the contrast and the existence probability (consistency) were explored using discriminant analysis.  (Consistency was measured by overlaying three multi-temporal forest maps and analyzing the result.)  It was found that the  consistency was most closely related to the contrast between the forest types present on either side of  a forest stand boundary.  Context was found to have no relationship to stand boundary consistency.

Keywords: spatial uncertainty, forest boundary, ecological contrast and context

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Orzanco2004accuracy.pdf550.29 KB

Categorical Coefficients of Agreement for Assessing Soft-Classified Maps at Multiple Resolutions

Kristopher Kuzera and R. Gil Pontius, Jr.
Department of International Development, Community and Environment
Department of Geography
Clark University
950 Main Street, Worcester, MA 01610 USA
E-mail: kkuzera@clarku.edu, rpontius@clarku.edu

Abstract
An important issue regarding  map comparison involves examining agreement of pixels between two categorical maps.   Information in the maps can become less precise spatially as resolutions of the maps change from fine to coarse. This paper consists of two major components addressing this issue.  First, cross-tabulation matrices are produced  for multiple resolutions  using hard classification and  three different soft pixel classification operators: Multiplication, Minimum, and Composite. Second, the cross- tabulation matrices are analyzed through various statistical measures to produce the following categorical coefficients of agreement: user’s accuracy, producer’s accuracy, conditional kappa by row,  and conditional kappa by column.  These statistical measures are  graphed to demonstrate their behavior over multiple resolutions.  Land-cover maps of the same subject area for two different years are compared to illustrate the analysis.  The area examined is a  part of Worcester County, Massachusetts which has experienced about 10% change in land cover between 1971 and 1999.  The results from the analysis show that over multiple resolutions, the  Hard operator behaves chaotically, the Multiplication operator decreases agreement, the Minimum operator  is difficult to interpret, while the Composite operator offers increasing agreement, is interpretable, and is recommended for a multiple resolution analysis.  

Keywords: categorical coefficients of agreement, multiple resolutions, soft-classified maps

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Kuzera2004accuracy.pdf330.56 KB

Characterization of the Spatial and Parameter Variability in a Subtropical Wetland

S. Grunwald, K.R. Reddy and J.P. Prenger  
Soil and Water Science Department
University of Florida
2169 McCarty Hall, P.O. Box 110290
Gainesville, FL 32611-0290
Phone: 352-392-1951 ext. 204 ; Fax: 352-392-3902
Email: SGrunwald@ifas.ufl.edu

Abstract
The eutrophication of subtropical wetland ecosystems has profound effects on biogeochemical patterns, pedo- and biodiversity, and ecological function. We investigated  a nutrient-enriched subtropical wetland, which is undergoing natural succession. We collected 20 biogeochemical soil properties at 267 sites to characterize the current ecological status. This exhaustive dataset served as a reference. Our goal was to identify properties which accounted for much of the spatial and the parameter  variability. We used Conditional Sequential Gaussian Simulation (CSGS) to generate realizations of biogeochemical properties to characterize spatial patterns and to assess explicitly the uncertainty of predictions. We used Principal Component (PC) Analysis to transform a number of possibly correlated variables into a smaller number of uncorrelated variables. CSGS was used to generate realizations of PCs. Each biogeochemical property was mapped into a PC indicating its significance to explain variability in  the dataset. A spatial sensitivity analysis using subsets drawn randomly from the exhaustive dataset suggested that dynamic biogeochemical properties grouped into PC1 require sampling densities of > 3.75 sites/100 ha whereas properties grouped into PC2 require a sample density of > 2.52 sites/100 ha to reproduce the spatial patterns and variability across the wetland.  Our results are valuable to document the current ecological spatial patterns in this wetland and  optimize future sampling designs in subtropi cal wetlands explicitly considering parameter and spatial variability.

Keywords: stochastic simulation, realizations, biogeochemical soil properties, principal component analysis, spatial patterns, spatial variability

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

 

AttachmentSize
Grunwald2004accuracy.pdf140.01 KB

Comparison of Methods for Normalisation and Trend Testing of Water Quality Data

Claudia Libiseller
Department of Mathematics, Division of Statistics
Linköping University
SE-58183 Linköping, Sweden
E-mail: cllib@mai.liu.se

Abstract
To correctly assess trends in water quality data, influencing variables such as discharge or temperature must be taken into account. This can be done by using (i) one-step procedures like the Partial Mann-Kendall (PMK) test or multiple regression, or (ii) two-step techniques that include a normalisation followed by a trend test on the residuals. Which approach is most appropriate depends strongly on the relationship between the response variable under consideration and the influencing variables. For example, PMK tests can be superior if there are long and varying time lags in the water quality response. Two-step procedures are particularly useful when the shape of the temporal trend is the primary interest, but they can be misleading if one of the influencing variables itself exhibits a trend or long-term tendency. The present study discusses the advantages and disadvantages of some trend testing techniques, using Swedish water quality data to illustrate the properties of the methods.

Keywords: long-term changes in covariate, non-monotonic changes, long memory effects, seasonal variation

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Libiseller2004accuracy.pdf304.31 KB

Complex Systems Analysis using Space-Time Information Systems and Model Transition Sensitivity Analysis

G. M. Jacquez, G. AvRuskin, E. Do, H. Durbeck, D. A. Greiling, P. Goovaerts,  A. Kaufmann, and B. Rommel 
BioMedware Inc.
516 North State Street
Ann Arbor, MI 48104-1236
Ph. 734.913.1098; Fax 734.913.2201

Abstract
Real-world systems are  dynamic,  complex  and  geographic,  yet many  mathematical  modeling tools  do  not evaluate  sensitivity of  results to  underlying  assumptions, and GIS do  not adequately represent time.  This presentation describes two new approaches: Space-Time  Information Systems (STIS), and Model Transition Sensitivity Analysis (MTSA). Current GIS are based on spatial data models that inadequately characterize  the temporal dimension needed for effective representation of complex systems. They do not deal readily with space- time georeferencing nor  space-time queries, and  are best suited to “snapshots” of static systems. These deficiencies prompted many geographers to call for a “higher-dimensional GIS” (a STIS) to better  represent space-time dynamics. When formulating models of complex systems, critical choices are made regarding model type and complexity. Model type is the mathematical approach employed, for example, a deterministic model versus a stochastic model.  Model complexity is determined by the amount of abstraction and simplification employed during model construction. A growing body of work demonstrates that choice of model type and complexity has substantial impacts on simulation results and on model-based decisions. This  paper  briefly describes STIS and MTSA approaches that allow researchers to more effectively represent complex systems and to evaluate the sensitivity of their results to underlying assumptions.

Keywords: space time information system; model transition sensitivity analysis; complex systems

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Jacquez2004accuracy.pdf624.25 KB

Continuous-domain model-assisted variance estimation

 Cynthia Cooper 
Oregon State University
Department of Statistics
44 Kidder Hall
Corvallis OR 97330
Ph. 541-737-5936
E-mail:  cooper@stat.orst.edu

Abstract
Model -based and design-based approaches to estimation differ in how variance of an estimator is quantified.  Design-based variance estimators account for covariance by incorporating within-cluster variance as used in multi -stage sampling.  The estimators are unbiased, intuitive and free of model assumptions, but require subsamples at all levels of the sample design  – a  problem in some stratified and systematic samples.  This potential inadequacy of data is analogous to small-area estimation scenarios.  Model -based quantification of mean square error, which combines variance of the underlying process and squared bias, will  not necessarily be representative of variation due to the sampling process.  In this paper, application of a model -assisted approach to quantifying variance due to the sampling process is explored.  The concept of employing a model of covariance is illustrated for two scenarios  – a model -based scenario in which the objective is to quantify precision of an estimate afforded by the sampling process; and a design-based scenario in which inadequate subsampling precludes quantifying within-cluster variance by direct estimation.  Application of a parametric model to quantify variance due to sampling is demonstrated to be more representative of the empirical variance observed in simulations of repeated samples on a fixed realization of a random process.  

Keywords:   model -based, design-based sampling/estimation, model -assisted variance estimation, kriging variance, continuous domain sampling, spatial covariance, non-exchangeable response

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Cooper2004accuracy.pdf69.25 KB

Data quality and uncertainty in fine particulate monitoring

A. Fassò 1, O. Nicolis 1, V. Gianelle 2
1 Department IGI, University of Bergamo
Via Marconi 5, 24044 Dalmine BG, Italy
tel. +39 035 2052 323, fax. +39 035 562 779
e-mail: alessandro.fasso@unibg.it
2 ARPA Lombardia, Milan, Italy

Abstract
In  order  to  assess  compliance  with  air  quality  standards,  European  regulations  prescribe  to  monitoring  the concentration of particulate matters and to control both annual and daily averages. The measurement accuracy varies according to  monitor type, temperature and pollution level, often in a complex nonlinear manner. Consequently, comparisons,  threshold exceedances  interpretation and compliance assessment are often difficult. In  this  paper, we  consider  the  displaced  dynamical  calibration  (DDC) model which  is  able  to  calibrate  biased readings  by  using  displaced  data  obtained  by  reference  instruments. Moreover, we  discuss  the  uncertainty  of annual averages of daily concentrations. An application to the Northern Italy air quality network allows us to draw some empirical conclusions.

Keywords: fine particulate matters, TEOM, gravimetric, calibration, state-space modelling, quality standard assessment

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Fasso2004accuracy.pdf62.95 KB

Detecting the Influence of Protection on Landscape Transformation in Southwestern Ghana

Clement Aga Alo and R Gil Pontius Jr
Department of International Development, Community and Environment
Graduate School of Geography
Clark University
950 Main Street, Worcester MA 01610-1477, USA
Ph. +1 508 798 0731; Fax +1 508 793 8820
Email: calo@clarku.edu, rpontius@clarku.edu

Abstract
This paper examines the transitions among six land cover categories in southwestern Ghana and compares the transitions within  protected areas to those outside protected areas. Landsat Thematic Mapper (TM) satellite images of 1990  and  2000 are  used to create two land cover classifications, and then the two maps are compared to produce cross-tabulation matrices for both the protected and unprotected areas. These  matrices are analyzed according to their various components to identify the most systematic landscape transitions. It is necessary to consider the amount of gain in each category separately from the amount of loss of each category between 1990 and 2000. The amount of gain of a category is measured relative to the distribution of the other categories in 1990 in order to compute the amount of gain that would be expected in each category due to a random process of gain. The expected gain is then compared to the observed gain to detect systematic transitions. In a manner analogous to the analysis of gains, the amount of loss of a category is measured relative to the distribution of the other categories in 2000, and then the observed loss is compared to the expected loss due to a random process of loss. A non- random gain and a non-random loss in a particular transition imply a systematic process of change. The results show that, in the protected areas, Closed Forest transitions systematically to Bare Ground but outside the protected areas Closed Forest transitions systematically to Bush & Scattered Trees. Evidently, the process of land transformation in the protected areas is different from outside the protected areas. The research highlights the need for the practical application of this methodological approach to landscape change identification in Ghana. Identifying the strong signals of forest transformation is particularly important in the light of efforts by policy makers to halt or at least to slow deforestation in Ghana.

Keywords: deforestation, Ghana, matrix, land, protection

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Alo2004accuracy.pdf950.7 KB

Detection of Boundaries in Regression Data in the Presence of Spatial Correlation

L. Xie and I. B. MacNeill

 

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Xi2004accuracy.pdf345.65 KB

Effect of Support Size on the Accuracy of Spatial Models: Findings of Rockfall Simulations on Forested Slopes

 
Luuk K.A. Dorren 1, Gerard B.M. Heuvelink 2 and Frédéric Berger 1
 1 Cemagref Grenoble 2, rue de la Papeterie, B.P. 76, 38402, Saint Martin d’Hères, France
Tel: +33 4 76762806; Fax: +33 4 76513803
E-mail: luuk.dorren@cemagref.fr
2 Laboratory of Soil Science and Geology
Wageningen University
P.O. Box 37, 6700 AA Wageningen, The Netherlands 

Abstract 
The accuracy of model output increases with a decreasing support size of the input data, due to the increase of detail. This paper examines whether this is true for various spatial models developed for simulating rockfall. We analyze the effect of the support size on the accuracy of a set of models and their parameters. Both calibration and validation data were obtained from real-size rockfall experiments in France, where high-speed video cameras recorded the trajectories and velocities of more than 200 individual falling rocks with diameters between 0.8 and 1.5 meter. The second validation set was obtained in the Austrian Alps. Here we mapped rockfall impacts on trees to obtain the spatial distribution of rockfall impacts throughout the study site. These observed data are thoroughly compared with the output of a rockfall simulation model. One of the main findings is that a larger support size can be a more important cause of a larger model error than poor data quality.

Keywords: support; accuracy; rockfall simulation, spatial model

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Dorren2004accuracy.pdf630.7 KB

Efficient Posterior Inference and Prediction of Space-Tim Processes Using Dynamic Process Convolutions

Catherine A. Calder
Department of Statistics
The Ohio State University
1958 Neil Avenue
Columbus, OH 43221
Ph. 614-688-0004; Fax 614-292-2096
E-mail: calder@stat.ohio-state.edu

Abstract
Bayesian dynamic process convolution models provide an appealing approach for modeling both univariate and multivariate spatial temporal data. Their structure can be exploited to signi cantly reduce the dimensionality of a complex spatial temporal process. This results in ecient Markov chain Monte Carlo
(MCMC) algorithms required for full Bayesian inference. In addition, the dynamic process convolution framework readily handles both missing data and misaligned multivariate space-time data without the need for imputation. We review the dynamic process convolution framework and discuss these and other
computational advantages of the approach. We present an application involving the modeling of air pollutants to demonstrate how this approach can be used to e ectively model a space-time process and provide predictions along with corresponding uncertainty statements.

Keywords: Bayesian, factor analysis, dimension reduction, air pollution

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Calder2004accuracy.pdf115.68 KB

Estimating and Modeling Space-Time Variograms

Donald E. Myers
Department of Mathematics
University of Arizona
Tucson, AZ 85721 USA
Ph. 520 621 6859, Fax 520 621 6859
E-mail: myers@math.arizona.edu

Abstract
As with a spatial variogram or spatial covariance, a principal purpose of estimating and modeling a space-time variogram is to quantify the spatial temporal dependence reflected in a data set. The resulting model might then be used for spatial  interpolation and/or temporal prediction which might take several forms, e.g. kriging and Radial Basis functions. There are significant problems to overcome in both the estimation and the modeling stages for space-time problems unl ike the purely spatial appl ication where estimation is the more difficult step. The key point is that a spatial-temporal variogram as a function must be conditionally negative definite (not just semi-definite) which can be a difficult condition to verify in specific cases. In the purely spatial  context one rel ies on a known l ist of isotropic valid models, e.g., the Matern class as well as the exponential and gaussian models, as well as on positive linear combinations of known valid models. Bochner’s Theorem (or the extension given for generalized covariances by Matheron) characterizes non- negative definite functions but does not easily distinguish the strictly positive definite functions. Geometric anisotropies can be incorporated via an affine transformation and space-time might be viewed as simply a higher dimensional space but possibly with an anisotropy in the model. This approach impl ies that there is an appropriate and natural  choice of a norm (or metric) on space-time analogous to the usual Euclidean norm for space. The most obvious way to construct a model for space-time is to “separate” the dependence on space and on time. This is not new and in fact a similar problem can occur in spatial application, i.e., a zonal anisotropy. Early attempts used either the sum of two covariances or the sum of two variograms, in either case one component being defined on space and the other on time. It is easily shown that this leads to semi-definite models and hence if used in kriging equations, the result may be a non- invertible coefficient matrix. It is also easy to see that the product of two variograms (even on the same domain) can violate the growth condition. However it is well known that the product of two strictly positive definite functions is again strictly positive definite. In fact a gaussian covaiance model might be viewed as product (of several gaussian models each defined on a lower dimensional space). Likewise one form of the exponential covariance, often used in hydrology appl ications, is also a product. When converted to variogram form, there is not only a product (with a negative sign) but also a sum. It turns out that the variogram form is more convenient in the estimation stage. The simple product covariance is somewhat too limi ting however, each component effectively must have the same “si ll”. An obvious extension is the product-sum model which when converted to variogram form is the same as for the product (but with different coefficients), This can be further generalized to an integrated product sum model. In the estimation stage there are two separate problems, one is to determine the appropriate model type and the other is to estimate the model parameters. In a typical spatial  application the list of possible models is usually kept smal l and hence the primary emphasis is on estimating the model parameters. In the spatial temporal context the list of possible models is likely to be much greater and model type selection more difficult. De Ioca, Myers and Posa have shown that the use of marginal variograms is one way to attack this problem and have given an example of extending to the integrated product sum model as well as to the multivariate case using a Linear Coregional ization Model.

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Myers2004accuracy.pdf102.49 KB

Evaluating Classified MODIS Satellite Imagery as a Stratification Tool

Greg C. Liknes, Mark D. Nelson, and Ronald E. McRoberts
USDA Forest Service
1992 Folwell Ave
St. Paul, Minnesota, United States of America  55108
Ph. +1 651 6495192 Fax. +1 651 6495285
E-mail:  gliknes@fs.fed.us

Abstract
The Forest Inventory and Analysis (FIA) program of the USDA Forest Service collects forest attribute data on permanent plots arranged on a hexagonal network across all 50 states and Puerto Rico.  Due to budget constraints, sample sizes sufficient to satisfy national FIA precision standards are seldom achieved for most inventory variables unless the estimation process is enhanced with ancillary data.  When used to  create strata for stratified estimation, satellite imagery can be effective ancillary data.  The National Land Cover Dataset (NLCD), a land cover classification based on satellite imagery, has been used to produce substantial increases in the precision of statewide forest inventory estimates.   Because inventories are conducted on an annual basis, it is desirable to create strata using a product that is updated more frequently than the 10-year update cycle of the NLCD.  In particular, data from the MODIS sensor are available every 1-2 days, although at a much coarser spatial resolution than the Landsat data used in the creation of the NLCD (250-1000m vs. 30m).  In this study, the effectiveness of strata created by classifying MODIS satellite imagery is compared to that of strata created from the NLCD.  Results indicate that precision decreases by 0.9 percent per million acres when using a 1-km dataset versus a 30-m dataset.

Keywords: Stratified estimation, MODIS, spatial resolution

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Liknes2004accuracy.pdf208.79 KB

Evaluating Patterns of Spatial Relations to Enhance Data Quality

David Gadish, Ph.D.
Assistant Professor of Information Systems
California State University Los Angeles
5151 State University Drive
Los Angeles, CA, 90032
Tel: 323-343-2924
Email: dgadish@calstatela.edu

Abstract  

Effective use of data stored in  spatial databases requires methods for  evaluation and enhancement of the quality of the data. Spatial data quality can be evaluated using a measure of internal validity, or consistency, of a data set. Capturing spatial data consistency is possible with a multi -step approach. A distance measure is used to detect implicit spatial relations between  neighboring objects. T he next step involves identifying the types of relations between these neighboring objects using topology based constraints. The semantic information of objects, together with topological relations are combined to discover patterns,  or rules, in the data. These rules are based on  the analysis of the relations between each object and each of  its neighbors, as well as between each object and all of its neighbors. Patterns of spatial relations, represented as rules are  validated using  available metadata, as well as trend analysis and Monte Carlo simulation techniques. These  can now be used as the basis for automated detection of inconsistencies among spatial objects, where  possible inconsistencies are detected when one or more rules are violated. Detected inconsistencies can then be adjusted, thus increasing the quality of spatial data sets.

Keywords: Consistency, Patterns, Spatial Relations 

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

 

AttachmentSize
Gadish2004accuracy.pdf75.74 KB

Extracting the Essence of Process-Based Models of the Flow of Nitrogen through Catchments

K. Wahlin 1, D. Shahsavani 1, A. Grimvall 1, A. J. Wade 2, D. Butterfield2, and H. P. Jarvie 3
1 Department of Mathematics
Linköping University, SE-58183 Linköping, Sweden
Ph. +46 13 281000; Fax +46 13 100746
E-mail: kawah@mai.liu.se
2 Aquatic Environments Research Centre
School of Human and Environmental Sciences,
The University of Reading, UK, RG6 6AB
3 Centre for Ecology and Hydrology
Wallingford, UK, OX10 8BB

Abstract
The dynamic properties of process-oriented models of the flow of nitrogen through catchments can be very complex. We introduced several types of ensemble runs that can provide informative summaries of meteorologically normalised model outputs and also clarify the extent to which such outputs are related to various model parameters. Specifically, we showed how to assess travel times for nitrogen and water in the unsaturated and saturated zones. Studies of the catchment model INCA-N revealed that the temporal distribution of the modelled water quality  response to changes in fertiliser application was determined primarily by the hydromechanical parameters of the model, whereas the magnitude of the total intervention effect was influenced mainly by the parameters governing the turnover of nitrogen in soil. Furthermore, the travel times in this model were invariably shorter for nitrogen than for water due to preferential removal of the nitrogen exhibiting unusually long residence times.

Keywords: Model reduction; Ensemble runs; Catchment; Nitrogen; Retention.

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Wahlin2004accuracy.pdf55.31 KB

Fast Cross Validation of a Paleoclimate Model Using IRMCMC

 
S. Bhattacharya and J. Haslett
 

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Bhattacharya2004accuracy.pdf439.69 KB

Indexing Structure For Handling Uncertain Spatial Data

Bir Bhanu, Rui Li, Chinya Ravishankar, Michael Kurth and Jinfeng Ni
Center for Research in Intelligent Systems 
University of California, Riverside, 92521
bhanu, rli@vislab.ucr.edu ; ravi, kurthm, jni@cs.ucr.edu

Abstract
Consideration of uncertainty in manipulation and management of spatial data is important. Unlike traditional fuzzy approaches, in this paper we use a probability-based method to model and index uncertain data in the application of Mojave desert endangered species protection. The query is a feature vector describing the habitat for certain species, and we are interested in finding geographic locations suitable for that species. We select appropriate layers of the geo-spatial data affecting species life, called habitat features, and model the uncertainty for each  feature as a probability density function (PDF). We partition the geographic area into grids, assign an uncertain feature vector to each cell, and develop a filter-and-refine indexing method. The filter part is a bottom-up binary tree  based on the automated clustering result obtained using the EM algorithm. The refine part processes the filtered results based on the “ similarity”  between the query and properties of each cell. We compare the performance of our proposed indexing structure with the intersection method from Mojave Desert Ecosystem Program (MDEP); our method is more selective and efficient.

Keywords: geographic information system, uncertainty, probability density function, optimizedGaussian mixture hierarchy, mixture model, R-tree, Mojave desert, desert tortoise

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

 

AttachmentSize
Bhanu2004accuracy.pdf186.59 KB

Method of Evaluation by Order Theory (METEOR) applied on the Topic of Water Contamination with Pharmaceuticals

K. Voigt 1 and R. Brüggemann 2 
1 GSF - National Research Center for Environment and Health, Institute of Biomathematics and Biometry, 85764 Neuherberg, Germany, kvoigt@gsf.de
2 Leibniz-Institute of Freshwater Ecology and Inland Fisheries, 12587 Berlin, Germany, (brg@igb-berlin.de

Abstract
A growing concern to be tackled today is the potential contamination of water resources by chemicals - including pharmaceuticals entering food chains and causing adverse effects in animals and humans. In this paper 75 publications from the years 2000-2004 are analysed concerning their information on 12 chosen pharmaceuticals. These pharmaceuticals are recognised as potential pollutants to the media water and soil. A 12x75 data-matrix is the basis for a mathematical evaluation (ranking) approach using the Method of Evaluation by Order Theory (METEOR).  The original data-matrix of 12 objects and 75 attributes is subject to several logical aggregation steps applying the Hasse Diagram Technique. It can be demonstrated that the aggregation to get three super indicators concerning different kinds of journals leads to a highly structured diagram with more comparabilities than incomparabilities. The aggregation steps with different weights, normalisation to 1 lead to linear order diagrams. These diagrams all show that the following ranking of the objects EES>IBU>BEZ>ROX>MET>FEN is given in all the linear order diagrams. All partial order diagrams and linear order diagrams demonstrate that the pharmaceutical EES: Estinyl Estradiol is always in a maximal position whereas FEN: Fenofibrate is always a minimal object.

Keywords: Partial order theory, Method of Evaluation  by Order Theory (METEOR), Hasse Diagram Technique, water pollution, drinking water, groundwater, pharmaceuticals

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Voigt2004accuracy.pdf250.98 KB

Minimizing Information Loss in Continuous Representations: A Fuzzy Classification Technique based on Principal Components Analysis

Barry J. Kronenfeld 1, Nathan D. Kronenfeld 2
1 Department of Geography, University at Buffalo
105 Wilkeson Quad, Amherst, NY 14261
Tel: (716) 645-2722    Fax: (716) 645-2329
E-mail: bjk3@buffalo.edu
2 Oakville, Ontario

Abstract
The increasing use of fuzzy classification methods to generalize environmental data has led to a persistent question of how to determine class membership values, as well as how to interpret these values once they have been determined.  This paper integrates the above two problems as complementary aspects of the same data reduction process.  Within this process, it is shown that a fuzzy classification technique based on Principal Components Analysis will minimize the amount of information lost through classification.  The PCA-based fuzzy classification technique is analogous to linear spectral unmixing models in remote sensing, and differs from algorithms such as fuzzy k-means in that primary attention is focused on preserving an accurate representation of the underlying attribute data, rather than maximizing the internal consistency of classes.  This focus on accuracy suggests PCA-based fuzzy classification as appropriate for data modeling applications.  However, further research is required to balance the goal of accuracy with the desire for simple (less fuzzy) representations.

Keywords: Data reduction, fuzzy classification, principal components analysis, linear unmixing models

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Kronenfeld2004accuracy.pdf2.42 MB

Model testing for spatial strong mixing data

R. Ignaccolo 1 and N. Ribeccoy 2
1 Dipartimento di Statistica e Matematica Applicata
Università degli Studi di Torino, Italy
e-mail: ignaccolo@econ.unito.it
2 Dipartimento di Scienze Statistiche
Università degli Studi di Bari, Italy
e-mail: ribecco@dss.uniba.it

Abstract
In analysing the distribution of a variable in a space, each value is subject not only to the source of the phenomenon but also to its localisation. In this paper, we t the model of the distribution, taking explicitly into account the spatial autocorrelation among the observed data. To this end we rst suppose that the observations are generated by a strong-mixing random eld. Then, after estimating the density of the considered variable, we construct a test statistics in order to verify the goodness of t of the observed spatial data. The proposed class of tests is a generalization of the classical chi-square-test and of the Neyman smooth test. The asymptotic behaviour of the test is analysed and some indications about its implementation are provided.

Keywords: goodness of t; correlated data; spatial process; mixing random eld.

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Ignaccolo2004accuracy.pdf206.21 KB

Modeling Uncertainty about Pollutant Concentration and Human Exposure using Geostatistics and a Space-time Information System: Application to Arsenic in Groundwater of Southeast Michigan

P. Goovaerts *, G. Avruskin *, J. Meliker **, M. Slotnick **, G. Jacquez *, J. Nriagu **
* Biomedware, Inc.
516 North State Street
Ann Arbor, MI 48104, USA
Ph.  (734) 913-1098; Fax: (734) 913-2201
E-mail: goovaerts@biomedware.com 
** School of Public Health
The University of Michigan
Ann Arbor, MI 48109-2029, USA

Abstract
The last decade has witnessed an increasing interest in assessing health risks caused by exposure to contaminants present in the soil, air, and water. A key component of any exposure study is a reliable model for the space-time distribution of pollutants. This paper compares the performances of multiGaussian and indicator kriging for modeling probabilistically the space-time distribution of  arsenic concentrations in groundwater of Southeast Michigan, accounting for information collected at private residential wells and the hydrogeochemistry of the area. This model will later be combined with a space-time information system to compute individual exposures and analyze its relationship to the occurrence of bladder cancer.  Because of the small  changes in concentration observed in time, the study has focused on the spatial variability of arsenic, which can be considerable over very short distances. Factorial kriging was used to filter this short-range variability, leading to a significant increase (17 to 65%) in the proportion of variance explained by secondary information, such as type of surficial rock formation and proximity to Marshall  Sandstone subcrop. Cross validation of 8,212 well data shows that accounting for this regional background does not improve the local prediction of arsenic, indicating the presence of unexplained sources of variability and the importance to model the uncertainty attached to these predictions.  

Keywords: indicator kriging, arsenic, cross validation, exposure

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Gooverts2004accuracy.pdf1.98 MB

Non Parametric Confidence Bands for β-Diversity Profiles

Tonio Di Battista and Stefano A. Gattone
Department of Quantitative Methods and Economic Theory
University ”G. d’Annunzio” of Chieti
Viale Pindaro, 42 I-65127 Pescara, Italy
E-mail: Dibattis@dmqte.unich.it

Abstract
The aim of this paper is to construct simultaneous confidence bands for the β- diversity profiles that are a parametric family of indexes of diversity for a biological population. In this framework drawbacks arise when simultaneous inference has to be performed. Moreover, biologists rarely have at hand large sample sizes so that the derivation of asymptotic sampling distribution may be unpractical. We try to overcome these problems by building simultaneous confidence bands adopting a non-parametric bootstrap procedure.

Keywords: Biological population, Bootstrap, Diversity profile, Simultaneous con- fidence region.

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
DiBattista2004accuracy.pdf95.41 KB

Nonparametric spatial-temporal analysis of SO2 across Europe

Marco Giannitrapani, Adrian Bowman, Marian Scott, Ron Smith
Dept. of Statistics, University of Glasgow, G12 8QW Scotland, U.K.
Tel: +44(0)141 330 6118 - Fax: +44(0)141 330 4814
email: marco@stats.gla.ac.uk

Abstract
From the 1970's, a co-ordinated international programme monitoring acidifying air pollution was initiated in direct response to observed acidification. At the same time, several international protocols on the reduction of acidifying and eutrophying emissions (SO2, SO4, etc.) were also agreed. This work presents an evaluation of the observed spatial and temporal trends in SO2 in Europe for the last quarter of the twentieth century, on the basis of data from EMEP (Co-operative Programme for Monitoring and Evaluation of the Long Range Transmission of Air Pollutants in Europe). The policy question of interest is whether the protocols have resulted in a real improvement in environmental quality and a real change in the acidifying environment. In the first part of the work, we report on nonparametric modelling of the temporal trends, accounting for the effect of meteorological covariates using generalized additive models (GAMs). The second part of the work considers the spatial patterns in the SO2 field and their temporal evolution.

Keywords: additive model, smoother, back fitting algorithm, variogram

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

 

AttachmentSize
Giannitrapani2004accuracy.pdf1.06 MB

Optimizing METAR Network Design for Verification of Cloud Ceiling Height and Visibility Forecasts

Eric Gilleland
National Center for Atmospheric Research
Research Applications Program
Boulder, CO 80307-3000
Email: ericg@ucar.edu

Abstract
Methods are given and explored for thinning METAR stations in order to make more meaningful verification analyses of cloud ceiling and visibility forecasts for use within the general aviation com- munity. Verification of these forecasts is performed based on data from surface METAR stations, which for some areas are densely located and others only sparsely located. Forecasts, which are made over an entire grid, may be awarded or penalized multiple times for a correct or incorrect forecast if there are many METAR stations situated closely together. A coverage design technique in conjunction with a percent of agreement analysis is used to find an “optimal” network design in order to better score forecasts over densely located regions. Preliminary results for a network of 104 monitors in the New England area suggest that the removal of some stations is appropriate.

Keywords: cloud celing height, visibility, verification, coverage design

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Gilleland2004accuracy.pdf267.98 KB

Powers of Anova Tests for Variables with General Distribution from Exponential Class

Zuzana Hrdlickova
Department of Applied Mathematics,
Faculty of Science, Masaryk University,
Janackovo nam. 2a, Brno, Czech Republic
Ph. +420 549 491 412; Fax +420 541 210 337
E-mail: zuzka@math.muni.cz

Abstract
The asymptotic powers of the test used in generalized linear models for comparison of the population means (ANOVA type models) and based on scale deviance statistics or score statistics are derived in the paper. The algorithm for calculation of the asymptotic power of the above mentioned test is described. This algorithm was computer implemented in MATLAB and used for the comparison of the asymptotic powers with the simulated powers of considered test for small sample sizes (for models with Poisson, gamma, binomial and negative binomial distributed response variables). Finally, obtained results are applied to environmetrical data for nding the suitable experimental design.

Keywords: power function, exponential class of distributions, generalized linear model, ANOVA type model

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Hrdlickova2004accuracy.pdf274.43 KB

Regression With Spatially Misaligned Data

Lisa Madsen and David Ruppert
Department of Statistics
Oregon State University
44 Kidder Hall, Corvallis OR 97331
E-mail: lmadsen@stat.orst.edu

Abstract
Suppose X(s) and e(s) are stationary spatially autocorrelated Gaussian processes and Y(s) = ß0 +ß1X(s) + e(s) for any location s. Our problem is to estimate the ß's, particularly ß1, when X and Y are not necessarily observed in the same location. This situation may arise when the data are recorded by different agencies or when there are missing data values. A natural but naive approach is to predict ("krige") the missing X's at the locations Y is observed, and then use least squares to estimate ("regress") the ß's as if these X's were actually observed. This krige-and-regress estimator is consistent, even when the spatial covariance parameters are estimated. If we use it as a starting value for a Newton-Raphson maximization of the likelihood, the resulting maximum likelihood estimator is asymptotically effcient. We can then use an information-based variance estimator for inference.

Keywords: Spatial regression, Maximum likelihood, Misaligned data

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Madsen2004accuracy.pdf289.05 KB

Satellite “Image-Banks” in Forest Information Systems? Prospects for Accurate Estimation of Forest Change and Growth from Data-Fusion Models.

Keith Rennolls
School of Computing and Mathematical Sciences,
University of Greenwich, London SE10 9LS, UK.
k.rennolls@greenwich.ac.uk

Abstract
Forest Information Services and Systems (FISs)  are becoming increasingly numerous largely because of the relatively recent increased ease of development of distributed  systems  within the context of the Internet and WWW.  Such systems vary both in design and implementation.  Some systems use remote sensed imagery, and in particular satellite imagery, as their primary data source for land-use map construction.  It seems, in general, on a rather longer time scale, such systems discard their base  imagery, and offer maps and  summary  tables  as  their information products. Others FISs make no use of remotely sensed imagery. We ask the question: what FISs use sequences of satellite imagery as a continuing explicit feature of their information offerings, either as a means or  adjuct to information display, or a means of re-analysis? A partial review is conducted in relation to this question with  the  answer “very few”.  It is argued and  conjectured that any FIS concerned with spatially explicit information, on Forests, the Environment, or land-use in general, which aims to be flexible in terms of what is mapped, which aims to be accurate in estimating and mapping change, and to make forecasts, MUST, in the long term, retain and make adaptive use of historical series of satellite imagery. This conjecture is based on  the  fact that any map, no matter how good, is just a particular model, calibrated at one time, from the currently available data. Re-analysis and improved modelling with new data demands access to the old raw data!  The reviewed FISs are examined in relation to this conjecture.  The paper considers some of the problems involved in making use of historical  series  of  satellite  imagery, and makes some suggestions on how FISs of the future might make use image-fusion modelling techniques to make the conjecture become a fact.

Keywords: Forest Information System, remotely sensed images, historical series, accuracy, spatio-temporal models, image-fusion, change-detection.

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Rennolls2004accuracy.pdf246.84 KB

Simulating Geological Structures Based on Training Images and Pattern Classifications

P. Switzer, T. Zhang, A. Journel
Department of Geological and Environmental Sciences
Stanford University
CA, 94305, U.S.A
Ph. +01 650 7232879; Fax +01 650 7250979
E-mail: switzer@stanford.edu

Abstract
Local two-dimensional spatial structure, as represented by a training image, can be summarized by a system of filter scores. Local patterns are then classified according to these scores. Sequen- tial point-support simulation proceeds by selecting the score class most resembling the local data and then patching a pattern from this class at the simulation location. This procedure can handle both categorical and continuous variable training images. In addition, because the score space has low dimension, computation is efficient. Examples show spatial simulations derived from training images of sand channels and lithofacies.

Keywords: training images, filters, filter scores, simulations, spatial uncertainty

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Switzer2004accuracy.pdf458.26 KB

Sources of Uncertainty in Land Use Pressure Mapping

L.R. Lilburne and H.C. North
Landcare Research
P.O. Box 69, Lincoln, Canterbury 8152, New Zealand
Ph. +64 3 3256700; Fax +64 3 3252418
E-mail: lilburnel@landcareresearch.co.nz

Abstract
A key agricultural management pressure in Canterbury, New Zealand, is the practice of leaving land fallow during winter, because any nitrate present is likely to  be leached since there is no plant uptake. Remote sensing imagery has been successfully used to identify  land bare for significant periods in winter and early spring. This was achieved by use of simple rules  on a temporal sequence of three Landsat 7 images. In particular, ability to correctly classify land into bare, sparsely vegetated, and fully vegetated categories according to percentage cover of vegetation was investigated using detailed field data along with the images. This paper analyses the sources of error in mapping fallow ground using remote sensing image sequences. Accurate assessment of total area and correct assessment at each spatial location were the aims. Potential sources of error include suitability of the logical  model, timing of image acquisition, scale mismatch, incorrect reference data, geometric errors between  images, and radiometric variation both within and between images (month to month and year to year). A probabilistic approach to providing estimates of uncertainty was applied to draw together the most important of these error sources. Radiometric error was the most significant error source. A map of uncertainty is also produced.

Keywords: landuse pressure, multi-temporal, radiometric error, geometric error

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Lilburne2004accuracy.pdf679.5 KB

Spatial scale and its effects on comparisons of airborne and ground- based gamma-ray spectrometry for mapping environmental radioactivity

E M Scott 1, D C W Sanderson 2, A J Cresswell 2, J J Lang
1 Dept of Statistics, University of Glasgow, Glasgow G12 8QW, UK
2 Scottish Universities Research and Reactor Centre, East Kilbride G75 0QF, UK

Abstract
Airborne gamma ray spectrometry (AGS) has emerged as an important method for measuring environmental radioactivity (particularly radio caesium (Cs-137)) over wide areas. In early summer of 2002, a number of European AGS teams mapped three large common areas in a European inter- calibration exercise (RESUMÉ 2002). Part of the exercise also involved ground-based teams who conducted soil sampling, in-situ and dose rate measurements in the same areas.  The study design required airborne and ground-based measurements to be taken at three calibration sites (pre-characterised prior to the exercise), three common areas as well as a large composite area.  The three calibration sites had 31 sampling points designed in an expanding hexagonal pattern with more than 500 laboratory measurements being made. The three common areas, X, Y and Z were measured by each of the AGS teams and a set of 42 ground based sites (control points) were defined within the common areas and investigated by in-situ gamma spectrometry, soil sampling and dose rate measurement.  Analysis of the results focussed on an assessment of comparability of the AGS results and on the agreement between the AGS and ground based results.     The statistical issues in the analysis of the exercise results include the spatial resolution of the measurements made using the different measurement systems and the substantial natural variation compounded with the variation in measurement techniques (and the importance of the spatial  (lateral) homogeneity of the source of the radioactivity).

Keywords:  mapping, spatial scale

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Scott2004accuracy.pdf478.55 KB

Spatio-temporal analysis of extreme values from lichenometric studies and their relationships to climate

 

Daniel Cooley, Vincent Jomelli, Philippe Naveau

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Cooley2004accuracy.pdf222.25 KB

The Credible Diversity Plot: A Graphic for the Comparison of Biodiversity

Christopher J. Mecklin
Department of Mathematics & Statistics
Murray State University
Murray KY 42071
E-mail: christopher.mecklin@murraystate.edu

Abstract
A wide variety of indices exist for the measurement of ecological diversity. Common choices include the Shannon index and the Simpson index. Unfortunately, the arbitrary selection of diversity index can lead to conflicting results. However, both Shannon’s and Simpson’s measures are special cases of Renyi entropy. Others have developed “diversity profiles”, which use Renyi entropy to graphically compare the diversity index collected from different locations or from the same location over time. We extend this work by using both the bootstrap and Bayesian methods to estimate Renyi entropy and hence construct a graphic that we refer to as “credible diversity profiles”. Several examples using our method will be shown.

Keywords: Statistical Ecology, Biodiversity, Entropy, Bayesian Estimation, Bootstrap

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Mecklin2004accuracy.pdf142.48 KB

The effect of blurred plot coordinates on interpolating forest biomass: a case study

J.W. Coulston 1 and G.A Reams 2
1 Department of Forestry
North Carolina State University
Southern Research Station
Forestry Sciences Laboratory
3041 Cornwallis Road
Research Triangle Park, NC 27709
Phone 919-549-4071
jcoulston@fs.fed.us 
2 USDA Forest Service
Southern Research Station
Forestry Sciences Laboratory
3041 Cornwallis Road
Research Triangle Park, NC 27709
Phone 919-549-4010
greams@fs.fed.us

Abstract 
Interpolated surfaces of forest attributes are important analytical tools and have been used in risk assessments, forest inventories, and forest health assessments.  The USDA Forest Service Forest Inventory and Analysis program (FIA) annually collects information on forest attributes in a consistent fashion nation-wide.  Users of these data typically perform interpolations with the kriging or inverse distance weighting methods which requires the coordinates of each FIA plot.  However because of privacy issues, FIA uses two methods to manipulate plot locations to insure landowner privacy.  The influence these manipulations have on the accuracy of interpolated surfaces is unknown.  We investigated the influence by comparing actual and interpolated estimates of forest biomass created from data with manipulated coordinates for three interpolation techniques.  We found that kriging consistently under-performed the inverse distance and Thiessen polygon methods.  Overall the inverse distance method performed best.  We suggest using the inverse distance method for spatial interpolation of FIA data with blurred plot coordinates when relatively little spatial autocorrelation exists.    

Keywords: forest inventory and analysis, spatial statistics, cross-validation, Food security act of 1985

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Coulston2004accuracy.pdf668.94 KB

Two –stage wavelet analysis assessment of dependencies in time series of disease incidence

Nina H. Fefferman, Jyotsna S. Jagai, Elena N. Naumova
Tufts University School of Medicine
Family Medicine and Community Health
136 Harrison Avenue, Boston, MA 02111 USA
e-mail: nina.fefferman@tufts.edu

Abstract 
In epidemiology, techniques which examine periodicity in times series data can be used to understand weekly, biannual or seasonal patterns of disease.  However, a simple understanding of periodicity  is not sufficient to examine the possible influence of variation in incubation period, distributed sources of infection, and infection due to environmental factors, especially if these influences affect the rate of disease on various spatio-temporal scales.  Wavelet analysis provides the ability to consider influences on various spatio-temporal scales.  In order to examine the  feasibility of using wavelets to assess dependencies over different spatio-temporal scales in a time series of disease incidence, we abstracted 10 years of daily records of ambient temperature and precipitation in addition to daily disease incidence data for Massachusetts for two enterically transmitted diseases.  We eliminated periodic fluctuation in both seasonal and weekly case reporting using various techniques (Fourier transformation and “loess” smoothing) on each time series of disease data.   These different methods were employed in order to examine the possible effect of removed periodicities on the variance of the data.  We then performed a wavelet decomposition to examine the residuals from these  analyses on a variety of temporal scales and examined the resulting correlations to the environmental data. 

Keywords:  wavelet decomposition, time series, Fourier decomposition, loess smoothing, environmental epidemiology

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Fefferman2004accuracy.pdf169.16 KB

Why aren’t we making better use of uncertainty information in decision-making?

Kim Lowell
Centre de recherche en géomatique
Pavillon Casault, Université Laval
Québec, Québec  G1K 7P4  Canada
(418) 656-2131 ext. 7998  Fax : (418) 656-7411
Kim.Lowell@scg.ulaval.ca

Abstract
In their day-to-day lives, human beings are quite comfortable making decisions based on uncertain information. However, in using decision-support tools, natural resource management is conducted based upon the most likely or “average” outcome of a given event.  While this is appropriate over a long time period and/or large area, it is not appropriate for individual events.  This paper argues that management of natural resources should be conducted using the same paradigm that human beings use in every day life: the risk of a given event should be known, the potential consequences of that event occurring should be estimated, and a decision should made based on the risk of this event relative to the potential consequences.  It is quite possible that this could lead to protecting against events that have a low likelihood of occurring, but that would cause catastrophic effects should they occur.  A formal analytical structure for doing this is presented, as is a discussion of how adopting such a management paradigm will change decision-support tools.  Finally, a description is provided of the new visualisation and analytical tools that would be required to employ such a paradigm.

Keywords: Model uncertainty, data uncertainty, spatial uncertainty, decision-support, probability, likelihood

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.
 

AttachmentSize
Lowell2004accuracy.pdf279.56 KB

Wildfire threats count analysis by longitudinal models

J.A. Quintanilha and L.L. Ho
Escola Politécnica – Universidade de São Paulo
Av. Prof. Almeida Prado Trav2, n.83
05508-900 São Paulo SP Brazil
E-mail:jaquinta@usp.br

Abstract
The current operational firing monitoring program conducted by IBAMA (Brazil) has collected data of hot spot count, as measurement of wildfire threats, and other explanatory of Amazon region. The aim of this paper is to present the results of statistical analysis of this dataset from 1999 to 2002. From original data, new variables were created. The sample unit was the municipality. The density of hot spot count (the ratio of the hot spot count and the municipality area) was selected as a dependent variable. A longitudinal linear model was used and it identified as relevant explanatory variables: administrative limits, municipalities area, year, rain conditions, legal conditions of the areas, percentage of: deforestation, illegal human occupation, population growth index and agricultural area, as also it pointed out different structures of variance in the dependent variable for different type of the legal conditions of the areas.  From residual analysis, most of standardized residuals (near 90%) are in the interval (-3, +3). However, some neighborhood municipalities must be considered differently since hot spot count are not associated to any of explanatory variables used in this analysis.

Keywords: wildfire threats, hot spot, longitudinal analysis, Amazon region

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Quintanilha2004accuracy.pdf290.23 KB

Wildfire Chances and Probabilistic Risk Assessment

D. R. Brillinger 1, H.K.Preisler 2 and H. M. Naderi
1Statistics Department
University of California
Berkeley, CA, 94720-3860
2 Pacific Southwest Research Station
USDA Forest Service
Albany, CA, 94710

Abstract
Forest fires are an important societal problem in many countries and regions. They cause extensive damage and sub-stantial funds are spent preparing for and fighting them. This work applies methods of probabilistic risk assessment to estimate chances of fires at a future time given explanatory variables. One focus of the work is random effects models. Questions of interest include: Are random effects needed in the risk model? If yes, how is the analysis to be implimented? An exploratory data analysis approach is taken employing both fixed and random effects models for data concerning the state of Oregon during the years 1989-1996.

Keywords: biased sampling, false discovery rate, forest fires, generalized mixed model, penalized quasi-likelihood, risk

In: McRoberts, R. et al. (eds).  Proceedings of the joint meeting of The 6th International Symposium On Spatial Accuracy Assessment In Natural Resources and Environmental Sciences and The 15th Annual Conference of The International Environmetrics Society, June 28 – July 1 2004, Portland, Maine, USA.

AttachmentSize
Brillinger2004accuracy.pdf505.59 KB