196,028 research outputs found
A storage and access architecture for efficient query processing in spatial database systems
Due to the high complexity of objects and queries and also due to extremely
large data volumes, geographic database systems impose stringent requirements on their
storage and access architecture with respect to efficient query processing. Performance
improving concepts such as spatial storage and access structures, approximations, object
decompositions and multi-phase query processing have been suggested and analyzed as
single building blocks. In this paper, we describe a storage and access architecture which
is composed from the above building blocks in a modular fashion. Additionally, we incorporate
into our architecture a new ingredient, the scene organization, for efficiently
supporting set-oriented access of large-area region queries. An experimental performance
comparison demonstrates that the concept of scene organization leads to considerable
performance improvements for large-area region queries by a factor of up to 150
Query processing of spatial objects: Complexity versus Redundancy
The management of complex spatial objects in applications, such as geography and cartography,
imposes stringent new requirements on spatial database systems, in particular on efficient
query processing. As shown before, the performance of spatial query processing can be improved
by decomposing complex spatial objects into simple components. Up to now, only decomposition
techniques generating a linear number of very simple components, e.g. triangles or trapezoids, have
been considered. In this paper, we will investigate the natural trade-off between the complexity of
the components and the redundancy, i.e. the number of components, with respect to its effect on
efficient query processing. In particular, we present two new decomposition methods generating
a better balance between the complexity and the number of components than previously known
techniques. We compare these new decomposition methods to the traditional undecomposed representation
as well as to the well-known decomposition into convex polygons with respect to their
performance in spatial query processing. This comparison points out that for a wide range of query
selectivity the new decomposition techniques clearly outperform both the undecomposed representation
and the convex decomposition method. More important than the absolute gain in performance
by a factor of up to an order of magnitude is the robust performance of our new decomposition
techniques over the whole range of query selectivity
BayesX: Analysing Bayesian structured additive regression models
There has been much recent interest in Bayesian inference for generalized additive and related models. The increasing popularity of Bayesian methods for these and other model classes is mainly caused by the introduction of Markov chain Monte Carlo (MCMC) simulation techniques which allow the estimation of very complex and realistic models. This paper describes the capabilities of the public domain software BayesX for estimating complex regression models with structured additive predictor. The program extends the capabilities of existing software for semiparametric regression. Many model classes well known from the literature are special cases of the models supported by BayesX. Examples are Generalized Additive (Mixed) Models, Dynamic Models, Varying Coefficient Models, Geoadditive Models, Geographically Weighted Regression and models for space-time regression. BayesX supports the most common distributions for the response variable. For univariate responses these are Gaussian, Binomial, Poisson, Gamma and negative Binomial. For multicategorical responses, both multinomial logit and probit models for unordered categories of the response as well as cumulative threshold models for ordered categories may be estimated. Moreover, BayesX allows the estimation of complex continuous time survival and hazardrate models
Recommended from our members
Recognition by directed attention to recursively partitioned images
A learning/recognition model (and instantiating program) is described which recursively combines the learning paradigms of conceptual clustering (Michalski, 1980) and learning-from-examples to resolve the ambiguities of real-world recognition. The model is based on neuropsychological and psychological evidence that the visual system is analytic, hierarchical, and composed of a parallel/serial dichotomy (many, see conclusions by Crick, 1984). Emulating the experimental evidence, parallel processes in the model decompose the image into components and cluster the constituents in much the same way as the image processing technique known as moment analysis (Alt, 1962). Serial, attentive mechanisms then reassemble the decompositions by investigating spatial relationships between components. The use of attentive mechanisms extends the moment analysis technique to handle alterations in structure and solves the contention problem created by combining the two learning paradigms. The contention results from a disagreement between the teacher and the model on what constitutes the salient features at the highest level of the symbol. There are four cases ZBT must handle, two of which result from the disagreement with the teacher. The parallel/serial dichotomy represents a vertical/horizontal tradeoff between the invariant and variant features of a domain. The resultant learned hierarchy allows ZBT to recognize structural differences while avoiding problems of exponential growth
A Quantitative Analysis of IRAS Maps of Molecular Clouds
We present an analysis of IRAS maps of five molecular clouds: Orion,
Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description
of these astrophysical maps, we use a newly developed technique which considers
all maps of a given type to be elements of a pseudometric space. For each
physical characteristic of interest, this formal system assigns a distance
function (a pseudometric) to the space of all maps; this procedure allows us to
measure quantitatively the difference between any two maps and to order the
space of all maps. We thus obtain a quantitative classification scheme for
molecular clouds. In this present study we use the IRAS continuum maps at
100m and 60m to produce column density (or optical depth) maps for
the five molecular cloud regions given above. For this sample of clouds, we
compute the ``output'' functions which measure the distribution of density, the
distribution of topological components, the self-gravity, and the filamentary
nature of the clouds. The results of this work provide a quantitative
description of the structure in these molecular cloud regions. We then order
the clouds according to the overall environmental ``complexity'' of these star
forming regions. Finally, we compare our results with the observed populations
of young stellar objects in these clouds and discuss the possible environmental
effects on the star formation process. Our results are consistent with the
recently stated conjecture that more massive stars tend to form in more
``complex'' environments.Comment: 27 pages Plain TeX, submitted to ApJ, UM-AC 93-15, 15 figures
available upon reques
- âŠ