2,182,155 research outputs found
Cosmological models with flat spatial geometry
The imposition of symmetries or special geometric properties on submanifolds
is less restrictive than to impose them in the full space-time. Starting from
this idea, in this paper we study irrotational dust cosmological models in
which the geometry of the hypersurfaces generated by the fluid velocity is
flat, which supposes a relaxation of the restrictions imposed by the
Cosmological Principle. The method of study combines covariant and tetrad
methods that exploits the geometrical and physical properties of these models.
This procedure will allow us to determine all the space-times within this class
as well as to study their properties. Some important consequences and
applications of this study are also discussed.Comment: 12 pages, LaTeX2e, IOP style. To appear in Classical and Quantum
Gravit
Spatial models for flood risk assessment
The problem of computing risk measures associated to flood events is extremely important not only from the point of view of civil protection systems but also because of the necessity for the municipalities of insuring against the damages. In this work we propose, in the framework of an integrated strategy, an operating solution which merges in a conditional approach the information usually available in this setup. First we use a Logistic Auto-Logistic (LAM) model for the estimation of the univariate conditional probabilities of flood events. This approach has two fundamental advantages: it allows to incorporate auxiliary information and does not require the target variables to be independent. Then we simulate the joint distribution of floodings by means of the Gibbs Sampler. Finally we propose an algorithm to increase ex post the spatial autocorrelation of the simulated events. The methodology is shown to be effective by means of an application to the estimation of the flood probability of Italian hydrographic regions
Reducing Spatial Data Complexity for Classification Models
Intelligent data analytics gradually becomes a day-to-day reality of today's businesses. However, despite rapidly
increasing storage and computational power current state-of-the-art predictive models still can not handle massive and noisy
corporate data warehouses. What is more adaptive and real-time operational environment requires multiple models to be
frequently retrained which fiirther hinders their use. Various data reduction techniques ranging from data sampling up to
density retention models attempt to address this challenge by capturing a summarised data structure, yet they either do
not account for labelled data or degrade the classification performance of the model trained on the condensed dataset. Our
response is a proposition of a new general framework for reducing the complexity of labelled data by means of controlled
spatial redistribution of class densities in the input space. On the example of Parzen Labelled Data Compressor (PLDC) we
demonstrate a simulatory data condensation process directly inspired by the electrostatic field interaction where the data are
moved and merged following the attracting and repelling interactions with the other labelled data. The process is controlled
by the class density function built on the original data that acts as a class-sensitive potential field ensuring preservation of
the original class density distributions, yet allowing data to rearrange and merge joining together their soft class partitions.
As a result we achieved a model that reduces the labelled datasets much further than any competitive approaches yet with
the maximum retention of the original class densities and hence the classification performance. PLDC leaves the reduced
dataset with the soft accumulative class weights allowing for efficient online updates and as shown in a series of experiments
if coupled with Parzen Density Classifier (PDC) significantly outperforms competitive data condensation methods in terms of
classification performance at the comparable compression levels
Spatial models for flood risk assessment
The problem of computing risk measures associated to flood events is extremely important not only from the point of view of civil protection systems but also because of the necessity for the municipalities of insuring against the damages. In this work we propose, in the framework of an integrated strategy, an operating solution which merges in a conditional approach the information usually available in this setup. First we use a Logistic Auto-Logistic (LAM) model for the estimation of the univariate conditional probabilities of flood events. This approach has two fundamental advantages: it allows to incorporate auxiliary information and does not require the target variables to be indepen- dent. Then we simulate the joint distribution of floodings by means of the Gibbs Sampler. Finally we propose an algorithm to increase ex post the spatial autocorrelation of the simulated events. The methodology is shown to be effective by means of an application to the estimation of the flood probability of Italian hydrographic regions.Flood Risk, Conditional Approach, LAM Model, Pseudo-Maximum Likelihood Estimation, Spatial Autocorrelation, Gibbs Sampler.
Panel VAR Models with Spatial Dependence
I consider a panel vector-autoregressive model with cross-sectional dependence of the disturbances characterized by a spatial autoregressive process. I propose a three-step estimation procedure. Its first step is an instrumental variable estimation that ignores the spatial correlation. In the second step, the estimated disturbances are used in a multivariate spatial generalized moments estimation to infer the degree of spatial correlation. The final step of the procedure uses transformed data and applies standard techniques for estimation of panel vector-autoregressive models. I compare the small-sample performance of various estimation strategies in a Monte Carlo study.Spatial PVAR, Multivariate dynamic panel data model, Spatial GM, Spatial Cochrane-Orcutt transformation, Constrained maximum likelihood estimation
- …
