2,262 research outputs found
Selected Challenges From Spatial Statistics For Spatial Econometricians
Griffith and Paelinck (2011) present selected non-standard spatial statistics and spatial econometrics topics that address issues associated with spatial econometric methodology. This paper addresses the following challenges posed by spatial autocorrelation alluded to and/or derived from the spatial statistics topics of this book: the Gaussian random variable Jacobian term for massive datasets; topological features of georeferenced data; eigenvector spatial filtering-based georeferenced data generating mechanisms; and, interpreting random effects.Artykuł prezentuje wybrane, niestandardowe statystyki przestrzenne oraz zagadnienia ekonometrii przestrzennej. Rozważania teoretyczne koncentrują się na wyzwaniach wynikających z autokorelacji przestrzennej, nawiązując do pojęć Gaussowskiej zmiennej losowej, topologicznych cech danych georeferencyjnych, wektorów własnych, filtrów przestrzennych, georeferencyjnych mechanizmów generowania danych oraz interpretacji efektów losowych
Comparing estimation methods for spatial econometrics techniques using R.
Recent advances in spatial econometrics model fitting techniques have made it more desirable to be able to compare results and timings. Results should correspond between implementations using different applications, while timings are more readily compared within a single application. A broad range of model fitting techniques are provided by the contributed R packages for spatial econometrics. These model fitting techniques are associated with methods for estimating impacts and some tests, which will also be presented and compared. This review constitutes an up-to-date demonstration of techniques now available in R, and mentions some that will shortly become more generally available.Spatial autoregression; Econometric software.
A Scalable MCEM Estimator for Spatio-Temporal Autoregressive Models
Very large spatio-temporal lattice data are becoming increasingly common
across a variety of disciplines. However, estimating interdependence across
space and time in large areal datasets remains challenging, as existing
approaches are often (i) not scalable, (ii) designed for conditionally Gaussian
outcome data, or (iii) are limited to cross-sectional and univariate outcomes.
This paper proposes an MCEM estimation strategy for a family of latent-Gaussian
multivariate spatio-temporal models that addresses these issues. The proposed
estimator is applicable to a wide range of non-Gaussian outcomes, and
implementations for binary and count outcomes are discussed explicitly. The
methodology is illustrated on simulated data, as well as on weekly data of
IS-related events in Syrian districts.Comment: 29 pages, 8 figure
After “Raising the Bar”: applied maximum likelihood estimation of families of models in spatial econometrics.
Elhorst (2010) shows how the recent publication of LeSage and Pace (2009) in his expression “raises the bar” for our fitting of spatial econometrics models. By extending the family of models that deserve attention, Elhorst reveals the need to explore how they might be fitted, and discusses some alternatives. This paper attempts to take up this challenge with respect to implementation in the R spdep package for the maximum likelihood case, using a smaller data set to see whether earlier conclusions would be changed when newer techniques are used, and two larger data sets to examine model fitting issues.Models; Econometrics;
Robust Cardiac Motion Estimation using Ultrafast Ultrasound Data: A Low-Rank-Topology-Preserving Approach
Cardiac motion estimation is an important diagnostic tool to detect heart
diseases and it has been explored with modalities such as MRI and conventional
ultrasound (US) sequences. US cardiac motion estimation still presents
challenges because of the complex motion patterns and the presence of noise. In
this work, we propose a novel approach to estimate the cardiac motion using
ultrafast ultrasound data. -- Our solution is based on a variational
formulation characterized by the L2-regularized class. The displacement is
represented by a lattice of b-splines and we ensure robustness by applying a
maximum likelihood type estimator. While this is an important part of our
solution, the main highlight of this paper is to combine a low-rank data
representation with topology preservation. Low-rank data representation
(achieved by finding the k-dominant singular values of a Casorati Matrix
arranged from the data sequence) speeds up the global solution and achieves
noise reduction. On the other hand, topology preservation (achieved by
monitoring the Jacobian determinant) allows to radically rule out distortions
while carefully controlling the size of allowed expansions and contractions.
Our variational approach is carried out on a realistic dataset as well as on a
simulated one. We demonstrate how our proposed variational solution deals with
complex deformations through careful numerical experiments. While maintaining
the accuracy of the solution, the low-rank preprocessing is shown to speed up
the convergence of the variational problem. Beyond cardiac motion estimation,
our approach is promising for the analysis of other organs that experience
motion.Comment: 15 pages, 10 figures, Physics in Medicine and Biology, 201
Computing the Jacobian in spatial models: an applied survey.
Despite attempts to get around the Jacobian in fitting spatial econometric models by using GMM and other approximations, it remains a central problem for maximum likelihood estimation. In principle, and for smaller data sets, the use of the eigenvalues of the spatial weights matrix provides a very rapid and satisfactory resolution. For somewhat larger problems, including those induced in spatial panel and dyadic (network) problems, solving the eigenproblem is not as attractive, and a number of alternatives have been proposed. This paper will survey chosen alternatives, and comment on their relative usefulness.Spatial autoregression; Maximum likelihood estimation; Jacobian computation; Econometric software.
Beam-beam simulation code BBSIM for particle accelerators
A highly efficient, fully parallelized, six-dimensional tracking model for
simulating interactions of colliding hadron beams in high energy ring colliders
and simulating schemes for mitigating their effects is described. The model
uses the weak-strong approximation for calculating the head-on interactions
when the test beam has lower intensity than the other beam, a look-up table for
the efficient calculation of long-range beam-beam forces, and a self-consistent
Poisson solver when both beams have comparable intensities. A performance test
of the model in a parallel environment is presented. The code is used to
calculate beam emittance and beam loss in the Tevatron at Fermilab and compared
with measurements. We also present results from the studies of two schemes
proposed to compensate the beam-beam interactions: a) the compensation of
long-range interactions in the Relativistic Heavy Ion Collider (RHIC) at
Brookhaven and the Large Hadron Collider (LHC) at CERN with a current-carrying
wire, b) the use of a low energy electron beam to compensate the head-on
interactions in RHIC
NetKet 3: Machine Learning Toolbox for Many-Body Quantum Systems
We introduce version 3 of NetKet, the machine learning toolbox for many-body quantum physics. NetKet is built around neural-network quantum states and provides efficient algorithms for their evaluation and optimization. This new version is built on top of JAX, a differentiable programming and accelerated linear algebra framework for the Python programming language. The most significant new feature is the possibility to define arbitrary neural network ansätze in pure Python code using the concise notation of machine-learning frameworks, which allows for just-in-time compilation as well as the implicit generation of gradients thanks to automatic differentiation. NetKet 3 also comes with support for GPU and TPU accelerators, advanced support for discrete symmetry groups, chunking to scale up to thousands of degrees of freedom, drivers for quantum dynamics applications, and improved modularity, allowing users to use only parts of the toolbox as a foundation for their own code
Spatial regression in large datasets: problem set solution
In this dissertation we investigate a possible attempt to combine the Data Mining methods and traditional Spatial Autoregressive models, in the context of large spatial datasets.
We start to considere the numerical difficulties to handle massive datasets by the usual approach based on Maximum Likelihood estimation for spatial models and Spatial Two-Stage
Least Squares.
So, we conduct an experiment by Monte Carlo simulations to compare the accuracy and computational complexity for decomposition and approximation techniques to solve the problem of computing the Jacobian in spatial models, for various regular lattice structures. In particular,
we consider one of the most common spatial econometric models: spatial lag (or SAR,
spatial autoregressive model).
Also, we provide new evidences in the literature, by examining the double effect on computational
complexity of these methods: the influence of "size effect" and "sparsity effect".
To overcome this computational problem, we propose a data mining methodology as CART
(Classification and Regression Tree) that explicitly considers the phenomenon of spatial autocorrelation
on pseudo-residuals, in order to remove this effect and to improve the accuracy,
with significant saving in computational complexity in wide range of spatial datasets: realand simulated data
- …