431 research outputs found
Manycore processing of repeated range queries over massive moving objects observations
The ability to timely process significant amounts of continuously updated
spatial data is mandatory for an increasing number of applications. Parallelism
enables such applications to face this data-intensive challenge and allows the
devised systems to feature low latency and high scalability. In this paper we
focus on a specific data-intensive problem, concerning the repeated processing
of huge amounts of range queries over massive sets of moving objects, where the
spatial extents of queries and objects are continuously modified over time. To
tackle this problem and significantly accelerate query processing we devise a
hybrid CPU/GPU pipeline that compresses data output and save query processing
work. The devised system relies on an ad-hoc spatial index leading to a problem
decomposition that results in a set of independent data-parallel tasks. The
index is based on a point-region quadtree space decomposition and allows to
tackle effectively a broad range of spatial object distributions, even those
very skewed. Also, to deal with the architectural peculiarities and limitations
of the GPUs, we adopt non-trivial GPU data structures that avoid the need of
locked memory accesses and favour coalesced memory accesses, thus enhancing the
overall memory throughput. To the best of our knowledge this is the first work
that exploits GPUs to efficiently solve repeated range queries over massive
sets of continuously moving objects, characterized by highly skewed spatial
distributions. In comparison with state-of-the-art CPU-based implementations,
our method highlights significant speedups in the order of 14x-20x, depending
on the datasets, even when considering very cheap GPUs
Studying the universality of field induced tunnel ionization times via high-order harmonic spectroscopy
High-harmonics generation spectroscopy is a promising tool for resolving
electron dynamics and structure in atomic and molecular systems. This scheme,
commonly described by the strong field approximation, requires a deep insight
into the basic mechanism that leads to the harmonics generation. Recently, we
have demonstrated the ability to resolve the first stage of the process --
field induced tunnel ionization -- by adding a weak perturbation to the strong
fundamental field. Here we generalize this approach and show that the
assumptions behind the strong field approximation are valid over a wide range
of tunnel ionization conditions. Performing a systematic study -- modifying the
fundamental wavelength, intensity and atomic system -- we observed a good
agreement with quantum path analysis over a range of Keldysh parameters. The
generality of this scheme opens new perspectives in high harmonics
spectroscopy, holding the potential of probing large, complex molecular
systems.Comment: 11 pages, 5 figure
Diffraction-free light droplets for axially-resolved volume imaging
An ideal direct imaging system entails a method to illuminate on command a single diffraction-limited region in a generally thick and turbid volume. The best approximation to this is the use of large-aperture lenses that focus light into a spot. This strategy fails for regions that are embedded deep into the sample, where diffraction and scattering prevail. Airy beams and Bessel beams are solutions of the Helmholtz Equation that are both non-diffracting and self-healing, features that make them naturally able to outdo the effects of distance into the volume but intrinsically do not allow resolution along the propagation axis. Here, we demonstrate diffraction-free self-healing three-dimensional monochromatic light spots able to penetrate deep into the volume of a sample, resist against deflection in turbid environments, and offer axial resolution comparable to that of Gaussian beams. The fields, formed from coherent mixtures of Bessel beams, manifest a more than ten-fold increase in their undistorted penetration, even in turbid milk solutions, compared to diffraction-limited beams. In a fluorescence imaging scheme, we find a ten-fold increase in image contrast compared to diffraction-limited illuminations, and a constant axial resolution even after four Rayleigh lengths. Results pave the way to new opportunities in three-dimensional microscopy
Application of data fusion techniques to direct geographical traceability indicators
A hierarchical data fusion approach has been developed proposing multivariate curve resolution (MCR) as a variable reduction tool.
The case study presented concerns the characterization of soil samples of the Modena District. It was performed in order to understand, at a pilot study stage, the geographical variability of the zone prior to planning a representative soils sampling to derive geographical traceability models for Lambrusco Wines. Soils samples were collected from four producers of Lambrusco Wines, located in in-plane and hill areas. Depending on the extension of the sampled fields the number of points collected varies from three to five and, for each point, five depth levels were considered.
The different data blocks consisted of X-ray powder diffraction (XRDP) spectra, metals concentrations relative to thirty-four elements and the 87Sr/86Sr isotopic abundance ratio, a very promising geographical traceability marker.
A multi steps data fusion strategy has been adopted. Firstly, the metals concentrations dataset was weighted and concatenated with the values of strontium isotopic ratio and compressed. The resolved components described common patterns of variation of metals content and strontium isotopic ratio. The X-ray powder spectra profiles were resolved in three main components that can be referred to calcite, quartz and clays contributions. Then, a high-level data fusion approach was applied by combining the components arising from the previous data sets.
The results show interesting links among the different components arising from XRDP, the metals pattern and to which of these 87Sr/86Sr Isotopic Ratio variation is closer. The combined information allowed capturing the variability of the analyzed soil samples
Load-Sensitive Selective Pruning for Distributed Search
A search engine infrastructure must be able to provide the same quality of service to all queries received during a day. During normal operating conditions, the demand for resources is considerably lower than under peak conditions, yet an oversized infrastructure would result in an unnecessary waste of computing power. A possible solution adopted in this situation might consist of defining a maximum threshold processing time for each query, and dropping queries for which this threshold elapses, leading to disappointed users. In this paper, we propose and evaluate a different approach, where, given a set of different query processing strategies with differing efficiency, each query is considered by a framework that sets a maximum query processing time and selects which processing strategy is the best for that query, such that the processing time for all queries is kept below the threshold. The processing time estimates used by the scheduler are learned from past queries. We experimentally validate our approach on 10,000 queries from a standard TREC dataset with over 50 million documents, and we compare it with several baselines. These experiments encompass testing the system under different query loads and different maximum tolerated query response times. Our results show that, at the cost of a marginal loss in terms of response quality, our search system is able to answer 90 % of queries within half a second during times of high query volume
A mid level data fusion strategy for the Varietal Classification of Lambrusco PDO wines
Nowadays the necessity to reveal the hidden information from complex data sets is increasing due to the development of high-throughput instrumentation. The possibility to jointly analyze data sets arising from different sources (e.g. different analytical determinations/platforms) allows capturing the latent information that would not be extracted by the individual analysis of each block of data. Several approaches are proposed in the literature and are generally referred to as data fusion approaches. In this work a mid level data fusion is proposed for the characterization of three varieties (Salamino di Santa Croce, Grasparossa di Castelvetro, Sorbara) of Lambrusco wine, a typical PDO wine of the district of Modena (Italy). Wine samples of the three different varieties were analyzed by means of 1H-NMR spectroscopy, Emission-Excitation Fluorescence Spectroscopy and HPLC-DAD of the phenolic compounds.
Since the analytical outputs are characterized by different dimensionalities (matrix and tensor), several multivariate analyses were applied (PCA, PARAFAC, MCR-ALS) in order to extract and merge, in a hierarchical way, the information present in each data set.
The results showed that this approach was able to well characterize Lambrusco samples giving also the possibility to understand the correlation between the sources of information arising from the three analytical techniques
Kuznets curve in municipal solid waste production: An empirical analysis based on municipal-level panel data from the Lombardy region (Italy)
By using a novel database that observes 1,497 municipalities from the Lombardy region in Italy between 2005 and 2011, this paper provides an empirical test of the Waste Kuznets Curve (WKC) hypothesis. Fixed effects regression analyses, generalized method of moments models and a number of robustness checks strongly indicate that among the municipalities under scrutiny there is an inverted U-shaped relationship between economic development and waste generation. Nevertheless, only a few of the municipalities under scrutiny reach the turning point of the estimated curve. These findings contribute to the expanding empirical literature that tests WKC by using municipal data, considered the most appropriate for this kind of analysis.Peer reviewe
- …