1,241 research outputs found

    Space-Based Cosmic-Ray and Gamma-Ray Detectors: a Review

    Full text link
    Prepared for the 2014 ISAPP summer school, this review is focused on space-borne and balloon-borne cosmic-ray and gamma-ray detectors. It is meant to introduce the fundamental concepts necessary to understand the instrument performance metrics, how they tie to the design choices and how they can be effectively used in sensitivity studies. While the write-up does not aim at being complete or exhaustive, it is largely self-contained in that related topics such as the basic physical processes governing the interaction of radiation with matter and the near-Earth environment are briefly reviewed.Comment: 86 pages, 70 figures, prepared for the 2014 ISAPP summer school. Change log in the writeup, ancillary material at https://bitbucket.org/lbaldini/crdetector

    Alternative approaches to Long Term Care financing. Distributive implications and sustainability for Italy.

    Get PDF
    In the last decade, many countries have adopted tax schemes specifically aimed at financing programs for Long Term Care (LTC). These mechanisms have important distributional implications both within and across generations. Given the process of demographic ageing, the issue of inter and intra-generational fairness is deeply linked with the problem of the long-term financial equilibrium of an LTC fund. In this paper we first compare, on a microdata sample of the Italian population, the distributive effects (both on current income and across generations) of six alternative approaches to finance an LTC scheme. In particular, we consider a hypothetical LTC scheme (with a size equivalent to that of the German one) to be introduced in Italy and analyse the distributive implications of some tax options, taken from the financing mechanisms implemented or under discussion in Germany, Luxembourg, Japan and Italy.In the second part of the paper we move from a static to a dynamic perspective: we study the long-term sustainability of an hypothetical Pay as You Go (Payg) LTC scheme operating in Italy (that is, assuming the Italian projected demographic trends) under scenarios that consider alternative indexation rules, growth rates of GNP, future incidence of disability among age groups.long term care; distributive effects; tax-benefit model; intertemporal sustainability; trust fund

    Influence of Beam Broadening on the Accuracy of Radar Polarimetric Rainfall Estimation

    Get PDF
    Abstract The quantitative estimation of rain rates using meteorological radar has been a major theme in radar meteorology and radar hydrology. The increase of interest in polarimetric radar is in part because polarization diversity can reduce the effect on radar precipitation estimates caused by raindrop size variability, which has allowed progress on radar rainfall estimation and on hydrometeorological applications. From an operational point of view, the promises regarding the improvement of radar rainfall accuracy have not yet been completely proven. The main reason behind these limits is the geometry of radar measurements combined with the variability of the spatial structure of the precipitation systems. To overcome these difficulties, a methodology has been developed to transform the estimated drop size distribution (DSD) provided by a vertically pointing micro rain radar to a profile given by a ground-based polarimetric radar. As a result, the rainfall rate at the ground is fixed at all ranges, whereas the broadening beam encompasses a large variability of DSDs. The resulting DSD profile is used to simulate the corresponding profile of radar measurements at C band. Rainfall algorithms based on polarimetric radar measurements were taken into account to estimate the rainfall into the radar beam. Finally, merit factors were used to achieve a quantitative analysis of the performance of the rainfall algorithm in comparison with the corresponding measurements at the ground obtained from a 2D video disdrometer (2DVD) that was positioned beside the micro rain radar. In this method, the behavior change of the merit factors in the range is directly attributable to the DSD variability inside the radar measurement volume, thus providing an assessment of the effects due to beam broadening

    Complexity vs. performance in granular embedding spaces for graph classification

    Get PDF
    The most distinctive trait in structural pattern recognition in graph domain is the ability to deal with the organization and relations between the constituent entities of the pattern. Even if this can be convenient and/or necessary in many contexts, most of the state-of the art classi\ufb01cation techniques can not be deployed directly in the graph domain without \ufb01rst embedding graph patterns towards a metric space. Granular Computing is a powerful information processing paradigm that can be employed in order to drive the synthesis of automatic embedding spaces from structured domains. In this paper we investigate several classi\ufb01cation techniques starting from Granular Computing-based embedding procedures and provide a thorough overview in terms of model complexity, embedding space complexity and performances on several open-access datasets for graph classi\ufb01cation. We witness that certain classi\ufb01cation techniques perform poorly both from the point of view of complexity and learning performances as the case of non-linear SVM, suggesting that high dimensionality of the synthesized embedding space can negatively affect the effectiveness of these approaches. On the other hand, linear support vector machines, neuro-fuzzy networks and nearest neighbour classi\ufb01ers have comparable performances in terms of accuracy, with second being the most competitive in terms of structural complexity and the latter being the most competitive in terms of embedding space dimensionality

    Relaxed Dissimilarity-based Symbolic Histogram Variants for Granular Graph Embedding

    Get PDF
    Graph embedding is an established and popular approach when designing graph-based pattern recognition systems. Amongst the several strategies, in the last ten years, Granular Computing emerged as a promising framework for structural pattern recognition. In the late 2000\u2019s, symbolic histograms have been proposed as the driving force in order to perform the graph embedding procedure by counting the number of times each granule of information appears in the graph to be embedded. Similarly to a bag-of-words representation of a text corpora, symbolic histograms have been originally conceived as integer-valued vectorial representation of the graphs. In this paper, we propose six \u2018relaxed\u2019 versions of symbolic histograms, where the proper dissimilarity values between the information granules and the constituent parts of the graph to be embedded are taken into account, information which is discarded in the original symbolic histogram formulation due to the hard-limited nature of the counting procedure. Experimental results on six open-access datasets of fully-labelled graphs show comparable performance in terms of classification accuracy with respect to the original symbolic histograms (average accuracy shift ranging from -7% to +2%), counterbalanced by a great improvement in terms of number of resulting information granules, hence number of features in the embedding space (up to 75% less features, on average)

    Intrusion detection in wi-fi networks by modular and optimized ensemble of classifiers

    Get PDF
    4noopenWith the breakthrough of pervasive advanced networking infrastructures and paradigms such as 5G and IoT, cybersecurity became an active and crucial field in the last years. Furthermore, machine learning techniques are gaining more and more attention as prospective tools for mining of (possibly malicious) packet traces and automatic synthesis of network intrusion detection systems. In this work, we propose a modular ensemble of classifiers for spotting malicious attacks on Wi-Fi networks. Each classifier in the ensemble is tailored to characterize a given attack class and is individually optimized by means of a genetic algorithm wrapper with the dual goal of hyper-parameters tuning and retaining only relevant features for a specific attack class. Our approach also considers a novel false alarm management procedure thanks to a proper reliability measure formulation. The proposed system has been tested on the well-known AWID dataset, showing performances comparable with other state of the art works both in terms of accuracy and knowledge discovery capabilities. Our system is also characterized by a modular design of the classification model, allowing to include new possible attack classes in an efficient way.openAccademicoGiuseppe Granato; Alessio Martino; Luca Baldini; Antonello RizziGranato, Giuseppe; Martino, Alessio; Baldini, Luca; Rizzi, Antonell

    Structural optimization of automotive chassis: theory, set up, design

    Get PDF
    Improvements in structural components design are often achieved on a trial-and-error basis guided by the designer know-how. Despite the designer experience must remain a fundamental aspect in design, such an approach is likely to allow only marginal product enhancements. A different turn of mind that could boost structural design is needed and could be given by structural optimization methods linked with finite elements analyses. These methods are here briefly introduced, and some applications are presented and discussed with the aim of showing their potential. A particular focus is given to weight reduction in automotive chassis design applications following the experience matured at MilleChili Lab

    Impact of multiple radar reflectivity data assimilation on the numerical simulation of a flash flood event during the HyMeX campaign

    Get PDF
    An analysis to evaluate the impact of multiple radar reflectivity data with a three-dimensional variational (3-D-Var) assimilation system on a heavy precipitation event is presented. The main goal is to build a regionally tuned numerical prediction model and a decision-support system for environmental civil protection services and demonstrate it in the central Italian regions, distinguishing which type of observations, conventional and not (or a combination of them), is more effective in improving the accuracy of the forecasted rainfall. In that respect, during the first special observation period (SOP1) of HyMeX (Hydrological cycle in the Mediterranean Experiment) campaign several intensive observing periods (IOPs) were launched and nine of which occurred in Italy. Among them, IOP4 is chosen for this study because of its low predictability regarding the exact location and amount of precipitation. This event hit central Italy on 14 September 2012 producing heavy precipitation and causing several cases of damage to buildings, infrastructure, and roads. Reflectivity data taken from three C-band Doppler radars running operationally during the event are assimilated using the 3-D-Var technique to improve high-resolution initial conditions. In order to evaluate the impact of the assimilation procedure at different horizontal resolutions and to assess the impact of assimilating reflectivity data from multiple radars, several experiments using the Weather Research and Forecasting (WRF) model are performed. Finally, traditional verification scores such as accuracy, equitable threat score, false alarm ratio, and frequency bias - interpreted by analysing their uncertainty through bootstrap confidence intervals (CIs) - are used to objectively compare the experiments, using rain gauge data as a benchmark

    Influence of Manufacturing Constraints on the Topology Optimization of an Automotive Dashboard

    Get PDF
    Topology Optimization (TO) methods optimize material layout to design light-weight and high-performance products. However, TO methods, applied for components or assembly with high complexity shape or for structures with copious number of parts respectively, do not usually take into account the manufacturability of the optimized geometries, then a heavy further work is required to engineer the product, risking to compromise the mass reduction achieved. Within an Industry 4.0 approach, we propose to evaluate manufacturing constraints since early stages of the conceptual design to perform a TO coherent with the manufacturing technology chosen. Several approaches of TO with different manufacturing constraints such as casting and extrusion are proposed and each solution is compared. The optimum conceptual design is determined in order to minimize the component weight while satisfying both the structural targets and the manufacturing constraints; a case study on a high-performance sport car dashboard is finally presented

    Static temperature guideline for comparative testing of sorption heat storage systems for building application

    Get PDF
    Sorption heat storage system performance heavily depends on the operating temperature. It is found that testing temperatures reported in literature vary widely. In respect to the building application for space heating, reported testing temperatures are often outside of application scope and at times even incomplete. This has led to application performance overestimation and prevents sound comparison between reports. This issue is addressed in this paper and a remedy pursued by proposing a static temperature and vapor pressure-based testing guideline for building-integrated sorption heat storage systems. By following this guideline, comparable testing results in respect to temperature gain, power and energy density will be possible, in turn providing a measure for evaluation of progress
    • …
    corecore