5,665 research outputs found

    Fast Simulations of the ATLAS and CMS experiments at LHC

    Get PDF
    The fast simulation of a high energy physics experiment is a tool used by experimentalists to quickly assess the potentiality of their detectors on a specific analysis or reconstruction tecnique, before imbarking themselves into a more time- and CPU-expensive detailed study with the full simulation. In some cases, it can also be considered the access point for theoreticians wanting to see "how do their model looks like in the real life''. The aim of this contribution is to introduce how fast simulations work in the ATLAS and CMS experiments at LHC, and which are the main differences with respect to a full simulation. A comprehensive comparison of a few results obtained with the full and the fast simulation in CMS is also given, in order to provide an example of application of the two methods

    The spleen: a hub connecting nervous and immune systems in cardiovascular and metabolic diseases

    Get PDF
    Metabolic disorders have been identified as major health problems affecting a large portion of the world population. In addition, obesity and insulin resistance are principal risk factors for the development of cardiovascular diseases. Altered immune responses are common features of both hypertension and obesity and, moreover, the involvement of the nervous system in the modulation of immune system is gaining even more attention in both pathophysiological contexts. For these reasons, during the last decades, researches focused their efforts on the comprehension of the molecular mechanisms connecting immune system to cardiovascular and metabolic diseases. On the other hand, it has been reported that in these pathological conditions, central neural pathways modulate the activity of the peripheral nervous system, which is strongly involved in onset and progression of the disease. It is interesting to notice that neural reflex can also participate in the modulation of immune functions. In this scenario, the spleen becomes the crucial hub allowing the interaction of different systems differently involved in metabolic and cardiovascular diseases. Here, we summarize the major findings that dissect the role of the immune system in disorders related to metabolic and cardiovascular dysfunctions, and how this could also be influenced by neural reflexes

    On characterizations and tests of Benford’s law

    Get PDF
    Benford's law defines a probability distribution for patterns of significant digits in real numbers. When the law is expected to hold for genuine observations, deviation from it can be taken as evidence of possible data manipulation. We derive results on a transform of the significand function that provide motivation for new tests of conformance to Benford's law exploiting its sum-invariance characterization. We also study the connection between sum invariance of the first digit and the corresponding marginal probability distribution. We approximate the exact distribution of the new test statistics through a computationally efficient Monte Carlo algorithm. We investigate the power of our tests under different alternatives and we point out relevant situations in which they are clearly preferable to the available procedures. Finally, we show the application potential of our approach in the context of fraud detection in international trade

    The Forward Search for Very Large Datasets

    Get PDF
    The identification of atypical observations and the immunization of data analysis against both outliers and failures of modeling are important aspects of modern statistics. The forward search is a graphics rich approach that leads to the formal detection of outliers and to the detection of model inadequacy combined with suggestions for model enhancement. The key idea is to monitor quantities of interest, such as parameter estimates and test statistics, as the model is fitted to data subsets of increasing size. In this paper we propose some computational improvements of the forward search algorithm and we provide a recursive implementation of the procedure which exploits the information of the previous step. The output is a set of efficient routines for fast updating of the model parameter estimates, which do not require any data sorting, and fast computation of likelihood contributions, which do not require matrix inversion or qr decomposition. It is shown that the new algorithms enable a reduction of the computation time by more than 80%. Furthemore, the running time now increases almost linearly with the sample size. All the routines described in this paper are included in the FSDA toolbox for MATLAB which is freely downloadable from the internet.JRC.G.2-Global security and crisis managemen

    A study of wall-injected flows into closed–open rectangular cylinders

    Get PDF
    The present work concerns with the study of the fluid motion within a channel when a gas is injected from its wall. This kind of flow appears in a wide variety of engineering applications and it is realized either with low or high injection velocity. The most fruitful scientific production about this argument comes from the propulsive community. In fact, a Solid Rocket Motor (SRM) is a hollowed cylinder, whose inner walls burn and hence inject gas into a channel. Thus, propulsive applications involves wall-injected flows and combustion. Actually, the most of combustion takes place in a very thin limited region near the propellant (the flame) and it is often neglected in modeling. However, pressure and temperature are extremely high in a SRM. Therefore, due to this extreme condition, real firing tests provide a number of dataset limited to the number of probes. Usually, sophisticated pressure transducers, withstanding high temperature, placed at the head-end and aft-end of the motor and ultra-sound pads are employed in order to gauge pressure oscillations and monitor the radial regression of the propellant surface. As stated in [1], it is extremely hard to equip model rocket motor to allow for visual access to its interior and also punctual temperature are of limited interest. A first simplification had been to employ a cold flow injected into a hollow cylinder. Although it has been very useful to confirm theoretical predicted instabilities, also this experimental equipment was limited to punctual measurements. A further simplification leads to a closed-open rectangular cylinder, which can be seen as a bi-dimensional approximation of what happens in three-dimensions. Despite this kind of configuration have been developed, moreover with visual access, it has been mostly used for visualizations and no global flowfield measurements were conducted. In this work, a closed-open rectangular cylinder has been adopted in order to obtain a global velocity flowfield by means of Particle Image Velocimetry (PIV). The test section of the experiment is a 24 cm-long, 4 cm-wide and 2 cm-high rectangular channel into which a mixture of air and oil droplets, necessary to the PIV measure- ment, is injected through a porous material in order to simulate the propellant gas injection. As the propellant heterogeneity, the injection due to the flame presents a non-trivial spatial pattern and unsteady temporal behavior. Therefore, the porosity of the adopted material has been investigated in order to understand whether it yields a simulative phenomenon of the propellant flame evolution and, if so, which is the mechanism behind it. Since porosity implies morphological structures of the material of the dimensions of tens of a micron, a high resolution measuring technique is required for the analysis of the flow generated from the injection through this porous material. Hence, Single Pixel Ensemble Correlation PIV, which has a resolution as small as one pixel, has been applied in addition to the more classical Window Correlation technique, which has instead a resolution related to the window size, usually around 16 pixel. The velocity field of the whole channel, reconstructed by means of Window Correlation PIV, is then compared with analytical models opportunely presented. The influence of the flow structure due to porosity has been analyzed. The injection Reynolds number, based on the injection velocity and the height of the channel, for the present set-up is around 100 and the flowfield does not seems to show any transition to turbulence within the full length of the channel. The presence of corners and cavities is a fundamental point of interest for a better comprehension of the internal fluid dynamics of a SRM. Therefore, a second configuration has been taken into account. It presents a ninety degrees corner backward facing step, on the injecting wall, doubling the port area, and a subsequent non-porous, movable block forming a cavity. Investigations have been performed for different cavity lengths and in presence or not of injection from the bottom of the cavity itself. This aspect is strictly related to the flow structure at the aft region of a segmented grain with a tubular segment at the head followed by a star-shaped segment. In fact, the star-shaped segment and the nozzle form a cavity that injects until the propellant of the star-shaped segment burns, thereafter the cavity becomes inert and the flow is due to the tubular segment. The capability of a numerical scheme to accurately solve acoustic waves is fundamental in this circumstance, because the acoustical interaction plays a crucial role in the fluid dynamics of cavities. Definitely, numerical simulations provide a more complete outline and will be also employed in order to compare and go beyond the limitations of the experimental apparatus. A numerical code has been developed implementing high order centered finite difference schemes for the compressible Navier-Stokes equations. Since the physical phenomenon is confined into a bounded region with several boundary con- ditions, the Navier-Stokes Characteristics Boundary Conditions (NSCBC) technique has been adopted. It consists in a local one-dimensional approximation near the boundary, where the method of the characteristics is adopted in order to adequately compute the quantities deriving from the boundary conditions. Since a fourth-order finite difference scheme has been adopted, a selective low-pass filtering process was mandatory in order to overcome the spurious naturally arising oscillations. More- over, the coefficients of the schemes and filters have been chosen so that the scheme minimizes the dispersion error in order to solve the acoustical waves as accurately as possible

    Simulating mixtures of multivariate data with fixed cluster overlap in FSDA library

    Get PDF
    We extend the capabilities of MixSim, a framework which is useful for evaluating the performance of clustering algorithms, on the basis of measures of agreement between data partitioning and flexible generation methods for data, outliers and noise. The peculiarity of the method is that data are simulated from normal mixture distributions on the basis of pre-specified synthesis statistics on an overlap measure, defined as a sum of pairwise misclassification probabilities. We provide new tools which enable us to control additional overlapping statistics and departures from homogeneity and sphericity among groups, together with new outlier contamination schemes. The output of this extension is a more flexible framework for generation of data to better address modern robust clustering scenarios in presence of possible contamination. We also study the properties and the implications that this new way of simulating clustering data entails in terms of coverage of space, goodness of fit to theoretical distributions, and degree of convergence to nominal values. We demonstrate the new features using our MATLAB implementation that we have integrated in the Flexible Statistics for Data Analysis (FSDA) toolbox for MATLAB. With MixSim, FSDA now integrates in the same environment state of the art robust clustering algorithms and principled routines for their evaluation and calibration. A spin off of our work is a general complex routine, translated from C language to MATLAB, to compute the distribution function of a linear combinations of non central χ2\chi ^2?2 random variables which is at the core of MixSim and has its own interest for many test statistics

    The Robust Estimation of Monthly Prices of Goods Traded by the European Union

    Get PDF
    The general problem addressed in this document is the estimation of “fair” import prices from international trade data. The work is in support to the determination of the customs value at the moment of the customs formalities, to establish how much duty the importer must pay, and the post-clearance checks of individual transactions. The proposed approach can be naturally extended to the analysis of export flows and used for other purposes, including general market analyses. The Joint Research Centre of the European Commission has previously addressed (Arsenis et al., 2015) the trade price estimation problem by considering data for fixed product, origin and destination over a multiannual time period, typically of 3 or 4 years, leading to price estimates that are specific for each EU Member State. This report illustrates a different model whereby each price estimate is calculated on a monthly basis, using data for fixed time (month), product and origin. The approach differentiates between trades originated from different third countries and it is therefore particularly useful to monitor trends and anomalies in specific EU trade markets. These Estimated European Monthly Prices are publishes every month by the Joint Research Centre in a dedicated section of the THESEUS website (https://theseus.jrc.ec.europa.eu), accessible by authorized users of the EU and Member States services. The section, called Monthly Fair Prices, also shows the time evolution of worldwide price estimates computed with the same approach by fixing only time and product.JRC.I.3-Text and Data Minin

    A new family of tempered distributions

    Get PDF
    Tempered distributions have received considerable attention, both from a theoretical point of view and in several important application fields. The most popular choice is perhaps the Tweedie model, which is obtained by tempering the Positive Stable distribution. Through tempering, we suggest a very flexible four-parameter family of distributions that generalizes the Tweedie model and that could be applied to data sets of non-negative observations with complex (and difficult to accommodate) features. We derive the main theoretical properties of our proposal, through which we show its wide application potential. We also embed our proposal within the theory of Lévy processes, thus providing a strengthened probabilistic motivation for its introduction. Furthermore, we derive a series expansion for the probability density function which allows us to develop algorithms for fitting the distribution to data. We finally provide applications to challenging real-world examples taken from international trade

    Data Analytics for Credit Risk Models in Retail Banking: a new era for the banking system

    Get PDF
    Given the nature of the lending industry and its importance for global economic stability, financial institutions have always been keen on estimating the risk profile of their clients. For this reason, in the last few years several sophisticated techniques for modelling credit risk have been developed and implemented. After the financial crisis of 2007-2008, credit risk management has been further expanded and has acquired significant regulatory importance. Specifically, Basel II and III Accords have strengthened the conditions that banks must fulfil to develop their own internal models for estimating the regulatory capital and expected losses. After motivating the importance of credit risk modelling in the banking sector, in this contribution we perform a review of the traditional statistical methods used for credit risk management. Then we focus on more recent techniques based on Machine Learning techniques, and we critically compare tradition and innovation in credit risk modelling. Finally, we present a case study addressing the main steps to practically develop and validate a Probability of Default model for risk prediction via Machine Learning Techniques
    • 

    corecore