1,253 research outputs found
Non-convex scenario optimization
Scenario optimization is an approach to data-driven decision-making that has been introduced some fifteen years ago and has ever since then grown fast. Its most remarkable feature is that it blends the heuristic nature of data-driven methods with a rigorous theory that allows one to gain factual, reliable, insight in the solution. The usability of the scenario theory, however, has been restrained thus far by the obstacle that most results are standing on the assumption of convexity. With this paper, we aim to free the theory from this limitation. Specifically, we focus on the body of results that are known under the name of “wait-and-judge” and show that its fundamental achievements maintain their validity in a non-convex setup. While optimization is a major center of attention, this paper travels beyond it and into data-driven decision making. Adopting such a broad framework opens the door to building a new theory of truly vast applicability
Lattice Simulation of Nuclear Multifragmentation
Motivated by the decade-long debate over the issue of criticality supposedly
observed in nuclear multifragmentation, we propose a dynamical lattice model to
simulate the phenomenon. Its Ising Hamiltonian mimics a short range attractive
interaction which competes with a thermal-like dissipative process. The results
here presented, generated through an event-by-event analysis, are in agreement
with both experiment and those produced by a percolative (non-dynamical) model.Comment: 8 pages, 3 figure
The Hough Transform and the Impact of Chronic Leukemia on the Compact Bone Tissue from CT-Images Analysis
Computational analysis of X-ray Computed Tomography (CT) images allows the assessment of alteration of bone structure in adult patients with Advanced Chronic Lymphocytic Leukemia (ACLL), and may even offer a powerful tool to assess the development of the disease (prognostic potential). The crucial requirement for this kind of analysis is the application of a pattern recognition method able to accurately segment the intra-bone space in clinical CT images of the human skeleton. Our purpose is to show how this task can be accomplished by a procedure based on the use of the Hough transform technique for special families of algebraic curves. The dataset used for this study is composed of sixteen subjects including eight control subjects, one ACLL survivor, and seven ACLL victims. We apply the Hough transform approach to the set of CT images of appendicular bones for detecting the compact and trabecular bone contours by using ellipses, and we use the computed semi-axes values to infer information on bone alterations in the population affected by ACLL. The effectiveness of this method is proved against ground truth comparison. We show that features depending on the semi-axes values detect a statistically significant difference between the class of control subjects plus the ACLL survivor and the class of ACLL victims
Sign-Perturbed Sums (SPS) with Asymmetric Noise: Robustness Analysis and Robustification Techniques
Sign-Perturbed Sums (SPS) is a recently developed finite sample system identification method that can build exact confidence regions for linear regression problems under mild statistical assumptions. The regions are well-shaped, e.g., they are centred around the least-squares (LS) estimate, star-convex and strongly consistent. One of the main assumptions of SPS is that the distribution of the noise terms are symmetric about zero. This paper analyses how robust SPS is with respect to the violation of this assumption and how it could be robustified with respect to non-symmetric noises. First, some alternative solutions are overviewed, then a robustness analysis is performed resulting in a robustified version of SPS. We also suggest a modification of SPS, called LAD-SPS, which builds exact confidence regions around the least-absolute deviation (LAD) estimate instead of the LS estimate. LAD-SPS requires less assumptions as the noise needs only to have a conditionally zero median (w.r.t. the past). Furthermore, that approach can also be robustified using similar ideas as in the LS-SPS case. Finally, some numerical experiments are presented
Asymptotic properties of SPS confidence regions
Sign-Perturbed Sums (SPS) is a system identification method that constructs non-asymptotic confidence regions for the parameters of linear regression models under mild statistical assumptions. One of its main features is that, for any finite number of data points and any user-specified probability, the constructed confidence region contains the true system parameter with exactly the user-chosen probability. In this paper we examine the size and the shape of the confidence regions, and we show that the regions are strongly consistent, i.e., they almost surely shrink around the true parameter as the number of data points increases. Furthermore, the confidence region is contained in a marginally inflated version of the confidence ellipsoid obtained from the asymptotic system identification theory. The results are also illustrated by a simulation example
Highly Automated Dipole EStimation (HADES)
Automatic estimation of current dipoles from biomagnetic data is
still a problematic task. This is due not only to the ill-posedness of
the inverse problem but also to two intrinsic difficulties introduced by
the dipolar model: the unknown number of sources and the nonlinear
relationship between the source locations and the data. Recently, we
have developed a new Bayesian approach, particle filtering, based on
dynamical tracking of the dipole constellation. Contrary to many
dipole-based methods, particle filtering does not assume stationarity
of the source configuration: the number of dipoles and their positions
are estimated and updated dynamically during the course of the MEG
sequence. We have now developed a Matlab-based graphical user interface,
which allows nonexpert users to do automatic dipole estimation
from MEG data with particle filtering. In the present paper, we describe
the main features of the software and show the analysis of both
a synthetic data set and an experimental dataset
Parametric cost modelling of components for turbomachines: Preliminary study
The ever-increasing competitiveness, due to the market globalisation, has forced the industries to modify their design and production strategies. Hence, it is crucial to estimate and optimise costs as early as possible since any following changes will negatively impact the redesign effort and lead time. This paper aims to compare different parametric cost estimation methods that can be used for analysing mechanical components. The current work presents a cost estimation methodology which uses non-historical data for the database population. The database is settled using should cost data obtained from analytical cost models implemented in a cost estimation software. Then, the paper compares different parametric cost modelling techniques (artificial neural networks, deep learning, random forest and linear regression) to define the best one for industrial components. Such methods have been tested on 9 axial compressor discs, different in dimensions. Then, by considering other materials and batch sizes, it was possible to reach a training dataset of 90 records. From the analysis carried out in this work, it is possible to conclude that the machine learning techniques are a valid alternative to the traditional linear regression ones
Sc substitution for Mg in MgB2: effects on Tc and Kohn anomaly
Here we report synthesis and characterization of Mg_{1-x}Sc_{x}B_{2}
(0.12T_{c}>6 K.
We find that the Sc doping moves the chemical potential through the 2D/3D
electronic topological transition (ETT) in the sigma band where the ``shape
resonance" of interband pairing occurs. In the 3D regime beyond the ETT we
observe a hardening of the E_{2g} Raman mode with a significant line-width
narrowing due to suppression of the Kohn anomaly over the range 0<q<2k_{F}.Comment: 8 pages, 4 EPS figures, to be published in Phys. Rev.
- …