4,284,408 research outputs found
Sifting data in the real world
In the real world, experimental data are rarely, if ever, distributed as a
normal (Gaussian) distribution. As an example, a large set of data--such as the
cross sections for particle scattering as a function of energy contained in the
archives of the Particle Data Group--is a compendium of all published data, and
hence, unscreened. Inspection of similar data sets quickly shows that, for many
reasons, these data sets have many outliers--points well beyond what is
expected from a normal distribution--thus ruling out the use of conventional
techniques. This note suggests an adaptive algorithm that allows a
phenomenologist to apply to the data sample a sieve whose mesh is coarse enough
to let the background fall through, but fine enough to retain the preponderance
of the signal, thus sifting the data. A prescription is given for finding a
robust estimate of the best-fit model parameters in the presence of a noisy
background, together with a robust estimate of the model parameter errors, as
well as a determination of the goodness-of-fit of the data to the theoretical
hypothesis. Extensive computer simulations are carried out to test the
algorithm for both its accuracy and stability under varying background
conditions.Comment: 29 pages, 13 figures. Version to appear in Nucl. Instr. & Meth.
Network On Network for Tabular Data Classification in Real-world Applications
Tabular data is the most common data format adopted by our customers ranging
from retail, finance to E-commerce, and tabular data classification plays an
essential role to their businesses. In this paper, we present Network On
Network (NON), a practical tabular data classification model based on deep
neural network to provide accurate predictions. Various deep methods have been
proposed and promising progress has been made. However, most of them use
operations like neural network and factorization machines to fuse the
embeddings of different features directly, and linearly combine the outputs of
those operations to get the final prediction. As a result, the intra-field
information and the non-linear interactions between those operations (e.g.
neural network and factorization machines) are ignored. Intra-field information
is the information that features inside each field belong to the same field.
NON is proposed to take full advantage of intra-field information and
non-linear interactions. It consists of three components: field-wise network at
the bottom to capture the intra-field information, across field network in the
middle to choose suitable operations data-drivenly, and operation fusion
network on the top to fuse outputs of the chosen operations deeply. Extensive
experiments on six real-world datasets demonstrate NON can outperform the
state-of-the-art models significantly. Furthermore, both qualitative and
quantitative study of the features in the embedding space show NON can capture
intra-field information effectively
Using Simulation and Domain Adaptation to Improve Efficiency of Deep Robotic Grasping
Instrumenting and collecting annotated visual grasping datasets to train
modern machine learning algorithms can be extremely time-consuming and
expensive. An appealing alternative is to use off-the-shelf simulators to
render synthetic data for which ground-truth annotations are generated
automatically. Unfortunately, models trained purely on simulated data often
fail to generalize to the real world. We study how randomized simulated
environments and domain adaptation methods can be extended to train a grasping
system to grasp novel objects from raw monocular RGB images. We extensively
evaluate our approaches with a total of more than 25,000 physical test grasps,
studying a range of simulation conditions and domain adaptation methods,
including a novel extension of pixel-level domain adaptation that we term the
GraspGAN. We show that, by using synthetic data and domain adaptation, we are
able to reduce the number of real-world samples needed to achieve a given level
of performance by up to 50 times, using only randomly generated simulated
objects. We also show that by using only unlabeled real-world data and our
GraspGAN methodology, we obtain real-world grasping performance without any
real-world labels that is similar to that achieved with 939,777 labeled
real-world samples.Comment: 9 pages, 5 figures, 3 table
Recommended from our members
Real-world heart rate norms in the Health eHeart study.
Emerging technology allows patients to measure and record their heart rate (HR) remotely by photoplethysmography (PPG) using smart devices like smartphones. However, the validity and expected distribution of such measurements are unclear, making it difficult for physicians to help patients interpret real-world, remote and on-demand HR measurements. Our goal was to validate HR-PPG, measured using a smartphone app, against HR-electrocardiogram (ECG) measurements and describe out-of-clinic, real-world, HR-PPG values according to age, demographics, body mass index, physical activity level, and disease. To validate the measurements, we obtained simultaneous HR-PPG and HR-ECG in 50 consecutive patients at our cardiology clinic. We then used data from participants enrolled in the Health eHeart cohort between 1 April 2014 and 30 April 2018 to derive real-world norms of HR-PPG according to demographics and medical conditions. HR-PPG and HR-ECG were highly correlated (Intraclass correlation = 0.90). A total of 66,788 Health eHeart Study participants contributed 3,144,332 HR-PPG measurements. The mean real-world HR was 79.1 bpm ± 14.5. The 95th percentile of real-world HR was ≤110 in individuals aged 18-45, ≤100 in those aged 45-60 and ≤95 bpm in individuals older than 60 years old. In multivariable linear regression, the number of medical conditions, female gender, increasing body mass index, and being Hispanic was associated with an increased HR, whereas increasing age was associated with a reduced HR. Our study provides the largest real-world norms for remotely obtained, real-world HR according to various strata and they may help physicians interpret and engage with patients presenting such data
Leaky Bucket in the Real World: Estimating Inequality Aversion Using Survey Data
Existing evidence of inequality aversion relies on data from class-room experiments where subjects face hypothetical questions. This paper estimates the magnitude of inequality aversion using representative survey data, with questions related to the real-economy situations the respondents face. The results reveal that the magnitude of inequality aversion can be measured in a meaningful way using survey data, but the estimates depend dramatically on the framing of the question. No matter how measured, the revealed inequality aversion predicts opinions on a wide range of questions related to the welfare state, such as the level of taxation, tax progressivity and the structure of unemployment benefits.inequality aversion, social welfare functions, welfare state
- …
