2,494 research outputs found
On the fast computation of the weight enumerator polynomial and the value of digital nets over finite abelian groups
In this paper we introduce digital nets over finite abelian groups which
contain digital nets over finite fields and certain rings as a special case. We
prove a MacWilliams type identity for such digital nets. This identity can be
used to compute the strict -value of a digital net over finite abelian
groups. If the digital net has points in the dimensional unit cube
, then the -value can be computed in
operations and the weight enumerator polynomial can be computed in
operations, where operations mean arithmetic of
integers. By precomputing some values the number of operations of computing the
weight enumerator polynomial can be reduced further
hadroproduction with massive bottom quarks with PowHel
The associated production of top-antitop-bottom-antibottom quarks is a
relevant irreducible background for Higgs boson analyses in the
top-antitop-Higgs production channel, with Higgs decaying into a
bottom-antibottom quark pair. We implement this process in the PowHel event
generator, considering the bottom quarks as massive in all steps of the
computation which involves hard-scattering matrix-elements in the 4-flavour
number scheme combined with 4-flavour Parton Distribution Functions.
Predictions with NLO QCD + Parton Shower accuracy, as obtained by PowHel +
PYTHIA, are compared to those which resulted from a previous PowHel
implementation with hard-scattering matrix-elements in the 5-flavour number
scheme, considering as a baseline the example of a realistic analysis of
top-antitop hadroproduction with additional -jet activity, performed by the
CMS collaboration at the Large Hadron Collider.Comment: 9 pages, 6 figure
Validating Network Value of Influencers by means of Explanations
Recently, there has been significant interest in social influence analysis.
One of the central problems in this area is the problem of identifying
influencers, such that by convincing these users to perform a certain action
(like buying a new product), a large number of other users get influenced to
follow the action. The client of such an application is a marketer who would
target these influencers for marketing a given new product, say by providing
free samples or discounts. It is natural that before committing resources for
targeting an influencer the marketer would be interested in validating the
influence (or network value) of influencers returned. This requires digging
deeper into such analytical questions as: who are their followers, on what
actions (or products) they are influential, etc. However, the current
approaches to identifying influencers largely work as a black box in this
respect. The goal of this paper is to open up the black box, address these
questions and provide informative and crisp explanations for validating the
network value of influencers.
We formulate the problem of providing explanations (called PROXI) as a
discrete optimization problem of feature selection. We show that PROXI is not
only NP-hard to solve exactly, it is NP-hard to approximate within any
reasonable factor. Nevertheless, we show interesting properties of the
objective function and develop an intuitive greedy heuristic. We perform
detailed experimental analysis on two real world datasets - Twitter and
Flixster, and show that our approach is useful in generating concise and
insightful explanations of the influence distribution of users and that our
greedy algorithm is effective and efficient with respect to several baselines
Non-Gaussian Geostatistical Modeling using (skew) t Processes
We propose a new model for regression and dependence analysis when addressing
spatial data with possibly heavy tails and an asymmetric marginal distribution.
We first propose a stationary process with marginals obtained through scale
mixing of a Gaussian process with an inverse square root process with Gamma
marginals. We then generalize this construction by considering a skew-Gaussian
process, thus obtaining a process with skew-t marginal distributions. For the
proposed (skew) process we study the second-order and geometrical
properties and in the case, we provide analytic expressions for the
bivariate distribution. In an extensive simulation study, we investigate the
use of the weighted pairwise likelihood as a method of estimation for the
process. Moreover we compare the performance of the optimal linear predictor of
the process versus the optimal Gaussian predictor. Finally, the
effectiveness of our methodology is illustrated by analyzing a georeferenced
dataset on maximum temperatures in Australi
A room-temperature alternating current susceptometer - Data analysis, calibration, and test
An AC susceptometer operating in the range of 10 Hz to 100 kHz and at room
temperature is designed, built, calibrated and used to characterize the
magnetic behaviour of coated magnetic nanoparticles. Other weakly magnetic
materials (in amounts of some millilitres) can be analyzed as well. The setup
makes use of a DAQ-based acquisition system in order to determine the amplitude
and the phase of the sample magnetization as a function of the frequency of the
driving magnetic field, which is powered by a digital waveform generator. A
specific acquisition strategy makes the response directly proportional to the
sample susceptibility, taking advantage of the differential nature of the coil
assembly. A calibration method based on conductive samples is developed.Comment: 8 pages, 7 figures, 19 ref
Bioelectronic technologies and artificial intelligence for medical diagnosis and healthcare
The application of electronic findings to biology and medicine has significantly impacted health and wellbeing. Recent technology advances have allowed the development of new systems that can provide diagnostic information on portable point-of-devices or smartphones. The decreasing size of electronics technologies down to the atomic scale and the advances in system, cell, and molecular biology have the potential to increase the quality and reduce the costs of healthcare.
Clinicians have pervasive access to new data from complex sensors; imaging tools; and a multitude of other sources, including personal health e-records and smart environments. Humans are from being able to process this unprecedented volume of available data without advanced tools. Artificial intelligence (AI) can help clinicians to identify patterns from this huge amount of data to inform better choices for patients.
In this Special Issue, some original research papers focusing on recent advances have been collected, covering novel theories, innovative methods, and meaningful applications that could potentially lead to significant advances in the field
Performance of screening for aneuploidies by cell-free DNA analysis of maternal blood in twin pregnancies
Objectives
To report clinical implementation of cell‐free DNA (cfDNA) analysis of maternal blood in screening for trisomies 21, 18 and 13 in twin pregnancies and examine variables that could influence the failure rate of the test.
Methods
cfDNA testing was performed in 515 twin pregnancies at 10–28 weeks' gestation. The failure rate of the test to provide results was compared with that in 1847 singleton pregnancies, and logistic regression analysis was used to determine which factors among maternal and pregnancy characteristics were significant predictors of test failure.
Results
Failure rate of the cfDNA test at first sampling was 1.7% in singletons and 5.6% in twins. Of those with a test result, the median fetal fraction in twins was 8.7% (range, 4.1–30.0%), which was lower than that in singletons (11.7% (range, 4.0–38.9%)). Multivariable regression analysis demonstrated that twin pregnancy, higher maternal weight and conception by in‐vitro fertilization provided significant independent prediction of test failure. Follow‐up was available in 351 (68.2%) of the twin pregnancies and comprised 334 with euploid fetuses, 12 discordant for trisomy 21 and five discordant for trisomy 18. In all 323 euploid cases with a result, the risk score for each trisomy was < 1:10 000. In 11 of the 12 cases with trisomy 21 and in the five with trisomy 18, the cfDNA test gave a high‐risk result, but in one case of trisomy 21, the score was < 1:10 000.
Conclusion
In twin pregnancies screening by cfDNA testing is feasible, but the failure rate is higher and detection rate may be lower than in singletons
Blockwise Euclidean likelihood for spatio-temporal covariance models
A spatio-temporal blockwise Euclidean likelihood method for the estimation of covariance models when dealing with large spatio-temporal Gaussian data is proposed. The method uses moment conditions coming from the score of the pairwise composite likelihood. The blockwise approach guarantees considerable computational improvements over the standard pairwise composite likelihood method. In order to further speed up computation, a general purpose graphics processing unit implementation using OpenCL is implemented. The asymptotic properties of the proposed estimator are derived and the finite sample properties of this methodology by means of a simulation study highlighting the computational gains of the OpenCL graphics processing unit implementation. Finally, there is an application of the estimation method to a wind component data set
Helac-nlo
Based on the OPP technique and the HELAC framework, HELAC-1LOOP is a program
that is capable of numerically evaluating QCD virtual corrections to scattering
amplitudes. A detailed presentation of the algorithm is given, along with
instructions to run the code and benchmark results. The program is part of the
HELAC-NLO framework that allows for a complete evaluation of QCD NLO
corrections.Comment: minor text revisions, version to appear in Comput.Phys.Commu
Larmor frequency dressing by an anharmonic transverse magnetic field
We present a theoretical and experimental study of spin precession in the
presence of both a static and an orthogonal oscillating magnetic field, which
is nonresonant, not harmonically related to the Larmor precession, and of
arbitrary strength. Due to the intrinsic nonlinearity of the system, previous
models that account only for the simple sinusoidal case cannot be applied. We
suggest an alternative approach and develop a model that closely agrees with
experimental data produced by an optical-pumping atomic magnetometer. We
demonstrate that an appropriately designed nonharmonic field makes it possible
to extract a linear response to a weak dc transverse field, despite the scalar
nature of the magnetometer, which normally causes a much weaker, second-order
response.Comment: Published version has some minor changes; 22 pages and 8 picture
- …