4,681 research outputs found

    Tracking and predicting U.S. influenza activity with a real-time surveillance network

    Get PDF
    Each year in the United States, influenza causes illness in 9.2 to 35.6 million individuals and is responsible for 12,000 to 56,000 deaths. The U.S. Centers for Disease Control and Prevention (CDC) tracks influenza activity through a national surveillance network. These data are only available after a delay of 1 to 2 weeks, and thus influenza epidemiologists and transmission modelers have explored the use of other data sources to produce more timely estimates and predictions of influenza activity. We evaluated whether data collected from a national commercial network of influenza diagnostic machines could produce valid estimates of the current burden and help to predict influenza trends in the United States. Quidel Corporation provided us with de-identified influenza test results transmitted in real-time from a national network of influenza test machines called the Influenza Test System (ITS). We used this ITS dataset to estimate and predict influenza-like illness (ILI) activity in the United States over the 2015-2016 and 2016-2017 influenza seasons. First, we developed linear logistic models on national and regional geographic scales that accurately estimated two CDC influenza metrics: the proportion of influenza test results that are positive and the proportion of physician visits that are ILI-related. We then used our estimated ILI-related proportion of physician visits in transmission models to produce improved predictions of influenza trends in the United States at both the regional and national scale. These findings suggest that ITS can be leveraged to improve "nowcasts" and short-term forecasts of U.S. influenza activity

    Beyond 100 GHz: High frequency device characterization for THz applications

    Get PDF
    International audienc

    Measuring the proton spectrum in neutron decay - latest results with aSPECT

    Full text link
    The retardation spectrometer aSPECT was built to measure the shape of the proton spectrum in free neutron decay with high precision. This allows us to determine the antineutrino electron angular correlation coefficient a. We aim for a precision more than one order of magnitude better than the present best value, which is Delta_a /a = 5%. In a recent beam time performed at the Institut Laue-Langevin during April / May 2008 we reached a statistical accuracy of about 2% per 24 hours measurement time. Several systematic effects were investigated experimentally. We expect the total relative uncertainty to be well below 5%.Comment: Accepted for publication in the Conference Proceedings of the International Workshop on Particle Physics with Slow Neutrons 2008 held at the ILL, France. To be published in Nuclear Instruments and Methods in Physics Research, Section

    The age of data-driven proteomics : how machine learning enables novel workflows

    Get PDF
    A lot of energy in the field of proteomics is dedicated to the application of challenging experimental workflows, which include metaproteomics, proteogenomics, data independent acquisition (DIA), non-specific proteolysis, immunopeptidomics, and open modification searches. These workflows are all challenging because of ambiguity in the identification stage; they either expand the search space and thus increase the ambiguity of identifications, or, in the case of DIA, they generate data that is inherently more ambiguous. In this context, machine learning-based predictive models are now generating considerable excitement in the field of proteomics because these predictive models hold great potential to drastically reduce the ambiguity in the identification process of the above-mentioned workflows. Indeed, the field has already produced classical machine learning and deep learning models to predict almost every aspect of a liquid chromatography-mass spectrometry (LC-MS) experiment. Yet despite all the excitement, thorough integration of predictive models in these challenging LC-MS workflows is still limited, and further improvements to the modeling and validation procedures can still be made. In this viewpoint we therefore point out highly promising recent machine learning developments in proteomics, alongside some of the remaining challenges

    Ultracold-neutron infrastructure for the gravitational spectrometer GRANIT

    Full text link
    The gravitational spectrometer GRANIT will be set up at the Institut Laue Langevin. It will profit from the high ultracold neutron density produced by a dedicated source. A monochromator made of crystals from graphite intercalated with potassium will provide a neutron beam with 0.89 nm incident on the source. The source employs superthermal conversion of cold neutrons in superfluid helium, in a vessel made from BeO ceramics with Be windows. A special extraction technique has been tested which feeds the spectrometer only with neutrons with a vertical velocity component v < 20 cm/s, thus keeping the density in the source high. This new source is expected to provide a density of up to 800 1/cm3 for the spectrometer.Comment: accepted for publication in Proceedings International Workshop on Particle Physics with Slow Neutron

    Random billiards with wall temperature and associated Markov chains

    Full text link
    By a random billiard we mean a billiard system in which the standard specular reflection rule is replaced with a Markov transition probabilities operator P that, at each collision of the billiard particle with the boundary of the billiard domain, gives the probability distribution of the post-collision velocity for a given pre-collision velocity. A random billiard with microstructure (RBM) is a random billiard for which P is derived from a choice of geometric/mechanical structure on the boundary of the billiard domain. RBMs provide simple and explicit mechanical models of particle-surface interaction that can incorporate thermal effects and permit a detailed study of thermostatic action from the perspective of the standard theory of Markov chains on general state spaces. We focus on the operator P itself and how it relates to the mechanical/geometric features of the microstructure, such as mass ratios, curvatures, and potentials. The main results are as follows: (1) we characterize the stationary probabilities (equilibrium states) of P and show how standard equilibrium distributions studied in classical statistical mechanics, such as the Maxwell-Boltzmann distribution and the Knudsen cosine law, arise naturally as generalized invariant billiard measures; (2) we obtain some basic functional theoretic properties of P. Under very general conditions, we show that P is a self-adjoint operator of norm 1 on an appropriate Hilbert space. In a simple but illustrative example, we show that P is a compact (Hilbert-Schmidt) operator. This leads to the issue of relating the spectrum of eigenvalues of P to the features of the microstructure;(3) we explore the latter issue both analytically and numerically in a few representative examples;(4) we present a general algorithm for simulating these Markov chains based on a geometric description of the invariant volumes of classical statistical mechanics

    The Millennium Arecibo 21-CM Absorption Line Survey. II. Properties of the Warm and Cold Neutral Media

    Get PDF
    We use the Gaussian-fit results of Paper I to investigate the properties of interstellar HI in the Solar neighborhood. The Warm and Cold Neutral Media (WNM and CNM) are physically distinct components. The CNM spin temperature histogram peaks at about 40 K. About 60% of all HI is WNM. At z=0, we derive a volume filling fraction of about 0.50 for the WNM; this value is very rough. The upper-limit WNM temperatures determined from line width range upward from about 500 K; a minimum of about 48% of the WNM lies in the thermally unstable region 500 to 5000 K. The WNM is a prominent constituent of the interstellar medium and its properties depend on many factors, requiring global models that include all relevant energy sources, of which there are many. We use Principal Components Analysis, together with a form of least squares fitting that accounts for errors in both the independent and dependent parameters, to discuss the relationships among the four CNM Gaussian parameters. The spin temperature T_s and column density N(HI) are, approximately, the two most important eigenvectors; as such, they are sufficient, convenient, and physically meaningful primary parameters for describing CNM clouds. The Mach number of internal macroscopic motions for CNM clouds is typically 2.5, but there are wide variations. We discuss the historical tau-T_s relationship in some detail and show that it has little physical meaning. We discuss CNM morphology using the CNM pressure known from UV stellar absorption lines. Knowing the pressure allows us to show that CNM structures cannot be isotropic but instead are sheetlike, with length-to-thickness aspect ratios ranging up to about 280. We present large-scale maps of two regions where CNM lies in very large ``blobby sheets''.Comment: Revised submission to Ap.J. Changes include: (1) correction of turbulent Mach number in equation 16 and figure 12; the new typical value is 1.3 versus the old, incorrect value 2.5. (2) smaller typeface for the astro-ph version to conserve paper. 60 pages, 16 figure
    corecore