29 research outputs found

    Velocity autocorrelation function of a Brownian particle

    Full text link
    In this article, we present molecular dynamics study of the velocity autocorrelation function (VACF) of a Brownian particle. We compare the results of the simulation with the exact analytic predictions for a compressible fluid from [6] and an approximate result combining the predictions from hydrodynamics at short and long times. The physical quantities which determine the decay were determined from separate bulk simulations of the Lennard-Jones fluid at the same thermodynamic state point.We observe that the long-time regime of the VACF compares well the predictions from the macroscopic hydrodynamics, but the intermediate decay is sensitive to the viscoelastic nature of the solvent.Comment: 7 pages, 6 figure

    Acceptance criteria for new approach methods in toxicology and human health-relevant life science research - part I

    Get PDF
    Every test procedure, scientific and non-scientific, has inherent uncertainties, even when performed according to a standard operating procedure (SOP). In addition, it is prone to errors, defects, and mistakes introduced by operators, laboratory equipment, or materials used. Adherence to an SOP and comprehensive validation of the test method cannot guarantee that each test run produces data within the acceptable range of variability and with the precision and accuracy determined during the method validation. We illustrate here (part I) why controlling the validity of each test run is an important element of experimental design. The definition and application of acceptance criteria (AC) for the validity of test runs is important for the setup and use of test methods, particularly for the use of new approach methods (NAM) in toxicity testing. AC can be used for decision rules on how to handle data, e.g., to accept the data for further use (AC fulfilled) or to reject the data (AC not fulfilled). The adherence to AC has important requirements and consequences that may seem surprising at first sight: (i) AC depend on a test method's objectives, e.g., on the types/concentrations of chemicals tested, the regulatory context, the desired throughput; (ii) AC are applied and documented at each test run, while validation of a method (including the definition of AC) is only performed once; (iii) if AC are altered, then the set of data produced by a method can change. AC, if missing, are the blind spot of quality assurance: Test results may not be reliable and comparable. The establishment and uses of AC will be further detailed in part II of this series.Toxicolog

    An Indication of Anisotropy in Arrival Directions of Ultra-high-energy Cosmic Rays through Comparison to the Flux Pattern of Extragalactic Gamma-Ray Sources

    Get PDF
    A new analysis of the data set from the Pierre Auger Observatory provides evidence for anisotropy in the arrival directions of ultra-high-energy cosmic rays on an intermediate angular scale, which is indicative of excess arrivals from strong, nearby sources. The data consist of 5514 events above 20 EeV with zenith angles up to 80 degrees. recorded before 2017 April 30. Sky models have been created for two distinct populations of extragalactic gamma-ray emitters: active galactic nuclei from the second catalog of hard Fermi-LAT sources (2FHL) and starburst galaxies from a sample that was examined with Fermi-LAT. Flux-limited samples, which include all types of galaxies from the Swift-BAT and 2MASS surveys, have been investigated for comparison. The sky model of cosmic-ray density constructed using each catalog has two free parameters, the fraction of events correlating with astrophysical objects, and an angular scale characterizing the clustering of cosmic rays around extragalactic sources. A maximum-likelihood ratio test is used to evaluate the best values of these parameters and to quantify the strength of each model by contrast with isotropy. It is found that the starburst model fits the data better than the hypothesis of isotropy with a statistical significance of 4.0 sigma, the highest value of the test statistic being for energies above 39 EeV. The three alternative models are favored against isotropy with 2.7 sigma-3.2 sigma significance. The origin of the indicated deviation from isotropy is examined and prospects for more sensitive future studies are discussed

    A Targeted Search for Point Sources of EeV Photons with the Pierre Auger Observatory

    Get PDF
    Simultaneous measurements of air showers with the fluorescence and surface detectors of the Pierre Auger Observatory allow a sensitive search for EeV photon point sources. Several Galactic and extragalactic candidate objects are grouped in classes to reduce the statistical penalty of many trials from that of a blind search and are analyzed for a significant excess above the background expectation. The presented search does not find any evidence for photon emission at candidate sources, and combined p-values for every class are reported. Particle and energy flux upper limits are given for selected candidate sources. These limits significantly constrain predictions of EeV proton emission models from non-transient Galactic and nearby extragalactic sources, as illustrated for the particular case of the Galactic center region

    Inferences on mass composition and tests of hadronic interactions from 0.3 to 100 EeV using the water-Cherenkov detectors of the Pierre Auger Observatory

    Get PDF
    We present a new method for probing the hadronic interaction models at ultrahigh energy and extracting details about mass composition. This is done using the time profiles of the signals recorded with the water-Cherenkov detectors of the Pierre Auger Observatory. The profiles arise from a mix of the muon and electromagnetic components of air showers. Using the risetimes of the recorded signals, we define a new parameter, which we use to compare our observations with predictions from simulations. We find, first, inconsistencies between our data and predictions over a greater energy range and with substantially more events than in previous studies. Second, by calibrating the new parameter with fluorescence measurements from observations made at the Auger Observatory, we can infer the depth of shower maximum Xmax for a sample of over 81,000 events extending from 0.3 to over 100 EeV. Above 30 EeV, the sample is nearly 14 times larger than what is currently available from fluorescence measurements and extending the covered energy range by half a decade. The energy dependence of ?Xmaxcopyright is compared to simulations and interpreted in terms of the mean of the logarithmic mass. We find good agreement with previous work and extend the measurement of the mean depth of shower maximum to greater energies than before, reducing significantly the statistical uncertainty associated with the inferences about mass composition

    G × E interactions as a basis for toxicological uncertainty.

    No full text
    To transfer toxicological findings from model systems, e.g. animals, to humans, standardized safety factors are applied to account for intra-species and inter-species variabilities. An alternative approach would be to measure and model the actual compound-specific uncertainties. This biological concept assumes that all observed toxicities depend not only on the exposure situation (environment = E), but also on the genetic (G) background of the model (G × E). As a quantitative discipline, toxicology needs to move beyond merely qualitative G × E concepts. Research programs are required that determine the major biological variabilities affecting toxicity and categorize their relative weights and contributions. In a complementary approach, detailed case studies need to explore the role of genetic backgrounds in the adverse effects of defined chemicals. In addition, current understanding of the selection and propagation of adverse outcome pathways (AOP) in different biological environments is very limited. To improve understanding, a particular focus is required on modulatory and counter-regulatory steps. For quantitative approaches to address uncertainties, the concept of "genetic" influence needs a more precise definition. What is usually meant by this term in the context of G × E are the protein functions encoded by the genes. Besides the gene sequence, the regulation of the gene expression and function should also be accounted for. The widened concept of past and present "gene expression" influences is summarized here as G <sub>e</sub> . Also, the concept of "environment" needs some re-consideration in situations where exposure timing (E <sub>t</sub> ) is pivotal: prolonged or repeated exposure to the insult (chemical, physical, life style) affects G <sub>e</sub> . This implies that it changes the model system. The interaction of G <sub>e</sub> with E <sub>t</sub> might be denoted as G <sub>e</sub> × E <sub>t</sub> . We provide here general explanations and specific examples for this concept and show how it could be applied in the context of New Approach Methodologies (NAM)
    corecore