2,236 research outputs found

    The MICRO-BOSS scheduling system: Current status and future efforts

    Get PDF
    In this paper, a micro-opportunistic approach to factory scheduling was described that closely monitors the evolution of bottlenecks during the construction of the schedule, and continuously redirects search towards the bottleneck that appears to be most critical. This approach differs from earlier opportunistic approaches, as it does not require scheduling large resource subproblems or large job subproblems before revising the current scheduling strategy. This micro-opportunistic approach was implemented in the context of the MICRO-BOSS factory scheduling system. A study comparing MICRO-BOSS against a macro-opportunistic scheduler suggests that the additional flexibility of the micro-opportunistic approach to scheduling generally yields important reductions in both tardiness and inventory

    A framework for applying natural language processing in digital health interventions

    Get PDF
    BACKGROUND: Digital health interventions (DHIs) are poised to reduce target symptoms in a scalable, affordable, and empirically supported way. DHIs that involve coaching or clinical support often collect text data from 2 sources: (1) open correspondence between users and the trained practitioners supporting them through a messaging system and (2) text data recorded during the intervention by users, such as diary entries. Natural language processing (NLP) offers methods for analyzing text, augmenting the understanding of intervention effects, and informing therapeutic decision making. OBJECTIVE: This study aimed to present a technical framework that supports the automated analysis of both types of text data often present in DHIs. This framework generates text features and helps to build statistical models to predict target variables, including user engagement, symptom change, and therapeutic outcomes. METHODS: We first discussed various NLP techniques and demonstrated how they are implemented in the presented framework. We then applied the framework in a case study of the Healthy Body Image Program, a Web-based intervention trial for eating disorders (EDs). A total of 372 participants who screened positive for an ED received a DHI aimed at reducing ED psychopathology (including binge eating and purging behaviors) and improving body image. These users generated 37,228 intervention text snippets and exchanged 4285 user-coach messages, which were analyzed using the proposed model. RESULTS: We applied the framework to predict binge eating behavior, resulting in an area under the curve between 0.57 (when applied to new users) and 0.72 (when applied to new symptom reports of known users). In addition, initial evidence indicated that specific text features predicted the therapeutic outcome of reducing ED symptoms. CONCLUSIONS: The case study demonstrates the usefulness of a structured approach to text data analytics. NLP techniques improve the prediction of symptom changes in DHIs. We present a technical framework that can be easily applied in other clinical trials and clinical presentations and encourage other groups to apply the framework in similar contexts

    Experiment for Testing Special Relativity Theory

    Full text link
    An experiment aimed at testing special relativity via a comparison of the velocity of a non matter particle (annihilation photon) with the velocity of the matter particle (Compton electron) produced by the second annihilation photon from the decay Na-22(beta^+)Ne-22 is proposed.Comment: 7 pages, 1 figure, Report on the Conference of Nuclear Physics Division of Russian Academy of Science "Physics of Fundamental Interactions", ITEP, Moscow, November 26-30, 200

    Spectrometric method to detect exoplanets as another test to verify the invariance of the velocity of light

    Full text link
    Hypothetical influences of variability of light velocity due to the parameters of the source of radiation, for the results of spectral measurements of stars to search for exoplanets are considered. Accounting accelerations of stars relative to the barycenter of the star - a planet (the planets) was carried out. The dependence of the velocity of light from the barycentric radial velocity and barycentric radial acceleration component of the star should lead to a substantial increase (up to degree of magnitude) semi-major axes of orbits detected candidate to extrasolar planets. Consequently, the correct comparison of the results of spectral method with results of other well-known modern methods of detecting extrasolar planets can regard the results obtained in this paper as a reliable test for testing the invariance of the velocity of light.Comment: 11 pages, 5 figure

    Spherical collapse model in dark energy cosmologies

    Full text link
    We study the spherical collapse model for several dark energy scenarios using the fully nonlinear differential equation for the evolution of the density contrast within homogeneous spherical overdensities derived from Newtonian hydrodynamics. While mathematically equivalent to the more common approach based on the differential equation for the radius of the perturbation, this approach has substantial conceptual as well as numerical advantages. Among the most important are that no singularities at early times appear, which avoids numerical problems in particular in applications to cosmologies with dynamical and early dark energy, and that the assumption of time-reversal symmetry can easily be dropped where it is not strictly satisfied. We use this approach to derive the two parameters characterising the spherical-collapse model, i.e.~the linear density threshold for collapse δc\delta_\mathrm{c} and the virial overdensity ΔV\Delta_\mathrm{V}, for a broad variety of dark-energy models and to reconsider these parameters in cosmologies with early dark energy. We find that, independently of the model under investigation, δc\delta_\mathrm{c} and ΔV\Delta_\mathrm{V} are always very close to the values obtained for the standard Λ\LambdaCDM model, arguing that the abundance of and the mean density within non-linear structures are quite insensitive to the differences between dark-energy cosmologies. Regarding early dark energy, we thus arrive at a different conclusion than some earlier papers, including one from our group, and we explain why.Comment: 11 pages, 7 figures, accepted for publications on MNRA

    Disagreeable Privacy Policies: Mismatches between Meaning and Users’ Understanding

    Get PDF
    Privacy policies are verbose, difficult to understand, take too long to read, and may be the least-read items on most websites even as users express growing concerns about information collection practices. For all their faults, though, privacy policies remain the single most important source of information for users to attempt to learn how companies collect, use, and share data. Likewise, these policies form the basis for the self-regulatory notice and choice framework that is designed and promoted as a replacement for regulation. The underlying value and legitimacy of notice and choice depends, however, on the ability of users to understand privacy policies. This paper investigates the differences in interpretation among expert, knowledgeable, and typical users and explores whether those groups can understand the practices described in privacy policies at a level sufficient to support rational decision-making. The paper seeks to fill an important gap in the understanding of privacy policies through primary research on user interpretation and to inform the development of technologies combining natural language processing, machine learning and crowdsourcing for policy interpretation and summarization. For this research, we recruited a group of law and public policy graduate students at Fordham University, Carnegie Mellon University, and the University of Pittsburgh (“knowledgeable users”) and presented these law and policy researchers with a set of privacy policies from companies in the e-commerce and news & entertainment industries. We asked them nine basic questions about the policies’ statements regarding data collection, data use, and retention. We then presented the same set of policies to a group of privacy experts and to a group of non-expert users. The findings show areas of common understanding across all groups for certain data collection and deletion practices, but also demonstrate very important discrepancies in the interpretation of privacy policy language, particularly with respect to data sharing. The discordant interpretations arose both within groups and between the experts and the two other groups. The presence of these significant discrepancies has critical implications. First, the common understandings of some attributes of described data practices mean that semi-automated extraction of meaning from website privacy policies may be able to assist typical users and improve the effectiveness of notice by conveying the true meaning to users. However, the disagreements among experts and disagreement between experts and the other groups reflect that ambiguous wording in typical privacy policies undermines the ability of privacy policies to effectively convey notice of data practices to the general public. The results of this research will, consequently, have significant policy implications for the construction of the notice and choice framework and for the US reliance on this approach. The gap in interpretation indicates that privacy policies may be misleading the general public and that those policies could be considered legally unfair and deceptive. And, where websites are not effectively conveying privacy policies to consumers in a way that a “reasonable person” could, in fact, understand the policies, “notice and choice” fails as a framework. Such a failure has broad international implications since websites extend their reach beyond the United States

    Very-high-energy observations of the binaries V 404 Cyg and 4U 0115+634 during giant X-ray outbursts

    Full text link
    Transient X-ray binaries produce major outbursts in which the X-ray flux can increase over the quiescent level by factors as large as 10710^7. The low-mass X-ray binary V 404 Cyg and the high-mass system 4U 0115+634 underwent such major outbursts in June and October 2015, respectively. We present here observations at energies above hundreds of GeV with the VERITAS observatory taken during some of the brightest X-ray activity ever observed from these systems. No gamma-ray emission has been detected by VERITAS in 2.5 hours of observations of the microquasar V 404 Cyg from 2015, June 20-21. The upper flux limits derived from these observations on the gamma-ray flux above 200 GeV of F <4.4×1012< 4.4\times 10^{-12} cm2^{-2} s1^{-1} correspond to a tiny fraction (about 10610^{-6}) of the Eddington luminosity of the system, in stark contrast to that seen in the X-ray band. No gamma rays have been detected during observations of 4U 0115+634 in the period of major X-ray activity in October 2015. The flux upper limit derived from our observations is F <2.1×1012< 2.1\times 10^{-12} cm2^{-2} s1^{-1} for gamma rays above 300 GeV, setting an upper limit on the ratio of gamma-ray to X-ray luminosity of less than 4%.Comment: Accepted for publication in the Astrophysical Journa

    Evolution of Massive Haloes in non-Gaussian Scenarios

    Full text link
    We have performed high-resolution cosmological N-body simulations of a concordance LCDM model to study the evolution of virialized, dark matter haloes in the presence of primordial non-Gaussianity. Following a standard procedure, departures from Gaussianity are modeled through a quadratic Gaussian term in the primordial gravitational potential, characterized by a dimensionless non-linearity strength parameter f_NL. We find that the halo mass function and its redshift evolution closely follow the analytic predictions of Matarrese et al.(2000). The existence of precise analytic predictions makes the observation of rare, massive objects at large redshift an even more attractive test to detect primordial non-Gaussian features in the large scale structure of the universe.Comment: 7 pages,3 figures, submitted to MNRA

    Measurement of Cosmic-ray Electrons at TeV Energies by VERITAS

    Full text link
    Cosmic-ray electrons and positrons (CREs) at GeV-TeV energies are a unique probe of our local Galactic neighborhood. CREs lose energy rapidly via synchrotron radiation and inverse-Compton scattering processes while propagating within the Galaxy and these losses limit their propagation distance. For electrons with TeV energies, the limit is on the order of a kiloparsec. Within that distance there are only a few known astrophysical objects capable of accelerating electrons to such high energies. It is also possible that the CREs are the products of the annihilation or decay of heavy dark matter (DM) particles. VERITAS, an array of imaging air Cherenkov telescopes in southern Arizona, USA, is primarily utilized for gamma-ray astronomy, but also simultaneously collects CREs during all observations. We describe our methods of identifying CREs in VERITAS data and present an energy spectrum, extending from 300 GeV to 5 TeV, obtained from approximately 300 hours of observations. A single power-law fit is ruled out in VERITAS data. We find that the spectrum of CREs is consistent with a broken power law, with a break energy at 710 ±\pm 40stat_{stat} ±\pm 140syst_{syst} GeV.Comment: 17 pages, 2 figures, accepted for publication in PR

    The Sunyaev-Zel'dovich effects from a cosmological hydrodynamical simulation: large-scale properties and correlation with the soft X-ray signal

    Get PDF
    Using the results of a cosmological hydrodynamical simulation of the concordance LambdaCDM model, we study the global properties of the Sunyaev-Zel'dovich (SZ) effects, both considering the thermal (tSZ) and the kinetic (kSZ) component. The simulation follows gravitation and gas dynamics and includes also several physical processes that affect the baryonic component, like a simple reionization scenario, radiative cooling, star formation and supernova feedback. Starting from the outputs of the simulation we create mock maps of the SZ signals due to the large structures of the Universe integrated in the range 0 < z < 6. We predict that the Compton y-parameter has an average value of (1.19 +/- 0.32) 10^-6 and is lognormally distributed in the sky; half of the whole signal comes from z < 1 and about 10 per cent from z > 2. The Doppler b-parameter shows approximately a normal distribution with vanishing mean value and a standard deviation of 1.6 10^-6, with a significant contribution from high-redshift (z > 3) gas. We find that the tSZ effect is expected to dominate the primary CMB anisotropies for l >~ 3000 in the Rayleigh-Jeans limit, while interestingly the kSZ effect dominates at all frequencies at very high multipoles (l >~ 7 10^4). We also analyse the cross-correlation between the two SZ effects and the soft (0.5-2 keV) X-ray emission from the intergalactic medium and we obtain a strong correlation between the three signals, especially between X-ray emission and tSZ effect (r_l ~ 0.8-0.9) at all angular scales.Comment: 12 pages, 15 figures. Accepted for publication in MNRAS. Minor changes, added reference
    corecore