1,869 research outputs found

    Product Variety and Demand Uncertainty

    Get PDF
    We show that demand uncertainty leads to vertical product differentiation even when consumers are homogeneous. When a firm anticipates that its inventory or capacity may not be fully utilized, product variety can reduce its expected costs of excess capacity. When the firm offers a continuum of product varieties, the highest quality product has the highest profit margins but the lowest percentage margin, while the lowest quality product has the highest percentage margin but the lowest absolute margin. We derive these results in both a monopoly model and a variety of different competitive models. We conclude with a discussion of empirical predictions together with a brief discussion of supporting evidence available from marketing studies.

    On the measurement of a weak classical force coupled to a quantum-mechanical oscillator. I. Issues of principle

    Get PDF
    The monitoring of a quantum-mechanical harmonic oscillator on which a classical force acts is important in a variety of high-precision experiments, such as the attempt to detect gravitational radiation. This paper reviews the standard techniques for monitoring the oscillator, and introduces a new technique which, in principle, can determine the details of the force with arbitrary accuracy, despite the quantum properties of the oscillator. The standard method for monitoring the oscillator is the "amplitude-and-phase" method (position or momentum transducer with output fed through a narrow-band amplifier). The accuracy obtainable by this method is limited by the uncertainty principle ("standard quantum limit"). To do better requires a measurement of the type which Braginsky has called "quantum nondemolition." A well known quantum nondemolition technique is "quantum counting," which can detect an arbitrarily weak classical force, but which cannot provide good accuracy in determining its precise time dependence. This paper considers extensively a new type of quantum nondemolition measurement—a "back-action-evading" measurement of the real part X_1 (or the imaginary part X_2) of the oscillator's complex amplitude. In principle X_1 can be measured "arbitrarily quickly and arbitrarily accurately," and a sequence of such measurements can lead to an arbitrarily accurate monitoring of the classical force. The authors describe explicit Gedanken experiments which demonstrate that X_1 can be measured arbitrarily quickly and arbitrarily accurately. In these experiments the measuring apparatus must be coupled to both the position (position transducer) and the momentum (momentum transducer) of the oscillator, and both couplings must be modulated sinusoidally. For a given measurement time the strength of the coupling determines the accuracy of the measurement; for arbitrarily strong coupling the measurement can be arbitrarily accurate. The "momentum transducer" is constructed by combining a "velocity transducer" with a "negative capacitor" or "negative spring." The modulated couplings are provided by an external, classical generator, which can be realized as a harmonic oscillator excited in an arbitrarily energetic, coherent state. One can avoid the use of two transducers by making "stroboscopic measurements" of X_1, in which one measures position (or momentum) at half-cycle intervals. Alternatively, one can make "continuous single-transducer" measurements of X_1 by modulating appropriately the output of a single transducer (position or momentum), and then filtering the output to pick out the information about X_1 and reject information about X_2. Continuous single-transducer measurements are useful in the case of weak coupling. In this case long measurement times are required to achieve good accuracy, and continuous single-transducer measurements are almost as good as perfectly coupled two-transducer measurements. Finally, the authors develop a theory of quantum nondemolition measurement for arbitrary systems. This paper (Paper I) concentrates on issues of principle; a sequel (Paper II) will consider issues of practice

    Identifying Patients Suitable for Discharge After a Single-Presentation High-Sensitivity Troponin Result: A Comparison of Five Established Risk Scores and Two High-Sensitivity Assays.

    Get PDF
    STUDY OBJECTIVE: We compare the ability of 5 established risk scores to identify patients with suspected acute coronary syndromes who are suitable for discharge after a modified single-presentation high-sensitivity troponin result. METHODS: This was a prospective observational study conducted in a UK district general hospital emergency department. Consecutive adults recruited with suspected acute coronary syndrome for whom attending physicians determined evaluation with serial troponin testing was required. Index tests were definitions of low risk applied to modified Goldman, Thrombolysis in Myocardial Infarction (TIMI), Global Registry of Acute Cardiac Events (GRACE), History, ECG, Age, Risk Factors, Troponin (HEART), and Vancouver Chest Pain Rule risk scores, incorporating either high-sensitivity troponin T or I results. The endpoint was acute myocardial infarction within 30 days. A test sensitivity threshold for acute myocardial infarction of 98% was chosen. Clinical utility was defined as a negative predictive value greater than or equal to 99.5% and identification of greater than 30% suitable for discharge. RESULTS: Nine hundred fifty-nine patients underwent high-sensitivity troponin T analysis and 867 underwent high-sensitivity troponin I analysis. In the high-sensitivity troponin T group, 79 of 959 (8.2%) had an acute myocardial infarction and 66 of 867 (7.6%) in the high-sensitivity troponin I group. Two risk scores (GRACE <80 and HEART ≤3) did not have the potential to achieve a sensitivity of 98% with high-sensitivity troponin T, and 3 scores (Goldman ≤1, TIMI ≤1, and GRACE <80) with high-sensitivity troponin I. A TIMI score of 0 or less than or equal to 1 and modified Goldman score less than or equal to 1 with high-sensitivity troponin T, and TIMI score of 0 and HEART score of less than or equal to 3 with high-sensitivity troponin I had the potential to achieve a negative predictive value greater than or equal to 99.5% while identifying greater than 30% of patients as suitable for immediate discharge. CONCLUSION: With established risk scores, it may be possible to identify greater than 30% of patients suitable for discharge, with a negative predictive value greater than or equal to 99.5% for the diagnosis of acute myocardial infarction, using a single high-sensitivity troponin test result at presentation. There is variation in high-sensitivity troponin assays, which may have implications in introducing rapid rule-out protocols

    Classical phase-space descriptions of continuous-variable teleportation

    Full text link
    The nonnegative Wigner function of all quantum states involved in teleportation of Gaussian states using the standard continuous-variable teleportation protocol means that there is a local realistic phase-space description of the process. This includes the coherent states teleported up to now in experiments. We extend the phase-space description to teleportation of non-Gaussian states using the standard protocol and conclude that teleportation of non-Gaussian states with fidelity of 2/3 is a "gold standard" for this kind of teleportation.Comment: New version contains minor changes requested by journal referee

    Editorial: Leading People - Managing Organizations: Contemporary Public Health Leadership

    Get PDF
    Effectively leading people engaged in the practice of public health has never been more critical than in the early years of the twenty-first century. Likewise, effectively managing the organizations in which these individuals practice the various professional disciplines of public health has become increasing important and difficult. Taken together, leading the people and managing public health organizations requires well educated and appropriately trained public health leaders and managers. Although leadership is often viewed as one of the key attributes of management, not every great manager will be a great leader and vice versa. While some leaders may be born with the inherent skills to lead, most effective leaders develop the requisite skills through education, additional training, and practice. Our aim is to focus the attention of public health practitioners on the importance of effectively leading public health organizations. Public health managers should recognize that their most valuable resource is the people they lead. The articles comprising the eBook on Leading People – Managing Organizations is composed of articles expressing the opinion of their authors of the need for effective public health leaders; perspective articles establishing their authors’ understanding of how leadership may be applied in various situations; methods articles that demonstrate how public health leadership may be applied, and original research articles that establish the role of public health leadership research studies

    Simulation of a storm event in marine microcosms

    Get PDF
    A storm of moderate intensity and lasting for 14 hours was simulated in three of the Marine Ecosystems Research Laboratory (MERL) microcosms by increasing the intensity of microcosm mixing. The simulation resuspended ~0.3 cm of sediment and increased suspended particulate loads by 2 orders of magnitude. Concurrently, the concentrations of metals and nutrients increased in the water column...

    Do walking strategies to increase physical activity reduce reported sitting in workplaces: a randomized control trial

    Get PDF
    Background Interventions designed to increase workplace physical activity may not automatically reduce high volumes of sitting, a behaviour independently linked to chronic diseases such as obesity and type II diabetes. This study compared the impact two different walking strategies had on step counts and reported sitting times. Methods Participants were white-collar university employees (n = 179; age 41.3 ± 10.1 years; 141 women), who volunteered and undertook a standardised ten-week intervention at three sites. Pre-intervention step counts (Yamax SW-200) and self-reported sitting times were measured over five consecutive workdays. Using pre-intervention step counts, employees at each site were randomly allocated to a control group (n = 60; maintain normal behaviour), a route-based walking group (n = 60; at least 10 minutes sustained walking each workday) or an incidental walking group (n = 59; walking in workday tasks). Workday step counts and reported sitting times were re-assessed at the beginning, mid- and endpoint of intervention and group mean± SD steps/day and reported sitting times for pre-intervention and intervention measurement points compared using a mixed factorial ANOVA; paired sample-t-tests were used for follow-up, simple effect analyses. Results A significant interactive effect (F = 3.5; p < 0.003) was found between group and step counts. Daily steps for controls decreased over the intervention period (-391 steps/day) and increased for route (968 steps/day; t = 3.9, p < 0.000) and incidental (699 steps/day; t = 2.5, p < 0.014) groups. There were no significant changes for reported sitting times, but average values did decrease relative to the control (routes group = 7 minutes/day; incidental group = 15 minutes/day). Reductions were most evident for the incidental group in the first week of intervention, where reported sitting decreased by an average of 21 minutes/day (t = 1.9; p < 0.057). Conclusion Compared to controls, both route and incidental walking increased physical activity in white-collar employees. Our data suggests that workplace walking, particularly through incidental movement, also has the potential to decrease employee sitting times, but there is a need for on-going research using concurrent and objective measures of sitting, standing and walking

    Qubit metrology and decoherence

    Full text link
    Quantum properties of the probes used to estimate a classical parameter can be used to attain accuracies that beat the standard quantum limit. When qubits are used to construct a quantum probe, it is known that initializing nn qubits in an entangled "cat state," rather than in a separable state, can improve the measurement uncertainty by a factor of 1/n1/\sqrt{n}. We investigate how the measurement uncertainty is affected when the individual qubits in a probe are subjected to decoherence. In the face of such decoherence, we regard the rate RR at which qubits can be generated and the total duration τ\tau of a measurement as fixed resources, and we determine the optimal use of entanglement among the qubits and the resulting optimal measurement uncertainty as functions of RR and τ\tau.Comment: 24 Pages, 3 Figure

    Direct neutron capture of 48Ca at kT = 52 keV

    Full text link
    The neutron capture cross section of 48Ca was measured relative to the known gold cross section at kT = 52 keV using the fast cyclic activation technique. The experiment was performed at the Van-de-Graaff accelerator, Universitaet Tuebingen. The new experimental result is in good agreement with a calculation using the direct capture model. The 1/v behaviour of the capture cross section at thermonuclear energies is confirmed, and the adopted reaction rate which is based on several previous experimental investigations remains unchanged.Comment: 9 pages (uses Revtex), 2 postscript figures, accepted for publication as Brief Report in Phys. Rev.
    corecore