1,002 research outputs found

    Enrollee Mix, Treatment Intensity, and Cost in Competing Indemnity and HMO Plans

    Get PDF
    We examine why managed care plans are less expensive than traditional indemnity insurance plans. Our database consists of the insurance experiences of over 200,000 state and local employees in Massachusetts and their families, who are insured in a single pool. Within this group, average HMO costs are 40 percent below those of the indemnity plan. We evaluate cost differences for 8 conditions representing over 10 percent of total health expenditures. They are: heart attacks, cancers (breast, cervical, colon, prostate), diabetes (type I and II), and live births. For each condition, we identify the portions of the cost differential arising from differences in treatment intensity, enrollee mix, and prices paid for the same treatment. Surprisingly, treatment intensity differs hardly at all between the HMOs and the indemnity plan. That is, relative to their fee-for-service competitor, HMOs do not curb the use of expensive treatments. Across the 8 conditions, roughly half of the HMO cost savings is due to the lower incidence of the diseases in the HMOs. Virtually all of the remaining savings come because HMOs pay lower prices for the same treatment.

    The Continuing Struggle for Automotive Safety

    Get PDF

    A comparative fMRI meta-analysis of altruistic and strategic decisions to give

    Get PDF
    The decision to share resources is fundamental for cohesive societies. Humans can be motivated to give for many reasons. Some generosity incurs a definite cost, with no extrinsic reward to the act, but instead provides intrinsic satisfaction (labelled here as 'altruistic' giving). Other giving behaviours are done with the prospect of improving one's own situation via reciprocity, reputation, or public good (labelled here as 'strategic' giving). These contexts differ in the source, certainty, and timing of rewards as well as the inferences made about others' mental states. We executed a combined statistical map and coordinate-based fMRI meta-analysis of decisions to give (36 studies, 1150 participants). Methods included a novel approach for accommodating variable signal dropout between studies in meta-analysis. Results reveal consistent, cross-paradigm neural correlates of each decision type, commonalities, and informative differences. Relative to being selfish, altruistic and strategic giving activate overlapping reward networks. However, strategic decisions showed greater activity in striatal regions than altruistic choices. Altruistic giving, more than strategic, activated subgenual anterior cingulate cortex (sgACC). Ventromedial prefrontal cortex (vmPFC) is consistently involved during generous decisions and processing across a posterior to anterior axis differentiates the altruistic/strategic context. Posterior vmPFC was preferentially recruited during altruistic decisions. Regions of the 'social brain' showed distinct patterns of activity between choice types, reflecting the different use of theory of mind in the two contexts. We provide the consistent neural correlates of decisions to give, and show that many will depend on the source of incentives

    Ultrahigh precision cosmology from gravitational waves

    Get PDF
    We show that the Big Bang Observer (BBO), a proposed space-based gravitational-wave (GW) detector, would provide ultraprecise measurements of cosmological parameters. By detecting ∼3×10^5 compact-star binaries, and utilizing them as standard sirens, BBO would determine the Hubble constant to ∼0.1%, and the dark-energy parameters w_0 and w_a to ∼0.01 and ∼0.1, respectively. BBO’s dark-energy figure-of-merit would be approximately an order of magnitude better than all other proposed, dedicated dark-energy missions. To date, BBO has been designed with the primary goal of searching for gravitational waves from inflation, down to the level Ω_(GW)∼10^(-17); this requirement determines BBO’s frequency band (deci-Hz) and its sensitivity requirement (strain measured to ∼10^(-24)). To observe an inflationary GW background, BBO would first have to detect and subtract out ∼3×10^5 merging compact-star binaries, out to a redshift z ∼ 5. It is precisely this carefully measured foreground which would enable high-precision cosmology. BBO would determine the luminosity distance to each binary to ∼ percent accuracy. In addition, BBO’s angular resolution would be sufficient to uniquely identify the host galaxy for the majority of binaries; a coordinated optical/infrared observing campaign could obtain the redshifts. Combining the GW-derived distances and the electromagnetically-derived redshifts for such a large sample of objects, out to such high redshift, naturally leads to extraordinarily tight constraints on cosmological parameters. We emphasize that such “standard siren” measurements of cosmology avoid many of the systematic errors associated with other techniques: GWs offer a physics-based, absolute measurement of distance. In addition, we show that BBO would also serve as an exceptionally powerful gravitational-lensing mission, and we briefly discuss other astronomical uses of BBO, including providing an early warning system for all short/hard gamma-ray bursts

    Trustworthy Experimentation Under Telemetry Loss

    Full text link
    Failure to accurately measure the outcomes of an experiment can lead to bias and incorrect conclusions. Online controlled experiments (aka AB tests) are increasingly being used to make decisions to improve websites as well as mobile and desktop applications. We argue that loss of telemetry data (during upload or post-processing) can skew the results of experiments, leading to loss of statistical power and inaccurate or erroneous conclusions. By systematically investigating the causes of telemetry loss, we argue that it is not practical to entirely eliminate it. Consequently, experimentation systems need to be robust to its effects. Furthermore, we note that it is nontrivial to measure the absolute level of telemetry loss in an experimentation system. In this paper, we take a top-down approach towards solving this problem. We motivate the impact of loss qualitatively using experiments in real applications deployed at scale, and formalize the problem by presenting a theoretical breakdown of the bias introduced by loss. Based on this foundation, we present a general framework for quantitatively evaluating the impact of telemetry loss, and present two solutions to measure the absolute levels of loss. This framework is used by well-known applications at Microsoft, with millions of users and billions of sessions. These general principles can be adopted by any application to improve the overall trustworthiness of experimentation and data-driven decision making.Comment: Proceedings of the 27th ACM International Conference on Information and Knowledge Management, October 201

    Time-Restricted Feeding Improves Circadian Dysfunction as well as Motor Symptoms in the Q175 Mouse Model of Huntington's Disease.

    Get PDF
    Huntington's disease (HD) patients suffer from a progressive neurodegeneration that results in cognitive, psychiatric, cardiovascular, and motor dysfunction. Disturbances in sleep/wake cycles are common among HD patients with reports of delayed sleep onset, frequent bedtime awakenings, and fatigue during the day. The heterozygous Q175 mouse model of HD has been shown to phenocopy many HD core symptoms including circadian dysfunctions. Because circadian dysfunction manifests early in the disease in both patients and mouse models, we sought to determine if early intervention that improve circadian rhythmicity can benefit HD and delay disease progression. We determined the effects of time-restricted feeding (TRF) on the Q175 mouse model. At six months of age, the animals were divided into two groups: ad libitum (ad lib) and TRF. The TRF-treated Q175 mice were exposed to a 6-h feeding/18-h fasting regimen that was designed to be aligned with the middle of the time when mice are normally active. After three months of treatment (when mice reached the early disease stage), the TRF-treated Q175 mice showed improvements in their locomotor activity rhythm and sleep awakening time. Furthermore, we found improved heart rate variability (HRV), suggesting that their autonomic nervous system dysfunction was improved. Importantly, treated Q175 mice exhibited improved motor performance compared to untreated Q175 controls, and the motor improvements were correlated with improved circadian output. Finally, we found that the expression of several HD-relevant markers was restored to WT levels in the striatum of the treated mice using NanoString gene expression assays

    Multi-Point Interferometric Rayleigh Scattering using Dual-Pass Light Recirculation

    Get PDF
    This paper describes for the first time an interferometric Rayleigh scattering system using dual-pass light recirculation (IRS-LR) capable of simultaneously measuring at multiple points two orthogonal components of flow velocity in combustion flows using single shot laser probing. An additional optical path containing the interferometer input mirror, a quarter-wave plate, a polarization dependent beam combiner, and a high reflectivity mirror partially recirculates the light that is rejected by the interferometer. Temporally- and spatially-resolved acquisitions of Rayleigh spectra in a large-scale combustion-heated supersonic axi-symmetric jet were performed to demonstrate the technique. Recirculating of Rayleigh scattered light increases the number of photons analyzed by the system up to a factor of 1.8 compared with previous configurations. This is equivalent to performing measurements with less laser energy or performing measurements with the previous system in gas flows at higher temperatures

    Spatially and Temporally-Resolved Multi-Parameter Interferometric Rayleigh Scattering

    Get PDF
    A novel approach to simultaneously measure the translational temperature, bulk velocity, and density in gases by collecting, referencing, and analyzing nanosecond time-scale Rayleigh scattered light from molecules is described. A narrow-band pulsed laser source is used to probe two largely separated measurement locations, one of which is used for reference. The elastically scattered photons containing information from both measurement locations are collected at the same time and analyzed spectrally using a planar Fabry - Perot interferometer. A practical means of referencing the measurement of velocity using the laser frequency, and the density and temperature using the information from the reference measurement location maintained at constant properties is described. To demonstrate the technique single-shot spectra of elastic scattered light are obtained in a near zero velocity H2-air Hencken burner flame and simultaneously in an N2-filled gas cell. A simplified Gaussian distribution model to the scattered light spectra is used to obtain the flame properties. Corrections to this model are applied at lower gas temperatures when the simplified Gaussian approximation is no longer suitable. The near-zero measured velocity as a function of the measured flame temperature, and a comparison of the measured flame density and temperature with the perfect gas law are presented
    corecore