115 research outputs found

    Responsiveness of the Eating Disorders Quality of Life Scale (EDQLS) in a longitudinal multi-site sample

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In eating disorders (EDs), treatment outcome measurement has traditionally focused on symptom reduction rather than functioning or quality of life (QoL). The Eating Disorders Quality of Life Scale (EDQLS) was recently developed to allow for measurement of broader outcomes. We examined responsiveness of the EDQLS in a longitudinal multi-site study.</p> <p>Methods</p> <p>The EDQLS and comparator generic QoL scales were collected in person at baseline, and 3 and 6 months from 130 participants (mean age 25.6 years; range 14-60) in 12 treatment programs in four Canadian provinces. Total score differences across the time points and responsiveness were examined using both anchor- and distribution-based methods.</p> <p>Results</p> <p>98 (75%) and 85 (65%) responses were received at 3 and 6 months respectively. No statistically significant differences were found between the baseline sample and those lost to follow-up on any measured characteristic. Mean EDQLS total scores increased from 110 (SD = 24) to 124.5 (SD = 29) at 3 months and 129 (SD = 28) at 6 months, and the difference by time was tested using a general linear model (GLM) to account for repeated measurement (p < .001). Responsiveness was good overall (Cohen's d = .61 and .80), and confirmed using anchor methods across 5 levels of self-reported improvement in health status (p < .001). Effect sizes across time were moderate or large for for all age groups. Internal consistency (Chronbach's alpha=.96) held across measurement points and patterns of responsiveness held across subscales. EDQLS responsiveness exceeded that of the Quality of Life Inventory, the Short Form-12 (mental and physical subscales) and was similar to the 16-dimension quality of life scale.</p> <p>Conclusions</p> <p>The EDQLS is responsive to change in geographically diverse and clinically heterogeneous programs over a relatively short time period in adolescents and adults. It shows promise as an outcome measure for both research and clinical practice.</p

    An Analysis of Private School Closings

    Get PDF
    We add to the small literature on private school supply by exploring exits of K-12 private schools. We find that the closure of private schools is not an infrequent event, and use national survey data from the National Center for Education Statistics to study closures of private schools. We assume that the probability of an exit is a function of excess supply of private schools over the demand, as well as the school's characteristics such as age, size, and religious affiliation. Our empirical results generally support the implications of the model. Working Paper 07-0

    An Infrared Census of Star Formation in the Horsehead Nebula

    Full text link
    At ~ 400 pc, the Horsehead Nebula (B33) is the closest radiatively-sculpted pillar to the Sun, but the state and extent of star formation in this structure is not well understood. We present deep near-infrared (IRSF/SIRIUS JHKs) and mid-infrared (Spitzer/IRAC) observations of the Horsehead Nebula in order to characterize the star forming properties of this region and to assess the likelihood of triggered star formation. Infrared color-color and color-magnitude diagrams are used to identify young stars based on infrared excess emission and positions to the right of the Zero-Age Main Sequence, respectively. Of the 45 sources detected at both near- and mid-infrared wavelengths, three bona fide and five candidate young stars are identified in this 7' by 7' region. Two bona fide young stars have flat infrared SEDs and are located at the western irradiated tip of the pillar. The spatial coincidence of the protostars at the leading edge of this elephant trunk is consistent with the Radiation-Driven Implosion (RDI) model of triggered star formation. There is no evidence, however, for sequential star formation within the immediate ~ 1.5' (0.17 pc) region from the cloud/H II region interface.Comment: 30 pages, 11 figures; accepted for publication in The Astronomical Journa

    Statistical design of personalized medicine interventions: The Clarification of Optimal Anticoagulation through Genetics (COAG) trial

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There is currently much interest in pharmacogenetics: determining variation in genes that regulate drug effects, with a particular emphasis on improving drug safety and efficacy. The ability to determine such variation motivates the application of personalized drug therapies that utilize a patient's genetic makeup to determine a safe and effective drug at the correct dose. To ascertain whether a genotype-guided drug therapy improves patient care, a personalized medicine intervention may be evaluated within the framework of a randomized controlled trial. The statistical design of this type of personalized medicine intervention requires special considerations: the distribution of relevant allelic variants in the study population; and whether the pharmacogenetic intervention is equally effective across subpopulations defined by allelic variants.</p> <p>Methods</p> <p>The statistical design of the Clarification of Optimal Anticoagulation through Genetics (COAG) trial serves as an illustrative example of a personalized medicine intervention that uses each subject's genotype information. The COAG trial is a multicenter, double blind, randomized clinical trial that will compare two approaches to initiation of warfarin therapy: genotype-guided dosing, the initiation of warfarin therapy based on algorithms using clinical information and genotypes for polymorphisms in <it>CYP2C9 </it>and <it>VKORC1</it>; and clinical-guided dosing, the initiation of warfarin therapy based on algorithms using only clinical information.</p> <p>Results</p> <p>We determine an absolute minimum detectable difference of 5.49% based on an assumed 60% population prevalence of zero or multiple genetic variants in either <it>CYP2C9 </it>or <it>VKORC1 </it>and an assumed 15% relative effectiveness of genotype-guided warfarin initiation for those with zero or multiple genetic variants. Thus we calculate a sample size of 1238 to achieve a power level of 80% for the primary outcome. We show that reasonable departures from these assumptions may decrease statistical power to 65%.</p> <p>Conclusions</p> <p>In a personalized medicine intervention, the minimum detectable difference used in sample size calculations is not a known quantity, but rather an unknown quantity that depends on the genetic makeup of the subjects enrolled. Given the possible sensitivity of sample size and power calculations to these key assumptions, we recommend that they be monitored during the conduct of a personalized medicine intervention.</p> <p>Trial Registration</p> <p>clinicaltrials.gov: NCT00839657</p

    Experimental Philosophical Bioethics

    Get PDF
    There is a rich tradition in bioethics of gathering empirical data to inform, supplement, or test the implications of normative ethical analysis. To this end, bioethicists have drawn on diverse methods, including qualitative interviews, focus groups, ethnographic studies, and opinion surveys to advance understanding of key issues in bioethics. In so doing, they have developed strong ties with neighboring disciplines such as anthropology, history, law, and sociology. Collectively, these lines of research have flourished in the broader field of “empirical bioethics” for more than 30 years (Sugarman & Sulmasy 2010). More recently, philosophers from outside the field of bioethics have similarly employed empirical methods—drawn primarily from psychology, the cognitive sciences, economics, and related disciplines—to advance theoretical debates. This approach, which has come to be called experimental philosophy (or x-phi), relies primarily on controlled experiments to interrogate the concepts, intuitions, reasoning, implicit mental processes, and empirical assumptions about the mind that play a role in traditional philosophical arguments (Knobe et al. 2012). Within the moral domain, for example, experimental philosophy has begun to contribute to long-standing debates about the nature of moral judgment and reasoning; the sources of our moral emotions and biases; the qualities of a good person or a good life; and the psychological basis of moral theory itself (Alfano, Loeb, & Plakias 2018). We believe that experimental philosophical bioethics—or “bioxphi”—can similarly explain how it is distinct from empirical bioethics more broadly construed, and attempt to characterize how it might advance theory and practice in this area

    Updated Nucleosynthesis Constraints on Unstable Relic Particles

    Get PDF
    We revisit the upper limits on the abundance of unstable massive relic particles provided by the success of Big-Bang Nucleosynthesis calculations. We use the cosmic microwave background data to constrain the baryon-to-photon ratio, and incorporate an extensively updated compilation of cross sections into a new calculation of the network of reactions induced by electromagnetic showers that create and destroy the light elements deuterium, he3, he4, li6 and li7. We derive analytic approximations that complement and check the full numerical calculations. Considerations of the abundances of he4 and li6 exclude exceptional regions of parameter space that would otherwise have been permitted by deuterium alone. We illustrate our results by applying them to massive gravitinos. If they weigh ~100 GeV, their primordial abundance should have been below about 10^{-13} of the total entropy. This would imply an upper limit on the reheating temperature of a few times 10^7 GeV, which could be a potential difficulty for some models of inflation. We discuss possible ways of evading this problem.Comment: 40 pages LaTeX, 18 eps figure
    corecore