6,309 research outputs found

    Feasibility Test of the MedaCube

    Get PDF
    Poor adherence is a significant barrier to achieve better patient outcomes. Rates of non-adherence approach 40% resulting in 10% of all emergency department visits and 23% of admissions into skilled nursing facilities. Many factors contribute to medication non-adherence including psychological and memory disorders, aging and pill burden. The MedaCube is a medication management system intended to help solve unintentional medication non-adherence. The device is designed to dispense scheduled and as-needed oral medications. The MedaCube provides audio and visual prompts alerting subjects to administer their medications. Caregivers receive notification of missed doses, late doses and refill requests. The null hypothesis is that use of the MedaCube results in no difference in medication adherence when compared with six month prior adherence in individual subjects

    Statistical Methods for Evaluating and Comparing Biomarkers for Patient Treatment Selection

    Get PDF
    Despite the heightened interest in developing biomarkers predicting treatment response that are used to optimize patient treatment decisions, there has been relatively little development of statistical methodology to evaluate these markers. There is currently no unified statistical framework for marker evaluation. This paper proposes a suite of descriptive and inferential methods designed to evaluate individual markers and to compare candidate markers. An R software package has been developed which implements these methods. Their utility is illustrated in the breast cancer treatment context, where candidate markers are evaluated for their ability to identify a subset of women who do not benefit from adjuvant chemotherapy and can therefore avoid its toxicity

    Evaluating the impact of policies recommending PrEP to subpopulations of men and transgender women who have sex with men based on demographic and behavioral risk factors.

    Get PDF
    IntroductionDeveloping guidelines to inform the use of antiretroviral pre-exposure prophylaxis (PrEP) for HIV prevention in resource-limited settings must necessarily be informed by considering the resources and infrastructure needed for PrEP delivery. We describe an approach that identifies subpopulations of cisgender men who have sex with men (MSM) and transgender women (TGW) to prioritize for the rollout of PrEP in resource-limited settings.MethodsWe use data from the iPrEx study, a multi-national phase III study of PrEP for HIV prevention in MSM/TGW, to build statistical models that identify subpopulations at high risk of HIV acquisition without PrEP, and with high expected PrEP benefit. We then evaluate empirically the population impact of policies recommending PrEP to these subpopulations, and contrast these with existing policies.ResultsA policy recommending PrEP to a high risk subpopulation of MSM/TGW reporting condomless receptive anal intercourse over the last 3 months (estimated 3.3% 1-year HIV incidence) yields an estimated 1.95% absolute reduction in 1-year HIV incidence at the population level, and 3.83% reduction over 2 years. Importantly, such a policy requires rolling PrEP out to just 59.7% of MSM/TGW in the iPrEx population. We find that this policy is identical to that which prioritizes MSM/TGW with high expected PrEP benefit. It is estimated to achieve nearly the same reduction in HIV incidence as the PrEP guideline put forth by the US Centers for Disease Control, which relies on the measurement of more behavioral risk factors and which would recommend PrEP to a larger subset of the MSM/TGW population (86% vs. 60%).ConclusionsThese findings may be used to focus future mathematical modelling studies of PrEP in resource-limited settings on prioritizing PrEP for high-risk subpopulations of MSM/TGW. The statistical approach we took could be employed to develop PrEP policies for other at-risk populations and resource-limited settings

    Exploring and adjusting for potential learning effects in ROLARR: a randomised controlled trial comparing robotic-assisted vs. standard laparoscopic surgery for rectal cancer resection

    Get PDF
    Background: Commonly in surgical randomised controlled trials (RCT) the experimental treatment is a relatively new technique which the surgeons may still be learning, while the control is a well-established standard. This can lead to biased comparisons between treatments. In this paper we discuss the implementation of approaches for addressing this issue in the ROLARR trial, and points of consideration for future surgical trials. Methods: ROLARR was an international, randomised, parallel-group trial comparing robotic vs. laparoscopic surgery for the curative treatment of rectal cancer. The primary endpoint was conversion to open surgery (binary). A surgeon inclusion criterion mandating a minimum level of experience in each technique was incorporated. Additionally, surgeon self-reported data were collected periodically throughout the trial to capture the level of experience of every participating surgeon. Multi-level logistic regression adjusting for operating surgeon as a random effect is used to estimate the odds ratio for conversion to open surgery between the treatment groups. We present and contrast the results from the primary analysis, which did not account for learning effects, and a sensitivity analysis which did. Results: The primary analysis yields an estimated odds ratio (robotic/laparoscopic) of 0.614 (95% CI 0.311, 1.211; p = 0.16), providing insufficient evidence to conclude superiority of robotic surgery compared to laparoscopic in terms of the risk of conversion to open. The sensitivity analysis reveals that while participating surgeons in ROLARR were expert at laparoscopic surgery, some, if not all, were still learning robotic surgery. The treatment-effect odds ratio decreases by a factor of 0.341 (95% CI 0.121, 0.960; p = 0.042) per unit increase in log-number of previous robotic operations performed by the operating surgeon. The odds ratio for a patient whose operating surgeon has the mean experience level in ROLARR – 152.46 previous laparoscopic, 67.93 previous robotic operations – is 0.40 (95% CI 0.168, 0.953; p = 0.039). Conclusions: In this paper we have demonstrated the implementation of approaches for accounting for learning in a practical example of a surgery RCT analysis. The results demonstrate the value of implementing such approaches, since we have shown that without them the ROLARR analysis would indeed have been confounded by the learning effects

    A broad-band FT-ICR Penning trap system for KATRIN

    Get PDF
    The KArlsruhe TRItium Neutrino experiment KATRIN aims at improving the upper limit of the mass of the electron antineutrino to about 0.2 eV (90% c.l.) by investigating the beta-decay of tritium gas molecules T(2) -> ((3)HeT)(+) + e(-) + (nu) over bar (e). The experiment is currently under construction to start first data taking in 2012. One source of systematic uncertainties in the KATRIN experiment is the formation of ion clusters when tritium decays and decay products interact with residual tritium molecules. It is essential to monitor the abundances of these clusters since they have different final state energies than tritium ions. For this purpose, a prototype of a cylindrical Penning trap has been constructed and tested at the Max-Planck-Institute for Nuclear Physics in Heidelberg, which will be installed in the KATRIN beam line. This system employs the technique of Fourier-Transform Ion-Cyclotron-Resonance in order to measure the abundances of the different stored ion species.The two Penning traps have been financed by the BMBF (grant to the University of Karlsruhe) under project codes 05CK5VKA/5 and 05A08VK2. The support of the Deutsche Forschungsgemeinschaft for the development of the FT-ICR detection technique for precision mass spectrometry under contract number BL981-2-1 is gratefully acknowledged. We thank A. Gotsova for her help during tests in Mainz and Prof. C. Weinheimer for useful discussions related to this project. We warmly thank the LPC trappers group for providing the attenuation grids. D. Rodríguez is a Juan de la Cierva fellow and acknowledges support from the Spanish Ministry of Science and Innovation through the José Castillejo program to provide funding for a 5-month stay at the MPI-K. Sz. Nagy acknowledges support from the Alliance Program of the Helmholtz Association EMMI. S. Lukic acknowledges support by the Transregional Collaborative Research Centre No. 27 “Neutrinos and Beyond”, funded by Deutsche Forschungsgemeinschaft

    Risky Driving by Recently Licensed Teens: Self-Reports and Simulated Performance

    Get PDF
    U.S. teens are overrepresented in motor vehicle crashes, with the majority due to driver error; however, causal pathways remain to be elucidated. This research aimed to identify driving performance factors that might underlie newly-licensed male teens’ risk. Surveys were conducted with 21 16-year-olds at the time of intermediate licensure. During the second month of licensure they completed drives in a high-fidelity simulator. Simulator scenarios allowed assessment of responses to yellow traffic lights changing to red and to a visual search task, for which previous data on older age groups of drivers were available. All teens had an A or B grade point average, previously found to be associated with lower crash and citation risk. Nonetheless, 71% reported risky driving in terms of prior unlicensed, unsupervised driving. In the simulator, 46% went through an intersection as the light turned red, compared to 33% of adults. In the visual search task, teens had shorter mean perception-reaction times and identified more targets than adults and older drivers, but similar to young drivers. Therefore, even teens with good grades, perceived to be less risky, were willing to take driving risks. Their driving performance suggests there may be subtle differences in the way recently-licensed teens drive that might predispose them to crashes. Further research of this nature can increase understanding of such differences and inform the development of more targeted intervention

    Exploring the role of professional associations in collective learning in London and New York's advertising and law professional service firm clusters.

    Get PDF
    The value of regional economies for collective learning has been reported by numerous scholars. However often work has been criticised for lacking analytical clarity and failing to explore the architectures of collective learning and the role of the knowledge produced in making firms in a cluster economy successful. This paper engages with these problematics and investigates how collective learning is facilitated in the advertising and law professional service firm clusters in London and New York. It explores the role of professional associations and investigates how they mediate a collective learning process in each city. It argues that professional associations seed urban communities of practice that emerge outside of the formal activities of professional associations. In these communities individual with shared interests in advertising and law learn from one-another and are therefore able to adapt and evolve one-another approaches to common industry challenges. The paper suggests this is another form of the variation Marshall highlighted in relation to cluster-based collective learning. The paper also shows how the collective learning process is affected by the presence, absence and strength of an institutional thickness. It is therefore argued that a richer understanding of institutional affects is needed in relation to CL
    • …
    corecore