373 research outputs found

    Intervention planning and modification of the BUMP intervention: a digital intervention for the early detection of raised blood pressure in pregnancy

    Get PDF
    Background: Hypertensive disorders in pregnancy, particularly pre-eclampsia, pose a substantial health risk for both maternal and foetal outcomes. The BUMP (Blood Pressure Self-Monitoring in Pregnancy) interventions are being tested in a trial. They aim to facilitate the early detection of raised blood pressure through self-monitoring. This article outlines how the self-monitoring interventions in the BUMP trial were developed and modified using the person-based approach to promote engagement and adherence. Methods: Key behavioural challenges associated with blood pressure self-monitoring in pregnancy were identified through synthesising qualitative pilot data and existing evidence, which informed guiding principles for the development process. Social cognitive theory was identified as an appropriate theoretical framework. A testable logic model was developed to illustrate the hypothesised processes of change associated with the intervention. Iterative qualitative feedback from women and staff informed modifications to the participant materials. Results: The evidence synthesis suggested women face challenges integrating self-monitoring into their lives and that adherence is challenging at certain time points in pregnancy (for example, starting maternity leave). Intervention modification included strategies to address adherence but also focussed on modifying outcome expectancies, by providing messages explaining pre-eclampsia and outlining the potential benefits of self-monitoring. Conclusions: With an in-depth understanding of the target population, several methods and approaches to plan and develop interventions specifically relevant to pregnant women were successfully integrated, to address barriers to behaviour change while ensuring they are easy to engage with, persuasive and acceptable

    Study protocol: developing a decision system for inclusive housing: applying a systematic, mixed-method quasi-experimental design

    Get PDF
    Background Identifying the housing preferences of people with complex disabilities is a much needed, but under-developed area of practice and scholarship. Despite the recognition that housing is a social determinant of health and quality of life, there is an absence of empirical methodologies that can practically and systematically involve consumers in this complex service delivery and housing design market. A rigorous process for making effective and consistent development decisions is needed to ensure resources are used effectively and the needs of consumers with complex disability are properly met. Methods/Design This 3-year project aims to identify how the public and private housing market in Australia can better respond to the needs of people with complex disabilities whilst simultaneously achieving key corporate objectives. First, using the Customer Relationship Management framework, qualitative (Nominal Group Technique) and quantitative (Discrete Choice Experiment) methods will be used to quantify the housing preferences of consumers and their carers. A systematic mixed-method, quasi-experimental design will then be used to quantify the development priorities of other key stakeholders (e.g., architects, developers, Government housing services etc.) in relation to inclusive housing for people with complex disabilities. Stakeholders randomly assigned to Group 1 (experimental group) will participate in a series of focus groups employing Analytical Hierarchical Process (AHP) methodology. Stakeholders randomly assigned to Group 2 (control group) will participate in focus groups employing existing decision making processes to inclusive housing development (e.g., Risk, Opportunity, Cost, Benefit considerations). Using comparative stakeholder analysis, this research design will enable the AHP methodology (a proposed tool to guide inclusive housing development decisions) to be tested. Discussion It is anticipated that the findings of this study will enable stakeholders to incorporate consumer housing preferences into commercial decisions. Housing designers and developers will benefit from the creation of a parsimonious set of consumer-led housing preferences by which to make informed investments in future housing and contribute to future housing policy. The research design has not been applied in the Australian research context or elsewhere, and will provide a much needed blueprint for market investment to develop viable, consumer directed inclusive housing options for people with complex disability

    Electrochemical activation and inhibition of neuromuscular systems through modulation of ion concentrations with ion-selective membranes

    Get PDF
    Conventional functional electrical stimulation aims to restore functional motor activity of patients with disabilities resulting from spinal cord injury or neurological disorders. However, intervention with functional electrical stimulation in neurological diseases lacks an effective implantable method that suppresses unwanted nerve signals. We have developed an electrochemical method to activate and inhibit a nerve by electrically modulating ion concentrations in situ along the nerve. Using ion-selective membranes to achieve different excitability states of the nerve, we observe either a reduction of the electrical threshold for stimulation by up to approximately 40%, or voluntary, reversible inhibition of nerve signal propagation. This low-threshold electrochemical stimulation method is applicable in current implantable neuroprosthetic devices, whereas the on-demand nerve-blocking mechanism could offer effective clinical intervention in disease states caused by uncontrolled nerve activation, such as epilepsy and chronic pain syndromes.Massachusetts Institute of Technology. Faculty Discretionary Research FundNational Institutes of Health (U.S.) (Award UL1 RR 025758)Harvard Catalyst (Grant

    The relationship between literacy and multimorbidity in a primary care setting

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Multimorbidity is now acknowledged as a research priority in primary care. The identification of risk factors and people most at risk is an important step in guiding prevention and intervention strategies. The aim of this study was to examine the relationship between literacy and multimorbidity while controlling for potential confounders.</p> <p>Methods</p> <p>Participants were adult patients attending the family medicine clinic of a regional health centre in Saguenay (Quebec), Canada. Literacy was measured with the Newest Vital Sign (NVS). Multimorbidity was measured with the Disease Burden Morbidity Assessment (DBMA) by self-report. Information on potential confounders (age, sex, education and family income) was also collected. The association between literacy (independent variable) and multimorbidity was examined in bivariate and multivariate analyses. Two operational definitions of multimorbidity were used successively as the dependent variable; confounding variables were introduced into the model as potential predictors.</p> <p>Results</p> <p>One hundred three patients (36 men) 19–83 years old were recruited; 41.8% had completed 12 years of school or less. Forty-seven percent of patients provided fewer than four correct answers on the NVS (possible low literacy) whereas 53% had four correct responses or more. Literacy and multimorbidity were associated in bivariate analyses (p < 0.01) but not in multivariate analyses, including age and family income.</p> <p>Conclusion</p> <p>This study suggests that there is no relationship between literacy and multimorbidity when controlling for age and family income.</p

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO

    X-ray emission from the Sombrero galaxy: discrete sources

    Get PDF
    We present a study of discrete X-ray sources in and around the bulge-dominated, massive Sa galaxy, Sombrero (M104), based on new and archival Chandra observations with a total exposure of ~200 ks. With a detection limit of L_X = 1E37 erg/s and a field of view covering a galactocentric radius of ~30 kpc (11.5 arcminute), 383 sources are detected. Cross-correlation with Spitler et al.'s catalogue of Sombrero globular clusters (GCs) identified from HST/ACS observations reveals 41 X-rays sources in GCs, presumably low-mass X-ray binaries (LMXBs). We quantify the differential luminosity functions (LFs) for both the detected GC and field LMXBs, whose power-low indices (~1.1 for the GC-LF and ~1.6 for field-LF) are consistent with previous studies for elliptical galaxies. With precise sky positions of the GCs without a detected X-ray source, we further quantify, through a fluctuation analysis, the GC LF at fainter luminosities down to 1E35 erg/s. The derived index rules out a faint-end slope flatter than 1.1 at a 2 sigma significance, contrary to recent findings in several elliptical galaxies and the bulge of M31. On the other hand, the 2-6 keV unresolved emission places a tight constraint on the field LF, implying a flattened index of ~1.0 below 1E37 erg/s. We also detect 101 sources in the halo of Sombrero. The presence of these sources cannot be interpreted as galactic LMXBs whose spatial distribution empirically follows the starlight. Their number is also higher than the expected number of cosmic AGNs (52+/-11 [1 sigma]) whose surface density is constrained by deep X-ray surveys. We suggest that either the cosmic X-ray background is unusually high in the direction of Sombrero, or a distinct population of X-ray sources is present in the halo of Sombrero.Comment: 11 figures, 5 tables, ApJ in pres

    Azimuthal anisotropy of charged particles at high transverse momenta in PbPb collisions at sqrt(s[NN]) = 2.76 TeV

    Get PDF
    The azimuthal anisotropy of charged particles in PbPb collisions at nucleon-nucleon center-of-mass energy of 2.76 TeV is measured with the CMS detector at the LHC over an extended transverse momentum (pt) range up to approximately 60 GeV. The data cover both the low-pt region associated with hydrodynamic flow phenomena and the high-pt region where the anisotropies may reflect the path-length dependence of parton energy loss in the created medium. The anisotropy parameter (v2) of the particles is extracted by correlating charged tracks with respect to the event-plane reconstructed by using the energy deposited in forward-angle calorimeters. For the six bins of collision centrality studied, spanning the range of 0-60% most-central events, the observed v2 values are found to first increase with pt, reaching a maximum around pt = 3 GeV, and then to gradually decrease to almost zero, with the decline persisting up to at least pt = 40 GeV over the full centrality range measured.Comment: Replaced with published version. Added journal reference and DO

    Search for new physics with same-sign isolated dilepton events with jets and missing transverse energy

    Get PDF
    A search for new physics is performed in events with two same-sign isolated leptons, hadronic jets, and missing transverse energy in the final state. The analysis is based on a data sample corresponding to an integrated luminosity of 4.98 inverse femtobarns produced in pp collisions at a center-of-mass energy of 7 TeV collected by the CMS experiment at the LHC. This constitutes a factor of 140 increase in integrated luminosity over previously published results. The observed yields agree with the standard model predictions and thus no evidence for new physics is found. The observations are used to set upper limits on possible new physics contributions and to constrain supersymmetric models. To facilitate the interpretation of the data in a broader range of new physics scenarios, information on the event selection, detector response, and efficiencies is provided.Comment: Published in Physical Review Letter
    corecore