357 research outputs found

    Study protocol: developing a decision system for inclusive housing: applying a systematic, mixed-method quasi-experimental design

    Get PDF
    Background Identifying the housing preferences of people with complex disabilities is a much needed, but under-developed area of practice and scholarship. Despite the recognition that housing is a social determinant of health and quality of life, there is an absence of empirical methodologies that can practically and systematically involve consumers in this complex service delivery and housing design market. A rigorous process for making effective and consistent development decisions is needed to ensure resources are used effectively and the needs of consumers with complex disability are properly met. Methods/Design This 3-year project aims to identify how the public and private housing market in Australia can better respond to the needs of people with complex disabilities whilst simultaneously achieving key corporate objectives. First, using the Customer Relationship Management framework, qualitative (Nominal Group Technique) and quantitative (Discrete Choice Experiment) methods will be used to quantify the housing preferences of consumers and their carers. A systematic mixed-method, quasi-experimental design will then be used to quantify the development priorities of other key stakeholders (e.g., architects, developers, Government housing services etc.) in relation to inclusive housing for people with complex disabilities. Stakeholders randomly assigned to Group 1 (experimental group) will participate in a series of focus groups employing Analytical Hierarchical Process (AHP) methodology. Stakeholders randomly assigned to Group 2 (control group) will participate in focus groups employing existing decision making processes to inclusive housing development (e.g., Risk, Opportunity, Cost, Benefit considerations). Using comparative stakeholder analysis, this research design will enable the AHP methodology (a proposed tool to guide inclusive housing development decisions) to be tested. Discussion It is anticipated that the findings of this study will enable stakeholders to incorporate consumer housing preferences into commercial decisions. Housing designers and developers will benefit from the creation of a parsimonious set of consumer-led housing preferences by which to make informed investments in future housing and contribute to future housing policy. The research design has not been applied in the Australian research context or elsewhere, and will provide a much needed blueprint for market investment to develop viable, consumer directed inclusive housing options for people with complex disability

    Talking in primary care (TIP): protocol for a cluster-randomised controlled trial in UK primary care to assess clinical and cost-effectiveness of communication skills e-learning for practitioners on patients' musculoskeletal pain and enablement.

    Get PDF
    INTRODUCTION: Effective communication can help optimise healthcare interactions and patient outcomes. However, few interventions have been tested clinically, subjected to cost-effectiveness analysis or are sufficiently brief and well-described for implementation in primary care. This paper presents the protocol for determining the effectiveness and cost-effectiveness of a rigorously developed brief eLearning tool, EMPathicO, among patients with and without musculoskeletal pain. METHODS AND ANALYSIS: A cluster randomised controlled trial in general practitioner (GP) surgeries in England and Wales serving patients from diverse geographic, socioeconomic and ethnic backgrounds. GP surgeries are randomised (1:1) to receive EMPathicO e-learning immediately, or at trial end. Eligible practitioners (eg, GPs, physiotherapists and nurse practitioners) are involved in managing primary care patients with musculoskeletal pain. Patient recruitment is managed by practice staff and researchers. Target recruitment is 840 adults with and 840 without musculoskeletal pain consulting face-to-face, by telephone or video. Patients complete web-based questionnaires at preconsultation baseline, 1 week and 1, 3 and 6 months later. There are two patient-reported primary outcomes: pain intensity and patient enablement. Cost-effectiveness is considered from the National Health Service and societal perspectives. Secondary and process measures include practitioner patterns of use of EMPathicO, practitioner-reported self-efficacy and intentions, patient-reported symptom severity, quality of life, satisfaction, perceptions of practitioner empathy and optimism, treatment expectancies, anxiety, depression and continuity of care. Purposive subsamples of patients, practitioners and practice staff take part in up to two qualitative, semistructured interviews. ETHICS APPROVAL AND DISSEMINATION: Approved by the South Central Hampshire B Research Ethics Committee on 1 July 2022 and the Health Research Authority and Health and Care Research Wales on 6 July 2022 (REC reference 22/SC/0145; IRAS project ID 312208). Results will be disseminated via peer-reviewed academic publications, conference presentations and patient and practitioner outlets. If successful, EMPathicO could quickly be made available at a low cost to primary care practices across the country. TRIAL REGISTRATION NUMBER: ISRCTN18010240

    NUScon: a community-driven platform for quantitative evaluation of nonuniform sampling in NMR

    Get PDF
    Although the concepts of nonuniform sampling (NUS​​​​​​​) and non-Fourier spectral reconstruction in multidimensional NMR began to emerge 4 decades ago (Bodenhausen and Ernst, 1981; Barna and Laue, 1987), it is only relatively recently that NUS has become more commonplace. Advantages of NUS include the ability to tailor experiments to reduce data collection time and to improve spectral quality, whether through detection of closely spaced peaks (i.e., “resolution”) or peaks of weak intensity (i.e., “sensitivity”). Wider adoption of these methods is the result of improvements in computational performance, a growing abundance and flexibility of software, support from NMR spectrometer vendors, and the increased data sampling demands imposed by higher magnetic fields. However, the identification of best practices still remains a significant and unmet challenge. Unlike the discrete Fourier transform, non-Fourier methods used to reconstruct spectra from NUS data are nonlinear, depend on the complexity and nature of the signals, and lack quantitative or formal theory describing their performance. Seemingly subtle algorithmic differences may lead to significant variabilities in spectral qualities and artifacts. A community-based critical assessment of NUS challenge problems has been initiated, called the “Nonuniform Sampling Contest” (NUScon), with the objective of determining best practices for processing and analyzing NUS experiments. We address this objective by constructing challenges from NMR experiments that we inject with synthetic signals, and we process these challenges using workflows submitted by the community. In the initial rounds of NUScon our aim is to establish objective criteria for evaluating the quality of spectral reconstructions. We present here a software package for performing the quantitative analyses, and we present the results from the first two rounds of NUScon. We discuss the challenges that remain and present a roadmap for continued community-driven development with the ultimate aim of providing best practices in this rapidly evolving field. The NUScon software package and all data from evaluating the challenge problems are hosted on the NMRbox platform

    Cabinet Tree: an orthogonal enclosure approach to visualizing and exploring big data

    Get PDF
    Treemaps are well-known for visualizing hierarchical data. Most related approaches have been focused on layout algorithms and paid little attention to other display properties and interactions. Furthermore, the structural information in conventional Treemaps is too implicit for viewers to perceive. This paper presents Cabinet Tree, an approach that: i) draws branches explicitly to show relational structures, ii) adapts a space-optimized layout for leaves and maximizes the space utilization, iii) uses coloring and labeling strategies to clearly reveal patterns and contrast different attributes intuitively. We also apply the continuous node selection and detail window techniques to support user interaction with different levels of the hierarchies. Our quantitative evaluations demonstrate that Cabinet Tree achieves good scalability for increased resolutions and big datasets

    Herbivore benefits from vectoring plant virus through reduction of period of vulnerability to predation

    Get PDF
    Herbivores can profit from vectoring plant pathogens because the induced defence of plants against pathogens sometimes interferes with the induced defence of plants against herbivores. Plants can also defend themselves indirectly by the action of the natural enemies of the herbivores. It is unknown whether the defence against pathogens induced in the plant also interferes with the indirect defence against herbivores mediated via the third trophic level. We previously showed that infection of plants with Tomato spotted wilt virus (TSWV) increased the developmental rate of and juvenile survival of its vector, the thrips Frankliniella occidentalis. Here, we present the results of a study on the effects of TSWV infections of plants on the effectiveness of three species of natural enemies of F. occidentalis: the predatory mites Neoseiulus cucumeris and Iphiseius degenerans, and the predatory bug Orius laevigatus. The growth rate of thrips larvae was positively affected by the presence of virus in the host plant. Because large larvae are invulnerable to predation by the two species of predatory mites, this resulted in a shorter period of vulnerability to predation for thrips that developed on plants with virus than thrips developing on uninfected plants (4.4 vs. 7.9 days, respectively). Because large thrips larvae are not invulnerable to predation by the predatory bug Orius laevigatus, infection of the plant did not affect the predation risk of thrips larvae from this predator. This is the first demonstration of a negative effect of a plant pathogen on the predation risk of its vector

    Brane-World Gravity

    Get PDF
    The observable universe could be a 1+3-surface (the "brane") embedded in a 1+3+\textit{d}-dimensional spacetime (the "bulk"), with Standard Model particles and fields trapped on the brane while gravity is free to access the bulk. At least one of the \textit{d} extra spatial dimensions could be very large relative to the Planck scale, which lowers the fundamental gravity scale, possibly even down to the electroweak (∟\sim TeV) level. This revolutionary picture arises in the framework of recent developments in M theory. The 1+10-dimensional M theory encompasses the known 1+9-dimensional superstring theories, and is widely considered to be a promising potential route to quantum gravity. At low energies, gravity is localized at the brane and general relativity is recovered, but at high energies gravity "leaks" into the bulk, behaving in a truly higher-dimensional way. This introduces significant changes to gravitational dynamics and perturbations, with interesting and potentially testable implications for high-energy astrophysics, black holes, and cosmology. Brane-world models offer a phenomenological way to test some of the novel predictions and corrections to general relativity that are implied by M theory. This review analyzes the geometry, dynamics and perturbations of simple brane-world models for cosmology and astrophysics, mainly focusing on warped 5-dimensional brane-worlds based on the Randall--Sundrum models. We also cover the simplest brane-world models in which 4-dimensional gravity on the brane is modified at \emph{low} energies -- the 5-dimensional Dvali--Gabadadze--Porrati models. Then we discuss co-dimension two branes in 6-dimensional models.Comment: A major update of Living Reviews in Relativity 7:7 (2004) "Brane-World Gravity", 119 pages, 28 figures, the update contains new material on RS perturbations, including full numerical solutions of gravitational waves and scalar perturbations, on DGP models, and also on 6D models. A published version in Living Reviews in Relativit

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO

    X-ray emission from the Sombrero galaxy: discrete sources

    Get PDF
    We present a study of discrete X-ray sources in and around the bulge-dominated, massive Sa galaxy, Sombrero (M104), based on new and archival Chandra observations with a total exposure of ~200 ks. With a detection limit of L_X = 1E37 erg/s and a field of view covering a galactocentric radius of ~30 kpc (11.5 arcminute), 383 sources are detected. Cross-correlation with Spitler et al.'s catalogue of Sombrero globular clusters (GCs) identified from HST/ACS observations reveals 41 X-rays sources in GCs, presumably low-mass X-ray binaries (LMXBs). We quantify the differential luminosity functions (LFs) for both the detected GC and field LMXBs, whose power-low indices (~1.1 for the GC-LF and ~1.6 for field-LF) are consistent with previous studies for elliptical galaxies. With precise sky positions of the GCs without a detected X-ray source, we further quantify, through a fluctuation analysis, the GC LF at fainter luminosities down to 1E35 erg/s. The derived index rules out a faint-end slope flatter than 1.1 at a 2 sigma significance, contrary to recent findings in several elliptical galaxies and the bulge of M31. On the other hand, the 2-6 keV unresolved emission places a tight constraint on the field LF, implying a flattened index of ~1.0 below 1E37 erg/s. We also detect 101 sources in the halo of Sombrero. The presence of these sources cannot be interpreted as galactic LMXBs whose spatial distribution empirically follows the starlight. Their number is also higher than the expected number of cosmic AGNs (52+/-11 [1 sigma]) whose surface density is constrained by deep X-ray surveys. We suggest that either the cosmic X-ray background is unusually high in the direction of Sombrero, or a distinct population of X-ray sources is present in the halo of Sombrero.Comment: 11 figures, 5 tables, ApJ in pres
    • …
    corecore