846 research outputs found

    Incidence and risk factors for placenta accreta/increta/percreta in the UK: a national case-control study.

    Get PDF
    Placenta accreta/increta/percreta is associated with major pregnancy complications and is thought to be becoming more common. The aims of this study were to estimate the incidence of placenta accreta/increta/percreta in the UK and to investigate and quantify the associated risk factors

    BNCI systems as a potential assistive technology: ethical issues and participatory research in the BrainAble project

    Get PDF
    This paper highlights aspects related to current research and thinking about ethical issues in relation to Brain Computer Interface (BCI) and Brain-Neuronal Computer Interfaces (BNCI) research through the experience of one particular project, BrainAble, which is exploring and developing the potential of these technologies to enable people with complex disabilities to control computers. It describes how ethical practice has been developed both within the multidisciplinary research team and with participants. Results: The paper presents findings in which participants shared their views of the project prototypes, of the potential of BCI/BNCI systems as an assistive technology, and of their other possible applications. This draws attention to the importance of ethical practice in projects where high expectations of technologies, and representations of “ideal types” of disabled users may reinforce stereotypes or drown out participant “voices”. Conclusions: Ethical frameworks for research and development in emergent areas such as BCI/BNCI systems should be based on broad notions of a “duty of care” while being sufficiently flexible that researchers can adapt project procedures according to participant needs. They need to be frequently revisited, not only in the light of experience, but also to ensure they reflect new research findings and ever more complex and powerful technologies

    Beyond maternal death: improving the quality of maternal care through national studies of ‘near-miss’ maternal morbidity

    Get PDF
    BACKGROUND: Studies of maternal mortality have been shown to result in important improvements to women’s health. It is now recognised that in countries such as the UK, where maternal deaths are rare, the study of near-miss severe maternal morbidity provides additional information to aid disease prevention, treatment and service provision. OBJECTIVES: To (1) estimate the incidence of specific near-miss morbidities; (2) assess the contribution of existing risk factors to incidence; (3) describe different interventions and their impact on outcomes and costs; (4) identify any groups in which outcomes differ; (5) investigate factors associated with maternal death; (6) compare an external confidential enquiry or a local review approach for investigating quality of care for affected women; and (7) assess the longer-term impacts. METHODS: Mixed quantitative and qualitative methods including primary national observational studies, database analyses, surveys and case studies overseen by a user advisory group. SETTING: Maternity units in all four countries of the UK. PARTICIPANTS: Women with near-miss maternal morbidities, their partners and comparison women without severe morbidity. MAIN OUTCOME MEASURES: The incidence, risk factors, management and outcomes of uterine rupture, placenta accreta, haemolysis, elevated liver enzymes and low platelets (HELLP) syndrome, severe sepsis, amniotic fluid embolism and pregnancy at advanced maternal age (≥ 48 years at completion of pregnancy); factors associated with progression from severe morbidity to death; associations between severe maternal morbidity and ethnicity and socioeconomic status; lessons for care identified by local and external review; economic evaluation of interventions for management of postpartum haemorrhage (PPH); women’s experiences of near-miss maternal morbidity; long-term outcomes; and models of maternity care commissioned through experience-led and standard approaches. RESULTS: Women and their partners reported long-term impacts of near-miss maternal morbidities on their physical and mental health. Older maternal age and caesarean delivery are associated with severe maternal morbidity in both current and future pregnancies. Antibiotic prescription for pregnant or postpartum women with suspected infection does not necessarily prevent progression to severe sepsis, which may be rapidly progressive. Delay in delivery, of up to 48 hours, may be safely undertaken in women with HELLP syndrome in whom there is no fetal compromise. Uterine compression sutures are a cost effective second-line therapy for PPH. Medical comorbidities are associated with a fivefold increase in the odds of maternal death from direct pregnancy complications. External reviews identified more specific clinical messages for care than local reviews. Experience-led commissioning may be used as a way to commission maternity services. LIMITATIONS: This programme used observational studies, some with limited sample size, and the possibility of uncontrolled confounding cannot be excluded. CONCLUSIONS: Implementation of the findings of this research could prevent both future severe pregnancy complications as well as improving the outcome of pregnancy for women. One of the clearest findings relates to the population of women with other medical and mental health problems in pregnancy and their risk of severe morbidity. Further research into models of pre-pregnancy, pregnancy and postnatal care is clearly needed

    Formalization of the classification pattern: Survey of classification modeling in information systems engineering

    Get PDF
    Formalization is becoming more common in all stages of the development of information systems, as a better understanding of its benefits emerges. Classification systems are ubiquitous, no more so than in domain modeling. The classification pattern that underlies these systems provides a good case study of the move towards formalization in part because it illustrates some of the barriers to formalization; including the formal complexity of the pattern and the ontological issues surrounding the ‘one and the many’. Powersets are a way of characterizing the (complex) formal structure of the classification pattern and their formalization has been extensively studied in mathematics since Cantor’s work in the late 19th century. One can use this formalization to develop a useful benchmark. There are various communities within Information Systems Engineering (ISE) that are gradually working towards a formalization of the classification pattern. However, for most of these communities this work is incomplete, in that they have not yet arrived at a solution with the expressiveness of the powerset benchmark. This contrasts with the early smooth adoption of powerset by other Information Systems communities to, for example, formalize relations. One way of understanding the varying rates of adoption is recognizing that the different communities have different historical baggage. Many conceptual modeling communities emerged from work done on database design and this creates hurdles to the adoption of the high level of expressiveness of powersets. Another relevant factor is that these communities also often feel, particularly in the case of domain modeling, a responsibility to explain the semantics of whatever formal structures they adopt. This paper aims to make sense of the formalization of the classification pattern in ISE and surveys its history through the literature; starting from the relevant theoretical works of the mathematical literature and gradually shifting focus to the ISE literature. The literature survey follows the evolution of ISE’s understanding of how to formalize the classification pattern. The various proposals are assessed using the classical example of classification; the Linnaean taxonomy formalized using powersets as a benchmark for formal expressiveness. The broad conclusion of the survey is that (1) the ISE community is currently in the early stages of the process of understanding how to formalize the classification pattern, particularly in the requirements for expressiveness exemplified by powersets and (2) that there is an opportunity to intervene and speed up the process of adoption by clarifying this expressiveness. Given the central place that the classification pattern has in domain modeling, this intervention has the potential to lead to significant improvements.The UK Engineering and Physical Sciences Research Council (grant EP/K009923/1)

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO

    Jet energy measurement with the ATLAS detector in proton-proton collisions at root s=7 TeV

    Get PDF
    The jet energy scale and its systematic uncertainty are determined for jets measured with the ATLAS detector at the LHC in proton-proton collision data at a centre-of-mass energy of √s = 7TeV corresponding to an integrated luminosity of 38 pb-1. Jets are reconstructed with the anti-kt algorithm with distance parameters R=0. 4 or R=0. 6. Jet energy and angle corrections are determined from Monte Carlo simulations to calibrate jets with transverse momenta pT≥20 GeV and pseudorapidities {pipe}η{pipe}<4. 5. The jet energy systematic uncertainty is estimated using the single isolated hadron response measured in situ and in test-beams, exploiting the transverse momentum balance between central and forward jets in events with dijet topologies and studying systematic variations in Monte Carlo simulations. The jet energy uncertainty is less than 2. 5 % in the central calorimeter region ({pipe}η{pipe}<0. 8) for jets with 60≤pT<800 GeV, and is maximally 14 % for pT<30 GeV in the most forward region 3. 2≤{pipe}η{pipe}<4. 5. The jet energy is validated for jet transverse momenta up to 1 TeV to the level of a few percent using several in situ techniques by comparing a well-known reference such as the recoiling photon pT, the sum of the transverse momenta of tracks associated to the jet, or a system of low-pT jets recoiling against a high-pT jet. More sophisticated jet calibration schemes are presented based on calorimeter cell energy density weighting or hadronic properties of jets, aiming for an improved jet energy resolution and a reduced flavour dependence of the jet response. The systematic uncertainty of the jet energy determined from a combination of in situ techniques is consistent with the one derived from single hadron response measurements over a wide kinematic range. The nominal corrections and uncertainties are derived for isolated jets in an inclusive sample of high-pT jets. Special cases such as event topologies with close-by jets, or selections of samples with an enhanced content of jets originating from light quarks, heavy quarks or gluons are also discussed and the corresponding uncertainties are determined. © 2013 CERN for the benefit of the ATLAS collaboration

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO

    Measurement of the inclusive and dijet cross-sections of b-jets in pp collisions at sqrt(s) = 7 TeV with the ATLAS detector

    Get PDF
    The inclusive and dijet production cross-sections have been measured for jets containing b-hadrons (b-jets) in proton-proton collisions at a centre-of-mass energy of sqrt(s) = 7 TeV, using the ATLAS detector at the LHC. The measurements use data corresponding to an integrated luminosity of 34 pb^-1. The b-jets are identified using either a lifetime-based method, where secondary decay vertices of b-hadrons in jets are reconstructed using information from the tracking detectors, or a muon-based method where the presence of a muon is used to identify semileptonic decays of b-hadrons inside jets. The inclusive b-jet cross-section is measured as a function of transverse momentum in the range 20 < pT < 400 GeV and rapidity in the range |y| < 2.1. The bbbar-dijet cross-section is measured as a function of the dijet invariant mass in the range 110 < m_jj < 760 GeV, the azimuthal angle difference between the two jets and the angular variable chi in two dijet mass regions. The results are compared with next-to-leading-order QCD predictions. Good agreement is observed between the measured cross-sections and the predictions obtained using POWHEG + Pythia. MC@NLO + Herwig shows good agreement with the measured bbbar-dijet cross-section. However, it does not reproduce the measured inclusive cross-section well, particularly for central b-jets with large transverse momenta.Comment: 10 pages plus author list (21 pages total), 8 figures, 1 table, final version published in European Physical Journal

    Search for pair-produced long-lived neutral particles decaying to jets in the ATLAS hadronic calorimeter in ppcollisions at √s=8TeV

    Get PDF
    The ATLAS detector at the Large Hadron Collider at CERN is used to search for the decay of a scalar boson to a pair of long-lived particles, neutral under the Standard Model gauge group, in 20.3fb−1of data collected in proton–proton collisions at √s=8TeV. This search is sensitive to long-lived particles that decay to Standard Model particles producing jets at the outer edge of the ATLAS electromagnetic calorimeter or inside the hadronic calorimeter. No significant excess of events is observed. Limits are reported on the product of the scalar boson production cross section times branching ratio into long-lived neutral particles as a function of the proper lifetime of the particles. Limits are reported for boson masses from 100 GeVto 900 GeV, and a long-lived neutral particle mass from 10 GeVto 150 GeV
    corecore