574 research outputs found

    Open, more democratic or better at hiding? Two decades of local government transparency in the UK

    Get PDF
    Local government has long been the site of experiments and innovation in transparency. Since the 1990s waves of reforms have sought to open up local government in Britain, from the Freedom of Information (FOI) Act in the 2000s to Open Data in the 2010s. This paper looks across the evidence to see how well these new transparency tools have worked, who is using them and why. It then moves to analyse what impact the changes have had on local government, in line with hopes of campaigners and fears of (some) politicians. Have reforms succeeded in making local government more open, more accountable and more participative, and in what situations? Or have they, as some claim, simply driven decision making into other arenas, and made local bodies ‘better at hiding’? Finally, how does transparency sit with the fragmented and disjointed landscape of local politics today, from outsourcing and devolution to financial crisis

    Comparative transcriptomics in the Triticeae

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Barley and particularly wheat are two grass species of immense agricultural importance. In spite of polyploidization events within the latter, studies have shown that genotypically and phenotypically these species are very closely related and, indeed, fertile hybrids can be created by interbreeding. The advent of two genome-scale Affymetrix GeneChips now allows studies of the comparison of their transcriptomes.</p> <p>Results</p> <p>We have used the Wheat GeneChip to create a "gene expression atlas" for the wheat transcriptome (cv. Chinese Spring). For this, we chose mRNA from a range of tissues and developmental stages closely mirroring a comparable study carried out for barley (cv. Morex) using the Barley1 GeneChip. This, together with large-scale clustering of the probesets from the two GeneChips into "homologous groups", has allowed us to perform a genomic-scale comparative study of expression patterns in these two species. We explore the influence of the polyploidy of wheat on the results obtained with the Wheat GeneChip and quantify the correlation between conservation in gene sequence and gene expression in wheat and barley. In addition, we show how the conservation of expression patterns can be used to elucidate, probeset by probeset, the reliability of the Wheat GeneChip.</p> <p>Conclusion</p> <p>While there are many differences in expression on the level of individual genes and tissues, we demonstrate that the wheat and barley transcriptomes appear highly correlated. This finding is significant not only because given small evolutionary distance between the two species it is widely expected, but also because it demonstrates that it is possible to use the two GeneChips for comparative studies. This is the case even though their probeset composition reflects rather different design principles as well as, of course, the present incomplete knowledge of the gene content of the two species. We also show that, in general, the Wheat GeneChip is not able to distinguish contributions from individual homoeologs. Furthermore, the comparison between the two species leads us to conclude that the conservation of both gene sequence as well as gene expression is positively correlated with absolute expression levels, presumably reflecting increased selection pressure on genes coding for proteins present at high levels. In addition, the results indicate the presence of a correlation between sequence and expression conservation within the Triticeae.</p

    C-STICH2: emergency cervical cerclage to prevent miscarriage and preterm birth—study protocol for a randomised controlled trial

    Get PDF
    Abstract Background Cervical cerclage is a recognised treatment to prevent late miscarriage and pre-term birth (PTB). Emergency cervical cerclage (ECC) for cervical dilatation with exposed unruptured membranes is less common and the potential benefits of cerclage are less certain. A randomised control trial is needed to accurately assess the effectiveness of ECC in preventing pregnancy loss compared to an expectant approach. Methods C-STICH2 is a multicentre randomised controlled trial in which women presenting with cervical dilatation and unruptured exposed membranes at 16 + 0 to 27 + 6 weeks gestation are randomised to ECC or expectant management. Trial design includes 18 month internal pilot with embedded qualitative process evaluation, minimal data set and a within-trial health economic analysis. Inclusion criteria are ≄16 years, singleton pregnancy, exposed membranes at the external os, gestation 16 + 0–27 + 6 weeks, and informed consent. Exclusion criteria are contraindication to cerclage, cerclage in situ or previous cerclage in this pregnancy. Randomisation occurs via an online service in a 1:1 ratio, using a minimisation algorithm to reduce chance imbalances in key prognostic variables (site, gestation and dilatation). Primary outcome is pregnancy loss; a composite including miscarriage, termination of pregnancy and perinatal mortality defined as stillbirth and neonatal death in the first week of life. Secondary outcomes include all core outcomes for PTB. Two-year development outcomes will be assessed using general health and Parent Report of Children’s Abilities-Revised (PARCA-R) questionnaires. Intended sample size is 260 participants (130 each arm) based on 60% rate of pregnancy loss in the expectant management arm and 40% in the ECC arm, with 90% power and alpha 0.05. Analysis will be by intention-to-treat. Discussion To date there has been one small trial of ECC in 23 participants which included twin and singleton pregnancies. This small trial along with the largest observational study (n = 161) found ECC to prolong pregnancy duration and reduce deliveries before 34 weeks gestation. It is important to generate high quality evidence on the effectiveness of ECC in preventing pregnancy loss, and improve understanding of the prevalence of the condition and frequency of complications associated with ECC. An adequately powered RCT will provide the highest quality evidence regarding optimum care for these women and their babies. Trial registration ISRCTN Registry ISRCTN12981869 . Registered on 13th June 2018

    Can One Trust Quantum Simulators?

    Full text link
    Various fundamental phenomena of strongly-correlated quantum systems such as high-TcT_c superconductivity, the fractional quantum-Hall effect, and quark confinement are still awaiting a universally accepted explanation. The main obstacle is the computational complexity of solving even the most simplified theoretical models that are designed to capture the relevant quantum correlations of the many-body system of interest. In his seminal 1982 paper [Int. J. Theor. Phys. 21, 467], Richard Feynman suggested that such models might be solved by "simulation" with a new type of computer whose constituent parts are effectively governed by a desired quantum many-body dynamics. Measurements on this engineered machine, now known as a "quantum simulator," would reveal some unknown or difficult to compute properties of a model of interest. We argue that a useful quantum simulator must satisfy four conditions: relevance, controllability, reliability, and efficiency. We review the current state of the art of digital and analog quantum simulators. Whereas so far the majority of the focus, both theoretically and experimentally, has been on controllability of relevant models, we emphasize here the need for a careful analysis of reliability and efficiency in the presence of imperfections. We discuss how disorder and noise can impact these conditions, and illustrate our concerns with novel numerical simulations of a paradigmatic example: a disordered quantum spin chain governed by the Ising model in a transverse magnetic field. We find that disorder can decrease the reliability of an analog quantum simulator of this model, although large errors in local observables are introduced only for strong levels of disorder. We conclude that the answer to the question "Can we trust quantum simulators?" is... to some extent.Comment: 20 pages. Minor changes with respect to version 2 (some additional explanations, added references...

    CCL2-CCR2 axis promotes metastasis of nasopharyngeal carcinoma by activating ERK1/2-MMP2/9 pathway

    Get PDF
    Distant metastasis remains the major failure of nasopharyngeal carcinoma (NPC). In this study, the roles of chemokine C-C motif ligand 2 (CCL2), and its receptor chemokine C-C motif receptor type 2 (CCR2) on NPC metastasis were investigated. Serum CCL2 and CCL2/CCR2 expression level were remarkably increased in NPC patients compared to non-tumor patients by ELISA and IHC analyses. High expressions of CCL2/CCR2 were significantly associated with NPC metastasis and poor overall survival (OS). High expression of CCR2 is an independent adverse prognostic factor of OS and distant metastasis free survival (DMFS). Overexpressions of CCL2 and CCR2 were detected in high-metastatic NPC cell lines. Upregulating CCL2 and CCR2 respectively in low-metastatic NPC cell lines could promote cell migration and invasion, and exogenous CCL2 enhanced the motility in CCR2-overexpressing cells. On the other hand, downregulating CCL2 and CCR2 respectively in high-metastatic NPC cell lines by shRNA could decrease cell migration and invasion. However, exogenous CCL2 could not rescue the weaken ability of motility of CCR2-silencing cells. In nude mouse model, distant metastasis was significantly facilitated in either CCL2-overexpressing or CCR2-overexpressing groups, which was more obvious in CCR2-overexpressing group. Also, distant metastasis was considerably inhibited in either CCL2-silencing or CCR2-silencing groups. Dual overexpression of CCL2/CCR2 could activate extracellular signal-regulated kinase (ERK1/2) signaling pathway, which sequentially induced matrix metalloproteinase (MMP) 2 and 9 upregulations in the downstream. In conclusion, CCL2-CCR2 axis could promote NPC metastasis by activating ERK1/2-MMP2/9 pathway. This study helps to develop novel therapeutic targets for distant metastasis in NPC

    “I like them, but won't ‘like’ them”: An examination of impression management associated with visible political party affiliation on Facebook

    Get PDF
    Unlike traditional media, our interactions with political parties via social media are generally public, subject to scrutiny by others and, consequently, a self-presentation concern. This paper contributes to theory on impression management within social network sites (SNSs) by providing an understanding of the effect of visible affiliation on page ‘Liking’ behavior in the context of political parties; specifically, the possible association with social anxiety and the use of protective impression management. We predict that while users may be motivated to ‘Like’ a political party, some may feel socially anxious about the impressions their friends may derive from this action, and so ultimately choose to refrain from ‘Liking’ the party. Furthermore, we propose a new function of ‘Secret Likes’ (i.e. ‘Likes’ that others cannot see) as a means to increase gateway interactions. A survey of eligible voters (n=225) was conducted in the month prior to the 2015 UK general election, examining behavior associated with the Facebook pages of the two largest political parties. Results support that conspicuous affiliation with political parties indeed hinders intention to ‘Like’ political pages and is associated with social anxiety. ‘Secret Likes’ were found to be a successful method to increase gateway interactions. In addition to the theoretical contribution, implications for political party communications and site designers are considered

    Hunt for new phenomena using large jet multiplicities and missing transverse momentum with ATLAS in 4.7 fb−1 of s√=7TeV proton-proton collisions

    Get PDF
    Results are presented of a search for new particles decaying to large numbers of jets in association with missing transverse momentum, using 4.7 fb−1 of pp collision data at s√=7TeV collected by the ATLAS experiment at the Large Hadron Collider in 2011. The event selection requires missing transverse momentum, no isolated electrons or muons, and from ≄6 to ≄9 jets. No evidence is found for physics beyond the Standard Model. The results are interpreted in the context of a MSUGRA/CMSSM supersymmetric model, where, for large universal scalar mass m 0, gluino masses smaller than 840 GeV are excluded at the 95% confidence level, extending previously published limits. Within a simplified model containing only a gluino octet and a neutralino, gluino masses smaller than 870 GeV are similarly excluded for neutralino masses below 100 GeV

    Measurement of the inclusive and dijet cross-sections of b-jets in pp collisions at sqrt(s) = 7 TeV with the ATLAS detector

    Get PDF
    The inclusive and dijet production cross-sections have been measured for jets containing b-hadrons (b-jets) in proton-proton collisions at a centre-of-mass energy of sqrt(s) = 7 TeV, using the ATLAS detector at the LHC. The measurements use data corresponding to an integrated luminosity of 34 pb^-1. The b-jets are identified using either a lifetime-based method, where secondary decay vertices of b-hadrons in jets are reconstructed using information from the tracking detectors, or a muon-based method where the presence of a muon is used to identify semileptonic decays of b-hadrons inside jets. The inclusive b-jet cross-section is measured as a function of transverse momentum in the range 20 < pT < 400 GeV and rapidity in the range |y| < 2.1. The bbbar-dijet cross-section is measured as a function of the dijet invariant mass in the range 110 < m_jj < 760 GeV, the azimuthal angle difference between the two jets and the angular variable chi in two dijet mass regions. The results are compared with next-to-leading-order QCD predictions. Good agreement is observed between the measured cross-sections and the predictions obtained using POWHEG + Pythia. MC@NLO + Herwig shows good agreement with the measured bbbar-dijet cross-section. However, it does not reproduce the measured inclusive cross-section well, particularly for central b-jets with large transverse momenta.Comment: 10 pages plus author list (21 pages total), 8 figures, 1 table, final version published in European Physical Journal

    Jet energy measurement with the ATLAS detector in proton-proton collisions at root s=7 TeV

    Get PDF
    The jet energy scale and its systematic uncertainty are determined for jets measured with the ATLAS detector at the LHC in proton-proton collision data at a centre-of-mass energy of √s = 7TeV corresponding to an integrated luminosity of 38 pb-1. Jets are reconstructed with the anti-kt algorithm with distance parameters R=0. 4 or R=0. 6. Jet energy and angle corrections are determined from Monte Carlo simulations to calibrate jets with transverse momenta pT≄20 GeV and pseudorapidities {pipe}η{pipe}<4. 5. The jet energy systematic uncertainty is estimated using the single isolated hadron response measured in situ and in test-beams, exploiting the transverse momentum balance between central and forward jets in events with dijet topologies and studying systematic variations in Monte Carlo simulations. The jet energy uncertainty is less than 2. 5 % in the central calorimeter region ({pipe}η{pipe}<0. 8) for jets with 60≀pT<800 GeV, and is maximally 14 % for pT<30 GeV in the most forward region 3. 2≀{pipe}η{pipe}<4. 5. The jet energy is validated for jet transverse momenta up to 1 TeV to the level of a few percent using several in situ techniques by comparing a well-known reference such as the recoiling photon pT, the sum of the transverse momenta of tracks associated to the jet, or a system of low-pT jets recoiling against a high-pT jet. More sophisticated jet calibration schemes are presented based on calorimeter cell energy density weighting or hadronic properties of jets, aiming for an improved jet energy resolution and a reduced flavour dependence of the jet response. The systematic uncertainty of the jet energy determined from a combination of in situ techniques is consistent with the one derived from single hadron response measurements over a wide kinematic range. The nominal corrections and uncertainties are derived for isolated jets in an inclusive sample of high-pT jets. Special cases such as event topologies with close-by jets, or selections of samples with an enhanced content of jets originating from light quarks, heavy quarks or gluons are also discussed and the corresponding uncertainties are determined. © 2013 CERN for the benefit of the ATLAS collaboration

    Single hadron response measurement and calorimeter jet energy scale uncertainty with the ATLAS detector at the LHC

    Get PDF
    The uncertainty on the calorimeter energy response to jets of particles is derived for the ATLAS experiment at the Large Hadron Collider (LHC). First, the calorimeter response to single isolated charged hadrons is measured and compared to the Monte Carlo simulation using proton-proton collisions at centre-of-mass energies of sqrt(s) = 900 GeV and 7 TeV collected during 2009 and 2010. Then, using the decay of K_s and Lambda particles, the calorimeter response to specific types of particles (positively and negatively charged pions, protons, and anti-protons) is measured and compared to the Monte Carlo predictions. Finally, the jet energy scale uncertainty is determined by propagating the response uncertainty for single charged and neutral particles to jets. The response uncertainty is 2-5% for central isolated hadrons and 1-3% for the final calorimeter jet energy scale.Comment: 24 pages plus author list (36 pages total), 23 figures, 1 table, submitted to European Physical Journal
    • 

    corecore