6,367 research outputs found

    Students as producers and active partners in enhancing equality and diversity: ‘culturosity’ at Canterbury Christ Church University

    Get PDF
    Equality and diversity of truths, of opportunity, of outcome, of dignity and of identities lie at the heart of the idea of university (Wolff, 1992, p. 68). However, despite the fact that the UK ‘has well-established equality law and practice’ and the Equality Act 2010 requires universities to implement changes that protect their students and employees from various forms and effects of discrimination, ‘inequality remains, albeit often in more complex and subtle forms than have been understood before’, argues David Ruebain (2012, p. 3). This study contributes to the discussion about equality and diversity practices in the university context by proposing strategies to embed into students’ learning community equality and diversity and subsequent graduate attributes. The case study is the Culturosity Project: an equality and diversity training initiative co-created by Dr Kasia Lech and a group of final-year students and graduates from Drama and Performing Arts programmes and delivered – as a Canterbury Christ Church University Partners in Learning project – to L4 and foundation-year students. The project was first delivered in 2015 and has now become part of student induction at the CCCU Faculty of Arts and Humanities

    Identifying evidence of the effectiveness of photovoice: a systematic review and meta-analysis of the international healthcare literature

    Get PDF
    BACKGROUND: Photovoice (PV) was conceptualized in the early 1990s to engage community members in capturing/communicating their lived experience narratives through photography. However, no meta-analyses in health research have assessed whether PV achieves its purported effects. METHODS: We carried forward any relevant references from a previous review identifying PV studies before 2008 and searched MEDLINE, Embase, PsycINFO and Cochrane Central Register of Controlled Trials from 2008 up until October 2019. We included both published and grey literature, in any population or context. We assessed quality with the Effective Public Health Practice Project's (EPHPP) tool and pooled studies using the standardized mean difference (SMD) and 95% confidence intervals (CIs). RESULTS: Twenty-eight studies were included, showing significant post-treatment effects only for health knowledge (SMD, 95% CIs = 0.41, 0.09 to 0.73, n = 16) and community functions (SMD, 95% CIs = 0.22, 0.03 to 0.40, n = 4). Strong heterogeneity was indicated for health knowledge, potentially explained by a larger effect in ethnic minority populations. There was insufficient follow-up data for health knowledge, while in follow-up for community functions the post-treatment effect was lost. CONCLUSIONS: PV's post-treatment effect on health knowledge did not translate into positive health behaviours or physical and mental health outcomes, longer-term community functions, or health service outcomes

    Thinking beyond the hybrid:“actually-existing” cities “after neoliberalism” in Boyle <i>et al.</i>

    Get PDF
    In their article, ‘The spatialities of actually existing neoliberalism in Glasgow, 1977 to present’, Mark Boyle, Christopher McWilliams and Gareth Rice (2008) usefully problematise our current understanding of neoliberal urbanism. Our response is aimed at developing a sympathetic but critical approach to Boyle et al's understanding of neoliberal urbanism as illustrated by the Glasgow example. In particular, the counterposing by Boyle et al of a 'hybrid, mutant' model to a 'pure' model of neoliberalism for us misrepresents existing models of neoliberalism as a perfectly finished object rather than a roughly mottled process. That they do not identify any ‘pure’ model leads them to create a straw construct against which they can claim a more sophisticated, refined approach to the messiness of neoliberal urbanism. In contrast, we view neoliberalism as a contested and unstable response to accumulation crises at various scales of analysis

    A Thermo-Compositional Model of the African Cratonic Lithosphere

    Get PDF
    Recently, the continually increasing availability of seismic data has allowed high-resolution imaging of lithospheric structure beneath the African cratons. In this study, S-wave seismic tomography is combined with high resolution satellite gravity data in an integrated approach to investigate the structure of the cratonic lithosphere of Africa. A new model for the Moho depth and data on the crustal density structure is employed along with global dynamic models to calculate residual topography and mantle gravity residuals. Corrections for thermal effects of an initially juvenile mantle are estimated based on S-wave tomography and mineral physics. Joint inversion of the residuals yields necessary compositional adjustments that allow to recalculate the thermal effects. After several iterations, we obtain a consistent model of upper mantle temperature, thermal and compositional density variations, and Mg# as a measure of depletion, as well as an improved crustal density model. Our results show that thick and cold depleted lithosphere underlies West African, northern to central eastern Congo, and Zimbabwe Cratons. However, for most of these regions, the areal extent of their depleted lithosphere differs from the respective exposed Archean shields. Meanwhile, the lithosphere of Uganda, Tanzania, most of eastern and southern Congo, and the Kaapvaal Craton is thinner, warmer, and shows little or no depletion. Furthermore, the results allow to infer that the lithosphere of the exposed Archean shields of Congo and West African cratons was depleted before the single blocks were merged into their respective cratons

    Developing Public-Private Key Pairs Using Highly-Available Technology

    Full text link
    The investigation of IPv4 has investigated simulated annealing, and current trends suggest that the emulation of congestion control will soon emerge. In this paper, authors show the development of randomized algorithms. Our focus in this paper is not on whether the well-known decentralized algorithm for the development of agents by Z. Wang [3] runs in Θ(n) time, but rather on constructing an analysis of von Neumann machines (Swamp). Although it at first glance seems perverse, it is supported by previous work in the field

    Consolidating the set of known human protein-protein interactions in preparation for large-scale mapping of the human interactome

    Get PDF
    BACKGROUND: Extensive protein interaction maps are being constructed for yeast, worm, and fly to ask how the proteins organize into pathways and systems, but no such genome-wide interaction map yet exists for the set of human proteins. To prepare for studies in humans, we wished to establish tests for the accuracy of future interaction assays and to consolidate the known interactions among human proteins. RESULTS: We established two tests of the accuracy of human protein interaction datasets and measured the relative accuracy of the available data. We then developed and applied natural language processing and literature-mining algorithms to recover from Medline abstracts 6,580 interactions among 3,737 human proteins. A three-part algorithm was used: first, human protein names were identified in Medline abstracts using a discriminator based on conditional random fields, then interactions were identified by the co-occurrence of protein names across the set of Medline abstracts, filtering the interactions with a Bayesian classifier to enrich for legitimate physical interactions. These mined interactions were combined with existing interaction data to obtain a network of 31,609 interactions among 7,748 human proteins, accurate to the same degree as the existing datasets. CONCLUSION: These interactions and the accuracy benchmarks will aid interpretation of current functional genomics data and provide a basis for determining the quality of future large-scale human protein interaction assays. Projecting from the approximately 15 interactions per protein in the best-sampled interaction set to the estimated 25,000 human genes implies more than 375,000 interactions in the complete human protein interaction network. This set therefore represents no more than 10% of the complete network

    Scientific basis for safely shutting in the Macondo Well after the April 20, 2010 Deepwater Horizon blowout

    Get PDF
    As part of the government response to the Deepwater Horizon blowout, a Well Integrity Team evaluated the geologic hazards of shutting in the Macondo Well at the seafloor and determined the conditions under which it could safely be undertaken. Of particular concern was the possibility that, under the anticipated high shut-in pressures, oil could leak out of the well casing below the seafloor. Such a leak could lead to new geologic pathways for hydrocarbon release to the Gulf of Mexico. Evaluating this hazard required analyses of 2D and 3D seismic surveys, seafloor bathymetry, sediment properties, geophysical well logs, and drilling data to assess the geological, hydrological, and geomechanical conditions around the Macondo Well. After the well was successfully capped and shut in on July 15, 2010, a variety of monitoring activities were used to assess subsurface well integrity. These activities included acquisition of wellhead pressure data, marine multichannel seismic pro- files, seafloor and water-column sonar surveys, and wellhead visual/acoustic monitoring. These data showed that the Macondo Well was not leaking after shut in, and therefore, it could remain safely shut until reservoir pressures were suppressed (killed) with heavy drilling mud and the well was sealed with cement

    Statistical Mechanics of Semi-Supervised Clustering in Sparse Graphs

    Full text link
    We theoretically study semi-supervised clustering in sparse graphs in the presence of pairwise constraints on the cluster assignments of nodes. We focus on bi-cluster graphs, and study the impact of semi-supervision for varying constraint density and overlap between the clusters. Recent results for unsupervised clustering in sparse graphs indicate that there is a critical ratio of within-cluster and between-cluster connectivities below which clusters cannot be recovered with better than random accuracy. The goal of this paper is to examine the impact of pairwise constraints on the clustering accuracy. Our results suggests that the addition of constraints does not provide automatic improvement over the unsupervised case. When the density of the constraints is sufficiently small, their only impact is to shift the detection threshold while preserving the criticality. Conversely, if the density of (hard) constraints is above the percolation threshold, the criticality is suppressed and the detection threshold disappears.Comment: 8 pages, 4 figure
    • 

    corecore