429 research outputs found

    Investigating a potential oncogenic role for AGR2 in pro-survival autophagy

    Get PDF
    Members of the Protein Disulphide Isomerase (PDI) family have essential roles in mediating the oxidation, reduction and isomerisation of disulphide bonds during protein maturation in the endoplasmic reticulum. PDI family members are characterised by the presence of a thioredoxin motif (CXXC) with dual cysteines that permit proteins their oxidoreductase activity. However, PDI proteins that harbour evolutionary divergent thioredoxin motifs have also been identified; the Anterior Gradient-2 (AGR2) protein that possesses a single cysteine residue in its thioredoxin-like domain (CPHS) is one such protein. AGR2 is known to be required for the correct folding and secretion of mucins, the primary gel-forming proteins within mucus. Although its expression is normally restricted to certain secretory and reproductive organs, AGR2 is found derepressed in various cancers. We have previously shown that AGR2 forms a disulphide-dependent interaction with the autophagy receptor Sequestosome 1 (SQSTM1). The oxidation of SQSTM1 is required for the stimulation of autophagy to promote cell survival under conditions of oxidative and proteotoxic stress. The disulphide-dependent interaction between AGR2 and SQSTM1 links AGR2 to autophagy for the first time. Tumours are exposed to high levels of oxidative stress, and thus autophagy can prevent cytotoxicity by removing the cellular components damaged by oxidative stress that may otherwise be toxic to the cell. It is plausible, therefore, that the upregulation of SQSTM1 and AGR2 in human cancers could be involved in the induction of pro-survival autophagy

    Callisto: a cryptographic approach to detecting serial perpetrators of sexual misconduct

    Get PDF
    Sexual misconduct is prevalent in workplace and education settings but stigma and risk of further damage deter many victims from seeking justice. Callisto, a non-profit that has created an online sexual assault reporting platform for college campuses, is expanding its work to combat sexual assault and harassment in other industries. In this new product, users will be invited to an online "matching escrow" that will detect repeat perpetrators and create pathways to support for victims. Users submit encrypted data about their perpetrator, and this data can only be decrypted by the Callisto Options Counselor (a lawyer), when another user enters the identity of the same perpetrator. If the perpetrator identities match, both users will be put in touch independently with the Options Counselor, who will connect them to each other (if appropriate) and help them determine their best path towards justice. The client relationships with the Options Counselors are structured so that any client-counselor communications would be privileged. A combination of client-side encryption, encrypted communication channels, oblivious pseudo-random functions, key federation, and Shamir Secret Sharing keep data confidential in transit, at rest, and during the matching process with the guarantee that only the lawyer ever has access to user submitted data, and even then only when a match is identified.Accepted manuscrip

    How good are citizen weather stations? Addressing a biased opinion

    Get PDF
    The UK is home to a dense network of Citizen Weather Stations (CWS) primarily set up by members of the public. The majority of these stations record air temperature, relative humidity and precipitation, amongst other variables, at sub-hourly intervals. This high resolution network could have benefits in many applications, but only if the data quality is well characterised. Here we present results from an intercomparison field study, in which popular CWS models were tested against Met Office standard equipment. The study identifies some common instrumental biases and their dependencies, which will help us to quantify and correct such biases from the CWS network

    UncertWeb processing service:making models easer to access on the web

    Get PDF
    Models are central tools for modern scientists and decision makers, and there are many existing frameworks to support their creation, execution and composition. Many frameworks are based on proprietary interfaces, and do not lend themselves to the integration of models from diverse disciplines. Web based systems, or systems based on web services, such as Taverna and Kepler, allow composition of models based on standard web service technologies. At the same time the Open Geospatial Consortium has been developing their own service stack, which includes the Web Processing Service, designed to facilitate the executing of geospatial processing - including complex environmental models. The current Open Geospatial Consortium service stack employs Extensible Markup Language as a default data exchange standard, and widely-used encodings such as JavaScript Object Notation can often only be used when incorporated with Extensible Markup Language. Similarly, no successful engagement of the Web Processing Service standard with the well-supported technologies of Simple Object Access Protocol and Web Services Description Language has been seen. In this paper we propose a pure Simple Object Access Protocol/Web Services Description Language processing service which addresses some of the issues with the Web Processing Service specication and brings us closer to achieving a degree of interoperability between geospatial models, and thus realising the vision of a useful 'model web'

    Describing and communicating uncertainty within the semantic web

    Get PDF
    The Semantic Web relies on carefully structured, well defined, data to allow machines to communicate and understand one another. In many domains (e.g. geospatial) the data being described contains some uncertainty, often due to incomplete knowledge; meaningful processing of this data requires these uncertainties to be carefully analysed and integrated into the process chain. Currently, within the SemanticWeb there is no standard mechanism for interoperable description and exchange of uncertain information, which renders the automated processing of such information implausible, particularly where error must be considered and captured as it propagates through a processing sequence. In particular we adopt a Bayesian perspective and focus on the case where the inputs / outputs are naturally treated as random variables. This paper discusses a solution to the problem in the form of the Uncertainty Markup Language (UncertML). UncertML is a conceptual model, realised as an XML schema, that allows uncertainty to be quantified in a variety of ways i.e. realisations, statistics and probability distributions. UncertML is based upon a soft-typed XML schema design that provides a generic framework from which any statistic or distribution may be created. Making extensive use of Geography Markup Language (GML) dictionaries, UncertML provides a collection of definitions for common uncertainty types. Containing both written descriptions and mathematical functions, encoded as MathML, the definitions within these dictionaries provide a robust mechanism for defining any statistic or distribution and can be easily extended. Universal Resource Identifiers (URIs) are used to introduce semantics to the soft-typed elements by linking to these dictionary definitions. The INTAMAP (INTeroperability and Automated MAPping) project provides a use case for UncertML. This paper demonstrates how observation errors can be quantified using UncertML and wrapped within an Observations & Measurements (O&M) Observation. The interpolation service uses the information within these observations to influence the prediction outcome. The output uncertainties may be encoded in a variety of UncertML types, e.g. a series of marginal Gaussian distributions, a set of statistics, such as the first three marginal moments, or a set of realisations from a Monte Carlo treatment. Quantifying and propagating uncertainty in this way allows such interpolation results to be consumed by other services. This could form part of a risk management chain or a decision support system, and ultimately paves the way for complex data processing chains in the Semantic Web

    Prevalence and outcomes of acute kidney injury in term neonates with perinatal asphyxia

    Get PDF
    Background: The kidney is the most damaged organ in asphyxiated full-term infants. The severity of its damage is corre­lated with the severity of neurological damage. We determined the prevalence of perinatal asphyxia-associated acute kidney injury (AKI). Methods: We conducted a prospective cohort study including 60 full-term neonates admitted at the Kenyatta National Hospital newborn unit (NBU) in Nairobi with hypoxic ischaemic encephalopathy (HIE) from June 2012 to November 2012. Renal function was assessed by measuring serum creatinine on day 3 of life. AKI was defined by a level of creatinine above 133 μmol/l. The degree of neurological impairment was determined daily until patient discharge, death or day 7 of life. Results: Of the 60 infants 36.6% had HIE I, 51.6% HIE II and 11.8% HIE III. The prevalence of AKI was 11.7 %. There was a 15 fold increase risk of developing AKI in HIE III versus HIE I, p=0.034. Mortality rate in perinatal asphyxia associ­ated AKI was 71.4 % with a 24 fold increase risk of death in neonates with AKI, p=0.001. Conclusions: AKI is common and associated with poorer outcomes in perinatal asphyxia. Larger studies need to be done to correlate maternal factors and perinatal asphyxia-associated A

    Integrating OpenMI and Uncertweb:managing uncertainty in OpenMI models

    Get PDF
    OpenMI is a widely used standard allowing exchange of data between integrated models, which has mostly been applied to dynamic, deterministic models. Within the FP7 UncertWeb project we are developing mechanisms and tools to support the management of uncertainty in environmental models. In this paper we explore the integration of the UncertWeb framework with OpenMI, to assess the issues that arise when propagating uncertainty in OpenMI model compositions, and the degree of integration possible with UncertWeb tools. In particular we develop an uncertainty-enabled model for a simple Lotka-Volterra system with an interface conforming to the OpenMI standard, exploring uncertainty in the initial predator and prey levels, and the parameters of the model equations. We use the Elicitator tool developed within UncertWeb to identify the initial condition uncertainties, and show how these can be integrated, using UncertML, with simple Monte Carlo propagation mechanisms. The mediators we develop for OpenMI models are generic and produce standard Web services that expose the OpenMI models to a Web based framework. We discuss what further work is needed to allow a more complete system to be developed and show how this might be used practically

    Sociality and embodiment: online communication during and after Covid-19

    Get PDF
    During the Covid-19 pandemic we increasingly turned to technology to stay in touch with our family, friends, and colleagues. Even as lockdowns and restrictions ease many are encouraging us to embrace the replacement of face-to-face encounters with technologically mediated ones. Yet, as philosophers of technology have highlighted, technology can transform the situations we find ourselves in. Drawing insights from the phenomenology of sociality, we consider how digitally-enabled forms of communication and sociality impact our experience of one another. In particular, we draw attention to the way in which our embodied experience of one another is altered when we meet in digital spaces, taking as our focus the themes of perceptual access, intercorporeality, shared space, transitional spaces, and self-presentation. In light of the way in which technological mediation alters various dimensions of our social encounters, we argue that digital encounters constitute their own forms of sociality requiring their own phenomenological analysis. We conclude our paper by raising some broader concerns about the very framework of thinking about digitally and non-digitally mediated social encounters simply in terms of replacement

    Finding a solution:Heparinised saline versus normal saline in the maintenance of invasive arterial lines in intensive care

    Get PDF
    Background We assessed the impact of heparinised saline versus 0.9% normal saline on arterial line patency. Maintaining the patency of arterial lines is essential for obtaining accurate physiological measurements, enabling blood sampling and minimising line replacement. Use of heparinised saline is associated with risks such as thrombocytopenia, haemorrhage and mis-selection. Historical studies draw variable conclusions but suggest that normal saline is at least as effective at maintaining line patency, although recent evidence has questioned this. Methods We conducted a prospective analysis of the use of heparinised saline versus normal saline on unselected patients in the intensive care of our hospital. Data concerning duration of 471 lines insertion and reason for removal was collected. Results We found a higher risk of blockage for lines flushed with normal saline compared with heparinised saline (RR = 2.15, 95% CI 1.392–3.32, p ≤ 0.001). Of the 56 lines which blocked initially (19 heparinised saline and 37 normal saline lines), 16 were replaced with new lines; 5 heparinised saline lines and 11 normal saline lines were reinserted; 5 of these lines subsequently blocked again, 3 of which were flushed with normal saline. Conclusions Our study demonstrates a clinically important reduction in arterial line longevity due to blockages when flushed with normal saline compared to heparinised saline. We have determined that these excess blockages have a significant clinical impact with further lines being inserted after blockage, resulting in increased risks to patients, wasted time and cost of resources. Our findings suggest that the current UK guidance favouring normal saline flushes should be reviewed. </jats:sec
    • …
    corecore