144,326 research outputs found

    Self Optimizing Control Of An Evaporation Process Under Noisy Measurements

    Get PDF
    Recently, Cao (2004) presented a decentralized cascade self-optimizing control strategy and implemented on an evaporation process. In this method, the local optimal condition of a self optimizing control system is derived and this optimal condition is expressed as a gradient function in terms of the existing process measurements. This gradient function can then be used as a controlled variable to achieve local self optimization. Good results were obtained subject to noise free measurements but the performance deteriorates when measurement noise presents. This paper presents a method to overcome the detrimental effect of measurement noises on self-optimising control. Filtering the process measurements in conjunction with self-optimising control can reduce the effect of measurement noise on the process performance. The benefit of this method is quantified in terms of the total operating cost reduction compared to non-filtered gradient control. Operating cost comparison of a 10 hour period for various cases subject to the same disturbances clearly shows that the implementation of the proposed strategy reduces the operating cost

    Looking inside the black box : a theory-based process evaluation alongside a randomised controlled trial of printed educational materials (the Ontario printed educational message, OPEM) to improve referral and prescribing practices in primary care in Ontario, Canada

    Get PDF
    Background: Randomised controlled trials of implementation strategies tell us whether (or not) an intervention results in changes in professional behaviour but little about the causal mechanisms that produce any change. Theory-based process evaluations collect data on theoretical constructs alongside randomised trials to explore possible causal mechanisms and effect modifiers. This is similar to measuring intermediate endpoints in clinical trials to further understand the biological basis of any observed effects (for example, measuring lipid profiles alongside trials of lipid lowering drugs where the primary endpoint could be reduction in vascular related deaths). This study protocol describes a theory-based process evaluation alongside the Ontario Printed Educational Message (OPEM) trial. We hypothesize that the OPEM interventions are most likely to operate through changes in physicians' behavioural intentions due to improved attitudes or subjective norms with little or no change in perceived behavioural control. We will test this hypothesis using a well-validated social cognition model, the theory of planned behaviour (TPB) that incorporates these constructs. Methods/design: We will develop theory-based surveys using standard methods based upon the TPB for the second and third replications, and survey a subsample of Ontario family physicians from each arm of the trial two months before and six months after the dissemination of the index edition of informed, the evidence based newsletter used for the interventions. In the third replication, our study will converge with the "TRY-ME" protocol (a second study conducted alongside the OPEM trial), in which the content of educational messages was constructed using both standard methods and methods informed by psychological theory. We will modify Dillman's total design method to maximise response rates. Preliminary analyses will initially assess the internal reliability of the measures and use regression to explore the relationships between predictor and dependent variable (intention to advise diabetic patients to have annual retinopathy screening and to prescribe thiazide diuretics for first line treatment of uncomplicated hypertension). We will then compare groups using methods appropriate for comparing independent samples to determine whether there have been changes in the predicted constructs (attitudes, subjective norms, or intentions) across the study groups as hypothesised, and will assess the convergence between the process evaluation results and the main trial results.The OPEM trial and OPEM process evaluation are funded by the Canadian Institute of Health Research (CIHR). The OPEM process evaluation study was developed as part of the CIHR funded interdisciplinary capacity enhancement team KT-ICEBeRG. Gaston Godin, Jeremy Grimshaw and France Légaré hold Canada Research Chairs. Louise Lemyre holds an R.S. McLaughlin Research Chair

    Boundary operator algebras for free uniform tree lattices

    Full text link
    Let XX be a finite connected graph, each of whose vertices has degree at least three. The fundamental group Γ\Gamma of XX is a free group and acts on the universal covering tree Δ\Delta and on its boundary Δ\partial \Delta, endowed with a natural topology and Borel measure. The crossed product CC^*-algebra C(Δ)ΓC(\partial \Delta) \rtimes \Gamma depends only on the rank of Γ\Gamma and is a Cuntz-Krieger algebra whose structure is explicitly determined. The crossed product von Neumann algebra does not possess this rigidity. If XX is homogeneous of degree q+1q+1 then the von Neumann algebra L(Δ)ΓL^\infty(\partial \Delta)\rtimes \Gamma is the hyperfinite factor of type IIIλIII_\lambda where λ=1/q2\lambda=1/{q^2} if XX is bipartite, and λ=1/q\lambda=1/{q} otherwise

    Globalization, austerity and health equity politics : taming the inequality machine, and why it matters.

    Get PDF
    The recognition that globalization has an important role in explaining health inequalities has now moved into the mainstream. Much of that role relates to what has been called ‘[t]he inequality machine [that] is reshaping the planet’. At the same time, more attention must be paid to how the state can tame the inequality machine or compensate for its effects. I argue that governments have more flexibility in this respect than is often acknowledged. With an emphasis on current and recent social policy in Britain, I illustrate the need for researchers and practitioners to focus not only on external constraints associated with globalization but also on domestic political mechanisms and dynamics that may limit the extent to which governments can reduce health inequalities by addressing underlying social determinants

    Automated Cryptographic Analysis of the Pedersen Commitment Scheme

    Full text link
    Aiming for strong security assurance, recently there has been an increasing interest in formal verification of cryptographic constructions. This paper presents a mechanised formal verification of the popular Pedersen commitment protocol, proving its security properties of correctness, perfect hiding, and computational binding. To formally verify the protocol, we extended the theory of EasyCrypt, a framework which allows for reasoning in the computational model, to support the discrete logarithm and an abstraction of commitment protocols. Commitments are building blocks of many cryptographic constructions, for example, verifiable secret sharing, zero-knowledge proofs, and e-voting. Our work paves the way for the verification of those more complex constructions.Comment: 12 pages, conference MMM-ACNS 201

    Effective action for the Yukawa model in curved spacetime

    Get PDF
    We consider the one-loop renormalization of a real scalar field interacting with a Dirac spinor field in curved spacetime. A general Yukawa interaction is considered which includes both a scalar and a pseudoscalar coupling. The scalar field is assumed to be non-minimally coupled to the gravitational field and to have a general quartic self-interaction potential. All of the one-loop renormalization group functions are evaluated and in the special case where there is no mass scale present in the classical theory (apart from the fields) we evaluate the one-loop effective action up to and including order R2R^2 in the curvature. In the case where the fermion is massive we include a chiral term in γ5\gamma_5 and we show that although the γ5\gamma_5 term can be removed by a redefinition of the spinor field an anomaly in the effective action arises that is related to the familiar axial current anomaly.Comment: 28 page

    The development of a theory-based intervention to promote appropriate disclosure of a diagnosis of dementia

    Get PDF
    Background: The development and description of interventions to change professional practice are often limited by the lack of an explicit theoretical and empirical basis. We set out to develop an intervention to promote appropriate disclosure of a diagnosis of dementia based on theoretical and empirical work. Methods: We identified three key disclosure behaviours: finding out what the patient already knows or suspects about their diagnosis; using the actual words 'dementia' or 'Alzheimer's disease' when talking to the patient; and exploring what the diagnosis means to the patient. We conducted a questionnaire survey of older peoples' mental health teams (MHTs) based upon theoretical constructs from the Theory of Planned Behaviour (TPB) and Social Cognitive Theory (SCT) and used the findings to identify factors that predicted mental health professionals' intentions to perform each behaviour. We selected behaviour change techniques likely to alter these factors. Results: The change techniques selected were: persuasive communication to target subjective norm; behavioural modelling and graded tasks to target self-efficacy; persuasive communication to target attitude towards the use of explicit terminology when talking to the patient; and behavioural modelling by MHTs to target perceived behavioural control for finding out what the patient already knows or suspects and exploring what the diagnosis means to the patient. We operationalised these behaviour change techniques using an interactive 'pen and paper' intervention designed to increase intentions to perform the three target behaviours. Conclusion : It is feasible to develop an intervention to change professional behaviour based upon theoretical models, empirical data and evidence based behaviour change techniques. The next step is to evaluate the effect of such an intervention on behavioural intention. We argue that this approach to development and reporting of interventions will contribute to the science of implementation by providing replicable interventions that illuminate the principles and processes underlying change.This project is funded by UK Medical Research Council, Grant reference number G0300999. Jeremy Grimshaw holds a Canada Research Chair in Health Knowledge Transfer and Uptake. Jill Francis is funded by the Chief Scientist Office of the Scottish Government Health Directorate. The views expressed in this study are those of the authors

    Popular music and/as event: subjectivity, love and fidelity in the aftermath of rock ’n’ roll

    Get PDF
    This article concerns the usefulness of attaching a philosophy of the event to popular music studies. I am attempting to think about the ways that rock ’n’ roll functions as a musical revolution that becomes subjected to a narrative of loss accompanying the belief that the revolution has floundered, or even disappeared completely. In order to think about what this narrative of loss might entail I have found myself going back to the emergence of rock ’n’ roll, to what we might term its ‘event’, and then working towards the present to take stock of the current situation. The article is divided into three parts. Part One attempts to think of the emergence of rock ’n’ roll and its attendant discourse alongside Alain Badiou’s notion of event, looking at ways in which listening subjects are formed. Part Two continues the discussion of listening subjectivity while shifting the focus to objects associated with phonography. Part Three attends to a number of difficulties encountered in the Badiouian project and asks to what extent rock music might be thought of as a lost cause. All three parts deal with notions of subjectivity, love and fidelit

    Abstracting Asynchronous Multi-Valued Networks: An Initial Investigation

    Get PDF
    Multi-valued networks provide a simple yet expressive qualitative state based modelling approach for biological systems. In this paper we develop an abstraction theory for asynchronous multi-valued network models that allows the state space of a model to be reduced while preserving key properties of the model. The abstraction theory therefore provides a mechanism for coping with the state space explosion problem and supports the analysis and comparison of multi-valued networks. We take as our starting point the abstraction theory for synchronous multi-valued networks which is based on the finite set of traces that represent the behaviour of such a model. The problem with extending this approach to the asynchronous case is that we can now have an infinite set of traces associated with a model making a simple trace inclusion test infeasible. To address this we develop a decision procedure for checking asynchronous abstractions based on using the finite state graph of an asynchronous multi-valued network to reason about its trace semantics. We illustrate the abstraction techniques developed by considering a detailed case study based on a multi-valued network model of the regulation of tryptophan biosynthesis in Escherichia coli.Comment: Presented at MeCBIC 201
    corecore