1,452 research outputs found

    From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument

    Get PDF
    <b>Background</b> Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field.<p></p> <b>Methods</b> A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals.<p></p> <b>Results</b> The developed instrument was pre-tested in two professional samples (N = 46; N = 231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts.<p></p> <b>Conclusions</b> To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study

    Improving the normalization of complex interventions: measure development based on normalization process theory (NoMAD): study protocol

    Get PDF
    <b>Background</b> Understanding implementation processes is key to ensuring that complex interventions in healthcare are taken up in practice and thus maximize intended benefits for service provision and (ultimately) care to patients. Normalization Process Theory (NPT) provides a framework for understanding how a new intervention becomes part of normal practice. This study aims to develop and validate simple generic tools derived from NPT, to be used to improve the implementation of complex healthcare interventions.<p></p> <b>Objectives</b> The objectives of this study are to: develop a set of NPT-based measures and formatively evaluate their use for identifying implementation problems and monitoring progress; conduct preliminary evaluation of these measures across a range of interventions and contexts, and identify factors that affect this process; explore the utility of these measures for predicting outcomes; and develop an online users’ manual for the measures.<p></p> <b>Methods</b> A combination of qualitative (workshops, item development, user feedback, cognitive interviews) and quantitative (survey) methods will be used to develop NPT measures, and test the utility of the measures in six healthcare intervention settings.<p></p> <b>Discussion</b> The measures developed in the study will be available for use by those involved in planning, implementing, and evaluating complex interventions in healthcare and have the potential to enhance the chances of their implementation, leading to sustained changes in working practices

    Rethinking the patient: using Burden of Treatment Theory to understand the changing dynamics of illness

    Get PDF
    <b>Background</b> In this article we outline Burden of Treatment Theory, a new model of the relationship between sick people, their social networks, and healthcare services. Health services face the challenge of growing populations with long-term and life-limiting conditions, they have responded to this by delegating to sick people and their networks routine work aimed at managing symptoms, and at retarding - and sometimes preventing - disease progression. This is the new proactive work of patient-hood for which patients are increasingly accountable: founded on ideas about self-care, self-empowerment, and self-actualization, and on new technologies and treatment modalities which can be shifted from the clinic into the community. These place new demands on sick people, which they may experience as burdens of treatment.<p></p> <b>Discussion</b> As the burdens accumulate some patients are overwhelmed, and the consequences are likely to be poor healthcare outcomes for individual patients, increasing strain on caregivers, and rising demand and costs of healthcare services. In the face of these challenges we need to better understand the resources that patients draw upon as they respond to the demands of both burdens of illness and burdens of treatment, and the ways that resources interact with healthcare utilization.<p></p> <b>Summary</b> Burden of Treatment Theory is oriented to understanding how capacity for action interacts with the work that stems from healthcare. Burden of Treatment Theory is a structural model that focuses on the work that patients and their networks do. It thus helps us understand variations in healthcare utilization and adherence in different healthcare settings and clinical contexts

    Azumaya Objects in Triangulated Bicategories

    Full text link
    We introduce the notion of Azumaya object in general homotopy-theoretic settings. We give a self-contained account of Azumaya objects and Brauer groups in bicategorical contexts, generalizing the Brauer group of a commutative ring. We go on to describe triangulated bicategories and prove a characterization theorem for Azumaya objects therein. This theory applies to give a homotopical Brauer group for derived categories of rings and ring spectra. We show that the homotopical Brauer group of an Eilenberg-Mac Lane spectrum is isomorphic to the homotopical Brauer group of its underlying commutative ring. We also discuss tilting theory as an application of invertibility in triangulated bicategories.Comment: 23 pages; final version; to appear in Journal of Homotopy and Related Structure

    How Gaussian competition leads to lumpy or uniform species distributions

    Get PDF
    A central model in theoretical ecology considers the competition of a range of species for a broad spectrum of resources. Recent studies have shown that essentially two different outcomes are possible. Either the species surviving competition are more or less uniformly distributed over the resource spectrum, or their distribution is 'lumped' (or 'clumped'), consisting of clusters of species with similar resource use that are separated by gaps in resource space. Which of these outcomes will occur crucially depends on the competition kernel, which reflects the shape of the resource utilization pattern of the competing species. Most models considered in the literature assume a Gaussian competition kernel. This is unfortunate, since predictions based on such a Gaussian assumption are not robust. In fact, Gaussian kernels are a border case scenario, and slight deviations from this function can lead to either uniform or lumped species distributions. Here we illustrate the non-robustness of the Gaussian assumption by simulating different implementations of the standard competition model with constant carrying capacity. In this scenario, lumped species distributions can come about by secondary ecological or evolutionary mechanisms or by details of the numerical implementation of the model. We analyze the origin of this sensitivity and discuss it in the context of recent applications of the model.Comment: 11 pages, 3 figures, revised versio

    Two attacks on rank metric code-based schemes: RankSign and an Identity-Based-Encryption scheme

    Get PDF
    RankSign [GRSZ14a] is a code-based signature scheme proposed to the NIST competition for quantum-safe cryptography [AGHRZ17] and, moreover, is a fundamental building block of a new Identity-Based-Encryption (IBE) [GHPT17a]. This signature scheme is based on the rank metric and enjoys remarkably small key sizes, about 10KBytes for an intended level of security of 128 bits. Unfortunately we will show that all the parameters proposed for this scheme in [AGHRZ17] can be broken by an algebraic attack that exploits the fact that the augmented LRPC codes used in this scheme have very low weight codewords. Therefore, without RankSign the IBE cannot be instantiated at this time. As a second contribution we will show that the problem is deeper than finding a new signature in rank-based cryptography, we also found an attack on the generic problem upon which its security reduction relies. However, contrarily to the RankSign scheme, it seems that the parameters of the IBE scheme could be chosen in order to avoid our attack. Finally, we have also shown that if one replaces the rank metric in the [GHPT17a] IBE scheme by the Hamming metric, then a devastating attack can be found

    Arduous implementation: Does the Normalisation Process Model explain why it's so difficult to embed decision support technologies for patients in routine clinical practice

    Get PDF
    Background: decision support technologies (DSTs, also known as decision aids) help patients and professionals take part in collaborative decision-making processes. Trials have shown favorable impacts on patient knowledge, satisfaction, decisional conflict and confidence. However, they have not become routinely embedded in health care settings. Few studies have approached this issue using a theoretical framework. We explained problems of implementing DSTs using the Normalization Process Model, a conceptual model that focuses attention on how complex interventions become routinely embedded in practice.Methods: the Normalization Process Model was used as the basis of conceptual analysis of the outcomes of previous primary research and reviews. Using a virtual working environment we applied the model and its main concepts to examine: the 'workability' of DSTs in professional-patient interactions; how DSTs affect knowledge relations between their users; how DSTs impact on users' skills and performance; and the impact of DSTs on the allocation of organizational resources.Results: conceptual analysis using the Normalization Process Model provided insight on implementation problems for DSTs in routine settings. Current research focuses mainly on the interactional workability of these technologies, but factors related to divisions of labor and health care, and the organizational contexts in which DSTs are used, are poorly described and understood.Conclusion: the model successfully provided a framework for helping to identify factors that promote and inhibit the implementation of DSTs in healthcare and gave us insights into factors influencing the introduction of new technologies into contexts where negotiations are characterized by asymmetries of power and knowledge. Future research and development on the deployment of DSTs needs to take a more holistic approach and give emphasis to the structural conditions and social norms in which these technologies are enacte

    Edge-Based Compartmental Modeling for Infectious Disease Spread Part III: Disease and Population Structure

    Full text link
    We consider the edge-based compartmental models for infectious disease spread introduced in Part I. These models allow us to consider standard SIR diseases spreading in random populations. In this paper we show how to handle deviations of the disease or population from the simplistic assumptions of Part I. We allow the population to have structure due to effects such as demographic detail or multiple types of risk behavior the disease to have more complicated natural history. We introduce these modifications in the static network context, though it is straightforward to incorporate them into dynamic networks. We also consider serosorting, which requires using the dynamic network models. The basic methods we use to derive these generalizations are widely applicable, and so it is straightforward to introduce many other generalizations not considered here

    Process evaluation for complex interventions in primary care: understanding trials using the normalization process model

    Get PDF
    Background: the Normalization Process Model is a conceptual tool intended to assist in understanding the factors that affect implementation processes in clinical trials and other evaluations of complex interventions. It focuses on the ways that the implementation of complex interventions is shaped by problems of workability and integration.Method: in this paper the model is applied to two different complex trials: (i) the delivery of problem solving therapies for psychosocial distress, and (ii) the delivery of nurse-led clinics for heart failure treatment in primary care.Results: application of the model shows how process evaluations need to focus on more than the immediate contexts in which trial outcomes are generated. Problems relating to intervention workability and integration also need to be understood. The model may be used effectively to explain the implementation process in trials of complex interventions.Conclusion: the model invites evaluators to attend equally to considering how a complex intervention interacts with existing patterns of service organization, professional practice, and professional-patient interaction. The justification for this may be found in the abundance of reports of clinical effectiveness for interventions that have little hope of being implemented in real healthcare setting

    A Case Study: Women in Highway Patrol Group in Ilocos Norte

    Get PDF
    (PNP) under the Highway Patrol Group of Ilocos Norte. It aimed to determine the challenges encountered by female police officers, including the effects and the coping mechanisms to overcome these challenges associated with their role. This qualitative research method utilized the descriptive case study where personal and online interviews were conducted with three (3) participants working at the Philippine Highway Patrol Team of Laoag City, Abra and Ilocos Sur. Purposive sampling method was used in choosing the participants of the study. Findings showed that the participants of the study encountered challenges in the Highway Patrol Group as revealed in their answers to questions that revolved around nine (9) themes: (a) challenges encountered by women in the PNP Highway Patrol Group, (b) effects of the challenges faced in their job performance and (c) coping mechanisms used to overcome the challenges encountered. Themes that emerged based on these questions are: “shortlisted recruitment opportunity,” “financial inadequacy,” “doubt in competence,” “feeling of discouragement,” “burnout,” “destruction of family connection,” “self-growth,” “coping through faith,” and “mind over body.” On the bases of the data gathered and analyzed, this study revealed that the challenges experienced by women in the Highway Patrol Group include not only organizational challenges but also personal problems which both positively and negatively affect their life and job performances. Moreover, after experiencing these challenges, it was revealed that the participants resorted to practices of spirituality through faith by praying as a way to deal with feelings of anxiety, stress and exhaustion. Also, they focused on thoughts of mind rather than the body to cope or manage challenges and help them to do more than they are capable of
    corecore