53 research outputs found

    Harbinger I: The Development and Evaluation of the First PACT Replication

    Full text link
    While Assertive Community Treatment (originally known as the PACT program) is now recognized around the world as an effective model for rehabilitation of persons with severe mental illness, this was not the case 20 years ago. Harbinger of Grand Rapids, in Kent County, Michigan, was the first replication of the PACT model which sought fidelity and included an experimental design for assessing effectiveness. The design and results are presented from an initial 30-month and a follow-up 66-month evaluation of Harbinger. The 30-month evaluation showed significant differences favoring Harbinger vs. the control group on independent living, employment, and client functioning. At 66-months, there were fewer experimental-control group differences. The differences in results are analyzed in terms of design and data collection problems, changes in the treatment environment for the control group, and the longitudinal course of mental illness. The discussion focuses on next steps in ACT research, utilizing program theory to better establish the mechanisms for successful intervention models.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/44093/1/10488_2004_Article_414561.pd

    The effectiveness of generic emails versus a remote knowledge broker to integrate mood management into a smoking cessation program in team based primary care : A cluster randomized trial

    Get PDF
    Background: Knowledge brokering is a knowledge translation approach that has been gaining popularity in Canada although the effectiveness is unknown. This study evaluated the effectiveness of generalised, exclusively email-based prompts versus a personalised remote knowledge broker for delivering evidence-based mood management interventions within an existing smoking cessation programme in primary care settings. Methods: The study design is a cluster randomised controlled trial of 123 Ontario Family Health Teams participating in the Smoking Treatment for Ontario Patients programme. They were randomly allocated 1:1 for healthcare providers to receive either: a remote knowledge broker offering tailored support via phone and email (group A), or a generalised monthly email focused on tobacco and depression treatment (group B), to encourage the implementation of an evidence-based mood management intervention to smokers presenting depressive symptoms. The primary outcome was participants’ acceptance of a self-help mood management resource. The secondary outcome was smoking abstinence at 6-month follow-up, measured by self-report of smoking abstinence for at least 7 previous days. The tertiary outcome was the costs of delivering each intervention arm, which, together with the effectiveness outcomes, were used to undertake a cost minimisation analysis

    Modelling competencies for computing education beyond 2020: a research based approach to defining competencies in the computing disciplines.

    Get PDF
    How might the content and outcomes of tertiary education programmes be described and analysed in order to understand how they are structured and function? To address this question we develop a framework for modelling graduate competencies linked to tertiary degree programmes in the computing disciplines. While the focus of our work is computing the framework is applicable to education more broadly. The work presented here draws upon the pioneering curricular document for information technology (IT2017), curricular competency frameworks, other related documents such as the software engineering competency model (SWECOM), the Skills Framework for the Information Age (SFIA), current research in competency models, and elicitation workshop results from recent computing conferences. The aim is to inform the ongoing Computing Curricula (CC2020) project, an endeavour supported by the Association for Computing Machinery (ACM) and the IEEE Computer Society. We develop the Competency Learning Framework (CoLeaF), providing an internationally relevant tool for describing competencies. We argue that this competency based approach is well suited for constructing learning environments and assists degree programme architects in dealing with the challenge of developing, describing and including competencies relevant to computer and IT professionals. In this paper we demonstrate how the CoLeaF competency framework can be applied in practice, and though a series of case studies demonstrate its effectiveness and analytical power as a tool for describing and comparing degree programmes in the international higher education landscape

    The rate of telomere loss is related to maximum lifespan in birds

    Get PDF
    Telomeres are highly conserved regions of DNA that protect the ends of linear chromosomes. The loss of telomeres can signal an irreversible change to a cell's state, including cellular senescence. Senescent cells no longer divide and can damage nearby healthy cells, thus potentially placing them at the crossroads of cancer and ageing. While the epidemiology, cellular and molecular biology of telomeres are well studied, a newer field exploring telomere biology in the context of ecology and evolution is just emerging. With work to date focusing on how telomere shortening relates to individual mortality, less is known about how telomeres relate to ageing rates across species. Here, we investigated telomere length in cross-sectional samples from 19 bird species to determine how rates of telomere loss relate to interspecific variation in maximum lifespan. We found that bird species with longer lifespans lose fewer telomeric repeats each year compared with species with shorter lifespans. In addition, phylogenetic analysis revealed that the rate of telomere loss is evolutionarily conserved within bird families. This suggests that the physiological causes of telomere shortening, or the ability to maintain telomeres, are features that may be responsible for, or co-evolved with, different lifespans observed across species.This article is part of the theme issue 'Understanding diversity in telomere dynamics'

    Constraining sub-seismic deep-water stratal elements with electrofacies analysis; A case study from the Upper Cretaceous of the MĂ„lĂžy Slope, offshore Norway

    No full text
    Electrofacies represent rock facies identified from wireline-log measurements, and allow extrapolation of petrophysical characteristics away from stratigraphic intervals that are calibrated to core. This approach has been employed to reduce uncertainty in the identification of the sub-seismic depositional elements in the late Cenomanian-Coniacian succession of the northern MĂ„lĂžy Slope, offshore Norway. Core logging permits identification of eleven distinct sedimentary facies that are grouped into four facies associations: FA A-turbidite sandstones, FA B-heterolithic siltstones and sandstones, FA C-debrites and FA D-slide and slump deposits. Each facies association is defined by a distinct combination of petrophysical characteristics, including porosity, density, gamma-ray, sonic and resistivity. Using neural network analysis, electrofacies are calibrated with sedimentary facies, thereby allowing us to map their thickness and stacking patterns within the studied deep-water succession. We demonstrate that this approach is particularly useful where the presence of glauconite makes the distinction between sandstone- from shale-rich units difficult using gamma-ray logs alone. Our results indicate that the succession of interest is dominated by debris flows and slide and slump deposits, which are commonly poorly imaged on seismic reflection datasets in the northern North Sea. The methodology presented here can aid the correlation of deep-water stratal elements at production and exploration scales in stratigraphic successions that have undergone similar burial histories.Furthermore, this method may help in the identification of mass flow deposits that are present in Upper Cretaceous deep-water systems of the North Sea

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P < 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely

    Diagnosis and management of Cornelia de Lange syndrome:first international consensus statement

    Get PDF
    Cornelia de Lange syndrome (CdLS) is an archetypical genetic syndrome that is characterized by intellectual disability, well-defined facial features, upper limb anomalies and atypical growth, among numerous other signs and symptoms. It is caused by variants in any one of seven genes, all of which have a structural or regulatory function in the cohesin complex. Although recent advances in next-generation sequencing have improved molecular diagnostics, marked heterogeneity exists in clinical and molecular diagnostic approaches and care practices worldwide. Here, we outline a series of recommendations that document the consensus of a group of international experts on clinical diagnostic criteria, both for classic CdLS and non-classic CdLS phenotypes, molecular investigations, long-term management and care planning

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Lactoferrin in immune function cancer and disease resistance

    No full text
    • 

    corecore