301 research outputs found
Chlorpromazine for schizophrenia: a Cochrane systematic review of 50 years of randomised controlled trials
BACKGROUND:
Chlorpromazine (CPZ) remains one of the most common drugs used for people with schizophrenia worldwide, and a benchmark against which other treatments can be evaluated. Quantitative reviews are rare; this one evaluates the effects of chlorpromazine in the treatment of schizophrenia in comparison with placebo.
METHODS:
We sought all relevant randomised controlled trials (RCT) comparing chlorpromazine to placebo by electronic and reference searching, and by contacting trial authors and the pharmaceutical industry. Data were extracted from selected trials and, where possible, synthesised and random effects relative risk (RR), the number needed to treat (NNT) and their 95% confidence intervals (CI) calculated.
RESULTS:
Fifty RCTs from 1955–2000 were included with 5276 people randomised to CPZ or placebo. They constitute 2008 person-years spent in trials. Meta-analysis of these trials showed that chlorpromazine promotes a global improvement (n = 1121, 13 RCTs, RR 0.76 CI 0.7 to 0.9, NNT 7 CI 5 to 10), although a considerable placebo response is also seen. People allocated to chlorpromazine tended not to leave trials early in both the short (n = 945, 16 RCTs, RR 0.74 CI 0.5 to 1.1) and medium term (n = 1861, 25 RCTs, RR 0.79 CI 0.6 to 1.1). There were, however, many adverse effects. Chlorpromazine is sedating (n = 1242, 18 RCTs, RR 2.3 CI 1.7 to 3.1, NNH 6 CI 5 to 8), increases a person's chances of experiencing acute movement disorders, Parkinsonism and causes low blood pressure with dizziness and dry mouth.
CONCLUSION:
It is understandable why the World Health Organization (WHO) have endorsed and included chlorpromazine in their list of essential drugs for use in schizophrenia. Low- and middle-income countries may have more complete evidence upon which to base their practice compared with richer nations using recent innovations
Implementation and evaluation of a nurse-centered computerized potassium regulation protocol in the intensive care unit - a before and after analysis
<p>Abstract</p> <p>Background</p> <p>Potassium disorders can cause major complications and must be avoided in critically ill patients. Regulation of potassium in the intensive care unit (ICU) requires potassium administration with frequent blood potassium measurements and subsequent adjustments of the amount of potassium administrated. The use of a potassium replacement protocol can improve potassium regulation. For safety and efficiency, computerized protocols appear to be superior over paper protocols. The aim of this study was to evaluate if a computerized potassium regulation protocol in the ICU improved potassium regulation.</p> <p>Methods</p> <p>In our surgical ICU (12 beds) and cardiothoracic ICU (14 beds) at a tertiary academic center, we implemented a nurse-centered computerized potassium protocol integrated with the pre-existent glucose control program called GRIP (Glucose Regulation in Intensive Care patients). Before implementation of the computerized protocol, potassium replacement was physician-driven. Potassium was delivered continuously either by central venous catheter or by gastric, duodenal or jejunal tube. After every potassium measurement, nurses received a recommendation for the potassium administration rate and the time to the next measurement. In this before-after study we evaluated potassium regulation with GRIP. The attitude of the nursing staff towards potassium regulation with computer support was measured with questionnaires.</p> <p>Results</p> <p>The patient cohort consisted of 775 patients before and 1435 after the implementation of computerized potassium control. The number of patients with hypokalemia (<3.5 mmol/L) and hyperkalemia (>5.0 mmol/L) were recorded, as well as the time course of potassium levels after ICU admission. The incidence of hypokalemia and hyperkalemia was calculated. Median potassium-levels were similar in both study periods, but the level of potassium control improved: the incidence of hypokalemia decreased from 2.4% to 1.7% (P < 0.001) and hyperkalemia from 7.4% to 4.8% (P < 0.001). Nurses indicated that they considered computerized potassium control an improvement over previous practice.</p> <p>Conclusions</p> <p>Computerized potassium control, integrated with the nurse-centered GRIP program for glucose regulation, is effective and reduces the prevalence of hypo- and hyperkalemia in the ICU compared with physician-driven potassium regulation.</p
Dynamic clamp with StdpC software
Dynamic clamp is a powerful method that allows the introduction of artificial electrical components into target cells to simulate ionic conductances and synaptic inputs. This method is based on a fast cycle of measuring the membrane potential of a cell, calculating the current of a desired simulated component using an appropriate model and injecting this current into the cell. Here we present a dynamic clamp protocol using free, fully integrated, open-source software (StdpC, for spike timing-dependent plasticity clamp). Use of this protocol does not require specialist hardware, costly commercial software, experience in real-time operating systems or a strong programming background. The software enables the configuration and operation of a wide range of complex and fully automated dynamic clamp experiments through an intuitive and powerful interface with a minimal initial lead time of a few hours. After initial configuration, experimental results can be generated within minutes of establishing cell recording
Trial design: Computer guided normal-low versus normal-high potassium control in critically ill patients: Rationale of the GRIP-COMPASS study
Background: Potassium depletion is common in hospitalized patients and can cause serious complications such as cardiac arrhythmias. In the intensive care unit (ICU) the majority of patients require potassium suppletion. However, there are no data regarding the optimal control target in critically ill patients. After open-heart surgery, patients have a strongly increased risk of atrial fibrillation or atrial flutter (AFF). In a novel trial design, we examined if in these patients different potassium control-targets within the normal range may have different effects on the incidence of AFF. Methods/Design: The "computer-driven Glucose and potassium Regulation program in Intensive care Patients with COMparison of PotASSium targets within normokalemic range (GRIP-COMPASS) trial" is a single-center prospective trial in which a total of 1200 patients are assigned to either a potassium control-target of 4.0 mmol/L or 4.5 mmol/L in consecutive alternating blocks of 50 patients each. Potassium levels are regulated by the computer-assisted potassium suppletion algorithm called GRIP-II (Glucose and potassium regulation for Intensive care Patients). Primary endpoint is the in-hospital incidence of AFF after cardiac surgery. Secondary endpoints are: in-hospital AFF in medical patients or patients after non-cardiac surgery, actually achieved potassium levels and their variation, electrolyte and glucose levels, potassium and insulin requirements, cumulative fluid balance, (ICU) length of stay, ICU mortality, hospital mortality and 90-day mortality. Discussion: The GRIP-COMPASS trial is the first controlled clinical trial to date that compares potassium targets. Other novel methodological elements of the study are that it is performed in ICU patients where both targets are within the normal range and that a computer-assisted potassium suppletion algorithm is used
Consensus guidelines on analgesia and sedation in dying intensive care unit patients
BACKGROUND: Intensivists must provide enough analgesia and sedation to ensure dying patients receive good palliative care. However, if it is perceived that too much is given, they risk prosecution for committing euthanasia. The goal of this study is to develop consensus guidelines on analgesia and sedation in dying intensive care unit patients that help distinguish palliative care from euthanasia. METHODS: Using the Delphi technique, panelists rated levels of agreement with statements describing how analgesics and sedatives should be given to dying ICU patients and how palliative care should be distinguished from euthanasia. Participants were drawn from 3 panels: 1) Canadian Academic Adult Intensive Care Fellowship program directors and Intensive Care division chiefs (N = 9); 2) Deputy chief provincial coroners (N = 5); 3) Validation panel of Intensivists attending the Canadian Critical Care Trials Group meeting (N = 12). RESULTS: After three Delphi rounds, consensus was achieved on 16 statements encompassing the role of palliative care in the intensive care unit, the management of pain and suffering, current areas of controversy, and ways of improving palliative care in the ICU. CONCLUSION: Consensus guidelines were developed to guide the administration of analgesics and sedatives to dying ICU patients and to help distinguish palliative care from euthanasia
Long-term reductions in tinnitus severity
BACKGROUND: This study was undertaken to assess long-term changes in tinnitus severity exhibited by patients who completed a comprehensive tinnitus management program; to identify factors that contributed to changes in tinnitus severity within this population; to contribute to the development and refinement of effective assessment and management procedures for tinnitus. METHODS: Detailed questionnaires were mailed to 300 consecutive patients prior to their initial appointment at the Oregon Health & Science University Tinnitus Clinic. All patients were then evaluated and treated within a comprehensive tinnitus management program. Follow-up questionnaires were mailed to the same 300 patients 6 to 36 months after their initial tinnitus clinic appointment. RESULTS: One hundred ninety patients (133 males, 57 females; mean age 57 years) returned follow-up questionnaires 6 to 36 months (mean = 22 months) after their initial tinnitus clinic appointment. This group of patients exhibited significant long-term reductions in self-rated tinnitus loudness, Tinnitus Severity Index scores, tinnitus-related anxiety and prevalence of current depression. Patients who improved their sleep patterns or Beck Depression Inventory scores exhibited greater reductions of tinnitus severity scores than patients who continued to experience insomnia and depression at follow-up. CONCLUSIONS: Individualized tinnitus management programs that were designed for each patient contributed to overall reductions in tinnitus severity exhibited on follow-up questionnaires. Identification and treatment of patients experiencing anxiety, insomnia or depression are vital components of an effective tinnitus management program. Utilization of acoustic therapy also contributed to improvements exhibited by these patients
Recommended from our members
Global forecasting of thermal health hazards: the skill of probabilistic predictions of the Universal Thermal Climate Index (UTCI)
Although over a hundred thermal indices can be used for assessing thermal health hazards, many ignore the human heat budget, physiology and clothing. The Universal Thermal Climate Index (UTCI) addresses these shortcomings by using an advanced thermo-physiological model. This paper assesses the potential of using the UTCI for forecasting thermal health hazards. Traditionally, such hazard forecasting has had two further limitations: it has been narrowly focused on a particular region or nation and has relied on the use of single ‘deterministic’ forecasts. Here, the UTCI is computed on a global scale,which is essential for international health-hazard warnings and disaster preparedness, and it is provided as a probabilistic forecast. It is shown that probabilistic UTCI forecasts are superior in skill to deterministic forecasts and that despite global variations, the UTCI forecast is skilful for lead times up to 10 days. The paper also demonstrates the utility of probabilistic UTCI forecasts on the example of the 2010 heat wave in Russia
The Transient Receptor Potential Ion Channel TRPV6 Is Expressed at Low Levels in Osteoblasts and Has Little Role in Osteoblast Calcium Uptake
Background: TRPV6 ion channels are key mediators of regulated transepithelial absorption of Ca2+ within the small intestine. Trpv6-/- mice were reported to have lower bone density than wild-type littermates and significant disturbances in calcium homeostasis that suggested a role for TRPV6 in osteoblasts during bone formation and mineralization. TRPV6 and molecules related to transepithelial Ca2+ transport have been reported to be expressed at high levels in human and mouse osteoblasts.
Results: Transmembrane ion currents in whole cell patch clamped SaOS-2 osteoblasts did not show sensitivity to ruthenium red, an inhibitor of TRPV5/6 ion channels, and 45Ca uptake was not significantly affected by ruthenium red in either SaOS-2 (P = 0.77) or TE-85 (P = 0.69) osteoblastic cells. In contrast, ion currents and 45Ca uptake were both significantly affected in a human bronchial epithelial cell line known to express TRPV6. TRPV6 was expressed at lower levels in osteoblastic cells than has been reported in some literature. In SaOS-2 TRPV6 mRNA was below the assay detection limit; in TE-85 TRPV6 mRNA was detected at 6.90±1.9 × 10−5 relative to B2M. In contrast, TRPV6 was detected at 7.7±3.0 × 10−2 and 2.38±0.28 × 10−4 the level of B2M in human carcinoma-derived cell lines LNCaP and CaCO-2 respectively. In murine primary calvarial osteoblasts TRPV6 was detected at 3.80±0.24 × 10−5 relative to GAPDH, in contrast with 4.3±1.5 × 10−2 relative to GAPDH in murine duodenum. By immunohistochemistry, TRPV6 was expressed mainly in myleocytic cells of the murine bone marrow and was observed only at low levels in murine osteoblasts, osteocytes or growth plate cartilage.
Conclusions: TRPV6 is expressed only at low levels in osteoblasts and plays little functional role in osteoblastic calcium uptake
Molecular and cellular mechanisms underlying the evolution of form and function in the amniote jaw.
The amniote jaw complex is a remarkable amalgamation of derivatives from distinct embryonic cell lineages. During development, the cells in these lineages experience concerted movements, migrations, and signaling interactions that take them from their initial origins to their final destinations and imbue their derivatives with aspects of form including their axial orientation, anatomical identity, size, and shape. Perturbations along the way can produce defects and disease, but also generate the variation necessary for jaw evolution and adaptation. We focus on molecular and cellular mechanisms that regulate form in the amniote jaw complex, and that enable structural and functional integration. Special emphasis is placed on the role of cranial neural crest mesenchyme (NCM) during the species-specific patterning of bone, cartilage, tendon, muscle, and other jaw tissues. We also address the effects of biomechanical forces during jaw development and discuss ways in which certain molecular and cellular responses add adaptive and evolutionary plasticity to jaw morphology. Overall, we highlight how variation in molecular and cellular programs can promote the phenomenal diversity and functional morphology achieved during amniote jaw evolution or lead to the range of jaw defects and disease that affect the human condition
Efficacy of the mRNA-1273 SARS-CoV-2 vaccine at completion of blinded phase
BACKGROUND At interim analysis in a phase 3, observer-blinded, placebo-controlled clinical trial, the mRNA-1273 vaccine showed 94.1% efficacy in preventing coronavirus disease 2019 (Covid-19). After emergency use of the vaccine was authorized, the protocol was amended to include an open-label phase. Final analyses of efficacy and safety data from the blinded phase of the trial are reported.
METHODS We enrolled volunteers who were at high risk for Covid-19 or its complications; participants were randomly assigned in a 1:1 ratio to receive two intramuscular injections of mRNA-1273 (100 μg) or placebo, 28 days apart, at 99 centers across the United States. The primary end point was prevention of Covid-19 illness with onset at least 14 days after the second injection in participants who had not previously been infected with the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). The data cutoff date was March 26, 2021.
RESULTS The trial enrolled 30,415 participants; 15,209 were assigned to receive the mRNA-1273 vaccine, and 15,206 to receive placebo. More than 96% of participants received both injections, 2.3% had evidence of SARS-CoV-2 infection at baseline, and the median follow-up was 5.3 months in the blinded phase. Vaccine efficacy in preventing Covid-19 illness was 93.2% (95% confidence interval [CI], 91.0 to 94.8), with 55 confirmed cases in the mRNA-1273 group (9.6 per 1000 person-years; 95% CI, 7.2 to 12.5) and 744 in the placebo group (136.6 per 1000 person-years; 95% CI, 127.0 to 146.8). The efficacy in preventing severe disease was 98.2% (95% CI, 92.8 to 99.6), with 2 cases in the mRNA-1273 group and 106 in the placebo group, and the efficacy in preventing asymptomatic infection starting 14 days after the second injection was 63.0% (95% CI, 56.6 to 68.5), with 214 cases in the mRNA-1273 group and 498 in the placebo group. Vaccine efficacy was consistent across ethnic and racial groups, age groups, and participants with coexisting conditions. No safety concerns were identified.
CONCLUSIONS The mRNA-1273 vaccine continued to be efficacious in preventing Covid-19 illness and severe disease at more than 5 months, with an acceptable safety profile, and protection against asymptomatic infection was observed
- …