61 research outputs found

    Current Pyuria Cutoffs Promote Inappropriate Urinary Tract Infection Diagnosis in Older Women

    Get PDF
    Background: Pre-existing lower urinary tract symptoms (LUTS), cognitive impairment, and the high prevalence of asymptomatic bacteriuria (ASB) complicate the diagnosis of urinary tract infection (UTI) in older women. The presence of pyuria remains the cornerstone of UTI diagnosis. However, >90% of ASB patients have pyuria, prompting unnecessary treatment. We quantified pyuria by automated microscopy and flowcytometry to determine the diagnostic accuracy for UTI and to derive pyuria thresholds for UTI in older women. Methods: Women ≥65 years with ≥2 new-onset LUTS and 1 uropathogen ≥104 colony-forming units (CFU)/mL were included in the UTI group. Controls were asymptomatic and classified as ASB (1 uropathogen ≥105 CFU/mL), negative culture, or mixed flora. Patients with an indwelling catheter or antimicrobial pretreatment were excluded. Leukocyte medians were compared and sensitivity–specificity pairs were derived from a receiver operating characteristic curve. Results: We included 164 participants. UTI patients had higher median urinary leukocytes compared with control patients (microscopy: 900 vs 26 leukocytes/µL; flowcytometry: 1575 vs 23 leukocytes/µL; P < .001). Area under the curve was 0.93 for both methods. At a cutoff of 264 leukocytes/µL, sensitivity and specificity of microscopy were 88% (positive and negative likelihood ratio: 7.2 and 0.1, respectively). The commonly used cutoff of 10 leukocytes/µL had a poor specificity (36%) and a sensitivity of 100%. Conclusions: The degree of pyuria can help to distinguish UTI in older women from ASB and asymptomatic controls with pyuria. Current pyuria cutoffs are too low and promote inappropriate UTI diagnosis in older women

    Campylobacter jejuni bacteremia and Helicobacter pylori in a patient with X-linked agammaglobulinemia

    Get PDF
    We describe a 15-year-old patient with X-linked agammaglobulinemia who developed malabsorption and bacteremia due to infection of Helicobacter pylori and Campylobacter jejuni. The Campylobacter bacteremia was only recognized after subculturing of blood culture bottles that failed to signal in the automated system. After 2 weeks of treatment with meropenem and erythromycin for 4 weeks, the patient developed a relapse of bacteremia 10 months later with a high level erythromycin resistant C. jejuni. Sequencing revealed an A2058C mutation in the 23 S rRNA gene associated with this resistance. Treatment with doxycycline for 4 weeks finally resulted in complete eradication. This case report illustrates the importance for physicians to use adapted culture methods and adequate prolonged therapy in patients with an immunodeficiency. A summary of published case reports and series of patients with hypogammaglobulinemia or agammaglobulinemia with Campylobacter or Helicobacter bacteremia is given

    Process evaluation of a participatory ergonomics programme to prevent low back pain and neck pain among workers

    Get PDF
    Background: Both low back pain (LBP) and neck pain (NP) are major occupational health problems. In the workplace, participatory ergonomics (PE) is frequently used on musculoskeletal disorders. However, evidence on the effectiveness of PE to prevent LBP and NP obtained from randomised controlled trials (RCTs) is scarce. This study evaluates the process of the Stay@Work participatory ergonomics programme, including the perceived implementation of the prioritised ergonomic measures.Methods: This cluster-RCT was conducted at the departments of four Dutch companies (a railway transportation company, an airline company, a steel company, and a university including its university medical hospital). Directly after the randomisation outcome, intervention departments formed a working group that followed the steps of PE during a six-hour working group meeting. Guided by an ergonomist, working groups identified and prioritised risk factors for LBP and NP, and composed and prioritised ergonomic measures. Within three months after the meeting, working groups had to implement the prioritised ergonomic measures at their department. Data on various process components (recruitment, reach, fidelity, satisfaction, and implementation components, i.e., dose delivered and dose received) were collected and analysed on two levels: department (i.e., working group members from intervention departments) and participant (i.e., workers from intervention departments).Results: A total of 19 intervention departments (n = 10 with mental workloads, n = 1 with a light physical workload, n = 4 departments with physical and mental workloads, and n = 4 with heavy physical workloads) were recruited for participation, and the reach among working group members who participated was high (87%). Fidelity and satisfaction towards the PE programme rated by the working group members was good (7.3 or higher). The same was found for the Stay@Work ergocoach training (7.5 or higher). In total, 66 ergonomic measures were prioritised by the working groups. Altogether, 34% of all prioritised ergonomic measures were perceived as implemented (dose delivered), while the workers at the intervention departments perceived 26% as implemented (dose received).Conclusions: PE can be a successful method to develop and to prioritise ergonomic measures to prevent LBP and NP. Despite the positive rating of the PE programme the implementation of the prioritised ergonomic measures was lower than expected. © 2010 Driessen et al; licensee BioMed Central Ltd

    Fixed Dystonia in Complex Regional Pain Syndrome: a Descriptive and Computational Modeling Approach

    Get PDF
    Background: Complex regional pain syndrome (CRPS) may occur after trauma, usually to one limb, and is characterized by pain and disturbed blood flow, temperature regulation and motor control. Approximately 25% of cases develop fixed dystonia. Involvement of dysfunctional GABAergic interneurons has been suggested, however the mechanisms that underpin fixed dystonia are still unknown. We hypothesized that dystonia could be the result of aberrant proprioceptive reflex strengths of position, velocity or force feedback. Methods: We systematically characterized the pattern of dystonia in 85 CRPS-patients with dystonia according to the posture held at each joint of the affected limb. We compared the patterns with a neuromuscular computer model simulating aberrations of proprioceptive reflexes. The computer model consists of an antagonistic muscle pair with explicit contributions of the musculotendinous system and reflex pathways originating from muscle spindles and Golgi tendon organs, with time delays reflective of neural latencies. Three scenarios were simulated with the model: (i) increased reflex sensitivity (increased sensitivity of the agonistic and antagonistic reflex loops); (ii) imbalanced reflex sensitivity (increased sensitivity of the agonistic reflex loop); (iii) imbalanced reflex offset (an offset to the reflex output of the agonistic proprioceptors). Results: For the arm, fixed postures were present in 123 arms of 77 patients. The dominant pattern involved flexion of the fingers (116/123), the wrists (41/123) and elbows (38/123). For the leg, fixed postures were present in 114 legs of 77 patients. The dominant pattern was plantar flexion of the toes (55/114 legs), plantar flexion and inversion of the ankle (73/114) and flexion of the knee (55/114). Only the computer simulations of imbalanced reflex sensitivity to muscle force from Golgi tendon organs caused patterns that closely resembled the observed patient characteristics. In parallel experiments using robot manipulators we have shown that patients with dystonia were less able to adapt their force feedback strength. Conclusions: Findings derived from a neuromuscular model suggest that aberrant force feedback regulation from Golgi tendon organs involving an inhibitory interneuron may underpin the typical fixed flexion postures in CRPS patients with dystonia.Biomechanical EngineeringMechanical, Maritime and Materials Engineerin

    A theoretical framework to describe communication processes during medical disability assessment interviews

    Get PDF
    BACKGROUND: Research in different fields of medicine suggests that communication is important in physician-patient encounters and influences satisfaction with these encounters. It is argued that this also applies to the non-curative tasks that physicians perform, such as sickness certification and medical disability assessments. However, there is no conceptualised theoretical framework that can be used to describe intentions with regard to communication behaviour, communication behaviour itself, and satisfaction with communication behaviour in a medical disability assessment context. OBJECTIVE: The objective of this paper is to describe the conceptualization of a model for the communication behaviour of physicians performing medical disability assessments in a social insurance context and of their claimants, in face-to-face encounters during medical disability assessment interviews and the preparation thereof. CONCEPTUALIzATION: The behavioural model, based on the Theory of Planned Behaviour (TPB), is conceptualised for the communication behaviour of social insurance physicians and claimants separately, but also combined during the assessment interview. Other important concepts in the model are the evaluation of communication behaviour (satisfaction), intentions, attitudes, skills, and barriers for communication. CONCLUSION: The conceptualization of the TPB-based behavioural model will help to provide insight into the communication behaviour of social insurance physicians and claimants during disability assessment interviews. After empirical testing of the relationships in the model, it can be used in other studies to obtain more insight into communication behaviour in non-curative medicine, and it could help social insurance physicians to adapt their communication behaviour to their task when performing disability assessment

    Clinical characteristics of women captured by extending the definition of severe postpartum haemorrhage with 'refractoriness to treatment': a cohort study

    Get PDF
    Background: The absence of a uniform and clinically relevant definition of severe postpartum haemorrhage hampers comparative studies and optimization of clinical management. The concept of persistent postpartum haemorrhage, based on refractoriness to initial first-line treatment, was proposed as an alternative to common definitions that are either based on estimations of blood loss or transfused units of packed red blood cells (RBC). We compared characteristics and outcomes of women with severe postpartum haemorrhage captured by these three types of definitions. Methods: In this large retrospective cohort study in 61 hospitals in the Netherlands we included 1391 consecutive women with postpartum haemorrhage who received either ≥4 units of RBC or a multicomponent transfusion. Clinical characteristics and outcomes of women with severe postpartum haemorrhage defined as persistent postpartum haemorrhage were compared to definitions based on estimated blood loss or transfused units of RBC within 24 h following birth. Adverse maternal outcome was a composite of maternal mortality, hysterectomy, arterial embolisation and intensive care unit admission. Results: One thousand two hundred sixty out of 1391 women (90.6%) with postpartum haemorrhage fulfilled the definition of persistent postpartum haemorrhage. The majority, 820/1260 (65.1%), fulfilled this definition within 1 h following birth, compared to 819/1391 (58.7%) applying the definition of ≥1 L blood loss and 37/845 (4.4%) applying the definition of ≥4 units of RBC. The definition persistent postpartum haemorrhage captured 430/471 adverse maternal outcomes (91.3%), compared to 471/471 (100%) for ≥1 L blood loss and 383/471 (81.3%) for ≥4 units of RBC. Persistent postpartum haemorrhage did not capture all adverse outcomes because of missing data on timing of initial, first-line treatment. Conclusion: The definition persistent postpartum haemo

    Aggravation of Chronic Stress Effects on Hippocampal Neurogenesis and Spatial Memory in LPA1 Receptor Knockout Mice

    Get PDF
    The lysophosphatidic acid LPA₁ receptor regulates plasticity and neurogenesis in the adult hippocampus. Here, we studied whether absence of the LPA₁ receptor modulated the detrimental effects of chronic stress on hippocampal neurogenesis and spatial memory.Male LPA₁-null (NULL) and wild-type (WT) mice were assigned to control or chronic stress conditions (21 days of restraint, 3 h/day). Immunohistochemistry for bromodeoxyuridine and endogenous markers was performed to examine hippocampal cell proliferation, survival, number and maturation of young neurons, hippocampal structure and apoptosis in the hippocampus. Corticosterone levels were measured in another a separate cohort of mice. Finally, the hole-board test assessed spatial reference and working memory. Under control conditions, NULL mice showed reduced cell proliferation, a defective population of young neurons, reduced hippocampal volume and moderate spatial memory deficits. However, the primary result is that chronic stress impaired hippocampal neurogenesis in NULLs more severely than in WT mice in terms of cell proliferation; apoptosis; the number and maturation of young neurons; and both the volume and neuronal density in the granular zone. Only stressed NULLs presented hypocortisolemia. Moreover, a dramatic deficit in spatial reference memory consolidation was observed in chronically stressed NULL mice, which was in contrast to the minor effect observed in stressed WT mice.These results reveal that the absence of the LPA₁ receptor aggravates the chronic stress-induced impairment to hippocampal neurogenesis and its dependent functions. Thus, modulation of the LPA₁ receptor pathway may be of interest with respect to the treatment of stress-induced hippocampal pathology

    Spatial correlation bias in late-Cenozoic erosion histories derived from thermochronology

    Get PDF
    International audienceThe potential link between erosion rates at the Earth's surface and changes in global climate has intrigued geoscientists for decades1,2 because such a coupling has implications for the influence of silicate weathering3,4 and organic-carbon burial5 on climate and for the role of Quaternary glaciations in landscape evolution1,6. A global increase in late-Cenozoic erosion rates in response to a cooling, more variable climate has been proposed on the basis of worldwide sedimentation rates7. Other studies have indicated, however, that global erosion rates may have remained steady, suggesting that the reported increases in sediment-accumulation rates are due to preservation biases, depositional hiatuses and varying measurement intervals8-10. More recently, a global compilation of thermochronology data has been used to infer a nearly twofold increase in the erosion rate in mountainous landscapes over late-Cenozoic times6. It has been contended that this result is free of the biases that affect sedimentary records11, although others have argued that it contains biases related to how thermochronological data are averaged12 and to erosion hiatuses in glaciated landscapes13. Here we investigate the 30 locations with reported accelerated erosion during the late Cenozoic6. Our analysis shows that in 23 of these locations, the reported increases are a result of a spatial correlation bias—that is, combining data with disparate exhumation histories, thereby converting spatial erosion-rate variations into temporal increases. In four locations, the increases can be explained by changes in tectonic boundary conditions. In three cases, climatically induced accelerations are recorded, driven by localized glacial valley incision. Our findings suggest that thermochronology data currently have insufficient resolution to assess whether late-Cenozoic climate change affected erosion rates on a global scale. We suggest that a synthesis of local findings that include location-specific information may help to further investigate drivers of global erosion rates
    corecore