304 research outputs found

    Verified and potential pathogens of predatory mites (Acari: Phytoseiidae)

    Get PDF
    Several species of phytoseiid mites (Acari: Phytoseiidae), including species of the genera Amblyseius, Galendromus, Metaseiulus, Neoseiulus, Phytoseiulus and Typhlodromus, are currently reared for biological control of various crop pests and/or as model organisms for the study of predator¿prey interactions. Pathogen-free phytoseiid mites are important to obtain high efficacy in biological pest control and to get reliable data in mite research, as pathogens may affect the performance of their host or alter their reproduction and behaviour. Potential and verified pathogens have been reported for phytoseiid mites during the past 25 years. The present review provides an overview, including potential pathogens with unknown host effects (17 reports), endosymbiotic Wolbachia (seven reports), other bacteria (including Cardinium and Spiroplasma) (four reports), cases of unidentified diseases (three reports) and cases of verified pathogens (six reports). From the latter group four reports refer to Microsporidia, one to a fungus and one to a bacterium. Only five entities have been studied in detail, including Wolbachia infecting seven predatory mite species, other endosymbiotic bacteria infecting Metaseiulus (Galendromus, Typhlodromus) occidentalis (Nesbitt), the bacterium Acaricomes phytoseiuli infecting Phytoseiulus persimilis Athias-Henriot, the microsporidium Microsporidium phytoseiuli infecting P. persimilis and the microsporidium Oligosproridium occidentalis infecting M. occidentalis. In four cases (Wolbachia, A. phytoseiuli, M. phytoseiuli and O. occidentalis) an infection may be connected with fitness costs of the host. Moreover, infection is not always readily visible as no obvious gross symptoms are present. Monitoring of these entities on a routine and continuous basis should therefore get more attention, especially in commercial mass-production. Special attention should be paid to field-collected mites before introduction into the laboratory or mass rearing, and to mites that are exchanged among rearing facilities. However, at present general pathogen monitoring is not yet practical as effects of many entities are unknown. More research effort is needed concerning verified and potential pathogens of commercially reared arthropods and those used as model organisms in research

    The role of agonist and antagonist muscles in explaining isometric knee extension torque variation with hip joint angle.

    Get PDF
    PURPOSE: The biarticular rectus femoris (RF), operating on the ascending limb of the force-length curve, produces more force at longer lengths. However, experimental studies consistently report higher knee extension torque when supine (longer RF length) compared to seated (shorter RF length). Incomplete activation in the supine position has been proposed as the reason for this discrepancy, but differences in antagonistic co-activation could also be responsible due to altered hamstrings length. We examined the role of agonist and antagonist muscles in explaining the isometric knee extension torque variation with changes in hip joint angle. METHOD: Maximum voluntary isometric knee extension torque (joint MVC) was recorded in seated and supine positions from nine healthy males (30.2 ± 7.7 years). Antagonistic torque was estimated using EMG and added to the respective joint MVC (corrected MVC). Submaximal tetanic stimulation quadriceps torque was also recorded. RESULT: Joint MVC was not different between supine (245 ± 71.8 Nm) and seated (241 ± 69.8 Nm) positions and neither was corrected MVC (257 ± 77.7 and 267 ± 87.0 Nm, respectively). Antagonistic torque was higher when seated (26 ± 20.4 Nm) than when supine (12 ± 7.4 Nm). Tetanic torque was higher when supine (111 ± 31.9 Nm) than when seated (99 ± 27.5 Nm). CONCLUSION: Antagonistic co-activation differences between hip positions do not account for the reduced MVC in the supine position. Rather, reduced voluntary knee extensor muscle activation in that position is the major reason for the lower MVC torque when RF is lengthened (hip extended). These findings can assist standardising muscle function assessment and improving musculoskeletal modelling applications

    Global, regional, and national comparative risk assessment of 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks, 1990-2015: a systematic analysis for the Global Burden of Disease Study 2015

    Get PDF
    SummaryBackground The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 provides an up-to-date synthesis of the evidence for risk factor exposure and the attributable burden of disease. By providing national and subnational assessments spanning the past 25 years, this study can inform debates on the importance of addressing risks in context. Methods We used the comparative risk assessment framework developed for previous iterations of the Global Burden of Disease Study to estimate attributable deaths, disability-adjusted life-years (DALYs), and trends in exposure by age group, sex, year, and geography for 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks from 1990 to 2015. This study included 388 risk-outcome pairs that met World Cancer Research Fund-defined criteria for convincing or probable evidence. We extracted relative risk and exposure estimates from randomised controlled trials, cohorts, pooled cohorts, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. We developed a metric that allows comparisons of exposure across risk factors—the summary exposure value. Using the counterfactual scenario of theoretical minimum risk level, we estimated the portion of deaths and DALYs that could be attributed to a given risk. We decomposed trends in attributable burden into contributions from population growth, population age structure, risk exposure, and risk-deleted cause-specific DALY rates. We characterised risk exposure in relation to a Socio-demographic Index (SDI). Findings Between 1990 and 2015, global exposure to unsafe sanitation, household air pollution, childhood underweight, childhood stunting, and smoking each decreased by more than 25%. Global exposure for several occupational risks, high body-mass index (BMI), and drug use increased by more than 25% over the same period. All risks jointly evaluated in 2015 accounted for 57·8% (95% CI 56·6–58·8) of global deaths and 41·2% (39·8–42·8) of DALYs. In 2015, the ten largest contributors to global DALYs among Level 3 risks were high systolic blood pressure (211·8 million [192·7 million to 231·1 million] global DALYs), smoking (148·6 million [134·2 million to 163·1 million]), high fasting plasma glucose (143·1 million [125·1 million to 163·5 million]), high BMI (120·1 million [83·8 million to 158·4 million]), childhood undernutrition (113·3 million [103·9 million to 123·4 million]), ambient particulate matter (103·1 million [90·8 million to 115·1 million]), high total cholesterol (88·7 million [74·6 million to 105·7 million]), household air pollution (85·6 million [66·7 million to 106·1 million]), alcohol use (85·0 million [77·2 million to 93·0 million]), and diets high in sodium (83·0 million [49·3 million to 127·5 million]). From 1990 to 2015, attributable DALYs declined for micronutrient deficiencies, childhood undernutrition, unsafe sanitation and water, and household air pollution; reductions in risk-deleted DALY rates rather than reductions in exposure drove these declines. Rising exposure contributed to notable increases in attributable DALYs from high BMI, high fasting plasma glucose, occupational carcinogens, and drug use. Environmental risks and childhood undernutrition declined steadily with SDI; low physical activity, high BMI, and high fasting plasma glucose increased with SDI. In 119 countries, metabolic risks, such as high BMI and fasting plasma glucose, contributed the most attributable DALYs in 2015. Regionally, smoking still ranked among the leading five risk factors for attributable DALYs in 109 countries; childhood underweight and unsafe sex remained primary drivers of early death and disability in much of sub-Saharan Africa. Interpretation Declines in some key environmental risks have contributed to declines in critical infectious diseases. Some risks appear to be invariant to SDI. Increasing risks, including high BMI, high fasting plasma glucose, drug use, and some occupational exposures, contribute to rising burden from some conditions, but also provide opportunities for intervention. Some highly preventable risks, such as smoking, remain major causes of attributable DALYs, even as exposure is declining. Public policy makers need to pay attention to the risks that are increasingly major contributors to global burden. Funding Bill & Melinda Gates Foundation

    Mathematical modeling of the dynamic storage of iron in ferritin

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Iron is essential for the maintenance of basic cellular processes. In the regulation of its cellular levels, ferritin acts as the main intracellular iron storage protein. In this work we present a mathematical model for the dynamics of iron storage in ferritin during the process of intestinal iron absorption. A set of differential equations were established considering kinetic expressions for the main reactions and mass balances for ferritin, iron and a discrete population of ferritin species defined by their respective iron content.</p> <p>Results</p> <p>Simulation results showing the evolution of ferritin iron content following a pulse of iron were compared with experimental data for ferritin iron distribution obtained with purified ferritin incubated <it>in vitro </it>with different iron levels. Distinctive features observed experimentally were successfully captured by the model, namely the distribution pattern of iron into ferritin protein nanocages with different iron content and the role of ferritin as a controller of the cytosolic labile iron pool (cLIP). Ferritin stabilizes the cLIP for a wide range of total intracellular iron concentrations, but the model predicts an exponential increment of the cLIP at an iron content > 2,500 Fe/ferritin protein cage, when the storage capacity of ferritin is exceeded.</p> <p>Conclusions</p> <p>The results presented support the role of ferritin as an iron buffer in a cellular system. Moreover, the model predicts desirable characteristics for a buffer protein such as effective removal of excess iron, which keeps intracellular cLIP levels approximately constant even when large perturbations are introduced, and a freely available source of iron under iron starvation. In addition, the simulated dynamics of the iron removal process are extremely fast, with ferritin acting as a first defense against dangerous iron fluctuations and providing the time required by the cell to activate slower transcriptional regulation mechanisms and adapt to iron stress conditions. In summary, the model captures the complexity of the iron-ferritin equilibrium, and can be used for further theoretical exploration of the role of ferritin in the regulation of intracellular labile iron levels and, in particular, as a relevant regulator of transepithelial iron transport during the process of intestinal iron absorption.</p

    'Care and Prevent': rationale for investigating skin and soft tissue infections and AA amyloidosis among people who inject drugs in London.

    Get PDF
    BACKGROUND: Skin and soft tissue infections (SSTIs) are a leading cause of morbidity and mortality among people who inject drugs (PWID). International data indicate up to one third of PWID have experienced an SSTI within the past month. Complications include sepsis, endocarditis and amyloid A (AA) amyloidosis. AA amyloidosis is a serious sequela of chronic SSTI among PWID. Though there is a paucity of literature reporting on AA amyloidosis among PWID, what has been published suggests there is likely a causal relationship between AA amyloidosis and injecting-related SSTI. If left untreated, AA amyloidosis can lead to renal failure; premature mortality among diagnosed PWID is high. Early intervention may reverse disease. Despite the high societal and individual burden of SSTI among PWID, empirical evidence on the barriers and facilitators to injecting-related SSTI prevention and care or the feasibility and acceptability of AA amyloidosis screening and treatment referral are limited. This study aims to fill these gaps and assess the prevalence of AA amyloidosis among PWID. METHODS: Care and Prevent is a UK National Institute for Health Research-funded mixed-methods study. In five phases (P1-P5), we aim to assess the evidence for AA amyloidosis among PWID (P1); assess the feasibility of AA amyloidosis screening, diagnostic and treatment referral among PWID in London (P2); investigate the barriers and facilitators to AA amyloidosis care (P3); explore SSTI protection and risk (P4); and co-create harm reduction resources with the affected community (P5). This paper describes the conceptual framework, methodological design and proposed analysis for the mixed-methods multi-phase study. RESULTS: We are implementing the Care and Prevent protocol in London. The systematic review component of the study has been completed and published. Care and Prevent will generate an estimate of AA amyloidosis prevalence among community recruited PWID in London, with implications for the development of screening recommendations and intervention implementation. We aim to recruit 400 PWID from drug treatment services in London, UK. CONCLUSIONS: Care and Prevent is the first study to assess screening feasibility and the prevalence of positive proteinuria, as a marker for AA amyloidosis, among PWID accessing drug treatment services. AA amyloidosis is a serious, yet under-recognised condition for which early intervention is available but not employed

    Clinical Predictors of Immune Reconstitution following Combination Antiretroviral Therapy in Patients from the Australian HIV Observational Database

    Get PDF
    A small but significant number of patients do not achieve CD4 T-cell counts >500 cells/µl despite years of suppressive cART. These patients remain at risk of AIDS and non-AIDS defining illnesses. The aim of this study was to identify clinical factors associated with CD4 T-cell recovery following long-term cART.Patients with the following inclusion criteria were selected from the Australian HIV Observational Database (AHOD): cART as their first regimen initiated at CD4 T-cell count <500 cells/µl, HIV RNA<500 copies/ml after 6 months of cART and sustained for at least 12 months. The Cox proportional hazards model was used to identify determinants associated with time to achieve CD4 T-cell counts >500 cells/µl and >200 cells/µl.501 patients were eligible for inclusion from AHOD (n = 2853). The median (IQR) age and baseline CD4 T-cell counts were 39 (32-47) years and 236 (130-350) cells/µl, respectively. A major strength of this study is the long follow-up duration, median (IQR) = 6.5(3-10) years. Most patients (80%) achieved CD4 T-cell counts >500 cells/µl, but in 8%, this took >5 years. Among the patients who failed to reach a CD4 T-cell count >500 cells/µl, 16% received cART for >10 years. In a multivariate analysis, faster time to achieve a CD4 T-cell count >500 cells/µl was associated with higher baseline CD4 T-cell counts (p<0.001), younger age (p = 0.019) and treatment initiation with a protease inhibitor (PI)-based regimen (vs. non-nucleoside reverse transcriptase inhibitor, NNRTI; p = 0.043). Factors associated with achieving CD4 T-cell counts >200 cells/µl included higher baseline CD4 T-cell count (p<0.001), not having a prior AIDS-defining illness (p = 0.018) and higher baseline HIV RNA (p<0.001).The time taken to achieve a CD4 T-cell count >500 cells/µl despite long-term cART is prolonged in a subset of patients in AHOD. Starting cART early with a PI-based regimen (vs. NNRTI-based regimen) is associated with more rapid recovery of a CD4 T-cell count >500 cells/µl

    A review of abnormalities in the perception of visual illusions in schizophrenia

    Get PDF
    Specific abnormalities of vision in schizophrenia have been observed to affect high-level and some low-level integration mechanisms, suggesting that people with schizophrenia may experience anomalies across different stages in the visual system affecting either early or late processing or both. Here, we review the research into visual illusion perception in schizophrenia and the issues which previous research has faced. One general finding that emerged from the literature is that those with schizophrenia are mostly immune to the effects of high-level illusory displays, but this effect is not consistent across all low-level illusions. The present review suggests that this resistance is due to the weakening of top–down perceptual mechanisms and may be relevant to the understanding of symptoms of visual distortion rather than hallucinations as previously thought

    Can group-based reassuring information alter low back pain behavior? A cluster-randomized controlled trial?

    Get PDF
    Background Low back pain (LBP) is common in the population and multifactorial in nature, often involving negative consequences. Reassuring information to improve coping is recommended for reducing the negative consequences of LBP. Adding a simple non-threatening explanation for the pain (temporary muscular dysfunction) has been successful at altering beliefs and behavior when delivered with other intervention elements. This study investigates the isolated effect of this specific information on future occupational behavior outcomes when delivered to the workforce. Design A cluster-randomized controlled trial. Methods Publically employed workers (n=505) from 11 Danish municipality centers were randomized at center-level (cluster) to either intervention (two 1-hour group-based talks at the workplace) or control. The talks provided reassuring information together with a simple non-threatening explanation for LBP - the ‘functional-disturbance’-model. Data collections took place monthly over a 1-year period using text message tracking (SMS). Primary outcomes were self-reported days of cutting down usual activities and work participation. Secondary outcomes were self-reported back beliefs, work ability, number of healthcare visits, bothersomeness, restricted activity, use of pain medication, and sadness/depression. Results There was no between-group difference in the development of LBP during follow-up. Cumulative logistic regression analyses showed no between-group difference on days of cutting down activities, but increased odds for more days of work participation in the intervention group (OR=1.83 95% CI: 1.08-3.12). Furthermore, the intervention group was more likely to report: higher work ability, reduced visits to healthcare professionals, lower bothersomeness, lower levels of sadness/depression, and positive back beliefs. Conclusion Reassuring information involving a simple non-threatening explanation for LBP significantly increased the odds for days of work participation and higher work ability among workers who went on to experience LBP during the 12-month follow-up. Our results confirm the potential for public-health education for LBP, and add to the discussion of simple versus multidisciplinary interventions

    A Policy-into-Practice Intervention to Increase the Uptake of Evidence-Based Management of Low Back Pain in Primary Care: A Prospective Cohort Study

    Get PDF
    BACKGROUND: Persistent non-specific low back pain (nsLBP) is poorly understood by the general community, by educators, researchers and health professionals, making effective care problematic. This study evaluated the effectiveness of a policy-into-practice intervention developed for primary care physicians (PCPs). METHODS: To encourage PCPs to adopt practical evidence-based approaches and facilitate time-efficient, integrated management of patients with nsLBP, we developed an interdisciplinary evidence-based, practical pain education program (gPEP) based on a contemporary biopsychosocial framework. One hundred and twenty six PCPs from primary care settings in Western Australia were recruited. PCPs participated in a 6.5-hour gPEP. Self-report measures recorded at baseline and at 2 months post-intervention included PCPs' attitudes, beliefs (modified Health Care Providers Pain and Impairment Relationship Scale (HC-PAIRS), evidence-based clinical practices (knowledge and skills regarding nsLBP management: 5-point Likert scale with 1  =  nil and 5  =  excellent) and practice behaviours (recommendations based on a patient vignette; 5-point Likert scale). RESULTS: Ninety one PCPs participated (attendance rate of 72%; post-intervention response rate 88%). PCP-responders adopted more positive, guideline-consistent beliefs, evidenced by clinically significant HC-PAIRS score differences (mean change  =  -5.6±8.2, p<0.0001; 95% confidence interval: -7.6 to -3.6) and significant positive shifts on all measures of clinical knowledge and skills (p<0.0001 for all questions). Self management strategies were recommended more frequently post-intervention. The majority of responders who were guideline-inconsistent for work and bed rest recommendations (82% and 62% respectively) at pre-intervention, gave guideline-consistent responses at post-intervention. CONCLUSION: An interprofessional pain education program set within a framework that aligns health policy and practice, encourages PCPs to adopt more self-reported evidence-based attitudes, beliefs and clinical behaviours in their management of patients with nsLBP. However, further research is required to determine cost effectiveness of this approach when compared with other modes of educational delivery and to examine PCP behaviours in actual clinical practice
    corecore