287 research outputs found

    Cognitive loading affects motor awareness and movement kinematics but not locomotor trajectories during goal-directed walking in a virtual reality environment.

    Get PDF
    The primary purpose of this study was to investigate the effects of cognitive loading on movement kinematics and trajectory formation during goal-directed walking in a virtual reality (VR) environment. The secondary objective was to measure how participants corrected their trajectories for perturbed feedback and how participants' awareness of such perturbations changed under cognitive loading. We asked 14 healthy young adults to walk towards four different target locations in a VR environment while their movements were tracked and played back in real-time on a large projection screen. In 75% of all trials we introduced angular deviations of ±5° to ±30° between the veridical walking trajectory and the visual feedback. Participants performed a second experimental block under cognitive load (serial-7 subtraction, counter-balanced across participants). We measured walking kinematics (joint-angles, velocity profiles) and motor performance (end-point-compensation, trajectory-deviations). Motor awareness was determined by asking participants to rate the veracity of the feedback after every trial. In-line with previous findings in natural settings, participants displayed stereotypical walking trajectories in a VR environment. Our results extend these findings as they demonstrate that taxing cognitive resources did not affect trajectory formation and deviations although it interfered with the participants' movement kinematics, in particular walking velocity. Additionally, we report that motor awareness was selectively impaired by the secondary task in trials with high perceptual uncertainty. Compared with data on eye and arm movements our findings lend support to the hypothesis that the central nervous system (CNS) uses common mechanisms to govern goal-directed movements, including locomotion. We discuss our results with respect to the use of VR methods in gait control and rehabilitation

    Organizing to counter terrorism: sensemaking amidst dynamic complexity

    Get PDF
    publication-status: Acceptedtypes: ArticlePre-print draft (version 1). ‘The final, definitive version of this paper has been published in Human Relations September 2013 66(9): 1201–1223, by SAGE Publications Ltd, All rights reserved. © [The Author]Organizations increasingly find themselves contending with circumstances that are suffused with dynamic complexity. So how do they make sense of and contend with this? Using a sensemaking approach, our empirical case analysis of the shooting of Mr Jean Charles de Menezes shows how sensemaking is tested under such conditions. Through elaborating the relationship between the concepts of frames and cues, we find that the introduction of a new organizational routine to anticipate action in changing circumstances leads to discrepant sensemaking. This reveals how novel routines do not necessarily replace extant ones but instead, overlay each other and give rise to novel, dissonant identities which in turn can lead to an increase in equivocality rather than a reduction. This has important implications for sensemaking and organizing amidst unprecedented circumstances

    Inflammatory profile of patients with tuberculosis with or without HIV-1 co-infection: a prospective cohort study and immunological network analysis

    Get PDF
    Background HIV-1 mediated dysregulation of the immune response to tuberculosis and its effect on the response to antitubercular therapy (ATT) is incompletely understood. We aimed to analyse the inflammatory profile of patients with tuberculosis with or without HIV-1 co-infection undergoing ATT, with specific focus on the effect of ART and HIV-1 viraemia in those co-infected with HIV-1. Methods In this prospective cohort study and immunological network analysis, a panel of 38 inflammatory markers were measured in the plasma of a prospective patient cohort undergoing ATT at Khayelitsha Site B clinic, Cape Town, South Africa. We recruited patients with sputum Xpert MTB/RIF-positive rifampicin-susceptible pulmonary tuberculosis. Patients were excluded from the primary discovery cohort if they were younger than 18 years, unable to commence ATT for any reason, pregnant, had unknown HIV-1 status, were unable to consent to study participation, were unable to provide baseline sputum samples, had more than three doses of ATT, or were being re-treated for tuberculosis within 6 months of their previous ATT regimen. Plasma samples were collected at baseline (1–5 days after commencing ATT), week 8, and week 20 of ATT. We applied network and multivariate analysis to investigate the dynamic inflammatory profile of these patients in relation to ATT and by HIV status. In addition to the discovery cohort, a validation cohort of patients with HIV-1 admitted to hospital with CD4 counts less than 350 cells per μL and a high clinical suspicion of new tuberculosis were recruited. Findings Between March 1, 2013, and July 31, 2014, we assessed a cohort of 129 participants (55 [43%] female and 74 [57%] male, median age 35·1 years [IQR 30·1–43·7]) and 76 were co-infected with HIV-1. HIV-1 status markedly influenced the inflammatory profile regardless of ATT duration. HIV-1 viral load emerged as a major factor driving differential inflammatory marker expression and having a strong effect on correlation profiles observed in the HIV-1 co-infected group. Interleukin (IL)-17A emerged as a key correlate of HIV-1-induced inflammation during HIV–tuberculosis co-infection. Interpretation Our findings show the effect of HIV-1 co-infection on the complexity of plasma inflammatory profiles in patients with tuberculosis. Through network analysis we identified IL-17A as an important node in HIV–tuberculosis co-infection, thus implicating this cytokine's capacity to correlate with, and regulate, other inflammatory markers. Further mechanistic studies are required to identify specific IL-17A-related inflammatory pathways mediating immunopathology in HIV–tuberculosis co-infection, which could illuminate targets for future host-directed therapies. Funding National Institutes of Health, The Wellcome Trust, UK Research and Innovation, Cancer Research UK, European and Developing Countries Clinical Trials Partnership, and South African Medical Research Council

    The effect of gold kiwifruit consumed with an iron fortified breakfast cereal meal on iron status in women with low iron stores: A 16 week randomised controlled intervention study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Dietary treatment is often recommended as the first line of treatment for women with mild iron deficiency. Although it is well established that ascorbic acid enhances iron absorption, it is less clear whether the consumption of ascorbic acid rich foods (such as kiwifruit) with meals fortified with iron improves iron status. The aim of this study is to investigate whether the consumption of ZESPRI<sup>® </sup>GOLD kiwifruit (a fruit high in ascorbic acid and carotenoids) with an iron fortified breakfast cereal meal increases iron status in women with low iron stores.</p> <p>Methods/Design</p> <p>Eighty nine healthy women aged 18-44 years with low iron stores (serum ferritin (SF) ≤ 25 μg/L, haemoglobin (Hb) ≥ 115 g/L) living in Auckland, New Zealand were randomised to receive an iron fortified breakfast cereal (16 mg iron per serve) and either two ZESPRI<sup>® </sup>GOLD kiwifruit or a banana (low ascorbic acid and carotenoid content) to eat at breakfast time every day for 16 weeks. Iron status (SF, Hb, C-reactive protein (CRP) and soluble transferrin receptor (sTfR)), ascorbic acid and carotenoid status were measured at baseline and after 16 weeks. Anthropometric measures, dietary intake, physical activity and blood loss were measured before and after the 16 week intervention.</p> <p>Discussion</p> <p>This randomised controlled intervention study will be the first study to investigate the effect of a dietary based intervention of an iron fortified breakfast cereal meal combined with an ascorbic acid and carotenoid rich fruit on improving iron status in women with low iron stores.</p> <p>Trial registration</p> <p>ACTRN12608000360314</p

    A Critical Review of Biomarkers Used for Monitoring Human Exposure to Lead: Advantages, Limitations, and Future Needs

    Get PDF
    Lead concentration in whole blood (BPb) is the primary biomarker used to monitor exposure to this metallic element. The U.S. Centers for Disease Control and Prevention and the World Health Organization define a BPb of 10 μg/dL (0.48 μmol/L) as the threshold of concern in young children. However, recent studies have reported the possibility of adverse health effects, including intellectual impairment in young children, at BPb levels < 10 μg/dL, suggesting that there is no safe level of exposure. It appears impossible to differentiate between low-level chronic Pb exposure and a high-level short Pb exposure based on a single BPb measurement; therefore, serial BPb measurements offer a better estimation of possible health outcomes. The difficulty in assessing the exact nature of Pb exposure is dependent not so much on problems with current analytical methodologies, but rather on the complex toxicokinetics of Pb within various body compartments (i.e., cycling of Pb between bone, blood, and soft tissues). If we are to differentiate more effectively between Pb stored in the body for years and Pb from recent exposure, information on other biomarkers of exposure may be needed. None of the current biomarkers of internal Pb dose have yet been accepted by the scientific community as a reliable substitute for a BPb measurement. This review focuses on the limitations of biomarkers of Pb exposure and the need to improve the accuracy of their measurement. We present here only the traditional analytical protocols in current use, and we attempt to assess the influence of confounding variables on BPb levels. Finally, we discuss the interpretation of BPb data with respect to both external and endogenous Pb exposure, past or recent exposure, as well as the significance of Pb determinations in human specimens including hair, nails, saliva, bone, blood (plasma, whole blood), urine, feces, and exfoliated teeth

    Conservation of a pH-sensitive structure in the C-terminal region of spider silk extends across the entire silk gene family

    Get PDF
    Spiders produce multiple silks with different physical properties that allow them to occupy a diverse range of ecological niches, including the underwater environment. Despite this functional diversity, past molecular analyses show a high degree of amino acid sequence similarity between C-terminal regions of silk genes that appear to be independent of the physical properties of the resulting silks; instead, this domain is crucial to the formation of silk fibres. Here we present an analysis of the C-terminal domain of all known types of spider silk and include silk sequences from the spider Argyroneta aquatica, which spins the majority of its silk underwater. Our work indicates that spiders have retained a highly conserved mechanism of silk assembly, despite the extraordinary diversification of species, silk types and applications of silk over 350 million years. Sequence analysis of the silk C-terminal domain across the entire gene family shows the conservation of two uncommon amino acids that are implicated in the formation of a salt bridge, a functional bond essential to protein assembly. This conservation extends to the novel sequences isolated from A. aquatica. This finding is relevant to research regarding the artificial synthesis of spider silk, suggesting that synthesis of all silk types will be possible using a single process

    Synaptic Defects in the Spinal and Neuromuscular Circuitry in a Mouse Model of Spinal Muscular Atrophy

    Get PDF
    Spinal muscular atrophy (SMA) is a major genetic cause of death in childhood characterized by marked muscle weakness. To investigate mechanisms underlying motor impairment in SMA, we examined the spinal and neuromuscular circuitry governing hindlimb ambulatory behavior in SMA model mice (SMNΔ7). In the neuromuscular circuitry, we found that nearly all neuromuscular junctions (NMJs) in hindlimb muscles of SMNΔ7 mice remained fully innervated at the disease end stage and were capable of eliciting muscle contraction, despite a modest reduction in quantal content. In the spinal circuitry, we observed a ∼28% loss of synapses onto spinal motoneurons in the lateral column of lumbar segments 3–5, and a significant reduction in proprioceptive sensory neurons, which may contribute to the 50% reduction in vesicular glutamate transporter 1(VGLUT1)-positive synapses onto SMNΔ7 motoneurons. In addition, there was an increase in the association of activated microglia with SMNΔ7 motoneurons. Together, our results present a novel concept that synaptic defects occur at multiple levels of the spinal and neuromuscular circuitry in SMNΔ7 mice, and that proprioceptive spinal synapses could be a potential target for SMA therapy

    Testing a global standard for quantifying species recovery and assessing conservation impact

    Get PDF
    Recognizing the imperative to evaluate species recovery and conservation impact, in 2012 the International Union for Conservation of Nature (IUCN) called for development of a “Green List of Species” (now the IUCN Green Status of Species). A draft Green Status framework for assessing species’ progress toward recovery, published in 2018, proposed 2 separate but interlinked components: a standardized method (i.e., measurement against benchmarks of species’ viability, functionality, and preimpact distribution) to determine current species recovery status (herein species recovery score) and application of that method to estimate past and potential future impacts of conservation based on 4 metrics (conservation legacy, conservation dependence, conservation gain, and recovery potential). We tested the framework with 181 species representing diverse taxa, life histories, biomes, and IUCN Red List categories (extinction risk). Based on the observed distribution of species’ recovery scores, we propose the following species recovery categories: fully recovered, slightly depleted, moderately depleted, largely depleted, critically depleted, extinct in the wild, and indeterminate. Fifty-nine percent of tested species were considered largely or critically depleted. Although there was a negative relationship between extinction risk and species recovery score, variation was considerable. Some species in lower risk categories were assessed as farther from recovery than those at higher risk. This emphasizes that species recovery is conceptually different from extinction risk and reinforces the utility of the IUCN Green Status of Species to more fully understand species conservation status. Although extinction risk did not predict conservation legacy, conservation dependence, or conservation gain, it was positively correlated with recovery potential. Only 1.7% of tested species were categorized as zero across all 4 of these conservation impact metrics, indicating that conservation has, or will, play a role in improving or maintaining species status for the vast majority of these species. Based on our results, we devised an updated assessment framework that introduces the option of using a dynamic baseline to assess future impacts of conservation over the short term to avoid misleading results which were generated in a small number of cases, and redefines short term as 10 years to better align with conservation planning. These changes are reflected in the IUCN Green Status of Species Standard

    Evaluation of Bioelectrical Impedance Analysis for Identifying Overweight Individuals at Increased Cardiometabolic Risk:A Cross-Sectional Study

    Get PDF
    OBJECTIVE: To investigate whether bioelectrical impedance analysis could be used to identify overweight individuals at increased cardiometabolic risk, defined as the presence of metabolic syndrome and/or diabetes.DESIGN AND METHODS: Cross-sectional study of a Scottish population including 1210 women and 788 men. The diagnostic performance of thresholds of percentage body fat measured by bioelectrical impedance analysis to identify people at increased cardiometabolic risk was assessed using receiver-operating characteristic curves. Odds ratios for increased cardiometabolic risk in body mass index categories associated with values above compared to below sex-specific percentage body fat thresholds with optimal diagnostic performance were calculated using multivariable logistic regression analyses. The validity of bioelectrical impedance analysis to measure percentage body fat in this population was tested by examining agreement between bioelectrical impedance analysis and dual-energy X-ray absorptiometry in a subgroup of individuals.RESULTS: Participants were aged 16-91 years and the optimal bioelectrical impedance analysis cut-points for percentage body fat for identifying people at increased cardiometabolic risk were 25.9% for men and 37.1% for women. Stratifying by these percentage body fat cut-points, the prevalence of increased cardiometabolic risk was 48% and 38% above the threshold and 24% and 19% below these thresholds for men and women, respectively. By comparison, stratifying by percentage body fat category had little impact on identifying increased cardiometabolic risk in normal weight and obese individuals. Fully adjusted odds ratios of being at increased cardiometabolic risk among overweight people with percentage body fat ?25.9/37.1% compared with percentage body fat &lt;25.9/37.1% as a reference were 1.93 (95% confidence interval: 1.20-3.10) for men and 1.79 (1.10-2.92) for women.CONCLUSION: Percentage body fat measured using bioelectrical impedance analysis above a sex-specific threshold could be used in overweight people to identify individuals at increased cardiometabolic risk, who could benefit from risk factor management
    corecore