1,112 research outputs found

    Neuromechanical adaptations of foot function to changes in surface stiffness during hopping

    Get PDF
    This is the author accepted manuscript. The final version is available from the American Physiological Society via the DOI in this recordHumans choose work-minimizing movement strategies when interacting with compliant surfaces. Our ankles are credited with stiffening our lower limbs and maintaining the excursion of our body's center of mass on a range of surface stiffnesses. We may also be able to stiffen our feet through an active contribution from our plantar intrinsic muscles (PIMs) on such surfaces. However, traditional modelling of the ankle joint has masked this contribution. We compared foot and ankle mechanics and muscle activation on Low, Medium and High stiffness surfaces during bilateral hopping using a traditional and anatomical ankle model. The traditional ankle model overestimated work and underestimated quasi-stiffness compared to the anatomical model. Hopping on a low stiffness surface resulted in less longitudinal arch compression with respect to the high stiffness surface. However, because midfoot torque was also reduced, midfoot quasi-stiffness remained unchanged. We observed lower activation of the PIMs, soleus and tibialis anterior on the low and medium stiffness conditions, which paralleled the pattern we saw in the work performed by the foot and ankle. Rather than performing unnecessary work, participants altered their landing posture to harness the energy stored by the sprung surface in the low and medium conditions. These findings highlight our preference to minimize mechanical work when transitioning to compliant surfaces and highlight the importance of considering the foot as an active, multi-articular, part of the human leg

    Neuromechanical adaptations of foot function when hopping on a damped surface

    Get PDF
    This is the author accepted manuscript. The final version is available on open access from the American Physiological Society via the DOI in this recordTo preserve motion, humans must adopt actuator-like dynamics to replace energy that is dissipated during contact with damped surfaces. Our ankle plantar flexors are credited as the primary source of work generation. Our feet and their intrinsic foot muscles also appear to be an important source of generative work, but their contributions to restoring energy to the body remain unclear. Here, we test the hypothesis that our feet help to replace work dissipated by a damped surface through controlled activation of the intrinsic foot muscles. We used custom-built platforms to provide both elastic and damped surfaces and asked participants to perform a bilateral hopping protocol on each. We recorded foot motion and ground reaction forces, alongside muscle activation, using intramuscular electromyography from flexor digitorum brevis, abductor hallucis, soleus and tibialis anterior. Hopping in the Damped condition resulted in significantly greater positive work and contact-phase muscle activation compared to the Elastic condition. The foot contributed 25% of the positive work performed about the ankle, highlighting the importance of the foot when humans adapt to different surfaces.Australian Research Council (ARC)QUEX Institut

    Factorization Properties of Soft Graviton Amplitudes

    Full text link
    We apply recently developed path integral resummation methods to perturbative quantum gravity. In particular, we provide supporting evidence that eikonal graviton amplitudes factorize into hard and soft parts, and confirm a recent hypothesis that soft gravitons are modelled by vacuum expectation values of products of certain Wilson line operators, which differ for massless and massive particles. We also investigate terms which break this factorization, and find that they are subleading with respect to the eikonal amplitude. The results may help in understanding the connections between gravity and gauge theories in more detail, as well as in studying gravitational radiation beyond the eikonal approximation.Comment: 35 pages, 5 figure

    ARABIDOPSIS DEHISCENCE ZONE POLYGALACTURONASE 1 (ADPG1) releases latent defense signals in stems with reduced lignin content

    Get PDF
    There is considerable interest in engineering plant cell wall components, particularly lignin, to improve forage quality and biomass properties for processing to fuels and bioproducts. However, modifying lignin content and/or composition in transgenic plants through down-regulation of lignin biosynthetic enzymes can induce expression of defense response genes in the absence of biotic or abiotic stress. Arabidopsis thaliana lines with altered lignin through down-regulation of hydroxycinnamoyl CoA:shikimate/quinate hydroxycinnamoyl transferase (HCT) or loss of function of cinnamoyl CoA reductase 1 (CCR1) express a suite of pathogenesis-related (PR) protein genes. The plants also exhibit extensive cell wall remodeling associated with induction of multiple cell wall-degrading enzymes, a process which renders the corresponding biomass a substrate for growth of the cellulolytic thermophile Caldicellulosiruptor bescii lacking a functional pectinase gene cluster. The cell wall remodeling also results in the release of size- and charge-heterogeneous pectic oligosaccharide elicitors of PR gene expression. Genetic analysis shows that both in planta PR gene expression and release of elicitors are the result of ectopic expression in xylem of the gene ARABIDOPSIS DEHISCENCE ZONE POLYGALACTURONASE 1 (ADPG1), which is normally expressed during anther and silique dehiscence. These data highlight the importance of pectin in cell wall integrity and the value of lignin modification as a tool to interrogate the informational content of plant cell walls

    A Corroborative Approach to Verification and Validation of Human--Robot Teams

    Get PDF
    We present an approach for the verification and validation (V&V) of robot assistants in the context of human-robot interactions (HRI), to demonstrate their trustworthiness through corroborative evidence of their safety and functional correctness. Key challenges include the complex and unpredictable nature of the real world in which assistant and service robots operate, the limitations on available V&V techniques when used individually, and the consequent lack of confidence in the V&V results. Our approach, called corroborative V&V, addresses these challenges by combining several different V&V techniques; in this paper we use formal verification (model checking), simulation-based testing, and user validation in experiments with a real robot. We demonstrate our corroborative V&V approach through a handover task, the most critical part of a complex cooperative manufacturing scenario, for which we propose some safety and liveness requirements to verify and validate. We construct formal models, simulations and an experimental test rig for the HRI. To capture requirements we use temporal logic properties, assertion checkers and textual descriptions. This combination of approaches allows V&V of the HRI task at different levels of modelling detail and thoroughness of exploration, thus overcoming the individual limitations of each technique. Should the resulting V&V evidence present discrepancies, an iterative process between the different V&V techniques takes place until corroboration between the V&V techniques is gained from refining and improving the assets (i.e., system and requirement models) to represent the HRI task in a more truthful manner. Therefore, corroborative V&V affords a systematic approach to 'meta-V&V,' in which different V&V techniques can be used to corroborate and check one another, increasing the level of certainty in the results of V&V

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    Conditional Facilitation of an Aphid Vector, Acyrthosiphon pisum, by the Plant Pathogen, Pea Enation Mosaic Virus

    Get PDF
    Plant pathogens can induce symptoms that affect the performance of insect herbivores utilizing the same host plant. Previous studies examining the effects of infection of tic bean, Vicia faba L. (Fabales: Fabaceae), by pea enation mosaic virus (PEMV), an important disease of legume crops, indicated there were no changes in the growth and reproductive rate of its primary vector the pea aphid, Acyrthosiphon pisum (Harris) (Hemiptera: Aphididae). Here, we report the results of laboratory experiments investigating how A. pisum responded to PEMV infection of a different host plant, Pisum sativum L., at different stages of symptom development. Aphid growth rate was negatively related to the age of the host plant, but when they were introduced onto older plants with well-developed PEMV symptoms they exhibited a higher growth rate compared to those developing on uninfected plants of the same age. In choice tests using leaf discs A. pisum showed a strong preference for discs from PEMV-infected peas, probably in response to visual cues from the yellowed and mottled infected leaves. When adults were crowded onto leaves using clip-cages they produced more winged progeny on PEMV-infected plants. The results indicate that PEMV produces symptoms in the host plant that can enhance the performance of A. pisum as a vector, modify the production of winged progeny and affect their spatial distribution. The findings provide further evidence that some insect vector/plant pathogen interactions could be regarded as mutualistic rather than commensal when certain conditions regarding the age, stage of infection and species of host plant are met

    Informing the design of a national screening and treatment programme for chronic viral hepatitis in primary care: qualitative study of at-risk immigrant communities and healthcare professionals

    Get PDF
    n Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise statedThis paper presents independent research funded by the National Institute for Health Research (NIHR) under the Programme Grants for Applied Research programme (RP-PG-1209-10038).

    Patient-centred standards of care for adults with myositis

    Full text link

    Toward Engineering Biosystems With Emergent Collective Functions

    Get PDF
    Many complex behaviors in biological systems emerge from large populations of interacting molecules or cells, generating functions that go beyond the capabilities of the individual parts. Such collective phenomena are of great interest to bioengineers due to their robustness and scalability. However, engineering emergent collective functions is difficult because they arise as a consequence of complex multi-level feedback, which often spans many length-scales. Here, we present a perspective on how some of these challenges could be overcome by using multi-agent modeling as a design framework within synthetic biology. Using case studies covering the construction of synthetic ecologies to biological computation and synthetic cellularity, we show how multi-agent modeling can capture the core features of complex multi-scale systems and provide novel insights into the underlying mechanisms which guide emergent functionalities across scales. The ability to unravel design rules underpinning these behaviors offers a means to take synthetic biology beyond single molecules or cells and toward the creation of systems with functions that can only emerge from collectives at multiple scales
    • …
    corecore