53 research outputs found

    Regional Growth Rate Differences Specified by Apical Notch Activities Regulate Liverwort Thallus Shape

    Get PDF
    Plants have undergone 470 million years of evolution on land and different groups have distinct body shapes. Liverworts are the most ancient land plant lineage and have a flattened, creeping body (the thallus), which grows from apical cells in an invaginated "notch." The genetic mechanisms regulating liverwort shape are almost totally unknown, yet they provide a blueprint for the radiation of land plant forms. We have used a combination of live imaging, growth analyses, and computational modeling to determine what regulates liverwort thallus shape in Marchantia polymorpha\textit{Marchantia polymorpha}. We find that the thallus undergoes a stereotypical sequence of shape transitions during the first 2 weeks of growth and that key aspects of global shape depend on regional growth rate differences generated by the coordinated activities of the apical notches. A "notch-drives-growth" model, in which a diffusible morphogen produced at each notch promotes specified isotropic growth, can reproduce the growth rate distributions that generate thallus shape given growth suppression at the apex. However, in surgical experiments, tissue growth persists following notch excision, showing that this model is insufficient to explain thallus growth. In an alternative "notch-pre-patterns-growth" model, a persistently acting growth regulator whose distribution is pre-patterned by the notches can account for the discrepancies between growth dynamics in the notch-drives-growth model and real plants following excision. Our work shows that growth rate heterogeneity is the primary shape determinant in Marchantia polymorpha\textit{Marchantia polymorpha} and suggests that the thallus is likely to have zones with specialized functions.We thank the BBSRC ( BB/F016581/1 ) for funding J.E.S.’s PhD research and the Gatsby Charitable Foundation ( GAT2962 ) and the Royal Society (RG54416) for funding C.J.H.’s research

    Cost-effective control of plant disease when epidemiological knowledge is incomplete: modelling Bahia bark scaling of citrus.

    Get PDF
    A spatially-explicit, stochastic model is developed for Bahia bark scaling, a threat to citrus production in north-eastern Brazil, and is used to assess epidemiological principles underlying the cost-effectiveness of disease control strategies. The model is fitted via Markov chain Monte Carlo with data augmentation to snapshots of disease spread derived from a previously-reported multi-year experiment. Goodness-of-fit tests strongly supported the fit of the model, even though the detailed etiology of the disease is unknown and was not explicitly included in the model. Key epidemiological parameters including the infection rate, incubation period and scale of dispersal are estimated from the spread data. This allows us to scale-up the experimental results to predict the effect of the level of initial inoculum on disease progression in a typically-sized citrus grove. The efficacies of two cultural control measures are assessed: altering the spacing of host plants, and roguing symptomatic trees. Reducing planting density can slow disease spread significantly if the distance between hosts is sufficiently large. However, low density groves have fewer plants per hectare. The optimum density of productive plants is therefore recovered at an intermediate host spacing. Roguing, even when detection of symptomatic plants is imperfect, can lead to very effective control. However, scouting for disease symptoms incurs a cost. We use the model to balance the cost of scouting against the number of plants lost to disease, and show how to determine a roguing schedule that optimises profit. The trade-offs underlying the two optima we identify-the optimal host spacing and the optimal roguing schedule-are applicable to many pathosystems. Our work demonstrates how a carefully parameterised mathematical model can be used to find these optima. It also illustrates how mathematical models can be used in even this most challenging of situations in which the underlying epidemiology is ill-understood.FFL was funded via a CNPq Fellowship (Brazil's National Council for Scientific and Technological Development, see http://memoria.cnpq.br/english/cnpq/index.htm). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript

    Pyrenoid loss in Chlamydomonas reinhardtii causes limitations in CO2 supply, but not thylakoid operating efficiency

    Get PDF
    The pyrenoid of the unicellular green alga Chlamydomonas reinhardtii is a microcompartment situated in the centre of the cup-shaped chloroplast, containing up to 90% of cellular Rubisco. Traversed by a network of dense, knotted thylakoid tubules, the pyrenoid has been proposed to influence thylakoid biogenesis and ultrastructure. Mutants that are unable to assemble a pyrenoid matrix, due to expressing a vascular plant version of the Rubisco small subunit, exhibit severe growth and photosynthetic defects and have an ineffective carbon-concentrating mechanism (CCM). The present study set out to determine the cause of photosynthetic limitation in these pyrenoid-less lines. We tested whether electron transport and light use were compromised as a direct structural consequence of pyrenoid loss or as a metabolic effect downstream of lower CCM activity and resulting COâ‚‚ limitation. Thylakoid organization was unchanged in the mutants, including the retention of intrapyrenoid-type thylakoid tubules, and photosynthetic limitations associated with the absence of the pyrenoid were rescued by exposing cells to elevated COâ‚‚ levels. These results demonstrate that Rubisco aggregation in the pyrenoid functions as an essential element for COâ‚‚ delivery as part of the CCM, and does not play other roles in maintenance of photosynthetic membrane energetics.We wish to gratefully acknowledge financial support to ODC by Wolfson College, the Cambridge Philosophical Society and the TH Middleton Fund (Department of Plant Sciences) toward research-related travel, which was crucial to enable this collaborative study. This work was supported by the Biotechnology and Biological Sciences Research Council (PhD studentship 1090746 to ODC and BB/M007693/1 to MTM and HG). The work was also supported by NSF grant MCB 0951094 and US Department of Energy Grants DE-FG02-07ER64427, DE- FG02-12ER16338 awarded to ARG

    Three Aphid-Transmitted Viruses Encourage Vector Migration From Infected Common Bean (Phaseolus vulgaris) Plants Through a Combination of Volatile and Surface Cues

    Get PDF
    Bean common mosaic virus (BCMV), bean common mosaic necrosis virus (BCMNV), and cucumber mosaic virus (CMV) are important pathogens of common bean (Phaseolus vulgaris), a crop vital for food security in sub-Saharan Africa. These viruses are vectored by aphids non-persistently, with virions bound loosely to stylet receptors. These viruses also manipulate aphid-mediated transmission by altering host properties. Virus-induced effects on host-aphid interactions were investigated using choice test (migration) assays, olfactometry, and analysis of insect-perceivable volatile organic compounds (VOCs) using gas chromatography (GC)-coupled mass spectrometry, and GC-coupled electroantennography. When allowed to choose freely between infected and uninfected plants, aphids of the legume specialist species Aphis fabae, and of the generalist species Myzus persicae, were repelled by plants infected with BCMV, BCMNV, or CMV. However, in olfactometer experiments with A. fabae, only the VOCs emitted by BCMNV-infected plants repelled aphids. Although BCMV, BCMNV, and CMV each induced distinctive changes in emission of aphid-perceivable volatiles, all three suppressed emission of an attractant sesquiterpene, α-copaene, suggesting these three different viruses promote migration of virus-bearing aphids in a similar fashion

    How achievable are COVID-19 clinical trial recruitment targets? A UK observational cohort study and trials registry analysis

    Get PDF
    OBJECTIVES: To analyse enrolment to interventional trials during the first wave of the COVID-19 pandemic in England and describe the barriers to successful recruitment in the circumstance of a further wave or future pandemics. DESIGN: We analysed registered interventional COVID-19 trial data and concurrently did a prospective observational study of hospitalised patients with COVID-19 who were being assessed for eligibility to one of the RECOVERY, C19-ACS or SIMPLE trials. SETTING: Interventional COVID-19 trial data were analysed from the clinicaltrials.gov and International Standard Randomized Controlled Trial Number databases on 12 July 2020. The patient cohort was taken from five centres in a respiratory National Institute for Health Research network. Population and modelling data were taken from published reports from the UK government and Medical Research Council Biostatistics Unit. PARTICIPANTS: 2082 consecutive admitted patients with laboratory-confirmed SARS-CoV-2 infection from 27 March 2020 were included. MAIN OUTCOME MEASURES: Proportions enrolled, and reasons for exclusion from the aforementioned trials. Comparisons of trial recruitment targets with estimated feasible recruitment numbers. RESULTS: Analysis of trial registration data for COVID-19 treatment studies enrolling in England showed that by 12 July 2020, 29 142 participants were needed. In the observational study, 430 (20.7%) proceeded to randomisation. 82 (3.9%) declined participation, 699 (33.6%) were excluded on clinical grounds, 363 (17.4%) were medically fit for discharge and 153 (7.3%) were receiving palliative care. With 111 037 people hospitalised with COVID-19 in England by 12 July 2020, we determine that 22 985 people were potentially suitable for trial enrolment. We estimate a UK hospitalisation rate of 2.38%, and that another 1.25 million infections would be required to meet recruitment targets of ongoing trials. CONCLUSIONS: Feasible recruitment rates, study design and proliferation of trials can limit the number, and size, that will successfully complete recruitment. We consider that fewer, more appropriately designed trials, prioritising cooperation between centres would maximise productivity in a further wave

    Estimating the delay between host infection and disease (incubation period) and assessing its significance to the epidemiology of plant diseases.

    Get PDF
    Knowledge of the incubation period of infectious diseases (time between host infection and expression of disease symptoms) is crucial to our epidemiological understanding and the design of appropriate prevention and control policies. Plant diseases cause substantial damage to agricultural and arboricultural systems, but there is still very little information about how the incubation period varies within host populations. In this paper, we focus on the incubation period of soilborne plant pathogens, which are difficult to detect as they spread and infect the hosts underground and above-ground symptoms occur considerably later. We conducted experiments on Rhizoctonia solani in sugar beet, as an example patho-system, and used modelling approaches to estimate the incubation period distribution and demonstrate the impact of differing estimations on our epidemiological understanding of plant diseases. We present measurements of the incubation period obtained in field conditions, fit alternative probability models to the data, and show that the incubation period distribution changes with host age. By simulating spatially-explicit epidemiological models with different incubation-period distributions, we study the conditions for a significant time lag between epidemics of cryptic infection and the associated epidemics of symptomatic disease. We examine the sensitivity of this lag to differing distributional assumptions about the incubation period (i.e. exponential versus Gamma). We demonstrate that accurate information about the incubation period distribution of a pathosystem can be critical in assessing the true scale of pathogen invasion behind early disease symptoms in the field; likewise, it can be central to model-based prediction of epidemic risk and evaluation of disease management strategies. Our results highlight that reliance on observation of disease symptoms can cause significant delay in detection of soil-borne pathogen epidemics and mislead practitioners and epidemiologists about the timing, extent, and viability of disease control measures for limiting economic loss.ML thanks the Institut Technique français de la Betterave industrielle (ITB) for funding this project. CAG and JANF were funded by the UK’s Biotechnology and Biological Sciences Research Council (BBSRC). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript
    • …
    corecore