106 research outputs found

    Greenland records of aerosol source and atmospheric lifetime changes from the Eemian to the Holocene

    Get PDF
    The Northern Hemisphere experienced dramatic changes during the last glacial, featuring vast ice sheets and abrupt climate events, while high northern latitudes during the last interglacial (Eemian) were warmer than today. Here we use high-resolution aerosol records from the Greenland NEEM ice core to reconstruct the environmental alterations in aerosol source regions accompanying these changes. Separating source and transport effects, we find strongly reduced terrestrial biogenic emissions during glacial times reflecting net loss of vegetated area in North America. Rapid climate changes during the glacial have little effect on terrestrial biogenic aerosol emissions. A strong increase in terrestrial dust emissions during the coldest intervals indicates higher aridity and dust storm activity in East Asian deserts. Glacial sea salt aerosol emissions in the North Atlantic region increase only moderately (50%), likely due to sea ice expansion. Lower aerosol concentrations in Eemian ice compared to the Holocene are mainly due to shortened atmospheric residence time, while emissions changed little.It is supported by funding agencies and institutions in Belgium (FNRS-CFB and FWO), Canada (NRCan/GSC), China (CAS), Denmark (FIST), France (IPEV, CNRS/INSU, CEA and ANR), Germany (AWI), Iceland (RannIs), Japan (NIPR), Korea (KOPRI), The Netherlands (NWO/ALW), Sweden (VR), Switzerland (SNF), United Kingdom (NERC), and the USA (US NSF, Office of Polar Programs). Long-term support of ice core research at the University of Bern by SNF is gratefully acknowledged

    In vivo imaging of lymphocytes in the CNS reveals different behaviour of naïve T cells in health and autoimmunity

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Two-photon laser scanning microscopy (TPLSM) has become a powerful tool in the visualization of immune cell dynamics and cellular communication within the complex biological networks of the inflamed central nervous system (CNS). Whereas many previous studies mainly focused on the role of effector or effector memory T cells, the role of naïve T cells as possible key players in immune regulation directly in the CNS is still highly debated.</p> <p>Methods</p> <p>We applied <it>ex vivo </it>and intravital TPLSM to investigate migratory pathways of naïve T cells in the inflamed and non-inflamed CNS. MACS-sorted naïve CD4+ T cells were either applied on healthy CNS slices or intravenously injected into RAG1 -/- mice, which were affected by experimental autoimmune encephalomyelitis (EAE). We further checked for the generation of second harmonic generation (SHG) signals produced by extracellular matrix (ECM) structures.</p> <p>Results</p> <p>By applying TPLSM on living brain slices we could show that the migratory capacity of activated CD4+ T cells is not strongly influenced by antigen specificity and is independent of regulatory or effector T cell phenotype. Naïve T cells, however, cannot find sufficient migratory signals in healthy, non-inflamed CNS parenchyma since they only showed stationary behaviour in this context. This is in contrast to the high motility of naïve CD4+ T cells in lymphoid organs. We observed a highly motile migration pattern for naïve T cells as compared to effector CD4+ T cells in inflamed brain tissue of living EAE-affected mice. Interestingly, in the inflamed CNS we could detect reticular structures by their SHG signal which partially co-localises with naïve CD4+ T cell tracks.</p> <p>Conclusions</p> <p>The activation status rather than antigen specificity or regulatory phenotype is the central requirement for CD4+ T cell migration within healthy CNS tissue. However, under inflammatory conditions naïve CD4+ T cells can get access to CNS parenchyma and partially migrate along inflammation-induced extracellular SHG structures, which are similar to those seen in lymphoid organs. These SHG structures apparently provide essential migratory signals for naïve CD4+ T cells within the diseased CNS.</p

    Machine Learning Feature Selection Methods for Landslide Susceptibility Mapping

    Get PDF
    This paper explores the use of adaptive support vector machines, random forests and AdaBoost for landslide susceptibility mapping in three separated regions of Canton Vaud, Switzerland, based on a set of geological, hydrological and morphological features. The feature selection properties of the three algorithms are studied to analyze the relevance of features in controlling the spatial distribution of landslides. The elimination of irrelevant features gives simpler, lower dimensional models while keeping the classification performance high. An object-based sampling procedure is considered to reduce the spatial autocorrelation of data and to estimate more reliably generalization skills when applying the model to predict the occurrence of new unknown landslides. The accuracy of the models, the relevance of features and the quality of landslide susceptibility maps were found to be high in the regions characterized by shallow landslides and low in the ones with deep-seated landslides. Despite providing similar skill, random forests and AdaBoost were found to be more efficient in performing feature selection than adaptive support vector machines. The results of this study reveal the strengths of the classification algorithms, but evidence: (1) the need for relying on more than one method for the identification of relevant variables; (2) the weakness of the adaptive scaling algorithm when used with landslide data; and (3) the lack of additional features which characterize the spatial distribution of deep-seated landslides

    3D cut-cell modelling for high-resolution atmospheric simulations

    Full text link
    Owing to the recent, rapid development of computer technology, the resolution of atmospheric numerical models has increased substantially. With the use of next-generation supercomputers, atmospheric simulations using horizontal grid intervals of O(100) m or less will gain popularity. At such high resolution more of the steep gradients in mountainous terrain will be resolved, which may result in large truncation errors in those models using terrain-following coordinates. In this study, a new 3D Cartesian coordinate non-hydrostatic atmospheric model is developed. A cut-cell representation of topography based on finite-volume discretization is combined with a cell-merging approach, in which small cut-cells are merged with neighboring cells either vertically or horizontally. In addition, a block-structured mesh-refinement technique is introduced to achieve a variable resolution on the model grid with the finest resolution occurring close to the terrain surface. The model successfully reproduces a flow over a 3D bell-shaped hill that shows a good agreement with the flow predicted by the linear theory. The ability of the model to simulate flows over steep terrain is demonstrated using a hemisphere-shaped hill where the maximum slope angle is resolved at 71 degrees. The advantage of a locally refined grid around a 3D hill, with cut-cells at the terrain surface, is also demonstrated using the hemisphere-shaped hill. The model reproduces smooth mountain waves propagating over varying grid resolution without introducing large errors associated with the change of mesh resolution. At the same time, the model shows a good scalability on a locally refined grid with the use of OpenMP.Comment: 19 pages, 16 figures. Revised version, accepted for publication in QJRM

    Using 81Kr and Noble Gases to Characterize and Date Groundwater and Brines in the Baltic Artesian Basin on the One-Million-Year Timescale

    Full text link
    Analyses for 81^{81}Kr and noble gases on groundwater from the deepest aquifer system of the Baltic Artesian Basin (BAB) were performed to determine groundwater ages and uncover the flow dynamics of the system on a timescale of several hundred thousand years. We find that the system is controlled by mixing of three distinct water masses: Interglacial or recent meteoric water (\delta^{18}\text{O} \approx -10.4\unicode{x2030}) with a poorly evolved chemical and noble gas signature, glacial meltwater (\delta^{18}\text{O} \leq -18\unicode{x2030}) with elevated noble gas concentrations, and an old, high-salinity brine component (\delta^{18}\text{O} \geq -4.5\unicode{x2030}, \geq 90 \text{g Cl}^{-}/\text{L}) with strongly depleted atmospheric noble gas concentrations. The 81^{81}Kr measurements are interpreted within this mixing framework to estimate the age of the end-members. Deconvoluted 81^{81}Kr ages range from 300 ka to 1.3 Ma for interglacial or recent meteoric water and glacial meltwater. For the brine component, ages exceed the dating range of the ATTA 3 instrument of 1.3 Ma. The radiogenic noble gas components 4^{4}He* and 40^{40}Ar* are less conclusive but also support an age of > 1 Ma for the brine. Based on the chemical and noble gas concentrations and the dating results, we conclude that the brine originates from evaporated seawater that has been modified by later water-rock interaction. As the obtained tracer ages cover several glacial cycles, we discuss the impact of the glacial cycles on flow patterns in the studied aquifer system.Comment: Accepted for publication in Geochimica et Cosmochimica Act

    A framework for parameter estimation and model selection from experimental data in systems biology using approximate Bayesian computation.

    Get PDF
    As modeling becomes a more widespread practice in the life sciences and biomedical sciences, researchers need reliable tools to calibrate models against ever more complex and detailed data. Here we present an approximate Bayesian computation (ABC) framework and software environment, ABC-SysBio, which is a Python package that runs on Linux and Mac OS X systems and that enables parameter estimation and model selection in the Bayesian formalism by using sequential Monte Carlo (SMC) approaches. We outline the underlying rationale, discuss the computational and practical issues and provide detailed guidance as to how the important tasks of parameter inference and model selection can be performed in practice. Unlike other available packages, ABC-SysBio is highly suited for investigating, in particular, the challenging problem of fitting stochastic models to data. In order to demonstrate the use of ABC-SysBio, in this protocol we postulate the existence of an imaginary reaction network composed of seven interrelated biological reactions (involving a specific mRNA, the protein it encodes and a post-translationally modified version of the protein), a network that is defined by two files containing 'observed' data that we provide as supplementary information. In the first part of the PROCEDURE, ABC-SysBio is used to infer the parameters of this system, whereas in the second part we use ABC-SysBio's relevant functionality to discriminate between two different reaction network models, one of them being the 'true' one. Although computationally expensive, the additional insights gained in the Bayesian formalism more than make up for this cost, especially in complex problems

    Model selection in historical research using approximate Bayesian computation

    Get PDF
    Formal Models and History Computational models are increasingly being used to study historical dynamics. This new trend, which could be named Model-Based History, makes use of recently published datasets and innovative quantitative methods to improve our understanding of past societies based on their written sources. The extensive use of formal models allows historians to reevaluate hypotheses formulated decades ago and still subject to debate due to the lack of an adequate quantitative framework. The initiative has the potential to transform the discipline if it solves the challenges posed by the study of historical dynamics. These difficulties are based on the complexities of modelling social interaction, and the methodological issues raised by the evaluation of formal models against data with low sample size, high variance and strong fragmentation. This work examines an alternate approach to this evaluation based on a Bayesian-inspired model selection method. The validity of the classical Lanchester's laws of combat is examined against a dataset comprising over a thousand battles spanning 300 years. Four variations of the basic equations are discussed, including the three most common formulations (linear, squared, and logarithmic) and a new variant introducing fatigue. Approximate Bayesian Computation is then used to infer both parameter values and model selection via Bayes Factors. Results indicate decisive evidence favouring the new fatigue model. The interpretation of both parameter estimations and model selection provides new insights into the factors guiding the evolution of warfare. At a methodological level, the case study shows how model selection methods can be used to guide historical research through the comparison between existing hypotheses and empirical evidence.Funding for this work was provided by the SimulPast Consolider Ingenio project (CSD2010-00034) of the former Ministry for Science and Innovation of the Spanish Government and the European Research Council Advanced Grant EPNet (340828).Peer ReviewedPostprint (published version

    Bayesian computation: a summary of the current state, and samples backwards and forwards

    Full text link
    corecore