7,326 research outputs found

    Oscillatory behavior of two nonlinear microbial models of soil carbon decomposition

    Get PDF
    A number of nonlinear models have recently been proposed for simulating soil carbon decomposition. Their predictions of soil carbon responses to fresh litter input and warming differ significantly from conventional linear models. Using both stability analysis and numerical simulations, we showed that two of those nonlinear models (a two-pool model and a three-pool model) exhibit damped oscillatory responses to small perturbations. Stability analysis showed the frequency of oscillation is proportional to √(ε⁻¹-1) Ks/Vs in the two-pool model, and to √(ε⁻¹-1) Kl/Vl in the three-pool model, where ε is microbial growth efficiency, Ks and Kl are the half saturation constants of soil and litter carbon, respectively, and /Vs and /Vl are the maximal rates of carbon decomposition per unit of microbial biomass for soil and litter carbon, respectively. For both models, the oscillation has a period of between 5 and 15 years depending on other parameter values, and has smaller amplitude at soil temperatures between 0 and 15°C. In addition, the equilibrium pool sizes of litter or soil carbon are insensitive to carbon inputs in the nonlinear model, but are proportional to carbon input in the conventional linear model. Under warming, the microbial biomass and litter carbon pools simulated by the nonlinear models can increase or decrease, depending whether ε varies with temperature. In contrast, the conventional linear models always simulate a decrease in both microbial and litter carbon pools with warming. Based on the evidence available, we concluded that the oscillatory behavior and insensitivity of soil carbon to carbon input are notable features in these nonlinear models that are somewhat unrealistic. We recommend that a better model for capturing the soil carbon dynamics over decadal to centennial timescales would combine the sensitivity of the conventional models to carbon influx with the flexible response to warming of the nonlinear model.15 page(s

    Predicting the Next Best View for 3D Mesh Refinement

    Full text link
    3D reconstruction is a core task in many applications such as robot navigation or sites inspections. Finding the best poses to capture part of the scene is one of the most challenging topic that goes under the name of Next Best View. Recently, many volumetric methods have been proposed; they choose the Next Best View by reasoning over a 3D voxelized space and by finding which pose minimizes the uncertainty decoded into the voxels. Such methods are effective, but they do not scale well since the underlaying representation requires a huge amount of memory. In this paper we propose a novel mesh-based approach which focuses on the worst reconstructed region of the environment mesh. We define a photo-consistent index to evaluate the 3D mesh accuracy, and an energy function over the worst regions of the mesh which takes into account the mutual parallax with respect to the previous cameras, the angle of incidence of the viewing ray to the surface and the visibility of the region. We test our approach over a well known dataset and achieve state-of-the-art results.Comment: 13 pages, 5 figures, to be published in IAS-1

    The removal of thermally aged films of triacylglycerides by surfactant solutions

    Get PDF
    Thermal ageing of triacylglycerides (TAG) at high temperatures produces films which resist removal using aqueous surfactant solutions. We used a mass loss method to investigate the removal of thermally aged TAG films from hard surfaces using aqueous solutions of surfactants of different charge types. It was found that cationic surfactants are most effective at high pH, whereas anionics are most effective at low pH and a non-ionic surfactant is most effective at intermediate pH. We showed that the TAG film removal process occurs in several stages. In the first ‘‘lag phase’’ no TAG removal occurs; the surfactant first partitions into the thermally aged film. In the second stage, the TAG film containing surfactant was removed by solubilisation into micelles in the aqueous solution. The effects of pH and surfactant charge on the TAG removal process correlate with the effects of these variables on the extent of surfactant partitioning to the TAG film and on the maximum extent of TAG solubilisation within the micelles. Additionally, we showed how the TAG removal is enhanced by the addition of amphiphilic additives such as alcohols which act as co-surfactants. The study demonstrates that aqueous surfactant solutions provide a viable and more benign alternative to current methods for the removal of thermally aged TAG films

    Personalized online information search and visualization

    Get PDF
    BACKGROUND: The rapid growth of online publications such as the Medline and other sources raises the questions how to get the relevant information efficiently. It is important, for a bench scientist, e.g., to monitor related publications constantly. It is also important, for a clinician, e.g., to access the patient records anywhere and anytime. Although time-consuming, this kind of searching procedure is usually similar and simple. Likely, it involves a search engine and a visualization interface. Different words or combination reflects different research topics. The objective of this study is to automate this tedious procedure by recording those words/terms in a database and online sources, and use the information for an automated search and retrieval. The retrieved information will be available anytime and anywhere through a secure web server. RESULTS: We developed such a database that stored searching terms, journals and et al., and implement a piece of software for searching the medical subject heading-indexed sources such as the Medline and other online sources automatically. The returned information were stored locally, as is, on a server and visible through a Web-based interface. The search was performed daily or otherwise scheduled and the users logon to the website anytime without typing any words. The system has potentials to retrieve similarly from non-medical subject heading-indexed literature or a privileged information source such as a clinical information system. The issues such as security, presentation and visualization of the retrieved information were thus addressed. One of the presentation issues such as wireless access was also experimented. A user survey showed that the personalized online searches saved time and increased and relevancy. Handheld devices could also be used to access the stored information but less satisfactory. CONCLUSION: The Web-searching software or similar system has potential to be an efficient tool for both bench scientists and clinicians for their daily information needs

    Testing statistical significance scores of sequence comparison methods with structure similarity

    Get PDF
    BACKGROUND: In the past years the Smith-Waterman sequence comparison algorithm has gained popularity due to improved implementations and rapidly increasing computing power. However, the quality and sensitivity of a database search is not only determined by the algorithm but also by the statistical significance testing for an alignment. The e-value is the most commonly used statistical validation method for sequence database searching. The CluSTr database and the Protein World database have been created using an alternative statistical significance test: a Z-score based on Monte-Carlo statistics. Several papers have described the superiority of the Z-score as compared to the e-value, using simulated data. We were interested if this could be validated when applied to existing, evolutionary related protein sequences. RESULTS: All experiments are performed on the ASTRAL SCOP database. The Smith-Waterman sequence comparison algorithm with both e-value and Z-score statistics is evaluated, using ROC, CVE and AP measures. The BLAST and FASTA algorithms are used as reference. We find that two out of three Smith-Waterman implementations with e-value are better at predicting structural similarities between proteins than the Smith-Waterman implementation with Z-score. SSEARCH especially has very high scores. CONCLUSION: The compute intensive Z-score does not have a clear advantage over the e-value. The Smith-Waterman implementations give generally better results than their heuristic counterparts. We recommend using the SSEARCH algorithm combined with e-values for pairwise sequence comparisons

    Unsupervised Bayesian linear unmixing of gene expression microarrays

    Get PDF
    Background: This paper introduces a new constrained model and the corresponding algorithm, called unsupervised Bayesian linear unmixing (uBLU), to identify biological signatures from high dimensional assays like gene expression microarrays. The basis for uBLU is a Bayesian model for the data samples which are represented as an additive mixture of random positive gene signatures, called factors, with random positive mixing coefficients, called factor scores, that specify the relative contribution of each signature to a specific sample. The particularity of the proposed method is that uBLU constrains the factor loadings to be non-negative and the factor scores to be probability distributions over the factors. Furthermore, it also provides estimates of the number of factors. A Gibbs sampling strategy is adopted here to generate random samples according to the posterior distribution of the factors, factor scores, and number of factors. These samples are then used to estimate all the unknown parameters. Results: Firstly, the proposed uBLU method is applied to several simulated datasets with known ground truth and compared with previous factor decomposition methods, such as principal component analysis (PCA), non negative matrix factorization (NMF), Bayesian factor regression modeling (BFRM), and the gradient-based algorithm for general matrix factorization (GB-GMF). Secondly, we illustrate the application of uBLU on a real time-evolving gene expression dataset from a recent viral challenge study in which individuals have been inoculated with influenza A/H3N2/Wisconsin. We show that the uBLU method significantly outperforms the other methods on the simulated and real data sets considered here. Conclusions: The results obtained on synthetic and real data illustrate the accuracy of the proposed uBLU method when compared to other factor decomposition methods from the literature (PCA, NMF, BFRM, and GB-GMF). The uBLU method identifies an inflammatory component closely associated with clinical symptom scores collected during the study. Using a constrained model allows recovery of all the inflammatory genes in a single factor

    Evaluation of a Bayesian inference network for ligand-based virtual screening

    Get PDF
    Background Bayesian inference networks enable the computation of the probability that an event will occur. They have been used previously to rank textual documents in order of decreasing relevance to a user-defined query. Here, we modify the approach to enable a Bayesian inference network to be used for chemical similarity searching, where a database is ranked in order of decreasing probability of bioactivity. Results Bayesian inference networks were implemented using two different types of network and four different types of belief function. Experiments with the MDDR and WOMBAT databases show that a Bayesian inference network can be used to provide effective ligand-based screening, especially when the active molecules being sought have a high degree of structural homogeneity; in such cases, the network substantially out-performs a conventional, Tanimoto-based similarity searching system. However, the effectiveness of the network is much less when structurally heterogeneous sets of actives are being sought. Conclusion A Bayesian inference network provides an interesting alternative to existing tools for ligand-based virtual screening

    Stem cell differentiation increases membrane-actin adhesion regulating cell blebability, migration and mechanics

    Get PDF
    This work is licensed under a Creative Commons Attribution 4.0 International License. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in the credit line; if the material is not included under the Creative Commons license, users will need to obtain permission from the license holder in order to reproduce the material. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/K. S. is funded by an EPSRC PhD studentship. S.T. is funded by an EU Marie Curie Intra European Fellowship (GENOMICDIFF)

    New Limits on the Ultra-high Energy Cosmic Neutrino Flux from the ANITA Experiment

    Get PDF
    We report initial results of the first flight of the Antarctic Impulsive Transient Antenna (ANITA-1) 2006-2007 Long Duration Balloon flight, which searched for evidence of a diffuse flux of cosmic neutrinos above energies of 3 EeV. ANITA-1 flew for 35 days looking for radio impulses due to the Askaryan effect in neutrino-induced electromagnetic showers within the Antarctic ice sheets. We report here on our initial analysis, which was performed as a blind search of the data. No neutrino candidates are seen, with no detected physics background. We set model-independent limits based on this result. Upper limits derived from our analysis rule out the highest cosmogenic neutrino models. In a background horizontal-polarization channel, we also detect six events consistent with radio impulses from ultra-high energy extensive air showers.Comment: 4 pages, 2 table
    corecore