323 research outputs found

    Increasing the Reliability of Adaptive Quadrature Using Explicit Interpolants

    Full text link
    We present two new adaptive quadrature routines. Both routines differ from previously published algorithms in many aspects, most significantly in how they represent the integrand, how they treat non-numerical values of the integrand, how they deal with improper divergent integrals and how they estimate the integration error. The main focus of these improvements is to increase the reliability of the algorithms without significantly impacting their efficiency. Both algorithms are implemented in Matlab and tested using both the "families" suggested by Lyness and Kaganove and the battery test used by Gander and Gautschi and Kahaner. They are shown to be more reliable, albeit in some cases less efficient, than other commonly-used adaptive integrators.Comment: 32 pages, submitted to ACM Transactions on Mathematical Softwar

    Practical Evaluation of Lempel-Ziv-78 and Lempel-Ziv-Welch Tries

    Full text link
    We present the first thorough practical study of the Lempel-Ziv-78 and the Lempel-Ziv-Welch computation based on trie data structures. With a careful selection of trie representations we can beat well-tuned popular trie data structures like Judy, m-Bonsai or Cedar

    Electromagnetic plasma modeling in circuit breaker within the finite volume method

    Get PDF
    In order to ensure the galvanic isolation of an electrical system following a manual operation or a default strike, current limitation properties of the electric arc are used, forcing a fast decrease to zero current. Modeling this process reveals complex, since it involves a large amount of physical phenomena (radiation, phase transitions, electromagnetism, fluid dynamics, plasma physics). In order to get a robust solving, enhancing strongly coupled resolution and time constants compatibility, the Finite Volume Method has been chosen. This method was first implemented on intrinsic electromagnetism problems (current flow, magnetostatics including non-linear materials, and magnetodynamics). Once validated, the models have been successfully used in the Schneider's current-interruption dedicated software, thus allowing a significantly improved simulation of Schneider Electric circuit breakers

    Poly(ethylene) glycols and mechanochemistry for the preparation of bioactive 3,5-disubstituted hydantoins

    Get PDF
    International audienceMechanochemistry was effective for the preparation of 3,5-disubstituted hydantoins from a-amino methyl esters, using either 1,1 0-carbonyldiimidazole (CDI) or alkyl isocyanates. The preparation of the antimicrobial additives, 3-allyl-5,5 0-dimethyl hydantoin (ADMH) and 1-chloro-3-ethyl-5,5 0-dimethyl hydantoin (CEDMH) were performed by grinding. A chlorination reaction, never described before by mechanochemistry was achieved by Ca(ClO) 2 , while the preparation of the bioactive anticonvulsant marketed drug ethotoin was achieved by a novel approach based on poly(ethylene) glycol (PEGs) assisted grinding

    Digital media in primary schools: literacy or technology? Analyzing government and media discourses

    Get PDF
    This article examines the political and the media discourses concerning the Portuguese governmental program responsible for delivering a laptop named “Magalhães” to all primary school children. The analysis is based on the official documents related to the launch and development of the initiative as well as the press coverage of this topic. The main purpose is to recognize the dominant public discourses and to find out what the media select for the debate in the public sphere. This analysis was done with a particular focus on the critical media literacy framework. The results reveal that the press highlighted the negative aspects of that program and that this framing could have a strong impact on how it was accepted and understood by the public opinion. Analysis also reveals that the governmental initiative was predominantly driven by technological objectives, in particular the access to technology, rather than media literacy objectives.The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This paper is part of a three years project named "Navigating with 'Magalhaes': Study on the Impact of Digital Media in Schoolchildren" funded by FCT - Fundacao para a Ciencia e a Tecnologia (Portuguese Foundation for Science and Technology) and co-funded by FEDER - Fundo Europeu de Desenvolvimento Regional (ERDF: European Regional Development Fund) through COMPETE - Programa Operacional Factores de Competitividade (Operational Competitiveness Programme)

    Shaping Biological Knowledge: Applications in Proteomics

    Get PDF
    The central dogma of molecular biology has provided a meaningful principle for data integration in the field of genomics. In this context, integration reflects the known transitions from a chromosome to a protein sequence: transcription, intron splicing, exon assembly and translation. There is no such clear principle for integrating proteomics data, since the laws governing protein folding and interactivity are not quite understood. In our effort to bring together independent pieces of information relative to proteins in a biologically meaningful way, we assess the bias of bioinformatics resources and consequent approximations in the framework of small-scale studies. We analyse proteomics data while following both a data-driven (focus on proteins smaller than 10 kDa) and a hypothesis-driven (focus on whole bacterial proteomes) approach. These applications are potentially the source of specialized complements to classical biological ontologies

    A General Approach for Predicting the Filtration of Soft and Permeable Colloids: The Milk Example

    Get PDF
    Membrane filtration operations (ultra-, microfiltration) are now extensively used for concentrating or separating an ever-growing variety of colloidal dispersions. However, the phenomena that determine the efficiency of these operations are not yet fully understood. This is especially the case when dealing with colloids that are soft, deformable, and permeable. In this paper, we propose a methodology for building a model that is able to predict the performance (flux, concentration profiles) of the filtration of such objects in relation with the operating conditions. This is done by focusing on the case of milk filtration, all experiments being performed with dispersions of milk casein micelles, which are sort of ″natural″ colloidal microgels. Using this example, we develop the general idea that a filtration model can always be built for a given colloidal dispersion as long as this dispersion has been characterized in terms of osmotic pressure Π and hydraulic permeability k. For soft and permeable colloids, the major issue is that the permeability k cannot be assessed in a trivial way like in the case for hard-sphere colloids. To get around this difficulty, we follow two distinct approaches to actually measure k: a direct approach, involving osmotic stress experiments, and a reverse-calculation approach, that consists of estimating k through well-controlled filtration experiments. The resulting filtration model is then validated against experimental measurements obtained from combined milk filtration/SAXS experiments. We also give precise examples of how the model can be used, as well as a brief discussion on the possible universality of the approach presented here

    Viral to metazoan marine plankton nucleotide sequences from the Tara Oceans expedition

    Get PDF
    A unique collection of oceanic samples was gathered by the Tara Oceans expeditions (2009-2013), targeting plankton organisms ranging from viruses to metazoans, and providing rich environmental context measurements. Thanks to recent advances in the field of genomics, extensive sequencing has been performed for a deep genomic analysis of this huge collection of samples. A strategy based on different approaches, such as metabarcoding, metagenomics, single-cell genomics and metatranscriptomics, has been chosen for analysis of size-fractionated plankton communities. Here, we provide detailed procedures applied for genomic data generation, from nucleic acids extraction to sequence production, and we describe registries of genomics datasets available at the European Nucleotide Archive (ENA, www.ebi.ac.uk/ena). The association of these metadata to the experimental procedures applied for their generation will help the scientific community to access these data and facilitate their analysis. This paper complements other efforts to provide a full description of experiments and open science resources generated from the Tara Oceans project, further extending their value for the study of the world's planktonic ecosystems

    Robust Padé Approximation via SVD

    Full text link

    Optimal Arrangement of Keys in a Hash Table

    Full text link
    corecore