17,101 research outputs found

    Open TURNS: An industrial software for uncertainty quantification in simulation

    Full text link
    The needs to assess robust performances for complex systems and to answer tighter regulatory processes (security, safety, environmental control, and health impacts, etc.) have led to the emergence of a new industrial simulation challenge: to take uncertainties into account when dealing with complex numerical simulation frameworks. Therefore, a generic methodology has emerged from the joint effort of several industrial companies and academic institutions. EDF R&D, Airbus Group and Phimeca Engineering started a collaboration at the beginning of 2005, joined by IMACS in 2014, for the development of an Open Source software platform dedicated to uncertainty propagation by probabilistic methods, named OpenTURNS for Open source Treatment of Uncertainty, Risk 'N Statistics. OpenTURNS addresses the specific industrial challenges attached to uncertainties, which are transparency, genericity, modularity and multi-accessibility. This paper focuses on OpenTURNS and presents its main features: openTURNS is an open source software under the LGPL license, that presents itself as a C++ library and a Python TUI, and which works under Linux and Windows environment. All the methodological tools are described in the different sections of this paper: uncertainty quantification, uncertainty propagation, sensitivity analysis and metamodeling. A section also explains the generic wrappers way to link openTURNS to any external code. The paper illustrates as much as possible the methodological tools on an educational example that simulates the height of a river and compares it to the height of a dyke that protects industrial facilities. At last, it gives an overview of the main developments planned for the next few years

    Markovian Dynamics on Complex Reaction Networks

    Full text link
    Complex networks, comprised of individual elements that interact with each other through reaction channels, are ubiquitous across many scientific and engineering disciplines. Examples include biochemical, pharmacokinetic, epidemiological, ecological, social, neural, and multi-agent networks. A common approach to modeling such networks is by a master equation that governs the dynamic evolution of the joint probability mass function of the underling population process and naturally leads to Markovian dynamics for such process. Due however to the nonlinear nature of most reactions, the computation and analysis of the resulting stochastic population dynamics is a difficult task. This review article provides a coherent and comprehensive coverage of recently developed approaches and methods to tackle this problem. After reviewing a general framework for modeling Markovian reaction networks and giving specific examples, the authors present numerical and computational techniques capable of evaluating or approximating the solution of the master equation, discuss a recently developed approach for studying the stationary behavior of Markovian reaction networks using a potential energy landscape perspective, and provide an introduction to the emerging theory of thermodynamic analysis of such networks. Three representative problems of opinion formation, transcription regulation, and neural network dynamics are used as illustrative examples.Comment: 52 pages, 11 figures, for freely available MATLAB software, see http://www.cis.jhu.edu/~goutsias/CSS%20lab/software.htm

    Full Resolution Image Compression with Recurrent Neural Networks

    Full text link
    This paper presents a set of full-resolution lossy image compression methods based on neural networks. Each of the architectures we describe can provide variable compression rates during deployment without requiring retraining of the network: each network need only be trained once. All of our architectures consist of a recurrent neural network (RNN)-based encoder and decoder, a binarizer, and a neural network for entropy coding. We compare RNN types (LSTM, associative LSTM) and introduce a new hybrid of GRU and ResNet. We also study "one-shot" versus additive reconstruction architectures and introduce a new scaled-additive framework. We compare to previous work, showing improvements of 4.3%-8.8% AUC (area under the rate-distortion curve), depending on the perceptual metric used. As far as we know, this is the first neural network architecture that is able to outperform JPEG at image compression across most bitrates on the rate-distortion curve on the Kodak dataset images, with and without the aid of entropy coding.Comment: Updated with content for CVPR and removed supplemental material to an external link for size limitation

    An annotated and classified bibliography of software metrics publications : 1988 to 1994

    Get PDF
    With the growth of the software industry, the measurement of software plays an ever increasing role. In order to provide software metric researchers and practitioners with references so they can quickly identify the references of particular interest to them, over 60 of the many publications on software metrics that have appeared since 1988 are classified into four tables that comprise, respectively, (1) Metrics through the Life Cycle, (2) Classic Metrics, (3) Programming Language Metrics, and (4) New Metrics. Table 1 serves as a complete list of all the classified publications while Table 2, Table 3 and Table 4 are subsets of Table 1. The subset tables present more detailed information than Table 1. The bibliographic reference section contains brief summaries of the publications in the classified tables. As a continuation of the 1988 survey done by V. Cote, P. Bourque, S. Oligny and N. Rivard through the paper, "Software metrics: an overview of recent results", this project was conducted to discover the current trends in software metrics practice, and to report the trend movement from the 1988 paper until now by comparison of the results from the two surveys. All the table comparisons from the two surveys are given in percentages. As a survey, we are fully aware of the limitations of our collection out of the wealth of the publications in the software metrics field, but we are confident that our survey is a good indicator of the practice in the software metrics field. [Résumé abrégé par UMI]

    Potentials of a Harmonised Database for Agricultural Market Modelling

    Get PDF
    The study analysed existing databases for agricultural market data on errors and discrepancies and to elaborate the possibilities to harmonise datasets for policy modelling. The study supports DG AGRI in improving quality and timely availability of data for market modelling and ensuring that data from different sources are consistent. This study aims to provide a structure for a consolidated database for policy modelling which does not alter existing databases. Within this report, existing databases are analysed to derive key insights for setting-up a harmonised metabase. As available databases comprise statistical databases as well as scientific model databases, both groups are studied. For the purpose of this study, statistical databases are defined as providers of the information that international institutes receive from their reporters, while the reporters are required to provide harmonised, complete, consistent, and where possible, timely data series for establishing models or other quantitative methods. Nevertheless, a statistical database can also serve as a model database, such as e.g. PS&D. Statistical databases from international institutions (FAO, USDA, Eurostat), as well as model databases (AGLINK/COSIMO, AGMEMOD, CAPRI/CAPSIM, ESIM, FAPRI, GTAP, FARM, IMPACT), were studied to find ways of consolidating data and providing insights that allow for a better comparison of model results. For this reason, various classification schemes used in agricultural statistics were reviewed (country, product, balance item, year, unit), as was the manner in which the different modelling groups have dealt with these classifications in their databases. Besides a common classification, a harmonised database for market modelling purposes will require further efforts to be applied to a consolidation effort for the original data. Such a procedure must be supplemented by methods dealing with completion and balancing.JRC.J.5-Agriculture and Life Sciences in the Econom

    Input variable selection in time-critical knowledge integration applications: A review, analysis, and recommendation paper

    Get PDF
    This is the post-print version of the final paper published in Advanced Engineering Informatics. The published article is available from the link below. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. Copyright @ 2013 Elsevier B.V.The purpose of this research is twofold: first, to undertake a thorough appraisal of existing Input Variable Selection (IVS) methods within the context of time-critical and computation resource-limited dimensionality reduction problems; second, to demonstrate improvements to, and the application of, a recently proposed time-critical sensitivity analysis method called EventTracker to an environment science industrial use-case, i.e., sub-surface drilling. Producing time-critical accurate knowledge about the state of a system (effect) under computational and data acquisition (cause) constraints is a major challenge, especially if the knowledge required is critical to the system operation where the safety of operators or integrity of costly equipment is at stake. Understanding and interpreting, a chain of interrelated events, predicted or unpredicted, that may or may not result in a specific state of the system, is the core challenge of this research. The main objective is then to identify which set of input data signals has a significant impact on the set of system state information (i.e. output). Through a cause-effect analysis technique, the proposed technique supports the filtering of unsolicited data that can otherwise clog up the communication and computational capabilities of a standard supervisory control and data acquisition system. The paper analyzes the performance of input variable selection techniques from a series of perspectives. It then expands the categorization and assessment of sensitivity analysis methods in a structured framework that takes into account the relationship between inputs and outputs, the nature of their time series, and the computational effort required. The outcome of this analysis is that established methods have a limited suitability for use by time-critical variable selection applications. By way of a geological drilling monitoring scenario, the suitability of the proposed EventTracker Sensitivity Analysis method for use in high volume and time critical input variable selection problems is demonstrated.E
    corecore