50,113 research outputs found

    On the dialog between experimentalist and modeler in catchment hydrology

    Get PDF
    The dialog between experimentalist and modeler in catchment hydrology has been minimal to date. The experimentalist often has a highly detailed yet highly qualitative understanding of dominant runoff processes—thus there is often much more information content on the catchment than we use for calibration of a model. While modelers often appreciate the need for 'hard data' for the model calibration process, there has been little thought given to how modelers might access this 'soft' or process knowledge. We present a new method where soft data (i.e., qualitative knowledge from the experimentalist that cannot be used directly as exact numbers) are made useful through fuzzy measures of model-simulation and parameter-value acceptability. We developed a three-box lumped conceptual model for the Maimai catchment in New Zealand, a particularly well-studied process-hydrological research catchment. The boxes represent the key hydrological reservoirs that are known to have distinct groundwater dynamics, isotopic composition and solute chemistry. The model was calibrated against hard data (runoff and groundwater-levels) as well as a number of criteria derived from the soft data (e.g. percent new water, reservoir volume, etc). We achieved very good fits for the three-box model when optimizing the parameter values with only runoff (Reff=0.93). However, parameter sets obtained in this way showed in general a poor goodness-of-fit for other criteria such as the simulated new-water contributions to peak runoff. Inclusion of soft-data criteria in the model calibration process resulted in lower Reff-values (around 0.84 when including all criteria) but led to better overall performance, as interpreted by the experimentalist’s view of catchment runoff dynamics. The model performance with respect to soft data (like, for instance, the new water ratio) increased significantly and parameter uncertainty was reduced by 60% on average with the introduction of the soft data multi-criteria calibration. We argue that accepting lower model efficiencies for runoff is 'worth it' if one can develop a more 'real' model of catchment behavior. The use of soft data is an approach to formalize this exchange between experimentalist and modeler and to more fully utilize the information content from experimental catchments

    Data granulation by the principles of uncertainty

    Full text link
    Researches in granular modeling produced a variety of mathematical models, such as intervals, (higher-order) fuzzy sets, rough sets, and shadowed sets, which are all suitable to characterize the so-called information granules. Modeling of the input data uncertainty is recognized as a crucial aspect in information granulation. Moreover, the uncertainty is a well-studied concept in many mathematical settings, such as those of probability theory, fuzzy set theory, and possibility theory. This fact suggests that an appropriate quantification of the uncertainty expressed by the information granule model could be used to define an invariant property, to be exploited in practical situations of information granulation. In this perspective, a procedure of information granulation is effective if the uncertainty conveyed by the synthesized information granule is in a monotonically increasing relation with the uncertainty of the input data. In this paper, we present a data granulation framework that elaborates over the principles of uncertainty introduced by Klir. Being the uncertainty a mesoscopic descriptor of systems and data, it is possible to apply such principles regardless of the input data type and the specific mathematical setting adopted for the information granules. The proposed framework is conceived (i) to offer a guideline for the synthesis of information granules and (ii) to build a groundwork to compare and quantitatively judge over different data granulation procedures. To provide a suitable case study, we introduce a new data granulation technique based on the minimum sum of distances, which is designed to generate type-2 fuzzy sets. We analyze the procedure by performing different experiments on two distinct data types: feature vectors and labeled graphs. Results show that the uncertainty of the input data is suitably conveyed by the generated type-2 fuzzy set models.Comment: 16 pages, 9 figures, 52 reference

    Smoothing and filtering with a class of outer measures

    Full text link
    Filtering and smoothing with a generalised representation of uncertainty is considered. Here, uncertainty is represented using a class of outer measures. It is shown how this representation of uncertainty can be propagated using outer-measure-type versions of Markov kernels and generalised Bayesian-like update equations. This leads to a system of generalised smoothing and filtering equations where integrals are replaced by supremums and probability density functions are replaced by positive functions with supremum equal to one. Interestingly, these equations retain most of the structure found in the classical Bayesian filtering framework. It is additionally shown that the Kalman filter recursion can be recovered from weaker assumptions on the available information on the corresponding hidden Markov model

    Valuing information from mesoscale forecasts

    Get PDF
    The development of meso-gamma scale numerical weather prediction (NWP) models requires a substantial investment in research, development and computational resources. Traditional objective verification of deterministic model output fails to demonstrate the added value of high-resolution forecasts made by such models. It is generally accepted from subjective verification that these models nevertheless have a predictive potential for small-scale weather phenomena and extreme weather events. This has prompted an extensive body of research into new verification techniques and scores aimed at developing mesoscale performance measures that objectively demonstrate the return on investment in meso-gamma NWP. In this article it is argued that the evaluation of the information in mesoscale forecasts should be essentially connected to the method that is used to extract this information from the direct model output (DMO). This could be an evaluation by a forecaster, but, given the probabilistic nature of small-scale weather, is more likely a form of statistical post-processing. Using model output statistics (MOS) and traditional verification scores, the potential of this approach is demonstrated both on an educational abstraction and a real world example. The MOS approach for this article incorporates concepts from fuzzy verification. This MOS approach objectively weighs different forecast quality measures and as such it is an essential extension of fuzzy methods

    A Formal Approach based on Fuzzy Logic for the Specification of Component-Based Interactive Systems

    Full text link
    Formal methods are widely recognized as a powerful engineering method for the specification, simulation, development, and verification of distributed interactive systems. However, most formal methods rely on a two-valued logic, and are therefore limited to the axioms of that logic: a specification is valid or invalid, component behavior is realizable or not, safety properties hold or are violated, systems are available or unavailable. Especially when the problem domain entails uncertainty, impreciseness, and vagueness, the appliance of such methods becomes a challenging task. In order to overcome the limitations resulting from the strict modus operandi of formal methods, the main objective of this work is to relax the boolean notion of formal specifications by using fuzzy logic. The present approach is based on Focus theory, a model-based and strictly formal method for componentbased interactive systems. The contribution of this work is twofold: i) we introduce a specification technique based on fuzzy logic which can be used on top of Focus to develop formal specifications in a qualitative fashion; ii) we partially extend Focus theory to a fuzzy one which allows the specification of fuzzy components and fuzzy interactions. While the former provides a methodology for approximating I/O behaviors under imprecision, the latter enables to capture a more quantitative view of specification properties such as realizability.Comment: In Proceedings FESCA 2015, arXiv:1503.0437

    Fuzzy-based Propagation of Prior Knowledge to Improve Large-Scale Image Analysis Pipelines

    Get PDF
    Many automatically analyzable scientific questions are well-posed and offer a variety of information about the expected outcome a priori. Although often being neglected, this prior knowledge can be systematically exploited to make automated analysis operations sensitive to a desired phenomenon or to evaluate extracted content with respect to this prior knowledge. For instance, the performance of processing operators can be greatly enhanced by a more focused detection strategy and the direct information about the ambiguity inherent in the extracted data. We present a new concept for the estimation and propagation of uncertainty involved in image analysis operators. This allows using simple processing operators that are suitable for analyzing large-scale 3D+t microscopy images without compromising the result quality. On the foundation of fuzzy set theory, we transform available prior knowledge into a mathematical representation and extensively use it enhance the result quality of various processing operators. All presented concepts are illustrated on a typical bioimage analysis pipeline comprised of seed point detection, segmentation, multiview fusion and tracking. Furthermore, the functionality of the proposed approach is validated on a comprehensive simulated 3D+t benchmark data set that mimics embryonic development and on large-scale light-sheet microscopy data of a zebrafish embryo. The general concept introduced in this contribution represents a new approach to efficiently exploit prior knowledge to improve the result quality of image analysis pipelines. Especially, the automated analysis of terabyte-scale microscopy data will benefit from sophisticated and efficient algorithms that enable a quantitative and fast readout. The generality of the concept, however, makes it also applicable to practically any other field with processing strategies that are arranged as linear pipelines.Comment: 39 pages, 12 figure

    A metric to represent the evolution of CAD/analysis models in collaborative design

    Get PDF
    Computer Aided Design (CAD) and Computer Aided Engineering (CAE) models are often used during product design. Various interactions between the different models must be managed for the designed system to be robust and in accordance with initially defined specifications. Research published to date has for example considered the link between digital mock-up and analysis models. However design/analysis integration must take into consideration the important number of models (digital mock-up and simulation) due to model evolution in time, as well as considering system engineering. To effectively manage modifications made to the system, the dependencies between the different models must be known and the nature of the modification must be characterised to estimate the impact of the modification throughout the dependent models. We propose a technique to describe the nature of a modification which may be used to determine the consequence within other models as well as a way to qualify the modified information. To achieve this, a metric is proposed that allows the qualification and evaluation of data or information, based on the maturity and validity of information and model
    • …
    corecore