9,310 research outputs found

    Numerical modelling of ground-tunnel support interaction using bedded-beam-spring model with fuzzy parameters

    Get PDF
    The study of the ground-tunnel interaction by introducing a predetermined degree of variation (fuzziness) in some parameters of the chosen model is presented and discussed. This research comes from the consideration that tunnel model parameters and geometry are usually affected by a degree of uncertainty, mainly due to construction imprecision and the great variability of rock mass properties. The research has been developed by using the fuzzy set theory assuming that three model parameters are affected by a certain amount of uncertainty (defined by the so-called membership functions). The response of the numerical model is calculated by solving the fuzzy equations for different shapes of the membership functions. In order to investigate the effects of some model parameters, and to provide a simple procedure and tool for the designers, a study on the effect of tunnel boundary conditions, based on a fuzzy model, has been carried out using a simple but well known and widely used design method such as the bedded-beam-spring mode

    Semantic Information G Theory and Logical Bayesian Inference for Machine Learning

    Get PDF
    An important problem with machine learning is that when label number n\u3e2, it is very difficult to construct and optimize a group of learning functions, and we wish that optimized learning functions are still useful when prior distribution P(x) (where x is an instance) is changed. To resolve this problem, the semantic information G theory, Logical Bayesian Inference (LBI), and a group of Channel Matching (CM) algorithms together form a systematic solution. MultilabelMultilabel A semantic channel in the G theory consists of a group of truth functions or membership functions. In comparison with likelihood functions, Bayesian posteriors, and Logistic functions used by popular methods, membership functions can be more conveniently used as learning functions without the above problem. In Logical Bayesian Inference (LBI), every label’s learning is independent. For Multilabel learning, we can directly obtain a group of optimized membership functions from a big enough sample with labels, without preparing different samples for different labels. A group of Channel Matching (CM) algorithms are developed for machine learning. For the Maximum Mutual Information (MMI) classification of three classes with Gaussian distributions on a two-dimensional feature space, 2-3 iterations can make mutual information between three classes and three labels surpass 99% of the MMI for most initial partitions. For mixture models, the Expectation-Maxmization (EM) algorithm is improved and becomes the CM-EM algorithm, which can outperform the EM algorithm when mixture ratios are imbalanced, or local convergence exists. The CM iteration algorithm needs to combine neural networks for MMI classifications on high-dimensional feature spaces. LBI needs further studies for the unification of statistics and logic

    Mass assignment fuzzy ID3 with applications

    Get PDF
    • 

    corecore