14,131 research outputs found

    Analysing imperfect temporal information in GIS using the Triangular Model

    Get PDF
    Rough set and fuzzy set are two frequently used approaches for modelling and reasoning about imperfect time intervals. In this paper, we focus on imperfect time intervals that can be modelled by rough sets and use an innovative graphic model [i.e. the triangular model (TM)] to represent this kind of imperfect time intervals. This work shows that TM is potentially advantageous in visualizing and querying imperfect time intervals, and its analytical power can be better exploited when it is implemented in a computer application with graphical user interfaces and interactive functions. Moreover, a probabilistic framework is proposed to handle the uncertainty issues in temporal queries. We use a case study to illustrate how the unique insights gained by TM can assist a geographical information system for exploratory spatio-temporal analysis

    Neuro-fuzzy chip to handle complex tasks with analog performance

    Get PDF
    This paper presents a mixed-signal neuro-fuzzy controller chip which, in terms of power consumption, input–output delay, and precision, performs as a fully analog implementation. However, it has much larger complexity than its purely analog counterparts. This combination of performance and complexity is achieved through the use of a mixed-signal architecture consisting of a programmable analog core of reduced complexity, and a strategy, and the associated mixed-signal circuitry, to cover the whole input space through the dynamic programming of this core. Since errors and delays are proportional to the reduced number of fuzzy rules included in the analog core, they are much smaller than in the case where the whole rule set is implemented by analog circuitry. Also, the area and the power consumption of the new architecture are smaller than those of its purely analog counterparts simply because most rules are implemented through programming. The Paper presents a set of building blocks associated to this architecture, and gives results for an exemplary prototype. This prototype, called multiplexing fuzzy controller (MFCON), has been realized in a CMOS 0.7 um standard technology. It has two inputs, implements 64 rules, and features 500 ns of input to output delay with 16-mW of power consumption. Results from the chip in a control application with a dc motor are also provided

    Neuro-fuzzy chip to handle complex tasks with analog performance

    Get PDF
    This Paper presents a mixed-signal neuro-fuzzy controller chip which, in terms of power consumption, input-output delay and precision performs as a fully analog implementation. However, it has much larger complexity than its purely analog counterparts. This combination of performance and complexity is achieved through the use of a mixed-signal architecture consisting of a programmable analog core of reduced complexity, and a strategy, and the associated mixed-signal circuitry, to cover the whole input space through the dynamic programming of this core [1]. Since errors and delays are proportional to the reduced number of fuzzy rules included in the analog core, they are much smaller than in the case where the whole rule set is implemented by analog circuitry. Also, the area and the power consumption of the new architecture are smaller than those of its purely analog counterparts simply because most rules are implemented through programming. The Paper presents a set of building blocks associated to this architecture, and gives results for an exemplary prototype. This prototype, called MFCON, has been realized in a CMOS 0.7ÎŒm standard technology. It has two inputs, implements 64 rules and features 500ns of input to output delay with 16mW of power consumption. Results from the chip in a control application with a DC motor are also provided

    Multivariate adaptive regression splines for estimating riverine constituent concentrations

    Get PDF
    Regression-based methods are commonly used for riverine constituent concentration/flux estimation, which is essential for guiding water quality protection practices and environmental decision making. This paper developed a multivariate adaptive regression splines model for estimating riverine constituent concentrations (MARS-EC). The process, interpretability and flexibility of the MARS-EC modelling approach, was demonstrated for total nitrogen in the Patuxent River, a major river input to Chesapeake Bay. Model accuracy and uncertainty of the MARS-EC approach was further analysed using nitrate plus nitrite datasets from eight tributary rivers to Chesapeake Bay. Results showed that the MARS-EC approach integrated the advantages of both parametric and nonparametric regression methods, and model accuracy was demonstrated to be superior to the traditionally used ESTIMATOR model. MARS-EC is flexible and allows consideration of auxiliary variables; the variables and interactions can be selected automatically. MARS-EC does not constrain concentration-predictor curves to be constant but rather is able to identify shifts in these curves from mathematical expressions and visual graphics. The MARS-EC approach provides an effective and complementary tool along with existing approaches for estimating riverine constituent concentrations

    Data granulation by the principles of uncertainty

    Full text link
    Researches in granular modeling produced a variety of mathematical models, such as intervals, (higher-order) fuzzy sets, rough sets, and shadowed sets, which are all suitable to characterize the so-called information granules. Modeling of the input data uncertainty is recognized as a crucial aspect in information granulation. Moreover, the uncertainty is a well-studied concept in many mathematical settings, such as those of probability theory, fuzzy set theory, and possibility theory. This fact suggests that an appropriate quantification of the uncertainty expressed by the information granule model could be used to define an invariant property, to be exploited in practical situations of information granulation. In this perspective, a procedure of information granulation is effective if the uncertainty conveyed by the synthesized information granule is in a monotonically increasing relation with the uncertainty of the input data. In this paper, we present a data granulation framework that elaborates over the principles of uncertainty introduced by Klir. Being the uncertainty a mesoscopic descriptor of systems and data, it is possible to apply such principles regardless of the input data type and the specific mathematical setting adopted for the information granules. The proposed framework is conceived (i) to offer a guideline for the synthesis of information granules and (ii) to build a groundwork to compare and quantitatively judge over different data granulation procedures. To provide a suitable case study, we introduce a new data granulation technique based on the minimum sum of distances, which is designed to generate type-2 fuzzy sets. We analyze the procedure by performing different experiments on two distinct data types: feature vectors and labeled graphs. Results show that the uncertainty of the input data is suitably conveyed by the generated type-2 fuzzy set models.Comment: 16 pages, 9 figures, 52 reference

    Analysing the Creative Process through a Modelling of Tools and Methods for Composition in Hans Tutschku’s Entwurzelt

    Get PDF
    The analysis of the creative processes involved in electroacoustic music may to a large extent rely on the thorough study of the technological tools used for the realisation of a musical work, both on the composition and on the performance sides. Understanding the behaviour and potential range of aesthetic results of such tools enables the musicologist to approach the studied work much beyond its final form, as presented on tape or as performed on a particular occasion: gaining knowledge on a wider technological context leads to considering the actual artistic decisions in the perspective of the potential outcomes that the composer and performer could face but not necessarily adopt. Hence, analysing an electroacoustic work on the basis of the study of its creative context, technological tools and compositional methods may constitute a useful approach to a better understanding of its related creative processes. However, the implementation of such an approach, mainly based on the hardware or software elements used during the creation of a given work, is not straightforward. First, it implies that the considered technologies are still in use and have not be come irreversibly obsolete. In this matter, new performances of a work are good opportunities for such investigations, as they often provide a technical update and require a deep understanding of the composer’s intentions. The musicologist also needs to have access to the resources, which may not be available without a direct contact with the composer. Assuming these conditions are reached,the musicological and organological studies can encounter another issue, particularly in the digital domain: the sources are not always presented under forms that are directly readable by the analyst, for instance with a specific programming language. Despite all these possible difficulties, many cases of technological tools lean themselves to an in-depth investigation, leading to relevant conclusions on some of the creative processes appearing in the field of electroacoustic music. In the context of a common session of several analytical approaches to a same electroacoustic piece, Hans Tutschku’s Entwurzelt for six singers and electronics (2012), this article focuses on the investigation and modelling of tools and methods of the compositional stage of the realisation of the work. During a performance of Entwurzelt, the electronic materials are simply triggered as events by one of the singers, without further interactivity–thus, the essential part of the research on the electroacoustic realisation aims at exploring the processes used during the compositional stage itself. As the electronics are used as an extension of the live vocal expression by the means of harmonic amplification and complex texturing, the tools for generation and processing of both symbolic representations and audio explored. Since the software tools that constitute the primary sources for our research were not directly designed to be used beyond their creative purposes, this talk presents software modelling implemented by the two authors to demonstrate the technological context in which Tutschku could compose Entwurzelt, emphasizing his creative methods and the decisions he could make upon a wider range of possible materials and processing techniques
    • 

    corecore