6,044 research outputs found

    On the role of pre and post-processing in environmental data mining

    Get PDF
    The quality of discovered knowledge is highly depending on data quality. Unfortunately real data use to contain noise, uncertainty, errors, redundancies or even irrelevant information. The more complex is the reality to be analyzed, the higher the risk of getting low quality data. Knowledge Discovery from Databases (KDD) offers a global framework to prepare data in the right form to perform correct analyses. On the other hand, the quality of decisions taken upon KDD results, depend not only on the quality of the results themselves, but on the capacity of the system to communicate those results in an understandable form. Environmental systems are particularly complex and environmental users particularly require clarity in their results. In this paper some details about how this can be achieved are provided. The role of the pre and post processing in the whole process of Knowledge Discovery in environmental systems is discussed

    What is Computational Intelligence and where is it going?

    Get PDF
    What is Computational Intelligence (CI) and what are its relations with Artificial Intelligence (AI)? A brief survey of the scope of CI journals and books with ``computational intelligence'' in their title shows that at present it is an umbrella for three core technologies (neural, fuzzy and evolutionary), their applications, and selected fashionable pattern recognition methods. At present CI has no comprehensive foundations and is more a bag of tricks than a solid branch of science. The change of focus from methods to challenging problems is advocated, with CI defined as a part of computer and engineering sciences devoted to solution of non-algoritmizable problems. In this view AI is a part of CI focused on problems related to higher cognitive functions, while the rest of the CI community works on problems related to perception and control, or lower cognitive functions. Grand challenges on both sides of this spectrum are addressed

    Study of Discrete Choice Models and Adaptive Neuro-Fuzzy Inference System in the Prediction of Economic Crisis Periods in USA

    Get PDF
    In this study two approaches are applied for the prediction of the economic recession or expansion periods in USA. The first approach includes Logit and Probit models and the second is an Adaptive Neuro-Fuzzy Inference System (ANFIS) with Gaussian and Generalized Bell membership functions. The in-sample period 1950-2006 is examined and the forecasting performance of the two approaches is evaluated during the out-of sample period 2007-2010. The estimation results show that the ANFIS model outperforms the Logit and Probit model. This indicates that neuro-fuzzy model provides a better and more reliable signal on whether or not a financial crisis will take place.ANFIS, Discrete Choice Models, Error Back-propagation, Financial Crisis, Fuzzy Logic, US Economy

    Neurocognitive Informatics Manifesto.

    Get PDF
    Informatics studies all aspects of the structure of natural and artificial information systems. Theoretical and abstract approaches to information have made great advances, but human information processing is still unmatched in many areas, including information management, representation and understanding. Neurocognitive informatics is a new, emerging field that should help to improve the matching of artificial and natural systems, and inspire better computational algorithms to solve problems that are still beyond the reach of machines. In this position paper examples of neurocognitive inspirations and promising directions in this area are given

    Uncertainty Analysis of the Adequacy Assessment Model of a Distributed Generation System

    Full text link
    Due to the inherent aleatory uncertainties in renewable generators, the reliability/adequacy assessments of distributed generation (DG) systems have been particularly focused on the probabilistic modeling of random behaviors, given sufficient informative data. However, another type of uncertainty (epistemic uncertainty) must be accounted for in the modeling, due to incomplete knowledge of the phenomena and imprecise evaluation of the related characteristic parameters. In circumstances of few informative data, this type of uncertainty calls for alternative methods of representation, propagation, analysis and interpretation. In this study, we make a first attempt to identify, model, and jointly propagate aleatory and epistemic uncertainties in the context of DG systems modeling for adequacy assessment. Probability and possibility distributions are used to model the aleatory and epistemic uncertainties, respectively. Evidence theory is used to incorporate the two uncertainties under a single framework. Based on the plausibility and belief functions of evidence theory, the hybrid propagation approach is introduced. A demonstration is given on a DG system adapted from the IEEE 34 nodes distribution test feeder. Compared to the pure probabilistic approach, it is shown that the hybrid propagation is capable of explicitly expressing the imprecision in the knowledge on the DG parameters into the final adequacy values assessed. It also effectively captures the growth of uncertainties with higher DG penetration levels

    Building Fuzzy Elevation Maps from a Ground-based 3D Laser Scan for Outdoor Mobile Robots

    Get PDF
    Mandow, A; Cantador, T.J.; Reina, A.J.; Martínez, J.L.; Morales, J.; García-Cerezo, A. "Building Fuzzy Elevation Maps from a Ground-based 3D Laser Scan for Outdoor Mobile Robots," Robot2015: Second Iberian Robotics Conference, Advances in Robotics, (2016) Advances in Intelligent Systems and Computing, vol. 418. This is a self-archiving copy of the author’s accepted manuscript. The final publication is available at Springer via http://link.springer.com/book/10.1007/978-3-319-27149-1.The paper addresses terrain modeling for mobile robots with fuzzy elevation maps by improving computational speed and performance over previous work on fuzzy terrain identification from a three-dimensional (3D) scan. To this end, spherical sub-sampling of the raw scan is proposed to select training data that does not filter out salient obstacles. Besides, rule structure is systematically defined by considering triangular sets with an unevenly distributed standard fuzzy partition and zero order Sugeno-type consequents. This structure, which favors a faster training time and reduces the number of rule parameters, also serves to compute a fuzzy reliability mask for the continuous fuzzy surface. The paper offers a case study using a Hokuyo-based 3D rangefinder to model terrain with and without outstanding obstacles. Performance regarding error and model size is compared favorably with respect to a solution that uses quadric-based surface simplification (QSlim).This work was partially supported by the Spanish CICYT project DPI 2011-22443, the Andalusian project PE-2010 TEP-6101, and Universidad de Málaga-Andalucía Tech
    corecore