110 research outputs found

    Sistemas granulares evolutivos

    Get PDF
    Orientador: Fernando Antonio Campos GomideTese (doutorado) - Universidade Estadual de Campinas, Faculdade de Engenharia Elétrica e de ComputaçãoResumo: Recentemente tem-se observado um crescente interesse em abordagens de modelagem computacional para lidar com fluxos de dados do mundo real. Métodos e algoritmos têm sido propostos para obtenção de conhecimento a partir de conjuntos de dados muito grandes e, a princípio, sem valor aparente. Este trabalho apresenta uma plataforma computacional para modelagem granular evolutiva de fluxos de dados incertos. Sistemas granulares evolutivos abrangem uma variedade de abordagens para modelagem on-line inspiradas na forma com que os humanos lidam com a complexidade. Esses sistemas exploram o fluxo de informação em ambiente dinâmico e extrai disso modelos que podem ser linguisticamente entendidos. Particularmente, a granulação da informação é uma técnica natural para dispensar atenção a detalhes desnecessários e enfatizar transparência, interpretabilidade e escalabilidade de sistemas de informação. Dados incertos (granulares) surgem a partir de percepções ou descrições imprecisas do valor de uma variável. De maneira geral, vários fatores podem afetar a escolha da representação dos dados tal que o objeto representativo reflita o significado do conceito que ele está sendo usado para representar. Neste trabalho são considerados dados numéricos, intervalares e fuzzy; e modelos intervalares, fuzzy e neuro-fuzzy. A aprendizagem de sistemas granulares é baseada em algoritmos incrementais que constroem a estrutura do modelo sem conhecimento anterior sobre o processo e adapta os parâmetros do modelo sempre que necessário. Este paradigma de aprendizagem é particularmente importante uma vez que ele evita a reconstrução e o retreinamento do modelo quando o ambiente muda. Exemplos de aplicação em classificação, aproximação de função, predição de séries temporais e controle usando dados sintéticos e reais ilustram a utilidade das abordagens de modelagem granular propostas. O comportamento de fluxos de dados não-estacionários com mudanças graduais e abruptas de regime é também analisado dentro do paradigma de computação granular evolutiva. Realçamos o papel da computação intervalar, fuzzy e neuro-fuzzy em processar dados incertos e prover soluções aproximadas de alta qualidade e sumário de regras de conjuntos de dados de entrada e saída. As abordagens e o paradigma introduzidos constituem uma extensão natural de sistemas inteligentes evolutivos para processamento de dados numéricos a sistemas granulares evolutivos para processamento de dados granularesAbstract: In recent years there has been increasing interest in computational modeling approaches to deal with real-world data streams. Methods and algorithms have been proposed to uncover meaningful knowledge from very large (often unbounded) data sets in principle with no apparent value. This thesis introduces a framework for evolving granular modeling of uncertain data streams. Evolving granular systems comprise an array of online modeling approaches inspired by the way in which humans deal with complexity. These systems explore the information flow in dynamic environments and derive from it models that can be linguistically understood. Particularly, information granulation is a natural technique to dispense unnecessary details and emphasize transparency, interpretability and scalability of information systems. Uncertain (granular) data arise from imprecise perception or description of the value of a variable. Broadly stated, various factors can affect one's choice of data representation such that the representing object conveys the meaning of the concept it is being used to represent. Of particular concern to this work are numerical, interval, and fuzzy types of granular data; and interval, fuzzy, and neurofuzzy modeling frameworks. Learning in evolving granular systems is based on incremental algorithms that build model structure from scratch on a per-sample basis and adapt model parameters whenever necessary. This learning paradigm is meaningful once it avoids redesigning and retraining models all along if the system changes. Application examples in classification, function approximation, time-series prediction and control using real and synthetic data illustrate the usefulness of the granular approaches and framework proposed. The behavior of nonstationary data streams with gradual and abrupt regime shifts is also analyzed in the realm of evolving granular computing. We shed light upon the role of interval, fuzzy, and neurofuzzy computing in processing uncertain data and providing high-quality approximate solutions and rule summary of input-output data sets. The approaches and framework introduced constitute a natural extension of evolving intelligent systems over numeric data streams to evolving granular systems over granular data streamsDoutoradoAutomaçãoDoutor em Engenharia Elétric

    Data-stream driven Fuzzy-granular approaches for system maintenance

    Get PDF
    Intelligent systems are currently inherent to the society, supporting a synergistic human-machine collaboration. Beyond economical and climate factors, energy consumption is strongly affected by the performance of computing systems. The quality of software functioning may invalidate any improvement attempt. In addition, data-driven machine learning algorithms are the basis for human-centered applications, being their interpretability one of the most important features of computational systems. Software maintenance is a critical discipline to support automatic and life-long system operation. As most software registers its inner events by means of logs, log analysis is an approach to keep system operation. Logs are characterized as Big data assembled in large-flow streams, being unstructured, heterogeneous, imprecise, and uncertain. This thesis addresses fuzzy and neuro-granular methods to provide maintenance solutions applied to anomaly detection (AD) and log parsing (LP), dealing with data uncertainty, identifying ideal time periods for detailed software analyses. LP provides deeper semantics interpretation of the anomalous occurrences. The solutions evolve over time and are general-purpose, being highly applicable, scalable, and maintainable. Granular classification models, namely, Fuzzy set-Based evolving Model (FBeM), evolving Granular Neural Network (eGNN), and evolving Gaussian Fuzzy Classifier (eGFC), are compared considering the AD problem. The evolving Log Parsing (eLP) method is proposed to approach the automatic parsing applied to system logs. All the methods perform recursive mechanisms to create, update, merge, and delete information granules according with the data behavior. For the first time in the evolving intelligent systems literature, the proposed method, eLP, is able to process streams of words and sentences. Essentially, regarding to AD accuracy, FBeM achieved (85.64+-3.69)%; eGNN reached (96.17+-0.78)%; eGFC obtained (92.48+-1.21)%; and eLP reached (96.05+-1.04)%. Besides being competitive, eLP particularly generates a log grammar, and presents a higher level of model interpretability

    Untangling hotel industry’s inefficiency: An SFA approach applied to a renowned Portuguese hotel chain

    Get PDF
    The present paper explores the technical efficiency of four hotels from Teixeira Duarte Group - a renowned Portuguese hotel chain. An efficiency ranking is established from these four hotel units located in Portugal using Stochastic Frontier Analysis. This methodology allows to discriminate between measurement error and systematic inefficiencies in the estimation process enabling to investigate the main inefficiency causes. Several suggestions concerning efficiency improvement are undertaken for each hotel studied.info:eu-repo/semantics/publishedVersio

    Concepts in Action

    Get PDF
    This open access book is a timely contribution in presenting recent issues, approaches, and results that are not only central to the highly interdisciplinary field of concept research but also particularly important to newly emergent paradigms and challenges. The contributors present a unique, holistic picture for the understanding and use of concepts from a wide range of fields including cognitive science, linguistics, philosophy, psychology, artificial intelligence, and computer science. The chapters focus on three distinct points of view that lie at the core of concept research: representation, learning, and application. The contributions present a combination of theoretical, experimental, computational, and applied methods that appeal to students and researchers working in these fields

    Eighth International Workshop "What can FCA do for Artificial Intelligence?" (FCA4AI at ECAI 2020)

    Get PDF
    International audienceProceedings of the 8th International Workshop "What can FCA do for Artificial Intelligence?" (FCA4AI 2020)co-located with 24th European Conference on Artificial Intelligence (ECAI 2020), Santiago de Compostela, Spain, August 29, 202

    New Fundamental Technologies in Data Mining

    Get PDF
    The progress of data mining technology and large public popularity establish a need for a comprehensive text on the subject. The series of books entitled by "Data Mining" address the need by presenting in-depth description of novel mining algorithms and many useful applications. In addition to understanding each section deeply, the two books present useful hints and strategies to solving problems in the following chapters. The contributing authors have highlighted many future research directions that will foster multi-disciplinary collaborations and hence will lead to significant development in the field of data mining

    Computerized cancer malignancy grading of fine needle aspirates

    Get PDF
    According to the World Health Organization, breast cancer is a leading cause of death among middle-aged women. Precise diagnosis and correct treatment significantly reduces the high number of deaths caused by breast cancer. Being successful in the treatment strictly relies on the diagnosis. Specifically, the accuracy of the diagnosis and the stage at which a cancer was diagnosed. Precise and early diagnosis has a major impact on the survival rate, which indicates how many patients will live after the treatment. For many years researchers in medical and computer science fields have been working together to find the approach for precise diagnosis. For this thesis, precise diagnosis means finding a cancer at as early a stage as possible by developing new computer aided diagnostic tools. These tools differ depending on the type of cancer and the type of the examination that is used for diagnosis. This work concentrates on cytological images of breast cancer that are produced during fine needle aspiration biopsy examination. This kind of examination allows pathologists to estimate the malignancy of the cancer with very high accuracy. Malignancy estimation is very important when assessing a patients survival rate and the type of treatment. To achieve precise malignancy estimation, a classification framework is presented. This framework is able to classify breast cancer malignancy into two malignancy classes and is based on features calculated according to the Bloom-Richardson grading scheme. This scheme is commonly used by pathologists when grading breast cancer tissue. In Bloom-Richardson scheme two types of features are assessed depending on the magnification. Low magnification images are used for examining the dispersion of the cells in the image while the high magnification images are used for precise analysis of the cells' nuclear features. In this thesis, different types of segmentation algorithms were compared to estimate the algorithm that allows for relatively fast and accurate nuclear segmentation. Based on that segmentation a set of 34 features was extracted for further malignancy classification. For classification purposes 6 different classifiers were compared. From all of the tests a set of the best preforming features were chosen. The presented system is able to classify images of fine needle aspiration biopsy slides with high accurac

    Acta Polytechnica Hungarica 2014

    Get PDF

    New geometric algorithms and data structures for collision detection of dynamically deforming objects

    Get PDF
    Any virtual environment that supports interactions between virtual objects and/or a user and objects, needs a collision detection system to handle all interactions in a physically correct or plausible way. A collision detection system is needed to determine if objects are in contact or interpenetrates. These interpenetrations are resolved by a collision handling system. Because of the fact, that in nearly all simulations objects can interact with each other, collision detection is a fundamental technology, that is needed in all these simulations, like physically based simulation, robotic path and motion planning, virtual prototyping, and many more. Most virtual environments aim to represent the real-world as realistic as possible and therefore, virtual environments getting more and more complex. Furthermore, all models in a virtual environment should interact like real objects do, if forces are applied to the objects. Nearly all real-world objects will deform or break down in its individual parts if forces are acted upon the objects. Thus deformable objects are becoming more and more common in virtual environments, which want to be as realistic as possible and thus, will present new challenges to the collision detection system. The necessary collision detection computations can be very complex and this has the effect, that the collision detection process is the performance bottleneck in most simulations. Most rigid body collision detection approaches use a BVH as acceleration data structure. This technique is perfectly suitable if the object does not change its shape. For a soft body an update step is necessary to ensure that the underlying acceleration data structure is still valid after performing a simulation step. This update step can be very time consuming, is often hard to implement and in most cases will produce a degenerated BVH after some simulation steps, if the objects generally deform. Therefore, the here presented collision detection approach works entirely without an acceleration data structure and supports rigid and soft bodies. Furthermore, we can compute inter-object and intraobject collisions of rigid and deformable objects consisting of many tens of thousands of triangles in a few milliseconds. To realize this, a subdivision of the scene into parts using a fuzzy clustering approach is applied. Based on that all further steps for each cluster can be performed in parallel and if desired, distributed to different GPUs. Tests have been performed to judge the performance of our approach against other state-of-the-art collision detection algorithms. Additionally, we integrated our approach into Bullet, a commonly used physics engine, to evaluate our algorithm. In order to make a fair comparison of different rigid body collision detection algorithms, we propose a new collision detection Benchmarking Suite. Our Benchmarking Suite can evaluate both the performance as well as the quality of the collision response. Therefore, the Benchmarking Suite is subdivided into a Performance Benchmark and a Quality Benchmark. This approach needs to be extended to support soft body collision detection algorithms in the future.Jede virtuelle Umgebung, welche eine Interaktion zwischen den virtuellen Objekten in der Szene zulässt und/oder zwischen einem Benutzer und den Objekten, benötigt für eine korrekte Behandlung der Interaktionen eine Kollisionsdetektion. Nur dank der Kollisionsdetektion können Berührungen zwischen Objekten erkannt und mittels der Kollisionsbehandlung aufgelöst werden. Dies ist der Grund für die weite Verbreitung der Kollisionsdetektion in die verschiedensten Fachbereiche, wie der physikalisch basierten Simulation, der Pfadplanung in der Robotik, dem virtuellen Prototyping und vielen weiteren. Auf Grund des Bestrebens, die reale Umgebung in der virtuellen Welt so realistisch wie möglich nachzubilden, steigt die Komplexität der Szenen stetig. Fortwährend steigen die Anforderungen an die Objekte, sich realistisch zu verhalten, sollten Kräfte auf die einzelnen Objekte ausgeübt werden. Die meisten Objekte, die uns in unserer realen Welt umgeben, ändern ihre Form oder zerbrechen in ihre Einzelteile, wenn Kräfte auf sie einwirken. Daher kommen in realitätsnahen, virtuellen Umgebungen immer häufiger deformierbare Objekte zum Einsatz, was neue Herausforderungen an die Kollisionsdetektion stellt. Die hierfür Notwendigen, teils komplexen Berechnungen, führen dazu, dass die Kollisionsdetektion häufig der Performance-Bottleneck in der jeweiligen Simulation darstellt. Die meisten Kollisionsdetektionen für starre Körper benutzen eine Hüllkörperhierarchie als Beschleunigungsdatenstruktur. Diese Technik ist hervorragend geeignet, solange sich die Form des Objektes nicht verändert. Im Fall von deformierbaren Objekten ist eine Aktualisierung der Datenstruktur nach jedem Schritt der Simulation notwendig, damit diese weiterhin gültig ist. Dieser Aktualisierungsschritt kann, je nach Hierarchie, sehr zeitaufwendig sein, ist in den meisten Fällen schwer zu implementieren und generiert nach vielen Schritten der Simulation häufig eine entartete Hüllkörperhierarchie, sollte sich das Objekt sehr stark verformen. Um dies zu vermeiden, verzichtet unsere Kollisionsdetektion vollständig auf eine Beschleunigungsdatenstruktur und unterstützt sowohl rigide, wie auch deformierbare Körper. Zugleich können wir Selbstkollisionen und Kollisionen zwischen starren und/oder deformierbaren Objekten, bestehend aus vielen Zehntausenden Dreiecken, innerhalb von wenigen Millisekunden berechnen. Um dies zu realisieren, unterteilen wir die gesamte Szene in einzelne Bereiche mittels eines Fuzzy Clustering-Verfahrens. Dies ermöglicht es, dass alle Cluster unabhängig bearbeitet werden und falls gewünscht, die Berechnungen für die einzelnen Cluster auf verschiedene Grafikkarten verteilt werden können. Um die Leistungsfähigkeit unseres Ansatzes vergleichen zu können, haben wir diesen gegen aktuelle Verfahren für die Kollisionsdetektion antreten lassen. Weiterhin haben wir unser Verfahren in die Physik-Engine Bullet integriert, um das Verhalten in dynamischen Situationen zu evaluieren. Um unterschiedliche Kollisionsdetektionsalgorithmen für starre Körper korrekt und objektiv miteinander vergleichen zu können, haben wir eine Benchmarking-Suite entwickelt. Unsere Benchmarking- Suite kann sowohl die Geschwindigkeit, für die Bestimmung, ob zwei Objekte sich durchdringen, wie auch die Qualität der berechneten Kräfte miteinander vergleichen. Hierfür ist die Benchmarking-Suite in den Performance Benchmark und den Quality Benchmark unterteilt worden. In der Zukunft wird diese Benchmarking-Suite dahingehend erweitert, dass auch Kollisionsdetektionsalgorithmen für deformierbare Objekte unterstützt werden
    corecore