17 research outputs found

    Математичні методи розпізнавання надзвичайних ситуацій в умовах невизначеності

    Get PDF
    It has been shown that emergency events recognition systems exhibit various types of uncertainty: incomplete data streams, data stream errors, and inappropriate patterns of complex events. There were presented an overview of existing approaches for complex event recognition under uncertainty. It was noticed that the field of complex event recognition under uncertainty is relatively new and proposed to adopt methods of targeting activity recognition. It was shown that the streams of time-stamped derived events arriving at a complex event recognition system carry a certain degree of uncertainty and ambiguity. Information sources have to be heterogeneous, with data of different structures schemas and procedures of respond to corrupted data. Even for perfectly accurate sensors, the domain might be difficult to model precisely, thereby leading to another type of uncertainty. Thus, it is noted that it is important to consider methods for recognizing complex events that can be classified as uncertain, for this purpose, appropriate model objects were proposed. The analysis of key moments in the construction of complex recognition systems that are capable of working effectively under uncertainty included stochastic modeling, time representation models, and relational models. There were considered techniques based on automata, probabilistic graphical models, first-order logic, Petri Networks and Hidden Petri Networks. It is specified that the intermediate stage of the work of the corresponding algorithms should be the creation of a hierarchy of complex objects that are not always clearly defined. A number of limitations have been found regarding the syntax, models and performance used, which were compared with the specific variants of their implementation. An approach was proposed for the transition from a deterministic mathematical apparatus to a system of recognition of complex events under uncertainty conditions, through the introduction of the probability function of an event. The developed methodology allowed highlighting directions for investigation and estimating efficiency of the mathematical methods to be used.Показано, що системи розпізнавання надзвичайних подій виявляють різні типи невизначеності: неповні потоки даних, помилки в потоках даних і невідповідні шаблони складних подій. Показано, що потоки подій, що потрапляють на вхід системи розпізнавання складних подій, характеризуються певним ступенем невизначеності. Джерела даних є неоднорідними і характеризуються різною структуризацією даних і відповідними процедурами реагування на пошкоджені блоки даних. Навіть для даних, визначених достатньо точно, система може некоректно моделювати складні події, що призводить до подальшого типу невизначеності. Отже, зазначено, що важливо розглянути методи розпізнавання складних подій, які можна віднести до невизначених. З цією метою було запропоновано відповідні модельні об'єкти. Проведений аналіз ключових моментів побудови систем розпізнавання складних подій, які здатні ефективно працювати в умовах невизначеності, охоплював          методи стохастичного моделювання, моделі часового представлення та реляційні моделі. Розглянуто методики, що базуються на абстрактних автоматах, імовірнісних моделях графів, системах логіки першого порядку, мережах Петрі та прихованих мережах Петрі. Зазначено, що проміжним етапом роботи відповідних алгоритмів має бути створення ієрархії складних об'єктів, що не завжди піддаються чіткому визначенню. Виявлено низку обмежень щодо використовуваного синтаксису, моделей і ефективності, які були зіставлені з конкретними варіантами їх реалізації. Запропоновано підхід щодо переходу від детерміністичного математичного апарату до системи розпізнавання складних подій в умовах невизначеності, через введення функції вірогідності події. Розроблена методологія дала змогу виділити напрями досліджень і оцінити продуктивність використовуваних математичних методів

    Self-adaptive event recognition for intelligent transport management

    Full text link
    Intelligent transport management involves the use of voluminous amounts of uncertain sensor data to identify and effectively manage issues of congestion and quality of service. In particular, urban traffic has been in the eye of the storm for many years now and gathers increasing interest as cities become bigger, crowded, and “smart”. In this work we tackle the issue of uncertainty in transportation systems stream reporting. The variety of existing data sources opens new opportunities for testing the validity of sensor reports and self-adapting the recognition of complex events as a result. We report on the use of a logic-based event reasoning tool to identify regions of uncertainty within a stream and demonstrate our method with a real-world use-case from the city of Dublin. Our empirical analysis shows the feasibility of the approach when dealing with voluminous and highly uncertain streams

    Handling Location Uncertainty in Event Driven Experimentation

    Get PDF
    Singapore National Research Foundation under International Research Centre @ Singapore Funding Initiativ

    Extending Event-Driven Architecture for Proactive Systems

    Get PDF
    ABSTRACT Proactive Event-Driven Computing is a new paradigm, in which a decision is not made due to explicit users' requests nor is it made as a response to past events. Rather, the decision is autonomously triggered by forecasting future states. Proactive event-driven computing requires a departure from current event-driven architectures to ones capable of handling uncertainty and future events, and real-time decision making. We present a proactive event-driven architecture for Scalable Proactive Event-Driven Decision-making (SPEEDD), which combines these capabilities. The proposed architecture is composed of three main components: complex event processing, real-time decision making, and visualization. This architecture is instantiated by a real use case from the traffic management domain. In the future, the results of actual implementations of the use case will help us revise and refine the proposed architecture

    Big data and social media: A scientometrics analysis

    Get PDF
    The purpose of this research is to investigate the status and the evolution of the scientific studies for the effect of social networks on big data and usage of big data for modeling the social net-works users’ behavior. This paper presents a comprehensive review of the studies associated with big data in social media. The study uses Scopus database as a primary search engine and covers 2000 of highly cited articles over the period 2012-2019. The records are statistically analyzed and categorized in terms of different criteria. The findings show that researches have grown exponentially since 2014 and the trend has continued at relatively stable rates. Based on the survey, decision support systems is the keyword which has carried the highest densities followed by heuristics methods. Among the most cited articles, papers published by researchers in United States have received the highest citations (7548), followed by United Kingdom (588) and China with 543 citations. Thematic analysis shows that the subject nearly maintained an important and well-developed research field and for better results we can merge our research with “big data analytics” and “twitter” that are important topics in this field but not developed well

    Real-time probabilistic reasoning system using Lambda architecture

    Get PDF
    Thesis (MTech (Information Technology))--Cape Peninsula University of Technology, 2019The proliferation of data from sources like social media, and sensor devices has become overwhelming for traditional data storage and analysis technologies to handle. This has prompted a radical improvement in data management techniques, tools and technologies to meet the increasing demand for effective collection, storage and curation of large data set. Most of the technologies are open-source. Big data is usually described as very large dataset. However, a major feature of big data is its velocity. Data flow in as continuous stream and require to be actioned in real-time to enable meaningful, relevant value. Although there is an explosion of technologies to handle big data, they are usually targeted at processing large dataset (historic) and real-time big data independently. Thus, the need for a unified framework to handle high volume dataset and real-time big data. This resulted in the development of models such as the Lambda architecture. Effective decision-making requires processing of historic data as well as real-time data. Some decision-making involves complex processes, depending on the likelihood of events. To handle uncertainty, probabilistic systems were designed. Probabilistic systems use probabilistic models developed with probability theories such as hidden Markov models with inference algorithms to process data and produce probabilistic scores. However, development of these models requires extensive knowledge of statistics and machine learning, making it an uphill task to model real-life circumstances. A new research area called probabilistic programming has been introduced to alleviate this bottleneck. This research proposes the combination of modern open-source big data technologies with probabilistic programming and Lambda architecture on easy-to-get hardware to develop a highly fault-tolerant, and scalable processing tool to process both historic and real-time big data in real-time; a common solution. This system will empower decision makers with the capacity to make better informed resolutions especially in the face of uncertainty. The outcome of this research will be a technology product, built and assessed using experimental evaluation methods. This research will utilize the Design Science Research (DSR) methodology as it describes guidelines for the effective and rigorous construction and evaluation of an artefact. Probabilistic programming in the big data domain is still at its infancy, however, the developed artefact demonstrated an important potential of probabilistic programming combined with Lambda architecture in the processing of big data

    Participatory Sensing and Crowdsourcing in Urban Environment

    Get PDF
    With an increasing number of people who live in cities, urban mobility becomes one of the most important research fields in the so-called smart city environments. Urban mobility can be defined as the ability of people to move around the city, living and interacting with the space. For these reasons, urban accessibility represents a primary factor to keep into account for social inclusion and for the effective exercise of citizenship. In this thesis, we researched how to use crowdsourcing and participative sensing to effectively and efficiently collect data about aPOIs (accessible Point Of Interests) with the aim of obtaining an updated, trusted and completed accessible map of the urban environment. The data gathered in such a way, was integrated with data retrieved from external open dataset and used in computing personalized accessible urban paths. In order to deeply investigate the issues related to this research, we designed and prototyped mPASS, a context-aware and location-based accessible way-finding system

    The Power of Exogenous Variables in Predicting West Nile Virus in South Carolina

    Get PDF
    Despite the availability of medical data, environmental surveillance tools, and heightened public awareness, West Nile Virus (WNv) remains a global health hazard. Reliable methods for predicting WNv outbreaks remain elusive, and environmental health managers must take preventive actions without the benefit of simple predictive tools. The purpose of this ex post facto research was to examine the accuracy and timeliness of exogenous data in predicting outbreaks of WNv in South Carolina. Decision theory, the CYNEFIN construct, and systems theory provided the theoretical framework for this study, allowing the researcher to broaden traditional decision theory concepts with powerful system-level precepts. Using WNv as an example of decision making in complex environments, a statistical model for predicting the likelihood of the presence of WNv was developed through the exclusive use of exogenous explanatory variables (EEVs). The key research questions were focused on whether EEVs alone can predict the likelihood of WNv presence with the statistical confidence to make timely preventive resource decisions. Results indicated that publicly accessible EEVs such as average temperature, average wind speed, and average population can be used to predict the presence of WNv in a South Carolina locality 30 days prior to an incident, although they did not accurately predict incident counts higher than four. The social implications of this research can be far-reaching. The ability to predict emerging infectious diseases (EID) for which there are no vaccines or cure can provide decision makers with the ability to take pro-active measures to mitigate EID outbreaks

    Real time predictive monitoring system for urban transport

    Get PDF
    Ubiquitous access to mobile and internet technology has influenced a significant increase in the amount of data produced, communicated and stored by corporations as well as by individual users, in recent years. The research presented in this thesis proposes an architectural framework to acquire, store, manipulate and integrate data and information within an urban transport environment, to optimise its operations in real-time. The deployed architecture is based on the integration of a number of technologies and tailor-made algorithms implemented to provide a management tool to aid traffic monitoring, using intelligent decision-making processes. A creative combination of Data Mining techniques and Machine Learning algorithms was used to implement predictive analytics, as a key component in the process of addressing challenges in monitoring and managing an urban transport network operation in real-time. The proposed solution has then been applied to an actual urban transport management system, within a partner company, Mermaid Technology, Copenhagen to test and evaluate the proposed algorithms and the architectural integration principles used. Various visualization methods have been employed, at numerous stages of the project to dynamically interpret the large volume and diversity of data to effectively aid the monitoring and decision-making process. The deliverables on this project include: the system architecture design, as well as software solutions, which facilitate predictive analytics and effective visualisation strategies to aid real-time monitoring of a large system, in the context of urban transport. The proposed solutions have been implemented, tested and evaluated in a Case Study in collaboration with Mermaid Technology. Using live data from their network operations, it has aided in evaluating the efficiency of the proposed system
    corecore