1,406 research outputs found

    Data Systems Fault Coping for Real-time Big Data Analytics Required Architectural Crucibles

    Get PDF
    This paper analyzes the properties and characteristics of unknown and unexpected faults introduced into information systems while processing Big Data in real-time. The authors hypothesize that there are new faults, and requirements for fault handling and propose an analytic model and architectural framework to assess and manage the faults and mitigate the risks of correlating or integrating otherwise uncorrelated Big Data, and to ensure the source pedigree, quality, set integrity, freshness, and validity of data being consumed. We argue that new architectures, methods, and tools for handling and analyzing Big Data systems functioning in real-time must design systems that address and mitigate concerns for faults resulting from real-time streaming processes while ensuring that variables such as synchronization, redundancy, and latency are addressed. This paper concludes that with improved designs, real-time Big Data systems may continuously deliver the value and benefits of streaming Big Data

    A Survey on Transactional Stream Processing

    Full text link
    Transactional stream processing (TSP) strives to create a cohesive model that merges the advantages of both transactional and stream-oriented guarantees. Over the past decade, numerous endeavors have contributed to the evolution of TSP solutions, uncovering similarities and distinctions among them. Despite these advances, a universally accepted standard approach for integrating transactional functionality with stream processing remains to be established. Existing TSP solutions predominantly concentrate on specific application characteristics and involve complex design trade-offs. This survey intends to introduce TSP and present our perspective on its future progression. Our primary goals are twofold: to provide insights into the diverse TSP requirements and methodologies, and to inspire the design and development of groundbreaking TSP systems

    From Sensing to Action: Quick and Reliable Access to Information in Cities Vulnerable to Heavy Rain

    Get PDF
    Cities need to constantly monitor weather to anticipate heavy storm events and reduce the impact of floods. Information describing precipitation and ground conditions at high spatio-temporal resolution is essential for taking timely action and preventing damages. Traditionally, rain gauges and weather radars are used to monitor rain events, but these sources provide low spatial resolutions and are subject to inaccuracy. Therefore, information needs to be complemented with data from other sources: from citizens' phone calls to the authorities, to relevant online media posts, which have the potential of providing timely and valuable information on weather conditions in the city. This information is often scattered through different, static, and not-publicly available databases. This makes it impossible to use it in an aggregate, standard way, and therefore hampers efficiency of emergency response. In this paper, we describe information sources relating to a heavy rain event in Rotterdam on October 12-14, 2013. Rotterdam weather monitoring infrastructure is composed of a number of rain gauges installed at different locations in the city, as well as a weather radar network. This sensing network is currently scarcely integrated and logged data are not easily accessible during an emergency. Therefore, we propose a reliable, efficient, and low-cost ICT infrastructure that takes information from all relevant sources, including sensors as well as social and user contributed information and integrates them into a unique, cloud-based interface. The proposed infrastructure will improve efficiency in emergency responses to extreme weather events and, ultimately, guarantee more safety to the urban population

    Comparative Study Of Implementing The On-Premises and Cloud Business Intelligence On Business Problems In a Multi-National Software Development Company

    Get PDF
    Internship Report presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Knowledge Management and Business IntelligenceNowadays every enterprise wants to be competitive. In the last decade, the data volumes are increased dramatically. As each year data in the market increases, the ability to extract, analyze and manage the data become the backbone condition for the organization to be competitive. In this condition, organizations need to adapt their technologies to the new business reality in order to be competitive and provide new solutions that meet new requests. Business Intelligence by the main definition is the ability to extract analyze and manage the data through which an organization gain a competitive advantage. Before using this approach, it’s important to decide on which computing system it will base on, considering the volume of data, business context of the organization and technologies requirements of the market. In the last 10 years, the popularity of cloud computing increased and divided the computing Systems into On-Premises and cloud. The cloud benefits are based on providing scalability, availability and fewer costs. On another hand, traditional On-Premises provides independence of software configuration, control over data and high security. The final decision as to which computing paradigm to follow in the organization it’s not an easy task as well as depends on the business context of the organization, and the characteristics of the performance of the current On-Premises systems in business processes. In this case, Business Intelligence functions and requires in-depth analysis in order to understand if cloud computing technologies could better perform in those processes than traditional systems. The objective of this internship is to conduct a comparative study between 2 computing systems in Business Intelligence routine functions. The study will compare the On-Premises Business Intelligence Based on Oracle Architecture with Cloud Business Intelligence based on Google Cloud Services. A comparative study will be conducted through participation in activities and projects in the Business Intelligence department, of a company that develops software digital solutions to serve the telecommunications market for 12 months, as an internship student in the 2nd year of a master’s degree in Information Management, with a specialization in Knowledge Management and Business Intelligence at Nova Information Management School (NOVA IMS)

    Database Management Systems: A NoSQL Analysis

    Full text link
    Volume 1 Issue 7 (September 2013

    Novel Charging and Discharging Schemes for Electric Vehicles in Smart Grids

    Get PDF
    PhD ThesisThis thesis presents smart Charging and Discharging (C&D) schemes in the smart grid that enable a decentralised scheduling with large volumes of Electric Vehicles (EV) participation. The proposed C&D schemes use di erent strategies to atten the power consumption pro le by manipulating the charging or discharging electricity quantity. The novelty of this thesis lies in: 1. A user-behaviour based smart EV charging scheme that lowers the overall peak demand with an optimised EV charging schedule. It achieves the minimal impacts on users' daily routine while satisfying EV charging demands. 2. A decentralised EV electricity exchange process matches the power demand with an adaptive blockchain-enabled C&D scheme and iceberg order execution algorithm. It demonstrates improved performance in terms of charging costs and power consumption pro le. 3. The Peer-to-Peer (P2P) electricity C&D scheme that stimulates the trading depth and energy market pro le with the best price guide. It also increases the EV users' autonomy and achieved maximal bene ts for the network peers while protecting against potential attacks. 4. A novel consensus-mechanism driven EV C&D scheme for the blockchain-based system that accommodates high volume EV scenarios and substantially reduces the power uctuation level. The theoretical and comprehensive simulations prove that the penetration of EV with the proposed schemes minimises the power uctuation level in an urban area, and also increases the resilience of the smart grid system

    Knowledge-infused and Consistent Complex Event Processing over Real-time and Persistent Streams

    Full text link
    Emerging applications in Internet of Things (IoT) and Cyber-Physical Systems (CPS) present novel challenges to Big Data platforms for performing online analytics. Ubiquitous sensors from IoT deployments are able to generate data streams at high velocity, that include information from a variety of domains, and accumulate to large volumes on disk. Complex Event Processing (CEP) is recognized as an important real-time computing paradigm for analyzing continuous data streams. However, existing work on CEP is largely limited to relational query processing, exposing two distinctive gaps for query specification and execution: (1) infusing the relational query model with higher level knowledge semantics, and (2) seamless query evaluation across temporal spaces that span past, present and future events. These allow accessible analytics over data streams having properties from different disciplines, and help span the velocity (real-time) and volume (persistent) dimensions. In this article, we introduce a Knowledge-infused CEP (X-CEP) framework that provides domain-aware knowledge query constructs along with temporal operators that allow end-to-end queries to span across real-time and persistent streams. We translate this query model to efficient query execution over online and offline data streams, proposing several optimizations to mitigate the overheads introduced by evaluating semantic predicates and in accessing high-volume historic data streams. The proposed X-CEP query model and execution approaches are implemented in our prototype semantic CEP engine, SCEPter. We validate our query model using domain-aware CEP queries from a real-world Smart Power Grid application, and experimentally analyze the benefits of our optimizations for executing these queries, using event streams from a campus-microgrid IoT deployment.Comment: 34 pages, 16 figures, accepted in Future Generation Computer Systems, October 27, 201

    Dagstuhl News January - December 2008

    Get PDF
    "Dagstuhl News" is a publication edited especially for the members of the Foundation "Informatikzentrum Schloss Dagstuhl" to thank them for their support. The News give a summary of the scientific work being done in Dagstuhl. Each Dagstuhl Seminar is presented by a small abstract describing the contents and scientific highlights of the seminar as well as the perspectives or challenges of the research topic
    • …
    corecore