2,982 research outputs found

    When Things Matter: A Data-Centric View of the Internet of Things

    Full text link
    With the recent advances in radio-frequency identification (RFID), low-cost wireless sensor devices, and Web technologies, the Internet of Things (IoT) approach has gained momentum in connecting everyday objects to the Internet and facilitating machine-to-human and machine-to-machine communication with the physical world. While IoT offers the capability to connect and integrate both digital and physical entities, enabling a whole new class of applications and services, several significant challenges need to be addressed before these applications and services can be fully realized. A fundamental challenge centers around managing IoT data, typically produced in dynamic and volatile environments, which is not only extremely large in scale and volume, but also noisy, and continuous. This article surveys the main techniques and state-of-the-art research efforts in IoT from data-centric perspectives, including data stream processing, data storage models, complex event processing, and searching in IoT. Open research issues for IoT data management are also discussed

    Big Data Analysis

    Get PDF
    The value of big data is predicated on the ability to detect trends and patterns and more generally to make sense of the large volumes of data that is often comprised of a heterogeneous mix of format, structure, and semantics. Big data analysis is the component of the big data value chain that focuses on transforming raw acquired data into a coherent usable resource suitable for analysis. Using a range of interviews with key stakeholders in small and large companies and academia, this chapter outlines key insights, state of the art, emerging trends, future requirements, and sectorial case studies for data analysis

    Alexandria: Extensible Framework for Rapid Exploration of Social Media

    Full text link
    The Alexandria system under development at IBM Research provides an extensible framework and platform for supporting a variety of big-data analytics and visualizations. The system is currently focused on enabling rapid exploration of text-based social media data. The system provides tools to help with constructing "domain models" (i.e., families of keywords and extractors to enable focus on tweets and other social media documents relevant to a project), to rapidly extract and segment the relevant social media and its authors, to apply further analytics (such as finding trends and anomalous terms), and visualizing the results. The system architecture is centered around a variety of REST-based service APIs to enable flexible orchestration of the system capabilities; these are especially useful to support knowledge-worker driven iterative exploration of social phenomena. The architecture also enables rapid integration of Alexandria capabilities with other social media analytics system, as has been demonstrated through an integration with IBM Research's SystemG. This paper describes a prototypical usage scenario for Alexandria, along with the architecture and key underlying analytics.Comment: 8 page

    Arquitectura de Analítica de Big Data para Aplicaciones de Ciberseguridad

    Get PDF
    The technological and social changes in the  cur- rent information age pose new challenges for security analysts. Novel strategies and security solutions are sought to improve security operations concerning the detection and analysis of security threats and attacks. Security analysts address security challenges by analyzing large amounts of data from server logs, communication equipment, security solutions, and blogs related to information security in different structured and unstructured formats. In this paper, we examine the application of big data to support some security activities and conceptual models to generate knowledge that can be used for the decision making or automation of security response action. Concretely, we present a massive data processing methodology and introduce  a  big data architecture devised for cybersecurity applications. This architecture identifies anomalous behavior patterns and trends to anticipate cybersecurity attacks characterized as relatively random, spontaneous, and out of the ordinary.Los cambios tecnológicos y  sociales  en  la  era de la información actual plantean nuevos desafíos para los analistas de seguridad. Se buscan nuevas estrategias y soluciones de seguridad para mejorar las operaciones de seguridad relacionadas con la detección y análisis de amenazas y ataques a la seguridad. Los analistas de seguridad abordan los desafíos de seguridad al analizar grandes cantidades de datos de registros de servidores, equipos de comunicación, soluciones de seguridad y blogs relacionados con la seguridad de la información en diferentes formatos estructurados y no estructurados. En este artículo, se examina la aplicación de big data para respaldar algunas actividades de seguridad y modelos conceptuales para generar conocimiento que se pueda utilizar  para  la  toma de decisiones o la  automatización  de  la  acción  de  respuesta de seguridad. En concreto, se presenta una metodología de procesamiento   masivo   de   datos    y   se   introduce una arquitectura  de  big   data  ideada   para   aplicaciones de ciberseguridad. Esta arquitectura identifica patrones de comportamiento anómalos y tendencias para anticipar ataques de ciberseguridad caracterizados como relativamente aleatorios, espontáneos y fuera de lo común

    Fast Data in the Era of Big Data: Twitter's Real-Time Related Query Suggestion Architecture

    Full text link
    We present the architecture behind Twitter's real-time related query suggestion and spelling correction service. Although these tasks have received much attention in the web search literature, the Twitter context introduces a real-time "twist": after significant breaking news events, we aim to provide relevant results within minutes. This paper provides a case study illustrating the challenges of real-time data processing in the era of "big data". We tell the story of how our system was built twice: our first implementation was built on a typical Hadoop-based analytics stack, but was later replaced because it did not meet the latency requirements necessary to generate meaningful real-time results. The second implementation, which is the system deployed in production, is a custom in-memory processing engine specifically designed for the task. This experience taught us that the current typical usage of Hadoop as a "big data" platform, while great for experimentation, is not well suited to low-latency processing, and points the way to future work on data analytics platforms that can handle "big" as well as "fast" data

    Post Event Investigation of Multi-stream Video Data Utilizing Hadoop Cluster

    Get PDF
    Rapid advancement in technology and in-expensive camera has raised the necessity of monitoring systems for surveillance applications. As a result data acquired from numerous cameras deployed for surveillance is tremendous. When an event is triggered then, manually investigating such a massive data is a complex task. Thus it is essential to explore an approach that, can store massive multi-stream video data as well as, process them to find useful information. To address the challenge of storing and processing multi-stream video data, we have used Hadoop, which has grown into a leading computing model for data intensive applications. In this paper we propose a novel technique for performing post event investigation on stored surveillance video data. Our algorithm stores video data in HDFS in such a way that it efficiently identifies the location of data from HDFS based on the time of occurrence of event and perform further processing. To prove efficiency of our proposed work, we have performed event detection in the video based on the time period provided by the user. In order to estimate the performance of our approach, we evaluated the storage and processing of video data by varying (i) pixel resolution of video frame (ii) size of video data (iii) number of reducers (workers) executing the task (iv) the number of nodes in the cluster. The proposed framework efficiently achieve speed up of 5.9 for large files of 1024X1024 pixel resolution video frames thus makes it appropriate for the feasible practical deployment in any applications

    Analysis and Extraction of Tempo-Spatial Events in an Efficient Archival CDN with Emphasis on Telegram

    Full text link
    This paper presents an efficient archival framework for exploring and tracking cyberspace large-scale data called Tempo-Spatial Content Delivery Network (TS-CDN). Social media data streams are renewing in time and spatial dimensions. Various types of websites and social networks (i.e., channels, groups, pages, etc.) are considered spatial in cyberspace. Accurate analysis entails encompassing the bulk of data. In TS-CDN by applying the hash function on big data an efficient content delivery network is created. Using hash function rebuffs data redundancy and leads to conclude unique data archive in large-scale. This framework based on entered query allows for apparent monitoring and exploring data in tempo-spatial dimension based on TF-IDF score. Also by conformance from i18n standard, the Unicode problem has been dissolved. For evaluation of TS-CDN framework, a dataset from Telegram news channels from March 23, 2020 (1399-01-01), to September 21, 2020 (1399-06-31) on topics including Coronavirus (COVID-19), vaccine, school reopening, flood, earthquake, justice shares, petroleum, and quarantine exploited. By applying hash on Telegram dataset in the mentioned time interval, a significant reduction in media files such as 39.8% for videos (from 79.5 GB to 47.8 GB), and 10% for images (from 4 GB to 3.6 GB) occurred. TS-CDN infrastructure in a web-based approach has been presented as a service-oriented system. Experiments conducted on enormous time series data, including different spatial dimensions (i.e., Khabare Fouri, Khabarhaye Fouri, Akhbare Rouze Iran, and Akhbare Rasmi Telegram news channels), demonstrate the efficiency and applicability of the implemented TS-CDN framework
    corecore