12 research outputs found
Time Series Management Systems:A Survey
The collection of time series data increases as more monitoring and
automation are being deployed. These deployments range in scale from an
Internet of things (IoT) device located in a household to enormous distributed
Cyber-Physical Systems (CPSs) producing large volumes of data at high velocity.
To store and analyze these vast amounts of data, specialized Time Series
Management Systems (TSMSs) have been developed to overcome the limitations of
general purpose Database Management Systems (DBMSs) for times series
management. In this paper, we present a thorough analysis and classification of
TSMSs developed through academic or industrial research and documented through
publications. Our classification is organized into categories based on the
architectures observed during our analysis. In addition, we provide an overview
of each system with a focus on the motivational use case that drove the
development of the system, the functionality for storage and querying of time
series a system implements, the components the system is composed of, and the
capabilities of each system with regard to Stream Processing and Approximate
Query Processing (AQP). Last, we provide a summary of research directions
proposed by other researchers in the field and present our vision for a next
generation TSMS.Comment: 20 Pages, 15 Figures, 2 Tables, Accepted for publication in IEEE TKD
Cybersecurity of Industrial Cyber-Physical Systems: A Review
Industrial cyber-physical systems (ICPSs) manage critical infrastructures by
controlling the processes based on the "physics" data gathered by edge sensor
networks. Recent innovations in ubiquitous computing and communication
technologies have prompted the rapid integration of highly interconnected
systems to ICPSs. Hence, the "security by obscurity" principle provided by
air-gapping is no longer followed. As the interconnectivity in ICPSs increases,
so does the attack surface. Industrial vulnerability assessment reports have
shown that a variety of new vulnerabilities have occurred due to this
transition while the most common ones are related to weak boundary protection.
Although there are existing surveys in this context, very little is mentioned
regarding these reports. This paper bridges this gap by defining and reviewing
ICPSs from a cybersecurity perspective. In particular, multi-dimensional
adaptive attack taxonomy is presented and utilized for evaluating real-life
ICPS cyber incidents. We also identify the general shortcomings and highlight
the points that cause a gap in existing literature while defining future
research directions.Comment: 32 pages, 10 figure
Cybersecurity of industrial cyber-physical systems: a review
Industrial cyber-physical systems (ICPSs) manage critical infrastructures by controlling the processes based on the “physics” data gathered by edge sensor networks. Recent innovations in ubiquitous computing and communication technologies have prompted the rapid integration of highly interconnected systems to ICPSs. Hence, the “security by obscurity” principle provided by air-gapping is no longer followed. As the interconnectivity in ICPSs increases, so does the attack surface. Industrial vulnerability assessment reports have shown that a variety of new vulnerabilities have occurred due to this transition. Although there are existing surveys in this context, very little is mentioned regarding the outputs of these reports. While these reports show that the most exploited vulnerabilities occur due to weak boundary protection, these vulnerabilities also occur due to limited or ill defined security policies. However, current literature focuses on intrusion detection systems (IDS), network traffic analysis (NTA) methods, or anomaly detection techniques. Hence, finding a solution for the problems mentioned in these reports is relatively hard. We bridge this gap by defining and reviewing ICPSs from a cybersecurity perspective. In particular, multi-dimensional adaptive attack taxonomy is presented and utilized for evaluating real-life ICPS cyber incidents. Finally, we identify the general shortcomings and highlight the points that cause a gap in existing literature while defining future research directions
An object-oriented component-based approach to building real-time software systems
A project report submitted to the Faculty of Erlglncerlng, University of Witwatersrand,
Johannesburg, In partial fulfilment of the requirements for the degree of Master of Science In
Engineering
Johannesburg 1993This Project Repolt r ''"lorts on the study of an approach to building integrated real-time software
systems based on re-usable object-oriented components. The basis of the approach is the
development of a a-layered structure of components, where each layer is built on the underlying
layer of components,
The lower layer of components consists of generic re-usable building blocks that may be re-used
for building and integrating other real-time applications. The middle layer consists of components
that are generic to the application domain, and the top layer consists of components that are
specific to each application of that application domain.
The Report includes researching and developing methods of communicating between these
building blocks using an OSI/CMIP-conformant 'software highway" and in this regard particular
attention is given to the formal and de facto industry standards.
With this approach, it is argued that the application engineer can effectively build new applications
using the re-usable components. This is demonstrated by reporting on the implementation of a
large real-world Telecommunications Network Management application.
The Project Report contains a critical analysis of the technical, organisational and project
management issues of this Object-oriented component approach as compared to the traditional
development approach. The Report concludes that despite certain technical and organisational
concerns, the object-oriented approach does indeed yield several worthwhile benefits for
developing real-time software systems. These benefits include genuine re-usability, and l"1proved
productivity, testability and maintainability
Large-Scale Indexing, Discovery, and Ranking for the Internet of Things (IoT)
Network-enabled sensing and actuation devices are key enablers to connect real-world objects to the cyber world. The Internet of Things (IoT) consists of the network-enabled devices and communication technologies that allow connectivity and integration of physical objects (Things) into the digital world (Internet). Enormous amounts of dynamic IoT data are collected from Internet-connected devices. IoT data are usually multi-variant streams that are heterogeneous, sporadic, multi-modal, and spatio-temporal. IoT data can be disseminated with different granularities and have diverse structures, types, and qualities. Dealing with the data deluge from heterogeneous IoT resources and services imposes new challenges on indexing, discovery, and ranking mechanisms that will allow building applications that require on-line access and retrieval of ad-hoc IoT data. However, the existing IoT data indexing and discovery approaches are complex or centralised, which hinders their scalability. The primary objective of this article is to provide a holistic overview of the state-of-the-art on indexing, discovery, and ranking of IoT data. The article aims to pave the way for researchers to design, develop, implement, and evaluate techniques and approaches for on-line large-scale distributed IoT applications and services
Santa Clara Magazine, Volume 57 Number 1, Fall 2015
24 - ART HAPPENING HERE Inside the Edward M. Dowd Art & Art History Building. Illustration by Harry Campbell. Words by Steven Boyd Saum.
28 - CALL HER A WORLD CHAMPION And call them America’s Team. Julie Johnston ’14 and the Women’s World Cup. By Ann Killion.
34 - A WILD GENEROSITY The energy and genius of Steve Nash ’96 on the court. By Brian Doyle.
37 - BELIEVE IN US An oral history of a 1993 NCAA playoff game that became an upset for the ages. By Jeff Gire and Harold Gutmann.
40 - CHANGE THE GAME Pope Francis speaks about our common home. Here is what a theologian, an engineer, and an environmentalist hear. By John S. Farnsworth.
46 - SERRA’S SOJOURN Mallorca to Mexico to the missions of Alta California. And now to sainthood. So who was he really? By Robert Senkewicz and Rose Marie Beebe ’76.https://scholarcommons.scu.edu/sc_mag/1024/thumbnail.jp
ST-Hadoop: A MapReduce Framework for Big Spatio-temporal Data Management
University of Minnesota Ph.D. dissertation.May 2019. Major: Computer Science. Advisor: Mohamed Mokbel. 1 computer file (PDF); x, 123 pages.Apache Hadoop, employing the MapReduce programming paradigm, that has been widely accepted as the standard framework for analyzing big data in distributed environments. Unfortunately, this rich framework was not genuinely exploited towards processing large scale spatio-temporal data, especially with the emergence and popularity of applications that create them in large-scale. The huge volumes of spatio-temporal data come from applications, like Taxi fleet in urban computing, Asteroids in astronomy research studies, animal movements in habitat studies, neuron analysis in neuroscience research studies, and contents of social networks (e.g., Twitter or Facebook). Managing space and time are two fundamental characteristics that raised the demand for processing spatio-temporal data created by these applications. Besides the massive size of data, the complexity of shapes and formats associated with these data raised many challenges in managing spatio-temporal data. The goal of the dissertation is centered on establishing a full-fledged big spatio-temporal data management system that serves the need for a wide range of spatio-temporal applications. This involves indexing, querying, and analyzing spatio-temporal data. We propose ST-Hadoop; the first full-fledged open-source system with native support for big spatio-temporal data, available to download http://st-hadoop.cs.umn.edu/. ST- Hadoop injects spatio-temporal data awareness inside the highly popular Hadoop system that is considered state-of-the-art for off-line analysis of big data systems. Considering a distributed environment, we focus on the following: (1) indexing spatio-temporal data and (2) Supporting various fundamental spatio-temporal operations, such as range, kNN, and join (3) Supporting indexing and querying trajectories, which is considered as a special class of spatio-temporal data that require special handling. Throughout this dissertation, we will touch base on the background and related work, motivate for the proposed system, and highlight our contributions