17,379 research outputs found
Detecting Irregular Patterns in IoT Streaming Data for Fall Detection
Detecting patterns in real time streaming data has been an interesting and
challenging data analytics problem. With the proliferation of a variety of
sensor devices, real-time analytics of data from the Internet of Things (IoT)
to learn regular and irregular patterns has become an important machine
learning problem to enable predictive analytics for automated notification and
decision support. In this work, we address the problem of learning an irregular
human activity pattern, fall, from streaming IoT data from wearable sensors. We
present a deep neural network model for detecting fall based on accelerometer
data giving 98.75 percent accuracy using an online physical activity monitoring
dataset called "MobiAct", which was published by Vavoulas et al. The initial
model was developed using IBM Watson studio and then later transferred and
deployed on IBM Cloud with the streaming analytics service supported by IBM
Streams for monitoring real-time IoT data. We also present the systems
architecture of the real-time fall detection framework that we intend to use
with mbientlabs wearable health monitoring sensors for real time patient
monitoring at retirement homes or rehabilitation clinics.Comment: 7 page
Real-Time Streaming Analytics using Big Data Paradigm and Predictive Modelling based on Deep Learning
With the evolution of distributed streaming platforms analysing humongous time series data, which is streamed continuously from IoT devices become lot easier. In most of the IoT networks the data are in motion or in data centre/cloud. It is possible to process this data in real time similar to edge devices using the big data framework. In data intensive applications predictive analytics require more resources to perform complex computations. Apache Flink framework is capable of performing real time streaming of schema less data and scales very high in distributed environment with low latency, it is used to collect and store the data in the cloud. This work suggests a suitable environment to collect, transport, preprocess and aggregate the data stream to perform predictive analytics using deep learning models. Deep learning automatically extracts features and builds models after training, it has the potential to solve problems that can't be solved by conventional machine learning models. Therefore, the use of algorithms based on deep learning is recommended for forecasting temporal data. Also, we discuss a number of different deep learning forecasting models and analyse the performance of different deep learning forecasting models in order to determine which one is the effective model for single step, multi step and multi variant methods based on error functions with respect to streamed sensor data
Deep Learning for Network Traffic Monitoring and Analysis (NTMA): A Survey
Modern communication systems and networks, e.g., Internet of Things (IoT) and cellular networks, generate a massive and heterogeneous amount of traffic data. In such networks, the traditional network management techniques for monitoring and data analytics face some challenges and issues, e.g., accuracy, and effective processing of big data in a real-time fashion. Moreover, the pattern of network traffic, especially in cellular networks, shows very complex behavior because of various factors, such as device mobility and network heterogeneity. Deep learning has been efficiently employed to facilitate analytics and knowledge discovery in big data systems to recognize hidden and complex patterns. Motivated by these successes, researchers in the field of networking apply deep learning models for Network Traffic Monitoring and Analysis (NTMA) applications, e.g., traffic classification and prediction. This paper provides a comprehensive review on applications of deep learning in NTMA. We first provide fundamental background relevant to our review. Then, we give an insight into the confluence of deep learning and NTMA, and review deep learning techniques proposed for NTMA applications. Finally, we discuss key challenges, open issues, and future research directions for using deep learning in NTMA applications.publishedVersio
IoT Enabled Sensory Monitoring System for Fog Optimal Resource Provisioning Method in Health Monitoring System
Fog is data management and analytics service. In this paper gains and most effective novel approach to provide IoT enabled services in healthcare application using Fog Computing. In this research the data is collected from Google Scholar, Science Director and MEDLINE database. IoT based Fog Computing techniques are proposed for delivering quality of services to the user. Optimal Resource Provisioning method is proposed to find edges, service level agreements and administration services for IoT client. The DeepQ residue information processing technique is applied for connecting data centre of the cloud and computing paradigms technique is finding the depth reference of Fog levels. The proposed Optimal resource provisioning algorithm is examining the dataset and TensorFlow tool is used for simulating environment. Fog computing layer consist of IoT sensor data inputs, data centres for the cloud and connected layers for simulations. The Deep belief network is generated based on above inputs using 256 X 256 X 3 layer system and 5000 trained data, 1000 test data are taken for simulations. Each dataset simulation is recording using supervised and unsupervised learning methods. Based on above results IoT enable Fog Computing data management and analytics systems provided 95% accuracy and the compared with existing computing techniques our proposed systems shows better efficiency with respect to safety and convenience
Deep Learning: Edge-Cloud Data Analytics for IoT
Sensors, wearables, mobile and other Internet of Thing (IoT) devices are becoming increasingly integrated in all aspects of our lives. They are capable of collecting massive quantities of data that are typically transmitted to the cloud for processing. However, this results in increased network traffic and latencies. Edge computing has a potential to remedy these challenges by moving computation physically closer to the network edge where data are generated. However, edge computing does not have sufficient resources for complex data analytics tasks. Consequently, this paper investigates merging cloud and edge computing for IoT data analytics and presents a deep learning-based approach for data reduction on the edge with the machine learning on the cloud. The encoder part of the autoencoder is located on the edge to reduce data dimensions. Reduced data are sent to the cloud where there are used directly for machine learning or expanded to original features using the decoder part of the autoencoder. The proposed approach has been evaluated on the human activity recognition tasks. Results show that 50% data reduction did not have a significant impact on the classification accuracy and 77% reduction only caused 1% change
Smart Asset Management for Electric Utilities: Big Data and Future
This paper discusses about future challenges in terms of big data and new
technologies. Utilities have been collecting data in large amounts but they are
hardly utilized because they are huge in amount and also there is uncertainty
associated with it. Condition monitoring of assets collects large amounts of
data during daily operations. The question arises "How to extract information
from large chunk of data?" The concept of "rich data and poor information" is
being challenged by big data analytics with advent of machine learning
techniques. Along with technological advancements like Internet of Things
(IoT), big data analytics will play an important role for electric utilities.
In this paper, challenges are answered by pathways and guidelines to make the
current asset management practices smarter for the future.Comment: 13 pages, 3 figures, Proceedings of 12th World Congress on
Engineering Asset Management (WCEAM) 201
Big Data and the Internet of Things
Advances in sensing and computing capabilities are making it possible to
embed increasing computing power in small devices. This has enabled the sensing
devices not just to passively capture data at very high resolution but also to
take sophisticated actions in response. Combined with advances in
communication, this is resulting in an ecosystem of highly interconnected
devices referred to as the Internet of Things - IoT. In conjunction, the
advances in machine learning have allowed building models on this ever
increasing amounts of data. Consequently, devices all the way from heavy assets
such as aircraft engines to wearables such as health monitors can all now not
only generate massive amounts of data but can draw back on aggregate analytics
to "improve" their performance over time. Big data analytics has been
identified as a key enabler for the IoT. In this chapter, we discuss various
avenues of the IoT where big data analytics either is already making a
significant impact or is on the cusp of doing so. We also discuss social
implications and areas of concern.Comment: 33 pages. draft of upcoming book chapter in Japkowicz and Stefanowski
(eds.) Big Data Analysis: New algorithms for a new society, Springer Series
on Studies in Big Data, to appea
An IoT Endpoint System-on-Chip for Secure and Energy-Efficient Near-Sensor Analytics
Near-sensor data analytics is a promising direction for IoT endpoints, as it
minimizes energy spent on communication and reduces network load - but it also
poses security concerns, as valuable data is stored or sent over the network at
various stages of the analytics pipeline. Using encryption to protect sensitive
data at the boundary of the on-chip analytics engine is a way to address data
security issues. To cope with the combined workload of analytics and encryption
in a tight power envelope, we propose Fulmine, a System-on-Chip based on a
tightly-coupled multi-core cluster augmented with specialized blocks for
compute-intensive data processing and encryption functions, supporting software
programmability for regular computing tasks. The Fulmine SoC, fabricated in
65nm technology, consumes less than 20mW on average at 0.8V achieving an
efficiency of up to 70pJ/B in encryption, 50pJ/px in convolution, or up to
25MIPS/mW in software. As a strong argument for real-life flexible application
of our platform, we show experimental results for three secure analytics use
cases: secure autonomous aerial surveillance with a state-of-the-art deep CNN
consuming 3.16pJ per equivalent RISC op; local CNN-based face detection with
secured remote recognition in 5.74pJ/op; and seizure detection with encrypted
data collection from EEG within 12.7pJ/op.Comment: 15 pages, 12 figures, accepted for publication to the IEEE
Transactions on Circuits and Systems - I: Regular Paper
- …