877 research outputs found

    Reputation-aware Trajectory-based Data Mining in the Internet of Things (IoT)

    Get PDF
    Internet of Things (IoT) is a critically important technology for the acquisition of spatiotemporally dense data in diverse applications, ranging from environmental monitoring to surveillance systems. Such data helps us improve our transportation systems, monitor our air quality and the spread of diseases, respond to natural disasters, and a bevy of other applications. However, IoT sensor data is error-prone due to a number of reasons: sensors may be deployed in hazardous environments, may deplete their energy resources, have mechanical faults, or maybe become the targets of malicious attacks by adversaries. While previous research has attempted to improve the quality of the IoT data, they are limited in terms of better realization of the sensing context and resiliency against malicious attackers in real time. For instance, the data fusion techniques, which process the data in batches, cannot be applied to time-critical applications as they take a long time to respond. Furthermore, context-awareness allows us to examine the sensing environment and react to environmental changes. While previous research has considered geographical context, no related contemporary work has studied how a variety of sensor context (e.g., terrain elevation, wind speed, and user movement during sensing) can be used along with spatiotemporal relationships for online data prediction. This dissertation aims at developing online methods for data prediction by fusing spatiotemporal and contextual relationships among the participating resource-constrained mobile IoT devices (e.g. smartphones, smart watches, and fitness tracking devices). To achieve this goal, we first introduce a data prediction mechanism that considers the spatiotemporal and contextual relationship among the sensors. Second, we develop a real-time outlier detection approach stemming from a window-based sub-trajectory clustering method for finding behavioral movement similarity in terms of space, time, direction, and location semantics. We relax the prior assumption of cooperative sensors in the concluding section. Finally, we develop a reputation-aware context-based data fusion mechanism by exploiting inter sensor-category correlations. On one hand, this method is capable of defending against false data injection by differentiating malicious and honest participants based on their reported data in real time. On the other hand, this mechanism yields a lower data prediction error rate

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    A survey of machine learning techniques applied to self organizing cellular networks

    Get PDF
    In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future

    Machine Learning for Identifying Group Trajectory Outliers

    Get PDF
    Prior works on the trajectory outlier detection problem solely consider individual outliers. However, in real-world scenarios, trajectory outliers can often appear in groups, e.g., a group of bikes that deviates to the usual trajectory due to the maintenance of streets in the context of intelligent transportation. The current paper considers the Group Trajectory Outlier (GTO) problem and proposes three algorithms. The first and the second algorithms are extensions of the well-known DBSCAN and kNN algorithms, while the third one models the GTO problem as a feature selection problem. Furthermore, two different enhancements for the proposed algorithms are proposed. The first one is based on ensemble learning and computational intelligence, which allows for merging algorithms’ outputs to possibly improve the final result. The second is a general high-performance computing framework that deals with big trajectory databases, which we used for a GPU-based implementation. Experimental results on different real trajectory databases show the scalability of the proposed approaches.acceptedVersio

    Traffic Prediction using Artificial Intelligence: Review of Recent Advances and Emerging Opportunities

    Full text link
    Traffic prediction plays a crucial role in alleviating traffic congestion which represents a critical problem globally, resulting in negative consequences such as lost hours of additional travel time and increased fuel consumption. Integrating emerging technologies into transportation systems provides opportunities for improving traffic prediction significantly and brings about new research problems. In order to lay the foundation for understanding the open research challenges in traffic prediction, this survey aims to provide a comprehensive overview of traffic prediction methodologies. Specifically, we focus on the recent advances and emerging research opportunities in Artificial Intelligence (AI)-based traffic prediction methods, due to their recent success and potential in traffic prediction, with an emphasis on multivariate traffic time series modeling. We first provide a list and explanation of the various data types and resources used in the literature. Next, the essential data preprocessing methods within the traffic prediction context are categorized, and the prediction methods and applications are subsequently summarized. Lastly, we present primary research challenges in traffic prediction and discuss some directions for future research.Comment: Published in Transportation Research Part C: Emerging Technologies (TR_C), Volume 145, 202

    USING PROBABILISTIC GRAPHICAL MODELS TO DRAW INFERENCES IN SENSOR NETWORKS WITH TRACKING APPLICATIONS

    Get PDF
    Sensor networks have been an active research area in the past decade due to the variety of their applications. Many research studies have been conducted to solve the problems underlying the middleware services of sensor networks, such as self-deployment, self-localization, and synchronization. With the provided middleware services, sensor networks have grown into a mature technology to be used as a detection and surveillance paradigm for many real-world applications. The individual sensors are small in size. Thus, they can be deployed in areas with limited space to make unobstructed measurements in locations where the traditional centralized systems would have trouble to reach. However, there are a few physical limitations to sensor networks, which can prevent sensors from performing at their maximum potential. Individual sensors have limited power supply, the wireless band can get very cluttered when multiple sensors try to transmit at the same time. Furthermore, the individual sensors have limited communication range, so the network may not have a 1-hop communication topology and routing can be a problem in many cases. Carefully designed algorithms can alleviate the physical limitations of sensor networks, and allow them to be utilized to their full potential. Graphical models are an intuitive choice for designing sensor network algorithms. This thesis focuses on a classic application in sensor networks, detecting and tracking of targets. It develops feasible inference techniques for sensor networks using statistical graphical model inference, binary sensor detection, events isolation and dynamic clustering. The main strategy is to use only binary data for rough global inferences, and then dynamically form small scale clusters around the target for detailed computations. This framework is then extended to network topology manipulation, so that the framework developed can be applied to tracking in different network topology settings. Finally the system was tested in both simulation and real-world environments. The simulations were performed on various network topologies, from regularly distributed networks to randomly distributed networks. The results show that the algorithm performs well in randomly distributed networks, and hence requires minimum deployment effort. The experiments were carried out in both corridor and open space settings. A in-home falling detection system was simulated with real-world settings, it was setup with 30 bumblebee radars and 30 ultrasonic sensors driven by TI EZ430-RF2500 boards scanning a typical 800 sqft apartment. Bumblebee radars are calibrated to detect the falling of human body, and the two-tier tracking algorithm is used on the ultrasonic sensors to track the location of the elderly people

    Learning Human Behaviour Patterns by Trajectory and Activity Recognition

    Get PDF
    The world’s population is ageing, increasing the awareness of neurological and behavioural impairments that may arise from the human ageing. These impairments can be manifested by cognitive conditions or mobility reduction. These conditions are difficult to be detected on time, relying only on the periodic medical appointments. Therefore, there is a lack of routine screening which demands the development of solutions to better assist and monitor human behaviour. The available technologies to monitor human behaviour are limited to indoors and require the installation of sensors around the user’s homes presenting high maintenance and installation costs. With the widespread use of smartphones, it is possible to take advantage of their sensing information to better assist the elderly population. This study investigates the question of what we can learn about human pattern behaviour from this rich and pervasive mobile sensing data. A deployment of a data collection over a period of 6 months was designed to measure three different human routines through human trajectory analysis and activity recognition comprising indoor and outdoor environment. A framework for modelling human behaviour was developed using human motion features, extracted in an unsupervised and supervised manner. The unsupervised feature extraction is able to measure mobility properties such as step length estimation, user points of interest or even locomotion activities inferred from an user-independent trained classifier. The supervised feature extraction was design to be user-dependent as each user may have specific behaviours that are common to his/her routine. The human patterns were modelled through probability density functions and clustering approaches. Using the human learned patterns, inferences about the current human behaviour were continuously quantified by an anomaly detection algorithm, where distance measurements were used to detect significant changes in behaviour. Experimental results demonstrate the effectiveness of the proposed framework that revealed an increase potential to learn behaviour patterns and detect anomalies
    • …
    corecore