766 research outputs found

    Machine Learning for Microcontroller-Class Hardware -- A Review

    Get PDF
    The advancements in machine learning opened a new opportunity to bring intelligence to the low-end Internet-of-Things nodes such as microcontrollers. Conventional machine learning deployment has high memory and compute footprint hindering their direct deployment on ultra resource-constrained microcontrollers. This paper highlights the unique requirements of enabling onboard machine learning for microcontroller class devices. Researchers use a specialized model development workflow for resource-limited applications to ensure the compute and latency budget is within the device limits while still maintaining the desired performance. We characterize a closed-loop widely applicable workflow of machine learning model development for microcontroller class devices and show that several classes of applications adopt a specific instance of it. We present both qualitative and numerical insights into different stages of model development by showcasing several use cases. Finally, we identify the open research challenges and unsolved questions demanding careful considerations moving forward.Comment: Accepted for publication at IEEE Sensors Journa

    Next Generation Internet of Things – Distributed Intelligence at the Edge and Human-Machine Interactions

    Get PDF
    This book provides an overview of the next generation Internet of Things (IoT), ranging from research, innovation, development priorities, to enabling technologies in a global context. It is intended as a standalone in a series covering the activities of the Internet of Things European Research Cluster (IERC), including research, technological innovation, validation, and deployment.The following chapters build on the ideas put forward by the European Research Cluster, the IoT European Platform Initiative (IoT–EPI), the IoT European Large-Scale Pilots Programme and the IoT European Security and Privacy Projects, presenting global views and state-of-the-art results regarding the next generation of IoT research, innovation, development, and deployment.The IoT and Industrial Internet of Things (IIoT) are evolving towards the next generation of Tactile IoT/IIoT, bringing together hyperconnectivity (5G and beyond), edge computing, Distributed Ledger Technologies (DLTs), virtual/ andaugmented reality (VR/AR), and artificial intelligence (AI) transformation.Following the wider adoption of consumer IoT, the next generation of IoT/IIoT innovation for business is driven by industries, addressing interoperability issues and providing new end-to-end security solutions to face continuous treats.The advances of AI technology in vision, speech recognition, natural language processing and dialog are enabling the development of end-to-end intelligent systems encapsulating multiple technologies, delivering services in real-time using limited resources. These developments are focusing on designing and delivering embedded and hierarchical AI solutions in IoT/IIoT, edge computing, using distributed architectures, DLTs platforms and distributed end-to-end security, which provide real-time decisions using less data and computational resources, while accessing each type of resource in a way that enhances the accuracy and performance of models in the various IoT/IIoT applications.The convergence and combination of IoT, AI and other related technologies to derive insights, decisions and revenue from sensor data provide new business models and sources of monetization. Meanwhile, scalable, IoT-enabled applications have become part of larger business objectives, enabling digital transformation with a focus on new services and applications.Serving the next generation of Tactile IoT/IIoT real-time use cases over 5G and Network Slicing technology is essential for consumer and industrial applications and support reducing operational costs, increasing efficiency and leveraging additional capabilities for real-time autonomous systems.New IoT distributed architectures, combined with system-level architectures for edge/fog computing, are evolving IoT platforms, including AI and DLTs, with embedded intelligence into the hyperconnectivity infrastructure.The next generation of IoT/IIoT technologies are highly transformational, enabling innovation at scale, and autonomous decision-making in various application domains such as healthcare, smart homes, smart buildings, smart cities, energy, agriculture, transportation and autonomous vehicles, the military, logistics and supply chain, retail and wholesale, manufacturing, mining and oil and gas

    Next Generation Internet of Things – Distributed Intelligence at the Edge and Human-Machine Interactions

    Get PDF
    This book provides an overview of the next generation Internet of Things (IoT), ranging from research, innovation, development priorities, to enabling technologies in a global context. It is intended as a standalone in a series covering the activities of the Internet of Things European Research Cluster (IERC), including research, technological innovation, validation, and deployment.The following chapters build on the ideas put forward by the European Research Cluster, the IoT European Platform Initiative (IoT–EPI), the IoT European Large-Scale Pilots Programme and the IoT European Security and Privacy Projects, presenting global views and state-of-the-art results regarding the next generation of IoT research, innovation, development, and deployment.The IoT and Industrial Internet of Things (IIoT) are evolving towards the next generation of Tactile IoT/IIoT, bringing together hyperconnectivity (5G and beyond), edge computing, Distributed Ledger Technologies (DLTs), virtual/ andaugmented reality (VR/AR), and artificial intelligence (AI) transformation.Following the wider adoption of consumer IoT, the next generation of IoT/IIoT innovation for business is driven by industries, addressing interoperability issues and providing new end-to-end security solutions to face continuous treats.The advances of AI technology in vision, speech recognition, natural language processing and dialog are enabling the development of end-to-end intelligent systems encapsulating multiple technologies, delivering services in real-time using limited resources. These developments are focusing on designing and delivering embedded and hierarchical AI solutions in IoT/IIoT, edge computing, using distributed architectures, DLTs platforms and distributed end-to-end security, which provide real-time decisions using less data and computational resources, while accessing each type of resource in a way that enhances the accuracy and performance of models in the various IoT/IIoT applications.The convergence and combination of IoT, AI and other related technologies to derive insights, decisions and revenue from sensor data provide new business models and sources of monetization. Meanwhile, scalable, IoT-enabled applications have become part of larger business objectives, enabling digital transformation with a focus on new services and applications.Serving the next generation of Tactile IoT/IIoT real-time use cases over 5G and Network Slicing technology is essential for consumer and industrial applications and support reducing operational costs, increasing efficiency and leveraging additional capabilities for real-time autonomous systems.New IoT distributed architectures, combined with system-level architectures for edge/fog computing, are evolving IoT platforms, including AI and DLTs, with embedded intelligence into the hyperconnectivity infrastructure.The next generation of IoT/IIoT technologies are highly transformational, enabling innovation at scale, and autonomous decision-making in various application domains such as healthcare, smart homes, smart buildings, smart cities, energy, agriculture, transportation and autonomous vehicles, the military, logistics and supply chain, retail and wholesale, manufacturing, mining and oil and gas

    A Framework for Preserving Privacy and Cybersecurity in Brain-Computer Interfacing Applications

    Full text link
    Brain-Computer Interfaces (BCIs) comprise a rapidly evolving field of technology with the potential of far-reaching impact in domains ranging from medical over industrial to artistic, gaming, and military. Today, these emerging BCI applications are typically still at early technology readiness levels, but because BCIs create novel, technical communication channels for the human brain, they have raised privacy and security concerns. To mitigate such risks, a large body of countermeasures has been proposed in the literature, but a general framework is lacking which would describe how privacy and security of BCI applications can be protected by design, i.e., already as an integral part of the early BCI design process, in a systematic manner, and allowing suitable depth of analysis for different contexts such as commercial BCI product development vs. academic research and lab prototypes. Here we propose the adoption of recent systems-engineering methodologies for privacy threat modeling, risk assessment, and privacy engineering to the BCI field. These methodologies address privacy and security concerns in a more systematic and holistic way than previous approaches, and provide reusable patterns on how to move from principles to actions. We apply these methodologies to BCI and data flows and derive a generic, extensible, and actionable framework for brain-privacy-preserving cybersecurity in BCI applications. This framework is designed for flexible application to the wide range of current and future BCI applications. We also propose a range of novel privacy-by-design features for BCIs, with an emphasis on features promoting BCI transparency as a prerequisite for informational self-determination of BCI users, as well as design features for ensuring BCI user autonomy. We anticipate that our framework will contribute to the development of privacy-respecting, trustworthy BCI technologies

    An Ethics for the New (and Old) Surveillance

    Get PDF

    Detection of Anomalous Behavior of IoT/CPS Devices Using Their Power Signals

    Get PDF
    Embedded computing devices, in the Internet of Things (IoT) or Cyber-Physical Systems (CPS), are becoming pervasive in many domains around the world. Their wide deployment in simple applications (e.g., smart buildings, fleet management, and smart agriculture) or in more critical operations (e.g., industrial control, smart power grids, and self-driving cars) creates significant market potential ($ 4-11 trillion in annual revenue is expected by 2025). A main requirement for the success of such systems and applications is the capacity to ensure the performance of these devices. This task includes equipping them to be resilient against security threats and failures. Globally, several critical infrastructure applications have been the target of cyber attacks. These recent incidents, as well as the rich applicable literature, confirm that more research is needed to overcome such challenges. Consequently, the need for robust approaches that detect anomalous behaving devices in security and safety-critical applications has become paramount. Solving such a problem minimizes different kinds of losses (e.g., confidential data theft, financial loss, service access restriction, or even casualties). In light of the aforementioned motivation and discussion, this thesis focuses on the problem of detecting the anomalous behavior of IoT/CPS devices by considering their side-channel information. Solving such a problem is extremely important in maintaining the security and dependability of critical systems and applications. Although several side-channel based approaches are found in the literature, there are still important research gaps that need to be addressed. First, the intrusive nature of the monitoring in some of the proposed techniques results in resources overhead and requires instrumentation of the internal components of a device, which makes them impractical. It also raises a data integrity flag. Second, the lack of realistic experimental power consumption datasets that reflect the normal and anomalous behaviors of IoT and CPS devices has prevented fair and coherent comparisons with the state of the art in this domain. Finally, most of the research to date has concentrated on the accuracy of detection and not the novelty of detecting new anomalies. Such a direction relies on: (i) the availability of labeled datasets; (ii) the complexity of the extracted features; and (iii) the available compute resources. These assumptions and requirements are usually unrealistic and unrepresentative. This research aims to bridge these gaps as follows. First, this study extends the state of the art that adopts the idea of leveraging the power consumption of devices as a signal and the concept of decoupling the monitoring system and the devices to be monitored to detect and classify the "operational health'' of the devices. Second, this thesis provides and builds power consumption-based datasets that can be utilized by AI as well as security research communities to validate newly developed detection techniques. The collected datasets cover a wide range of anomalous device behavior due to the main aspects of device security (i.e., confidentiality, integrity, and availability) and partial system failures. The extensive experiments include: a wide spectrum of various emulated malware scenarios; five real malware applications taken from the well-known Drebin dataset; distributed denial of service attack (DDOS) where an IoT device is treated as: (1) a victim of a DDOS attack, and (2) the source of a DDOS attack; cryptomining malware where the resources of an IoT device are being hijacked to be used to advantage of the attacker’s wish and desire; and faulty CPU cores. This level of extensive validation has not yet been reported in any study in the literature. Third, this research presents a novel supervised technique to detect anomalous device behavior based on transforming the problem into an image classification problem. The main aim of this methodology is to improve the detection performance. In order to achieve the goals of this study, the methodology combines two powerful computer vision tools, namely Histograms of Oriented Gradients (HOG) and a Convolutional Neural Network (CNN). Such a detection technique is not only useful in this present case but can contribute to most time-series classification (TSC) problems. Finally, this thesis proposes a novel unsupervised detection technique that requires only the normal behavior of a device in the training phase. Therefore, this methodology aims at detecting new/unseen anomalous behavior. The methodology leverages the power consumption of a device and Restricted Boltzmann Machine (RBM) AutoEncoders (AE) to build a model that makes them more robust to the presence of security threats. The methodology makes use of stacked RBM AE and Principal Component Analysis (PCA) to extract feature vector based on AE's reconstruction errors. A One-Class Support Vector Machine (OC-SVM) classifier is then trained to perform the detection task. Across 18 different datasets, both of our proposed detection techniques demonstrated high detection performance with at least ~ 88% accuracy and 85% F-Score on average. The empirical results indicate the effectiveness of the proposed techniques and demonstrated improved detection performance gain of 9% - 17% over results reported in other methods

    Sensor-based ICT Systems for Smart Societies

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen
    • …
    corecore