458,546 research outputs found

    TrustShadow: Secure Execution of Unmodified Applications with ARM TrustZone

    Full text link
    The rapid evolution of Internet-of-Things (IoT) technologies has led to an emerging need to make it smarter. A variety of applications now run simultaneously on an ARM-based processor. For example, devices on the edge of the Internet are provided with higher horsepower to be entrusted with storing, processing and analyzing data collected from IoT devices. This significantly improves efficiency and reduces the amount of data that needs to be transported to the cloud for data processing, analysis and storage. However, commodity OSes are prone to compromise. Once they are exploited, attackers can access the data on these devices. Since the data stored and processed on the devices can be sensitive, left untackled, this is particularly disconcerting. In this paper, we propose a new system, TrustShadow that shields legacy applications from untrusted OSes. TrustShadow takes advantage of ARM TrustZone technology and partitions resources into the secure and normal worlds. In the secure world, TrustShadow constructs a trusted execution environment for security-critical applications. This trusted environment is maintained by a lightweight runtime system that coordinates the communication between applications and the ordinary OS running in the normal world. The runtime system does not provide system services itself. Rather, it forwards requests for system services to the ordinary OS, and verifies the correctness of the responses. To demonstrate the efficiency of this design, we prototyped TrustShadow on a real chip board with ARM TrustZone support, and evaluated its performance using both microbenchmarks and real-world applications. We showed TrustShadow introduces only negligible overhead to real-world applications.Comment: MobiSys 201

    The application of a business intelligence tool for service delivery improvement : the case of South Africa

    Get PDF
    Abstract: The global environment requires organisations to adapt and respond quickly to the complexity of its nature. Responding to such an environment depends on real-time information. In the last decade, organisations have relied much on human expertise to extract and analyse and process data into meaningful information for decision making. Many will probably agree with the assertion that the complexity of the globalisation has led to a complexity in modern data analysis, which encompasses different elements (technology and innovation, internet of things and influx of data to name but few), resulting in modern scientific problems. It is evident that organisational knowledge has become the enabling factor for decision-making in both the private and public sector. Yet, the study of the opinion that the advancement of technology and internet of things has complicated matters further for humankind to interpret complex and vast amounts of data at the speed required to keep up with the demands of the global environment in which they operate. Therefore, it is likely that the discovered knowledge may be inaccurate at times. In responding to these dynamics, organisations require computational intelligence systems to transform the data they acquire into real-time meaningful information in order to make informed decisions. ..D.Phil. (Engineering Management

    Data Mining in Internet of Things Systems: A Literature Review

    Get PDF
    The Internet of Things (IoT) and cloud technologies have been the main focus of recent research, allowing for the accumulation of a vast amount of data generated from this diverse environment. These data include without any doubt priceless knowledge if could correctly discovered and correlated in an efficient manner. Data mining algorithms can be applied to the Internet of Things (IoT) to extract hidden information from the massive amounts of data that are generated by IoT and are thought to have high business value. In this paper, the most important data mining approaches covering classification, clustering, association analysis, time series analysis, and outlier analysis from the knowledge will be covered. Additionally, a survey of recent work in in this direction is included. Another significant challenges in the field are collecting, storing, and managing the large number of devices along with their associated features. In this paper, a deep look on the data mining for the IoT platforms will be given concentrating on real applications found in the literatur

    Proposed Model for Real-Time Anomaly Detection in Big IoT Sensor Data for Smart City

    Get PDF
    A smart city represents an advanced urban environment that utilizes digital technologies to improve the well-being of residents, efficiently manage urban operations, and prioritize long-term sustainability. These technologically advanced cities collect significant data through various Internet of Things (IoT) sensors, highlighting the crucial importance of detecting anomalies to ensure both efficient operation and security. However, real-time identification of anomalies presents challenges due to the sheer volume, rapidity, and diversity of the data streams. This manuscript introduces an innovative framework designed for the immediate detection of anomalies within extensive IoT sensor data in the context of a smart city. Our proposed approach integrates a combination of unsupervised machine learning techniques, statistical analysis, and expert feature engineering to achieve real-time anomaly detection. Through an empirical assessment of a practical dataset obtained from a smart city environment, we demonstrate that our model outperforms established techniques for anomaly detection

    Anomaly Detection in IoT: Methods, Techniques and Tools

    Get PDF
    [Abstract] Nowadays, the Internet of things (IoT) network, as system of interrelated computing devices with the ability to transfer data over a network, is present in many scenarios of everyday life. Understanding how traffic behaves can be done more easily if the real environment is replicated to a virtualized environment. In this paper, we propose a methodology to develop a systematic approach to dataset analysis for detecting traffic anomalies in an IoT network. The reader will become familiar with the specific techniques and tools that are used. The methodology will have five stages: definition of the scenario, injection of anomalous packages, dataset analysis, implementation of classification algorithms for anomaly detection and conclusions

    EVALUATING THE CYBER SECURITY IN THE INTERNET OF THINGS: SMART HOME VULNERABILITIES

    Get PDF
    The need for advanced cyber security measures and strategies is attributed to modern sophistication of cyber-attacks and intense media attention when attacks and breaches occur. In May 2014, a congressional report suggested that Americans used approximately 500 million Internet-capable devices at home, including, but not limited to Smartphones, tablets, and other Internet-connected devices, which run various unimpeded applications. Owing to this high level of connectivity, our home environment is not immune to the cyber-attack paradigm; rather, the home has evolved to become one of the most influenced markets where the Internet of Things has had extensive surfaces, vectors for attacks, and unanswered security concerns. Thus, the aim of the present research was to investigate behavioral heuristics of the Internet of Things by adopting an exploratory multiple case study approach. A controlled Internet of Things ecosystem was constructed consisting of real-life data observed during a typical life cycle of initial configuration and average use. The information obtained during the course of this study involved the systematic acquisition and analysis of Smart Home ecosystem link-layer protocol data units (PDUs). The methodology employed during this study involved a recursive multiple case study evaluation of the Smart Home ecosystem data-link layer PDUs and aligned the case studies to the existing Intrusion Kill Chain design model. The proposed solution emerging from the case studies builds the appropriate data collection template while concurrently developing a Security as a Service (SECaaS) capability to evaluate collected results

    Real-Time Big Data Analytics in Smart Cities from LoRa-Based IoT Networks

    Get PDF
    The currently burst of the Internet of Things (IoT) tech-nologies implies the emergence of new lines of investigation regarding not only to hardware and protocols but also to new methods of pro-duced data analysis satisfying the IoT environment constraints: a real-time and a big data approach. The Real-time restriction is about the continuous generation of data provided by the endpoints connected to an IoT network; due to the connection and scaling capabilities of an IoT network, the amount of data to process is so high that Big data tech-niques become essential. In this article, we present a system consisting of two main modules. In one hand, the infrastructure, a complete LoRa based network designed, tested and deployment in the Pablo de Olavide University and, on the other side, the analytics, a big data streaming sys-tem that processes the inputs produced by the network to obtain useful, valid and hidden information.Ministerio de Economía y Competitividad TIN2017-88209-C2-1-

    Implementation of Integration VaaMSN and SEMAR for Wide Coverage Air Quality Monitoring

    Get PDF
    The current air quality monitoring system cannot cover a large area, not real-time and has not implemented big data analysis technology with high accuracy. The purpose of an integration Mobile Sensor Network and Internet of Things system is to build air quality monitoring system that able to monitor in wide coverage. This system consists of Vehicle as a Mobile Sensors Network (VaaMSN) as edge computing and Smart Environment Monitoring and Analytic in Real-time (SEMAR) cloud computing. VaaMSN is a package of air quality sensor, GPS, 4G WiFi modem and single board computing. SEMAR cloud computing has a time-series database for real-time visualization, Big Data environment and analytics use the Support Vector Machines (SVM) and Decision Tree (DT) algorithm. The output from the system are maps, table, and graph visualization. The evaluation obtained from the experimental results shows that the accuracy of both algorithms reaches more than 90%. However, Mean Square Error (MSE) value of SVM algorithm about 0.03076293, but DT algorithm has 10x smaller MSE value than SVM algorithm

    New-normal market entry mode for pharmaceuticals: an Internet of Things (IoT) market entry framework stemming from COVID-19

    Get PDF
    Purpose: To determine new-normal uncertainty considerations stemming from the covid-19 pandemic to consider within transaction-cost analysis for pharmaceuticals. To propose new-normal market entry strategies to address the uncertainty as a result of covid-19’s implications and provide for lack of knowledge and information in an uncertain business environment by way of Internet of Things (IoT) ecosystem for pharmaceutical market entry. Methodology: In this paper, we focus on the uncertainty facet within transaction-cost analysis consideration and utilise a descriptive three-case study approach taking in Johnson and Johnson (J&J), GlaxoSmithKline (GSK) and Novartis to present an ADO (Antecedent-Decisions-Outcomes) understanding of their usual market entry approach, the approach undertaken during the pandemic and the outcomes thereafter facilitating new-normal uncertainty considerations to factor in. Further with this insight, we develop a conceptual framework addressing the transaction-cost analysis implications of uncertainties toward lack of knowledge and information for new-normal market entry approach and operating strategy for pharmaceuticals applicable due to IoT (Internet of Things). Findings: Uncertainty (external and internal) is different now in the new-normal business environment for pharmaceuticals and boils down to acute shortage of knowledge and information impact to make an appropriately informed decision. Therefore, considering the changed factors to consider, pharmaceuticals need to be able to undertake market entry with vaccines and medicines by way of IoT thereby enabling, the filling of the gap via real-time data access and sharing including enhancing predictive analysis for sustenance. Originality: It is the first study to our knowledge that throws light on transaction-cost analysis theory’s uncertainty facet for pharmaceuticals. It is also the first study that provides new-normal market entry strategy for pharmaceutical companies built on interoperability of real-time IoT
    corecore