565 research outputs found

    Deep Learning-Based Dynamic Watermarking for Secure Signal Authentication in the Internet of Things

    Full text link
    Securing the Internet of Things (IoT) is a necessary milestone toward expediting the deployment of its applications and services. In particular, the functionality of the IoT devices is extremely dependent on the reliability of their message transmission. Cyber attacks such as data injection, eavesdropping, and man-in-the-middle threats can lead to security challenges. Securing IoT devices against such attacks requires accounting for their stringent computational power and need for low-latency operations. In this paper, a novel deep learning method is proposed for dynamic watermarking of IoT signals to detect cyber attacks. The proposed learning framework, based on a long short-term memory (LSTM) structure, enables the IoT devices to extract a set of stochastic features from their generated signal and dynamically watermark these features into the signal. This method enables the IoT's cloud center, which collects signals from the IoT devices, to effectively authenticate the reliability of the signals. Furthermore, the proposed method prevents complicated attack scenarios such as eavesdropping in which the cyber attacker collects the data from the IoT devices and aims to break the watermarking algorithm. Simulation results show that, with an attack detection delay of under 1 second the messages can be transmitted from IoT devices with an almost 100% reliability.Comment: 6 pages, 9 figure

    IoT Based Real Time Early Age Concrete Compressive Strength Monitoring

    Get PDF
    Concrete Strength determination has been an expensive and hectic job due to its orthodox methodology of measuring concrete strength where cylinders are filled with concrete. Its strength is measured using the crushing of concrete (Compression Test). A significant amount of waste is generated while performing this test multiple times during the execution of the project. The present study proposes a new IoT-based framework comprising a low-cost sensor and a window dashboard to estimate and monitor the real-time early-age concrete strength. This system will significantly help the construction industry to avoid the onsite laboratory testing of concrete for strength. In this study, a temperature sensor, along with an ESP32 microprocessor, is used to acquire and transmit the recorded temperature in real time to a cloud database. The window application developed load data from the cloud database and presented it as figures and graphs related to concrete strength with time. The strength calculated using the developed sensor was compared with the actual strength determined using a compression test for the same mix design, which showed a significant match. The project is a contribution toward the non-destructive testing of concrete. By knowing the concrete strength of any structural member in advance, the practitioners can make decisions well before time to avoid delays in the project

    International conference on software engineering and knowledge engineering: Session chair

    Get PDF
    The Thirtieth International Conference on Software Engineering and Knowledge Engineering (SEKE 2018) will be held at the Hotel Pullman, San Francisco Bay, USA, from July 1 to July 3, 2018. SEKE2018 will also be dedicated in memory of Professor Lofti Zadeh, a great scholar, pioneer and leader in fuzzy sets theory and soft computing. The conference aims at bringing together experts in software engineering and knowledge engineering to discuss on relevant results in either software engineering or knowledge engineering or both. Special emphasis will be put on the transference of methods between both domains. The theme this year is soft computing in software engineering & knowledge engineering. Submission of papers and demos are both welcome

    Blockchain Enabled Reparations in Smart Buildings Cyber Physical System

    Get PDF
    Blockchain technology is evolving across the globe and is being looked upon as a definite part of the future. Blockchain is often associated with bitcoin and finance’s domain. But over the last decade, this backend technology to bitcoin has spread its association in almost all domains that we can think of. Further to this, smart contracts are making the blockchain ecosystem better. Other evolving technologies like Internet-of-things, Industrial Internet-of things, Cyber physical systems are also making their onset on the global platform. Smart buildings link Internet of-things connectivity, sensors and the cloud to remotely supervise and assure efficient heating- air conditioning, lighting and security systems etc to improve efficiency and overall sustainability. The global buildings sector over the next 40 years is expected to add 230 billion square meters of fresh construction, i.e., adding the equivalent of Paris every week. Thus integrating these technologies right at the onset, before they grow in isolation, is a coveted need today. This paper proposes a prototype to simulate architecture and discusses how blockchain-enabled smart buildings can further expedite automation, security and transparency. For an apprehension purpose, the paper focuses on smart contracts enabled repairs and service in smart buildings

    Expedite requests in Raytheon's North Texas supply chain

    Get PDF
    Thesis (M.B.A.)--Massachusetts Institute of Technology, Sloan School of Management; and, (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science; in conjunction with the Leaders for Manufacturing Program at MIT, 2006.Includes bibliographical references (p. 69-70).In December 2004, a manager at Raytheon Company articulated in the form of an LFM (Leaders for Manufacturing) internship proposal his belief that someone should do something about the amounts of time and money that Raytheon's North Texas plants spent handling expedite requests-requests that someone provide goods or services more quickly than normal. This thesis attempts to summarize the thoughts, learnings, initiatives, and outcomes associated with the ensuing effort. In particular, a large section of the paper is devoted to a case study of the most involved initiative: the devising and implementing of a new dispatching method in one small but central operation in an organization with a long history of processing things first in, first out. While for the project team the compelling factor was achieving a specific dollar impact, the reader of this paper will probably be more interested in the methodology than in Raytheon's ROI. Research for this thesis was conducted during a six-month internship with Raytheon Company's Space and Airborne Systems Supply Chain Management group in McKinney, TX, and Dallas, TX. The internship was affiliated with the Massachusetts Institute of Technology's Leaders for Manufacturing (LFM) Program.by Scott K. Hiroshige.S.M.M.B.A

    Underpinning Quality Assurance: Identifying Core Testing Strategies for Multiple Layers of Internet-of-Things-Based Applications

    Get PDF
    The Internet of Things (IoT) constitutes a digitally integrated network of intelligent devices equipped with sensors, software, and communication capabilities, facilitating data exchange among a multitude of digital systems via the Internet. Despite its pivotal role in the software development life-cycle (SDLC) for ensuring software quality in terms of both functional and non-functional aspects, testing within this intricate software–hardware ecosystem has been somewhat overlooked. To address this, various testing techniques are applied for real-time minimization of failure rates in IoT applications. However, the execution of a comprehensive test suite for specific IoT software remains a complex undertaking. This paper proposes a holistic framework aimed at aiding quality assurance engineers in delineating essential testing methods across different testing levels within the IoT. This delineation is crucial for effective quality assurance, ultimately reducing failure rates in real-time scenarios. Furthermore, the paper offers a mapping of these identified tests to each layer within the layered framework of the IoT. This comprehensive approach seeks to enhance the reliability and performance of IoT-based applications

    A Model of Factors Influencing the Implementation of Artificial Intelligent in Crisis Management: A Case Study of National Crisis and Emergency Management Authority (NCEMA)

    Get PDF
    This paper outlines the development of a structural equation model focusing on factors influencing the implementation of AI in crisis management within the UAE National Crisis and Emergency Management Authority. Literature has identified 28 factors which are categorized into seven domains that influencing the implementation of AI in crisis management for the model. The model was constructed and evaluated using SmartPLS software. The model was evaluated at its measurement and structural components. The results revealed that at the measurement component, the model met all evaluation criteria. While, at the structural component, the relationship between 'CoV' and 'CrM' was statistically significant (T-statistic = 2.633, P-value = 0.009), indicating a robust connection. However, the links between 'ReF' and 'CrM' and 'LSM' and 'CrM' were not statistically significant (P-values = 0.999 and 0.949, respectively), suggesting limited impact on 'CrM.' Relationships between 'RoB,' 'IoT,' 'DeL,' and 'NLP' with 'CrM' showed moderate evidence but lacked statistical significance, possibly due to data limitations. Furthermore, the model demonstrated a strong fit, with an R-squared (R²) value of 0.761, explaining approximately 76.1% of the variance in "CrM" with the seven independent variables. Lastly, for predictive relevance, the "CrM" as a dependent construct displayed a Q² value of 0.608, indicating that around 60.8% of the variation in "CrM" is explained by the model beyond random chance, confirming its strong predictive value

    Advancing Chronic Respiratory Disease Care with Real-Time Vital Sign Prediction

    Get PDF
    Cardiovascular and chronic respiratory diseases, being pervasive in nature, pose formidable challenges to the overall well-being of the global populace. With an alarming annual mortality rate of approximately 19 million individuals across the globe, these diseases have emerged as significant public health concerns warranting immediate attention and comprehensive understanding. The mitigation of this elevated mortality rate can be achieved through the application of cutting-edge technological innovations within the realm of medical science, which possess the capacity to enable the perpetual surveillance of various physiological indicators, including but not limited to blood pressure, cholesterol levels, and blood glucose concentrations. The forward-thinking implications of these pivotal physiological or vital sign parameters not only facilitate prompt intervention from medical professionals and carers, but also empower patients to effectively navigate their health status through the receipt of pertinent periodic notifications and guidance from healthcare practitioners. In this research endeavour, we present a novel framework that leverages the power of machine learning algorithms to forecast and categorise forthcoming values of pertinent physiological indicators in the context of cardiovascular and chronic respiratory ailments. Drawing upon prognostications of prospective values, the envisaged framework possesses the capacity to effectively categorise the health condition of individuals, thereby alerting both caretakers and medical professionals. In the present study, a machine-learning-driven prediction and classification framework has been employed, wherein a genuine dataset comprising vital signs has been utilised. In order to anticipate the forthcoming 1-3 minutes of vital sign values, a series of regression techniques, namely linear regression and polynomial regression of degrees 2, 3, and 4, have been subjected to rigorous examination and evaluation. In the realm of caregiving, a concise 60-second prognostication is employed to enable the expeditious provision of emergency medical aid. Additionally, a more comprehensive 3-minute prognostication of vital signs is utilised for the same purpose. The patient's overall health is evaluated based on the anticipated vital signs values through the utilisation of three machine learning classifiers, namely Support Vector Machine (SVM), Decision Tree and Random Forest. The findings of our study indicate that the implementation of a Decision Tree algorithm exhibits a high level of accuracy in accurately categorising a patient's health status by leveraging anomalous values of vital signs. This approach demonstrates its potential in facilitating prompt and effective medical interventions, thereby enhancing the overall quality of care provided to patients

    Implementation and evaluation of semantic clustering techniques for Fog nodes

    Get PDF
    Growing at an extremely rapid rate, the Internet of Things (IoT) devices are becoming a crucial part of our everyday lives. They are embedded in almost everything we do on a daily basis. From simple sensors, cell phones, wearable devices to smart city technologies, we are becoming heavily dependent on such devices. At this current state, the Cloud paradigm is being ooded by massive amounts of data continuously. The current amounts of data is minimal compared to the amounts that we are about to witness in the near future, mainly because of the 5G deployment expediting and the increase in network intelligence. This increased data could lead to more network congestion and higher latency, due to the physical distance between the devices and the Cloud data centers. Therefore, a need for a new model is paramount, and will be essential in realizing the Internet of Everything (IoE) and the next stage in the digital evolution. Fog computing is one of the promising paradigms, since it extends the Cloud with intelligent computing units, placed closer to where the data is being generated to o oad the Cloud. This tackles the issues of latency, mobility and network congestion. In this work we present a conceptual Fog computing ecosystem, where we model the Cloud to Fog (C2F) environment. Then we implement two dynamic clustering techniques of Fog nodes to utilize combined resources, using a semantic description of the Fog nodes' resources and properties of the edge devices. Finally, we optimize the assignment of applications over Fog cluster resources, using Linear programming and a First Fit Heuristic Algorithm. We evaluate our implementation by analyzing the di erences between the two clustering techniques. We perform several experiments to evaluate our implementation, and the results prove that the heuristic optimization of task allocation is much faster and more consistent than the Linear programming solver, as expected. Moreover, the results show that clustering Fog nodes is bene cial in o oading the Cloud and reducing response times
    • …
    corecore