16 research outputs found

    Secure Smart Wearable Computing through Artificial Intelligence-Enabled Internet of Things and Cyber-Physical Systems for Health Monitoring

    Get PDF
    The functionality of the Internet is continually changing from the Internet of Computers (IoC) to the “Internet of Things (IoT)”. Most connected systems, called Cyber-Physical Systems (CPS), are formed from the integration of numerous features such as humans and the physical environment, smart objects, and embedded devices and infrastructure. There are a few critical problems, such as security risks and ethical issues that could affect the IoT and CPS. When every piece of data and device is connected and obtainable on the network, hackers can obtain it and utilise it for different scams. In medical healthcare IoT-CPS, everyday medical and physical data of a patient may be gathered through wearable sensors. This paper proposes an AI-enabled IoT-CPS which doctors can utilise to discover diseases in patients based on AI. AI was created to find a few disorders such as Diabetes, Heart disease and Gait disturbances. Each disease has various symptoms among patients or elderly. Dataset is retrieved from the Kaggle repository to execute AI-enabled IoT-CPS technology. For the classification, AI-enabled IoT-CPS Algorithm is used to discover diseases. The experimental results demonstrate that compared with existing algorithms, the proposed AI-enabled IoT-CPS algorithm detects patient diseases and fall events in elderly more efficiently in terms of Accuracy, Precision, Recall and F-measure

    A metaheuristic optimization approach for energy efficiency in the IoT networks

    Get PDF
    © 2020 John Wiley & Sons, Ltd. Recently Internet of Things (IoT) is being used in several fields like smart city, agriculture, weather forecasting, smart grids, waste management, etc. Even though IoT has huge potential in several applications, there are some areas for improvement. In the current work, we have concentrated on minimizing the energy consumption of sensors in the IoT network that will lead to an increase in the network lifetime. In this work, to optimize the energy consumption, most appropriate Cluster Head (CH) is chosen in the IoT network. The proposed work makes use of a hybrid metaheuristic algorithm, namely, Whale Optimization Algorithm (WOA) with Simulated Annealing (SA). To select the optimal CH in the clusters of IoT network, several performance metrics such as the number of alive nodes, load, temperature, residual energy, cost function have been used. The proposed approach is then compared with several state-of-the-art optimization algorithms like Artificial Bee Colony algorithm, Genetic Algorithm, Adaptive Gravitational Search algorithm, WOA. The results prove the superiority of the proposed hybrid approach over existing approaches

    SCBC: Smart city monitoring with blockchain using Internet of Things for and neuro fuzzy procedures.

    Get PDF
    The security of the Internet of Things (IoT) is crucial in various application platforms, such as the smart city monitoring system, which encompasses comprehensive monitoring of various conditions. Therefore, this study conducts an analysis on the utilization of blockchain technology for the purpose of monitoring Internet of Things (IoT) systems. The analysis is carried out by employing parametric objective functions. In the context of the Internet of Things (IoT), it is imperative to establish well-defined intervals for job execution, ensuring that the completion status of each action is promptly monitored and assessed. The major significance of proposed method is to integrate a blockchain technique with neuro-fuzzy algorithm thereby improving the security of data processing units in all smart city applications. As the entire process is carried out with IoT the security of data in both processing and storage units are not secured therefore confidence level of monitoring units are maximized at each state. Due to the integration process the proposed system model is implemented with minimum energy conservation where 93% of tasks are completed with improved security for about 90%

    A review on classification of imbalanced data for wireless sensor networks

    Get PDF
    © The Author(s) 2020. Classification of imbalanced data is a vastly explored issue of the last and present decade and still keeps the same importance because data are an essential term today and it becomes crucial when data are distributed into several classes. The term imbalance refers to uneven distribution of data into classes that severely affects the performance of traditional classifiers, that is, classifiers become biased toward the class having larger amount of data. The data generated from wireless sensor networks will have several imbalances. This review article is a decent analysis of imbalance issue for wireless sensor networks and other application domains, which will help the community to understand WHAT, WHY, and WHEN of imbalance in data and its remedies

    COVID-19 patient health prediction using boosted random forest algorithm

    Get PDF
    © 2020 Iwendi, Bashir, Peshkar, Sujatha, Chatterjee, Pasupuleti, Mishra, Pillai and Jo. Integration of artificial intelligence (AI) techniques in wireless infrastructure, real-time collection, and processing of end-user devices is now in high demand. It is now superlative to use AI to detect and predict pandemics of a colossal nature. The Coronavirus disease 2019 (COVID-19) pandemic, which originated in Wuhan China, has had disastrous effects on the global community and has overburdened advanced healthcare systems throughout the world. Globally; over 4,063,525 confirmed cases and 282,244 deaths have been recorded as of 11th May 2020, according to the European Centre for Disease Prevention and Control agency. However, the current rapid and exponential rise in the number of patients has necessitated efficient and quick prediction of the possible outcome of an infected patient for appropriate treatment using AI techniques. This paper proposes a fine-tuned Random Forest model boosted by the AdaBoost algorithm. The model uses the COVID-19 patient’s geographical, travel, health, and demographic data to predict the severity of the case and the possible outcome, recovery, or death. The model has an accuracy of 94% and a F1 Score of 0.86 on the dataset used. The data analysis reveals a positive correlation between patients’ gender and deaths, and also indicates that the majority of patients are aged between 20 and 70 years

    Carbonic acid gas emission rating by vehicles using datascience techniques

    No full text
    One factor contributing to the warming of the upper orbit is the rollout of man-made pollutants into the eco system (biogas, Dioxide, laughing gas, and so on). Approximately 14% of total worldwide carbon dioxide emissions are attributed to the road transport. Wheels dust are dangerous to us and contain global warm gases that leads to changes in climate. Products of gas and diesel fuels that include NO2, CO, CH, C6H6, CH2O. Wheels also emit CO2, common human-caused global warm gas. It has been set emission targets to dramatically reduce highway's contribution to Dioxide. These are inferred from the global weather conference's goal of keeping the peak warming of the planet to a maximum of 2 degrees Celsius until 2100. In order to accomplish, in this study, a machine learning hybrid algorithm was developed in the combination of many classifications’ algorithm to find the vehicle CO2 emission with high accuracy rate. The results show that hybrid models can produce more accuracy with a lower error rate when developing an application for emission rating. Accurate carbon emission prediction models can aid in the development of emission-reduction policies

    Internet of Vehicles-based application using deep learning approach

    No full text
    Edge Computing is an optimistic technology that can extend the necessary support for vehicular applications. In this paper, an effective edge-computing framework is developed to improvise task scheduling. A task partition and scheduling algorithm are developed to decide the workload allocation and schedule the execution order of tasks offloaded. Then, according to the characteristics of task scheduling, design the corresponding state-action space and reward function; and finally, taking into consideration the complexity of task scheduling and computing resource allocation, the pointer network is trained by multi-agent fuzzy deep reinforcement learning; this allows the pointer network to account for the dynamic nature of During the process of network fusion, it is used to find a solution for the issue of weight distribution for each agent. The simulation showcases that the proposed method is superior. Furthermore, it has significant advantages in terms of convergence speed and optimal performance. It has a high degree of flexibility in the ever-changing and intricate electromagnetic environment. The capabilities of the Internet of Vehicles' job offloading system have been significantly increased because of this improvement. It is widely believed that the Internet of Vehicles (IoV), which incorporates cutting-edge technologies such as connectivity, big data, and artificial intelligence, will play a significant role in the development of the next-generation intelligent transportation system. In recent years, the Internet of Vehicles has given rise to a significant number of novel computer jobs, such as augmented reality and autonomous driving, to name a few. The completion of these computer jobs must adhere to stringent real-time constraints, and it takes a significant amount of computing resources to bring these tasks to a successful conclusion. Since the volume, weight, and other limitations that restrict vehicles prevent them from being outfitted with powerful computing devices, the computing resources of the onboard devices that are now in use are often not enough to fulfill the processing requirements of these jobs. Install edge servers in the immediate area of the vehicle. Edge computing, in contrast to cloud computing, can provide consumers with computer services that are located relatively near them. Instead of being sent to the cloud, the computing duties that are created by the vehicle are immediately offloaded to the edge server. This reduces the amount of time it takes for computing activities to be transmitted. As a result, the implementation of edge computing in IOVs is a potential solution to the problem of inadequate processing power shown by vehicles and a means of satisfying the criteria of low latency imposed by tasks. The offloading of computational duties, in general, may effectively lower the amount of energy that the vehicle requires to operate. Offloading chores is something that consumers are often more likely to do in the interest of keeping the vehicle's battery alive for as long as possible. The number of responsibilities that must be offloaded and carried out inside the Internet of Vehicles will continue to grow because of this. When a significant number of tasks are offloaded and performed, the server is unable to provide computer resources for all the tasks at the same time. This means that tasks that are not allocated to computing resources must wait to be executed. At present, it is not possible to disregard the waiting time if the computing jobs that are now queued up to be done have delay requirements. Therefore, to effectively offer computing services for a greater number of offloading jobs, it is important to establish an acceptable scheduling strategy according to the execution time and delay needs of computing tasks. This paper integrates software-defined networking (SDN) into the Internet of Vehicles, constructs an SDN-assisted computing task offloading system for the Internet of Vehicles in an edge computing environment, and presents a task of computing offloading for vehicles. This is done since SDN can manage network resources more conveniently and effectively. Scheduling model. After that, an improved pointer network is trained using deep reinforcement learning to solve the offload scheduling problem of delay-constrained computing tasks in multi-edge servers on the Internet of Vehicles. This is done in consideration of the complexity of task scheduling and the allocation of computing resources

    Junk mail content detection using logistic regression algorithm

    No full text
    In contemporary times, things have moved away from traditional method to sophisticated way of communication via social media. One of the common ways information is disseminated amongst people is in the use of emails. Emails are very effective, easy and less costly to use by the sender but invariably costly to the recipient. This is due to the effect unwarranted messages which are thrown in tons are being received daily. This paper focus is on developing an effective junk mail content detector to effectively detect the content of messages and properly classify them thereby eliminate spurious emails. Logistic regression and Random Forest algorithms were employed and the result showed thar our model Logistics regression proves a superior performance

    Effective house price prediction using machine learning

    No full text
    In recent times, there have been a surge in the housing business, such that prediction of houses is of utmost important both for the seller and the potential buyer. This has been influenced by several key indices. Many approaches have been used to tackle the issue of predicting house prices to help the house owners and real estate agents maximise their profit while the prospective buyers make better informed decision. This study focuses on building an effective model for the prediction of house prices. Since price is a continuous variable, it was expedient we used regression models. Some regression models like linear regression, Random Forest regressor (RF), Extreme Gradient Boosting Regressor (XGBoost), Support Vector Machine (SVM) regressor, K-Nearest Neighbor (KNN) and Linear regression were employed. The result showed that Random Forest Regressor showed a superior performance having an R2 score of 99.97% while SVM regressor performed poorly with an R2 score of −4.11%. The result proved that Random Forest regressor as an effective machine learning model to predicting house prices
    corecore