International Journal on Recent and Innovation Trends in Computing and Communication
Not a member yet
    7998 research outputs found

    Retracted

    No full text
    Retracte

    Addressing Persistent Storage Challenges in Kubernetes Environments

    Get PDF
    The widespread adoption of Kubernetes as a cornerstone for container orchestration highlights the platform's robust capabilities in managing complex, distributed applications. However, an area that consistently presents challenges is the integration of persistent storage solutions. This review paper examines the persistent storage challenges in Kubernetes environments, focusing on the key issues of volume management, data durability, and scalability across diverse infrastructures. Through an extensive literature review, the paper aggregates findings from various studies, revealing common obstacles and innovative solutions in this field. It analyzes how Kubernetes handles stateful applications via Persistent Volumes (PVs), Persistent Volume Claims (PVCs), and StorageClasses, and evaluates the effectiveness of these mechanisms in real-world scenarios. Moreover, the paper delves into the advancements in Container Storage Interface (CSI) drivers and their role in enhancing storage flexibility and operator ease. By synthesizing current research, this review not only outlines the persistent issues but also highlights emerging trends and potential future directions in Kubernetes storage management. It serves as a critical resource for developers, IT professionals, and organizations aiming to leverage Kubernetes for applications requiring reliable and scalable storage solutions

    Bidirectional Braille Transcription for Kannada and Telugu text using Natural Language Processing

    Get PDF
    In today's modern society, where information is readily accessible through various sources such as the internet and newspapers, individuals with visual impairments encounter significant challenges in accessing this wealth of knowledge. Unlike their sighted counterparts who effortlessly stay informed about current events and knowledge, visually impaired individuals face obstacles in harnessing this information. To address this disparity, there is an urgent need to develop a system that enables the bidirectional conversion of natural language text into Braille, thereby offering enhanced learning opportunities for the visually impaired. This paper presents a pioneering approach to bidirectional Braille transcription for Kannada and Telugu texts, employing advanced Natural Language Processing (NLP) techniques exclusively on text-based data. Given the essential role of Braille transcription in enabling visually impaired individuals to access text, the complexity of Indian scripts like Kannada and Telugu poses unique challenges. Our proposed system utilizes state-of-the-art NLP algorithms to facilitate accurate and efficient translation between printed text and Braille. The methodology encompasses tailored preprocessing steps addressing the intricate orthographic structures of Kannada and Telugu, alongside a robust transliteration engine for converting text to Braille, and an inverse transcription mechanism to revert Braille back to standard text. Through comprehensive testing on diverse text samples, the system demonstrates high accuracy and reliability. This research significantly enhances accessibility for visually impaired Kannada and Telugu speakers and sets a precedent for the application of advanced NLP techniques in regional language Braille transcription

    Unified Modelling Language (UML) Tool for Software

    Get PDF
    The production of software has going to move a lot much beyond the traditional development of software, described as an interactive autonomous software component by the structured programming paradigm of the late sixties and early 1970s. The study seeks to identify a UML profile, outline software projects in which UML was used to evaluate the use and efficiency of UML diagrams, determine the use of CASE tools and record the perceived use of UML language. A study survey was developed for IT professionals and university students. The research was conducted. There have been mailing lists. The findings indicate that UML is used in most software development projects successfully and that many of the users consider UML to be good since it helps to build the system more rapidly and to produce excellent software systems. For performance risk assessment, UML diagrams are used, a software model is created for each scenario, and are translated into a system execution model through deployment data

    Personalized Health Assessment and Recommendations Through Iot and Mlp Classifier Algorithms

    Get PDF
    Procuring a healthy lifestyle involves a holistic approach of personalized dietary and exercise recommendations dependent on individual health statuses. In this study, we present a new paradigm for examining individual health statuses for easy self-assessment without specialist help. The heart is a full kit of assessing instruments that can align critical climacterics of body temperature, pulse rate, blood oxygen level, and body max index that could be run with minor medic assistance. The research abides a dataset obtained through a broad scope of volunteers aged 17 to 24 including both males and females. Vital signs such as SpO2, BPM, temperature, and BMI are mediated utilizing incorporated Internet of Things units. The dataset is then cautiously preprocessed and balanced using machine learning algorithms before examination. The basis of this model is a two-tier state classifier system that designs autonomous dietary and exercise responsibilities varying from examined health clots. It is exploited for adulthood healthcare systems across multiple machines learning techniques, including Decision Tree, KNN, and some classifiers with the MLP classifier being the exemplary worthy model. The MLP classifier demonstrates unbelievable outcomes through approximately 86% accuracy when the trainings and testing datasets are 70:30 ratios apart

    Lesion-Based Detection of Cardiovascular Diseases Using Deep Learning and Red Deer Optimization

    Get PDF
    Nowadays, cardiovascular disease is a very concerning health issue in human life. Medical imaging through MRI plays an important role in the detection of many diseases. Magnetic resonance imaging (MRI) is a non-invasive and sophisticated diagnostic tool for cardiovascular disease (CVD) that allows for full visualization of the heart and blood vessels. Through Magnetic resonance imaging, we get high-quality images of blood vessels, which helps in detecting various types of heart-related diseases. With the help of MRI, we can detect various types of heart-related diseases. It also gives us information about their early diagnosis and their preventive measures. Deep learning and its advanced features are proving to be very helpful in this work. Deep learning has brought many new changes in this field. The article presents the Red Deer Optimizer with Deep Learning (ACVD-RDODL) algorithm for automated cardiovascular disease identification using magnetic resonance imaging (MRI). The primary goal of the proposed approach is to use Deep Learning models on cardiac MRI to detect Cardiovascular issues. The dynamic histogram equalisation (DHE) based noise removal model is used in the given approach to pre-process the images. Additionally, the Attention Based Convolutional Gated Recurrent Unit Network (ACGRU) model is used in this approach to classify Cardiovascular diseas

    Innovative Solutions for Agriculture: Sensor-Driven Soil Parameter Monitoring and Deep Learning in Potato Disease Detection

    Get PDF
    The primary obstacle facing modern agriculture is the lack of advanced technologies capable of efficiently and proactively identifying crop diseases, a gap that is most noticeable while the crop is at the key stem stage. Taking note of this difficulty, the suggested solution calls for the deliberate insertion of cutting-edge sensors at the root level straight into the soil. The objective of this integration is to offer a comprehensive and in-depth evaluation of crucial factors that are necessary for plant health, including temperature dynamics, moisture content, and nutrient levels of soil. While the temperature sensors serve a dual purpose by monitoring the external environment and evaluating the condition of mechanical assets vital to agricultural operations, the soil moisture and index sensors are essential for precisely determining irrigation needs and assessing soil nutrient levels. The project incorporates a cutting-edge Convolutional Neural Network (CNN) deep learning algorithm designed especially for the identification of potato leaf diseases, which represents a significant improvement to disease detection capabilities. This sophisticated algorithm improves the accuracy and efficiency of disease identification by using deep learning to analyze and comprehend complex patterns found in the leaf of the plant. This comprehensive initiative's main goal is to create a seamlessly integrated sensor system that can monitor crop health dynamically, provide real-time insights into critical soil characteristics, and use state-of-the-art CNN deep learning technology to detect potato leaf diseases in the agricultural landscape with extreme precision

    A Framework to Automate Requirements Specification Task

    Get PDF
    Requirement identification and prioritisation are two principal activities of the requirement engineering process in the Software Development Life cycle. There are several approaches to prioritization of requirements identified by the stakeholders. However, there is a need for a deeper understanding of the optimal approach. Much study has been done and machine learning has proven to help automate requirement engineering tasks. A framework that identifies the types of requirements and assigns the priority to requirements does not exist. This study examines the behaviour of the different machine learning algorithms used for software requirements identification and prioritisation. Due to variations in research methodologies and datasets, the results of various studies are inherently contradictory. A framework that identifies the types of requirements and assigns the priority to requirements does not exist. This paper further discusses a framework for text preparation of requirements stated in natural language, type identification and requirements prioritisation has been proposed and implemented. After analysing the ML algorithms that are now in use, it can be concluded that it is necessary to take into account the various types of requirements when dealing with the identification and classification of requirements. A Multiple Correlation Coefficient-based Decision Tree (MCC-based DT) algorithm considers multiple features to map to a requirement and hence overcomes the limitations of the existing machine learning algorithms. The results demonstrated that the MCC-based DT algorithm has enhanced type identification performance compared to existing ML methods. The MCC-based DT algorithm is 4.42% more accurate than the Decision Tree algorithm. This study also tries to determine an optimisation algorithm that is likely to prioritise software requirements and further evaluate the performance. The sparse matrix produced for the text dataset indicates that Adam optimisation method must be modified to assign the requirement a more precise priority. To address the limitations of the Adam Algorithm, the Automated Requirement Prioritisation Technique, an innovative algorithm, is implemented in this work. Testing the ARPT on 43 projects reveals that the mean squared error is reduced to 1.34 and the error cost is reduced to 0.0001. The results indicate an 84% improvement in the prioritisation of requirements compared to the Adam algorithm

    Optimizing Energy Efficiency in UAV-Based Wireless Communication Networks: A Comparative Analysis of TAODV and DSR Protocols using the Trust Score Algorithm for Signal Processing

    Get PDF
    This study presents a comprehensive analysis of energy efficiency optimization in signal processing algorithms for UAV-based wireless communication networks. Employing a multifaceted approach that integrates mathematical modeling, game theory analysis, and an array of testing methodologies, the research aims to address the critical challenge of enhancing communication protocol performance while minimizing energy consumption. Central to our investigation is the development and application of the Trust Score Algorithm (TSA), a novel quantitative tool designed to evaluate and compare the efficacy of various signal processing algorithms across multiple dimensions, including energy efficiency, reliability, adaptability, security, and latency. Through detailed comparative analysis and data visualization techniques, the study reveals that the Proposed_TAODV protocol significantly outperforms traditional TAODV and DSR protocols in several key metrics. These include throughput efficiency, end-to-end delay, and packet delivery ratio, particularly as the number of UAV nodes scales up. Such findings underscore the Proposed_TAODV protocol's superior stability and performance, advocating for its potential in improving the sustainability and effectiveness of UAV-based communication networks. The research methodology encompasses both theoretical and empirical testing phases, ranging from simulation-based analysis, to validate the performance of the signal processing algorithms under varied operational conditions. The results not only affirm the superior performance of the Proposed_TAODV protocol but also highlight the utility of the TSA in guiding the selection and optimization of signal processing algorithms for UAV networks

    Investigation of Enhanced Performance in Flexible Solar Cells Using Passive Cooling Technique

    Get PDF
    The lack of flexibility and enormous weight in conventional photovoltaic (PV) modules limits their applications. The advantages of flexibility and lightweight have made flexible solar cells popular in various applications. However, flexible PVs have an efficiency degradation due to an increase in module temperature through incoming solar infrared radiation. Both the power output and the electrical efficiency of the PV module depend linearly on the operating temperature. For every degree increase in the PV temperature, the efficiency decreases by 0.45-0.65%. Here, the novel concept of applying a nanomaterial-based heat-resistant coating for the passive cooling of flexible solar cells was experimentally investigated. A heat-resistant coating generally keeps buildings cooled by filtering UV and infrared rays and transmitting visible rays. This approach works by controlling the incoming solar radiation, thereby decreasing the overall temperature of flexible solar cells passively without adding much weight. Here, a transparent flexible polyacrylic sheet 0.25 mm in thickness was used, and two coats of silver nanomaterial-based coating were applied. The sheet was placed over a flexible solar photovoltaic module with a power rating of 6 watts. The temperature of the flexible solar photovoltaic module was recorded at different time intervals for August, September, and October using temperature sensors, taking note of factors such as wind speed and solar irradiation. These readings were compared with those taken from the solar panel without any coating. A temperature reduction of 6-7°C and an improved solar power efficiency of 2.5-4 % were observed for cooled flexible solar panels

    7,900

    full texts

    7,999

    metadata records
    Updated in last 30 days.
    International Journal on Recent and Innovation Trends in Computing and Communication
    Access Repository Dashboard
    Do you manage Open Research Online? Become a CORE Member to access insider analytics, issue reports and manage access to outputs from your repository in the CORE Repository Dashboard! 👇