16 research outputs found

    SAgric-IoT: an IoT-based platform and deep learning for greenhouse monitoring

    Get PDF
    The Internet of Things (IoT) and convolutional neural networks (CNN) integration is a growing topic of interest for researchers as a technology that will contribute to transforming agriculture. IoT will enable farmers to decide and act based on data collected from sensor nodes regarding field conditions and not purely based on experience, thus minimizing the wastage of supplies (seeds, water, pesticide, and fumigants). On the other hand, CNN complements monitoring systems with tasks such as the early detection of crop diseases or predicting the number of consumable resources and supplies (water, fertilizers) needed to increase productivity. This paper proposes SAgric-IoT, a technology platform based on IoT and CNN for precision agriculture, to monitor environmental and physical variables and provide early disease detection while automatically controlling the irrigation and fertilization in greenhouses. The results show SAgric-IoT is a reliable IoT platform with a low packet loss level that considerably reduces energy consumption and has a disease identification detection accuracy and classification process of over 90%

    A review of deep learning applications for the next generation of cognitive networks

    Get PDF
    Intelligence capabilities will be the cornerstone in the development of next-generation cognitive networks. These capabilities allow them to observe network conditions, learn from them, and then, using prior knowledge gained, respond to its operating environment to optimize network performance. This study aims to offer an overview of the current state of the art related to the use of deep learning in applications for intelligent cognitive networks that can serve as a reference for future initiatives in this field. For this, a systematic literature review was carried out in three databases, and eligible articles were selected that focused on using deep learning to solve challenges presented by current cognitive networks. As a result, 14 articles were analyzed. The results showed that applying algorithms based on deep learning to optimize cognitive data networks has been approached from different perspectives in recent years and in an experimental way to test its technological feasibility. In addition, its implications for solving fundamental challenges in current wireless networks are discussed

    The use of deep learning to improve player engagement in a video game through a dynamic difficulty adjustment based on skills classification

    Get PDF
    The balance between game difficulty and player skill in the evolving landscape of the video game industry is a significant factor in player engagement. This study introduces a deep learning (DL) approach to enhance gameplay by dynamically adjusting game difficulty based on a player’s skill level. Our methodology aims to prevent player disengagement, which can occur if the game difficulty significantly exceeds or falls short of the player’s skill level. Our evaluation indicates that such dynamic adjustment leads to improved gameplay and increased player involvement, with 90% of the players reporting high game enjoyment and immersion levels

    Olfactory Interfaces: Recent Trends and Challenges of E-Noses in Human–Computer Interaction

    No full text
    An electronic nose (e-nose) is an electronic device composed of one or more odor sensors, a microcontroller, electronic components, and software that acquire and analyze a gas or volatile organic compound (VOC) present in an environment. E-noses attempt to identify the gas or VOC based on their chemical composition, sending electronic data about the detected odor signature to a computer, akin to an animal nose identifying odors and sending electrochemical signals to an animal brain. Then, the computer attempts to identify the perceived odor. E-noses have been used in human–computer interaction in specialized computing applications containing a user interface (UI) with a purpose: supporting its user to identify an odor and its properties and communicating information about the odor on the UI

    SAgric-IoT: An IoT-Based Platform and Deep Learning for Greenhouse Monitoring

    No full text
    The Internet of Things (IoT) and convolutional neural networks (CNN) integration is a growing topic of interest for researchers as a technology that will contribute to transforming agriculture. IoT will enable farmers to decide and act based on data collected from sensor nodes regarding field conditions and not purely based on experience, thus minimizing the wastage of supplies (seeds, water, pesticide, and fumigants). On the other hand, CNN complements monitoring systems with tasks such as the early detection of crop diseases or predicting the number of consumable resources and supplies (water, fertilizers) needed to increase productivity. This paper proposes SAgric-IoT, a technology platform based on IoT and CNN for precision agriculture, to monitor environmental and physical variables and provide early disease detection while automatically controlling the irrigation and fertilization in greenhouses. The results show SAgric-IoT is a reliable IoT platform with a low packet loss level that considerably reduces energy consumption and has a disease identification detection accuracy and classification process of over 90%

    Natural User Interfaces to Teach Math on Higher Education

    Get PDF
    AbstractA common and known problem among students on higher education (in general throughout the scholar life) is the difficulty of learning math. Nowadays the teaching methods for learning math remain relatively the same. Though, with the emerging of new technologies like portable devices (smart phones and tablets) and movement interaction devices (Nintendo WiiMote, Microsoft Kinect and PlayStation Move), learners and teachers have big interest on new ways of interacting through the teaching-learning process. In this sense the Natural User Interactions (NUIs) offer a great potential to facilitate new ways of computer enhanced learning, these have the potential to enhance classroom interactions, by increasing learners participation, facilitating the teachers’ presentations and creating opportunities for discussion. We present a system that combines gestural and touch interactions to support teaching math across multiple personal devices and public displays to enhance and support math education for college students. In a formative usability study, learners and teachers were positive about the interaction design and the learning possibilities for math education. Thus, this created good intentions in the users of continuing using it

    Prioritization-Driven Congestion Control in Networks for the Internet of Medical Things: A Cross-Layer Proposal

    No full text
    Real-life implementation of the Internet of Things (IoT) in healthcare requires sufficient quality of service (QoS) to transmit the collected data successfully. However, unsolved challenges in prioritization and congestion issues limit the functionality of IoT networks by increasing the likelihood of packet loss, latency, and high-power consumption in healthcare systems. This study proposes a priority-based cross-layer congestion control protocol called QCCP, which is managed by communication devices’ transport and medium access control (MAC) layers. Unlike existing methods, the novelty of QCCP is how it estimates and resolves wireless channel congestion because it does not generate control packets, operates in a distributed manner, and only has a one-bit overhead. Furthermore, at the same time, QCCP offers packet scheduling considering each packet’s network load and QoS. The results of the experiments demonstrated that with a 95% confidence level, QCCP achieves sufficient performance to support the QoS requirements for the transmission of health signals. Finally, the comparison study shows that QCCP outperforms other TCP protocols, with 64.31% higher throughput, 18.66% less packet loss, and 47.87% less latency

    A Comparison of Three Machine Learning Methods for Multivariate Genomic Prediction Using the Sparse Kernels Method (SKM) Library

    No full text
    Genomic selection (GS) changed the way plant breeders select genotypes. GS takes advantage of phenotypic and genotypic information to training a statistical machine learning model, which is used to predict phenotypic (or breeding) values of new lines for which only genotypic information is available. Therefore, many statistical machine learning methods have been proposed for this task. Multi-trait (MT) genomic prediction models take advantage of correlated traits to improve prediction accuracy. Therefore, some multivariate statistical machine learning methods are popular for GS. In this paper, we compare the prediction performance of three MT methods: the MT genomic best linear unbiased predictor (GBLUP), the MT partial least squares (PLS) and the multi-trait random forest (RF) methods. Benchmarking was performed with six real datasets. We found that the three investigated methods produce similar results, but under predictors with genotype (G) and environment (E), that is, E + G, the MT GBLUP achieved superior performance, whereas under predictors E + G + genotype × environment (GE) and G + GE, random forest achieved the best results. We also found that the best predictions were achieved under the predictors E + G and E + G + GE. Here, we also provide the R code for the implementation of these three statistical machine learning methods in the sparse kernel method (SKM) library, which offers not only options for single-trait prediction with various statistical machine learning methods but also some options for MT predictions that can help to capture improved complex patterns in datasets that are common in genomic selection
    corecore