61 research outputs found

    Edge-Cloud Polarization and Collaboration: A Comprehensive Survey for AI

    Full text link
    Influenced by the great success of deep learning via cloud computing and the rapid development of edge chips, research in artificial intelligence (AI) has shifted to both of the computing paradigms, i.e., cloud computing and edge computing. In recent years, we have witnessed significant progress in developing more advanced AI models on cloud servers that surpass traditional deep learning models owing to model innovations (e.g., Transformers, Pretrained families), explosion of training data and soaring computing capabilities. However, edge computing, especially edge and cloud collaborative computing, are still in its infancy to announce their success due to the resource-constrained IoT scenarios with very limited algorithms deployed. In this survey, we conduct a systematic review for both cloud and edge AI. Specifically, we are the first to set up the collaborative learning mechanism for cloud and edge modeling with a thorough review of the architectures that enable such mechanism. We also discuss potentials and practical experiences of some on-going advanced edge AI topics including pretraining models, graph neural networks and reinforcement learning. Finally, we discuss the promising directions and challenges in this field.Comment: 20 pages, Transactions on Knowledge and Data Engineerin

    Ecosystemic Evolution Feeded by Smart Systems

    Get PDF
    Information Society is advancing along a route of ecosystemic evolution. ICT and Internet advancements, together with the progression of the systemic approach for enhancement and application of Smart Systems, are grounding such an evolution. The needed approach is therefore expected to evolve by increasingly fitting into the basic requirements of a significant general enhancement of human and social well-being, within all spheres of life (public, private, professional). This implies enhancing and exploiting the net-living virtual space, to make it a virtuous beneficial integration of the real-life space. Meanwhile, contextual evolution of smart cities is aiming at strongly empowering that ecosystemic approach by enhancing and diffusing net-living benefits over our own lived territory, while also incisively targeting a new stable socio-economic local development, according to social, ecological, and economic sustainability requirements. This territorial focus matches with a new glocal vision, which enables a more effective diffusion of benefits in terms of well-being, thus moderating the current global vision primarily fed by a global-scale market development view. Basic technological advancements have thus to be pursued at the system-level. They include system architecting for virtualization of functions, data integration and sharing, flexible basic service composition, and end-service personalization viability, for the operation and interoperation of smart systems, supporting effective net-living advancements in all application fields. Increasing and basically mandatory importance must also be increasingly reserved for human–technical and social–technical factors, as well as to the associated need of empowering the cross-disciplinary approach for related research and innovation. The prospected eco-systemic impact also implies a social pro-active participation, as well as coping with possible negative effects of net-living in terms of social exclusion and isolation, which require incisive actions for a conformal socio-cultural development. In this concern, speed, continuity, and expected long-term duration of innovation processes, pushed by basic technological advancements, make ecosystemic requirements stricter. This evolution requires also a new approach, targeting development of the needed basic and vocational education for net-living, which is to be considered as an engine for the development of the related ‘new living know-how’, as well as of the conformal ‘new making know-how’

    Internet of Underwater Things and Big Marine Data Analytics -- A Comprehensive Survey

    Full text link
    The Internet of Underwater Things (IoUT) is an emerging communication ecosystem developed for connecting underwater objects in maritime and underwater environments. The IoUT technology is intricately linked with intelligent boats and ships, smart shores and oceans, automatic marine transportations, positioning and navigation, underwater exploration, disaster prediction and prevention, as well as with intelligent monitoring and security. The IoUT has an influence at various scales ranging from a small scientific observatory, to a midsized harbor, and to covering global oceanic trade. The network architecture of IoUT is intrinsically heterogeneous and should be sufficiently resilient to operate in harsh environments. This creates major challenges in terms of underwater communications, whilst relying on limited energy resources. Additionally, the volume, velocity, and variety of data produced by sensors, hydrophones, and cameras in IoUT is enormous, giving rise to the concept of Big Marine Data (BMD), which has its own processing challenges. Hence, conventional data processing techniques will falter, and bespoke Machine Learning (ML) solutions have to be employed for automatically learning the specific BMD behavior and features facilitating knowledge extraction and decision support. The motivation of this paper is to comprehensively survey the IoUT, BMD, and their synthesis. It also aims for exploring the nexus of BMD with ML. We set out from underwater data collection and then discuss the family of IoUT data communication techniques with an emphasis on the state-of-the-art research challenges. We then review the suite of ML solutions suitable for BMD handling and analytics. We treat the subject deductively from an educational perspective, critically appraising the material surveyed.Comment: 54 pages, 11 figures, 19 tables, IEEE Communications Surveys & Tutorials, peer-reviewed academic journa

    Smart and Pervasive Healthcare

    Get PDF
    Smart and pervasive healthcare aims at facilitating better healthcare access, provision, and delivery by overcoming spatial and temporal barriers. It represents a shift toward understanding what patients and clinicians really need when placed within a specific context, where traditional face-to-face encounters may not be possible or sufficient. As such, technological innovation is a necessary facilitating conduit. This book is a collection of chapters written by prominent researchers and academics worldwide that provide insights into the design and adoption of new platforms in smart and pervasive healthcare. With the COVID-19 pandemic necessitating changes to the traditional model of healthcare access and its delivery around the world, this book is a timely contribution

    pHealth 2021. Proc. of the 18th Internat. Conf. on Wearable Micro and Nano Technologies for Personalised Health, 8-10 November 2021, Genoa, Italy

    Get PDF
    Smart mobile systems – microsystems, smart textiles, smart implants, sensor-controlled medical devices – together with related body, local and wide-area networks up to cloud services, have become important enablers for telemedicine and the next generation of healthcare services. The multilateral benefits of pHealth technologies offer enormous potential for all stakeholder communities, not only in terms of improvements in medical quality and industrial competitiveness, but also for the management of healthcare costs and, last but not least, the improvement of patient experience. This book presents the proceedings of pHealth 2021, the 18th in a series of conferences on wearable micro and nano technologies for personalized health with personal health management systems, hosted by the University of Genoa, Italy, and held as an online event from 8 – 10 November 2021. The conference focused on digital health ecosystems in the transformation of healthcare towards personalized, participative, preventive, predictive precision medicine (5P medicine). The book contains 46 peer-reviewed papers (1 keynote, 5 invited papers, 33 full papers, and 7 poster papers). Subjects covered include the deployment of mobile technologies, micro-nano-bio smart systems, bio-data management and analytics, autonomous and intelligent systems, the Health Internet of Things (HIoT), as well as potential risks for security and privacy, and the motivation and empowerment of patients in care processes. Providing an overview of current advances in personalized health and health management, the book will be of interest to all those working in the field of healthcare today

    Big Data and Its Applications in Smart Real Estate and the Disaster Management Life Cycle: A Systematic Analysis

    Get PDF
    Big data is the concept of enormous amounts of data being generated daily in different fields due to the increased use of technology and internet sources. Despite the various advancements and the hopes of better understanding, big data management and analysis remain a challenge, calling for more rigorous and detailed research, as well as the identifications of methods and ways in which big data could be tackled and put to good use. The existing research lacks in discussing and evaluating the pertinent tools and technologies to analyze big data in an efficient manner which calls for a comprehensive and holistic analysis of the published articles to summarize the concept of big data and see field-specific applications. To address this gap and keep a recent focus, research articles published in last decade, belonging to top-tier and high-impact journals, were retrieved using the search engines of Google Scholar, Scopus, and Web of Science that were narrowed down to a set of 139 relevant research articles. Different analyses were conducted on the retrieved papers including bibliometric analysis, keywords analysis, big data search trends, and authors’ names, countries, and affiliated institutes contributing the most to the field of big data. The comparative analyses show that, conceptually, big data lies at the intersection of the storage, statistics, technology, and research fields and emerged as an amalgam of these four fields with interlinked aspects such as data hosting and computing, data management, data refining, data patterns, and machine learning. The results further show that major characteristics of big data can be summarized using the seven Vs, which include variety, volume, variability, value, visualization, veracity, and velocity. Furthermore, the existing methods for big data analysis, their shortcomings, and the possible directions were also explored that could be taken for harnessing technology to ensure data analysis tools could be upgraded to be fast and efficient. The major challenges in handling big data include efficient storage, retrieval, analysis, and visualization of the large heterogeneous data, which can be tackled through authentication such as Kerberos and encrypted files, logging of attacks, secure communication through Secure Sockets Layer (SSL) and Transport Layer Security (TLS), data imputation, building learning models, dividing computations into sub-tasks, checkpoint applications for recursive tasks, and using Solid State Drives (SDD) and Phase Change Material (PCM) for storage. In terms of frameworks for big data management, two frameworks exist including Hadoop and Apache Spark, which must be used simultaneously to capture the holistic essence of the data and make the analyses meaningful, swift, and speedy. Further field-specific applications of big data in two promising and integrated fields, i.e., smart real estate and disaster management, were investigated, and a framework for field-specific applications, as well as a merger of the two areas through big data, was highlighted. The proposed frameworks show that big data can tackle the ever-present issues of customer regrets related to poor quality of information or lack of information in smart real estate to increase the customer satisfaction using an intermediate organization that can process and keep a check on the data being provided to the customers by the sellers and real estate managers. Similarly, for disaster and its risk management, data from social media, drones, multimedia, and search engines can be used to tackle natural disasters such as floods, bushfires, and earthquakes, as well as plan emergency responses. In addition, a merger framework for smart real estate and disaster risk management show that big data generated from the smart real estate in the form of occupant data, facilities management, and building integration and maintenance can be shared with the disaster risk management and emergency response teams to help prevent, prepare, respond to, or recover from the disasters

    Transmission Rate Compression Based on Kalman Filter Using Spatio-temporal Correlation for Wireless Sensor Networks

    Get PDF
    Wireless sensor networks (WSNs) composed of spatially distributed autonomous sensor nodes have been applied to a wide variety of applications. Due to the limited energy budget of the sensor nodes and long-term operation requirement of the network, energy efficiency is a primary concern in almost any application. Radio communication, known as one of the most expensive processes, can be suppressed thanks to the temporal and spatial correlations. However, it is a challenge to compress the communication as much as possible, while reconstructing the system state with the highest quality. This work proposes the PKF method to compress the transmission rate for cluster based WSNs, which combines a k-step ahead Kalman predictor with a Kalman filter (KF). It provides the optimal reconstruction solution based on the compressed information of a single node for a linear system. Instead of approximating the noisy raw data, PKF aims to reconstruct the internal state of the system. It achieves data filtering, state estimation, data compression and reconstruction within one KF framework and allows the reconstructed signal based on the compressed transmission to be even more precise than transmitting all of the raw measurements without processing. The second contribution is the detailed analysis of PKF. It not only characterizes the effect of the system parameters on the performance of PKF but also supplies a common framework to analyze the underlying process of prediction-based schemes. The transmission rate and reconstruction quality are functions of the system parameters, which are calculated with the aid of (truncated) multivariate normal (MVN) distribution. The transmission of the node using PKF not only determines the current optimal estimate of the system state, but also indicates the range and the transmission probability of the k-step ahead prediction of the cluster head. Besides, one of the prominent results is an explicit expression for the covariance of the doubly truncated MVN distribution. This is the first work that calculates it using the Hessian matrix of the probability density function of a MVN distribution, which improves the traditional methods using moment-generating function and has generality. This contribution is important for WSNs, but also for other domains, e.g., statistics and economics. The PKF method is extended to use spatial correlation in multi-nodes systems without any intra-communication or a coordinator based on the above analysis. Each leaf node executes a PKF independently. The reconstruction quality is further improved by the cluster head using the received information, which is equivalent to further reduce the transmission rate of the node under the guaranteed reconstruction quality. The optimal reconstruction solution, called Rand-ST, is obtained, when the cluster head uses the incomplete information by taking the transmission of each node as random. Rand-ST actually solves the KF fusion problem with colored and randomly transmitted observations, which is the first work addressing this problem to the best of our knowledge. It proves the KF with state augment method is more accurate than the measurement differencing approach in this scenario. The suboptimality of Rand-ST by neglecting the useful information is analyzed, when the transmission of each node is controlled by PKF. The heuristic EPKF methods are thereupon proposed to utilize the complete information, while solving the nonlinear problem through linear approximations. Compared with the available techniques, EPKF methods not only ensure an error bound of the reconstruction for each node, but also allow them to report the emergency event in time, which avoids the loss of penitential important information

    Management And Security Of Multi-Cloud Applications

    Get PDF
    Single cloud management platform technology has reached maturity and is quite successful in information technology applications. Enterprises and application service providers are increasingly adopting a multi-cloud strategy to reduce the risk of cloud service provider lock-in and cloud blackouts and, at the same time, get the benefits like competitive pricing, the flexibility of resource provisioning and better points of presence. Another class of applications that are getting cloud service providers increasingly interested in is the carriers\u27 virtualized network services. However, virtualized carrier services require high levels of availability and performance and impose stringent requirements on cloud services. They necessitate the use of multi-cloud management and innovative techniques for placement and performance management. We consider two classes of distributed applications – the virtual network services and the next generation of healthcare – that would benefit immensely from deployment over multiple clouds. This thesis deals with the design and development of new processes and algorithms to enable these classes of applications. We have evolved a method for optimization of multi-cloud platforms that will pave the way for obtaining optimized placement for both classes of services. The approach that we have followed for placement itself is predictive cost optimized latency controlled virtual resource placement for both types of applications. To improve the availability of virtual network services, we have made innovative use of the machine and deep learning for developing a framework for fault detection and localization. Finally, to secure patient data flowing through the wide expanse of sensors, cloud hierarchy, virtualized network, and visualization domain, we have evolved hierarchical autoencoder models for data in motion between the IoT domain and the multi-cloud domain and within the multi-cloud hierarchy

    eHealth in Chronic Diseases

    Get PDF
    This book provides a review of the management of chronic diseases (evaluation and treatment) through eHealth. Studies that examine how eHealth can help to prevent, evaluate, or treat chronic diseases and their outcomes are included
    • …
    corecore