3,159 research outputs found

    Models of everywhere revisited: a technological perspective

    Get PDF
    The concept ‘models of everywhere’ was first introduced in the mid 2000s as a means of reasoning about the environmental science of a place, changing the nature of the underlying modelling process, from one in which general model structures are used to one in which modelling becomes a learning process about specific places, in particular capturing the idiosyncrasies of that place. At one level, this is a straightforward concept, but at another it is a rich multi-dimensional conceptual framework involving the following key dimensions: models of everywhere, models of everything and models at all times, being constantly re-evaluated against the most current evidence. This is a compelling approach with the potential to deal with epistemic uncertainties and nonlinearities. However, the approach has, as yet, not been fully utilised or explored. This paper examines the concept of models of everywhere in the light of recent advances in technology. The paper argues that, when first proposed, technology was a limiting factor but now, with advances in areas such as Internet of Things, cloud computing and data analytics, many of the barriers have been alleviated. Consequently, it is timely to look again at the concept of models of everywhere in practical conditions as part of a trans-disciplinary effort to tackle the remaining research questions. The paper concludes by identifying the key elements of a research agenda that should underpin such experimentation and deployment

    The edge cloud: A holistic view of communication, computation and caching

    Get PDF
    The evolution of communication networks shows a clear shift of focus from just improving the communications aspects to enabling new important services, from Industry 4.0 to automated driving, virtual/augmented reality, Internet of Things (IoT), and so on. This trend is evident in the roadmap planned for the deployment of the fifth generation (5G) communication networks. This ambitious goal requires a paradigm shift towards a vision that looks at communication, computation and caching (3C) resources as three components of a single holistic system. The further step is to bring these 3C resources closer to the mobile user, at the edge of the network, to enable very low latency and high reliability services. The scope of this chapter is to show that signal processing techniques can play a key role in this new vision. In particular, we motivate the joint optimization of 3C resources. Then we show how graph-based representations can play a key role in building effective learning methods and devising innovative resource allocation techniques.Comment: to appear in the book "Cooperative and Graph Signal Pocessing: Principles and Applications", P. Djuric and C. Richard Eds., Academic Press, Elsevier, 201

    Monitoring the waste to energy plant using the latest AI methods and tools

    Get PDF
    Solid wastes for instance, municipal and industrial wastes present great environmental concerns and challenges all over the world. This has led to development of innovative waste-to-energy process technologies capable of handling different waste materials in a more sustainable and energy efficient manner. However, like in many other complex industrial process operations, waste-to-energy plants would require sophisticated process monitoring systems in order to realize very high overall plant efficiencies. Conventional data-driven statistical methods which include principal component analysis, partial least squares, multivariable linear regression and so forth, are normally applied in process monitoring. But recently, latest artificial intelligence (AI) methods in particular deep learning algorithms have demostrated remarkable performances in several important areas such as machine vision, natural language processing and pattern recognition. The new AI algorithms have gained increasing attention from the process industrial applications for instance in areas such as predictive product quality control and machine health monitoring. Moreover, the availability of big-data processing tools and cloud computing technologies further support the use of deep learning based algorithms for process monitoring. In this work, a process monitoring scheme based on the state-of-the-art artificial intelligence methods and cloud computing platforms is proposed for a waste-to-energy industrial use case. The monitoring scheme supports use of latest AI methods, laveraging big-data processing tools and taking advantage of available cloud computing platforms. Deep learning algorithms are able to describe non-linear, dynamic and high demensionality systems better than most conventional data-based process monitoring methods. Moreover, deep learning based methods are best suited for big-data analytics unlike traditional statistical machine learning methods which are less efficient. Furthermore, the proposed monitoring scheme emphasizes real-time process monitoring in addition to offline data analysis. To achieve this the monitoring scheme proposes use of big-data analytics software frameworks and tools such as Microsoft Azure stream analytics, Apache storm, Apache Spark, Hadoop and many others. The availability of open source in addition to proprietary cloud computing platforms, AI and big-data software tools, all support the realization of the proposed monitoring scheme

    An Intelligent model for supporting Edge Migration for Virtual Function Chains in Next Generation Internet of Things

    Get PDF
    The developments on next generation IoT sensing devices, with the advances on their low power computational capabilities and high speed networking has led to the introduction of the edge computing paradigm. Within an edge cloud environment, services may generate and consume data locally, without involving cloud computing infrastructures. Aiming to tackle the low computational resources of the IoT nodes, Virtual-Function-Chain has been proposed as an intelligent distribution model for exploiting the maximum of the computational power at the edge, thus enabling the support of demanding services. An intelligent migration model with the capacity to support Virtual-Function-Chains is introduced in this work. According to this model, migration at the edge can support individual features of a Virtual-Function-Chain. First, auto-healing can be implemented with cold migrations, if a Virtual Function fails unexpectedly. Second, a Quality of Service monitoring model can trigger live migrations, aiming to avoid edge devices overload. The evaluation studies of the proposed model revealed that it has the capacity to increase the robustness of an edge-based service on low-powered IoT devices. Finally, comparison with similar frameworks, like Kubernetes, showed that the migration model can effectively react on edge network fluctuations

    Interactive Experience Design: Integrated and Tangible Storytelling with Maritime Museum Artefacts

    Get PDF
    Museums play the role of intermediary between cultural heritage and visitors, and are often described as places and environments for education and enjoyment. The European Union also encourages innovative uses of museums to support education through the cultural heritage resources. However, the importance of visitors’ active role in museums as places for education and entertainment, on the one hand, and the growing and indispensable presence of technology in the cultural heritage domain, on the other hand, provided the initial ideas to develop the research. This thesis, presents the study and design for an interactive storytelling installation for a maritime museum. The installation is designed to integrate different museum artefacts into the storytelling system to enrich the visitors experience through tangible storytelling. The project was conducted in collaboration with another PhD student, Luca Ciotoli. His contribution was mainly focused on the narrative and storytelling features of the research, while my contribution was focused on the interaction- and technology-related features, including the design and implementation of the prototype. The research is deployed using a four-phase iterative approach. The first phase of the research, Study, deals with literature review and different studies to identify the requirements. The second phase, Design, determines the broad outlines of the project i.e., an interactive storytelling installation. The design phase includes interaction and museum experience design. We investigated different design approaches, e.g., interaction and museum experience design, to develop a conceptual design. The third phase, prototype, allows us to determine how to fulfill the tasks and meet the requirements that are established for the research. Prototyping involves content creation, storyboarding, integrating augmented artefacts into the storytelling system. Th final phase, test, refers to the evaluations that are conducted during the aforementioned phases e.g., formative and the final usability testing with users. The outcome of the research confirms previous results in the literature about how digital narratives can be enriched with the tangible dimension, moreover it shows how this dimension can enable to communicate stories and knowledge of the past that are complex, such as the art of navigating in the past, by integrating tangible objects that play different roles in the storytelling process

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Advancing the Smart Region Digital Twin: The Case of UNESCO GEOfood

    Get PDF
    Organic food waste recycling is one of the final frontiers for a sustainable food lifecycle. Digital twins may be helpful to close the loop in more advanced food supply chains, but there is a lack of guidelines on how to adopt this emerging technology in community composting. This paper presents a digital twin-driven design to address this need in a UNESCO-protected region with geological relevance (Geopark). The design science research project offers the foundations for creating intelligent composting networks supported by digital technologies. Six initial design principles are suggested for developing digital twins at a regional level. This study contributes to the deployment of layered digital twins in sustainable regional development. Moreover, our proposal assists the integration of local food producers in regional food supply chains and increases the impact of sustainability brands like GEOfood
    • …
    corecore