1,202 research outputs found

    Serverless Strategies and Tools in the Cloud Computing Continuum

    Full text link
    Tesis por compendio[ES] En los últimos años, la popularidad de la computación en nube ha permitido a los usuarios acceder a recursos de cómputo, red y almacenamiento sin precedentes bajo un modelo de pago por uso. Esta popularidad ha propiciado la aparición de nuevos servicios para resolver determinados problemas informáticos a gran escala y simplificar el desarrollo y el despliegue de aplicaciones. Entre los servicios más destacados en los últimos años se encuentran las plataformas FaaS (Función como Servicio), cuyo principal atractivo es la facilidad de despliegue de pequeños fragmentos de código en determinados lenguajes de programación para realizar tareas específicas en respuesta a eventos. Estas funciones son ejecutadas en los servidores del proveedor Cloud sin que los usuarios se preocupen de su mantenimiento ni de la gestión de su elasticidad, manteniendo siempre un modelo de pago por uso de grano fino. Las plataformas FaaS pertenecen al paradigma informático conocido como Serverless, cuyo propósito es abstraer la gestión de servidores por parte de los usuarios, permitiéndoles centrar sus esfuerzos únicamente en el desarrollo de aplicaciones. El problema del modelo FaaS es que está enfocado principalmente en microservicios y tiende a tener limitaciones en el tiempo de ejecución y en las capacidades de computación (por ejemplo, carece de soporte para hardware de aceleración como GPUs). Sin embargo, se ha demostrado que la capacidad de autoaprovisionamiento y el alto grado de paralelismo de estos servicios pueden ser muy adecuados para una mayor variedad de aplicaciones. Además, su inherente ejecución dirigida por eventos hace que las funciones sean perfectamente adecuadas para ser definidas como pasos en flujos de trabajo de procesamiento de archivos (por ejemplo, flujos de trabajo de computación científica). Por otra parte, el auge de los dispositivos inteligentes e integrados (IoT), las innovaciones en las redes de comunicación y la necesidad de reducir la latencia en casos de uso complejos han dado lugar al concepto de Edge computing, o computación en el borde. El Edge computing consiste en el procesamiento en dispositivos cercanos a las fuentes de datos para mejorar los tiempos de respuesta. La combinación de este paradigma con la computación en nube, formando arquitecturas con dispositivos a distintos niveles en función de su proximidad a la fuente y su capacidad de cómputo, se ha acuñado como continuo de la computación en la nube (o continuo computacional). Esta tesis doctoral pretende, por lo tanto, aplicar diferentes estrategias Serverless para permitir el despliegue de aplicaciones generalistas, empaquetadas en contenedores de software, a través de los diferentes niveles del continuo computacional. Para ello, se han desarrollado múltiples herramientas con el fin de: i) adaptar servicios FaaS de proveedores Cloud públicos; ii) integrar diferentes componentes software para definir una plataforma Serverless en infraestructuras privadas y en el borde; iii) aprovechar dispositivos de aceleración en plataformas Serverless; y iv) facilitar el despliegue de aplicaciones y flujos de trabajo a través de interfaces de usuario. Además, se han creado y adaptado varios casos de uso para evaluar los desarrollos conseguidos.[CA] En els últims anys, la popularitat de la computació al núvol ha permès als usuaris accedir a recursos de còmput, xarxa i emmagatzematge sense precedents sota un model de pagament per ús. Aquesta popularitat ha propiciat l'aparició de nous serveis per resoldre determinats problemes informàtics a gran escala i simplificar el desenvolupament i desplegament d'aplicacions. Entre els serveis més destacats en els darrers anys hi ha les plataformes FaaS (Funcions com a Servei), el principal atractiu de les quals és la facilitat de desplegament de petits fragments de codi en determinats llenguatges de programació per realitzar tasques específiques en resposta a esdeveniments. Aquestes funcions són executades als servidors del proveïdor Cloud sense que els usuaris es preocupen del seu manteniment ni de la gestió de la seva elasticitat, mantenint sempre un model de pagament per ús de gra fi. Les plataformes FaaS pertanyen al paradigma informàtic conegut com a Serverless, el propòsit del qual és abstraure la gestió de servidors per part dels usuaris, permetent centrar els seus esforços únicament en el desenvolupament d'aplicacions. El problema del model FaaS és que està enfocat principalment a microserveis i tendeix a tenir limitacions en el temps d'execució i en les capacitats de computació (per exemple, no té suport per a maquinari d'acceleració com GPU). Tot i això, s'ha demostrat que la capacitat d'autoaprovisionament i l'alt grau de paral·lelisme d'aquests serveis poden ser molt adequats per a més aplicacions. A més, la seva inherent execució dirigida per esdeveniments fa que les funcions siguen perfectament adequades per ser definides com a passos en fluxos de treball de processament d'arxius (per exemple, fluxos de treball de computació científica). D'altra banda, l'auge dels dispositius intel·ligents i integrats (IoT), les innovacions a les xarxes de comunicació i la necessitat de reduir la latència en casos d'ús complexos han donat lloc al concepte d'Edge computing, o computació a la vora. L'Edge computing consisteix en el processament en dispositius propers a les fonts de dades per millorar els temps de resposta. La combinació d'aquest paradigma amb la computació en núvol, formant arquitectures amb dispositius a diferents nivells en funció de la proximitat a la font i la capacitat de còmput, s'ha encunyat com a continu de la computació al núvol (o continu computacional). Aquesta tesi doctoral pretén, doncs, aplicar diferents estratègies Serverless per permetre el desplegament d'aplicacions generalistes, empaquetades en contenidors de programari, a través dels diferents nivells del continu computacional. Per això, s'han desenvolupat múltiples eines per tal de: i) adaptar serveis FaaS de proveïdors Cloud públics; ii) integrar diferents components de programari per definir una plataforma Serverless en infraestructures privades i a la vora; iii) aprofitar dispositius d'acceleració a plataformes Serverless; i iv) facilitar el desplegament d'aplicacions i fluxos de treball mitjançant interfícies d'usuari. A més, s'han creat i s'han adaptat diversos casos d'ús per avaluar els desenvolupaments aconseguits.[EN] In recent years, the popularity of Cloud computing has allowed users to access unprecedented compute, network, and storage resources under a pay-per-use model. This popularity led to new services to solve specific large-scale computing challenges and simplify the development and deployment of applications. Among the most prominent services in recent years are FaaS (Function as a Service) platforms, whose primary appeal is the ease of deploying small pieces of code in certain programming languages to perform specific tasks on an event-driven basis. These functions are executed on the Cloud provider's servers without users worrying about their maintenance or elasticity management, always keeping a fine-grained pay-per-use model. FaaS platforms belong to the computing paradigm known as Serverless, which aims to abstract the management of servers from the users, allowing them to focus their efforts solely on the development of applications. The problem with FaaS is that it focuses on microservices and tends to have limitations regarding the execution time and the computing capabilities (e.g. lack of support for acceleration hardware such as GPUs). However, it has been demonstrated that the self-provisioning capability and high degree of parallelism of these services can be well suited to broader applications. In addition, their inherent event-driven triggering makes functions perfectly suitable to be defined as steps in file processing workflows (e.g. scientific computing workflows). Furthermore, the rise of smart and embedded devices (IoT), innovations in communication networks and the need to reduce latency in challenging use cases have led to the concept of Edge computing. Edge computing consists of conducting the processing on devices close to the data sources to improve response times. The coupling of this paradigm together with Cloud computing, involving architectures with devices at different levels depending on their proximity to the source and their compute capability, has been coined as Cloud Computing Continuum (or Computing Continuum). Therefore, this PhD thesis aims to apply different Serverless strategies to enable the deployment of generalist applications, packaged in software containers, across the different tiers of the Cloud Computing Continuum. To this end, multiple tools have been developed in order to: i) adapt FaaS services from public Cloud providers; ii) integrate different software components to define a Serverless platform on on-premises and Edge infrastructures; iii) leverage acceleration devices on Serverless platforms; and iv) facilitate the deployment of applications and workflows through user interfaces. Additionally, several use cases have been created and adapted to assess the developments achieved.Risco Gallardo, S. (2023). Serverless Strategies and Tools in the Cloud Computing Continuum [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/202013Compendi

    Modern computing: Vision and challenges

    Get PDF
    Over the past six decades, the computing systems field has experienced significant transformations, profoundly impacting society with transformational developments, such as the Internet and the commodification of computing. Underpinned by technological advancements, computer systems, far from being static, have been continuously evolving and adapting to cover multifaceted societal niches. This has led to new paradigms such as cloud, fog, edge computing, and the Internet of Things (IoT), which offer fresh economic and creative opportunities. Nevertheless, this rapid change poses complex research challenges, especially in maximizing potential and enhancing functionality. As such, to maintain an economical level of performance that meets ever-tighter requirements, one must understand the drivers of new model emergence and expansion, and how contemporary challenges differ from past ones. To that end, this article investigates and assesses the factors influencing the evolution of computing systems, covering established systems and architectures as well as newer developments, such as serverless computing, quantum computing, and on-device AI on edge devices. Trends emerge when one traces technological trajectory, which includes the rapid obsolescence of frameworks due to business and technical constraints, a move towards specialized systems and models, and varying approaches to centralized and decentralized control. This comprehensive review of modern computing systems looks ahead to the future of research in the field, highlighting key challenges and emerging trends, and underscoring their importance in cost-effectively driving technological progress

    Investigating the learning potential of the Second Quantum Revolution: development of an approach for secondary school students

    Get PDF
    In recent years we have witnessed important changes: the Second Quantum Revolution is in the spotlight of many countries, and it is creating a new generation of technologies. To unlock the potential of the Second Quantum Revolution, several countries have launched strategic plans and research programs that finance and set the pace of research and development of these new technologies (like the Quantum Flagship, the National Quantum Initiative Act and so on). The increasing pace of technological changes is also challenging science education and institutional systems, requiring them to help to prepare new generations of experts. This work is placed within physics education research and contributes to the challenge by developing an approach and a course about the Second Quantum Revolution. The aims are to promote quantum literacy and, in particular, to value from a cultural and educational perspective the Second Revolution. The dissertation is articulated in two parts. In the first, we unpack the Second Quantum Revolution from a cultural perspective and shed light on the main revolutionary aspects that are elevated to the rank of principles implemented in the design of a course for secondary school students, prospective and in-service teachers. The design process and the educational reconstruction of the activities are presented as well as the results of a pilot study conducted to investigate the impact of the approach on students' understanding and to gather feedback to refine and improve the instructional materials. The second part consists of the exploration of the Second Quantum Revolution as a context to introduce some basic concepts of quantum physics. We present the results of an implementation with secondary school students to investigate if and to what extent external representations could play any role to promote students’ understanding and acceptance of quantum physics as a personal reliable description of the world

    AI: Limits and Prospects of Artificial Intelligence

    Get PDF
    The emergence of artificial intelligence has triggered enthusiasm and promise of boundless opportunities as much as uncertainty about its limits. The contributions to this volume explore the limits of AI, describe the necessary conditions for its functionality, reveal its attendant technical and social problems, and present some existing and potential solutions. At the same time, the contributors highlight the societal and attending economic hopes and fears, utopias and dystopias that are associated with the current and future development of artificial intelligence

    Enabling dynamic and intelligent workflows for HPC, data analytics, and AI convergence

    Get PDF
    The evolution of High-Performance Computing (HPC) platforms enables the design and execution of progressively larger and more complex workflow applications in these systems. The complexity comes not only from the number of elements that compose the workflows but also from the type of computations they perform. While traditional HPC workflows target simulations and modelling of physical phenomena, current needs require in addition data analytics (DA) and artificial intelligence (AI) tasks. However, the development of these workflows is hampered by the lack of proper programming models and environments that support the integration of HPC, DA, and AI, as well as the lack of tools to easily deploy and execute the workflows in HPC systems. To progress in this direction, this paper presents use cases where complex workflows are required and investigates the main issues to be addressed for the HPC/DA/AI convergence. Based on this study, the paper identifies the challenges of a new workflow platform to manage complex workflows. Finally, it proposes a development approach for such a workflow platform addressing these challenges in two directions: first, by defining a software stack that provides the functionalities to manage these complex workflows; and second, by proposing the HPC Workflow as a Service (HPCWaaS) paradigm, which leverages the software stack to facilitate the reusability of complex workflows in federated HPC infrastructures. Proposals presented in this work are subject to study and development as part of the EuroHPC eFlows4HPC project.This work has received funding from the European High-Performance Computing Joint Undertaking (JU) under grant agreement No 955558. The JU receives support from the European Union’s Horizon 2020 research and innovation programme and Spain, Germany, France, Italy, Poland, Switzerland and Norway. In Spain, it has received complementary funding from MCIN/AEI/10.13039/501100011033, Spain and the European Union NextGenerationEU/PRTR (contracts PCI2021-121957, PCI2021-121931, PCI2021-121944, and PCI2021-121927). In Germany, it has received complementary funding from the German Federal Ministry of Education and Research (contracts 16HPC016K, 6GPC016K, 16HPC017 and 16HPC018). In France, it has received financial support from Caisse des dépôts et consignations (CDC) under the action PIA ADEIP (project Calculateurs). In Italy, it has been preliminary approved for complimentary funding by Ministero dello Sviluppo Economico (MiSE) (ref. project prop. 2659). In Norway, it has received complementary funding from the Norwegian Research Council, Norway under project number 323825. In Switzerland, it has been preliminary approved for complimentary funding by the State Secretariat for Education, Research, and Innovation (SERI), Norway. In Poland, it is partially supported by the National Centre for Research and Development under decision DWM/EuroHPCJU/4/2021. The authors also acknowledge financial support by MCIN/AEI /10.13039/501100011033, Spain through the “Severo Ochoa Programme for Centres of Excellence in R&D” under Grant CEX2018-000797-S, the Spanish Government, Spain (contract PID2019-107255 GB) and by Generalitat de Catalunya, Spain (contract 2017-SGR-01414). Anna Queralt is a Serra Húnter Fellow.With funding from the Spanish government through the ‘Severo Ochoa Centre of Excellence’ accreditation (CEX2018-000797-S)

    The Impact of the Atlantic Meridional Overturning Circulation (AMOC) Variability on the Mediterranean Climate for the last six decades

    Get PDF
    Το AMOC είναι από τα κυριότερα συστήματα ωκεάνιας κυκλοφορίας στον Ατλαντικό. Μεταφέρει θερμότητα από τις ζεστές τροπικές περιοχές, στους πόλους και, αντίστροφα, ψυχρά αλμυρά νερά από τους πόλους σε νοτιότερα πλάτη. Ο μηχανισμός αυτός έχει σημαντικό ενεργό ρόλο για πολλά κλιματικά και καιρικά φαινόμενα. Ο τρόπος με τον οποίο συμβαίνει αυτό, είναι αρχικά διατηρώντας τις θερμοκρασίες σχετικά ήπιες ακόμα και σε υψηλά γεωγραφικά πλάτη, λόγω των θερμών νερών. Κατά δεύτερο, τα θερμά νερά αποτελούν πηγή θερμότητας και υγρασίας για την τροπόσφαιρα, ελέγχοντας έτσι καιρικά φαινόμενα όπως καταιγίδες. Ο μηχανισμός κυκλοφορίας είναι εκ φύσεως ασταθής και μεταβλητός στον χρόνο. Επηρεάζεται από πολλούς παράγοντες που ελέγχουν την ωκεάνια κυκλοφορία, κυρίως από την παροχή γλυκού νερού πάγων, από τα συστήματα ανέμων, από την ηλιακή ακτινοβολία κ.ά.. Η ανθρωπογενής κλιματική αλλαγή και παγκόσμια θέρμανση έχει αλλάξει τις ισορροπίες στο γήινο παγκόσμιο σύστημα, μεταξύ άλλων, ενισχύοντας και επιταχύνοντας φυσικές διεργασίες. Καθώς τα κλιματικά και ωκεάνια μοντέλα βελτιώνονται και εξελίσσονται, πρόσφατες έρευνες υποδεικνύουν ότι το ρεύμα του Ατλαντικού έχει μια τάση καθυστέρησης τα τελευταία χρόνια, αρκετά πιο σημαντική συγκρινόμενη με προηγούμενες προσομοιώσεις, με αρκετά πιο έντονες επιπτώσεις σε παγκόσμια κλίμακα. Δεδομένης της κρισιμότητας των φαινομένων που αντιμετωπίζει η Μεσόγειος εν όψει της κλιματικής αλλαγής και των δυσμενών συνθηκών που θα πρέπει να αντιμετωπίσει στα επόμενα χρόνια, επικεντρωνόμαστε στη σχέση που μπορεί να έχουν τα δύο απομακρυσμένα συστήματα, το AMOC και το κλίμα της Μεσογείου. Στην έρευνά μας χρησιμοποιούμε δεδομένα από τη βάση ECMWF ERA5 reanalysis που είναι από τις πιο πρόσφατες και έμπιστες χρονοσειρές σήμερα. Συγκεκριμένα χρησιμοποιούμε μηνιαίες τιμές για την επιφανειακή θαλάσσια θερμοκρασία, την επιφανειακή θερμοκρασία του αέρα και το συνολικό ύψος κατακρημνισμάτων, για το χρονικό διάστημα από το 1959 ως και το 2021, και τις συσχετίζουμε με κατάλληλα επιλεγμένο δείκτη του AMOC. Από τα αποτελέσματά μας προκύπτει πως πράγματι υπάρχει μια σχέση μεταξύ των δύο συστημάτων, η οποία όμως δεν είναι ομοιογενής και δεν παρουσιάζει κάποια σταθερή τάση, κοινή για τις παραμέτρους. Συγκεκριμένα, προκύπτει πως υπάρχει αξιοσημείωτη σχέση μεταξύ του AMOC και των δύο θερμοκρασιών (θαλάσσια και αέρα) με χρονική καθυστέρηση ως και δύο χρόνια, ενώ βρέθηκε ισχυρή συσχέτιση με τα κατακρημνίσματα του χειμώνα με επίδραση έως και τρία χρόνια μετά. Τα αποτελέσματα είναι σίγουρα ενθαρρυντικά. Εξακρίβωση της σχέσης μεταξύ του AMOC και της Μεσογείου σε επόμενες έρευνες, μπορεί δυνητικά να βελτιώσει σημαντικά τις προγνώσεις στην περιοχή και να προειδοποιηθούν με αυτό τον τρόπο οι τοπικές κοινωνίες για ακραία καιρικά φαινόμενα, μήνες πριν συμβούν.The Atlantic Meridional Overturning Circulation (AMOC) is a major ocean circulation system in the Atlantic Ocean. The AMOC transports heat to the poles and cold saline waters to the tropics. This mechanism is responsible for many climate and weather systems, firstly by keeping the temperatures mild even in high latitudes and secondly by providing heat and moisture to the air. This mechanism is naturally unstable and changes in time, depending on sea-ice melt, wind patterns, sun radiation variation etc. Anthropogenic climate change and global warming has altered the balance in the global system accelerating change. As models evolve and improve their outcomes, recent research has indicated that the AMOC is in a weaker state than previously thought with impacts more severe than expected. As AMOC is a global circulation system, the impacts can reach remote systems many kilometers away. Given that the Mediterranean is a hot spot for climate change and the risks it faces can get severe, we focus on the relationship between the two remote systems, the AMOC slowing down and the Mediterranean climate. We use the ECMWF ERA5 Reanalysis datasets, which are the newest reanalysis datasets, to get our monthly datasets (sea surface temperature, surface air temperature and precipitation) from 1959 to 2021, and correlate them with an AMOC fingerprint. We found that there is indeed a relationship although it is not homogenous and there is no evident, strict pattern. More specifically there is a connection between the AMOC and the Mediterranean Sea surface and air temperatures up to a two-year time lag, and a strong winter correlation with the precipitation in the Mediterranean up to three years ahead. The results are encouraging. They suggest a possibility for improved forecasts in the region as well as predicting several months in advance any extreme events

    Synthetic Aperture Radar (SAR) Meets Deep Learning

    Get PDF
    This reprint focuses on the application of the combination of synthetic aperture radars and depth learning technology. It aims to further promote the development of SAR image intelligent interpretation technology. A synthetic aperture radar (SAR) is an important active microwave imaging sensor, whose all-day and all-weather working capacity give it an important place in the remote sensing community. Since the United States launched the first SAR satellite, SAR has received much attention in the remote sensing community, e.g., in geological exploration, topographic mapping, disaster forecast, and traffic monitoring. It is valuable and meaningful, therefore, to study SAR-based remote sensing applications. In recent years, deep learning represented by convolution neural networks has promoted significant progress in the computer vision community, e.g., in face recognition, the driverless field and Internet of things (IoT). Deep learning can enable computational models with multiple processing layers to learn data representations with multiple-level abstractions. This can greatly improve the performance of various applications. This reprint provides a platform for researchers to handle the above significant challenges and present their innovative and cutting-edge research results when applying deep learning to SAR in various manuscript types, e.g., articles, letters, reviews and technical reports

    The Power of Patents: Leveraging Text Mining and Social Network Analysis to Forecast IoT Trends

    Full text link
    Technology has become an indispensable competitive tool as science and technology have progressed throughout history. Organizations can compete on an equal footing by implementing technology appropriately. Technology trends or technology lifecycles begin during the initiation phase. Finally, it reaches saturation after entering the maturity phase. As technology reaches saturation, it will be removed or replaced by another. This makes investing in technologies during this phase unjustifiable. Technology forecasting is a critical tool for research and development to determine the future direction of technology. Based on registered patents, this study examined the trends of IOT technologies. A total of 3697 patents related to the Internet of Things from the last six years of patenting have been gathered using lens.org for this purpose. The main people and companies were identified through the creation of the IOT patent registration cooperation network, and the main groups active in patent registration were identified by the community detection technique. The patents were then divided into six technology categories: Safety and Security, Information Services, Public Safety and Environment Monitoring, Collaborative Aware Systems, Smart Homes/Buildings, and Smart Grid. And their technical maturity was identified and examined using the Sigma Plot program. Based on the findings, information services technologies are in the saturation stage, while both smart homes/buildings, and smart grid technologies are in the saturation stage. Three technologies, Safety and Security, Public Safety and Environment Monitoring, and Collaborative Aware Systems are in the maturity stage

    Digital agriculture: research, development and innovation in production chains.

    Get PDF
    Digital transformation in the field towards sustainable and smart agriculture. Digital agriculture: definitions and technologies. Agroenvironmental modeling and the digital transformation of agriculture. Geotechnologies in digital agriculture. Scientific computing in agriculture. Computer vision applied to agriculture. Technologies developed in precision agriculture. Information engineering: contributions to digital agriculture. DIPN: a dictionary of the internal proteins nanoenvironments and their potential for transformation into agricultural assets. Applications of bioinformatics in agriculture. Genomics applied to climate change: biotechnology for digital agriculture. Innovation ecosystem in agriculture: Embrapa?s evolution and contributions. The law related to the digitization of agriculture. Innovating communication in the age of digital agriculture. Driving forces for Brazilian agriculture in the next decade: implications for digital agriculture. Challenges, trends and opportunities in digital agriculture in Brazil
    corecore