209 research outputs found

    Application for managing container-based software development environments

    Get PDF
    Abstract. Virtualizing the software development process can enhance efficiency through unified, remotely managed environments. Docker containers, a popular technology in software development, are widely used for application testing and deployment. This thesis examines the use of containers as cloud-based development environments. This study explores the history and implementation of container-based virtualization before presenting containers as a novel cloud-based software development environment. Virtual containers, like virtual machines, have been extensively used in software development for code testing but not as development environments. Containers are also prevalent in the final stages of software production, specifically in the distribution and deployment of completed applications. In the practical part of the thesis, an application is implemented to improve the usability of a container-based development environment, addressing challenges in adopting new work environments. The work was conducted for a private company, and multiple experts provided input. The management application enhanced the container-based development environment’s efficiency by improving user rights management, virtual container management, and user interface. Additionally, the new management tools reduced training time for new employees by 50%, facilitating their integration into the organization. Container-based development environments with efficient management tools provide a secure, efficient, and unified platform for large-scale software development. Virtual containers also hold potential for future improvements in energy-saving strategies and organizational work method harmonization and integration.Sovellus konttipohjaisten ohjelmistonkehitysympäristöjen hallintaan. Tiivistelmä. Ohjelmistokehitysprosessin virtualisointi voi parantaa tehokkuutta yhtenäisten, etähallittujen ympäristöjen avulla. Ohjelmistonkehityksessä suosittu ohjelmistonkehitysteknologia, Docker-kontteja käytetään laajalti sovellusten testaamisessa ja käyttöönotossa. Tässä opinnäytetyössä tarkastellaan konttien käyttöä pilvipohjaisina kehitysympäristöinä. Tämä tutkimus tutkii konttipohjaisen virtualisoinnin historiaa ja toteutusta, jonka jälkeen esitellään konttien käyttöä uudenlaisena pilvipohjaisena ohjelmistokehitysympäristönä. Virtuaalisia kontteja, kuten virtuaalikoneita, on käytetty laajasti ohjelmistokehityksessä kooditestauksessa, mutta ei kehitysympäristöinä. Kontit ovat myös yleisiä ohjelmistotuotannon loppuvaiheissa, erityisesti valmiiden sovellusten jakelussa ja käyttöönotossa. Opinnäytetyön käytännön osassa toteutetaan konttipohjaisen kehitysympäristön käytettävyyttä parantava sovellus, joka vastaa uusien työympäristöjen käyttöönoton haasteisiin. Työ suoritettiin yksityiselle yritykselle, ja sen suunnitteluun osallistui useita asiantuntijoita. Hallintasovellus lisäsi konttipohjaisen kehitysympäristön tehokkuutta parantamalla käyttäjäoikeuksien hallintaa, virtuaalisen kontin hallintaa ja käyttöliittymää. Lisäksi uudet hallintatyökalut lyhensivät uusien työntekijöiden koulutusaikaa 50%, mikä helpotti heidän integroitumistaan organisaatioon. Säiliöpohjaiset kehitysympäristöt varustettuina tehokkailla hallintatyökaluilla tarjoavat turvallisen, tehokkaan ja yhtenäisen alustan laajamittaiseen ohjelmistokehitykseen. Virtuaalisissa konteissa on myös potentiaalia tulevaisuuden parannuksiin energiansäästöstrategioissa ja organisaation työmenetelmien harmonisoinnissa ja integroinnissa

    Big Data and Large-scale Data Analytics: Efficiency of Sustainable Scalability and Security of Centralized Clouds and Edge Deployment Architectures

    Get PDF
    One of the significant shifts of the next-generation computing technologies will certainly be in the development of Big Data (BD) deployment architectures. Apache Hadoop, the BD landmark, evolved as a widely deployed BD operating system. Its new features include federation structure and many associated frameworks, which provide Hadoop 3.x with the maturity to serve different markets. This dissertation addresses two leading issues involved in exploiting BD and large-scale data analytics realm using the Hadoop platform. Namely, (i)Scalability that directly affects the system performance and overall throughput using portable Docker containers. (ii) Security that spread the adoption of data protection practices among practitioners using access controls. An Enhanced Mapreduce Environment (EME), OPportunistic and Elastic Resource Allocation (OPERA) scheduler, BD Federation Access Broker (BDFAB), and a Secure Intelligent Transportation System (SITS) of multi-tiers architecture for data streaming to the cloud computing are the main contribution of this thesis study

    Paving the path towards platform engineering using a comprehensive reference model

    Get PDF
    Amidst the growing popularity of platform engineering, promising improved productivity and enhanced developer experience through an internal developer platform (IDP), this research addresses the prevalent challenge of a lack of a shared understanding in the field and the complications in defining effective, customized strategies. Introducing a definitive Platform Engineering Reference Model (PE-RM) based on the Open Distributed Processing reference model (ODP-RM) framework to provide a common under- standing. This model offers a structured framework for software organizations to create tailored platform engineering strategies and realize the full potential of platform engineering. The reference model is val- idated by conducting a case study in which a contextual design and technical implementation guided by the reference model is proposed. The case study offers guidance in designing platform engineering in the context of a software organization. Furthermore, it showcases how to construct a technical platform engineering implementation, which includes experiments exposing the productivity improvements and applicability of the implementation. By facilitating a shared vocabulary and providing a roadmap for implementation, this research aims to mitigate prevailing complexities and accelerate the adoption and effectiveness of platform engineering across organizations

    Measuring Success for a Future Vision: Defining Impact in Science Gateways/Virtual Research Environments

    Get PDF
    Scholars worldwide leverage science gateways/VREs for a wide variety of research and education endeavors spanning diverse scientific fields. Evaluating the value of a given science gateway/VRE to its constituent community is critical in obtaining the financial and human resources necessary to sustain operations and increase adoption in the user community. In this paper, we feature a variety of exemplar science gateways/VREs and detail how they define impact in terms of e.g., their purpose, operation principles, and size of user base. Further, the exemplars recognize that their science gateways/VREs will continuously evolve with technological advancements and standards in cloud computing platforms, web service architectures, data management tools and cybersecurity. Correspondingly, we present a number of technology advances that could be incorporated in next-generation science gateways/VREs to enhance their scope and scale of their operations for greater success/impact. The exemplars are selected from owners of science gateways in the Science Gateways Community Institute (SGCI) clientele in the United States, and from the owners of VREs in the International Virtual Research Environment Interest Group (VRE-IG) of the Research Data Alliance. Thus, community-driven best practices and technology advances are compiled from diverse expert groups with an international perspective to envisage futuristic science gateway/VRE innovations

    On-premise containerized, light-weight software solutions for Biomedicine

    Get PDF
    Bioinformatics software systems are critical tools for analysing large-scale biological data, but their design and implementation can be challenging due to the need for reliability, scalability, and performance. This thesis investigates the impact of several software approaches on the design and implementation of bioinformatics software systems. These approaches include software patterns, microservices, distributed computing, containerisation and container orchestration. The research focuses on understanding how these techniques affect bioinformatics software systems’ reliability, scalability, performance, and efficiency. Furthermore, this research highlights the challenges and considerations involved in their implementation. This study also examines potential solutions for implementing container orchestration in bioinformatics research teams with limited resources and the challenges of using container orchestration. Additionally, the thesis considers microservices and distributed computing and how these can be optimised in the design and implementation process to enhance the productivity and performance of bioinformatics software systems. The research was conducted using a combination of software development, experimentation, and evaluation. The results show that implementing software patterns can significantly improve the code accessibility and structure of bioinformatics software systems. Specifically, microservices and containerisation also enhanced system reliability, scalability, and performance. Additionally, the study indicates that adopting advanced software engineering practices, such as model-driven design and container orchestration, can facilitate efficient and productive deployment and management of bioinformatics software systems, even for researchers with limited resources. Overall, we develop a software system integrating all our findings. Our proposed system demonstrated the ability to address challenges in bioinformatics. The thesis makes several key contributions in addressing the research questions surrounding the design, implementation, and optimisation of bioinformatics software systems using software patterns, microservices, containerisation, and advanced software engineering principles and practices. Our findings suggest that incorporating these technologies can significantly improve bioinformatics software systems’ reliability, scalability, performance, efficiency, and productivity.Bioinformatische Software-Systeme stellen bedeutende Werkzeuge für die Analyse umfangreicher biologischer Daten dar. Ihre Entwicklung und Implementierung kann jedoch aufgrund der erforderlichen Zuverlässigkeit, Skalierbarkeit und Leistungsfähigkeit eine Herausforderung darstellen. Das Ziel dieser Arbeit ist es, die Auswirkungen von Software-Mustern, Microservices, verteilten Systemen, Containerisierung und Container-Orchestrierung auf die Architektur und Implementierung von bioinformatischen Software-Systemen zu untersuchen. Die Forschung konzentriert sich darauf, zu verstehen, wie sich diese Techniken auf die Zuverlässigkeit, Skalierbarkeit, Leistungsfähigkeit und Effizienz von bioinformatischen Software-Systemen auswirken und welche Herausforderungen mit ihrer Konzeptualisierungen und Implementierung verbunden sind. Diese Arbeit untersucht auch potenzielle Lösungen zur Implementierung von Container-Orchestrierung in bioinformatischen Forschungsteams mit begrenzten Ressourcen und die Einschränkungen bei deren Verwendung in diesem Kontext. Des Weiteren werden die Schlüsselfaktoren, die den Erfolg von bioinformatischen Software-Systemen mit Containerisierung, Microservices und verteiltem Computing beeinflussen, untersucht und wie diese im Design- und Implementierungsprozess optimiert werden können, um die Produktivität und Leistung bioinformatischer Software-Systeme zu steigern. Die vorliegende Arbeit wurde mittels einer Kombination aus Software-Entwicklung, Experimenten und Evaluation durchgeführt. Die erzielten Ergebnisse zeigen, dass die Implementierung von Software-Mustern, die Zuverlässigkeit und Skalierbarkeit von bioinformatischen Software-Systemen erheblich verbessern kann. Der Einsatz von Microservices und Containerisierung trug ebenfalls zur Steigerung der Zuverlässigkeit, Skalierbarkeit und Leistungsfähigkeit des Systems bei. Darüber hinaus legt die Arbeit dar, dass die Anwendung von SoftwareEngineering-Praktiken, wie modellgesteuertem Design und Container-Orchestrierung, die effiziente und produktive Bereitstellung und Verwaltung von bioinformatischen Software-Systemen erleichtern kann. Zudem löst die Implementierung dieses SoftwareSystems, Herausforderungen für Forschungsgruppen mit begrenzten Ressourcen. Insgesamt hat das System gezeigt, dass es in der Lage ist, Herausforderungen im Bereich der Bioinformatik zu bewältigen und stellt somit ein wertvolles Werkzeug für Forscher in diesem Bereich dar. Die vorliegende Arbeit leistet mehrere wichtige Beiträge zur Beantwortung von Forschungsfragen im Zusammenhang mit dem Entwurf, der Implementierung und der Optimierung von Software-Systemen für die Bioinformatik unter Verwendung von Prinzipien und Praktiken der Softwaretechnik. Unsere Ergebnisse deuten darauf hin, dass die Einbindung dieser Technologien die Zuverlässigkeit, Skalierbarkeit, Leistungsfähigkeit, Effizienz und Produktivität bioinformatischer Software-Systeme erheblich verbessern kann

    Big data reference architecture for industry 4.0: including economic and ethical Implications

    Get PDF
    El rápido progreso de la Industria 4.0 se consigue gracias a las innovaciones en varios campos, por ejemplo, la fabricación, el big data y la inteligencia artificial. La tesis explica la necesidad de una arquitectura del Big Data para implementar la Inteligencia Artificial en la Industria 4.0 y presenta una arquitectura cognitiva para la inteligencia artificial - CAAI - como posible solución, que se adapta especialmente a los retos de las pequeñas y medianas empresas. La tesis examina las implicaciones económicas y éticas de esas tecnologías y destaca tanto los beneficios como los retos para los países, las empresas y los trabajadores individuales. El "Cuestionario de la Industria 4.0 para las PYME" se realizó para averiguar los requisitos y necesidades de las pequeñas y medianas empresas. Así, la nueva arquitectura de la CAAI presenta un modelo de diseño de software y proporciona un conjunto de bloques de construcción de código abierto para apoyar a las empresas durante la implementación. Diferentes casos de uso demuestran la aplicabilidad de la arquitectura y la siguiente evaluación verifica la funcionalidad de la misma.The rapid progress in Industry 4.0 is achieved through innovations in several fields, e.g., manufacturing, big data, and artificial intelligence. The thesis motivates the need for a Big Data architecture to apply artificial intelligence in Industry 4.0 and presents a cognitive architecture for artificial intelligence – CAAI – as a possible solution, which is especially suited for the challenges of small and medium-sized enterprises. The work examines the economic and ethical implications of those technologies and highlights the benefits but also the challenges for countries, companies and individual workers. The "Industry 4.0 Questionnaire for SMEs" was conducted to gain insights into smaller and medium-sized companies’ requirements and needs. Thus, the new CAAI architecture presents a software design blueprint and provides a set of open-source building blocks to support companies during implementation. Different use cases demonstrate the applicability of the architecture and the following evaluation verifies the functionality of the architecture

    A Cognitive Routing framework for Self-Organised Knowledge Defined Networks

    Get PDF
    This study investigates the applicability of machine learning methods to the routing protocols for achieving rapid convergence in self-organized knowledge-defined networks. The research explores the constituents of the Self-Organized Networking (SON) paradigm for 5G and beyond, aiming to design a routing protocol that complies with the SON requirements. Further, it also exploits a contemporary discipline called Knowledge-Defined Networking (KDN) to extend the routing capability by calculating the “Most Reliable” path than the shortest one. The research identifies the potential key areas and possible techniques to meet the objectives by surveying the state-of-the-art of the relevant fields, such as QoS aware routing, Hybrid SDN architectures, intelligent routing models, and service migration techniques. The design phase focuses primarily on the mathematical modelling of the routing problem and approaches the solution by optimizing at the structural level. The work contributes Stochastic Temporal Edge Normalization (STEN) technique which fuses link and node utilization for cost calculation; MRoute, a hybrid routing algorithm for SDN that leverages STEN to provide constant-time convergence; Most Reliable Route First (MRRF) that uses a Recurrent Neural Network (RNN) to approximate route-reliability as the metric of MRRF. Additionally, the research outcomes include a cross-platform SDN Integration framework (SDN-SIM) and a secure migration technique for containerized services in a Multi-access Edge Computing environment using Distributed Ledger Technology. The research work now eyes the development of 6G standards and its compliance with Industry-5.0 for enhancing the abilities of the present outcomes in the light of Deep Reinforcement Learning and Quantum Computing
    corecore