80 research outputs found

    Microservice API Evolution in Practice: A Study on Strategies and Challenges

    Full text link
    Nowadays, many companies design and develop their software systems as a set of loosely coupled microservices that communicate via their Application Programming Interfaces (APIs). While the loose coupling improves maintainability, scalability, and fault tolerance, it poses new challenges to the API evolution process. Related works identified communication and integration as major API evolution challenges but did not provide the underlying reasons and research directions to mitigate them. In this paper, we aim to identify microservice API evolution strategies and challenges in practice and gain a broader perspective of their relationships. We conducted 17 semi-structured interviews with developers, architects, and managers in 11 companies and analyzed the interviews with open coding used in grounded theory. In total, we identified six strategies and six challenges for REpresentational State Transfer (REST) and event-driven communication via message brokers. The strategies mainly focus on API backward compatibility, versioning, and close collaboration between teams. The challenges include change impact analysis efforts, ineffective communication of changes, and consumer reliance on outdated versions, leading to API design degradation. We defined two important problems in microservice API evolution resulting from the challenges and their coping strategies: tight organizational coupling and consumer lock-in. To mitigate these two problems, we propose automating the change impact analysis and investigating effective communication of changes as open research directions

    Evaluation of Arrowhead Framework in Condition Monitoring Application

    Get PDF
    The technological advancement in the field of electronics and information technology is changing how industrial automation systems are built. This phenomenon is commonly referred to as the fourth industrial revolution. However, before this prophecy on the change can manifest, new architectural solutions are needed to fully leverage the abilities brought by cheaper sensors, more advanced communication technology and more powerful processing units. The Arrowhead Framework tries to tackle this problem by providing means for Service-oriented architecture via System-of-Systems approach, where so-called application systems consume services provided by so-called core systems, which provide means for service discovery, service registration and service authorization. The goal of the thesis was to evaluate The Arrowhead Framework by developing a demo application on the edge-cloud setup used in the condition monitoring system of vibrating screens manufactured by Metso. The demo applications objective was to ease the configuration and installation of industrial Linux PC’s at the edge of the network. The methodological model for the evaluation was based on the design science research process (DSRP), which provides a model for research of IT artefacts. As a result, the Arrowhead Framework’s core features were found helpful in the problem domain, and suitable for small-scale test setup. However, the implementation of the framework was found to be low quality and lacking features from a production-ready software artefact. The found shortcomings were reported as feedback for the ongoing development process of the framework

    GA4GH: International policies and standards for data sharing across genomic research and healthcare.

    Get PDF
    The Global Alliance for Genomics and Health (GA4GH) aims to accelerate biomedical advances by enabling the responsible sharing of clinical and genomic data through both harmonized data aggregation and federated approaches. The decreasing cost of genomic sequencing (along with other genome-wide molecular assays) and increasing evidence of its clinical utility will soon drive the generation of sequence data from tens of millions of humans, with increasing levels of diversity. In this perspective, we present the GA4GH strategies for addressing the major challenges of this data revolution. We describe the GA4GH organization, which is fueled by the development efforts of eight Work Streams and informed by the needs of 24 Driver Projects and other key stakeholders. We present the GA4GH suite of secure, interoperable technical standards and policy frameworks and review the current status of standards, their relevance to key domains of research and clinical care, and future plans of GA4GH. Broad international participation in building, adopting, and deploying GA4GH standards and frameworks will catalyze an unprecedented effort in data sharing that will be critical to advancing genomic medicine and ensuring that all populations can access its benefits

    Develop a generic Rules Engine to quality control a CV database

    Get PDF
    This bachelor’s thesis presents a software solution to enhance Bouvet’s quality control process for employee CVs. By implementing a generic rule engine with extended functionalities, we identified that 90% of the CVs at Bouvet did not meet the company’s business standards. Using Scrum with Extreme Programming as our project management system, we developed a scalable and maintainable pilot, employing Microservices, Event-Driven, and Command and Query Responsibility Segregation architecture. Our pilot allows for future modifications using create, read, update and delete operations. The software solution presented in this thesis can be extended to a production-ready state by implementing an Role-based access control and an API-Gateway. When the event bus project by another group at Bouvet is completed, our implementation will be able to notify employees about their CVs’ status, further improving the quality control process. Overall, our results demonstrate the our software solution and project management system in enhancing the quality control of employee CVs at Bouvet.This bachelor’s thesis presents a software solution to enhance Bouvet’s quality control process for employee CVs. By implementing a generic rule engine with extended functionalities, we identified that 90% of the CVs at Bouvet did not meet the company’s business standards. Using Scrum with Extreme Programming as our project management system, we developed a scalable and maintainable pilot, employing Microservices, Event-Driven, and Command and Query Responsibility Segregation architecture. Our pilot allows for future modifications using create, read, update and delete operations. The software solution presented in this thesis can be extended to a production-ready state by implementing an Role-based access control and an API-Gateway. When the event bus project by another group at Bouvet is completed, our implementation will be able to notify employees about their CVs’ status, further improving the quality control process. Overall, our results demonstrate the our software solution and project management system in enhancing the quality control of employee CVs at Bouvet

    FuzzTheREST - Intelligent Automated Blackbox RESTful API Fuzzer

    Get PDF
    In recent years, the pervasive influence of technology has deeply intertwined with human life, impacting diverse fields. This relationship has evolved into a dependency, with software systems playing a pivotal role, necessitating a high level of trust. Today, a substantial portion of software is accessed through Application Programming Interfaces, particularly web APIs, which predominantly adhere to the Representational State Transfer architecture. However, this architectural choice introduces a wide range of potential vulnerabilities, which are available and accessible at a network level. The significance of Software testing becomes evident when considering the widespread use of software in various daily tasks that impact personal safety and security, making the identification and assessment of faulty software of paramount importance. In this thesis, FuzzTheREST, a black-box RESTful API fuzzy testing framework, is introduced with the primary aim of addressing the challenges associated with understanding the context of each system under test and conducting comprehensive automated testing using diverse inputs. Operating from a black-box perspective, this fuzzer leverages Reinforcement Learning to efficiently uncover vulnerabilities in RESTful APIs by optimizing input values and combinations, relying on mutation methods for input exploration. The system's value is further enhanced through the provision of a thoroughly documented vulnerability discovery process for the user. This proposal stands out for its emphasis on explainability and the application of RL to learn the context of each API, thus eliminating the necessity for source code knowledge and expediting the testing process. The developed solution adheres rigorously to software engineering best practices and incorporates a novel Reinforcement Learning algorithm, comprising a customized environment for API Fuzzy Testing and a Multi-table Q-Learning Agent. The quality and applicability of the tool developed are also assessed, relying on the results achieved on two case studies, involving the Petstore API and an Emotion Detection module which was part of the CyberFactory#1 European research project. The results demonstrate the tool's effectiveness in discovering vulnerabilities, having found 7 different vulnerabilities and the agents' ability to learn different API contexts relying on API responses while maintaining reasonable code coverage levels.Ultimamente, a influência da tecnologia espalhou-se pela vida humana de uma forma abrangente, afetando uma grande diversidade dos seus aspetos. Com a evolução tecnológica esta acabou por se tornar uma dependência. Os sistemas de software começam assim a desempenhar um papel crucial, o que em contrapartida obriga a um elevado grau de confiança. Atualmente, uma parte substancial do software é implementada em formato de Web APIs, que na sua maioria seguem a arquitetura de transferência de estado representacional. No entanto, esta introduz uma série vulnerabilidade. A importância dos testes de software torna-se evidente quando consideramos o amplo uso de software em várias tarefas diárias que afetam a segurança, elevando ainda mais a importância da identificação e mitigação de falhas de software. Nesta tese é apresentado o FuzzTheREST, uma framework de teste fuzzy de APIs RESTful num modelo caixa preta, com o objetivo principal de abordar os desafios relacionados com a compreensão do contexto de cada sistema sob teste e a realização de testes automatizados usando uma variedade de possíveis valores. Este fuzzer utiliza aprendizagem por reforço de forma a compreender o contexto da API que está sob teste de forma a guiar a geração de valores de teste, recorrendo a métodos de mutação, para descobrir vulnerabilidades nas mesmas. Todo o processo desempenhado pelo sistema é devidamente documentado para que o utilizador possa tomar ações mediante os resultados obtidos. Esta explicabilidade e aplicação de inteligência artificial para aprender o contexto de cada API, eliminando a necessidade de analisar código fonte e acelerando o processo de testagem, enaltece e distingue a solução proposta de outras. A solução desenvolvida adere estritamente às melhores práticas de engenharia de software e inclui um novo algoritmo de aprendizagem por reforço, que compreende um ambiente personalizado para testagem Fuzzy de APIs e um Agente de QLearning com múltiplas Q-tables. A qualidade e aplicabilidade da ferramenta desenvolvida também são avaliadas com base nos resultados obtidos em dois casos de estudo, que envolvem a conhecida API Petstore e um módulo de Deteção de Emoções que fez parte do projeto de investigação europeu CyberFactory#1. Os resultados demonstram a eficácia da ferramenta na descoberta de vulnerabilidades, tendo identificado 7 vulnerabilidades distintas, e a capacidade dos agentes em aprender diferentes contextos de API com base nas respostas da mesma, mantendo níveis de cobertura aceitáveis

    Develop a generic Rules Engine to quality control a CV database

    Get PDF
    This bachelor’s thesis presents a software solution to enhance Bouvet’s quality control process for employee CVs. By implementing a generic rule engine with extended functionalities, we identified that 90% of the CVs at Bouvet did not meet the company’s business standards. Using Scrum with Extreme Programming as our project management system, we developed a scalable and maintainable pilot, employing Microservices, Event-Driven, and Command and Query Responsibility Segregation architecture. Our pilot allows for future modifications using create, read, update and delete operations. The software solution presented in this thesis can be extended to a production-ready state by implementing an Role-based access control and an API-Gateway. When the event bus project by another group at Bouvet is completed, our implementation will be able to notify employees about their CVs’ status, further improving the quality control process. Overall, our results demonstrate the our software solution and project management system in enhancing the quality control of employee CVs at Bouvet.This bachelor’s thesis presents a software solution to enhance Bouvet’s quality control process for employee CVs. By implementing a generic rule engine with extended functionalities, we identified that 90% of the CVs at Bouvet did not meet the company’s business standards. Using Scrum with Extreme Programming as our project management system, we developed a scalable and maintainable pilot, employing Microservices, Event-Driven, and Command and Query Responsibility Segregation architecture. Our pilot allows for future modifications using create, read, update and delete operations. The software solution presented in this thesis can be extended to a production-ready state by implementing an Role-based access control and an API-Gateway. When the event bus project by another group at Bouvet is completed, our implementation will be able to notify employees about their CVs’ status, further improving the quality control process. Overall, our results demonstrate the our software solution and project management system in enhancing the quality control of employee CVs at Bouvet

    Reactive Microservices - An Experiment

    Get PDF
    Os microserviços são geralmente adotados quando a escalabilidade e flexibilidade de uma aplicação são essenciais para o seu sucesso. Apesar disto, as dependências entre serviços transmitidos através de protocolos síncronos, resultam numa única falha que pode afetar múltiplos microserviços. A adoção da capacidade de resposta numa arquitetura baseada em microserviços, através da reatividade, pode facilitar e minimizar a proliferação de erros entre serviços e na comunicação entre eles, ao dar prioridade à capacidade de resposta e à resiliência de um serviço. Esta dissertação fornece uma visão geral do estado da arte dos microserviços reativos, estruturada através de um processo de mapeamento sistemático, onde são analisados os seus atributos de qualidade mais importantes, os seus erros mais comuns, as métricas mais adequadas para a sua avaliação, e as frameworks mais relevantes. Com a informação recolhida, é apresentado o valor deste trabalho, onde a decisão do projeto e a framework a utilizar são tomadas, através da técnica de preferência de ordem por semelhança com a solução ideal e o processo de hierarquia analítica, respetivamente. Em seguida, é realizada a análise e o desenho da solução, para o respetivo projeto, onde se destacam as alterações arquiteturais necessárias para o converter num projeto de microserviços reativo. Em seguida, descreve-se a implementação da solução, começando pela configuração do projeto necessária para agilizar o processo de desenvolvimento, seguida dos principais detalhes de implementação utilizados para assegurar a reatividade e como a framework apoia e simplifica a sua implementação, finalizada pela configuração das ferramentas de métricas no projeto para apoiar os testes e a avaliação da solução. Em seguida, a validação da solução é investigada e executada com base na abordagem Goals, Questions, Metrics (GQM), para estruturar a sua análise relativamente à manutenção, escalabilidade, desempenho, testabilidade, disponibilidade, monitorabilidade e segurança, finalizada pela conclusão do trabalho global realizado, onde são listadas as contribuições, ameaças à validade e possíveis trabalhos futuros.Microservices are generally adopted when the scalability and flexibility of an application are essential to its success. Despite this, dependencies between services transmitted through synchronous protocols result in one failure, potentially affecting multiple microservices. The adoption of responsiveness in a microservices-based architecture, through reactivity, can facilitate and minimize the proliferation of errors between services and in the communication between them by prioritizing the responsiveness and resilience of a service. This dissertation provides an overview of the reactive microservices state of the art, structured through a systematic mapping process, where its most important quality attributes, pitfalls, metrics, and most relevant frameworks are analysed. With the gathered information, the value of this work is presented, where the project and framework decision are made through the technique of order preference by similarity to the ideal solution and the analytic hierarchy process, respectively. Then, the analysis and design of the solution are idealized for the respective project, where the necessary architectural changes are highlighted to convert it to a reactive microservices project. Next, the solution implementation is described, starting with the necessary project setup to speed up the development process, followed by the key implementation details employed to ensure reactivity and how the framework streamlines its implementation, finalized by the metrics tools setup in the project to support the testing and evaluation of the solution. Then, the solution validation is traced and executed based on the Goals, Questions, Metrics (GQM) approach to structure its analysis regarding maintainability, scalability, performance, testability, availability, monitorability, and security, finalized by the conclusion of the overall work done, where the contributions, threats to validity and possible future work are listed

    Knowledge-driven architecture composition

    Full text link
    Service interoperability for embedded devices is a mandatory feature for dynamically changing Internet-of-Things and Industry 4.0 software platforms. Service interoperability is achieved on a technical, syntactic, and semantic level. If service interoperability is achieved on all layers, plug and play functionality known from USB storage sticks or printer drivers becomes feasible. As a result, micro batch size production, individualized automation solution, or job order production become affordable. However, interoperability at the semantic layer is still a problem for the maturing class of IoT systems. Current solutions to achieve semantic integration of IoT devices’ heterogeneous services include standards, machine-understandable service descriptions, and the implementation of software adapters. Standardization bodies such as the VDMA tackle the problem by providing a reference software architecture and an information meta model for building up domain standards. For instance, the universal machine technology interface (UMATI) facilitates the data exchange between machines, components, installations, and their integration into a customerand user-specific IT ecosystem for mechanical engineering and plant construction worldwide. Automated component integration approaches fill the gap of software interfaces that are not relying on a global standard. These approaches translate required into provided software interfaces based on the needed architectural styles (e.g., client-server, layered, publish-subscribe, or cloud-based) using additional component descriptions. Interoperability at the semantic layer is achieved by relying on a shared domain vocabulary (e.g., an ontology) and service description (e.g., SAWSDL) used by all devices involved. If these service descriptions are available and machine-understandable knowledge of how to integrate software components on the functional and behavioral level is available, plug and play scenarios are feasible. Both standards and formal service descriptions cannot be applied effectively to IoT systems as they rely on the assumption that the semantic domain is completely known when they are noted down. This assumption is hard to believe as an increasing number of decentralized developed and connected IoT devices will exist (i.e., 30.73 billion in 2020 and 75.44 billion in 2025). If standards are applied in IoT systems, they must be updated continuously, so they contain the most recent domain knowledge agreed upon centrally and ahead of application. Although formal descriptions of concrete integration contexts can happen in a decentralized manner, they still rely on the assumption that the knowledge once noted down is complete. Hence, if an interoperable service from a new device is available that has not been considered in the initial integration context, the formal descriptions must be updated continuously. Both the formalization effort and keeping standards up to date result in too much additional engineering effort. Consequently, practitioners rely on implementing software adapters manually. However, this dull solution hardly scales with the increasing number of IoT devices. In this work, we introduce a novel engineering method that explicitly allows for an incomplete semantic domain description without losing the ability for automated IoT system integration. Dropping the completeness claim requires the management of incomplete integration knowledge. By sharing integration knowledge centrally, we assist the system integrator in automating software adapter generation. In addition to existing approaches, we enable semantic integration for services by making integration knowledge reusable. We empirically show with students that integration effort can be lowered in a home automation context

    Microservice architecture patterns with GraphQL

    Get PDF
    In this thesis the GraphQL query language was studied in the context of the microservice architecture. The thesis was a combination of a literary study and expert interviews. Most prevalent microservice architecture patterns were gathered from peer reviewed studies and grey literature. Four expert interviews were held to detect which patterns were used in projects which used a microservice architecture and GraphQL APIs. Based on the interviews, a typical GraphQL microservice architecture consists of up to 10 REST or GraphQL microservices exposed to public networks using a GraphQL API gateway. The studied microservice architectures typically had a flat structure where a GraphQL API gateway exposed the other microservices in a protected network, which used a persistence layer to store data. Most of the microservice architectures seemed to rely on synchronous communication patterns, using a platform provided service discovery mechanisms to implement load balancing. The most prominent authorization mechanism was to use web tokens via an authorization header. An aggregating GraphQL service seemed to work well as an API gateway layer, which exposes the functionality of underlying microservices to clients. A pattern called schema stitching was successfully used in some of the interviewed projects to automatically combine multiple microservice GraphQL schemas together. Overall the interviewed developers had a positive experience using GraphQL in their projects. The API exploration tools and built-in documentation enabled by the GraphQL schema introspection functionality made client application development easier

    GraphQL API generating service for Django

    Get PDF
    Tato bakalářská práce se zabývá vývojem modulu pro generování GraphQL API ve webovém frameworku Django. Hlavní část práce řeší rozvoj modulu Simple API, který se zaměřuje na rychlou tvorbu API vycházející z existujících definicí v Django. Práce obsahuje analýzu existujících modulů na integraci API do frameworku Django, nejen ty založené na GraphQL, ale také na architektuře REST. Práce také obsahuje analýzu aktuálního stavu vývoje Simple API a na základě této analýzy je vytvořen návrh na doplnění chybějících funkcí, následovaný jejich implementací. Rozšíření se zabývá nejen prvky důležitými pro funkčnost modulu, ale také integrováním zabezpečení GraphQL serveru proti Denial of Service (DoS) a batching útokům prostřednictvím komplexních dotazů. Práce obsahuje popis vývoje dynamické webové aplikace pro testování výstupního API. Výsledkem práce je rozšíření Simple API a webové prostředí Simple API Admin pro prozkoumání a testování vygenerovaného koncového bodu.This bachelor’s thesis describes the development of a module for generating GraphQL API for the Django web framework. The main part of this thesis is concerned with furthering the development of Simple API module, which aims to facilitate fast creation of API endpoints from existing Django definitions. The thesis contains an analysis of existing modules for API creation in Django, not just GraphQL, but also of those generating REST-style API. From the analysis of the current state of Simple API development, new additions to Simple API are designed, then implemented. Other than functional additions, the development also concerns the design and implementation of security features necessary for prevention of Denial of Service (DoS) and batching attacks through complex queries. Furthermore, a web application for testing of the API is also developed in the thesis. The result of this thesis is the expansion of the module Simple API and web application Simple API admin for exploration and testing of the generated endpoint
    corecore