28 research outputs found

    Creation of a Cloud-Native Application: Building and operating applications that utilize the benefits of the cloud computing distribution approach

    Get PDF
    Dissertation presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Information Systems and Technologies ManagementVMware is a world-renowned company in the field of cloud infrastructure and digital workspace technology which supports organizations in digital transformations. VMware accelerates digital transformation for evolving IT environments by empowering clients to adopt a software-defined strategy towards their business and information technology. Previously present in the private cloud segment, the company has recently focused on developing offers related to the public cloud. Comprehending how to devise cloud-compatible systems has become increasingly crucial in the present times. Cloud computing is rapidly evolving from a specialized technology favored by tech-savvy companies and startups to the cornerstone on which enterprise systems are constructed for future growth. To stay competitive in the current market, both big and small organizations are adopting cloud architectures and methodologies. As a member of the technical pre-sales team, the main goal of my internship was the design, development, and deployment of a cloud native application and therefore this will be the subject of my internship report. The application is intended to interface with an existing one and demonstrates in question the possible uses of VMware's virtualization infrastructure and automation offerings. Since its official release, the application has already been presented to various existing and prospective customers and at conferences. The purpose of this work is to provide a permanent record of my internship experience at VMware. Through this undertaking, I am able to retrospect on the professional facets of my internship experience and the competencies I gained during the journey. This work is a descriptive and theoretical reflection, methodologically oriented towards the development of a cloud-native application in the context of my internship in the system engineering team at VMware. The scientific content of the internship of the report focuses on the benefits - not limited to scalability and maintainability - to move from a monolithic architecture to microservices

    Technical Challenges of Microservices Migration

    Get PDF
    The microservices architecture is a recent trend in the software engineering community, with the number of research articles in the field increasing, and more companies adopting the architectural style every year. However, the migration of a monolith to the microservices architecture is an error-prone process with a lack of guidelines for its execution. Also, microservices introduce a lot of different challenges that are not faced when following a monolithic architecture. This work aims to fill some gaps in current microservices research by providing a catalogue of the currently most common challenges of adopting this architectural style, and possible solutions for them. For this reason, a systematic mapping study was executed analysing 54 different articles. Also, 30 industry professionals participated in a questionnaire regarding the topic. Furthermore, a participant observation experiment was performed to retrieve additional industry data. Moreover, one of the identified challenges – distributed transactions management – was further detailed and a solution implemented using the choreographed saga pattern. The solution is publicly available as an open-source project. Finally, multiple experts in the microservices field validated the results of the research and the distributed transactions solution and provided insights regarding the value of this work.A arquitetura de microserviços é uma tendência recente na comunidade de engenharia de software, com o número de artigos publicados sobre o tema a aumentar, assim como o número de empresas a adoptar o estilo arquitetural todos os anos. No entanto, o processo de migração de um monolito para uma arquitetura orientada a microserviços tem um alto potencial de erros, uma vez que existe falta de orientações sobre como conduzir o processo corretamente. Para além disso, os microserviços introduzem muitos desafios diferentes que não são enfrentados no desenvolvimento de um sistema monolitico. Este trabalho pretende preencher algumas destas lacunas na investigação da arquitetura de microserviços através da construção de um catalogo dos principais desafios enfrentados ao adoptar o estilo arquitetural e soluções possíveis para estes. Por este motivo, um systematic mapping study foi desenvolvido, analisando 54 artigos diferentes. Para além disso, 30 profissionais da industria responderam a questionario sobre o tema. Finalmente, para obter dados adicionais da indústria, uma experiência de migração foi realizada e observada de forma ativa. Ainda, um dos desafios identificados – gestão de transações distribuídas – foi detalhado e uma solução implementada usando o padrão de sagas coreografadas. A solução está publicamente disponível como um projecto open-source. Finalmente, vários peritos em microserviços avaliaram os resultados deste trabalho, incluindo a solução desenvolvida para gestão de transações distribuídas, e deram feedback relativamente ao valor deste trabalho

    Modernization of Legacy Information Technology Systems

    Get PDF
    Large enterprises spend a large portion of their Information Technology (IT) budget on maintaining their legacy systems. Legacy systems modernization projects are a catalyst for IT architects to save cost, provide new and efficient systems that increase profitability, and create value for their organization. Grounded in sociotechnical systems theory, the purpose of this qualitative multiple case study was to explore strategies IT architects use to modernize their legacy systems. The population included IT architects in large enterprises involved in legacy systems modernization projects, one in healthcare, and one in the financial services industry in the San Antonio-New Braunfels, Texas metropolitan area in the United States. The data collection included interviews with eight IT architects, reviewing 12 organizational documents and pertinent artifacts. Data were analyzed using thematic analysis. Prominent themes included collaboration in modernization projects, systems and process documentation, and resources upskilling and technical training. A key recommendation is for IT architects in large enterprises to ensure that team collaboration, system documentation, and resource technical training are built into all aspects of the legacy systems modernization projects. The implications for positive social change include the potential to bring together individuals with diverse backgrounds and different perspectives and skills to develop trust and build positive relationships during legacy systems modernization projects

    The Agile Model-Driven Method

    Get PDF
    Today the development of business applications is influenced by increased project complexity, shortened development cycles and high expectations in quality. Rising costs in the software development are an additional motivation to improve the productivity by the choice of a suitable development process. In the development of complex applications models are of great importance. Models reduce complexity by abstraction. Additionally, models offer the possibility to build different views onto an application. If models are sufficiently formal they are suitable for the automated transformation into source code. For this reason, an important acceleration and quality factor in the software development is attributed to the Model-Driven Software Development. On the other hand, Model-Driven Software Development requires quite high initial work for the definition of meta-models, domain-specific languages and transformation rules for the code generation process. A different approach to improve productivity is the use of agile process models like Scrum, Extreme Programming (XP) or Feature Driven Development (FDD). For these process models an early production of source code and the adjustment of executable partial results are important aspects of the process. The communication with the end user and the direct feedback are the most important success factors for a project and facilitate quick reactions on requirement changes. In agile methods modelling often plays a subordinated role. The requirements will be documented via “user stories” (XP) or “features” (Scrum, FDD). They are summarized either in Product- or Sprint-Backlogs (Scrum) or in Feature-Sets (FDD). From this, the idea is developed to apply agile work practices and techniques in a process tailored to model-driven development. First, existing process models for model-driven development are identified and described. Their common features such as process steps, artefacts and team organisation are worked out and abstracted in a metamodel. The aim is to reuse these process elements in a new agile process model. At the same time, suitable agile practices for modeling are identified, which can support such a process. Additional criteria and suggestions for the improvement of such a process are identified on the basis of case studies from practical model-driven projects. The Agile Model-Driven Method (AMDM) presents a combination of agile procedures and modelling techniques with the technology of model-driven development. AMDM is iteratively incremental and relies on proven concepts of accepted agile standards. AMDM integrates the development of a domain-specific modelling language, the modelling of problem domains and the development of the software architecture into a context. The development takes place in several cycles of sprints (iterations) which are distinguished in initial sprint, domain sprint and value sprint. Parallel to the development of domain language and application, the software architecture is developed evolutionarily and transferred to development. Finally, based on the mentioned case studies from the practice and investigations of model-driven projects in other industrial companies and business fields is checked how AMDM can contribute by agile concepts to increase efficiency in model-driven projects and how the expressed criticisms and problems from these studies can be avoided

    Migration from Legacy to Reactive Applications in OutSystems

    Get PDF
    A legacy system is an information system that significantly resists evolution. Through a migration, these systems can be moved to a more modernized environment without having to be redeveloped. OutSystems is a software company with a platform to develop and maintain applications using abstraction to increase productivity. In October 2019, OutSystems launched a new paradigm to allow developers to build reactive web applications. Because of this, the applications implemented in the old web paradigm turned into legacy systems. The OutSystems’ approach to this problem was a manual migration. However, it discards a considerable part of the effort previously made on the legacy system. A well-founded case study took place and allowed us to classify the UI as the most prioritized feature, but coincidently, the major bottleneck in migrations. So, this project had the following objectives: (1) The design and implementation of an automatic migration approach capable of converting UI elements to accelerate the manual migration; (2) The integration of the developed tool in the OutSystems platform. To transform the OutSystems paradigm’s elements, model-driven transformation rules must be set to receive the source UI elements and produce the target equivalent implementation in the new paradigm (each according to their model). However, the trans formations may not be straightforward, and a set of elements may need to be migrated to a different implementation due to Reactive Web’s best practices. Via the creation and search of UI patterns, it is possible to make special transformations for such scenarios. As a result, a migration approach was developed, allowing for the migration of UI (and other) elements. To complement this objective, the developed tool was integrated into the OutSystems platform with an easy to use interaction. Performance and usability tests proved the necessity and impact the final result had on the migration problem. This dissertation’s objectives were fully met and even exceeded, accelerating the man ual migration by providing an automatic UI conversion. This provided a quality increase in the existing process and results, giving OutSystems and its users the possibility of evolving their applications with considerable less effort and investment.Um sistema legado é um sistema de informação que resiste à evolução. Através de uma migração, estes sistemas podem ser movidos para um ambiente modernizado sem necessitar de re-implementação. A OutSystems é uma empresa de software com uma plataforma para desenvolver e manter aplicações usando abstracção para aumentar a produtividade. Em Outubro de 2019, a OutSystems lançou um novo paradigma para desenvolver aplicações reactive web. Assim, as aplicações implementadas no antigo paradigma web tornaram-se sistemas legados. A abordagem da OutSystems ao problema foi uma migração manual, no entanto, esta abordagem desconsidera uma parte significativa do investimento feito no sistema legado. Uma análise permitiu classificar a UI como a característica mais priorizada, mas também como o maior obstáculo em migrações. Assim, este projecto tem como objectivos: (1) O desenho e implementação de uma migração automática capaz de converter os elementos de UI para acelerar a migração manual; (2) A integração da ferramenta desenvolvida na plataforma da OutSystems. Para transformar os elementos dos paradigmas OutSystems, transformações de modelos têm de ser definidas para receber os elementos UI e produzir a implementação equivalente no novo paradigma (de acordo com o seu modelo). No entanto, as transformações podem não ser lineares, e um conjunto de elementos pode necessitar de uma migração para uma implementação diferente devido ao Reactive Web. Com a definição e procura de padrões de UI, é possível fazer transformações especiais para esses cenários. Como resultado, a migração foi desenvolvida, permitindo a conversão de elementos de UI (e não só). Para complementar, a ferramenta desenvolvida foi integrada na plataforma da OutSystems com uma interacção de fácil uso. Testes de desempenho e usabilidade provaram a necessidade e impacto da ferramenta no contexto da migração manual. Os objectivos desta dissertação foram completados na totalidade, acelerando a migração manual com a automação da migração de UI. Isto traz um aumento da qualidade no processo existente e nos seus resultados, dando à OutSystems e aos seus utilizadores a possibilidade de evoluírem as suas aplicações com um esforço e investimento menores

    Concepts for handling heterogeneous data transformation logic and their integration with TraDE middleware

    Get PDF
    The concept of programming-in-the-Large became a substantial part of modern computerbased scientific research with an advent of web services and the concept of orchestration languages. While the notions of workflows and service choreographies help to reduce the complexity by providing means to support the communication between involved participants, the process still remains generally complex. The TraDE Middleware and underlying concepts were introduced in order to provide means for performing the modeled data exchange across choreography participants in a transparent and automated fashion. However, in order to achieve both transparency and automation, the TraDE Middleware must be capable of transforming the data along its path. The data transformation’s transparency can be difficult to achieve due to various factors including the diversity of required execution environments and complicated configuration processes as well as the heterogeneity of data transformation software which results in tedious integration processes often involving the manual wrapping of software. Having a method of handling data transformation applications in a standardized manner can help to simplify the process of modeling and executing scientific service choreographies with the TraDE concepts applied. In this master thesis we analyze various aspects of this problem and conceptualize an extensible framework for handling the data transformation applications. The resulting prototypical implementation of the presented framework provides means to address data transformation applications in a standardized manner

    Adaptive monitoring and control framework in Application Service Management environment

    Get PDF
    The economics of data centres and cloud computing services have pushed hardware and software requirements to the limits, leaving only very small performance overhead before systems get into saturation. For Application Service Management–ASM, this carries the growing risk of impacting the execution times of various processes. In order to deliver a stable service at times of great demand for computational power, enterprise data centres and cloud providers must implement fast and robust control mechanisms that are capable of adapting to changing operating conditions while satisfying service–level agreements. In ASM practice, there are normally two methods for dealing with increased load, namely increasing computational power or releasing load. The first approach typically involves allocating additional machines, which must be available, waiting idle, to deal with high demand situations. The second approach is implemented by terminating incoming actions that are less important to new activity demand patterns, throttling, or rescheduling jobs. Although most modern cloud platforms, or operating systems, do not allow adaptive/automatic termination of processes, tasks or actions, it is administrators’ common practice to manually end, or stop, tasks or actions at any level of the system, such as at the level of a node, function, or process, or kill a long session that is executing on a database server. In this context, adaptive control of actions termination remains a significantly underutilised subject of Application Service Management and deserves further consideration. For example, this approach may be eminently suitable for systems with harsh execution time Service Level Agreements, such as real–time systems, or systems running under conditions of hard pressure on power supplies, systems running under variable priority, or constraints set up by the green computing paradigm. Along this line of work, the thesis investigates the potential of dimension relevance and metrics signals decomposition as methods that would enable more efficient action termination. These methods are integrated in adaptive control emulators and actuators powered by neural networks that are used to adjust the operation of the system to better conditions in environments with established goals seen from both system performance and economics perspectives. The behaviour of the proposed control framework is evaluated using complex load and service agreements scenarios of systems compatible with the requirements of on–premises, elastic compute cloud deployments, server–less computing, and micro–services architectures

    Smart data management with BIM for Architectural Heritage

    Get PDF
    In the last years smart buildings topic has received much attention as well as Building Information Modelling (BIM) and interoperability as independent fields. Linking these topics is an essential research target to help designers and stakeholders to run processes more efficiently. Working on a smart building requires the use of Innovation and Communication Technology (ICT) to optimize design, construction and management. In these terms, several technologies such as sensors for remote monitoring and control, building equipment, management software, etc. are available in the market. As BIM provides an enormous amount of information in its database and theoretically it is able to work with all kind of data sources using interoperability, it is essential to define standards for both data contents and format exchange. In this way, a possibility to align research activity with Horizon 2020 is the investigation of energy saving using ICT. Unfortunately, comparing the Architecture Engineering and Construction (AEC) Industry with other sectors it is clear how in the building field advanced information technology applications have not been adopted yet. However in the last years, the adoption of new methods for the data management has been investigated by many researchers. So, basing on the above considerations, the main purpose of this thesis is investigate the use of BIM methodology relating to existing buildings concerning on three main topics: • Smart data management for architectural heritage preservation; • District data management for energy reduction; • The maintenance of highrises. For these reasons, data management acquires a very important value relating to the optimization of the building process and it is considered the most important goal for this research. Taking into account different kinds of architectural heritage, the attention is focused on the existing and historical buildings that usually have characterized by several constraints. Starting from data collection, a BIM model was developed and customized in function of its objectives, and providing information for different simulation tests. Finally, data visualization was investigated through the Virtual Reality(VR) and Augmented Reality (AR). Certainly, the creation of a 3D parametric model implies that data is organized according to the use of individual users that are involved in the building process. This means that each 3D model can be developed with different Levels of Detail/Development (LODs) basing on the goal of the data source. Along this thesis the importance of LODs is taken into account related to the kind of information filled in a BIM model. In fact, basing on the objectives of each project a BIM model can be developed in a different way to facilitate the querying data for the simulations tests.\ud The three topics were compared considering each step of the building process workflow, highlighting the main differences, evaluating the strengths and weaknesses of BIM methodology. In these terms, the importance to set a BIM template before the modelling step was pointed out, because it provides the possibility to manage information in order to be collected and extracted for different purposes and by specific users. Moreover, basing on the results obtained in terms of the 3D parametric model and in terms of process, a proper BIM maturity level was determined for each topic. Finally, the value of interoperability was arisen from these tests considering that it provided the opportunity to develop a framework for collaboration, involving all parties of the building industry
    corecore