59 research outputs found

    Software Defined Application Delivery Networking

    Get PDF
    In this thesis we present the architecture, design, and prototype implementation details of AppFabric. AppFabric is a next generation application delivery platform for easily creating, managing and controlling massively distributed and very dynamic application deployments that may span multiple datacenters. Over the last few years, the need for more flexibility, finer control, and automatic management of large (and messy) datacenters has stimulated technologies for virtualizing the infrastructure components and placing them under software-based management and control; generically called Software-defined Infrastructure (SDI). However, current applications are not designed to leverage this dynamism and flexibility offered by SDI and they mostly depend on a mix of different techniques including manual configuration, specialized appliances (middleboxes), and (mostly) proprietary middleware solutions together with a team of extremely conscientious and talented system engineers to get their applications deployed and running. AppFabric, 1) automates the whole control and management stack of application deployment and delivery, 2) allows application architects to define logical workflows consisting of application servers, message-level middleboxes, packet-level middleboxes and network services (both, local and wide-area) composed over application-level routing policies, and 3) provides the abstraction of an application cloud that allows the application to dynamically (and automatically) expand and shrink its distributed footprint across multiple geographically distributed datacenters operated by different cloud providers. The architecture consists of a hierarchical control plane system called Lighthouse and a fully distributed data plane design (with no special hardware components such as service orchestrators, load balancers, message brokers, etc.) called OpenADN . The current implementation (under active development) consists of ~10000 lines of python and C code. AppFabric will allow applications to fully leverage the opportunities provided by modern virtualized Software-Defined Infrastructures. It will serve as the platform for deploying massively distributed, and extremely dynamic next generation application use-cases, including: Internet-of-Things/Cyber-Physical Systems: Through support for managing distributed gather-aggregate topologies common to most Internet-of-Things(IoT) and Cyber-Physical Systems(CPS) use-cases. By their very nature, IoT and CPS use cases are massively distributed and have different levels of computation and storage requirements at different locations. Also, they have variable latency requirements for their different distributed sites. Some services, such as device controllers, in an Iot/CPS application workflow may need to gather, process and forward data under near-real time constraints and hence need to be as close to the device as possible. Other services may need more computation to process aggregated data to drive long term business intelligence functions. AppFabric has been designed to provide support for such very dynamic, highly diversified and massively distributed application use-cases. Network Function Virtualization: Through support for heterogeneous workflows, application-aware networking, and network-aware application deployments, AppFabric will enable new partnerships between Application Service Providers (ASPs) and Network Service Providers (NSPs). An application workflow in AppFabric may comprise of application services, packet and message-level middleboxes, and network transport services chained together over an application-level routing substrate. The Application-level routing substrate allows policy-based service chaining where the application may specify policies for routing their application traffic over different services based on application-level content or context. Virtual worlds/multiplayer games: Through support for creating, managing and controlling dynamic and distributed application clouds needed by these applications. AppFabric allows the application to easily specify policies to dynamically grow and shrink the application\u27s footprint over different geographical sites, on-demand. Mobile Apps: Through support for extremely diversified and very dynamic application contexts typical of such applications. Also, AppFabric provides support for automatically managing massively distributed service deployment and controlling application traffic based on application-level policies. This allows mobile applications to provide the best Quality-of-Experience to its users without This thesis is the first to handle and provide a complete solution for such a complex and relevant architectural problem that is expected to touch each of our lives by enabling exciting new application use-cases that are not possible today. Also, AppFabric is a non-proprietary platform that is expected to spawn lots of innovations both in the design of the platform itself and the features it provides to applications. AppFabric still needs many iterations, both in terms of design and implementation maturity. This thesis is not the end of journey for AppFabric but rather just the beginning

    System design for periodic data production management

    Get PDF
    This research project introduces a new type of information system, the periodic data production management system, and proposes several innovative system design concepts for this application area. Periodic data production systems are common in the information industry for the production of information. These systems process large quantities of data in order to produce statistical reports in predefined intervals. The workflow of such a system is typically distributed world-wide and consists of several semi-computerized production steps which transform data packages. For example, market research companies apply these systems in order to sell marketing information over specified timelines. production of information. These systems process large quantities of data in order to produce statistical reports in predefined intervals. The workflow of such a system is typically distributed world-wide and consists of several semi-computerized production steps which transform data packages. For example, market research companies apply these systems in order to sell marketing information over specified timelines. There has been identified a lack of concepts for IT-aided management in this area. This thesis clearly defines the complex requirements of periodic data production management systems. It is shown that these systems can be defines as IT-support for planning, monitoring and controlling periodic data production processes. Their significant advantages are that information industry will be enabled to increase production performance, and to ease (and speed up) the identification of the production progress as well as the achievable optimisation potential in order to control rationalisation goals. In addition, this thesis provides solutions for he generic problem how to introduce such a management system on top of an unchangeable periodic data production system. Two promising system designs for periodic data production management are derived, analysed and compared in order to gain knowledge about appropriate concepts and this application area. Production planning systems are the metaphor models used for the so-called closely coupled approach. The metaphor model for the loosely coupled approach is project management. The latter approach is prototyped as an application in the market research industry and used as case study. Evaluation results are real-world experiences which demonstrate the extraordinary efficiency of systems based on the loosely coupled approach. Special is a scenario-based evaluation that accurately demonstrates the many improvements achievable with this approach. Main results are that production planning and process quality can vitally be improved. Finally, among other propositions, it is suggested to concentrate future work on the development of product lines for periodic data production management systems in order to increase their reuse

    An investigation into the relevance of flexibility- and interoperability requirements for implementation processes for workflow-management-applications

    Get PDF
    Flexibility and Interoperability have become important characteristics for organisations and their business processes. The need to control flexible business processes within an organisationā€™s boundaries and between organisations imposes major requirements on a companyā€™s process control capabilities. Workflow Management Systems (WFMS) try to fulfil these requirements by offering respective product features. Evidence suggests that the achievement of flexible business processes and an inter-organisational process control is also influenced by implementation processes for Workflow Management Applications (WFMA). [A WFMA comprises the WFMS and "all WFMS specific data with regard to one or more business processes" [VER01]]. The impact of a WFMA implementation methodology on the fulfilment of these requirements is the research scope of the project. The thesis provides knowledge in the following areas: 1. Review of the relationship between workflow management and the claim for process flexibility respectively -interoperability. 2. Definition of a research-/evaluation framework for workflow projects. This framework is composed of all relevant research variables that have been identified for the thesis. 3. Empirical survey of relevant workflow-project objectives and their priority in the context of process flexibility and ā€“interoperability. 4. Empirical survey of the objectivesā€™ achievement. 5. Empirical survey of methodologies / activities that have been applied within workflow projects. 6. Derivation of the project methodologiesā€™ effectiveness in terms of the impact that applied activities had on project objectives. 7. Evaluation of existing workflow life-cycle models in accordance with the research framework. 8. Identification of basic improvements for workflow implementation processes with respect to the achievement of flexible and interoperable business processes. The first part of the thesis argues the relevance of the subject. Afterwards research variables that constitute the evaluation framework for WFMA implementation processes are stepwise identified and defined. An empirical study then proves the variablesā€™ effectiveness for the achievement of process flexibility and ā€“interoperability within the WFMA implementation process. After this the framework is applied to evaluate chosen WFMA implementation methodologies. Identified weaknesses and effective methodological aspects are utilised to develop generic methodological improvements. These improvements are later validated by means of a case study and interviews with workflow experts.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    An investigation into the relevance of flexibility- and interoperability requirements for implementation processes for workflow-management-applications

    Get PDF
    Flexibility and Interoperability have become important characteristics for organisations and their business processes. The need to control flexible business processes within an organisationā€™s boundaries and between organisations imposes major requirements on a companyā€™s process control capabilities. Workflow Management Systems (WFMS) try to fulfil these requirements by offering respective product features. Evidence suggests that the achievement of flexible business processes and an inter-organisational process control is also influenced by implementation processes for Workflow Management Applications (WFMA). [A WFMA comprises the WFMS and "all WFMS specific data with regard to one or more business processes" [VER01]]. The impact of a WFMA implementation methodology on the fulfilment of these requirements is the research scope of the project. The thesis provides knowledge in the following areas: 1. Review of the relationship between workflow management and the claim for process flexibility respectively -interoperability. 2. Definition of a research-/evaluation framework for workflow projects. This framework is composed of all relevant research variables that have been identified for the thesis. 3. Empirical survey of relevant workflow-project objectives and their priority in the context of process flexibility and ā€“interoperability. 4. Empirical survey of the objectivesā€™ achievement. 5. Empirical survey of methodologies / activities that have been applied within workflow projects. 6. Derivation of the project methodologiesā€™ effectiveness in terms of the impact that applied activities had on project objectives. 7. Evaluation of existing workflow life-cycle models in accordance with the research framework. 8. Identification of basic improvements for workflow implementation processes with respect to the achievement of flexible and interoperable business processes. The first part of the thesis argues the relevance of the subject. Afterwards research variables that constitute the evaluation framework for WFMA implementation processes are stepwise identified and defined. An empirical study then proves the variablesā€™ effectiveness for the achievement of process flexibility and ā€“interoperability within the WFMA implementation process. After this the framework is applied to evaluate chosen WFMA implementation methodologies. Identified weaknesses and effective methodological aspects are utilised to develop generic methodological improvements. These improvements are later validated by means of a case study and interviews with workflow experts.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    50 jaar informatiesystemen 1978-2028 : liber amicorum voor Theo Bemelmans

    Get PDF
    no abstrac

    Assessing the relationship between bpm maturity and the success of organizations

    Get PDF
    Dissertation presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Information Systems and Technologies ManagementFor the past decades, organizations have been investing heavily in BPM projects in the hope of improving their competitive advantage in an increasingly complex environment. However, although it is believed that the higher the level of BPM maturity the greater the success of the organization, experience shows that this relationship is not always possible to prove. The purpose of this study is to help clarify the relationship between the level of BPM maturity and the success of an organization. This was done through the implementation of a case study-based research within a global company that has an operation in Portugal, focusing on the shared services organization. An analysis of the existing BPM maturity models and its level of coverage of BPM core areas was conducted as a way to select the most suitable BPM maturity model to conduct the assessment of the current BPM maturity level of the organization. It was also established a framework to characterize the success of an organization. These two inputs, along with information gathered to understand process improvements that were implemented and its impact in the organization, were the basis for conducting the research. Results show a successful organization, with a high maturity level according to the BPM OMG maturity model, that has been investing in continually improving its processes with a strong focus on digital transformation. The identified benefits from a high level of BPM maturity, namely the improved productivity, cost reduction, error & risk prevention, higher agility, employee upskilling and knowledge retention, were shown to have a positive influence in the majority of the dimensions used to characterize the success of the organization

    50 jaar informatiesystemen 1978-2028 : liber amicorum voor Theo Bemelmans

    Get PDF
    no abstrac

    S-BPM in the Wild

    Get PDF
    This is the first book to present field studies on the application of subject-oriented business process management (S-BPM). Each case presents a specific story and focuses on an essential modeling or implementation issue, and most end with implications or suggestions for further studies. Significant variables and success factors are identified that were discovered during the respective study and lead to suggesting S-BPM novelties. For each case, the authors explain step-by-step how the story develops, and provide readers guidance by detailing the respective rationale. The studies covered are clustered according to three main S-BPM themes: Part I ā€œBusiness Operation Supportā€ documents approaches to the practical development of S-BPM solutions in various application domains and organizational settings, while Part II ā€œConsultancy and Education Supportā€ highlights cases that can help to train readers in S-BPM modeling and knowledge acquisition for S-BPM lifecycle iterations. It also refers to architecting S-BPM solutions for application cases based on hands-on experience. Part III ā€œTechnical Execution Supportā€ focuses on concepts for utilizing specific theories and technologies to execute S-BPM models. It also addresses how to create reference models for certain settings in the field. Lastly, the appendix covers all relevant aspects needed to grasp S-BPM modeling and apply it based on fundamental examples. Its format reconciles semantic precision with syntactic rigor.>Addressing the needs of developers, educators and practitioners, this book will help companies to learn from the experiences of first-time users and to develop systems that fit their business processes, explaining the latest key methodological and technological S-BPM developments in the fields of training, research and application

    Virtual learning process environment (VLPE): a BPM-based learning process management architecture

    Get PDF
    E-learning systems have signiļ¬cantly impacted the way that learning takes place within universities, particularly in providing self-learning support and ļ¬‚exibility of course delivery. Virtual Learning Environments help facilitate the management of educational courses for students, in particular by assisting course designers and thriving in the management of the learning itself. Current literature has shown that pedagogical modelling and learning process management facilitation are inadequate. In particular, quantitative information on the process of learning that is needed to perform real time or reļ¬‚ective monitoring and statistical analysis of studentsā€™ learning processes performance is deļ¬cient. Therefore, for a course designer, pedagogical evaluation and reform decisions can be diļ¬ƒcult. This thesis presents an alternative e-learning systems architecture - Virtual Learning Process Environment (VLPE) - that uses the Business Process Management (BPM) conceptual framework to design an architecture that addresses the critical quantitative learning process information gaps associated with the conventional VLE frameworks. Within VLPE, course designers can model desired education pedagogies in the form of learning process workļ¬‚ows using an intuitive graphical ļ¬‚ow diagram user-interface. Automated agents associated with BPM frameworks are employed to capture quantitative learning information from the learning process workļ¬‚ow. Consequently, course designers are able to monitor, analyse and re-evaluate in real time the eļ¬€ectiveness of their chosen pedagogy using live interactive learning process dashboards. Once a course delivery is complete the collated quantitative information can also be used to make major revisions to pedagogy design for the next iteration of the course. An additional contribution of this work is that this new architecture facilitates individual students in monitoring and analysing their own learning performances in comparison to their peers in a real time anonymous manner through a personal analytics learning process dashboard. A case scenario of the quantitative statistical analysis of a cohort of learners (10 participants in size) is presented. The analytical results of their learning processes, performances and progressions on a short Mathematics course over a ļ¬ve-week period are also presented in order to demonstrate that the proposed framework can signiļ¬cantly help to advance learning analytics and the visualisation of real time learning data
    • ā€¦
    corecore