44,869 research outputs found

    Facilitating the analysis of a UK national blood service supply chain using distributed simulation

    Get PDF
    In an attempt to investigate blood unit ordering policies, researchers have created a discrete-event model of the UK National Blood Service (NBS) supply chain in the Southampton area of the UK. The model has been created using Simul8, a commercial-off-the-shelf discrete-event simulation package (CSP). However, as more hospitals were added to the model, it was discovered that the length of time needed to perform a single simulation severely increased. It has been claimed that distributed simulation, a technique that uses the resources of many computers to execute a simulation model, can reduce simulation runtime. Further, an emerging standardized approach exists that supports distributed simulation with CSPs. These CSP Interoperability (CSPI) standards are compatible with the IEEE 1516 standard The High Level Architecture, the defacto interoperability standard for distributed simulation. To investigate if distributed simulation can reduce the execution time of NBS supply chain simulation, this paper presents experiences of creating a distributed version of the CSP Simul8 according to the CSPI/HLA standards. It shows that the distributed version of the simulation does indeed run faster when the model reaches a certain size. Further, we argue that understanding the relationship of model features is key to performance. This is illustrated by experimentation with two different protocols implementations (using Time Advance Request (TAR) and Next Event Request (NER)). Our contribution is therefore the demonstration that distributed simulation is a useful technique in the timely execution of supply chains of this type and that careful analysis of model features can further increase performance

    Mapping Big Data into Knowledge Space with Cognitive Cyber-Infrastructure

    Full text link
    Big data research has attracted great attention in science, technology, industry and society. It is developing with the evolving scientific paradigm, the fourth industrial revolution, and the transformational innovation of technologies. However, its nature and fundamental challenge have not been recognized, and its own methodology has not been formed. This paper explores and answers the following questions: What is big data? What are the basic methods for representing, managing and analyzing big data? What is the relationship between big data and knowledge? Can we find a mapping from big data into knowledge space? What kind of infrastructure is required to support not only big data management and analysis but also knowledge discovery, sharing and management? What is the relationship between big data and science paradigm? What is the nature and fundamental challenge of big data computing? A multi-dimensional perspective is presented toward a methodology of big data computing.Comment: 59 page

    A model-driven approach to broaden the detection of software performance antipatterns at runtime

    Full text link
    Performance antipatterns document bad design patterns that have negative influence on system performance. In our previous work we formalized such antipatterns as logical predicates that predicate on four views: (i) the static view that captures the software elements (e.g. classes, components) and the static relationships among them; (ii) the dynamic view that represents the interaction (e.g. messages) that occurs between the software entities elements to provide the system functionalities; (iii) the deployment view that describes the hardware elements (e.g. processing nodes) and the mapping of the software entities onto the hardware platform; (iv) the performance view that collects specific performance indices. In this paper we present a lightweight infrastructure that is able to detect performance antipatterns at runtime through monitoring. The proposed approach precalculates such predicates and identifies antipatterns whose static, dynamic and deployment sub-predicates are validated by the current system configuration and brings at runtime the verification of performance sub-predicates. The proposed infrastructure leverages model-driven techniques to generate probes for monitoring the performance sub-predicates and detecting antipatterns at runtime.Comment: In Proceedings FESCA 2014, arXiv:1404.043

    ERP implementation methodologies and frameworks: a literature review

    Get PDF
    Enterprise Resource Planning (ERP) implementation is a complex and vibrant process, one that involves a combination of technological and organizational interactions. Often an ERP implementation project is the single largest IT project that an organization has ever launched and requires a mutual fit of system and organization. Also the concept of an ERP implementation supporting business processes across many different departments is not a generic, rigid and uniform concept and depends on variety of factors. As a result, the issues addressing the ERP implementation process have been one of the major concerns in industry. Therefore ERP implementation receives attention from practitioners and scholars and both, business as well as academic literature is abundant and not always very conclusive or coherent. However, research on ERP systems so far has been mainly focused on diffusion, use and impact issues. Less attention has been given to the methods used during the configuration and the implementation of ERP systems, even though they are commonly used in practice, they still remain largely unexplored and undocumented in Information Systems research. So, the academic relevance of this research is the contribution to the existing body of scientific knowledge. An annotated brief literature review is done in order to evaluate the current state of the existing academic literature. The purpose is to present a systematic overview of relevant ERP implementation methodologies and frameworks as a desire for achieving a better taxonomy of ERP implementation methodologies. This paper is useful to researchers who are interested in ERP implementation methodologies and frameworks. Results will serve as an input for a classification of the existing ERP implementation methodologies and frameworks. Also, this paper aims also at the professional ERP community involved in the process of ERP implementation by promoting a better understanding of ERP implementation methodologies and frameworks, its variety and history

    A model-driven approach for facilitating user-friendly design of complex event patterns

    Get PDF
    Complex Event Processing (CEP) is an emerging technology which allows us to efficiently process and correlate huge amounts of data in order to discover relevant or critical situations of interest (complex events) for a specific domain. This technology requires domain experts to define complex event patterns, where the conditions to be detected are specified by means of event processing languages. However, these experts face the handicap of defining such patterns with editors which are not user-friendly enough. To solve this problem, a model-driven approach for facilitating user-friendly design of complex event patterns is proposed and developed in this paper. Besides, the proposal has been applied to different domains and several event processing languages have been compared. As a result, we can affirm that the presented approach is independent both of the domain where CEP technology has to be applied to and of the concrete event processing language required for defining event patterns
    • …
    corecore