329,083 research outputs found

    Towards automatic Markov reliability modeling of computer architectures

    Get PDF
    The analysis and evaluation of reliability measures using time-varying Markov models is required for Processor-Memory-Switch (PMS) structures that have competing processes such as standby redundancy and repair, or renewal processes such as transient or intermittent faults. The task of generating these models is tedious and prone to human error due to the large number of states and transitions involved in any reasonable system. Therefore model formulation is a major analysis bottleneck, and model verification is a major validation problem. The general unfamiliarity of computer architects with Markov modeling techniques further increases the necessity of automating the model formulation. This paper presents an overview of the Automated Reliability Modeling (ARM) program, under development at NASA Langley Research Center. ARM will accept as input a description of the PMS interconnection graph, the behavior of the PMS components, the fault-tolerant strategies, and the operational requirements. The output of ARM will be the reliability of availability Markov model formulated for direct use by evaluation programs. The advantages of such an approach are (a) utility to a large class of users, not necessarily expert in reliability analysis, and (b) a lower probability of human error in the computation

    An Automated procedure for simulating complex arrival processes: A Web-based approach

    Get PDF
    In industry, simulation is one of the most widely used probabilistic modeling tools for modeling highly complex systems. Major sources of complexity include the inputs that drive the logic of the model. Effective simulation input modeling requires the use of accurate and efficient input modeling procedures. This research focuses on nonstationary arrival processes. The fundamental stochastic model on which this study is conducted is the nonhomogeneous Poisson process (NHPP) which has successfully been used to characterize arrival processes where the arrival rate changes over time. Although a number of methods exist for modeling the rate and mean value functions that define the behavior of NHPPs, one of the most flexible is a multiresolution procedure that is used to model the mean value function for processes possessing long-term trends over time or asymmetric, multiple cyclic behavior. In this research, a statistical-estimation procedure for automating the multiresolution procedure is developed that involves the following steps at each resolution level corresponding to a basic cycle: (a) transforming the cumulative relative frequency of arrivals within the cycle to obtain a linear statistical model having normal residuals with homogeneous variance; (b) fitting specially formulated polynomials to the transformed arrival data; (c) performing a likelihood ratio test to determine the degree of the fitted polynomial; and (d) fitting a polynomial of the degree determined in (c) to the original (untransformed) arrival data. Next, an experimental performance evaluation is conducted to test the effectiveness of the estimation method. A web-based application for modeling NHPPs using the automated multiresolution procedure and generating realizations of the NHPP is developed. Finally, a web-based simulation infrastructure that integrates modeling, input analysis, verification, validation and output analysis is discussed

    Extending a dashboard meta-model to account for users’ characteristics and goals for enhancing personalization

    Get PDF
    [EN]Information dashboards are useful tools for exploiting datasets and support decision-making processes. However, these tools are not trivial to design and build. Information dashboards not only involve a set of visualizations and handlers to manage the presented data, but also a set of users that will potentially benefit from the knowledge generated by interacting with the data. It is important to know and understand the requirements of the final users of a dashboard because they will influence the design processes. But several user profiles can be involved, making these processes even more complicated. This paper identifies and discusses why it is essential to include the final users when modeling a dashboard. Through meta-modeling, different characteristics of potential users are structured, thus obtaining a meta-model that dissects not only technical and functional features of a dashboard (from an abstract point of view) but also the different aspects of the final users that will make use of it. By identifying these user characteristics and by arranging them into a meta-model, software engineering paradigms such as model-driven development or software product lines can employ it as an input for generating concrete dashboard products. This approach could be useful for generating Learning Analytics dashboards that take into account the users' motivations, beliefs, and knowledge

    EIGENVALUE EXPRESSION FOR A BATCH MARKOVIAN ARRIVAL PROCESS

    Get PDF
    Abstract Consider a batch Markovian arrival process (BMAP) as the counting process of an underlying Markov process representing the state of environment. Such a process is useful for representing correlated inputs for example. They are used both as a modeling tool and as a theoretical device to represent and approximate superposition of input processes and complex large systems. Our objective is to consider the first and second moments of the counting process depending on time and state. Assuming that the probability generating functions of batch size are analytic, and that eigenvalues of the infinitesimal generator are simple, we derive an analytic diagonalization for the matrix generating function of the counting process. Our main result gives the time-dependent form of the first and second factorial moments of the counting process, which is represented by eigenvalues and eigenvectors of the matrix generating function of the batch size. 1

    SPoT: Representing the Social, Spatial, and Temporal Dimensions of Human Mobility with a Unifying Framework

    Get PDF
    Modeling human mobility is crucial in the analysis and simulation of opportunistic networks, where contacts are exploited as opportunities for peer-topeer message forwarding. The current approach with human mobility modeling has been based on continuously modifying models, trying to embed in them the mobility properties (e.g., visiting patterns to locations or specific distributions of inter-contact times) as they came up from trace analysis. As a consequence, with these models it is difficult, if not impossible, to modify the features of mobility or to control the exact shape of mobility metrics (e.g., modifying the distribution of inter-contact times). For these reasons, in this paper we propose a mobility framework rather than a mobility model, with the explicit goal of providing a exible and controllable tool for modeling mathematically and generating simulatively different possible features of human mobility. Our framework, named SPoT, is able to incorporate the three dimensions - spatial, social, and temporal - of human mobility. The way SPoT does it is by mapping the different social communities of the network into different locations, whose members visit with a configurable temporal pattern. In order to characterize the temporal patterns of user visits to locations and the relative positioning of locations based on their shared users, we analyze the traces of real user movements extracted from three location-based online social networks (Gowalla, Foursquare, and Altergeo). We observe that a Bernoulli process effectively approximates user visits to locations in the majority of cases and that locations that share many common users visiting them frequently tend to be located close to each other. In addition, we use these traces to test the exibility of the framework, and we show that SPoT is able to accurately reproduce the mobility behavior observed in traces. Finally, relying on the Bernoulli assumption for arrival processes, we provide a throughout mathematical analysis of the controllability of the framework, deriving the conditions under which heavy-tailed and exponentially-tailed aggregate inter-contact times (often observed in real traces) emerge

    Product design-Process selection-Process planning Integration based on Modelling and Simulation

    Get PDF
    As a solution for traditional design process having many drawbacks in the manufacturing process, the integration of Product design-Process selection-Process planning is carried out in the early design phase. The technological, economic, and logistic parameters are taken into account simultaneously as well as manufacturing constraints being integrated into the product design. As a consequence, the most feasible alternative with regard to the product’s detailed design is extracted satisfying the product’s functional requirements. Subsequently, a couple of conceptual process plans are proposed relied on manufacturing processes being preliminarily selected in the conceptual design phase. Virtual manufacturing is employed under CAM software to simulate fabrication process of the potential process plans. Ultimately, the most suitable process plan for fabricating the part is recommended based upon a multi-criteria analysis as a resolution for decision making

    A System for Deduction-based Formal Verification of Workflow-oriented Software Models

    Full text link
    The work concerns formal verification of workflow-oriented software models using deductive approach. The formal correctness of a model's behaviour is considered. Manually building logical specifications, which are considered as a set of temporal logic formulas, seems to be the significant obstacle for an inexperienced user when applying the deductive approach. A system, and its architecture, for the deduction-based verification of workflow-oriented models is proposed. The process of inference is based on the semantic tableaux method which has some advantages when compared to traditional deduction strategies. The algorithm for an automatic generation of logical specifications is proposed. The generation procedure is based on the predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea for the approach is to consider patterns, defined in terms of temporal logic,as a kind of (logical) primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between intuitiveness of the deductive reasoning and the difficulty of its practical application in the case when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing our understanding of, the deduction-based formal verification of workflow-oriented models.Comment: International Journal of Applied Mathematics and Computer Scienc
    • 

    corecore