1,371 research outputs found

    Seeking stability of supply chain management decisions under uncertain criteria

    Get PDF
    The leading theme of MOSIM’12 is "Performance, Interoperability and Safety for sustainable development"International audienceThis paper tackles the question of the anticipation of the supply chain partner's decisional behaviour under uncertain criteria. In other words, we propose a model to support sequential decisions under uncertainty where the decision maker has to make hypothesis about the decision criteria. For example, Hurwicz criterion weights extreme optimism and pessimism positions and a classic criticism of this criterion consisting in the difficulty of the weight assessment and the involving decision instability. To achieve this, we present a method based on fuzzy representation of weight vision. Finally, the model allows sequential decision of a Decision Tree to be compute thanks a pignistic probabilities treatment of the fuzzy representation of the decision maker optimism-pessimism index. This approach is illustrated through an industrial case study

    Architecture value mapping: using fuzzy cognitive maps as a reasoning mechanism for multi-criteria conceptual design evaluation

    Get PDF
    The conceptual design phase is the most critical phase in the systems engineering life cycle. The design concept chosen during this phase determines the structure and behavior of the system, and consequently, its ability to fulfill its intended function. A good conceptual design is the first step in the development of a successful artifact. However, decision-making during conceptual design is inherently challenging and often unreliable. The conceptual design phase is marked by an ambiguous and imprecise set of requirements, and ill-defined system boundaries. A lack of usable data for design evaluation makes the problem worse. In order to assess a system accurately, it is necessary to capture the relationships between its physical attributes and the stakeholders\u27 value objectives. This research presents a novel conceptual architecture evaluation approach that utilizes attribute-value networks, designated as \u27Architecture Value Maps\u27, to replicate the decision makers\u27 cogitative processes. Ambiguity in the system\u27s overall objectives is reduced hierarchically to reveal a network of criteria that range from the abstract value measures to the design-specific performance measures. A symbolic representation scheme, the 2-Tuple Linguistic Representation is used to integrate different types of information into a common computational format, and Fuzzy Cognitive Maps are utilized as the reasoning engine to quantitatively evaluate potential design concepts. A Linguistic Ordered Weighted Average aggregation operator is used to rank the final alternatives based on the decision makers\u27 risk preferences. The proposed methodology provides systems architects with the capability to exploit the interrelationships between a system\u27s design attributes and the value that stakeholders associate with these attributes, in order to design robust, flexible, and affordable systems --Abstract, page iii

    Fuzzy Set Ranking Methods and Multiple Expert Decision Making

    Get PDF
    The present report further investigates the multi-criteria decision making tool named Fuzzy Compromise Programming. Comparison of different fuzzy set ranking methods (required for processing fuzzy information) is performed. A complete sensitivity analysis concerning decision maker’s risk preferences was carried out for three water resources systems, and compromise solutions identified. Then, a weights sensitivity analysis was performed on one of the three systems to see whether the rankings would change in response to changing weights. It was found that this particular system was robust to the changes in weights. An inquiry was made into the possibility of modifying Fuzzy Compromise Programming to include participation of multiple decision makers or experts. This was accomplished by merging a technique known as Group Decision Making Under Fuzziness, with Fuzzy Compromise Programming. Modified technique provides support for the group decision making under multiple criteria in a fuzzy environment.https://ir.lib.uwo.ca/wrrr/1001/thumbnail.jp

    Cognitive Information Processing

    Get PDF
    Contains reports on six research projects.National Institutes of Health (Grant 5 PO1 GM14940-03)National Institutes of Health (Grant 5 PO1 GM15006-03)Joint Services Electronics Programs (U. S. Army, U. S. Navy, and U.S. Air Force) under Contract DA 28-043-AMC-02536(E)National Institutes of Health (Grant 5 TO1 GM01555-03

    Contention management for distributed data replication

    Get PDF
    PhD ThesisOptimistic replication schemes provide distributed applications with access to shared data at lower latencies and greater availability. This is achieved by allowing clients to replicate shared data and execute actions locally. A consequence of this scheme raises issues regarding shared data consistency. Sometimes an action executed by a client may result in shared data that may conflict and, as a consequence, may conflict with subsequent actions that are caused by the conflicting action. This requires a client to rollback to the action that caused the conflicting data, and to execute some exception handling. This can be achieved by relying on the application layer to either ignore or handle shared data inconsistencies when they are discovered during the reconciliation phase of an optimistic protocol. Inconsistency of shared data has an impact on the causality relationship across client actions. In protocol design, it is desirable to preserve the property of causality between different actions occurring across a distributed application. Without application level knowledge, we assume an action causes all the subsequent actions at the same client. With application knowledge, we can significantly ease the protocol burden of provisioning causal ordering, as we can identify which actions do not cause other actions (even if they precede them). This, in turn, makes possible the client’s ability to rollback to past actions and to change them, without having to alter subsequent actions. Unfortunately, increased instances of application level causal relations between actions lead to a significant overhead in protocol. Therefore, minimizing the rollback associated with conflicting actions, while preserving causality, is seen as desirable for lower exception handling in the application layer. In this thesis, we present a framework that utilizes causality to create a scheduler that can inform a contention management scheme to reduce the rollback associated with the conflicting access of shared data. Our framework uses a backoff contention management scheme to provide causality preserving for those optimistic replication systems with high causality requirements, without the need for application layer knowledge. We present experiments which demonstrate that our framework reduces clients’ rollback and, more importantly, that the overall throughput of the system is improved when the contention management is used with applications that require causality to be preserved across all actions

    Cross Entropy-based Analysis of Spacecraft Control Systems

    Get PDF
    Space missions increasingly require sophisticated guidance, navigation and control algorithms, the development of which is reliant on verification and validation (V&V) techniques to ensure mission safety and success. A crucial element of V&V is the assessment of control system robust performance in the presence of uncertainty. In addition to estimating average performance under uncertainty, it is critical to determine the worst case performance. Industrial V&V approaches typically employ mu-analysis in the early control design stages, and Monte Carlo simulations on high-fidelity full engineering simulators at advanced stages of the design cycle. While highly capable, such techniques present a critical gap between pessimistic worst case estimates found using analytical methods, and the optimistic outlook often presented by Monte Carlo runs. Conservative worst case estimates are problematic because they can demand a controller redesign procedure, which is not justified if the poor performance is unlikely to occur. Gaining insight into the probability associated with the worst case performance is valuable in bridging this gap. It should be noted that due to the complexity of industrial-scale systems, V&V techniques are required to be capable of efficiently analysing non-linear models in the presence of significant uncertainty. As well, they must be computationally tractable. It is desirable that such techniques demand little engineering effort before each analysis, to be applied widely in industrial systems. Motivated by these factors, this thesis proposes and develops an efficient algorithm, based on the cross entropy simulation method. The proposed algorithm efficiently estimates the probabilities associated with various performance levels, from nominal performance up to degraded performance values, resulting in a curve of probabilities associated with various performance values. Such a curve is termed the probability profile of performance (PPoP), and is introduced as a tool that offers insight into a control system's performance, principally the probability associated with the worst case performance. The cross entropy-based robust performance analysis is implemented here on various industrial systems in European Space Agency-funded research projects. The implementation on autonomous rendezvous and docking models for the Mars Sample Return mission constitutes the core of the thesis. The proposed technique is implemented on high-fidelity models of the Vega launcher, as well as on a generic long coasting launcher upper stage. In summary, this thesis (a) develops an algorithm based on the cross entropy simulation method to estimate the probability associated with the worst case, (b) proposes the cross entropy-based PPoP tool to gain insight into system performance, (c) presents results of the robust performance analysis of three space industry systems using the proposed technique in conjunction with existing methods, and (d) proposes an integrated template for conducting robust performance analysis of linearised aerospace systems

    A fuzzy hierarchical decision model and its application in networking datacenters and in infrastructure acquisitions and design

    Get PDF
    According to several studies, an inordinate number of major business decisions to acquire, design, plan, and implement networking infrastructures fail. A networking infrastructure is a collaborative group of telecommunications systems providing services needed for a firm\u27s operations and business growth. The analytical hierarchy process (AHP) is a well established decision-making process used to analyze decisions related to networking infrastructures. AHP is concerned with decomposing complex decisions into a set of factors and solutions. However, AHP has difficulties in handling uncertainty in decision information. This study addressed the research question of solutions to AHP deficiencies. The solutions were accomplished through the development of a model capable of handling decisions with incomplete information and uncertain decision operating environment. This model is based on AHP framework and fuzzy sets theory. Fuzzy sets are sets whose memberships are gradual. A member of a fuzzy set may have a strong, weak, or a moderate membership. The methodology for this study was based primarily on the analytical research design method, which is neither quantitative nor qualitative, but based on mathematical concepts, proofs, and logic. The model\u27s constructs were verified by a simulated practical case study based on current literature and the input of networking experts. To further verify the research objectives, the investigator developed, tested, and validated a software platform. The results showed tangible improvements in analyzing complex networking infrastructure decisions. The ability of this model to analyze decisions with incomplete information and uncertain economic outlook can be employed in the socially important areas such as renewable energy, forest management, and environmental studies to achieve large savings

    A Bayesian approach to robust identification: application to fault detection

    Get PDF
    In the Control Engineering field, the so-called Robust Identification techniques deal with the problem of obtaining not only a nominal model of the plant, but also an estimate of the uncertainty associated to the nominal model. Such model of uncertainty is typically characterized as a region in the parameter space or as an uncertainty band around the frequency response of the nominal model. Uncertainty models have been widely used in the design of robust controllers and, recently, their use in model-based fault detection procedures is increasing. In this later case, consistency between new measurements and the uncertainty region is checked. When an inconsistency is found, the existence of a fault is decided. There exist two main approaches to the modeling of model uncertainty: the deterministic/worst case methods and the stochastic/probabilistic methods. At present, there are a number of different methods, e.g., model error modeling, set-membership identification and non-stationary stochastic embedding. In this dissertation we summarize the main procedures and illustrate their results by means of several examples of the literature. As contribution we propose a Bayesian methodology to solve the robust identification problem. The approach is highly unifying since many robust identification techniques can be interpreted as particular cases of the Bayesian framework. Also, the methodology can deal with non-linear structures such as the ones derived from the use of observers. The obtained Bayesian uncertainty models are used to detect faults in a quadruple-tank process and in a three-bladed wind turbine
    • …
    corecore