218 research outputs found

    An overview of decision table literature 1982-1995.

    Get PDF
    This report gives an overview of the literature on decision tables over the past 15 years. As much as possible, for each reference, an author supplied abstract, a number of keywords and a classification are provided. In some cases own comments are added. The purpose of these comments is to show where, how and why decision tables are used. The literature is classified according to application area, theoretical versus practical character, year of publication, country or origin (not necessarily country of publication) and the language of the document. After a description of the scope of the interview, classification results and the classification by topic are presented. The main body of the paper is the ordered list of publications with abstract, classification and comments.

    Dependence in probabilistic modeling, Dempster-Shafer theory, and probability bounds analysis.

    Full text link

    Context classification for service robots

    Get PDF
    This dissertation presents a solution for environment sensing using sensor fusion techniques and a context/environment classification of the surroundings in a service robot, so it could change his behavior according to the different rea-soning outputs. As an example, if a robot knows he is outdoors, in a field environment, there can be a sandy ground, in which it should slow down. Contrariwise in indoor environments, that situation is statistically unlikely to happen (sandy ground). This simple assumption denotes the importance of context-aware in automated guided vehicles

    Reliable visual analytics, a prerequisite for outcome assessment of engineering systems

    Get PDF
    Various evaluation approaches exist for multi-purpose visual analytics (VA) frameworks. They are based on empirical studies in information visualization or on community activities, for example, VA Science and Technology Challenge (2006-2014) created as a community evaluation resource to “decide upon the right metrics to use, and the appropriate implementation of those metrics including datasets and evaluators” 1 . In this paper, we propose to use evaluated VA environments for computer-based processes or systems with the main goal of aligning user plans, system models and software results. For this purpose, trust in VA outcome should be established, which can be done by following the (meta-)design principles of a human-centered verification and validation assessment and also in dependence on users’ task models and interaction styles, since the possibility to work with the visualization interactively is an integral part of VA. To define reliable VA, we point out various dimensions of reliability along with their quality criteria, requirements, attributes and metrics. Several software packages are used to illustrate the concepts

    Focusing ATMS Problem-Solving: A Formal Approach

    Get PDF
    The Assumption-based Truth Maintenance System (ATMS) is a general and powerful problem-solving tool in AI. Unfortunately, its generality usually entails a high computational cost. In this paper, we study how a general notion of cost function can be incorporated into the design of an algorithm for focusing the ATMS, called BF-ATMS. The BF-ATMS algorithm explores a search space of size polynomial in the number of assumptions, even for problems which are proven to have exponential size labels. Experimental results indicate significant speedups over the standard ATMS for such problems. In addition to its improved efficiency, the BF-ATMS algorithm retains the multiple-context capability of an ATMS, and the important properties of consistency, minimality, soundness, as well as the property of bounded completeness. The usefulness of the new algorithm is demonstrated by its application to the task of consistency-based diagnosis, where dramatic efficiency improvements, with respect to the standard solution technique, are obtained

    A step towards understanding paper documents

    Get PDF
    This report focuses on analysis steps necessary for a paper document processing. It is divided in three major parts: a document image preprocessing, a knowledge-based geometric classification of the image, and a expectation-driven text recognition. It first illustrates the several low level image processing procedures providing the physical document structure of a scanned document image. Furthermore, it describes a knowledge-based approach, developed for the identification of logical objects (e.g., sender or the footnote of a letter) in a document image. The logical identifiers provide a context-restricted consideration of the containing text. While using specific logical dictionaries, a expectation-driven text recognition is possible to identify text parts of specific interest. The system has been implemented for the analysis of single-sided business letters in Common Lisp on a SUN 3/60 Workstation. It is running for a large population of different letters. The report also illustrates and discusses examples of typical results obtained by the system

    Developing a simple yet rigorous approach for operational risk management for small vessels

    Get PDF
    Fishing is seen as one of the most dangerous occupations in the world, and the people affected by the accidents at sea are often among the poorest in the society as found by the International Labor Organization (ILO). About 95% of fishers worldwide are small scale fishers and it is estimated that as much as 40% of the global landings comes from small scale fisheries according to recent studies conducted by the Food and Agricultural Organization (FAO), in partnerships with Duke University and WorldFish. Some studies have in the past documented fishing accidents and spelt out various hazards and consequences relating to outcomes including injury, vessel damage and loss, and death. There is, however, limited information regarding national and global ranking of these hazards and consequences to help identify the patterns associated with the risk, and hence target training resources in the direction of most probable occurrences is difficult. It is therefore essential to study and assess the interactions among the influential risk factors and the management strategies that can be employed to mitigate their impacts and improve training. This research work seeks to study and develop a simple but rigorous operational risk modelling and management approach for small vessels that are used in fishing and transportation. A comprehensive probabilistic analysis was required to propose a simple applicable method to analyze risk causal factors of small fishing vessel operations. This was followed by the development of an operational risk model for small fishing vessels. The model was further analyzed with expert data along with secondary data from literature using a hybrid quantitative model for operational risk. In completing the research study, a case for an operational risk management approach for small fishing vessel is proposed using the cost per unit risk reduction (CURR) model to select a risk control option. Several small fishing vessel accidental events were attributed to operator error, vessel factors and environmental factors. Based on the findings of the research it is recommended that a combination of administrative and personal protective equipment control measures be adopted by the stakeholders

    Resilient Monitoring and Control Systems: Design, Analysis, and Performance Evaluation.

    Full text link
    Critical infrastructure systems (i.e., power plants, transportation networks, chemical plants, etc.) and their sensor networks are vulnerable to cyber-physical attacks. Cyber-attacks refer to the malicious manipulation of the sensor data, while physical attacks refer to the intentional damage of the plant components, by an adversary. The goal of this dissertation is to develop monitoring and control systems that are resilient to these attacks. The monitoring system is termed resilient if it provides the least uncertain process variable estimates and plant condition assessment. The control system is termed resilient if it identifies the attacked actuators and generates the best possible control signals (in terms of the largest probability of maintaining the process variables in the desired range). The resilient monitoring system (RMS) developed in this research consists of five layers: Data quality acquisition, process variable assessment, plant condition assessment, sensor network adaptation, and decentralized knowledge fusion. The techniques involved in each of these layers are rigorously analyzed and are shown to identify the plant condition in a reliable and timely manner. The RMS is applied to a power plant model, and its performance is evaluated under several cyber-physical attack scenarios. The measure of resiliency is quantified using Kullback-Leibler divergence and is shown to be high in all scenarios considered. The resilient control system (RCS) is developed based on two approaches: Model predictive control (MPC)-based approach and synchronous detection (SD)-based approach. In the MPC approach, a control input is calculated using the information provided by the RMS. The goal here is to steer the process variable to the desired value, while ensuring that it always remains within a safe domain. In the SD approach, the condition of the sensor and actuator is assessed using the method of synchronous detection. Then, the controller is modified so that the effects of the attacks are eliminated. Using simulations, it is shown that both these approaches are viable for the design of RCS. Thus, the main contribution of this research is in providing the theoretical foundation for the design of RMS and RCS applicable to critical infrastructures that are characterized by complex interactions of process variables.PhDElectrical Engineering: SystemsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/113431/1/marutrav_1.pd

    Advances and Applications of Dezert-Smarandache Theory (DSmT) for Information Fusion (Collected Works), Vol. 4

    Get PDF
    The fourth volume on Advances and Applications of Dezert-Smarandache Theory (DSmT) for information fusion collects theoretical and applied contributions of researchers working in different fields of applications and in mathematics. The contributions (see List of Articles published in this book, at the end of the volume) have been published or presented after disseminating the third volume (2009, http://fs.unm.edu/DSmT-book3.pdf) in international conferences, seminars, workshops and journals. First Part of this book presents the theoretical advancement of DSmT, dealing with Belief functions, conditioning and deconditioning, Analytic Hierarchy Process, Decision Making, Multi-Criteria, evidence theory, combination rule, evidence distance, conflicting belief, sources of evidences with different importance and reliabilities, importance of sources, pignistic probability transformation, Qualitative reasoning under uncertainty, Imprecise belief structures, 2-Tuple linguistic label, Electre Tri Method, hierarchical proportional redistribution, basic belief assignment, subjective probability measure, Smarandache codification, neutrosophic logic, Evidence theory, outranking methods, Dempster-Shafer Theory, Bayes fusion rule, frequentist probability, mean square error, controlling factor, optimal assignment solution, data association, Transferable Belief Model, and others. More applications of DSmT have emerged in the past years since the apparition of the third book of DSmT 2009. Subsequently, the second part of this volume is about applications of DSmT in correlation with Electronic Support Measures, belief function, sensor networks, Ground Moving Target and Multiple target tracking, Vehicle-Born Improvised Explosive Device, Belief Interacting Multiple Model filter, seismic and acoustic sensor, Support Vector Machines, Alarm classification, ability of human visual system, Uncertainty Representation and Reasoning Evaluation Framework, Threat Assessment, Handwritten Signature Verification, Automatic Aircraft Recognition, Dynamic Data-Driven Application System, adjustment of secure communication trust analysis, and so on. Finally, the third part presents a List of References related with DSmT published or presented along the years since its inception in 2004, chronologically ordered
    • …
    corecore