5,689 research outputs found

    A layered abduction model of perception: Integrating bottom-up and top-down processing in a multi-sense agent

    Get PDF
    A layered-abduction model of perception is presented which unifies bottom-up and top-down processing in a single logical and information-processing framework. The process of interpreting the input from each sense is broken down into discrete layers of interpretation, where at each layer a best explanation hypothesis is formed of the data presented by the layer or layers below, with the help of information available laterally and from above. The formation of this hypothesis is treated as a problem of abductive inference, similar to diagnosis and theory formation. Thus this model brings a knowledge-based problem-solving approach to the analysis of perception, treating perception as a kind of compiled cognition. The bottom-up passing of information from layer to layer defines channels of information flow, which separate and converge in a specific way for any specific sense modality. Multi-modal perception occurs where channels converge from more than one sense. This model has not yet been implemented, though it is based on systems which have been successful in medical and mechanical diagnosis and medical test interpretation

    Some thoughts on theoretical physics

    Full text link
    Some thoughts are presented on the inter-relation between beauty and truth in science in general and theoretical physics in particular. Some conjectural procedures that can be used to create new ideas, concepts and results are illustrated in both Boltzmann-Gibbs and nonextensive statistical mechanics. The sociological components of scientific progress and its unavoidable and benefic controversies are, mainly through existing literary texts, briefly addressed as well.Comment: Short essay based on the plenary talk given at the International Workshop on Trends and Perspectives in Extensive and Non-Extensive Statistical Mechanics, held in November 19-21, 2003, in Angra dos Reis, Brazil. To appear in a Physica A special volume (2004) edited by E.M.F. Curado, H.J. Herrmann and M. Barbosa. 23 pages, including 3 figures. The new version has 25 pages and the same figures. The texts by Saramago and by Bersanelli are now translated into English. A few typos and minor improvements are included as wel

    Space exploration: The interstellar goal and Titan demonstration

    Get PDF
    Automated interstellar space exploration is reviewed. The Titan demonstration mission is discussed. Remote sensing and automated modeling are considered. Nuclear electric propulsion, main orbiting spacecraft, lander/rover, subsatellites, atmospheric probes, powered air vehicles, and a surface science network comprise mission component concepts. Machine, intelligence in space exploration is discussed

    Complexity of Non-Monotonic Logics

    Full text link
    Over the past few decades, non-monotonic reasoning has developed to be one of the most important topics in computational logic and artificial intelligence. Different ways to introduce non-monotonic aspects to classical logic have been considered, e.g., extension with default rules, extension with modal belief operators, or modification of the semantics. In this survey we consider a logical formalism from each of the above possibilities, namely Reiter's default logic, Moore's autoepistemic logic and McCarthy's circumscription. Additionally, we consider abduction, where one is not interested in inferences from a given knowledge base but in computing possible explanations for an observation with respect to a given knowledge base. Complexity results for different reasoning tasks for propositional variants of these logics have been studied already in the nineties. In recent years, however, a renewed interest in complexity issues can be observed. One current focal approach is to consider parameterized problems and identify reasonable parameters that allow for FPT algorithms. In another approach, the emphasis lies on identifying fragments, i.e., restriction of the logical language, that allow more efficient algorithms for the most important reasoning tasks. In this survey we focus on this second aspect. We describe complexity results for fragments of logical languages obtained by either restricting the allowed set of operators (e.g., forbidding negations one might consider only monotone formulae) or by considering only formulae in conjunctive normal form but with generalized clause types. The algorithmic problems we consider are suitable variants of satisfiability and implication in each of the logics, but also counting problems, where one is not only interested in the existence of certain objects (e.g., models of a formula) but asks for their number.Comment: To appear in Bulletin of the EATC

    Epistemological Foundations for Neuroeconomics

    Get PDF
    Neuroeconomics is an emerging field crossing neuroscientific data, the use of brain-imaging tools, experimental and behavioral economics, and an attempt at a better understanding of the cognitive assumptions that underlie theoretical predictive economic models. In this paper the authors try two things: 1) To assess the epistemological biases that affect Neuroeconomics as it is currently done. A number of significant experiments are discussed in that perspective. 2) To imagine an original way - apart from what is already being done - to run experiments in brain-imaging that are relevant to the discussion of rationality assumptions at the core of economic theory.Neuroeconomics, Rationality Assumptions, Abduction

    A granularity-based framework of deduction, induction, and abduction

    Get PDF
    AbstractIn this paper, we propose a granularity-based framework of deduction, induction, and abduction using variable precision rough set models proposed by Ziarko and measure-based semantics for modal logic proposed by Murai et al. The proposed framework is based on α-level fuzzy measure models on the basis of background knowledge, as described in the paper. In the proposed framework, deduction, induction, and abduction are characterized as reasoning processes based on typical situations about the facts and rules used in these processes. Using variable precision rough set models, we consider β-lower approximation of truth sets of nonmodal sentences as typical situations of the given facts and rules, instead of the truth sets of the sentences as correct representations of the facts and rules. Moreover, we represent deduction, induction, and abduction as relationships between typical situations

    Inference Belief and Interpretation in Science

    Get PDF
    This monograph explores the deeply cognitive roots of human scientific quest. The process of making scientific inferences is continuous with the day-to-day inferential activity of individuals, and is predominantly inductive in nature. Inductive inference, which is fallible, exploratory, and open-ended, is of essential relevance in our incessant efforts at making sense of a complex and uncertain world around us, and covers a vast range of cognitive activities, among which scientific exploration constitutes the pinnacle. Inductive inference has a personal aspect to it, being rooted in the cognitive unconscious of individuals, which has recently been found to be of paramount importance in a wide range of complex cognitive processes. One other major aspect of the process of inference making, including the making of scientific inferences, is the role of a vast web of beliefs lodged in the human mind, as also of a huge repertoire of heuristics, that constitute an important component of ‘unconscious intelligence’. Finally, human cognitive activity is dependent in a large measure on emotions and affects that operate mostly at an unconscious level. Of special importance in scientific inferential activity is the process of hypothesis making, which is examined in this book, along with the above aspects of inductive inference, at considerable depth. The book focuses on the inadequacy of the viewpoint of naive realism in understanding the context-dependence of scientific theories, where a cumulative progress towards an ultimate truth about Nature appears to be too simplistic a generalization. It poses a critique to the commonly perceived image of science where it is seen as the last word in logic and objectivity, the latter in the double sense of being independent of individual psychological propensities and, at the same time, approaching a correct understanding of the workings of a mind-independent nature. Adopting the naturalist point of view, it examines the essential tension between the cognitive endeavors of individuals and scientific communities, immersed in belief systems and cultures, on the one hand, and the engagement with a mind-independent reality on the other. In the end, science emerges as an interpretation of nature, which is perceived by us only contextually, as successively emerging cross-sections of a limited scope and extent. Successive waves of theory building in science appear as episodic and kaleidoscopic changes in perspective as certain in-built borders are crossed, rather than as a cumulative progress towards some ultimate truth. Based on current literature, I aim to set up, in the form of a plausible hypothesis, a framework for understanding the mechanisms underlying inductive inference in general and abduction in particular
    • …
    corecore