1,054 research outputs found

    Intelligent flight control systems

    Get PDF
    The capabilities of flight control systems can be enhanced by designing them to emulate functions of natural intelligence. Intelligent control functions fall in three categories. Declarative actions involve decision-making, providing models for system monitoring, goal planning, and system/scenario identification. Procedural actions concern skilled behavior and have parallels in guidance, navigation, and adaptation. Reflexive actions are spontaneous, inner-loop responses for control and estimation. Intelligent flight control systems learn knowledge of the aircraft and its mission and adapt to changes in the flight environment. Cognitive models form an efficient basis for integrating 'outer-loop/inner-loop' control functions and for developing robust parallel-processing algorithms

    Knowledge modelling for the motion detection task

    Get PDF
    In this article knowledge modelling at the knowledge level for the task of moving objects detection in image sequences is introduced. Three items have been the focus of the approach: (1) the convenience of knowledge modelling of tasks and methods in terms of a library of reusable components and in advance to the phase of operationalization of the primitive inferences; (2) the potential utility of looking for inspiration in biology; (3) the convenience of using these biologically inspired problem-solving methods (PSMs) to solve motion detection tasks. After studying a summary of the methods used to solve the motion detection task, the moving targets in indefinite sequences of images detection task is approached by means of the algorithmic lateral inhibition (ALI) PSM. The task is decomposed in four subtasks: (a) thresholded segmentation; (b) motion detection; (c) silhouettes parts obtaining; and (d) moving objects silhouettes fusion. For each one of these subtasks, first, the inferential scheme is obtained and then each one of the inferences is operationalized. Finally, some experimental results are presented along with comments on the potential value of our approach

    A Promethean Philosophy of External Technologies, Empiricism, & the Concept: Second-Order Cybernetics, Deep Learning, and Predictive Processing

    Get PDF
    Beginning with a survey of the shortcoming of theories of organology/media-as-externalization of mind/body—a philosophical-anthropological tradition that stretches from Plato through Ernst Kapp and finds its contemporary proponent in Bernard Stiegler—I propose that the phenomenological treatment of media as an outpouching and extension of mind qua intentionality is not sufficient to counter the Ìłblack-box‘ mystification of today‘s deep learning‘s algorithms. Focusing on a close study of Simondon‘s On the Existence of Technical Objectsand Individuation, I argue that the process-philosophical work of Gilbert Simondon, with its critique of Norbert Wiener‘s first-order cybernetics, offers a precursor to the conception of second-order cybernetics (as endorsed byFrancisco Varela, Humberto Maturana, and Ricardo B. Uribe) and, specifically, its autopoietic treatment of information. It has been argued by those such as Frank Pasquale that neuro-inferential deep learning systems premised on predictive patterning, suchas AlphaGo Zero, have a veiled logic and, thus, are Ìłblack boxes‘. In detailing a philosophical-historical approach to demystify predictive patterning/processing and the logic of such deep learning algorithms, this paper attempts to shine a light on such systems and their inner workingsĂ la Simondon

    A Unified Framework for Gradient-based Hyperparameter Optimization and Meta-learning

    Get PDF
    Machine learning algorithms and systems are progressively becoming part of our societies, leading to a growing need of building a vast multitude of accurate, reliable and interpretable models which should possibly exploit similarities among tasks. Automating segments of machine learning itself seems to be a natural step to undertake to deliver increasingly capable systems able to perform well in both the big-data and the few-shot learning regimes. Hyperparameter optimization (HPO) and meta-learning (MTL) constitute two building blocks of this growing effort. We explore these two topics under a unifying perspective, presenting a mathematical framework linked to bilevel programming that captures existing similarities and translates into procedures of practical interest rooted in algorithmic differentiation. We discuss the derivation, applicability and computational complexity of these methods and establish several approximation properties for a class of objective functions of the underlying bilevel programs. In HPO, these algorithms generalize and extend previous work on gradient-based methods. In MTL, the resulting framework subsumes classic and emerging strategies and provides a starting basis from which to build and analyze novel techniques. A series of examples and numerical simulations offer insight and highlight some limitations of these approaches. Experiments on larger-scale problems show the potential gains of the proposed methods in real-world applications. Finally, we develop two extensions of the basic algorithms apt to optimize a class of discrete hyperparameters (graph edges) in an application to relational learning and to tune online learning rate schedules for training neural network models, an old but crucially important issue in machine learning

    The Machine as Data: A Computational View of Emergence and Definability

    Get PDF
    Turing’s (Proceedings of the London Mathematical Society 42:230–265, 1936) paper on computable numbers has played its role in underpinning different perspectives on the world of information. On the one hand, it encourages a digital ontology, with a perceived flatness of computational structure comprehensively hosting causality at the physical level and beyond. On the other (the main point of Turing’s paper), it can give an insight into the way in which higher order information arises and leads to loss of computational control—while demonstrating how the control can be re-established, in special circumstances, via suitable type reductions. We examine the classical computational framework more closely than is usual, drawing out lessons for the wider application of information–theoretical approaches to characterizing the real world. The problem which arises across a range of contexts is the characterizing of the balance of power between the complexity of informational structure (with emergence, chaos, randomness and ‘big data’ prominently on the scene) and the means available (simulation, codes, statistical sampling, human intuition, semantic constructs) to bring this information back into the computational fold. We proceed via appropriate mathematical modelling to a more coherent view of the computational structure of information, relevant to a wide spectrum of areas of investigation
    • 

    corecore