3,107 research outputs found

    The Integration of Connectionism and First-Order Knowledge Representation and Reasoning as a Challenge for Artificial Intelligence

    Get PDF
    Intelligent systems based on first-order logic on the one hand, and on artificial neural networks (also called connectionist systems) on the other, differ substantially. It would be very desirable to combine the robust neural networking machinery with symbolic knowledge representation and reasoning paradigms like logic programming in such a way that the strengths of either paradigm will be retained. Current state-of-the-art research, however, fails by far to achieve this ultimate goal. As one of the main obstacles to be overcome we perceive the question how symbolic knowledge can be encoded by means of connectionist systems: Satisfactory answers to this will naturally lead the way to knowledge extraction algorithms and to integrated neural-symbolic systems.Comment: In Proceedings of INFORMATION'2004, Tokyo, Japan, to appear. 12 page

    Chaos and Asymptotical Stability in Discrete-time Neural Networks

    Full text link
    This paper aims to theoretically prove by applying Marotto's Theorem that both transiently chaotic neural networks (TCNN) and discrete-time recurrent neural networks (DRNN) have chaotic structure. A significant property of TCNN and DRNN is that they have only one fixed point, when absolute values of the self-feedback connection weights in TCNN and the difference time in DRNN are sufficiently large. We show that this unique fixed point can actually evolve into a snap-back repeller which generates chaotic structure, if several conditions are satisfied. On the other hand, by using the Lyapunov functions, we also derive sufficient conditions on asymptotical stability for symmetrical versions of both TCNN and DRNN, under which TCNN and DRNN asymptotically converge to a fixed point. Furthermore, generic bifurcations are also considered in this paper. Since both of TCNN and DRNN are not special but simple and general, the obtained theoretical results hold for a wide class of discrete-time neural networks. To demonstrate the theoretical results of this paper better, several numerical simulations are provided as illustrating examples.Comment: This paper will be published in Physica D. Figures should be requested to the first autho

    COMPUTER SIMULATION AND COMPUTABILITY OF BIOLOGICAL SYSTEMS

    Get PDF
    The ability to simulate a biological organism by employing a computer is related to the ability of the computer to calculate the behavior of such a dynamical system, or the "computability" of the system.* However, the two questions of computability and simulation are not equivalent. Since the question of computability can be given a precise answer in terms of recursive functions, automata theory and dynamical systems, it will be appropriate to consider it first. The more elusive question of adequate simulation of biological systems by a computer will be then addressed and a possible connection between the two answers given will be considered. A conjecture is formulated that suggests the possibility of employing an algebraic-topological, "quantum" computer (Baianu, 1971b) for analogous and symbolic simulations of biological systems that may include chaotic processes that are not, in genral, either recursively or digitally computable. Depending on the biological network being modelled, such as the Human Genome/Cell Interactome or a trillion-cell Cognitive Neural Network system, the appropriate logical structure for such simulations might be either the Quantum MV-Logic (QMV) discussed in recent publications (Chiara, 2004, and references cited therein)or Lukasiewicz Logic Algebras that were shown to be isomorphic to MV-logic algebras (Georgescu et al, 2001)

    A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks

    Get PDF
    We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule including passive forgetting and different time scales for neuronal activity and learning dynamics. Previous numerical works have reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on the neural network evolution. Furthermore, we show that the sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest

    Dynamics of Unperturbed and Noisy Generalized Boolean Networks

    Full text link
    For years, we have been building models of gene regulatory networks, where recent advances in molecular biology shed some light on new structural and dynamical properties of such highly complex systems. In this work, we propose a novel timing of updates in Random and Scale-Free Boolean Networks, inspired by recent findings in molecular biology. This update sequence is neither fully synchronous nor asynchronous, but rather takes into account the sequence in which genes affect each other. We have used both Kauffman's original model and Aldana's extension, which takes into account the structural properties about known parts of actual GRNs, where the degree distribution is right-skewed and long-tailed. The computer simulations of the dynamics of the new model compare favorably to the original ones and show biologically plausible results both in terms of attractors number and length. We have complemented this study with a complete analysis of our systems' stability under transient perturbations, which is one of biological networks defining attribute. Results are encouraging, as our model shows comparable and usually even better behavior than preceding ones without loosing Boolean networks attractive simplicity.Comment: 29 pages, publishe

    Information Scrambling in Quantum Neural Networks

    Get PDF
    The quantum neural network is one of the promising applications for near-term noisy intermediate-scale quantum computers. A quantum neural network distills the information from the input wave function into the output qubits. In this Letter, we show that this process can also be viewed from the opposite direction: the quantum information in the output qubits is scrambled into the input. This observation motivates us to use the tripartite information—a quantity recently developed to characterize information scrambling—to diagnose the training dynamics of quantum neural networks. We empirically find strong correlation between the dynamical behavior of the tripartite information and the loss function in the training process, from which we identify that the training process has two stages for randomly initialized networks. In the early stage, the network performance improves rapidly and the tripartite information increases linearly with a universal slope, meaning that the neural network becomes less scrambled than the random unitary. In the latter stage, the network performance improves slowly while the tripartite information decreases. We present evidences that the network constructs local correlations in the early stage and learns large-scale structures in the latter stage. We believe this two-stage training dynamics is universal and is applicable to a wide range of problems. Our work builds bridges between two research subjects of quantum neural networks and information scrambling, which opens up a new perspective to understand quantum neural networks
    • …
    corecore