193 research outputs found

    A derivative for complex Lipschitz maps with generalised Cauchy–Riemann equations

    Get PDF
    AbstractWe introduce the Lipschitz derivative or the L-derivative of a locally Lipschitz complex map: it is a Scott continuous, compact and convex set-valued map that extends the classical derivative to the bigger class of locally Lipschitz maps and allows an extension of the fundamental theorem of calculus and a new generalisation of Cauchy–Riemann equations to these maps, which form a continuous Scott domain. We show that a complex Lipschitz map is analytic in an open set if and only if its L-derivative is a singleton at all points in the open set. The calculus of the L-derivative for sum, product and composition of maps is derived. The notion of contour integration is extended to Scott continuous, non-empty compact, convex valued functions on the complex plane, and by using the L-derivative, the fundamental theorem of contour integration is extended to these functions

    Introduction to self-attachment and its neural basis

    Get PDF

    Reinforcement Learning for Nash Equilibrium Generation

    Get PDF
    Copyright © 2015, International Foundation for Autonomous Agents and Multiagent Systems (www.ifaamas.org). All rights reserved.We propose a new conceptual multi-agent framework which, given a game with an undesirable Nash equilibrium, will almost surely generate a new Nash equilibrium at some predetennined, more desirable pure action profile. The agent(s) targeted for reinforcement learn independently according to a standard model-free algorithm, using internally-generated states corresponding to high-level preference rankings over outcomes. We focus in particular on the case in which the additional reward can be considered as resulting from an internal (re-)appraisal, such that the new equilibrium is stable independent of the continued application of the procedure

    Self-attachment: A self-administrable intervention for chronic anxiety and depression

    Get PDF
    There has been increasing evidence to suggest that the root cause of much mental illness lies in a sub-optimal capacity for affect regulation. Cognition and emotion are intricately linked and cognitive deficits, which are characteristic of many psychiatric conditions, are often driven by affect dysregulation, which itself can usually be traced back to sub-optimal childhood development as supported by Attachment Theory. Individuals with insecure attachment types in their childhoods are prone to a variety of mental illness, whereas a secure attachment type in childhood provides a secure base in life. We therefore propose a holistic approach to tackle chronic anxiety and depression, typical of Axis II clinical disorders, which is informed by the development of the infant brain in social interaction with its primary care-givers. We formulate, in a self-administrable way, the protocols governing the interaction of a securely attached child with its primary care-givers that produce the capacity for affect regulation in the child. We posit that these protocols construct, by neuroplasticity and long term potentiation, new optimal neural pathways in the brains of adults with insecure childhood attachment that suffer from mental disorder. This procedure is called self-attachment and aims to help the individuals to create their own attachment objects in the form of their adult self looking after their inner child

    Power domains and iterated function systems

    No full text
    We introduce the notion of weakly hyperbolic iterated function system (IFS) on a compact metric space, which generalises that of hyperbolic IFS. Based on a domain-theoretic model, which uses the Plotkin power domain and the probabilistic power domain respectively, we prove the existence and uniqueness of the attractor of a weakly hyperbolic IFS and the invariant measure of a weakly hyperbolic IFS with probabilities, extending the classic results of Hutchinson for hyperbolic IFSs in this more general setting. We also present finite algorithms to obtain discrete and digitised approximations to the attractor and the invariant measure, extending the corresponding algorithms for hyperbolic IFSs. We then prove the existence and uniqueness of the invariant distribution of a weakly hyperbolic recurrent IFS and obtain an algorithm to generate the invariant distribution on the digitised screen. The generalised Riemann integral is used to provide a formula for the expected value of almost everywh..

    Decidability of consistency of function and derivative information for a triangle and a convex quadrilateral

    Get PDF
    Given a triangle in the plane, a planar convex compact set and an upper and and a lower bound, we derive a linear programming algorithm which checks if there exists a real-valued Lipschitz map defined on the triangle and bounded by the lower and upper bounds, whose Clarke subgradient lies within the convex compact set. We show that the problem is in fact equivalent to finding a piecewise linear surface with the above property. We extend the result to a convex quadrilateral in the plane. In addition, we obtain some partial results for this problem in higher dimensions

    Smooth approximation of Lipschitz maps and their subgradients

    Get PDF
    We derive new representations for the generalised Jacobian of a locally Lipschitz map between finite dimensional real Euclidean spaces as the lower limit (i.e., limit inferior) of the classical derivative of the map where it exists. The new representations lead to significantly shorter proofs for the basic properties of the subgradient and the generalised Jacobian including the chain rule. We establish that a sequence of locally Lipschitz maps between finite dimensional Euclidean spaces converges to a given locally Lipschitz map in the L-topology—that is, the weakest refinement of the sup norm topology on the space of locally Lipschitz maps that makes the generalised Jacobian a continuous functional—if and only if the limit superior of the sequence of directional derivatives of the maps in a given vector direction coincides with the generalised directional derivative of the given map in that direction, with the convergence to the limit superior being uniform for all unit vectors. We then prove our main result that the subspace of Lipschitz C∞ maps between finite dimensional Euclidean spaces is dense in the space of Lipschitz maps equipped with the L-topology, and, for a given Lipschitz map, we explicitly construct a sequence of Lipschitz C∞ maps converging to it in the L-topology, allowing global smooth approximation of a Lipschitz map and its differential properties. As an application, we obtain a short proof of the extension of Green’s theorem to interval-valued vector fields. For infinite dimensions, we show that the subgradient of a Lipschitz map on a Banach space is upper continuous, and, for a given real-valued Lipschitz map on a separable Banach space, we construct a sequence of Gateaux differentiable functions that converges to the map in the sup norm topology such that the limit superior of the directional derivatives in any direction coincides with the generalised directional derivative of the Lipschitz map in that direction

    Domain theory and differential calculus (functions of one variable)

    No full text
    Published versio

    Strong Attractors of Hopfield Neural Networks to Model Attachment Types and Behavioural Patterns

    No full text
    Abstract — We study the notion of a strong attractor of a Hopfield neural model as a pattern that has been stored multiple times in the network, and examine its properties using basic mathematical techniques as well as a variety of simulations. It is proposed that strong attractors can be used to model attachment types in developmental psychology as well as behavioural patterns in psychology and psychotherapy. We study the stability and basins of attraction of strong attractors in the presence of other simple attractors and show that they are indeed more stable with a larger basin of attraction compared with simple attractors. We also show that the perturbation of a strong attractor by random noise results in a cluster of attractors near the original strong attractor measured by the Hamming distance. We investigate the stability and basins of attraction of such clusters as the noise increases and establish that the unfolding of the strong attractor, leading to its breakup, goes through three different stages. Finally the relation between strong attractors of different multiplicity and their influence on each other are studied and we show how the impact of a strong attractor can be replaced with that of a new strong attractor. This retraining of the network is proposed as a model of how attachment types and behavioural patterns can undergo change. I
    • …
    corecore