1,286 research outputs found

    First-order logic learning in artificial neural networks

    Get PDF
    Artificial Neural Networks have previously been applied in neuro-symbolic learning to learn ground logic program rules. However, there are few results of learning relations using neuro-symbolic learning. This paper presents the system PAN, which can learn relations. The inputs to PAN are one or more atoms, representing the conditions of a logic rule, and the output is the conclusion of the rule. The symbolic inputs may include functional terms of arbitrary depth and arity, and the output may include terms constructed from the input functors. Symbolic inputs are encoded as an integer using an invertible encoding function, which is used in reverse to extract the output terms. The main advance of this system is a convention to allow construction of Artificial Neural Networks able to learn rules with the same power of expression as first order definite clauses. The system is tested on three examples and the results are discussed

    Spin-selective localization due to intrinsic spin-orbit coupling

    Full text link
    We study spin-dependent diffusive transport in the presence of a tunable spin-orbit (SO) interaction in a two-dimensional electron system. The spin precession of an electron in the SO coupling field is expressed in terms of a covariant curvature, affecting the quantum interference between different electronic trajectories. Controlling this curvature field by modulating the SO coupling strength and its gradients by, e.g., electric or elastic means, opens intriguing possibilities for exploring spin-selective localization physics. In particular, applying a weak magnetic field allows the control of the electron localization independently for two spin directions, with the spin-quantization axis that could be "engineered" by appropriate SO interaction gradients.Comment: 7 pages, 1 figur

    The Economic Stimulus Payments of 2008 and the aggregate demand for consumption

    Get PDF
    Households in the Nielsen Consumer Panel were surveyed about their 2008 Economic Stimulus Payment. In estimates identified by the randomized timing of disbursement, the average household׳s spending rose by 10 percent the week it received a Payment and remained high cumulating to 1.5–3.8 percent of spending over three months. These estimates imply partial-equilibrium increases in aggregate demand of 1.3 percent of consumption in the second quarter of 2008 and 0.6 percent in the third. Spending is concentrated among households with low wealth or low past income; a household׳s spending did not increase significantly when it learned about its Payment.Sloan School of ManagementNorthwestern University (Evanston, Ill.). Kellogg School of ManagementUniversity of Chicago. Initiative for Global MarketsNorthwestern University (Evanston, Ill.). Kellogg School of Management. Zell Center for Risk ResearchHarvard University. Laboratory for Applied Economics and Polic

    On Distributions of Ratios

    Full text link
    A large number of exact inferential procedures in statistics and econometrics involve the sampling distribution of ratios of random variables. If the denominator variable is positive, then tail probabilities of the ratio can be expressed as those of a suitably defined difference of random variables. If in addition, the joint characteristic function of numerator and denominator is known, then standard Fourier inversion techniques can be used to reconstruct the distribution function from it. Most research in this field has been based on this correspondence, but which breaks down when both numerator and denominator are supported on the entire real line. The present manuscript derives inversion formulae and saddlepoint approximations that remain valid in this case, and reduce to known results when the denominator is almost surely positive. Applications include the IV est imator of a structural parameter in a just identified equation

    Labelled Natural Deduction for Substructural Logics

    Full text link
    In this paper a uniform methodology to perform Natural Deduction over the family of linear, relevance and intuitionistic logics is proposed. The methodology follows the Labelled Deductive Systems (LDS) discipline, where the deductive process manipulates declarative units { formulas labelled according to a labelling algebra. In the system de-scribed here, labels are either ground terms or variables of a given labelling language and inference rules manipulate formulas and labels simultaneously, generating (whenever necessary) constraints on the labels used in the rules. A set of natural deduction style inference rules is given, and the notion of a derivation is dened which associates a la-belled natural deduction style \structural derivation " with a set of generated constraints. Algorithmic procedures, based on a technique called resource abduction, are dened to solve the constraints generated within a derivation, and their termination conditions dis-cussed. A natural deduction derivation is correct with respect to a given substructural logic, if, under the condition that the algorithmic procedures terminate, the associated set of constraints is satised with respect to the underlying labelling algebra. This is shown by proving that the natural deduction system is sound and complete with respect to the LKE tableaux system [DG94].

    Transparent modelling of finite stochastic processes for multiple agents

    Get PDF
    Stochastic Processes are ubiquitous, from automated engineering, through financial markets, to space exploration. These systems are typically highly dynamic, unpredictable and resistant to analytic methods; coupled with a need to orchestrate long control sequences which are both highly complex and uncertain. This report examines some existing single- and multi-agent modelling frameworks, details their strengths and weaknesses, and uses the experience to identify some fundamental tenets of good practice in modelling stochastic processes. It goes on to develop a new family of frameworks based on these tenets, which can model single- and multi-agent domains with equal clarity and flexibility, while remaining close enough to the existing frameworks that existing analytic and learning tools can be applied with little or no adaption. Some simple and larger examples illustrate the similarities and differences of this approach, and a discussion of the challenges inherent in developing more flexible tools to exploit these new frameworks concludes matters

    Mapping UML models incorporating OCL constraints into object-Z

    Get PDF
    Focusing on object-oriented designs, this paper proposes a mapping for translating systems modelled in the Unified Modelling Language (UML) incorporating Object Constraint Language (OCL) constraints into formal software specifications in Object-Z. Joint treatment of semi-formal model constructs and constraints within a single translation framework and conversion tool is novel, and leads to the generation of much richer formal specifications than is otherwise possible. This paper complements previous analyses by paying particular attention to the generation of complete Object-Z structures. Integration of proposals to extend the OCL to include action constraints also boosts the expressivity of the translated specifications. The main features of a tool support are described

    ARCHModels.jl: Estimating ARCH Models in Julia

    Get PDF
    This paper introduces ARCHModels.jl, a package for the Julia programming language that implements a number of univariate and multivariate autoregressive conditional heteroskedasticity models. This model class is the workhorse tool for modeling the conditional volatility of financial assets. The distinguishing feature of these models is that they model the latent volatility as a (deterministic) function of past returns and volatilities. This recursive structure results in loop-heavy code which, due to its just-in-time compiler, Julia is well-equipped to handle. As such, the entire package is written in Julia, without any binary dependencies. We benchmark the performance of ARCHModels.jl against popular implementations in MATLAB, R, and Python, and illustrate its use in a detailed case study

    Search space expansion for efficient incremental inductive logic programming from streamed data

    Get PDF
    In the past decade, several systems for learning Answer Set Programs (ASP) have been proposed, including the recent FastLAS system. Compared to other state-of-the-art approaches to learning ASP, FastLAS is more scalable, as rather than computing the hypothesis space in full, it computes a much smaller subset relative to a given set of examples that is nonetheless guaranteed to contain an optimal solution to the task (called an OPT-sufficient subset). On the other hand, like many other Inductive Logic Programming (ILP) systems, FastLAS is designed to be run on a fixed learning task meaning that if new examples are discovered after learning, the whole process must be run again. In many real applications, data arrives in a stream. Rerunning an ILP system from scratch each time new examples arrive is inefficient. In this paper we address this problem by presenting IncrementalLAS, a system that uses a new technique, called hypothesis space expansion, to enable a FastLAS-like OPT-sufficient subset to be expanded each time new examples are discovered. We prove that this preserves FastLAS's guarantee of finding an optimal solution to the full task (including the new examples), while removing the need to repeat previous computations. Through our evaluation, we demonstrate that running IncrementalLAS on tasks updated with sequences of new examples is significantly faster than re-running FastLAS from scratch on each updated task
    • …
    corecore