38 research outputs found

    Embodied Artificial Intelligence through Distributed Adaptive Control: An Integrated Framework

    Full text link
    In this paper, we argue that the future of Artificial Intelligence research resides in two keywords: integration and embodiment. We support this claim by analyzing the recent advances of the field. Regarding integration, we note that the most impactful recent contributions have been made possible through the integration of recent Machine Learning methods (based in particular on Deep Learning and Recurrent Neural Networks) with more traditional ones (e.g. Monte-Carlo tree search, goal babbling exploration or addressable memory systems). Regarding embodiment, we note that the traditional benchmark tasks (e.g. visual classification or board games) are becoming obsolete as state-of-the-art learning algorithms approach or even surpass human performance in most of them, having recently encouraged the development of first-person 3D game platforms embedding realistic physics. Building upon this analysis, we first propose an embodied cognitive architecture integrating heterogenous sub-fields of Artificial Intelligence into a unified framework. We demonstrate the utility of our approach by showing how major contributions of the field can be expressed within the proposed framework. We then claim that benchmarking environments need to reproduce ecologically-valid conditions for bootstrapping the acquisition of increasingly complex cognitive skills through the concept of a cognitive arms race between embodied agents.Comment: Updated version of the paper accepted to the ICDL-Epirob 2017 conference (Lisbon, Portugal

    QVAST: a new Quantum GIS plugin for estimating volcanic susceptibility

    Get PDF
    One of the most important tasks of modern volcanology is the construction of hazard maps simulating different eruptive scenarios that can be used in risk-based decision making in land-use planning and emergency management. The first step in the quantitative assessment of volcanic hazards is the development of susceptibility maps (i.e., the spatial probability of a future vent opening given the past eruptive activity of a volcano). This challenging issue is generally tackled using probabilistic methods that use the calculation of a kernel function at each data location to estimate probability density functions (PDFs). The smoothness and the modeling ability of the kernel function are controlled by the smoothing parameter, also known as the bandwidth. Here we present a new tool, QVAST, part of the open-source geographic information system Quantum GIS, which is designed to create user-friendly quantitative assessments of volcanic susceptibility. QVAST allows the selection of an appropriate method for evaluating the bandwidth for the kernel function on the basis of the input parameters and the shapefile geometry, and can also evaluate the PDF with the Gaussian kernel. When different input data sets are available for the area, the total susceptibility map is obtained by assigning different weights to each of the PDFs, which are then combined via a weighted summation and modeled in a non-homogeneous Poisson process. The potential of QVAST, developed in a free and user-friendly environment, is here shown through its application in the volcanic fields of Lanzarote (Canary Islands) and La Garrotxa (NE Spain)

    A multidimensional account of democratic legitimacy: how to make robust decisions in a non-idealized deliberative context

    Get PDF
    This paper analyses the possibility of granting legitimacy to democratic decisionmaking procedures in a context of deep pluralism. We defend a multidimensional account according to which a legitimate system needs to grant, on the one hand, that citizens should be included on an equal footing and acknowledged as reflexive political agents rather than mere beneficiaries of policies, and, on the other hand, that their decisions have an epistemic quality. While Estlund\u2019s account of imperfect epistemic proceduralism might seem to embody a dualistic conception of democratic legitimacy, we point out that it is not able to recognize citizens as reflexive political agents and is grounded in an idealized model of the circumstances of deliberation. To overcome these ambiguities, we develop an account of democratic legitimacy according to which disagreement is the proper expression of citizens\u2019 reflexive agency and the attribution of epistemic authority does not stem from a major expertise or specific ability, but it comes through the public confrontation among disagreeing agents. Consequently, the epistemic value of deliberation should be derived from the reasons-giving process rather than from the reference to the alleged quality of its outcomes. In this way, we demonstrate the validity of the multidimensional perspective of legitimacy, yet abstain from introducing any outcome-oriented criterion. Finally, we argue that this account of legitimacy is well suited for modeling deliberative democracy as a decision-making procedure that respects the agency of every citizen and grants her opportunity to influence public choices

    Quantum memories at finite temperature

    Get PDF
    To use quantum systems for technological applications one first needs to preserve their coherence for macroscopic time scales, even at finite temperature. Quantum error correction has made it possible to actively correct errors that affect a quantum memory. An attractive scenario is the construction of passive storage of quantum information with minimal active support. Indeed, passive protection is the basis of robust and scalable classical technology, physically realized in the form of the transistor and the ferromagnetic hard disk. The discovery of an analogous quantum system is a challenging open problem, plagued with a variety of no-go theorems. Several approaches have been devised to overcome these theorems by taking advantage of their loopholes. The state-of-the-art developments in this field are reviewed in an informative and pedagogical way. The main principles of self-correcting quantum memories are given and several milestone examples from the literature of two-, three- and higher-dimensional quantum memories are analyzed

    Discovery of extreme particle acceleration in the microquasar Cygnus X-3

    Full text link
    The study of relativistic particle acceleration is a major topic of high-energy astrophysics. It is well known that massive black holes in active galaxies can release a substantial fraction of their accretion power into energetic particles, producing gamma-rays and relativistic jets. Galactic microquasars (hosting a compact star of 1-10 solar masses which accretes matter from a binary companion) also produce relativistic jets. However, no direct evidence of particle acceleration above GeV energies has ever been obtained in microquasar ejections, leaving open the issue of the occurrence and timing of extreme matter energization during jet formation. Here we report the detection of transient gamma-ray emission above 100 MeV from the microquasar Cygnus X-3, an exceptional X-ray binary which sporadically produces powerful radio jets. Four gamma-ray flares (each lasting 1-2 days) were detected by the AGILE satellite simultaneously with special spectral states of Cygnus X-3 during the period mid-2007/mid-2009. Our observations show that very efficient particle acceleration and gamma-ray propagation out of the inner disk of a microquasar usually occur a few days before major relativistic jet ejections. Flaring particle energies can be thousands of times larger than previously detected maximum values (with Lorentz factors of 105 and 102 for electrons and protons, respectively). We show that the transitional nature of gamma-ray flares and particle acceleration above GeV energies in Cygnus X-3 is clearly linked to special radio/X-ray states preceding strong radio flares. Thus gamma-rays provide unique insight into the nature of physical processes in microquasars.Comment: 29 pages (including Supplementary Information), 8 figures, 2 tables version submitted to Nature on August 7, 2009 (accepted version available at http://www.nature.com/nature/journal/vaop/ncurrent/pdf/nature08578.pdf

    Financial constraints in family firms and the role of venture capital

    No full text
    Based on the natural reluctance of family-controlled firms (FCFs) to accept external shareholders, in this paper we analyze whether investment sensitivity to internally generated cash flow is a driver of venture capital (VC) participation in those firms. We argue that FCFs are more likely to accept external investors when they are subject to serious financial constraints. We also aim to ascertain to what extent VC involvement contributes to reducing the dependency between investments and internal cash flow. We focus on a representative sample of Spanish privately held FCFs that received the initial VC investment between 1997 and 2006, and compare the investment-cash flow sensitivity of VC-backed FCFs with that of non VC-backed FCFs. We find that FCFs that received VC were more financially constrained than other similar non VC-backed FCFs before receiving VC. This finding is especially true in first generation FCFs, thus providing additional evidence on the reluctance of FCFs to accept external shareholders. We also find that VC-backed FCFs, in particular first generation ones, significantly reduce the sensitivity of investments to cash flow after the initial VC round

    Investment-cash flow sensitivity in family-controlled firms and the impact of venture capital funding

    No full text
    In this paper we analyze investment sensitivity to cash flows in family-controlled businesses (FCBs) before and after the initial VC investment. We argue that highly constrained ones will be more inclined to change the preservation of the socioemotional wealth as the highest order reference point and, hence, accept the entry of external shareholders such as Venture Capital (VC) institutions. We find that financial constraints are significantly higher in first generation VC-backed FCBs than in similar untreated firms. We also find that VC involvement alleviates but does not fully eliminate the investment-cash flow sensitivity in investee first generation FCBs
    corecore