616,291 research outputs found
Demon-like Algorithmic Quantum Cooling and its Realization with Quantum Optics
The simulation of low-temperature properties of many-body systems remains one
of the major challenges in theoretical and experimental quantum information
science. We present, and demonstrate experimentally, a universal cooling method
which is applicable to any physical system that can be simulated by a quantum
computer. This method allows us to distill and eliminate hot components of
quantum states, i.e., a quantum Maxwell's demon. The experimental
implementation is realized with a quantum-optical network, and the results are
in full agreement with theoretical predictions (with fidelity higher than
0.978). These results open a new path for simulating low-temperature properties
of physical and chemical systems that are intractable with classical methods.Comment: 7 pages, 5 figures, plus supplementarity material
Towards large scale continuous EDA: a random matrix theory perspective
Estimation of distribution algorithms (EDA) are a major branch of evolutionary algorithms (EA) with some unique advantages in principle. They are able to take advantage of correlation structure to drive the search more efficiently, and they are able to provide insights about the structure of the search space. However, model building in high dimensions is extremely challenging and as a result existing EDAs lose their strengths in large scale problems.
Large scale continuous global optimisation is key to many real world problems of modern days. Scaling up EAs to large scale problems has become one of the biggest challenges of the field.
This paper pins down some fundamental roots of the problem and makes a start at developing a new and generic framework to yield effective EDA-type algorithms for large scale continuous global optimisation problems. Our concept is to introduce an ensemble of random projections of the set of fittest search points to low dimensions as a basis for developing a new and generic divide-and-conquer methodology. This is rooted in the theory of random projections developed in theoretical computer science, and will exploit recent advances of non-asymptotic random matrix theory
Experiencing information: using systems theory to develop a theoretical framework of information interaction
2021 Spring.Includes bibliographical references.This study outlines the construction, development, and initial testing of a proposed theoretical framework and measure for information interaction. To address the challenges associated with experiencing information, I synthesized existing literature from complementary and multidisciplinary domains of cognitive psychology, computer science, and organizational communication. I initially proposed theoretically driven components of information interaction based on a literature review, followed by a multimethod evaluation to further develop and refine the framework. Quantitatively, I researched organizational practices used for managing the information environment. Empirically, I collected data using multiple samples to test the psychometric properties of a proposed measure of information interaction. I used structural equation modeling to assess relationships associated with information interaction to develop its nomological network. The findings of these studies have implications for research and practice by establishing a new theoretical space in Industrial and Organizational Psychology, using a systems approach to construct development and application, and providing organizations with a mechanism for constant, minimally obtrusive collection and assessment of the information experience of members within the organizational system
Computers in Ramsey Theory; Testing, Constructions and Nonexistence
Computers in Ramsey Theory Ramsey theory is often regarded as the study of how order emerges from randomness. While originated in mathematical logic, it has applications in geometry, number theory, game theory, information theory, approximation algorithms, and other areas of mathematics and theoretical computer science. Ramsey theory studies the conditions of when a combinatorial object necessarily contains some smaller given objects. The central concept in Ramsey theory is that of arrowing, which in the case of graphs describes when colorings of larger graphs necessarily contain monochromatic copies of given smaller graphs. The role of Ramsey numbers is to quantify some of the general existential theorems in Ramsey theory, always involving arrowing. The determination of whether this arrowing holds is notoriously difficult, and thus it leads to numerous computational challenges concerning various types of Ramsey numbers and closely related Folkman numbers. This talk will overview how computers are increasingly used to study the bounds on Ramsey and Folkman numbers, and properties of Ramsey arrowing in general. This is happening in the area where traditional approaches typically call for classical computer-free proofs. It is evident that now we understand Ramsey theory much better than a few decades ago, increasingly due to computations. Further, more such progress and new insights based on computations should be anticipated
The Theoretical Argument for Disproving Asymptotic Upper-Bounds on the Accuracy of Part-of-Speech Tagging Algorithms: Adopting a Linguistics, Rule-Based Approach
This paper takes a deep dive into a particular area of the interdisciplinary domain of Computational Linguistics, Part-of-Speech Tagging algorithms.
The author relies primarily on scholarly Computer Science and Linguistics papers to describe previous approaches to this task and the often-hypothesized existence of the asymptotic accuracy rate of around 98%, by which this task is allegedly bound. However, after doing more research into why the accuracy of previous algorithms have behaved in this asymptotic manner, the author identifies valid and empirically-backed reasons why the accuracy of previous approaches do not necessarily reflect any sort of general asymptotic bound on the task of automated Part-of-Speech Tagging. In response, a theoretical argument is proposed to circumvent the shortcomings of previous approaches to this task, which involves abandoning the flawed status-quo of training machine learning algorithms and predictive models on outdated corpora, and instead walks the reader from conception through implementation of a rule-based algorithm with roots in both practical and theoretical Linguistics.
While the resulting algorithm is simply a prototype which cannot be currently verified in achieving a tagging-accuracy rate of over 98%, its multi-tiered methodology, meant to mirror aspects of human cognition in Natural Language Understanding, is meant to serve as a theoretical blueprint for a new and inevitably more-reliable way to deal with the challenges in Part-of-Speech Tagging, and provide much-needed advances in the popular area of Natural Language Processing
Proactive Independent Learning Approach: A case study in computer arithmetic
The rapid growth of knowledge and scientific challenges required lifelong continuous education in computational science and engineering.
Computer numerical system representation and computer arithmetic are the basis of numerical computing of scientific models. In this work an adapted student centered and problem based learning strategy is presented. Development of problem solving, effective self directed reasoning and communication skills are promoted. A pilot study was conducted to determine the validity of the proposed alternatives. The study aimed to evaluate the performance of students to solve new problems and effectively describe the problems, the theoretical context and the possible solutions. Preliminary results are presented for a particular population from which the sample is actually drawn.VI Workshop de Innovación en Educación en Informática (WIEI).Red de Universidades con Carreras en Informática (RedUNCI
Recommended from our members
A neural-symbolic system for temporal reasoning with application to model verification and learning
The effective integration of knowledge representation, reasoning and learning into a robust computational model is one of the key challenges in Computer Science and Artificial Intelligence. In particular, temporal models have been fundamental in describing the behaviour of Computational and Neural-Symbolic Systems. Furthermore, knowledge acquisition of correct descriptions of the desired system’s behaviour is a complex task in several domains. Several efforts have been directed towards the development of tools that are capable of learning, describing and evolving software models.
This thesis contributes to two major areas of Computer Science, namely Artificial Intelligence (AI) and Software Engineering. Under an AI perspective, we present a novel neural-symbolic computational model capable of representing and learning temporal knowledge in recurrent networks. The model works in integrated fashion. It enables the effective representation of temporal knowledge, the adaptation of temporal models to a set of desirable system properties and effective learning from examples, which in turn can lead to symbolic temporal knowledge extraction from the corresponding trained neural networks. The model is sound, from a theoretical standpoint, but is also tested in a number of case studies.
An extension to the framework is shown to tackle aspects of verification and adaptation under the SE perspective. As regards verification, we make use of established techniques for model checking, which allow the verification of properties described as temporal models and return counter-examples whenever the properties are not satisfied. Our neural-symbolic framework is then extended to deal with different sources of information. This includes the translation of model descriptions into the neural structure, the evolution of such descriptions by the application of learning of counter examples, and also the learning of new models from simple observation of their behaviour.
In summary, we believe the thesis describes a principled methodology for temporal knowledge representation, learning and extraction, shedding new light on predictive temporal models, not only from a theoretical standpoint, but also with respect to a potentially large number of applications in AI, Neural Computation and Software Engineering, where temporal knowledge plays a fundamental role
- …