161 research outputs found

    The quotient in preorder theories

    Get PDF
    Seeking the largest solution to an expression of the form Ax 64 B is a common task in several domains of engineering and computer science. This largest solution is commonly called quotient. Across domains, the meanings of the binary operation and the preorder are quite different, yet the syntax for computing the largest solution is remarkably similar. This paper is about finding a common framework to reason about quotients. We only assume we operate on a preorder endowed with an abstract monotonic multiplication and an involution. We provide a condition, called admissibility, which guarantees the existence of the quotient, and which yields its closed form. We call preordered heaps those structures satisfying the admissibility condition. We show that many existing theories in computer science are preordered heaps, and we are thus able to derive a quotient for them, subsuming existing solutions when available in the literature. We introduce the concept of sieved heaps to deal with structures which are given over multiple domains of definition. We show that sieved heaps also have well-defined quotients

    The Quotient in Preorder Theories

    Get PDF
    Seeking the largest solution to an expression of the form A x <= B is a common task in several domains of engineering and computer science. This largest solution is commonly called quotient. Across domains, the meanings of the binary operation and the preorder are quite different, yet the syntax for computing the largest solution is remarkably similar. This paper is about finding a common framework to reason about quotients. We only assume we operate on a preorder endowed with an abstract monotonic multiplication and an involution. We provide a condition, called admissibility, which guarantees the existence of the quotient, and which yields its closed form. We call preordered heaps those structures satisfying the admissibility condition. We show that many existing theories in computer science are preordered heaps, and we are thus able to derive a quotient for them, subsuming existing solutions when available in the literature. We introduce the concept of sieved heaps to deal with structures which are given over multiple domains of definition. We show that sieved heaps also have well-defined quotients.Comment: In Proceedings GandALF 2020, arXiv:2009.0936

    Combinatorial Representation Theory

    Full text link
    We attempt to survey the field of combinatorial representation theory, describe the main results and main questions and give an update of its current status. We give a personal viewpoint on the field, while remaining aware that there is much important and beautiful work that we have not been able to mention

    CEEME: compensating events based execution monitoring enforcement for Cyber-Physical Systems

    Get PDF
    Fundamentally, inherently observable events in Cyber-Physical Systems with tight coupling between cyber and physical components can result in a confidentiality violation. By observing how the physical elements react to cyber commands, adversaries can identify critical links in the system and force the cyber control algorithm to make erroneous decisions. Thus, there is a propensity for a breach in confidentiality leading to further attacks on availability or integrity. Due to the highly integrated nature of Cyber-Physical Systems, it is also extremely difficult to map the system semantics into a security framework under existing security models. The far-reaching objective of this research is to develop a science of selfobfuscating systems based on the composition of simple building blocks. A model of Nondeducibility composes the building blocks under Information Flow Security Properties. To this end, this work presents fundamental theories on external observability for basic regular networks and the novel concept of event compensation that can enforce Information Flow Security Properties at runtime --Abstract, page iii

    Making decisions based on context: models and applications in cognitive sciences and natural language processing

    Full text link
    It is known that humans are capable of making decisions based on context and generalizing what they have learned. This dissertation considers two related problem areas and proposes different models that take context information into account. By including the context, the proposed models exhibit strong performance in each of the problem areas considered. The first problem area focuses on a context association task studied in cognitive science, which evaluates the ability of a learning agent to associate specific stimuli with an appropriate response in particular spatial contexts. Four neural circuit models are proposed to model how the stimulus and context information are processed to produce a response. The neural networks are trained by modifying the strength of neural connections (weights) using principles of Hebbian learning. Such learning is considered biologically plausible, in contrast to back propagation techniques that do not have a solid neurophysiological basis. A series of theoretical results for the neural circuit models are established, guaranteeing convergence to an optimal configuration when all the stimulus-context pairs are provided during training. Among all the models, a specific model based on ideas from recommender systems trained with a primal-dual update rule, achieves perfect performance in learning and generalizing the mapping from context-stimulus pairs to correct responses. The second problem area considered in the thesis focuses on clinical natural language processing (NLP). A particular application is the development of deep-learning models for analyzing radiology reports. Four NLP tasks are considered including anatomy named entity recognition, negation detection, incidental finding detection, and clinical concept extraction. A hierarchical Recurrent Neural Network (RNN) is proposed for anatomy named entity recognition, which is then used to produce a set of features for incidental finding detection of pulmonary nodules. A clinical context word embedding model is obtained, which is used with an RNN to model clinical concept extraction. Finally, feature-enriched RNN and transformer-based models with contextual word embedding are proposed for negation detection. All these models take the (clinical) context information into account. The models are evaluated on different datasets and are shown to achieve strong performance, largely outperforming the state-of-art

    Doctor of Philosophy

    Get PDF
    dissertationSynthetic biology is a new field in which engineers, biologists, and chemists are working together to transform genetic engineering into an advanced engineering discipline, one in which the design and construction of novel genetic circuits are made possible through the application of engineering principles. This dissertation explores two engineering strategies to address the challenges of working with genetic technology, namely the development of standards for describing genetic components and circuits at separate yet connected levels of detail and the use of Genetic Design Automation (GDA) software tools to simplify and speed up the process of optimally designing genetic circuits. Its contributions to the field of synthetic biology include (1) a proposal for the next version of the Synthetic Biology Open Language (SBOL), an existing standard for specifying and exchanging genetic designs electronically, and (2) a GDA work ow that enables users of the software tool iBioSim to create an abstract functional specication, automatically select genetic components that satisfy the specication from a design library, and compose the selected components into a standardized genetic circuit design for subsequent analysis and physical construction. Ultimately, this dissertation demonstrates how existing techniques and concepts from electrical and computer engineering can be adapted to overcome the challenges of genetic design and is an example of what is possible when working with publicly available standards for genetic design

    Power subsystem automation study

    Get PDF
    The purpose of the phase 2 of the power subsystem automation study was to demonstrate the feasibility of using computer software to manage an aspect of the electrical power subsystem on a space station. The state of the art in expert systems software was investigated in this study. This effort resulted in the demonstration of prototype expert system software for managing one aspect of a simulated space station power subsystem

    BeyondPixels: A Comprehensive Review of the Evolution of Neural Radiance Fields

    Full text link
    Neural rendering combines ideas from classical computer graphics and machine learning to synthesize images from real-world observations. NeRF, short for Neural Radiance Fields, is a recent innovation that uses AI algorithms to create 3D objects from 2D images. By leveraging an interpolation approach, NeRF can produce new 3D reconstructed views of complicated scenes. Rather than directly restoring the whole 3D scene geometry, NeRF generates a volumetric representation called a ``radiance field,'' which is capable of creating color and density for every point within the relevant 3D space. The broad appeal and notoriety of NeRF make it imperative to examine the existing research on the topic comprehensively. While previous surveys on 3D rendering have primarily focused on traditional computer vision-based or deep learning-based approaches, only a handful of them discuss the potential of NeRF. However, such surveys have predominantly focused on NeRF's early contributions and have not explored its full potential. NeRF is a relatively new technique continuously being investigated for its capabilities and limitations. This survey reviews recent advances in NeRF and categorizes them according to their architectural designs, especially in the field of novel view synthesis.Comment: 22 page, 1 figure, 5 tabl
    corecore