57 research outputs found

    On Folding and Twisting (and whatknot): towards a characterization of workspaces in syntax

    Full text link
    Syntactic theory has traditionally adopted a constructivist approach, in which a set of atomic elements are manipulated by combinatory operations to yield derived, complex elements. Syntactic structure is thus seen as the result or discrete recursive combinatorics over lexical items which get assembled into phrases, which are themselves combined to form sentences. This view is common to European and American structuralism (e.g., Benveniste, 1971; Hockett, 1958) and different incarnations of generative grammar, transformational and non-transformational (Chomsky, 1956, 1995; and Kaplan & Bresnan, 1982; Gazdar, 1982). Since at least Uriagereka (2002), there has been some attention paid to the fact that syntactic operations must apply somewhere, particularly when copying and movement operations are considered. Contemporary syntactic theory has thus somewhat acknowledged the importance of formalizing aspects of the spaces in which elements are manipulated, but it is still a vastly underexplored area. In this paper we explore the consequences of conceptualizing syntax as a set of topological operations applying over spaces rather than over discrete elements. We argue that there are empirical advantages in such a view for the treatment of long-distance dependencies and cross-derivational dependencies: constraints on possible configurations emerge from the dynamics of the system.Comment: Manuscript. Do not cite without permission. Comments welcom

    Learning stochastic finite automata from experts

    Full text link

    Static Transformation of Power Consumption for Program Tracing and Software Attestation

    Get PDF
    This thesis presents methods to statically modify programs at compile-time to improve the effectiveness of power consumption based program analyses. Two related applications are considered, and algorithms are introduced for both. The first is power consumption based program tracing, and the second is software attestation with power consumption as a side-effect. We propose a framework for increasing the effectiveness of power-based program tracing techniques. These systems determine the most likely block of source code that produced an observed power trace. Our framework maximizes distinguishability between power traces for different code sections. To this end, we introduce a static transformation to reduce the probability of misclassification by reordering intermediate representation (IR) to find the ordering that produces power traces with the highest distances between them. Experimental results confirm the effectiveness of our technique. We also consider improvements to the algorithm, replacing the naïve, exhaustive permutation algorithm used in the original solution with Monte Carlo permutations. Due to the complexity of the naïve solution, its search space is constrained, making it unlikely to find a good solution when the number of instructions in a program section is too large. Variations on a basic stochastic implementation are described, and their expected results are compared. The Monte Carlo algorithms consistently found better solutions than their exhaustive counterpart while showing improved scalability. We then introduce a related technique to statically transform programs to use power consumption as the side-effect for software attestation. We show how to circumvent the undecidable nature of program execution for this purpose and present a static compiler transformation which implements the algorithm. Our approach is less intrusive than traditional software attestation because the system does not require interruption to compute a cryptographic checksum. It is particularly well suited to real-time systems where consistent timing is more important than speed

    Acta Cybernetica : Volume 22. Number 2.

    Get PDF

    Dynamic Network Notation: A Graphical Modeling Language to Support the Visualization and Management of Network Effects in Service Platforms

    Get PDF
    Service platforms have moved into the center of interest in both academic research and the IT industry due to their economic and technical impact. These multitenant platforms provide own or third party software as metered, on-demand services. Corresponding service offers exhibit network effects. The present work introduces a graphical modeling language to support service platform design with focus on the exploitation of these network effects

    Designing teachable robots

    Get PDF
    This thesis advances the design of teachable adaptable robots. I propose two paths of improvement to the popular, easy to use, leading method, in which a teacher literally leads the robot by its hand through movements. The improvements enable motor commands to be changed and conditional branches to be formed, without the need for a keyboard or other explicit programming device. On improvement path 1, the addition of a verbal correcting (VC) scheme would enable a teacher to make on-line verbal corrections to a robot's movement sequences. The further addition of a production system of corrections (PSC) would enable a robot to remember and use verbally taught conditional corrections. On path 2 a goal-seeking (GS) system and VC would enable a teacher to set goals, lead movements, and verbally correct the robot. The robot then selects its own motor commands for achieving goals. A multiple context learning system (MCLS), a multiple, extended GS system, combines the two paths. It enables both sequences and goals to be taught to a led robot. A simple, but real, led MCLS-robot is demonstrated. I establish four important properties of MCLSs: (a) an MCLS can enable a robot to learn to perform motor commands that are initially performed only by reflex, so that eye and speech motor commands, neither suitable for being led, can still be learned; (b) an MCLS can learn to be a Turing machine, which is a universal computing machine, explicitly showing the error in criticisms of MCLSs' computational power; (c) the selections of a context learning system in an MCLS converge on the optimal motor commands for achieving goals; and (d) an MCLS-robot can handle the negation problem; doing something positive in the absence of a certain condition

    Hybrid spintronics and straintronics: An ultra-low-energy computing paradigm

    Get PDF
    The primary obstacle to continued downscaling of charge-based electronic devices in accordance with Moore\u27s law is the excessive energy dissipation that takes place in the device during switching of bits. Unlike charge-based devices, spin-based devices are switched by flipping spins without moving charge in space. Although some energy is still dissipated in flipping spins, it can be considerably less than the energy associated with current flow in charge-based devices. Unfortunately, this advantage will be squandered if the method adopted to switch the spin is so energy-inefficient that the energy dissipated in the switching circuit far exceeds the energy dissipated inside the system. Regrettably, this is often the case, e.g., switching spins with a magnetic field or with spin-transfer-torque mechanism. In this dissertation, it is shown theoretically that the magnetization of two-phase multiferroic single-domain nanomagnets can be switched very energy-efficiently, more so than any device currently extant, leading possibly to new magnetic logic and memory systems which might be an important contributor to Beyond-Moore\u27s-Law technology. A multiferroic composite structure consists of a layer of piezoelectric material in intimate contact with a magnetostrictive layer. When a tiny voltage of few millivolts is applied across the structure, it generates strain in the piezoelectric layer and the strain is transferred to the magnetostrictive nanomagnet. This strain generates magnetostrictive anisotropy in the nanomagnet and thus rotates its direction of magnetization, resulting in magnetization reversal or \u27bit-flip\u27. It is shown after detailed analysis that full 180 degree switching of magnetization can occur in the symmetric potential landscape of the magnetostrictive nanomagnet, even in the presence of room-temperature thermal fluctuations, which differs from the general perception on binary switching. With proper choice of materials, the energy dissipated in the bit-flip can be made as low as one attoJoule at room-temperature. Also, sub-nanosecond switching delay can be achieved so that the device is adequately fast for general-purpose computing. The above idea, explored in this dissertation, has the potential to produce an extremely low-power, yet high-density and high-speed, non-volatile magnetic logic and memory system. Such processors would be well suited for embedded applications, e.g., implantable medical devices that could run on energy harvested from the patient\u27s body motion

    From quantum foundations to quantum information protocols and back

    Get PDF
    Physics has two main ambitions: to predict and to understand. Indeed, physics aims for the prediction of all natural phenomena. Prediction entails modeling the correlation between an action, the input, and what is subsequently observed, the output.Understanding, on the other hand, involves developing insightful principles and models that can explain the widest possible varietyof correlations present in nature. Remarkably, advances in both prediction and understanding foster our physical intuition and, as a consequence, novel and powerful applications are discovered. Quantum mechanics is a very successful physical theory both in terms of its predictive power as well as in its wide applicability. Nonetheless and despite many decades of development, we do not yet have a proper physical intuition of quantum phenomena. I believe that improvements in our understanding of quantum theory will yield better, and more innovative, protocols and vice versa.This dissertation aims at advancing our understanding and developing novel protocols. This is done through four approaches. The first one is to study quantum theory within a broad family of theories. In particular, we study quantum theory within the family of locally quantum theories. We found out that the principle that singles out quantum theory out of this family, thus connecting quantum local and nonlocal structure, is dynamical reversibility. This implies that the viability of large scale quantum computing can be based on concrete physical principles that can be experimentally tested at a local level without needing to test millions of qubits simultaneously. The second approach is to study quantum correlations from a black box perspective thus making as few assumptions as possible. The strategy is to study the completeness of quantum predictions by benchmarking them against alternative models. Three main results and applications come out of our study. Firstly, we prove that performing complete amplification of randomness starting from a source of arbitrarily weak randomness - a task that is impossible with classical resources - is indeed possible via nonlocality. This establishes in our opinion the strongest evidence for a truly random event in nature so far. Secondly, we prove that there exist finite events where quantum theory gives predictions as complete as any no-signaling theory can give, showing that the completeness of quantum theory is not an asymptotic property. Finally, we prove that maximally nonlocal theories can never be maximally random while quantum theory can, showing a trade-off between the nonlocality of a theory and its randomness capabilities. We also prove that quantum theory is not unique in this respect. The third approach we follow is to study quantum correlations in scenarios where some parties have a restriction on the available quantum degrees of freedom. The future progress of semi-device-independent quantum information depends crucially on our ability to bound the strength of these correlations. Here we provide a full characterization via a complete hierarchy of sets that approximate the target set from the outside. Each set can be in turn characterized using standard numerical techniques. One application of our work is certifying multidimensional entanglement device-independently.The fourth approach is to confront quantum theory with computer science principles. In particular, we establish two interesting implications for quantum theory results of raising the Church-Turing thesis to the level of postulate. Firstly, we show how different preparations of the same mixed state, indistinguishable according to the quantum postulates, become distinguishable when prepared computably. Secondly, we identify a new loophole for Bell-like experiments: if some parties in a Bell-like experiment use private pseudorandomness to choose their measurement inputs, the computational resources of an eavesdropper have to be limited to observe a proper violation of non locality.La física tiene dos finalidades: predecir y comprender. En efecto, la física aspira a poder predecir todos los fenómenos naturales. Predecir implica modelar correlaciones entre una acción y la reacción subsiguiente.Comprender, implica desarrollar leyes profundas que expliquen la más amplia gama de correlaciones presentes en la naturaleza. Avances tanto en la capacidad de predicción como en nuestra comprensión fomentan la intuición física y, como consecuencia, surgen nuevas y poderosas aplicaciones. La mecánica cuántica es una teoría física de enorme éxito por su capacidad de predicción y amplia aplicabilidad.Sin embargo, a pesar de décadas de gran desarrollo, no poseemos una intuición física satisfactoria de los fenómenos cuánticos.Creo que mejoras en nuestra comprensión de la teoría cuántica traerán consigo mejores y más innovadores protocolos y vice versa.Ésta tesis doctoral trata simultáneamente de avanzar nuestra comprensión y de desarrollar nuevos protocolos mediante cuatro enfoques distintos.El primero consiste en estudiar la mecánica cuántica como miembro de una familia de teorías: las teorías localmente cuánticas. Probamos que el principio que selecciona a la mecánica cuántica, conectando por tanto la estructura cuántica local y no local, es la reversibilidad de su dinámica.Ésto implica que la viabilidad de la computación cuántica a gran escala puede ser estudiada de manera local, comprobando experimentalmente ciertos principios físicos. El segundo enfoque consiste en estudiar las correlaciones cuánticas desde una perspectiva de 'caja negra', haciendo así el mínimo de asunciones físicas. La estrategia consiste en estudiar la completitud de las predicciones cuánticas, comparándolas con todos los modelos alternativos. Hemos obtenido tres grandes resultados. Primero, probamos que se puede amplificar completamente la aleatoriedad de una fuente de aleatoriedad arbitrariamente débil.Ésta tarea, imposible mediante recursos puramente clásicos, se vuelve factible gracias a la no localidad. Ésto establece a nuestro parecer la evidencia más fuerte de la existencia de eventos totalmente impredecibles en la naturaleza. Segundo, probamos que existen eventos finitos cuyas predicciones cuánticas son tan completas como permite el principio de 'no signaling'. Ésto prueba que la completitud de la mecánica cuántica no es una propiedad asintótica. Finalmente, probamos que las teorías máximamente no locales no pueden ser máximamente aleatorias, mientras que la mecánica cuántica lo es. Ésto muestra que hay una compensación entre la no localidad de una teoría y su capacidad para generar aleatoriedad. También probamos que la mecánica cuántica no es única en éste respecto. En tercer lugar, estudiamos las correlaciones cuánticas en escenarios dónde algunas partes tienen restricciones en el número de grados de libertad cuánticos accesibles. Éste escenario se denomina 'semi-device-independent'. Aquí encontramos una caracterización completa de éstas correlaciones mediante una jerarquía de conjuntos que aproximan al conjunto buscado desde fuera y que pueden ser caracterizados a su vez mediante técnicas numéricas estandar. Un aplicación de nuestro trabajo es la certificación de entrelazamiento multidimensional de manera 'device-independent'. El cuarto y último enfoque consiste en enfrentar a la mecánica cuántica con principios provenientes de la computación. En particular, establecemos dos implicaciones para la mecánica cuántica de elevar la tesis de Church-Turing al nivel de postulado. Primero, mostramos que diferentes preparaciones de un mismo estado mixto, indistinguibles de acuerdo a los axiomas cuánticos, devienen distinguibles cuando son preparados de manera computable. Segundo, identificamos un nuevo 'loophole' en experimentos de Bell: si algunas partes en un experimento de Bell usan pseudo aleatoriedad para escoger sus medidas, los recursos computacionales de un espía deben ser limitados a fin de observar verdaderamente la no localidad
    corecore