11,022 research outputs found

    Structure and Problem Hardness: Goal Asymmetry and DPLL Proofs in<br> SAT-Based Planning

    Full text link
    In Verification and in (optimal) AI Planning, a successful method is to formulate the application as boolean satisfiability (SAT), and solve it with state-of-the-art DPLL-based procedures. There is a lack of understanding of why this works so well. Focussing on the Planning context, we identify a form of problem structure concerned with the symmetrical or asymmetrical nature of the cost of achieving the individual planning goals. We quantify this sort of structure with a simple numeric parameter called AsymRatio, ranging between 0 and 1. We run experiments in 10 benchmark domains from the International Planning Competitions since 2000; we show that AsymRatio is a good indicator of SAT solver performance in 8 of these domains. We then examine carefully crafted synthetic planning domains that allow control of the amount of structure, and that are clean enough for a rigorous analysis of the combinatorial search space. The domains are parameterized by size, and by the amount of structure. The CNFs we examine are unsatisfiable, encoding one planning step less than the length of the optimal plan. We prove upper and lower bounds on the size of the best possible DPLL refutations, under different settings of the amount of structure, as a function of size. We also identify the best possible sets of branching variables (backdoors). With minimum AsymRatio, we prove exponential lower bounds, and identify minimal backdoors of size linear in the number of variables. With maximum AsymRatio, we identify logarithmic DPLL refutations (and backdoors), showing a doubly exponential gap between the two structural extreme cases. The reasons for this behavior -- the proof arguments -- illuminate the prototypical patterns of structure causing the empirical behavior observed in the competition benchmarks

    On Secure Distributed Data Storage Under Repair Dynamics

    Full text link
    We address the problem of securing distributed storage systems against passive eavesdroppers that can observe a limited number of storage nodes. An important aspect of these systems is node failures over time, which demand a repair mechanism aimed at maintaining a targeted high level of system reliability. If an eavesdropper observes a node that is added to the system to replace a failed node, it will have access to all the data downloaded during repair, which can potentially compromise the entire information in the system. We are interested in determining the secrecy capacity of distributed storage systems under repair dynamics, i.e., the maximum amount of data that can be securely stored and made available to a legitimate user without revealing any information to any eavesdropper. We derive a general upper bound on the secrecy capacity and show that this bound is tight for the bandwidth-limited regime which is of importance in scenarios such as peer-to-peer distributed storage systems. We also provide a simple explicit code construction that achieves the capacity for this regime.Comment: 5 pages, 4 figures, to appear in Proceedings of IEEE ISIT 201

    Beyond Markov Chains, Towards Adaptive Memristor Network-based Music Generation

    Full text link
    We undertook a study of the use of a memristor network for music generation, making use of the memristor's memory to go beyond the Markov hypothesis. Seed transition matrices are created and populated using memristor equations, and which are shown to generate musical melodies and change in style over time as a result of feedback into the transition matrix. The spiking properties of simple memristor networks are demonstrated and discussed with reference to applications of music making. The limitations of simulating composing memristor networks in von Neumann hardware is discussed and a hardware solution based on physical memristor properties is presented.Comment: 22 pages, 13 pages, conference pape

    A spatially-structured PCG method for content diversity in a Physics-based simulation game

    Get PDF
    This paper presents a spatially-structured evolutionary algorithm (EA) to procedurally generate game maps of di ferent levels of di ficulty to be solved, in Gravityvolve!, a physics-based simulation videogame that we have implemented and which is inspired by the n- body problem, a classical problem in the fi eld of physics and mathematics. The proposal consists of a steady-state EA whose population is partitioned into three groups according to the di ficulty of the generated content (hard, medium or easy) which can be easily adapted to handle the automatic creation of content of diverse nature in other games. In addition, we present three fitness functions, based on multiple criteria (i.e:, intersections, gravitational acceleration and simulations), that were used experimentally to conduct the search process for creating a database of maps with di ferent di ficulty in Gravityvolve!.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    Privacy preserving linkage and sharing of sensitive data

    Get PDF
    2018 Summer.Includes bibliographical references.Sensitive data, such as personal and business information, is collected by many service providers nowadays. This data is considered as a rich source of information for research purposes that could benet individuals, researchers and service providers. However, because of the sensitivity of such data, privacy concerns, legislations, and con ict of interests, data holders are reluctant to share their data with others. Data holders typically lter out or obliterate privacy related sensitive information from their data before sharing it, which limits the utility of this data and aects the accuracy of research. Such practice will protect individuals' privacy; however it prevents researchers from linking records belonging to the same individual across dierent sources. This is commonly referred to as record linkage problem by the healthcare industry. In this dissertation, our main focus is on designing and implementing ecient privacy preserving methods that will encourage sensitive information sources to share their data with researchers without compromising the privacy of the clients or aecting the quality of the research data. The proposed solution should be scalable and ecient for real-world deploy- ments and provide good privacy assurance. While this problem has been investigated before, most of the proposed solutions were either considered as partial solutions, not accurate, or impractical, and therefore subject to further improvements. We have identied several issues and limitations in the state of the art solutions and provided a number of contributions that improve upon existing solutions. Our rst contribution is the design of privacy preserving record linkage protocol using semi-trusted third party. The protocol allows a set of data publishers (data holders) who compete with each other, to share sensitive information with subscribers (researchers) while preserving the privacy of their clients and without sharing encryption keys. Our second contribution is the design and implementation of a probabilistic privacy preserving record linkage protocol, that accommodates discrepancies and errors in the data such as typos. This work builds upon the previous work by linking the records that are similar, where the similarity range is formally dened. Our third contribution is a protocol that performs information integration and sharing without third party services. We use garbled circuits secure computation to design and build a system to perform the record linkages between two parties without sharing their data. Our design uses Bloom lters as inputs to the garbled circuits and performs a probabilistic record linkage using the Dice coecient similarity measure. As garbled circuits are known for their expensive computations, we propose new approaches that reduce the computation overhead needed, to achieve a given level of privacy. We built a scalable record linkage system using garbled circuits, that could be deployed in a distributed computation environment like the cloud, and evaluated its security and performance. One of the performance issues for linking large datasets is the amount of secure computation to compare every pair of records across the linked datasets to nd all possible record matches. To reduce the amount of computations a method, known as blocking, is used to lter out as much as possible of the record pairs that will not match, and limit the comparison to a subset of the record pairs (called can- didate pairs) that possibly match. Most of the current blocking methods either require the parties to share blocking keys (called blocks identiers), extracted from the domain of some record attributes (termed blocking variables), or share reference data points to group their records around these points using some similarity measures. Though these methods reduce the computation substantially, they leak too much information about the records within each block. Toward this end, we proposed a novel privacy preserving approximate blocking scheme that allows parties to generate the list of candidate pairs with high accuracy, while protecting the privacy of the records in each block. Our scheme is congurable such that the level of performance and accuracy could be achieved according to the required level of privacy. We analyzed the accuracy and privacy of our scheme, implemented a prototype of the scheme, and experimentally evaluated its accuracy and performance against dierent levels of privacy

    Approaching the Rate-Distortion Limit with Spatial Coupling, Belief propagation and Decimation

    Get PDF
    We investigate an encoding scheme for lossy compression of a binary symmetric source based on simple spatially coupled Low-Density Generator-Matrix codes. The degree of the check nodes is regular and the one of code-bits is Poisson distributed with an average depending on the compression rate. The performance of a low complexity Belief Propagation Guided Decimation algorithm is excellent. The algorithmic rate-distortion curve approaches the optimal curve of the ensemble as the width of the coupling window grows. Moreover, as the check degree grows both curves approach the ultimate Shannon rate-distortion limit. The Belief Propagation Guided Decimation encoder is based on the posterior measure of a binary symmetric test-channel. This measure can be interpreted as a random Gibbs measure at a "temperature" directly related to the "noise level of the test-channel". We investigate the links between the algorithmic performance of the Belief Propagation Guided Decimation encoder and the phase diagram of this Gibbs measure. The phase diagram is investigated thanks to the cavity method of spin glass theory which predicts a number of phase transition thresholds. In particular the dynamical and condensation "phase transition temperatures" (equivalently test-channel noise thresholds) are computed. We observe that: (i) the dynamical temperature of the spatially coupled construction saturates towards the condensation temperature; (ii) for large degrees the condensation temperature approaches the temperature (i.e. noise level) related to the information theoretic Shannon test-channel noise parameter of rate-distortion theory. This provides heuristic insight into the excellent performance of the Belief Propagation Guided Decimation algorithm. The paper contains an introduction to the cavity method

    A Distribution Network Reconfiguration and Islanding Strategy

    Get PDF
    With the development of Smart Grid, the reliability and stability of the power system are significantly improved. However, a large-scale outage still possibly occurs when the power system is exposed to extreme conditions. Power system blackstart, the restoration after a complete or partial outage is a key issue needed to be studied for the safety of power system. Network reconfiguration is one of the most important steps when crews try to rapidly restore the network. Therefore, planning an optimal network reconfiguration scheme with the most efficient restoration target at the primary stage of system restoration is necessary and it also builds the foundation to the following restoration process. Besides, the utilization of distributed generators (DGs) has risen sharply in the power system and it plays a critical role in the future Smart Grid to modernize the power grid. The emerging Smart Grid technology, which enables self-sufficient power systems with DGs, provides further opportunities to enhance self-healing capability. The introduction of DGs makes a quick and efficient restoration of power system possible. In this thesis, based on the topological characteristics of scale-free networks and the Discrete Particle Swarm Optimization (DPSO) algorithm, a network reconfiguration scheme is proposed. A power system structure can be converted into a system consisting of nodes and edges. Indices that reflect the nodes’ and edges’ topological characteristics in Graph Theory can be utilized to describe the importance of loads and transmission lines in the power system. Therefore, indices like node importance degree, line betweenness centrality and clustering coefficient are introduced to weigh the importance of loads and transmission lines. Based on these indices, an objective function which aims to restore as many important loads and transmission lines as possible and also subjected to constraints is formulated. The effectiveness of potential reconfiguration scheme is verified by Depth First Search (DFS) algorithm. Finally, DPSO algorithm is employed to obtain the optimal reconfiguration scheme. The comprehensive reconfiguration scheme proposed by my thesis can be the theoretical basis for the power grid dispatchers. Besides, DGs are introduced in this thesis to enhance the restoration efficiency and success rate at the primary stage of network restoration. Firstly, the selection and classification principle of DGs are introduced in my thesis. In addition, the start sequence principle of DGs is presented as a foundation for the following stability analysis of network restoration with DGs. Then, the objective function subjected to constraints that aims to restore as many important loads as possible is formulated. Based on the restoration objective, islands that include part of important and restorable loads are formed because the DGs’ capacity cannot ensure an entire restoration of the outage areas. Finally, DPSO is used to obtain the optimal solution of islanding strategy and the state sequence matrix is utilized to represent the solution space. It is believed that this work will provide some useful insight into improving the power system resiliency in the face of extreme events such as natural or man-made disasters

    Practical free-space quantum key distribution

    Get PDF
    Within the last two decades, the world has seen an exponential increase in the quantity of data traffic exchanged electronically. Currently, the widespread use of classical encryption technology provides tolerable levels of security for data in day to day life. However, with one somewhat impractical exception these technologies are based on mathematical complexity and have never been proven to be secure. Significant advances in mathematics or new computer architectures could render these technologies obsolete in a very short timescale. By contrast, Quantum Key Distribution (or Quantum Cryptography as it is sometimes called) offers a theoretically secure method of cryptographic key generation and exchange which is guaranteed by physical laws. Moreover, the technique is capable of eavesdropper detection during the key exchange process. Much research and development work has been undertaken but most of this work has concentrated on the use of optical fibres as the transmission medium for the quantum channel. This thesis discusses the requirements, theoretical basis and practical development of a compact, free-space transmission quantum key distribution system from inception to system tests. Experiments conducted over several distances are outlined which verify the feasibility of quantum key distribution operating continuously over ranges from metres to intercity distances and finally to global reach via the use of satellites
    • …
    corecore