5,005 research outputs found

    Self-Stabilizing Token Distribution with Constant-Space for Trees

    Get PDF
    Self-stabilizing and silent distributed algorithms for token distribution in rooted tree networks are given. Initially, each process of a graph holds at most l tokens. Our goal is to distribute the tokens in the whole network so that every process holds exactly k tokens. In the initial configuration, the total number of tokens in the network may not be equal to nk where n is the number of processes in the network. The root process is given the ability to create a new token or remove a token from the network. We aim to minimize the convergence time, the number of token moves, and the space complexity. A self-stabilizing token distribution algorithm that converges within O(n l) asynchronous rounds and needs Theta(nh epsilon) redundant (or unnecessary) token moves is given, where epsilon = min(k,l-k) and h is the height of the tree network. Two novel ideas to reduce the number of redundant token moves are presented. One reduces the number of redundant token moves to O(nh) without any additional costs while the other reduces the number of redundant token moves to O(n), but increases the convergence time to O(nh l). All algorithms given have constant memory at each process and each link register

    Fast Discrete Consensus Based on Gossip for Makespan Minimization in Networked Systems

    Get PDF
    In this paper we propose a novel algorithm to solve the discrete consensus problem, i.e., the problem of distributing evenly a set of tokens of arbitrary weight among the nodes of a networked system. Tokens are tasks to be executed by the nodes and the proposed distributed algorithm minimizes monotonically the makespan of the assigned tasks. The algorithm is based on gossip-like asynchronous local interactions between the nodes. The convergence time of the proposed algorithm is superior with respect to the state of the art of discrete and quantized consensus by at least a factor O(n) in both theoretical and empirical comparisons

    IST Austria Thesis

    Get PDF
    This thesis considers two examples of reconfiguration problems: flipping edges in edge-labelled triangulations of planar point sets and swapping labelled tokens placed on vertices of a graph. In both cases the studied structures – all the triangulations of a given point set or all token placements on a given graph – can be thought of as vertices of the so-called reconfiguration graph, in which two vertices are adjacent if the corresponding structures differ by a single elementary operation – by a flip of a diagonal in a triangulation or by a swap of tokens on adjacent vertices, respectively. We study the reconfiguration of one instance of a structure into another via (shortest) paths in the reconfiguration graph. For triangulations of point sets in which each edge has a unique label and a flip transfers the label from the removed edge to the new edge, we prove a polynomial-time testable condition, called the Orbit Theorem, that characterizes when two triangulations of the same point set lie in the same connected component of the reconfiguration graph. The condition was first conjectured by Bose, Lubiw, Pathak and Verdonschot. We additionally provide a polynomial time algorithm that computes a reconfiguring flip sequence, if it exists. Our proof of the Orbit Theorem uses topological properties of a certain high-dimensional cell complex that has the usual reconfiguration graph as its 1-skeleton. In the context of token swapping on a tree graph, we make partial progress on the problem of finding shortest reconfiguration sequences. We disprove the so-called Happy Leaf Conjecture and demonstrate the importance of swapping tokens that are already placed at the correct vertices. We also prove that a generalization of the problem to weighted coloured token swapping is NP-hard on trees but solvable in polynomial time on paths and stars

    Learning to Infer Graphics Programs from Hand-Drawn Images

    Full text link
    We introduce a model that learns to convert simple hand drawings into graphics programs written in a subset of \LaTeX. The model combines techniques from deep learning and program synthesis. We learn a convolutional neural network that proposes plausible drawing primitives that explain an image. These drawing primitives are like a trace of the set of primitive commands issued by a graphics program. We learn a model that uses program synthesis techniques to recover a graphics program from that trace. These programs have constructs like variable bindings, iterative loops, or simple kinds of conditionals. With a graphics program in hand, we can correct errors made by the deep network, measure similarity between drawings by use of similar high-level geometric structures, and extrapolate drawings. Taken together these results are a step towards agents that induce useful, human-readable programs from perceptual input

    JaxNet: Scalable Blockchain Network

    Get PDF
    Today's world is organized based on merit and value. A single global currency that's decentralized is needed for a global economy. Bitcoin is a partial solution to this need, however it suffers from scalability problems which prevent it from being mass-adopted. Also, the deflationary nature of bitcoin motivates people to hoard and speculate on them instead of using them for day to day transactions. We propose a scalable, decentralized cryptocurrency that is based on Proof of Work.The solution involves having parallel chains in a closed network using a mechanism which rewards miners proportional to their effort in maintaining the network.The proposed design introduces a novel approach for solving scalability problem in blockchain network based on merged mining.Comment: 55 pages. 10 figure

    Empirical Analysis ot the Top 800 Cryptocurrencies using Machine Learning Techniques

    Get PDF
    The International Token Classification (ITC) Framework by the Blockchain Center in Frankfurt classifies 795 cryptocurrency tokens based on their economic, technological, legal and industry categorization. This work analyzes cryptocurrency data to evaluate the categorization with real-world market data. The feature space includes price, volume and market capitalization data. Additional metrics such as the moving average and the relative strengh index are added to get a more in-depth understanding of market movements. The data set is used to build supervised and unsupervised machine learning models. The prediction accuracies varied amongst labels and all remained below 90%. The technological label had the highest prediction accuracy at 88.9% using Random Forests. The economic label could be predicted with an accuracy of 81.7% using K-Nearest Neighbors. The classification using machine learning techniques is not yet accurate enough to automate the classification process. But it can be improved by adding additional features. The unsupervised clustering shows that there are more layers to the data that can be added to the ITC. The additional categories are built upon a combination of token mining, maximal supply, volume and market capitalization data. As a result we suggest that a data-driven extension of the categorization in to a token profile would allow investors and regulators to gain a deeper understanding of token performance, maturity and usage

    Memorization for Good: Encryption with Autoregressive Language Models

    Full text link
    Over-parameterized neural language models (LMs) can memorize and recite long sequences of training data. While such memorization is normally associated with undesired properties such as overfitting and information leaking, our work casts memorization as an unexplored capability of LMs. We propose the first symmetric encryption algorithm with autoregressive language models (SELM). We show that autoregressive LMs can encode arbitrary data into a compact real-valued vector (i.e., encryption) and then losslessly decode the vector to the original message (i.e., decryption) via random subspace optimization and greedy decoding. While SELM is not amenable to conventional cryptanalysis, we investigate its security through a novel empirical variant of the classic IND-CPA (indistinguishability under chosen-plaintext attack) game and show promising results on security. Our code and datasets are available at https://github.com/OSU-NLP-Group/SELM.Comment: Main text: 9 pages, 4 figures, 1 table. Work-in-progress. Project website at https://samuelstevens.me/research/encryption

    07391 Abstracts Collection -- Probabilistic Methods in the Design and Analysis of Algorithms

    Get PDF
    From 23.09.2007 to 28.09.2007, the Dagstuhl Seminar 07391 "Probabilistic Methods in the Design and Analysis of Algorithms\u27\u27was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. The seminar brought together leading researchers in probabilistic methods to strengthen and foster collaborations among various areas of Theoretical Computer Science. The interaction between researchers using randomization in algorithm design and researchers studying known algorithms and heuristics in probabilistic models enhanced the research of both groups in developing new complexity frameworks and in obtaining new algorithmic results. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available

    Improved Internet Security Protocols Using Cryptographic One-Way Hash Chains

    Get PDF
    In this dissertation, new approaches that utilize the one-way cryptographic hash functions in designing improved network security protocols are investigated. The proposed approaches are designed to be scalable and easy to implement in modern technology. The first contribution explores session cookies with emphasis on the threat of session hijacking attacks resulting from session cookie theft or sniffing. In the proposed scheme, these cookies are replaced by easily computed authentication credentials using Lamport\u27s well-known one-time passwords. The basic idea in this scheme revolves around utilizing sparse caching units, where authentication credentials pertaining to cookies are stored and fetched once needed, thereby, mitigating computational overhead generally associated with one-way hash constructions. The second and third proposed schemes rely on dividing the one-way hash construction into a hierarchical two-tier construction. Each tier component is responsible for some aspect of authentication generated by using two different hash functions. By utilizing different cryptographic hash functions arranged in two tiers, the hierarchical two-tier protocol (our second contribution) gives significant performance improvement over previously proposed solutions for securing Internet cookies. Through indexing authentication credentials by their position within the hash chain in a multi-dimensional chain, the third contribution achieves improved performance. In the fourth proposed scheme, an attempt is made to apply the one-way hash construction to achieve user and broadcast authentication in wireless sensor networks. Due to known energy and memory constraints, the one-way hash scheme is modified to mitigate computational overhead so it can be easily applied in this particular setting. The fifth scheme tries to reap the benefits of the sparse cache-supported scheme and the hierarchical scheme. The resulting hybrid approach achieves efficient performance at the lowest cost of caching possible. In the sixth proposal, an authentication scheme tailored for the multi-server single sign-on (SSO) environment is presented. The scheme utilizes the one-way hash construction in a Merkle Hash Tree and a hash calendar to avoid impersonation and session hijacking attacks. The scheme also explores the optimal configuration of the one-way hash chain in this particular environment. All the proposed protocols are validated by extensive experimental analyses. These analyses are obtained by running simulations depicting the many scenarios envisioned. Additionally, these simulations are supported by relevant analytical models derived by mathematical formulas taking into consideration the environment under investigation
    • …
    corecore