389,643 research outputs found

    Domain Randomization and Generative Models for Robotic Grasping

    Full text link
    Deep learning-based robotic grasping has made significant progress thanks to algorithmic improvements and increased data availability. However, state-of-the-art models are often trained on as few as hundreds or thousands of unique object instances, and as a result generalization can be a challenge. In this work, we explore a novel data generation pipeline for training a deep neural network to perform grasp planning that applies the idea of domain randomization to object synthesis. We generate millions of unique, unrealistic procedurally generated objects, and train a deep neural network to perform grasp planning on these objects. Since the distribution of successful grasps for a given object can be highly multimodal, we propose an autoregressive grasp planning model that maps sensor inputs of a scene to a probability distribution over possible grasps. This model allows us to sample grasps efficiently at test time (or avoid sampling entirely). We evaluate our model architecture and data generation pipeline in simulation and the real world. We find we can achieve a >>90% success rate on previously unseen realistic objects at test time in simulation despite having only been trained on random objects. We also demonstrate an 80% success rate on real-world grasp attempts despite having only been trained on random simulated objects.Comment: 8 pages, 11 figures. Submitted to 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2018

    A code for square permutations and convex permutominoes

    Full text link
    In this article we consider square permutations, a natural subclass of permutations defined in terms of geometric conditions, that can also be described in terms of pattern avoiding permutations, and convex permutoninoes, a related subclass of polyominoes. While these two classes of objects arised independently in various contexts, they play a natural role in the description of certain random horizontally and vertically convex grid configurations. We propose a common approach to the enumeration of these two classes of objets that allows us to explain the known common form of their generating functions, and to derive new refined formulas and linear time random generation algorithms for these objects and the associated grid configurations.Comment: 18 pages, 10 figures. Revision according to referees' remark

    Boltzmann samplers for random generation of lambda terms

    Get PDF
    Randomly generating structured objects is important in testing and optimizing functional programs, whereas generating random l'l-terms is more specifically needed for testing and optimizing compilers. For that a tool called QuickCheck has been proposed, but in this tool the control of the random generation is left to the programmer. Ten years ago, a method called Boltzmann samplers has been proposed to generate combinatorial structures. In this paper, we show how Boltzmann samplers can be developed to generate lambda-terms, but also other data structures like trees. These samplers rely on a critical value which parameters the main random selector and which is exhibited here with explanations on how it is computed. Haskell programs are proposed to show how samplers are actually implemented

    Perfect sampling algorithm for Schur processes

    Full text link
    We describe random generation algorithms for a large class of random combinatorial objects called Schur processes, which are sequences of random (integer) partitions subject to certain interlacing conditions. This class contains several fundamental combinatorial objects as special cases, such as plane partitions, tilings of Aztec diamonds, pyramid partitions and more generally steep domino tilings of the plane. Our algorithm, which is of polynomial complexity, is both exact (i.e. the output follows exactly the target probability law, which is either Boltzmann or uniform in our case), and entropy optimal (i.e. it reads a minimal number of random bits as an input). The algorithm encompasses previous growth procedures for special Schur processes related to the primal and dual RSK algorithm, as well as the famous domino shuffling algorithm for domino tilings of the Aztec diamond. It can be easily adapted to deal with symmetric Schur processes and general Schur processes involving infinitely many parameters. It is more concrete and easier to implement than Borodin's algorithm, and it is entropy optimal. At a technical level, it relies on unified bijective proofs of the different types of Cauchy and Littlewood identities for Schur functions, and on an adaptation of Fomin's growth diagram description of the RSK algorithm to that setting. Simulations performed with this algorithm suggest interesting limit shape phenomena for the corresponding tiling models, some of which are new.Comment: 26 pages, 19 figures (v3: final version, corrected a few misprints present in v2

    On Lightweight Privacy-Preserving Collaborative Learning for IoT Objects

    Full text link
    The Internet of Things (IoT) will be a main data generation infrastructure for achieving better system intelligence. This paper considers the design and implementation of a practical privacy-preserving collaborative learning scheme, in which a curious learning coordinator trains a better machine learning model based on the data samples contributed by a number of IoT objects, while the confidentiality of the raw forms of the training data is protected against the coordinator. Existing distributed machine learning and data encryption approaches incur significant computation and communication overhead, rendering them ill-suited for resource-constrained IoT objects. We study an approach that applies independent Gaussian random projection at each IoT object to obfuscate data and trains a deep neural network at the coordinator based on the projected data from the IoT objects. This approach introduces light computation overhead to the IoT objects and moves most workload to the coordinator that can have sufficient computing resources. Although the independent projections performed by the IoT objects address the potential collusion between the curious coordinator and some compromised IoT objects, they significantly increase the complexity of the projected data. In this paper, we leverage the superior learning capability of deep learning in capturing sophisticated patterns to maintain good learning performance. Extensive comparative evaluation shows that this approach outperforms other lightweight approaches that apply additive noisification for differential privacy and/or support vector machines for learning in the applications with light data pattern complexities.Comment: 12 pages,IOTDI 201

    Recursive Combinatorial Structures: Enumeration, Probabilistic Analysis and Random Generation

    Get PDF
    In a probabilistic context, the main data structures of computer science are viewed as random combinatorial objects. Analytic Combinatorics, as described in the book by Flajolet and Sedgewick, provides a set of high-level tools for their probabilistic analysis. Recursive combinatorial definitions lead to generating function equations from which efficient algorithms can be designed for enumeration, random generation and, to some extent, asymptotic analysis. With a focus on random generation, this tutorial first covers the basics of Analytic Combinatorics and then describes the idea of Boltzmann sampling and its realisation. The tutorial addresses a broad TCS audience and no particular pre-knowledge on analytic combinatorics is expected
    corecore