34,798 research outputs found

    Classification of Random Boolean Networks

    Get PDF
    We provide the first classification of different types of Random Boolean Networks (RBNs). We study the differences of RBNs depending on the degree of synchronicity and determinism of their updating scheme. For doing so, we first define three new types of RBNs. We note some similarities and differences between different types of RBNs with the aid of a public software laboratory we developed. Particularly, we find that the point attractors are independent of the updating scheme, and that RBNs are more different depending on their determinism or non-determinism rather than depending on their synchronicity or asynchronicity. We also show a way of mapping non-synchronous deterministic RBNs into synchronous RBNs. Our results are important for justifying the use of specific types of RBNs for modelling natural phenomena

    CodNN – Robust Neural Networks From Coded Classification

    Get PDF
    Deep Neural Networks (DNNs) are a revolutionary force in the ongoing information revolution, and yet their intrinsic properties remain a mystery. In particular, it is widely known that DNNs are highly sensitive to noise, whether adversarial or random. This poses a fundamental challenge for hardware implementations of DNNs, and for their deployment in critical applications such as autonomous driving.In this paper we construct robust DNNs via error correcting codes. By our approach, either the data or internal layers of the DNN are coded with error correcting codes, and successful computation under noise is guaranteed. Since DNNs can be seen as a layered concatenation of classification tasks, our research begins with the core task of classifying noisy coded inputs, and progresses towards robust DNNs.We focus on binary data and linear codes. Our main result is that the prevalent parity code can guarantee robustness for a large family of DNNs, which includes the recently popularized binarized neural networks. Further, we show that the coded classification problem has a deep connection to Fourier analysis of Boolean functions.In contrast to existing solutions in the literature, our results do not rely on altering the training process of the DNN, and provide mathematically rigorous guarantees rather than experimental evidence

    CodNN -- Robust Neural Networks From Coded Classification

    Get PDF
    Deep Neural Networks (DNNs) are a revolutionary force in the ongoing information revolution, and yet their intrinsic properties remain a mystery. In particular, it is widely known that DNNs are highly sensitive to noise, whether adversarial or random. This poses a fundamental challenge for hardware implementations of DNNs, and for their deployment in critical applications such as autonomous driving. In this paper we construct robust DNNs via error correcting codes. By our approach, either the data or internal layers of the DNN are coded with error correcting codes, and successful computation under noise is guaranteed. Since DNNs can be seen as a layered concatenation of classification tasks, our research begins with the core task of classifying noisy coded inputs, and progresses towards robust DNNs. We focus on binary data and linear codes. Our main result is that the prevalent parity code can guarantee robustness for a large family of DNNs, which includes the recently popularized binarized neural networks. Further, we show that the coded classification problem has a deep connection to Fourier analysis of Boolean functions. In contrast to existing solutions in the literature, our results do not rely on altering the training process of the DNN, and provide mathematically rigorous guarantees rather than experimental evidence.Comment: To appear in ISIT '2

    Entropy of complex relevant components of Boolean networks

    Full text link
    Boolean network models of strongly connected modules are capable of capturing the high regulatory complexity of many biological gene regulatory circuits. We study numerically the previously introduced basin entropy, a parameter for the dynamical uncertainty or information storage capacity of a network as well as the average transient time in random relevant components as a function of their connectivity. We also demonstrate that basin entropy can be estimated from time-series data and is therefore also applicable to non-deterministic networks models.Comment: 8 pages, 6 figure
    • …
    corecore