12 research outputs found

    Gradient Regularization Improves Accuracy of Discriminative Models

    Get PDF
    Regularizing the gradient norm of the output of a neural network is a powerful technique, rediscovered several times. This paper presents evidence that gradient regularization can consistently improve classification accuracy on vision tasks, using modern deep neural networks, especially when the amount of training data is small. We introduce our regularizers as members of a broader class of Jacobian-based regularizers. We demonstrate empirically on real and synthetic data that the learning process leads to gradients controlled beyond the training points, and results in solutions that generalize well

    ParallelGlobal with Low Thread Interactions

    Get PDF
    Global is an optimization algorithm conceived in the ’80s. Since then several papers discussed improvements of the algorithm, but adapting it to a multi-thread execution environment is only a recent branch of development [1]. Our previous work focused on parallel implementation on a single machine but sometimes the use of distributed systems is inevitable. In this paper we introduce a new version of Global which is the first step towards a fully distributed algorithm. While the proposed implementation still works on a single machine, it is easy to see how gossip based information sharing can be built into and be utilized by the algorithm. We show that ParallelGlobal is a feasible way to implement Global on a distributed system. However, further improvements must be made to solve real world problems with the algorithm

    Effects of Pooling in ParallelGlobal with Low Thread Interactions

    Get PDF
    The first step toward a new version of Global is discussed. It is a fully distributed algorithm. While the proposed implementation runs on a single machine, gossip based information sharing can be built into and be utilized by the algorithm. ParallelGlobal shows a feasible way to implement Global on a distributed system. Further improvements must be made to solve big real world problems with the algorithm

    Fooling A Complete Neural Network Verifier

    Get PDF
    The efficient and accurate characterization of the robustness of neural networks to input perturbation is an important open problem. Many approaches exist including heuristic and exact (or complete) methods. Complete methods are expensive but their mathematical formulation guarantees that they provide exact robustness metrics. However, this guarantee is valid only if we assume that the verified network applies arbitrary-precision arithmetic and the verifier is reliable. In practice, however, both the networks and the verifiers apply limited-precision floating point arithmetic. In this paper, we show that numerical roundoff errors can be exploited to craft adversarial networks, in which the actual robustness and the robustness computed by a state-of-the-art complete verifier radically differ. We also show that such adversarial networks can be used to insert a backdoor into any network in such a way that the backdoor is completely missed by the verifier. The attack is easy to detect in its naive form but, as we show, the adversarial network can be transformed to make its detection less trivial. We offer a simple defense against our particular attack based on adding a very small perturbation to the network weights. However, our conjecture is that other numerical attacks are possible, and exact verification has to take into account all the details of the computation executed by the verified networks, which makes the problem significantly harder

    The GLOBAL optimization algorithm: newly updated with Java implementation and parallelization

    No full text
    This book explores the updated version of the GLOBAL algorithm which contains improvements for a local search algorithm and new Java implementations. Efficiency comparisons to earlier versions and on the increased speed achieved by the parallelization, are detailed. Examples are provided for students as well as researchers and practitioners in optimization, operations research, and mathematics to compose their own scripts with ease. A GLOBAL manual is presented in the appendix to assist new users with modules and test functions. GLOBAL is a successful stochastic multistart global optimization algorithm that has passed several computational tests, and is efficient and reliable for small to medium dimensional global optimization problems. The algorithm uses clustering to ensure efficiency and is modular in regard to the two local search methods it starts with, but it can also easily apply other local techniques. The strength of this algorithm lies in its reliability and adaptive algorithm parameters. The GLOBAL algorithm is free to download also in the earlier Fortran, C, and MATLAB implementations
    corecore