150 research outputs found

    The Linking Probability of Deep Spider-Web Networks

    Get PDF
    We consider crossbar switching networks with base bb (that is, constructed from b×bb\times b crossbar switches), scale kk (that is, with bkb^k inputs, bkb^k outputs and bkb^k links between each consecutive pair of stages) and depth ll (that is, with ll stages). We assume that the crossbars are interconnected according to the spider-web pattern, whereby two diverging paths reconverge only after at least kk stages. We assume that each vertex is independently idle with probability qq, the vacancy probability. We assume that b2b\ge 2 and the vacancy probability qq are fixed, and that kk and l=ckl = ck tend to infinity with ratio a fixed constant c>1c>1. We consider the linking probability QQ (the probability that there exists at least one idle path between a given idle input and a given idle output). In a previous paper it was shown that if c2c\le 2, then the linking probability QQ tends to 0 if 0<q<qc0<q<q_c (where qc=1/b(c1)/cq_c = 1/b^{(c-1)/c} is the critical vacancy probability), and tends to (1ξ)2(1-\xi)^2 (where ξ\xi is the unique solution of the equation (1q(1x))b=x(1-q (1-x))^b=x in the range 0<x<10<x<1) if qc<q<1q_c<q<1. In this paper we extend this result to all rational c>1c>1. This is done by using generating functions and complex-variable techniques to estimate the second moments of various random variables involved in the analysis of the networks.Comment: i+21 p

    Non-perturbative corrections to mean-field behavior: spherical model on spider-web graph

    Full text link
    We consider the spherical model on a spider-web graph. This graph is effectively infinite-dimensional, similar to the Bethe lattice, but has loops. We show that these lead to non-trivial corrections to the simple mean-field behavior. We first determine all normal modes of the coupled springs problem on this graph, using its large symmetry group. In the thermodynamic limit, the spectrum is a set of δ\delta-functions, and all the modes are localized. The fractional number of modes with frequency less than ω\omega varies as exp(C/ω)\exp (-C/\omega) for ω\omega tending to zero, where CC is a constant. For an unbiased random walk on the vertices of this graph, this implies that the probability of return to the origin at time tt varies as exp(Ct1/3)\exp(- C' t^{1/3}), for large tt, where CC' is a constant. For the spherical model, we show that while the critical exponents take the values expected from the mean-field theory, the free-energy per site at temperature TT, near and above the critical temperature TcT_c, also has an essential singularity of the type exp[K(TTc)1/2]\exp[ -K {(T - T_c)}^{-1/2}].Comment: substantially revised, a section adde

    Efficiency Analysis of Swarm Intelligence and Randomization Techniques

    Full text link
    Swarm intelligence has becoming a powerful technique in solving design and scheduling tasks. Metaheuristic algorithms are an integrated part of this paradigm, and particle swarm optimization is often viewed as an important landmark. The outstanding performance and efficiency of swarm-based algorithms inspired many new developments, though mathematical understanding of metaheuristics remains partly a mystery. In contrast to the classic deterministic algorithms, metaheuristics such as PSO always use some form of randomness, and such randomization now employs various techniques. This paper intends to review and analyze some of the convergence and efficiency associated with metaheuristics such as firefly algorithm, random walks, and L\'evy flights. We will discuss how these techniques are used and their implications for further research.Comment: 10 pages. arXiv admin note: substantial text overlap with arXiv:1212.0220, arXiv:1208.0527, arXiv:1003.146

    Master index: volumes 31–40

    Get PDF

    L\'evy walks

    Full text link
    Random walk is a fundamental concept with applications ranging from quantum physics to econometrics. Remarkably, one specific model of random walks appears to be ubiquitous across many fields as a tool to analyze transport phenomena in which the dispersal process is faster than dictated by Brownian diffusion. The L\'{e}vy walk model combines two key features, the ability to generate anomalously fast diffusion and a finite velocity of a random walker. Recent results in optics, Hamiltonian chaos, cold atom dynamics, bio-physics, and behavioral science demonstrate that this particular type of random walks provides significant insight into complex transport phenomena. This review provides a self-consistent introduction to L\'{e}vy walks, surveys their existing applications, including latest advances, and outlines further perspectives.Comment: 50 page

    Toward enhancement of deep learning techniques using fuzzy logic: a survey

    Get PDF
    Deep learning has emerged recently as a type of artificial intelligence (AI) and machine learning (ML), it usually imitates the human way in gaining a particular knowledge type. Deep learning is considered an essential data science element, which comprises predictive modeling and statistics. Deep learning makes the processes of collecting, interpreting, and analyzing big data easier and faster. Deep neural networks are kind of ML models, where the non-linear processing units are layered for the purpose of extracting particular features from the inputs. Actually, the training process of similar networks is very expensive and it also depends on the used optimization method, hence optimal results may not be provided. The techniques of deep learning are also vulnerable to data noise. For these reasons, fuzzy systems are used to improve the performance of deep learning algorithms, especially in combination with neural networks. Fuzzy systems are used to improve the representation accuracy of deep learning models. This survey paper reviews some of the deep learning based fuzzy logic models and techniques that were presented and proposed in the previous studies, where fuzzy logic is used to improve deep learning performance. The approaches are divided into two categories based on how both of the samples are combined. Furthermore, the models' practicality in the actual world is revealed
    corecore