761 research outputs found

    Finding efficient nonlinear functions by means of genetic programming

    Get PDF
    7th International Conference, KES 2003. Proceedings, Part I. Oxford, UK, September 3-5, 2003The design of highly nonlinear functions is relevant for a number of different applications, ranging from database hashing to message authentication. But, apart from useful, it is quite a challenging task. In this work, we propose the use of genetic programming for finding functions that optimize a particular nonlinear criteria, the avalanche effect, using only very efficient operations, so that the resulting functions are extremely efficient both in hardware and in software.Supported by the Spanish Ministerio de Ciencia y Tecnologia research project TIC2002-04498-C05-4Publicad

    Weighted complex projective 2-designs from bases: optimal state determination by orthogonal measurements

    Get PDF
    We introduce the problem of constructing weighted complex projective 2-designs from the union of a family of orthonormal bases. If the weight remains constant across elements of the same basis, then such designs can be interpreted as generalizations of complete sets of mutually unbiased bases, being equivalent whenever the design is composed of d+1 bases in dimension d. We show that, for the purpose of quantum state determination, these designs specify an optimal collection of orthogonal measurements. Using highly nonlinear functions on abelian groups, we construct explicit examples from d+2 orthonormal bases whenever d+1 is a prime power, covering dimensions d=6, 10, and 12, for example, where no complete sets of mutually unbiased bases have thus far been found.Comment: 28 pages, to appear in J. Math. Phy

    A Hedged Monte Carlo Approach to Real Option Pricing

    Full text link
    In this work we are concerned with valuing optionalities associated to invest or to delay investment in a project when the available information provided to the manager comes from simulated data of cash flows under historical (or subjective) measure in a possibly incomplete market. Our approach is suitable also to incorporating subjective views from management or market experts and to stochastic investment costs. It is based on the Hedged Monte Carlo strategy proposed by Potters et al (2001) where options are priced simultaneously with the determination of the corresponding hedging. The approach is particularly well-suited to the evaluation of commodity related projects whereby the availability of pricing formulae is very rare, the scenario simulations are usually available only in the historical measure, and the cash flows can be highly nonlinear functions of the prices.Comment: 25 pages, 14 figure

    A Note on Cyclic Codes from APN Functions

    Full text link
    Cyclic codes, as linear block error-correcting codes in coding theory, play a vital role and have wide applications. Ding in \cite{D} constructed a number of classes of cyclic codes from almost perfect nonlinear (APN) functions and planar functions over finite fields and presented ten open problems on cyclic codes from highly nonlinear functions. In this paper, we consider two open problems involving the inverse APN functions f(x)=xqm−2f(x)=x^{q^m-2} and the Dobbertin APN function f(x)=x24i+23i+22i+2i−1f(x)=x^{2^{4i}+2^{3i}+2^{2i}+2^{i}-1}. From the calculation of linear spans and the minimal polynomials of two sequences generated by these two classes of APN functions, the dimensions of the corresponding cyclic codes are determined and lower bounds on the minimum weight of these cyclic codes are presented. Actually, we present a framework for the minimal polynomial and linear span of the sequence s∞s^{\infty} defined by st=Tr((1+αt)e)s_t=Tr((1+\alpha^t)^e), where α\alpha is a primitive element in GF(q)GF(q). These techniques can also be applied into other open problems in \cite{D}

    Integrated Inference and Learning of Neural Factors in Structural Support Vector Machines

    Get PDF
    Tackling pattern recognition problems in areas such as computer vision, bioinformatics, speech or text recognition is often done best by taking into account task-specific statistical relations between output variables. In structured prediction, this internal structure is used to predict multiple outputs simultaneously, leading to more accurate and coherent predictions. Structural support vector machines (SSVMs) are nonprobabilistic models that optimize a joint input-output function through margin-based learning. Because SSVMs generally disregard the interplay between unary and interaction factors during the training phase, final parameters are suboptimal. Moreover, its factors are often restricted to linear combinations of input features, limiting its generalization power. To improve prediction accuracy, this paper proposes: (i) Joint inference and learning by integration of back-propagation and loss-augmented inference in SSVM subgradient descent; (ii) Extending SSVM factors to neural networks that form highly nonlinear functions of input features. Image segmentation benchmark results demonstrate improvements over conventional SSVM training methods in terms of accuracy, highlighting the feasibility of end-to-end SSVM training with neural factors
    • …
    corecore