154,560 research outputs found

    Evolutionary dynamics in heterogeneous populations: a general framework for an arbitrary type distribution

    Full text link
    A general framework of evolutionary dynamics under heterogeneous populations is presented. The framework allows continuously many types of heterogeneous agents, heterogeneity both in payoff functions and in revision protocols and the entire joint distribution of strategies and types to influence the payoffs of agents. We clarify regularity conditions for the unique existence of a solution trajectory and for the existence of equilibrium. We confirm that equilibrium stationarity in general and equilibrium stability in potential games are extended from the homogeneous setting to the heterogeneous setting. In particular, a wide class of admissible dynamics share the same set of locally stable equilibria in a potential game through local maximization of the potential

    Optimized normal and distance matching for heterogeneous object modeling

    Get PDF
    This paper presents a new optimization methodology of material blending for heterogeneous object modeling by matching the material governing features for designing a heterogeneous object. The proposed method establishes point-to-point correspondence represented by a set of connecting lines between two material directrices. To blend the material features between the directrices, a heuristic optimization method developed with the objective is to maximize the sum of the inner products of the unit normals at the end points of the connecting lines and minimize the sum of the lengths of connecting lines. The geometric features with material information are matched to generate non-self-intersecting and non-twisted connecting surfaces. By subdividing the connecting lines into equal number of segments, a series of intermediate piecewise curves are generated to represent the material metamorphosis between the governing material features. Alternatively, a dynamic programming approach developed in our earlier work is presented for comparison purposes. Result and computational efficiency of the proposed heuristic method is also compared with earlier techniques in the literature. Computer interface implementation and illustrative examples are also presented in this paper

    SHADHO: Massively Scalable Hardware-Aware Distributed Hyperparameter Optimization

    Full text link
    Computer vision is experiencing an AI renaissance, in which machine learning models are expediting important breakthroughs in academic research and commercial applications. Effectively training these models, however, is not trivial due in part to hyperparameters: user-configured values that control a model's ability to learn from data. Existing hyperparameter optimization methods are highly parallel but make no effort to balance the search across heterogeneous hardware or to prioritize searching high-impact spaces. In this paper, we introduce a framework for massively Scalable Hardware-Aware Distributed Hyperparameter Optimization (SHADHO). Our framework calculates the relative complexity of each search space and monitors performance on the learning task over all trials. These metrics are then used as heuristics to assign hyperparameters to distributed workers based on their hardware. We first demonstrate that our framework achieves double the throughput of a standard distributed hyperparameter optimization framework by optimizing SVM for MNIST using 150 distributed workers. We then conduct model search with SHADHO over the course of one week using 74 GPUs across two compute clusters to optimize U-Net for a cell segmentation task, discovering 515 models that achieve a lower validation loss than standard U-Net.Comment: 10 pages, 6 figure
    corecore