321 research outputs found

    Hold your horses : temporal multiplexity and conflict moderation in the “Palio di Siena” (1743-2010)

    Get PDF
    The paper elaborates the concept of temporal multiplexity, defined as the overlaying of ties of different duration, such as transient employment and enduring organizational ties. This concept is instrumental in resolving long-standing challenges in network research, such as capturing the interplay between different levels of analysis or time horizons. This is made possible by longitudinal network and mobility data (1743–2010) from the Palio di Siena—the famous horse race in Siena, Italy. The outcome of interest is Palio-related collective violence. The analysis shows that relationally loaded organizational ties of rivalry or friendship increase the likelihood of incidents, whereas mobility along the same lines reduces it. The results support sociological arguments that symmetrical social space of friendship or rivalry is conducive to conflict. Mobility is a factor of moderation—by connecting employers within the actor and transferring relational content between them, it creates misalignment between the assumption of a role and fulfillment of its expectations. Mobility relaxes the relational constraints of jockeys, reducing their compliance with bellicose demands. The uncertainty resulting from mobility may have a collective benefit that is ignored by employers: the moderation of conflict

    Packing of concave polyhedra with continuous rotations using nonlinear optimisation

    Get PDF
    We study the problem of packing a given collection of arbitrary, in general concave, polyhedra into a cuboid of minimal volume. Continuous rotations and translations of polyhedra are allowed. In addition, minimal allowable distances between polyhedra are taken into account. We derive an exact mathematical model using adjusted radical free quasi phi-functions for concave polyhedra to describe non-overlapping and distance constraints. The model is a nonlinear programming formulation. We develop an efficient solution algorithm, which employs a fast starting point algorithm and a new compaction procedure. The procedure reduces our problem to a sequence of nonlinear programming subproblems of considerably smaller dimension and a smaller number of nonlinear inequalities. The benefit of this approach is borne out by the computational results, which include a comparison with previously published instances and new instances

    Optimal clustering of a pair of irregular objects

    No full text
    Cutting and packing problems arise in many fields of applications and theory. When dealing with irregular objects, an important subproblem is the identification of the optimal clustering of two objects. Within this paper we consider a container (rectangle, circle, convex polygon) of variable sizes and two irregular objects bounded by circular arcs and/or line segments, that can be continuously translated and rotated. In addition minimal allowable distances between objects and between each object and the frontier of a container, may be imposed. The objects should be arranged within a container such that a given objective will reach its minimal value. We consider a polynomial function as the objective, which depends on the variable parameters associated with the objects and the container. The paper presents a universal mathematical model and a solution strategy which are based on the concept of phi-functions and provide new benchmark instances of finding the containing region that has either minimal area, perimeter or homothetic coefficient of a given container, as well as finding the convex polygonal hull (or its approximation) of a pair of objects

    Mark correlations: relating physical properties to spatial distributions

    Get PDF
    Mark correlations provide a systematic approach to look at objects both distributed in space and bearing intrinsic information, for instance on physical properties. The interplay of the objects' properties (marks) with the spatial clustering is of vivid interest for many applications; are, e.g., galaxies with high luminosities more strongly clustered than dim ones? Do neighbored pores in a sandstone have similar sizes? How does the shape of impact craters on a planet depend on the geological surface properties? In this article, we give an introduction into the appropriate mathematical framework to deal with such questions, i.e. the theory of marked point processes. After having clarified the notion of segregation effects, we define universal test quantities applicable to realizations of a marked point processes. We show their power using concrete data sets in analyzing the luminosity-dependence of the galaxy clustering, the alignment of dark matter halos in gravitational NN-body simulations, the morphology- and diameter-dependence of the Martian crater distribution and the size correlations of pores in sandstone. In order to understand our data in more detail, we discuss the Boolean depletion model, the random field model and the Cox random field model. The first model describes depletion effects in the distribution of Martian craters and pores in sandstone, whereas the last one accounts at least qualitatively for the observed luminosity-dependence of the galaxy clustering.Comment: 35 pages, 12 figures. to be published in Lecture Notes of Physics, second Wuppertal conference "Spatial statistics and statistical physics

    Modeling Heterogeneous Materials via Two-Point Correlation Functions: I. Basic Principles

    Full text link
    Heterogeneous materials abound in nature and man-made situations. Examples include porous media, biological materials, and composite materials. Diverse and interesting properties exhibited by these materials result from their complex microstructures, which also make it difficult to model the materials. In this first part of a series of two papers, we collect the known necessary conditions on the standard two-point correlation function S2(r) and formulate a new conjecture. In particular, we argue that given a complete two-point correlation function space, S2(r) of any statistically homogeneous material can be expressed through a map on a selected set of bases of the function space. We provide new examples of realizable two-point correlation functions and suggest a set of analytical basis functions. Moreover, we devise an efficient and isotropy- preserving construction algorithm, namely, the Lattice-Point algorithm to generate realizations of materials from their two- point correlation functions based on the Yeong-Torquato technique. Subsequent analysis can be performed on the generated images to obtain desired macroscopic properties. These developments are integrated here into a general scheme that enables one to model and categorize heterogeneous materials via two-point correlation functions.Comment: 37 pages, 26 figure

    Stochastic reconstruction of sandstones

    Full text link
    A simulated annealing algorithm is employed to generate a stochastic model for a Berea and a Fontainebleau sandstone with prescribed two-point probability function, lineal path function, and ``pore size'' distribution function, respectively. We find that the temperature decrease of the annealing has to be rather quick to yield isotropic and percolating configurations. A comparison of simple morphological quantities indicates good agreement between the reconstructions and the original sandstones. Also, the mean survival time of a random walker in the pore space is reproduced with good accuracy. However, a more detailed investigation by means of local porosity theory shows that there may be significant differences of the geometrical connectivity between the reconstructed and the experimental samples.Comment: 12 pages, 5 figure

    Optimal spatial transportation networks where link-costs are sublinear in link-capacity

    Full text link
    Consider designing a transportation network on nn vertices in the plane, with traffic demand uniform over all source-destination pairs. Suppose the cost of a link of length \ell and capacity cc scales as cβ\ell c^\beta for fixed 0<β<10<\beta<1. Under appropriate standardization, the cost of the minimum cost Gilbert network grows essentially as nα(β)n^{\alpha(\beta)}, where α(β)=1β2\alpha(\beta) = 1 - \frac{\beta}{2} on 0<β1/20 < \beta \leq {1/2} and α(β)=1/2+β2\alpha(\beta) = {1/2} + \frac{\beta}{2} on 1/2β<1{1/2} \leq \beta < 1. This quantity is an upper bound in the worst case (of vertex positions), and a lower bound under mild regularity assumptions. Essentially the same bounds hold if we constrain the network to be efficient in the sense that average route-length is only 1+o(1)1 + o(1) times average straight line length. The transition at β=1/2\beta = {1/2} corresponds to the dominant cost contribution changing from short links to long links. The upper bounds arise in the following type of hierarchical networks, which are therefore optimal in an order of magnitude sense. On the large scale, use a sparse Poisson line process to provide long-range links. On the medium scale, use hierachical routing on the square lattice. On the small scale, link vertices directly to medium-grid points. We discuss one of many possible variant models, in which links also have a designed maximum speed ss and the cost becomes cβsγ\ell c^\beta s^\gamma.Comment: 13 page

    Sampling-based Algorithms for Optimal Motion Planning

    Get PDF
    During the last decade, sampling-based path planning algorithms, such as Probabilistic RoadMaps (PRM) and Rapidly-exploring Random Trees (RRT), have been shown to work well in practice and possess theoretical guarantees such as probabilistic completeness. However, little effort has been devoted to the formal analysis of the quality of the solution returned by such algorithms, e.g., as a function of the number of samples. The purpose of this paper is to fill this gap, by rigorously analyzing the asymptotic behavior of the cost of the solution returned by stochastic sampling-based algorithms as the number of samples increases. A number of negative results are provided, characterizing existing algorithms, e.g., showing that, under mild technical conditions, the cost of the solution returned by broadly used sampling-based algorithms converges almost surely to a non-optimal value. The main contribution of the paper is the introduction of new algorithms, namely, PRM* and RRT*, which are provably asymptotically optimal, i.e., such that the cost of the returned solution converges almost surely to the optimum. Moreover, it is shown that the computational complexity of the new algorithms is within a constant factor of that of their probabilistically complete (but not asymptotically optimal) counterparts. The analysis in this paper hinges on novel connections between stochastic sampling-based path planning algorithms and the theory of random geometric graphs.Comment: 76 pages, 26 figures, to appear in International Journal of Robotics Researc
    corecore