342 research outputs found

    Goldberg, Fuller, Caspar, Klug and Coxeter and a general approach to local symmetry-preserving operations

    Full text link
    Cubic polyhedra with icosahedral symmetry where all faces are pentagons or hexagons have been studied in chemistry and biology as well as mathematics. In chemistry one of these is buckminsterfullerene, a pure carbon cage with maximal symmetry, whereas in biology they describe the structure of spherical viruses. Parameterized operations to construct all such polyhedra were first described by Goldberg in 1937 in a mathematical context and later by Caspar and Klug -- not knowing about Goldberg's work -- in 1962 in a biological context. In the meantime Buckminster Fuller also used subdivided icosahedral structures for the construction of his geodesic domes. In 1971 Coxeter published a survey article that refers to these constructions. Subsequently, the literature often refers to the Goldberg-Coxeter construction. This construction is actually that of Caspar and Klug. Moreover, there are essential differences between this (Caspar/Klug/Coxeter) approach and the approaches of Fuller and of Goldberg. We will sketch the different approaches and generalize Goldberg's approach to a systematic one encompassing all local symmetry-preserving operations on polyhedra

    Coactive Learning for Locally Optimal Problem Solving

    Full text link
    Coactive learning is an online problem solving setting where the solutions provided by a solver are interactively improved by a domain expert, which in turn drives learning. In this paper we extend the study of coactive learning to problems where obtaining a globally optimal or near-optimal solution may be intractable or where an expert can only be expected to make small, local improvements to a candidate solution. The goal of learning in this new setting is to minimize the cost as measured by the expert effort over time. We first establish theoretical bounds on the average cost of the existing coactive Perceptron algorithm. In addition, we consider new online algorithms that use cost-sensitive and Passive-Aggressive (PA) updates, showing similar or improved theoretical bounds. We provide an empirical evaluation of the learners in various domains, which show that the Perceptron based algorithms are quite effective and that unlike the case for online classification, the PA algorithms do not yield significant performance gains.Comment: AAAI 2014 paper, including appendice

    Continuous correlated beta processes

    Get PDF
    In this paper we consider a (possibly continuous) space of Bernoulli experiments. We assume that the Bernoulli distributions are correlated. All evidence data comes in the form of successful or failed experiments at different points. Current state-ofthe-art methods for expressing a distribution over a continuum of Bernoulli distributions use logistic Gaussian processes or Gaussian copula processes. However, both of these require computationally expensive matrix operations (cubic in the general case). We introduce a more intuitive approach, directly correlating beta distributions by sharing evidence between them according to a kernel function, an approach which has linear time complexity. The approach can easily be extended to multiple outcomes, giving a continuous correlated Dirichlet process, and can be used for both classification and learning the actual probabilities of the Bernoulli distributions. We show results for a number of data sets, as well as a case-study where a mixture of continuous beta processes is used as part of an automated stroke rehabilitation system.

    Local symmetry preserving operations on polyhedra

    Get PDF

    Strategic Design of a Robust Supply Chain

    Get PDF
    The strategic design of a robust supply chain has as goal the configuration of the supply chain structure so that the performance of the supply chain remains of a consistently high quality for all possible future scenarios. We model this goal with an objective function that trades off the central tendency of the supply chain profit with the dispersion of the profit as measured by the standard deviation for any value of the weights assigned to the two components. However, the standard deviation, used as the dispersion penalty for profit maximization, has a square root expression which makes standard maximization algorithms non applicable. The focus in this article is on the development of the strategic and tactical models. The application of the methodology to an industrial case will be reported. The optimization algorithm and detailed numerical experiments will be described in future research

    A Framework for the Robust Design of Unit Load Storage Systems

    Get PDF
    The unit load storage assignment problem determines the assignment of a set of unit loads with known arrival and departure times to a set of unit storage locations in a warehouse. The material handling device(s) can carry at most one unit load at the time. In this research it is assumed that each of the storage locations can be accessed directly without load relocations or rearrangements and that the travel times between the storage locations and from and to the warehousing docks can be computed in advance. The objective is to minimize the total travel time of the material handling device for performing a number of storage and retrieval operations. This type of storage system is in widespread use and implemented in both mechanized and automated systems. It is by far one of the most common storage system architectures for unit loads. The formulation of this problem belongs to the class of Assignment Problems (AP) but finding the optimal solution for the most general variant is provably hard for large problem instances. A classification of the different variants of the APs for unit loads will be presented. The size of the instance problem is proportional to the product of the number of loads and the number of locations and the number of periods in the planning horizon and is typically very large for real world problem instances. Efficient solutions algorithms only exist for product-based storage policies or for the very special case of a perfectly balanced warehouse for load-based storage policies. However, for load-based storage policies the integrality property is not satisfied in general. This results in very large binary programming problems that to date cannot be solved to optimality. However, the formulations have special structure that can be exploited to design efficient solution algorithms. Properties and the special structure of the formulation will be presented. A specialized compound solution algorithm combines primal and dual approaches and heuristics to reduce the optimality gap. Initial computational experience will be shared. It is anticipated that the solution algorithm can either be directly implemented in commercial warehouse management systems or that it becomes a tool to evaluate the performance of commercially implemented storage policies. The above formulation is the sub problem in a decomposition algorithm for the design of unit load storage systems that identifies the tradeoffs between efficiency and risk of the performance of the storage system. Different risk measures such as the standard deviation and the downside risk can be used. An example based on realistic data values shows that in this case operator-controlled systems are less expensive and more risky than automated systems. However, if the same level of risk is mandated then the automated system is less expensive

    Robust Material Handling System Design Based on The Risk Versus Cost Tradeoff

    Get PDF
    The design and planning of major material handling systems belongs to the class of systems design problems under uncertainty. The overall structure of the system is decided during the current design stage, while the values of the future conditions and the future planning decisions are not known with certainty. Typically the future uncertainty is modeled through a number of scenarios and each scenario has an individual timediscounted total system cost. The overall performance of the material handling system is characterized by the distribution of these scenario costs. The central tendency of the cost distribution is almost always computed as the expected value of the distribution. Several alternatives can be used for the dispersion of the distribution such as the standard deviation and variance. In this study the standard deviation of the cost distribution is used as the measure of the risk of the system. The goal is to identify all configurations of the material handling system that are Pareto optimal with respect to the trade off between the expected value and the standard deviation of the costs; such Pareto-optimal configurations are also called efficient. The final selection of the material handling system for implementation can then be made based on the Pareto graph and other considerations such as the risk preferences of the system owner

    Generation of local symmetry-preserving operations on polyhedra

    Get PDF
    We introduce a new practical and more general definition of local symmetry-preserving operations on polyhedra. These can be applied to arbitrary embedded graphs and result in embedded graphs with the same or higher symmetry. With some additional properties we can restrict the connectivity, e.g. when we only want to consider polyhedra. Using some base structures and a list of 10 extensions, we can generate all possible local symmetry-preserving operations isomorph-free
    • …
    corecore