1,079 research outputs found

    Beyond Chance-Constrained Convex Mixed-Integer Optimization: A Generalized Calafiore-Campi Algorithm and the notion of SS-optimization

    Full text link
    The scenario approach developed by Calafiore and Campi to attack chance-constrained convex programs utilizes random sampling on the uncertainty parameter to substitute the original problem with a representative continuous convex optimization with NN convex constraints which is a relaxation of the original. Calafiore and Campi provided an explicit estimate on the size NN of the sampling relaxation to yield high-likelihood feasible solutions of the chance-constrained problem. They measured the probability of the original constraints to be violated by the random optimal solution from the relaxation of size NN. This paper has two main contributions. First, we present a generalization of the Calafiore-Campi results to both integer and mixed-integer variables. In fact, we demonstrate that their sampling estimates work naturally for variables restricted to some subset SS of Rd\mathbb R^d. The key elements are generalizations of Helly's theorem where the convex sets are required to intersect S⊂RdS \subset \mathbb R^d. The size of samples in both algorithms will be directly determined by the SS-Helly numbers. Motivated by the first half of the paper, for any subset S⊂RdS \subset \mathbb R^d, we introduce the notion of an SS-optimization problem, where the variables take on values over SS. It generalizes continuous, integer, and mixed-integer optimization. We illustrate with examples the expressive power of SS-optimization to capture sophisticated combinatorial optimization problems with difficult modular constraints. We reinforce the evidence that SS-optimization is "the right concept" by showing that the well-known randomized sampling algorithm of K. Clarkson for low-dimensional convex optimization problems can be extended to work with variables taking values over SS.Comment: 16 pages, 0 figures. This paper has been revised and split into two parts. This version is the second part of the original paper. The first part of the original paper is arXiv:1508.02380 (the original article contained 24 pages, 3 figures

    Many is beautiful : commoditization as a source of disruptive innovation

    Get PDF
    Thesis (S.M.M.O.T.)--Massachusetts Institute of Technology, Sloan School of Management, Management of Technology Program, 2003.Includes bibliographical references (leaves 44-45).This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.The expression "disruptive technology" is now firmly embedded in the modern business lexicon. The mental model summarized by this concise phrase has great explanatory power for ex-post analysis of many revolutionary changes in business. Unfortunately, this paradigm can rarely be applied prescriptively. The classic formulation of a "disruptive technology" sheds little light on potential sources of innovation. This thesis seeks to extend this analysis by suggesting that many important disruptive technologies arise from commodities. The sudden availability of a high performance factor input at a low price often enables innovation in adjacent market segments. The thesis suggests main five reasons that commodities spur innovation: ** The emergence of a commodity collapses competition to the single dimension of price. Sudden changes in factor prices create new opportunities for supply driven innovation. Low prices enable innovators to substitute quantity for quality. ** The price / performance curve of a commodity creates an attractor that promotes demand aggregation. ** Commodities emerge after the establishment of a dominant design. Commodities have defined and stable interfaces. Well developed tool sets and experienced developer communities are available to work with commodities, decreasing the price of experimentation. ** Distributed architectures based on large number of simple, redundant components offer more predictable performance. Systems based on a small number of high performance components will have a higher standard deviation for uptime than high granularity systems based on large numbers of low power components. ** Distributed architectures are much more flexible than low granularity systems. Large integrated facilities often provide cost advantages when operating at the Minimum Efficient Scale of production. However, distributed architectures that can efficiently change production levels over time may be a superior solution based on the ability to adapt to changing market demand patterns. The evolution of third generation bus architectures in personal computers provides a comprehensive example of commodity based disruption, incorporating all five forces.by Richard Ellert Willey.S.M.M.O.T

    Quality of service over ATM networks

    Get PDF
    PhDAbstract not availabl

    QoS routing for MPLS networks employing mobile agents

    Full text link

    Architecture and Information Requirements to Assess and Predict Flight Safety Risks During Highly Autonomous Urban Flight Operations

    Get PDF
    As aviation adopts new and increasingly complex operational paradigms, vehicle types, and technologies to broaden airspace capability and efficiency, maintaining a safe system will require recognition and timely mitigation of new safety issues as they emerge and before significant consequences occur. A shift toward a more predictive risk mitigation capability becomes critical to meet this challenge. In-time safety assurance comprises monitoring, assessment, and mitigation functions that proactively reduce risk in complex operational environments where the interplay of hazards may not be known (and therefore not accounted for) during design. These functions can also help to understand and predict emergent effects caused by the increased use of automation or autonomous functions that may exhibit unexpected non-deterministic behaviors. The envisioned monitoring and assessment functions can look for precursors, anomalies, and trends (PATs) by applying model-based and data-driven methods. Outputs would then drive downstream mitigation(s) if needed to reduce risk. These mitigations may be accomplished using traditional design revision processes or via operational (and sometimes automated) mechanisms. The latter refers to the in-time aspect of the system concept. This report comprises architecture and information requirements and considerations toward enabling such a capability within the domain of low altitude highly autonomous urban flight operations. This domain may span, for example, public-use surveillance missions flown by small unmanned aircraft (e.g., infrastructure inspection, facility management, emergency response, law enforcement, and/or security) to transportation missions flown by larger aircraft that may carry passengers or deliver products. Caveat: Any stated requirements in this report should be considered initial requirements that are intended to drive research and development (R&D). These initial requirements are likely to evolve based on R&D findings, refinement of operational concepts, industry advances, and new industry or regulatory policies or standards related to safety assurance

    A New Model for Location-Allocation Problem within Queuing Framework

    Get PDF
    This paper proposes a bi-objective model for the facility location problem under a congestion system. The idea of the model is motivated by applications of locating servers in bank automated teller machines (ATMS), communication networks, and so on. This model can be specifically considered for situations in which fixed service facilities are congested by stochastic demand within queueing framework. We formulate this model with two perspectives simultaneously: (i) customers and (ii) service provider. The objectives of the model are to minimize (i) the total expected travelling and waiting time and (ii) the average facility idle-time. This model represents a mixed-integer nonlinear programming problem which belongs to the class of NP-hard problems. In addition, to solve the model, two metaheuristic algorithms including non-dominated sorting genetic algorithms (NSGA-II) and non-dominated ranking genetic algorithms (NRGA) are proposed. Besides, to evaluate the performance of the two algorithms some numerical examples are roduced and analyzed with some metrics to determine which algorithm works better
    • …
    corecore