1,795 research outputs found

    An Optimal Control Theory for the Traveling Salesman Problem and Its Variants

    Get PDF
    We show that the traveling salesman problem (TSP) and its many variants may be modeled as functional optimization problems over a graph. In this formulation, all vertices and arcs of the graph are functionals; i.e., a mapping from a space of measurable functions to the field of real numbers. Many variants of the TSP, such as those with neighborhoods, with forbidden neighborhoods, with time-windows and with profits, can all be framed under this construct. In sharp contrast to their discrete-optimization counterparts, the modeling constructs presented in this paper represent a fundamentally new domain of analysis and computation for TSPs and their variants. Beyond its apparent mathematical unification of a class of problems in graph theory, the main advantage of the new approach is that it facilitates the modeling of certain application-specific problems in their home space of measurable functions. Consequently, certain elements of economic system theory such as dynamical models and continuous-time cost/profit functionals can be directly incorporated in the new optimization problem formulation. Furthermore, subtour elimination constraints, prevalent in discrete optimization formulations, are naturally enforced through continuity requirements. The price for the new modeling framework is nonsmooth functionals. Although a number of theoretical issues remain open in the proposed mathematical framework, we demonstrate the computational viability of the new modeling constructs over a sample set of problems to illustrate the rapid production of end-to-end TSP solutions to extensively-constrained practical problems.Comment: 24 pages, 8 figure

    Asymmetry measures for QSOs and companions

    Full text link
    An asymmetry index is derived from ellipse-fitting to galaxy images, that gives weight to faint outer features and is not strongly redshift-dependent. These measures are made on a sample of 13 2MASS QSOs and their neighbour galaxies, and a control sample of field galaxies from the same wide-field imaging data. The QSO host galaxy asymmetries correlate well with visual tidal interaction indices previously published. The companion galaxies have somewhat higher asymmetry than the control galaxy sample, and their asymmetry is inversely correlated with distance from the QSO. The distribution of QSO-companion asymmetry indices is different from that for matched control field galaxies at the ∼95\sim95% significance level. We present the data and discuss this evidence for tidal and other disturbances in the vicinity of QSOs.Comment: 13 pages, 2 tables, 4 figures; to appear in A

    PCV102 Drug Use Among Seniors On Public Drug Programs In Canada, 2012

    Get PDF

    Fast Mesh Refinement in Pseudospectral Optimal Control

    Get PDF
    Mesh refinement in pseudospectral (PS) optimal control is embarrassingly easy --- simply increase the order NN of the Lagrange interpolating polynomial and the mathematics of convergence automates the distribution of the grid points. Unfortunately, as NN increases, the condition number of the resulting linear algebra increases as N2N^2; hence, spectral efficiency and accuracy are lost in practice. In this paper, we advance Birkhoff interpolation concepts over an arbitrary grid to generate well-conditioned PS optimal control discretizations. We show that the condition number increases only as N\sqrt{N} in general, but is independent of NN for the special case of one of the boundary points being fixed. Hence, spectral accuracy and efficiency are maintained as NN increases. The effectiveness of the resulting fast mesh refinement strategy is demonstrated by using \underline{polynomials of over a thousandth order} to solve a low-thrust, long-duration orbit transfer problem.Comment: 27 pages, 12 figures, JGCD April 201

    A Two-Method Comparison of Muscle Testing the Serratus Anterior: Daniels and Worthingham vs. Kendall and Mccreary

    Get PDF
    The purpose of this study was to compare the amount of force produced by the left serratus anterior muscle when using two methods of muscle testing. Thirty subjects (5 men, 25 women) participated in this study. A manual muscle test was performed with each subject properly positioned for testing a good to normal muscle grade using the Daniels and Worthingham and Kendall and McCreary methods of muscle testing. A practice test of each method was performed and a rest period of one and a half minutes was allowed between tests. A hand-held dynamometer, the Dynatron II, measured objective data. Strength was recorded in pounds of force. Results reveal a significant difference in force produced by the left serratus anterior muscle when using two methods of muscle testing. The Daniels and Worthingham method of muscle testing revealed a larger production of force with a mean value of 41.37 pounds of force. The mean value of force produced with the Kendall and McCreary method of muscle testing was 27.39 pounds of force. This is statistically significant at the .0001 criterion level. There is, however, a strong positive correlation between the two methods of muscle testing

    Fusions et économies de dimension sur le marché des assurances générales au Québec

    Get PDF
    À l’aide d’une analyse de régression en coupe instantanée l’auteur tente de mesurer l’importance des économies de dimensions dans l’industrie des assurances générales avant 1978 et d’expliquer l’augmentation des fusions dans cette industrie, au Québec, entre 1978 et 1981. Les deuxième et troisième sections portent sur la question du choix d’une approche pertinente au problème de mesure des économies de dimension dans cette industrie. La dernière partie présente les résultats empiriques.Cross section regressions are used to evaluate the extent of economies of scale in the property-liability insurance industry prior to 1978 and to explain the increased number of mergers between 1978 and 1981 in the Quebec market. The second and third section of this paper discusses the appropriate approach to the problem of measuring economies of scale in the insurance industry. The fourth section describes the data and the empirical results

    Mechanisms of short-term false memory formation (Kısa süreli sahte bellek formasyonlarının mekanizmaları)

    Get PDF
    False memories are the erroneous recollection of events that did not actually occur. False memories have been broadly investigated within the domain of long-term memory, while studies involving short-term memory are less common and provide a far less detailed ‘picture’ of this phenomenon. We tested participants in a short-term memory task involving lists of four semantically related words that had to be matched with a probe word. Crucially, the probe word could be one of the four words of the list, it could be semantically related to them, or it could be semantically unrelated to the list. Participants had to decide whether the probe was in the list. To this task we added articulatory suppression to impair rehearsal, concurrent material to remember, and changes to the visual appearance of the probes to assess the mechanism involved in short-term memory retrieval. The results showed that, similarly to the studies on long-term memory, false memories emerged more frequently for probes semantically related to the list and when rehearsal was impaired by concurrent material. The visual appearance of the stimuli did not play an important role. This set of results suggests that deep semantic processing, rather than only superficial visual processing, is taking place within a few seconds from the presentation of the probes

    Experimental Implementation of Riemann--Stieltjes Optimal Control for Agile Imaging Satellites

    Get PDF
    The article of record as published may be found at http://dx.doi.org/10.2514/1.G00132

    Implementations of the Universal Birkhoff Theory for Fast Trajectory Optimization

    Full text link
    This is part II of a two-part paper. Part I presented a universal Birkhoff theory for fast and accurate trajectory optimization. The theory rested on two main hypotheses. In this paper, it is shown that if the computational grid is selected from any one of the Legendre and Chebyshev family of node points, be it Lobatto, Radau or Gauss, then, the resulting collection of trajectory optimization methods satisfy the hypotheses required for the universal Birkhoff theory to hold. All of these grid points can be generated at an O(1)\mathcal{O}(1) computational speed. Furthermore, all Birkhoff-generated solutions can be tested for optimality by a joint application of Pontryagin's- and Covector-Mapping Principles, where the latter was developed in Part~I. More importantly, the optimality checks can be performed without resorting to an indirect method or even explicitly producing the full differential-algebraic boundary value problem that results from an application of Pontryagin's Principle. Numerical problems are solved to illustrate all these ideas. The examples are chosen to particularly highlight three practically useful features of Birkhoff methods: (1) bang-bang optimal controls can be produced without suffering any Gibbs phenomenon, (2) discontinuous and even Dirac delta covector trajectories can be well approximated, and (3) extremal solutions over dense grids can be computed in a stable and efficient manner
    • …
    corecore