873 research outputs found

    Graph Morphing via Orthogonal Box Drawings

    Get PDF
    Abstract: A graph is a set of vertices, with some pairwise connections given by a set of edges. A graph drawing, such as a node-link diagram, visualizes a graph with geometric features. One of the most common forms of a graph drawings are straight-line point drawings, which represent each vertex with a point and each edge with a line segment connecting its relevant points, and poly-line point drawings, which more generally allow edges to be represented by poly-lines. Of particular interest to this work are planar straight-line drawings and planar poly-line drawings, in which no two vertices share a location, and no two edges cross (except at shared endpoints). We study the morphing problem for planar drawings: Given two planar drawings of the same graph, can we output a continuous transformation (a “morph”) from one to the other, such that each intermediate drawing is also a planar drawing? It is quite easy to test if a morph exists, but the test is non-constructive. We are interested in the problem of constructing morphs with simple representations. Specifically, we study sequences of linear morphs, which represent the overall morph with a sequence of drawings, so that each pair of adjacent drawings in the sequence can be linearly interpolated. Each drawing in the sequence is called an “explicit” intermediate drawing, since it given explicitly in the output. Previous work has shown that a pair of straight-line drawings of an n-vertex graph can be morphed using O(n) linear morphs, so that every explicit intermediate drawing is a straight-line drawing. We show that an additional constraint can be added, at the cost of a small tradeoff: We further restrict the explicit intermediate drawings to lie on an O(n)×O(n) grid, while allowing them to be poly-line drawings with O(1) bends per edge. Additionally, we give an algorithm that computes this sequence in O(n^2) time, which is known to be tight. Our methods involve morphing another class of drawings—orthogonal box drawings—which represent each vertex with an axis-aligned rectangle, and each edge with an orthogonal poly-line. Our methods for morphing orthogonal box drawings make use of methods known for morphing orthogonal point drawings, which are poly-line drawings that restrict each poly-line to use only axis-aligned line segments

    Computing Low-Cost Convex Partitions for Planar Point Sets with Randomized Local Search and Constraint Programming (CG Challenge)

    Get PDF
    The Minimum Convex Partition problem (MCP) is a problem in which a point-set is used as the vertices for a planar subdivision, whose number of edges is to be minimized. In this planar subdivision, the outer face is the convex hull of the point-set, and the interior faces are convex. In this paper, we discuss and implement the approach to this problem using randomized local search, and different initialization techniques based on maximizing collinearity. We also solve small instances optimally using a SAT formulation. We explored this as part of the 2020 Computational Geometry Challenge, where we placed first as Team UBC

    Conflict Optimization for Binary CSP Applied to Minimum Partition into Plane Subgraphs and Graph Coloring

    Full text link
    CG:SHOP is an annual geometric optimization challenge and the 2022 edition proposed the problem of coloring a certain geometric graph defined by line segments. Surprisingly, the top three teams used the same technique, called conflict optimization. This technique has been introduced in the 2021 edition of the challenge, to solve a coordinated motion planning problem. In this paper, we present the technique in the more general framework of binary constraint satisfaction problems (binary CSP). Then, the top three teams describe their different implementations of the same underlying strategy. We evaluate the performance of those implementations to vertex color not only geometric graphs, but also other types of graphs.Comment: To appear at ACM Journal of Experimental Algorithmic

    Maize Genomes to Fields: 2014 and 2015 field season genotype, phenotype, environment, and inbred ear image datasets

    Get PDF
    Objectives: Crop improvement relies on analysis of phenotypic, genotypic, and environmental data. Given large, well-integrated, multi-year datasets, diverse queries can be made: Which lines perform best in hot, dry environments? Which alleles of specific genes are required for optimal performance in each environment? Such datasets also can be leveraged to predict cultivar performance, even in uncharacterized environments. The maize Genomes to Fields (G2F) Initiative is a multi-institutional organization of scientists working to generate and analyze such datasets from existing, publicly available inbred lines and hybrids. G2F’s genotype by environment project has released 2014 and 2015 datasets to the public, with 2016 and 2017 collected and soon to be made available. Data description: Datasets include DNA sequences; traditional phenotype descriptions, as well as detailed ear, cob, and kernel phenotypes quantified by image analysis; weather station measurements; and soil characterizations by site. Data are released as comma separated value spreadsheets accompanied by extensive README text descriptions. For genotypic and phenotypic data, both raw data and a version with outliers removed are reported. For weather data, two versions are reported: a full dataset calibrated against nearby National Weather Service sites and a second calibrated set with outliers and apparent artifacts removed

    conditions for re deployment and energy development

    Get PDF
    Irrespective of technical abundancy, RE potential per se does not imply a structural and inclusive expansion of energy access and an overall sustainable energy development of EA. Proper technological, economic, institutional, and policy considerations must be made to assess which are the best ways and most apt policies to sustain the exploitation of such potential in the regional context in relation to other energy sources, as well as which roadblocks and challenges are faced. A first meaningful consideration in this sense is that EA is characterised by a strong rural-urban imbalance: the majority of the population lives in poorly interconnected rural communities away from the electricity grid, which serves predominantly densely populated urban centres. While plans to tackle the imbalance are in place in virtually every country (both Kenya and South Africa have achieved notable results in this sense), the issue is not going to be structurally overcome rapidly. Thus, as highlighted by the least-cost electrification scenarios in Chap. 4, when discussing the case for renewables to increase and improve access, a distinction must be made between national grid expansion to reach additional shares of the population, and specific decentralised solutions

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð„with constraintsð ð ð„ „ ðandðŽð„ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    Search for Physics beyond the Standard Model in Events with Overlapping Photons and Jets

    Get PDF
    Results are reported from a search for new particles that decay into a photon and two gluons, in events with jets. Novel jet substructure techniques are developed that allow photons to be identified in an environment densely populated with hadrons. The analyzed proton-proton collision data were collected by the CMS experiment at the LHC, in 2016 at root s = 13 TeV, and correspond to an integrated luminosity of 35.9 fb(-1). The spectra of total transverse hadronic energy of candidate events are examined for deviations from the standard model predictions. No statistically significant excess is observed over the expected background. The first cross section limits on new physics processes resulting in such events are set. The results are interpreted as upper limits on the rate of gluino pair production, utilizing a simplified stealth supersymmetry model. The excluded gluino masses extend up to 1.7 TeV, for a neutralino mass of 200 GeV and exceed previous mass constraints set by analyses targeting events with isolated photons.Peer reviewe

    Search for supersymmetry in events with one lepton and multiple jets in proton-proton collisions at root s=13 TeV

    Get PDF
    Peer reviewe

    Measurement of the top quark forward-backward production asymmetry and the anomalous chromoelectric and chromomagnetic moments in pp collisions at √s = 13 TeV

    Get PDF
    Abstract The parton-level top quark (t) forward-backward asymmetry and the anomalous chromoelectric (d̂ t) and chromomagnetic (Ό̂ t) moments have been measured using LHC pp collisions at a center-of-mass energy of 13 TeV, collected in the CMS detector in a data sample corresponding to an integrated luminosity of 35.9 fb−1. The linearized variable AFB(1) is used to approximate the asymmetry. Candidate t t ÂŻ events decaying to a muon or electron and jets in final states with low and high Lorentz boosts are selected and reconstructed using a fit of the kinematic distributions of the decay products to those expected for t t ÂŻ final states. The values found for the parameters are AFB(1)=0.048−0.087+0.095(stat)−0.029+0.020(syst),Ό̂t=−0.024−0.009+0.013(stat)−0.011+0.016(syst), and a limit is placed on the magnitude of | d̂ t| < 0.03 at 95% confidence level. [Figure not available: see fulltext.

    Calibration of the CMS hadron calorimeters using proton-proton collision data at root s=13 TeV

    Get PDF
    Methods are presented for calibrating the hadron calorimeter system of theCMSetector at the LHC. The hadron calorimeters of the CMS experiment are sampling calorimeters of brass and scintillator, and are in the form of one central detector and two endcaps. These calorimeters cover pseudorapidities vertical bar eta vertical bar ee data. The energy scale of the outer calorimeters has been determined with test beam data and is confirmed through data with high transverse momentum jets. In this paper, we present the details of the calibration methods and accuracy.Peer reviewe
    • 

    corecore