119 research outputs found

    Cone-volume measures of polytopes

    Full text link
    The cone-volume measure of a polytope with centroid at the origin is proved to satisfy the subspace concentration condition. As a consequence a conjectured (a dozen years ago) fundamental sharp affine isoperimetric inequality for the U-functional is completely established -- along with its equality conditions.Comment: Slightly revised version thanks to the suggestions of the referees and other readers; two figures adde

    Combinatorics and Geometry of Transportation Polytopes: An Update

    Full text link
    A transportation polytope consists of all multidimensional arrays or tables of non-negative real numbers that satisfy certain sum conditions on subsets of the entries. They arise naturally in optimization and statistics, and also have interest for discrete mathematics because permutation matrices, latin squares, and magic squares appear naturally as lattice points of these polytopes. In this paper we survey advances on the understanding of the combinatorics and geometry of these polyhedra and include some recent unpublished results on the diameter of graphs of these polytopes. In particular, this is a thirty-year update on the status of a list of open questions last visited in the 1984 book by Yemelichev, Kovalev and Kravtsov and the 1986 survey paper of Vlach.Comment: 35 pages, 13 figure

    The Convex Hull Problem in Practice : Improving the Running Time of the Double Description Method

    Get PDF
    The double description method is a simple but widely used algorithm for computation of extreme points in polyhedral sets. One key aspect of its implementation is the question of how to efficiently test extreme points for adjacency. In this dissertation, two significant contributions related to adjacency testing are presented. First, the currently used data structures are revisited and various optimizations are proposed. Empirical evidence is provided to demonstrate their competitiveness. Second, a new adjacency test is introduced. It is a refinement of the well known algebraic test featuring a technique for avoiding redundant computations. Its correctness is formally proven. Its superiority in multiple degenerate scenarios is demonstrated through experimental results. Parallel computation is one further aspect of the double description method covered in this work. A recently introduced divide-and-conquer technique is revisited and considerable practical limitations are demonstrated

    Arithmetic geometry of toric varieties. Metrics, measures and heights

    Full text link
    We show that the height of a toric variety with respect to a toric metrized line bundle can be expressed as the integral over a polytope of a certain adelic family of concave functions. To state and prove this result, we study the Arakelov geometry of toric varieties. In particular, we consider models over a discrete valuation ring, metrized line bundles, and their associated measures and heights. We show that these notions can be translated in terms of convex analysis, and are closely related to objects like polyhedral complexes, concave functions, real Monge-Amp\`ere measures, and Legendre-Fenchel duality. We also present a closed formula for the integral over a polytope of a function of one variable composed with a linear form. This allows us to compute the height of toric varieties with respect to some interesting metrics arising from polytopes. We also compute the height of toric projective curves with respect to the Fubini-Study metric, and of some toric bundles.Comment: Revised version, 230 pages, 3 figure

    Geometric uncertainty models for correspondence problems in digital image processing

    Get PDF
    Many recent advances in technology rely heavily on the correct interpretation of an enormous amount of visual information. All available sources of visual data (e.g. cameras in surveillance networks, smartphones, game consoles) must be adequately processed to retrieve the most interesting user information. Therefore, computer vision and image processing techniques gain significant interest at the moment, and will do so in the near future. Most commonly applied image processing algorithms require a reliable solution for correspondence problems. The solution involves, first, the localization of corresponding points -visualizing the same 3D point in the observed scene- in the different images of distinct sources, and second, the computation of consistent geometric transformations relating correspondences on scene objects. This PhD presents a theoretical framework for solving correspondence problems with geometric features (such as points and straight lines) representing rigid objects in image sequences of complex scenes with static and dynamic cameras. The research focuses on localization uncertainty due to errors in feature detection and measurement, and its effect on each step in the solution of a correspondence problem. Whereas most other recent methods apply statistical-based models for spatial localization uncertainty, this work considers a novel geometric approach. Localization uncertainty is modeled as a convex polygonal region in the image space. This model can be efficiently propagated throughout the correspondence finding procedure. It allows for an easy extension toward transformation uncertainty models, and to infer confidence measures to verify the reliability of the outcome in the correspondence framework. Our procedure aims at finding reliable consistent transformations in sets of few and ill-localized features, possibly containing a large fraction of false candidate correspondences. The evaluation of the proposed procedure in practical correspondence problems shows that correct consistent correspondence sets are returned in over 95% of the experiments for small sets of 10-40 features contaminated with up to 400% of false positives and 40% of false negatives. The presented techniques prove to be beneficial in typical image processing applications, such as image registration and rigid object tracking

    Scenedesmus biomass productivity and nutrient removal from wet market wastewater, a bio-kinetic study

    Get PDF
    The current study aims to investigate the production of microalgae biomass as a function for different wet market wastewater ratios (10, 25, 50, 75 and 100%) and Scenedesmus sp. initial concentrations (104 , 105 , 106 , 107 cells/mL) through the phycoremediation process. The biomass production, total nitrogen (TN), total phosphorus (TP) and total organic compounds (TOC) were determined daily. The pseudo-first order kinetic model was used to measure the potential of Scendesmus sp. in removing nutrients while the Verhulst logistic kinetic model was used to study the growth kinetic. The study revealed that the maximum productivity of Scenedesmus sp. biomass was recorded with 106 cells/mL of the initial concentration in 50% wet market wastewater (98.54 mg/L/day), and the highest removal of TP, TN, and TOC was obtained (85, 90 and 65% respectively). Total protein and lipid contents in the biomass yield produced in the wet market wastewater were more than that in the biomass produced in the BBM (41.7 vs. 37.4 and 23.2 vs. 19.2%, respectively). The results of GC–MS confirmed detection of 44 compounds in the biomass from the wet market wastewater compared to four compounds in the BBM. These compounds have several applications in pharmaceutical and personal care products, chemical industry and antimicrobial activity. These findings indicated the applicability of wet market wastewater as a production medium for microalgae biomass
    • …
    corecore