11,088 research outputs found

    An update on the Hirsch conjecture

    Get PDF
    The Hirsch conjecture was posed in 1957 in a letter from Warren M. Hirsch to George Dantzig. It states that the graph of a d-dimensional polytope with n facets cannot have diameter greater than n - d. Despite being one of the most fundamental, basic and old problems in polytope theory, what we know is quite scarce. Most notably, no polynomial upper bound is known for the diameters that are conjectured to be linear. In contrast, very few polytopes are known where the bound ndn-d is attained. This paper collects known results and remarks both on the positive and on the negative side of the conjecture. Some proofs are included, but only those that we hope are accessible to a general mathematical audience without introducing too many technicalities.Comment: 28 pages, 6 figures. Many proofs have been taken out from version 2 and put into the appendix arXiv:0912.423

    Level Set Methods for Stochastic Discontinuity Detection in Nonlinear Problems

    Full text link
    Stochastic physical problems governed by nonlinear conservation laws are challenging due to solution discontinuities in stochastic and physical space. In this paper, we present a level set method to track discontinuities in stochastic space by solving a Hamilton-Jacobi equation. By introducing a speed function that vanishes at discontinuities, the iso-zero of the level set problem coincide with the discontinuities of the conservation law. The level set problem is solved on a sequence of successively finer grids in stochastic space. The method is adaptive in the sense that costly evaluations of the conservation law of interest are only performed in the vicinity of the discontinuities during the refinement stage. In regions of stochastic space where the solution is smooth, a surrogate method replaces expensive evaluations of the conservation law. The proposed method is tested in conjunction with different sets of localized orthogonal basis functions on simplex elements, as well as frames based on piecewise polynomials conforming to the level set function. The performance of the proposed method is compared to existing adaptive multi-element generalized polynomial chaos methods

    Smoothed Analysis of the Minimum-Mean Cycle Canceling Algorithm and the Network Simplex Algorithm

    Get PDF
    The minimum-cost flow (MCF) problem is a fundamental optimization problem with many applications and seems to be well understood. Over the last half century many algorithms have been developed to solve the MCF problem and these algorithms have varying worst-case bounds on their running time. However, these worst-case bounds are not always a good indication of the algorithms' performance in practice. The Network Simplex (NS) algorithm needs an exponential number of iterations for some instances, but it is considered the best algorithm in practice and performs best in experimental studies. On the other hand, the Minimum-Mean Cycle Canceling (MMCC) algorithm is strongly polynomial, but performs badly in experimental studies. To explain these differences in performance in practice we apply the framework of smoothed analysis. We show an upper bound of O(mn2log(n)log(ϕ))O(mn^2\log(n)\log(\phi)) for the number of iterations of the MMCC algorithm. Here nn is the number of nodes, mm is the number of edges, and ϕ\phi is a parameter limiting the degree to which the edge costs are perturbed. We also show a lower bound of Ω(mlog(ϕ))\Omega(m\log(\phi)) for the number of iterations of the MMCC algorithm, which can be strengthened to Ω(mn)\Omega(mn) when ϕ=Θ(n2)\phi=\Theta(n^2). For the number of iterations of the NS algorithm we show a smoothed lower bound of Ω(mmin{n,ϕ}ϕ)\Omega(m \cdot \min \{ n, \phi \} \cdot \phi).Comment: Extended abstract to appear in the proceedings of COCOON 201

    Optimal Joint Routing and Scheduling in Millimeter-Wave Cellular Networks

    Full text link
    Millimeter-wave (mmWave) communication is a promising technology to cope with the expected exponential increase in data traffic in 5G networks. mmWave networks typically require a very dense deployment of mmWave base stations (mmBS). To reduce cost and increase flexibility, wireless backhauling is needed to connect the mmBSs. The characteristics of mmWave communication, and specifically its high directional- ity, imply new requirements for efficient routing and scheduling paradigms. We propose an efficient scheduling method, so-called schedule-oriented optimization, based on matching theory that optimizes QoS metrics jointly with routing. It is capable of solving any scheduling problem that can be formulated as a linear program whose variables are link times and QoS metrics. As an example of the schedule-oriented optimization, we show the optimal solution of the maximum throughput fair scheduling (MTFS). Practically, the optimal scheduling can be obtained even for networks with over 200 mmBSs. To further increase the runtime performance, we propose an efficient edge-coloring based approximation algorithm with provable performance bound. It achieves over 80% of the optimal max-min throughput and runs 5 to 100 times faster than the optimal algorithm in practice. Finally, we extend the optimal and approximation algorithms for the cases of multi-RF-chain mmBSs and integrated backhaul and access networks.Comment: To appear in Proceedings of INFOCOM '1
    corecore