2 research outputs found

    L infinity Isotonic Regression for Linear, Multidimensional, and Tree Orders

    Full text link
    Algorithms are given for determining LL_\infty isotonic regression of weighted data. For a linear order, grid in multidimensional space, or tree, of nn vertices, optimal algorithms are given, taking Θ(n)\Theta(n) time. These improve upon previous algorithms by a factor of Ω(logn)\Omega(\log n). For vertices at arbitrary positions in dd-dimensional space a Θ(nlogd1n)\Theta(n \log^{d-1} n) algorithm employs iterative sorting to yield the functionality of a multidimensional structure while using only Θ(n)\Theta(n) space. The algorithms utilize a new non-constructive feasibility test on a rendezvous graph, with bounded error envelopes at each vertex.Comment: updated references, minor modification

    Decomposing Isotonic Regression for Efficiently Solving Large Problems

    No full text
    A new algorithm for isotonic regression is presented based on recursively partitioning the solution space. We develop efficient methods for each partitioning subproblem through an equivalent representation as a network flow problem, and prove that this sequence of partitions converges to the global solution. These network flow problems can further be decomposed in order to solve very large problems. Success of isotonic regression in prediction and our algorithm’s favorable computational properties are demonstrated through simulated examples as large as 2 × 10 5 variables and 10 7 constraints.
    corecore