74,754 research outputs found

    Optimal bounds with semidefinite programming: an application to stress driven shear flows

    Full text link
    We introduce an innovative numerical technique based on convex optimization to solve a range of infinite dimensional variational problems arising from the application of the background method to fluid flows. In contrast to most existing schemes, we do not consider the Euler--Lagrange equations for the minimizer. Instead, we use series expansions to formulate a finite dimensional semidefinite program (SDP) whose solution converges to that of the original variational problem. Our formulation accounts for the influence of all modes in the expansion, and the feasible set of the SDP corresponds to a subset of the feasible set of the original problem. Moreover, SDPs can be easily formulated when the fluid is subject to imposed boundary fluxes, which pose a challenge for the traditional methods. We apply this technique to compute rigorous and near-optimal upper bounds on the dissipation coefficient for flows driven by a surface stress. We improve previous analytical bounds by more than 10 times, and show that the bounds become independent of the domain aspect ratio in the limit of vanishing viscosity. We also confirm that the dissipation properties of stress driven flows are similar to those of flows subject to a body force localized in a narrow layer near the surface. Finally, we show that SDP relaxations are an efficient method to investigate the energy stability of laminar flows driven by a surface stress.Comment: 17 pages; typos removed; extended discussion of linear matrix inequalities in Section III; revised argument in Section IVC, results unchanged; extended discussion of computational setup and limitations in Sectios IVE-IVF. Submitted to Phys. Rev.

    The Limits of Post-Selection Generalization

    Full text link
    While statistics and machine learning offers numerous methods for ensuring generalization, these methods often fail in the presence of adaptivity---the common practice in which the choice of analysis depends on previous interactions with the same dataset. A recent line of work has introduced powerful, general purpose algorithms that ensure post hoc generalization (also called robust or post-selection generalization), which says that, given the output of the algorithm, it is hard to find any statistic for which the data differs significantly from the population it came from. In this work we show several limitations on the power of algorithms satisfying post hoc generalization. First, we show a tight lower bound on the error of any algorithm that satisfies post hoc generalization and answers adaptively chosen statistical queries, showing a strong barrier to progress in post selection data analysis. Second, we show that post hoc generalization is not closed under composition, despite many examples of such algorithms exhibiting strong composition properties
    • …
    corecore