2,731 research outputs found

    Variational Principle for Velocity-Pressure Formulation of Navier-Stokes Equations

    Get PDF
    The work described here shows that the known variational principle for the Navier-Stokes equations and the adjoint system can be modified to produce a set of Euler-Lagrange variational equations which have the same order and same solution as the Navier-Stokes equations provided the adjoint system has a unique solution, and provided in the steady state case, that the Reynolds number remains finite.Comment: 10 page

    Exact analytical solution of viscous Korteweg-deVries equation for water waves

    Get PDF
    The evolution of a solitary wave with very weak nonlinearity which was originally investigated by Miles [4] is revisited. The solution for a one-dimensional gravity wave in a water of uniform depth is considered. This leads to finding the solution to a Korteweg-de Vries (KdV) equation in which the nonlinear term is small. Also considered is the asymptotic solution of the linearized KdV equation both analytically and numerically. As in Miles [4], the asymptotic solution of the KdV equation for both linear and weakly nonlinear case is found using the method of inversescattering theory. Additionally investigated is the analytical solution of viscous-KdV equation which reveals the formation of the Peregrine soliton that decays to the initial sech^2(\xi) soliton and eventually growing back to a narrower and higher amplitude bifurcated Peregrine-type soliton.Comment: 15 page

    Mutual Exclusivity Loss for Semi-Supervised Deep Learning

    Full text link
    In this paper we consider the problem of semi-supervised learning with deep Convolutional Neural Networks (ConvNets). Semi-supervised learning is motivated on the observation that unlabeled data is cheap and can be used to improve the accuracy of classifiers. In this paper we propose an unsupervised regularization term that explicitly forces the classifier's prediction for multiple classes to be mutually-exclusive and effectively guides the decision boundary to lie on the low density space between the manifolds corresponding to different classes of data. Our proposed approach is general and can be used with any backpropagation-based learning method. We show through different experiments that our method can improve the object recognition performance of ConvNets using unlabeled data.Comment: 5 pages, 1 figures, ICIP 201
    • …
    corecore