6,370 research outputs found

    On angular momentum of gravitational radiation

    Full text link
    The quasigroup approach to the conservation laws (Phys. Rev. D56, R7498 (1997)) is completed by imposing new gauge conditions for asymptotic symmetries. Noether charge associated with an arbitrary element of the Poincar\'e quasialgebra is free from the supertranslational ambiquity and identically vanishes in a flat spacetimeComment: Revtex4 styl

    Nonassociativity, Dirac monopoles and Aharonov-Bohm effect

    Full text link
    The Aharonov-Bohm (AB) effect for the singular string associated with the Dirac monopole carrying an arbitrary magnetic charge is studied. It is shown that the emerging difficulties in explanation of the AB effect may be removed by introducing nonassociative path-dependent wave functions. This provides the absence of the AB effect for the Dirac string of magnetic monopole with an arbitrary magnetic charge.Comment: Revised version. Typos corrected. References adde

    Non-Hermitian Quantum Systems and Time-Optimal Quantum Evolution

    Get PDF
    Recently, Bender et al. have considered the quantum brachistochrone problem for the non-Hermitian PT\cal PT-symmetric quantum system and have shown that the optimal time evolution required to transform a given initial state ψi|\psi_i\rangle into a specific final state ψf|\psi_f\rangle can be made arbitrarily small. Additionally, it has been shown that finding the shortest possible time requires only the solution of the two-dimensional problem for the quantum system governed by the effective Hamiltonian acting in the subspace spanned by ψi|\psi_i\rangle and ψf|\psi_f\rangle. In this paper, we study a similar problem for the generic non-Hermitian Hamiltonian, focusing our attention on the geometric aspects of the problem

    Random gradient-free minimization of convex functions

    Get PDF
    In this paper, we prove the complexity bounds for methods of Convex Optimization based only on computation of the function value. The search directions of our schemes are normally distributed random Gaussian vectors. It appears that such methods usually need at most n times more iterations than the standard gradient methods, where n is the dimension of the space of variables. This conclusion is true both for nonsmooth and smooth problems. For the later class, we present also an accelerated scheme with the expected rate of convergence O(n[ exp ]2 /k[ exp ]2), where k is the iteration counter. For Stochastic Optimization, we propose a zero-order scheme and justify its expected rate of convergence O(n/k[ exp ]1/2). We give also some bounds for the rate of convergence of the random gradient-free methods to stationary points of nonconvex functions, both for smooth and nonsmooth cases. Our theoretical results are supported by preliminary computational experiments.convex optimization, stochastic optimization, derivative-free methods, random methods, complexity bounds

    Barrier subgradient method

    Get PDF
    In this paper we develop a new primal-dual subgradient method for nonsmooth convex optimization problems. This scheme is based on a self-concordant barrier for the basic feasible set. It is suitable for finding approximate solutions with certain relative accuracy. We discuss some applications of this technique including fractional covering problem, maximal concurrent flow problem, semidefinite relaxations and nonlinear online optimization.convex optimization, subgradient methods, non-smooth optimization, minimax problems, saddle points, variational inequalities, stochastic optimization, black-box methods, lower complexity bounds.

    Smoothness parameter of power of Euclidean norm

    Full text link
    In this paper, we study derivatives of powers of Euclidean norm. We prove their H\"older continuity and establish explicit expressions for the corresponding constants. We show that these constants are optimal for odd derivatives and at most two times suboptimal for the even ones. In the particular case of integer powers, when the H\"older continuity transforms into the Lipschitz continuity, we improve this result and obtain the optimal constants.Comment: J Optim Theory Appl (2020

    Efficiency of coordinate descent methods on huge-scale optimization problems

    Get PDF
    In this paper we propose new methods for solving huge-scale optimization problems. For problems of this size, even the simplest full-dimensional vector operations are very expensive. Hence, we propose to apply an optimization technique based on random partial update of decision variables. For these methods, we prove the global estimates for the rate of convergence. Surprisingly enough, for certain classes of objective functions, our results are better than the standard worst-case bounds for deterministic algorithms. We present constrained and unconstrained versions of the method, and its accelerated variant. Our numerical test confirms a high efficiency of this technique on problems of very big size.Convex optimization, coordinate relaxation, worst-case efficiency estimates, fast gradient schemes, Google problem
    corecore