32 research outputs found

    Investigations in Hadamard spaces

    Get PDF
    Kjo tezĂ« e doktoratĂ«s hulumton ndĂ«rveprimin midis gjeometrisĂ« dhe analizĂ«s konvekse nĂ« hapĂ«sirat Hadamard. E motivuar nga aplikime tĂ« shumta tĂ« gjeometrisĂ« CAT(0), puna jonĂ« bazohet nĂ« rezultatet e shumĂ« autorĂ«ve tĂ« mĂ«parshĂ«mnĂ« mbi analizĂ«n konvekse dhe gjeometrinĂ« nĂ« sensin e Alexandrovit. Hetimet tona u pĂ«rgjigjen disa pyetjeve nĂ« teorinĂ« e hapĂ«sirave CAT(0) prej tĂ« cilave disa janĂ« parashtruar si probleme tĂ« hapura nĂ« literaturĂ«n e fundit. Teza jonĂ« e doktoratĂ«s zhvillohet sipas linjave tĂ« mĂ«poshtme: 1. TopologjitĂ« e dobĂ«ta nĂ« hapĂ«sirat Hadamard, 2. Konveksifikimi i bashkĂ«sive kompakte, 3. Problemi i pemĂ«s mesatare nĂ« hapĂ«sirat e pemĂ«ve filogjenetike, 4. Konvergjenca Mosko nĂ« hapĂ«sirat Hadamard, 5. OperatorĂ«t (plotĂ«sisht) jo-ekspansivĂ« dhe aplikimet e tyre nĂ« hapĂ«sirat Hadamard.Diese Doktorarbeit untersucht das Zusammenspiel zwischen Geometrie und konvexer Analyse in HadamardrĂ€umen. Motiviert durch zahlreiche Anwendungen der CAT(0)-Geometrie baut unsere Arbeit auf den Ergebnissen vieler frĂŒherer Autoren in der konvexen Analysis und der Alexandrov-Geometrie auf. Unsere Untersuchungen beantworten mehrere Fragen in der Theorie von CAT(0)-RĂ€umen, von denen einige in der neueren Literatur als offene Probleme gestellt wurden. Zusammengefasst entwickelt sich unsere Dissertation in folgende Richtungen: 1. Schwache Topologien in Hadamard-RĂ€umen, 2. Konvexe HĂŒllen kompakter Mengen, 3. Mittleres Baumproblem in phylogenetischen BaumrĂ€umen, 4. Mosco-Konvergenz in Hadamard-RĂ€umen, 5. Fest nichtexpansive Operatoren und ihre Anwendungen in Hadamard-RĂ€umen.This thesis investigates the interplay between geometry and convex analysis in Hadamard spaces. Motivated by numerous applications of CAT(0) geometry, our work builds upon the results in convex analysis and Alexandrov geometry of many previous authors. Our investigations answer several questions in the theory of CAT(0) spaces some of which were posed as open problems in recent literature. In a nutshell our thesis develops along the following lines: 1. Weak topologies in Hadamard spaces, 2. Convex hulls of compact sets, 3. Mean tree problem in phylogenetic tree spaces, 4. Mosco convergence in Hadamard spaces, 5. Firmly nonexpansive operators and their applications in Hadamard spaces

    On the role of the coefficients in the strong convergence of a general type Mann iterative scheme

    Get PDF
    Let H be a Hilbert space. Let (Wn)n∈N(W_{n})_{n\in\mathbb{N}} be a suitable family of mappings. Let S be a nonexpansive mapping and D be a strongly monotone operator. We study the convergence of the general scheme xn+1=Wn(αnSxn+(1−αn)(I−ΌnD)xn)x_{n+1}=W_{n}(\alpha_{n}Sx_{n}+(1-\alpha_{n})(I-\mu_{n}D)x_{n}) in dependence on the coefficients (αn)n∈N(\alpha_{n})_{n\in\mathbb{N}} , (ÎŒn)n∈N(\mu_{n})_{n\in\mathbb{N}}

    Quantitative results on Fejér monotone sequences

    Get PDF
    We provide in a unified way quantitative forms of strong convergence results for numerous iterative procedures which satisfy a general type of FejÂŽer monotonicity where the convergence uses the compactness of the underlying set. These quantitative versions are in the form of explicit rates of so-called metastability in the sense of T. Tao. Our approach covers examples ranging from the proximal point algorithm for maximal monotone operators to various fixed point iterations (xn) for firmly nonexpansive, asymptotically nonexpansive, strictly pseudo-contractive and other types of mappings. Many of the results hold in a general metric setting with some convexity structure added (so-called W-hyperbolic spaces). Sometimes uniform convexity is assumed still covering the important class of CAT(0)-spaces due to Gromov.German Science FoundationRomanian National Authority for Scientific Researc

    First order algorithms in variational image processing

    Get PDF
    Variational methods in imaging are nowadays developing towards a quite universal and flexible tool, allowing for highly successful approaches on tasks like denoising, deblurring, inpainting, segmentation, super-resolution, disparity, and optical flow estimation. The overall structure of such approaches is of the form D(Ku)+αR(u)→min⁥u{\cal D}(Ku) + \alpha {\cal R} (u) \rightarrow \min_u ; where the functional D{\cal D} is a data fidelity term also depending on some input data ff and measuring the deviation of KuKu from such and R{\cal R} is a regularization functional. Moreover KK is a (often linear) forward operator modeling the dependence of data on an underlying image, and α\alpha is a positive regularization parameter. While D{\cal D} is often smooth and (strictly) convex, the current practice almost exclusively uses nonsmooth regularization functionals. The majority of successful techniques is using nonsmooth and convex functionals like the total variation and generalizations thereof or ℓ1\ell_1-norms of coefficients arising from scalar products with some frame system. The efficient solution of such variational problems in imaging demands for appropriate algorithms. Taking into account the specific structure as a sum of two very different terms to be minimized, splitting algorithms are a quite canonical choice. Consequently this field has revived the interest in techniques like operator splittings or augmented Lagrangians. Here we shall provide an overview of methods currently developed and recent results as well as some computational studies providing a comparison of different methods and also illustrating their success in applications.Comment: 60 pages, 33 figure
    corecore