981 research outputs found

    Some New Observations for F-Contractions in Vector-Valued Metric Spaces of Perov's Type

    Get PDF
    The main purpose of this article is to improve, generalize and complement some recently established results for Perov's type F-contractions. In our approach, we use only the property (F1) of Wardowski while other authors employed all three conditions. Working only with the fact that the function F is strictly increasing on (0, +infinity)(m), we obtain as a consequence new families of contractive conditions in the realm of vector-valued metric spaces of Perov's type. At the end of the article, we present an example that supports obtained theoretical results and genuinely generalizes several known results in existing literature

    Some New Observations for F-Contractions in Vector-Valued Metric Spaces of Perov's Type

    Get PDF
    The main purpose of this article is to improve, generalize and complement some recently established results for Perov's type F-contractions. In our approach, we use only the property (F1) of Wardowski while other authors employed all three conditions. Working only with the fact that the function F is strictly increasing on (0, +infinity)(m), we obtain as a consequence new families of contractive conditions in the realm of vector-valued metric spaces of Perov's type. At the end of the article, we present an example that supports obtained theoretical results and genuinely generalizes several known results in existing literature

    Fixed Point Theory and Related Topics

    Get PDF

    Nonlinear Analysis and Optimization with Applications

    Get PDF
    Nonlinear analysis has wide and significant applications in many areas of mathematics, including functional analysis, variational analysis, nonlinear optimization, convex analysis, nonlinear ordinary and partial differential equations, dynamical system theory, mathematical economics, game theory, signal processing, control theory, data mining, and so forth. Optimization problems have been intensively investigated, and various feasible methods in analyzing convergence of algorithms have been developed over the last half century. In this Special Issue, we will focus on the connection between nonlinear analysis and optimization as well as their applications to integrate basic science into the real world

    Generalized Cyclic p-Contractions and p-Contraction Pairs Some Properties of Asymptotic Regularity Best Proximity Points, Fixed Points

    Get PDF
    This paper studies a general p-contractive condition of a self-mapping T on X, where (X ,d) is either a metric space or a dislocated metric space, which combines the contribution to the upper-bound of d(Tx , Ty), where x and y are arbitrary elements in X of a weighted combination of the distances d(x,y) , d(x,Tx),d(y,Ty),d(x,Ty),d(y,Tx), |d(x,Tx)−d(y,Ty)| and |d(x,Ty)−d(y,Tx)|. The asymptotic regularity of the self-mapping T on X and the convergence of Cauchy sequences to a unique fixed point are also discussed if (X,d) is complete. Subsequently, (T, S) generalized cyclic p-contraction pairs are discussed on a pair of non-empty, in general, disjoint subsets of X. The proposed contraction involves a combination of several distances associated with the (T, S)-pair. Some properties demonstrated are: (a) the asymptotic convergence of the relevant sequences to best proximity points of both sets is proved; (b) the best proximity points are unique if the involved subsets are closed and convex, the metric is norm induced, or the metric space is a uniformly convex Banach space. It can be pointed out that both metric and a metric-like (or dislocated metric) possess the symmetry property since their respective distance values for any given pair of elements of the corresponding space are identical after exchanging the roles of both elements.This research was funded by Basq ue Government, Grant number IT1555-22

    Fixed point theorems of Perov type

    Get PDF
    In this dissertation is introduced a new class of contractions in the setting of cone metric space, both solid and normal, by including an operator as a contractive constant. Some wellknown fixed point theorems are improved and obtained results generalize, Banach, Perov, Ćirić and Fisher theorem, among others. Common fixed point problem for a pair or a sequence of mappings is studied from a different point of view. Wide range of applications is corroborated with numerous examples

    Tensor Networks for Dimensionality Reduction and Large-Scale Optimizations. Part 2 Applications and Future Perspectives

    Full text link
    Part 2 of this monograph builds on the introduction to tensor networks and their operations presented in Part 1. It focuses on tensor network models for super-compressed higher-order representation of data/parameters and related cost functions, while providing an outline of their applications in machine learning and data analytics. A particular emphasis is on the tensor train (TT) and Hierarchical Tucker (HT) decompositions, and their physically meaningful interpretations which reflect the scalability of the tensor network approach. Through a graphical approach, we also elucidate how, by virtue of the underlying low-rank tensor approximations and sophisticated contractions of core tensors, tensor networks have the ability to perform distributed computations on otherwise prohibitively large volumes of data/parameters, thereby alleviating or even eliminating the curse of dimensionality. The usefulness of this concept is illustrated over a number of applied areas, including generalized regression and classification (support tensor machines, canonical correlation analysis, higher order partial least squares), generalized eigenvalue decomposition, Riemannian optimization, and in the optimization of deep neural networks. Part 1 and Part 2 of this work can be used either as stand-alone separate texts, or indeed as a conjoint comprehensive review of the exciting field of low-rank tensor networks and tensor decompositions.Comment: 232 page

    Tensor Networks for Dimensionality Reduction and Large-Scale Optimizations. Part 2 Applications and Future Perspectives

    Full text link
    Part 2 of this monograph builds on the introduction to tensor networks and their operations presented in Part 1. It focuses on tensor network models for super-compressed higher-order representation of data/parameters and related cost functions, while providing an outline of their applications in machine learning and data analytics. A particular emphasis is on the tensor train (TT) and Hierarchical Tucker (HT) decompositions, and their physically meaningful interpretations which reflect the scalability of the tensor network approach. Through a graphical approach, we also elucidate how, by virtue of the underlying low-rank tensor approximations and sophisticated contractions of core tensors, tensor networks have the ability to perform distributed computations on otherwise prohibitively large volumes of data/parameters, thereby alleviating or even eliminating the curse of dimensionality. The usefulness of this concept is illustrated over a number of applied areas, including generalized regression and classification (support tensor machines, canonical correlation analysis, higher order partial least squares), generalized eigenvalue decomposition, Riemannian optimization, and in the optimization of deep neural networks. Part 1 and Part 2 of this work can be used either as stand-alone separate texts, or indeed as a conjoint comprehensive review of the exciting field of low-rank tensor networks and tensor decompositions.Comment: 232 page

    Theory and Application of Fixed Point

    Get PDF
    In the past few decades, several interesting problems have been solved using fixed point theory. In addition to classical ordinary differential equations and integral equation, researchers also focus on fractional differential equations (FDE) and fractional integral equations (FIE). Indeed, FDE and FIE lead to a better understanding of several physical phenomena, which is why such differential equations have been highly appreciated and explored. We also note the importance of distinct abstract spaces, such as quasi-metric, b-metric, symmetric, partial metric, and dislocated metric. Sometimes, one of these spaces is more suitable for a particular application. Fixed point theory techniques in partial metric spaces have been used to solve classical problems of the semantic and domain theory of computer science. This book contains some very recent theoretical results related to some new types of contraction mappings defined in various types of spaces. There are also studies related to applications of the theoretical findings to mathematical models of specific problems, and their approximate computations. In this sense, this book will contribute to the area and provide directions for further developments in fixed point theory and its applications
    corecore