130 research outputs found

    Robust portfolio management with multiple financial analysts

    Get PDF
    Portfolio selection theory, developed by Markowitz (1952), is one of the best known and widely applied methods for allocating funds among possible investment choices, where investment decision making is a trade-off between the expected return and risk of the portfolio. Many portfolio selection models have been developed on the basis of Markowitz’s theory. Most of them assume that complete investment information is available and that it can be accurately extracted from the historical data. However, this complete information never exists in reality. There are many kinds of ambiguity and vagueness which cannot be dealt with in the historical data but still need to be considered in portfolio selection. For example, to address the issue of uncertainty caused by estimation errors, the robust counterpart approach of Ben-Tal and Nemirovski (1998) has been employed frequently in recent years. Robustification, however, often leads to a more conservative solution. As a consequence, one of the most common critiques against the robust counterpart approach is the excessively pessimistic character of the robust asset allocation. This thesis attempts to develop new approaches to improve on the respective performances of the robust counterpart approach by incorporating additional investment information sources, so that the optimal portfolio can be more reliable and, at the same time, achieve a greater return. [Continues.

    Fuzzy Efficiency Measures in Data Envelopment Analysis Using Lexicographic Multiobjective Approach

    Get PDF
    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.There is an extensive literature in data envelopment analysis (DEA) aimed at evaluating the relative efficiency of a set of decision-making units (DMUs). Conventional DEA models use definite and precise data while real-life problems often consist of some ambiguous and vague information, such as linguistic terms. Fuzzy sets theory can be effectively used to handle data ambiguity and vagueness in DEA problems. This paper proposes a novel fully fuzzified DEA (FFDEA) approach where, in addition to input and output data, all the variables are considered fuzzy, including the resulting efficiency scores. A lexicographic multi-objective linear programming (MOLP) approach is suggested to solve the fuzzy models proposed in this study. The contribution of this paper is fivefold: (1) both fuzzy Constant and Variable Returns to Scale models are considered to measure fuzzy efficiencies; (2) a classification scheme for DMUs, based on their fuzzy efficiencies, is defined with three categories; (3) fuzzy input and output targets are computed for improving the inefficient DMUs; (4) a super-efficiency FFDEA model is also formulated to rank the fuzzy efficient DMUs; and (5) the proposed approach is illustrated, and compared with existing methods, using a dataset from the literature

    Assessing productive efficiency of banks using integrated Fuzzy-DEA and bootstrapping:a case of Mozambican banks

    Get PDF
    Performance analysis has become a vital part of the management practices in the banking industry. There are numerous applications using DEA models to estimate efficiency in banking, and most of them assume that inputs and outputs are known with absolute precision. Here, we propose new Fuzzy-DEA α-level models to assess underlying uncertainty. Further, bootstrap truncated regressions with fixed factors are used to measure the impact of each model on the efficiency scores and to identify the most relevant contextual variables on efficiency. The proposed models have been demonstrated using an application in Mozambican banks to handle the underlying uncertainty. Findings reveal that fuzziness is predominant over randomness in interpreting the results. In addition, fuzziness can be used by decision-makers to identify missing variables to help in interpreting the results. Price of labor, price of capital, and market-share were found to be the significant factors in measuring bank efficiency. Managerial implications are addressed

    The estimation of electric power losses in electrical networks by fuzzy regression model using genetic algorithm

    Get PDF
    This paper presents the comparative study for fuzzy regression model using linear programming, fuzzy regression model using genetic algorithms and standard regression model. The fuzzy and standard models were developed for estimation of electric power losses in electrical networks. Simulation was carried out with a tool developed in MATLAB

    Représentation et combinaison d'informations incertaines : nouveaux résultats avec applications aux études de sûreté nucléaires

    Get PDF
    It often happens that the value of some parameters or variables of a system are imperfectly known, either because of the variability of the modelled phenomena, or because the availableinformation is imprecise or incomplete. Classical probability theory is usually used to treat these uncertainties. However, recent years have witnessed the appearance of arguments pointing to the conclusion that classical probabilities are inadequate to handle imprecise or incomplete information. Other frameworks have thus been proposed to address this problem: the three main are probability sets, random sets and possibility theory. There are many open questions concerning uncertainty treatment within these frameworks. More precisely, it is necessary to build bridges between these three frameworks to advance toward a unified handlingof uncertainty. Also, there is a need of practical methods to treat information, as using these framerowks can be computationally costly. In this work, we propose some answers to these two needs for a set of commonly encountered problems. In particular, we focus on the problems of:- Uncertainty representation- Fusion and evluation of multiple source information- Independence modellingThe aim being to give tools (both of theoretical and practical nature) to treat uncertainty. Some tools are then applied to some problems related to nuclear safety issues.Souvent, les valeurs de certains paramètres ou variables d'un système ne sont connues que de façon imparfaite, soit du fait de la variabilité des phénomènes physiques que l'on cherche à représenter,soit parce que l'information dont on dispose est imprécise, incomplète ou pas complètement fiable.Usuellement, cette incertitude est traitée par la théorie classique des probabilités. Cependant, ces dernières années ont vu apparaître des arguments indiquant que les probabilités classiques sont inadéquates lorsqu'il faut représenter l'imprécision présente dans l'information. Des cadres complémentaires aux probabilités classiques ont donc été proposés pour remédier à ce problème : il s'agit, principalement, des ensembles de probabilités, des ensembles aléatoires et des possibilités. Beaucoup de questions concernant le traitement des incertitudes dans ces trois cadres restent ouvertes. En particulier, il est nécessaire d'unifier ces approches et de comprendre les liens existants entre elles, et de proposer des méthodes de traitement permettant d'utiliser ces approches parfois cher en temps de calcul. Dans ce travail, nous nous proposons d'apporter des réponses à ces deux besoins pour une série de problème de traitement de l'incertain rencontré en analyse de sûreté. En particulier, nous nous concentrons sur les problèmes suivants :- Représentation des incertitudes- Fusion/évaluation de données venant de sources multiples- Modélisation de l'indépendanceL'objectif étant de fournir des outils, à la fois théoriques et pratiques, de traitement d'incertitude. Certains de ces outils sont ensuite appliqués à des problèmes rencontrés en sûreté nucléaire

    Robust optimality analysis for linear programming problems with uncertain objective function coefficients: an outer approximation approach

    Get PDF
    summary:Linear programming (LP) problems with uncertain objective function coefficients (OFCs) are treated in this paper. In such problems, the decision-maker would be interested in an optimal solution that has robustness against uncertainty. A solution optimal for all conceivable OFCs can be considered a robust optimal solution. Then we investigate an efficient method for checking whether a given non-degenerate basic feasible (NBF) solution is optimal for all OFC vectors in a specified range. When the specified range of the OFC vectors is a hyper-box, i. e., the marginal range of each OFC is given by an interval, it has been shown that the tolerance approach can efficiently solve the robust optimality test problem of an NBF solution. However, the hyper-box case is a particular case where the marginal ranges of some OFCs are the same no matter what values the remaining OFCs take. In real life, we come across cases where some OFCs' marginal ranges depend on the remaining OFCs' values. For example, the prices of products rise together in tandem with raw materials, the gross profit of exported products increases while that of imported products decreases because they depend on the currency exchange rates, and so on. Considering those dependencies, we consider a case where the range of the OFC vector is specified by a convex polytope. In this case, the tolerance approach to the robust optimality test problem of an NBF solution becomes in vain. To treat the problem, we propose an algorithm based on the outer approximation approach. By numerical experiments, we demonstrate how the proposed algorithm efficiently solves the robust optimality test problems of NBF solutions compared to a conventional vertex-listing method

    Fuzzy-Analysis in a Generic Polymorphic Uncertainty Quantification Framework

    Get PDF
    In this thesis, a framework for generic uncertainty analysis is developed. The two basic uncertainty characteristics aleatoric and epistemic uncertainty are differentiated. Polymorphic uncertainty as the combination of these two characteristics is discussed. The main focus is on epistemic uncertainty, with fuzziness as an uncertainty model. Properties and classes of fuzzy quantities are discussed. Some information reduction measures to reduce a fuzzy quantity to a characteristic value, are briefly debated. Analysis approaches for aleatoric, epistemic and polymorphic uncertainty are discussed. For fuzzy analysis α-level-based and α-level-free methods are described. As a hybridization of both methods, non-flat α-level-optimization is proposed. For numerical uncertainty analysis, the framework PUQpy, which stands for “Polymorphic Uncertainty Quantification in Python” is introduced. The conception, structure, data structure, modules and design principles of PUQpy are documented. Sequential Weighted Sampling (SWS) is presented as an optimization algorithm for general purpose optimization, as well as for fuzzy analysis. Slice Sampling as a component of SWS is shown. Routines to update Pareto-fronts, which are required for optimization are benchmarked. Finally, PUQpy is used to analyze example problems as a proof of concept. In those problems analytical functions with uncertain parameters, characterized by fuzzy and polymorphic uncertainty, are examined
    • …
    corecore