49 research outputs found

    A Possibilistic and Probabilistic Approach to Precautionary Saving

    Full text link
    This paper proposes two mixed models to study a consumer's optimal saving in the presence of two types of risk.Comment: Panoeconomicus, 201

    Asset allocation with multiple analysts’ views: a robust approach

    Get PDF
    Retail investors often make decisions based on professional analysts’ investment recommendations. Although these recommendations contain up-to-date financial information, they are usually expressed in sophisticated but vague forms. In addition, the quality differs from analyst to analyst and recommendations may even be mutually conflicting. This paper addresses these issues by extending the Black–Litterman (BL) method and developing a multi-analyst portfolio selection method, balanced against any over-optimistic forecasts. Our methods accommodate analysts’ ambiguous investment recommendations and the heterogeneity of data from disparate sources. We prove the validity of our model, using an empirical analysis of around 1000 daily financial newsletters collected from two top 10 Taiwanese brokerage firms over a 2-year period. We conclude that analysts’ views contribute to the investment allocation process and enhance the portfolio performance. We confirm that the degree of investors’ confidence in these views influences the portfolio outcome, thus extending the idea of the BL model and improving the practicality of robust optimisation

    Multimodel Approaches for Plasma Glucose Estimation in Continuous Glucose Monitoring. Development of New Calibration Algorithms

    Full text link
    ABSTRACT Diabetes Mellitus (DM) embraces a group of metabolic diseases which main characteristic is the presence of high glucose levels in blood. It is one of the diseases with major social and health impact, both for its prevalence and also the consequences of the chronic complications that it implies. One of the research lines to improve the quality of life of people with diabetes is of technical focus. It involves several lines of research, including the development and improvement of devices to estimate "online" plasma glucose: continuous glucose monitoring systems (CGMS), both invasive and non-invasive. These devices estimate plasma glucose from sensor measurements from compartments alternative to blood. Current commercially available CGMS are minimally invasive and offer an estimation of plasma glucose from measurements in the interstitial fluid CGMS is a key component of the technical approach to build the artificial pancreas, aiming at closing the loop in combination with an insulin pump. Yet, the accuracy of current CGMS is still poor and it may partly depend on low performance of the implemented Calibration Algorithm (CA). In addition, the sensor-to-patient sensitivity is different between patients and also for the same patient in time. It is clear, then, that the development of new efficient calibration algorithms for CGMS is an interesting and challenging problem. The indirect measurement of plasma glucose through interstitial glucose is a main confounder of CGMS accuracy. Many components take part in the glucose transport dynamics. Indeed, physiology might suggest the existence of different local behaviors in the glucose transport process. For this reason, local modeling techniques may be the best option for the structure of the desired CA. Thus, similar input samples are represented by the same local model. The integration of all of them considering the input regions where they are valid is the final model of the whole data set. Clustering is tBarceló Rico, F. (2012). Multimodel Approaches for Plasma Glucose Estimation in Continuous Glucose Monitoring. Development of New Calibration Algorithms [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/17173Palanci

    Robust portfolio management with multiple financial analysts

    Get PDF
    Portfolio selection theory, developed by Markowitz (1952), is one of the best known and widely applied methods for allocating funds among possible investment choices, where investment decision making is a trade-off between the expected return and risk of the portfolio. Many portfolio selection models have been developed on the basis of Markowitz’s theory. Most of them assume that complete investment information is available and that it can be accurately extracted from the historical data. However, this complete information never exists in reality. There are many kinds of ambiguity and vagueness which cannot be dealt with in the historical data but still need to be considered in portfolio selection. For example, to address the issue of uncertainty caused by estimation errors, the robust counterpart approach of Ben-Tal and Nemirovski (1998) has been employed frequently in recent years. Robustification, however, often leads to a more conservative solution. As a consequence, one of the most common critiques against the robust counterpart approach is the excessively pessimistic character of the robust asset allocation. This thesis attempts to develop new approaches to improve on the respective performances of the robust counterpart approach by incorporating additional investment information sources, so that the optimal portfolio can be more reliable and, at the same time, achieve a greater return. [Continues.

    New methods for discovering local behaviour in mixed databases

    Full text link
    Clustering techniques are widely used. There are many applications where it is desired to find automatically groups or hidden information in the data set. Finding a model of the system based in the integration of several local models is placed among other applications. Local model could have many structures; however, a linear structure is the most common one, due to its simplicity. This work aims at finding improvements in several fields, but all them will be applied to this finding of a set of local models in a database. On the one hand, a way of codifying the categorical information into numerical values has been designed, in order to apply a numerical algorithm to the whole data set. On the other hand, a cost index has been developed, which will be optimized globally, to find the parameters of the local clusters that best define the output of the process. Each of the techniques has been applied to several experiments and results show the improvements over the actual techniques.BarcelĂł Rico, F. (2009). New methods for discovering local behaviour in mixed databases. http://hdl.handle.net/10251/12739Archivo delegad

    Fuzzy Mathematics

    Get PDF
    This book provides a timely overview of topics in fuzzy mathematics. It lays the foundation for further research and applications in a broad range of areas. It contains break-through analysis on how results from the many variations and extensions of fuzzy set theory can be obtained from known results of traditional fuzzy set theory. The book contains not only theoretical results, but a wide range of applications in areas such as decision analysis, optimal allocation in possibilistics and mixed models, pattern classification, credibility measures, algorithms for modeling uncertain data, and numerical methods for solving fuzzy linear systems. The book offers an excellent reference for advanced undergraduate and graduate students in applied and theoretical fuzzy mathematics. Researchers and referees in fuzzy set theory will find the book to be of extreme value

    Characterization and uncertainty analysis of siliciclastic aquifer-fault system

    Get PDF
    The complex siliciclastic aquifer system underneath the Baton Rouge area, Louisiana, USA, is fluvial in origin. The east-west trending Baton Rouge fault and Denham Springs-Scotlandville fault cut across East Baton Rouge Parish and play an important role in groundwater flow and aquifer salinization. To better understand the salinization underneath Baton Rouge, it is imperative to study the hydrofacies architecture and the groundwater flow field of the Baton Rogue aquifer-fault system. This is done through developing multiple detailed hydrofacies architecture models and multiple groundwater flow models of the aquifer-fault system, representing various uncertain model propositions. The hydrofacies architecture models focus on the Miocene-Pliocene depth interval that consists of the “1,200-foot” sand, “1,500-foot” sand, “1,700-foot” sand and the “2,000-foot” sand, as these aquifer units are classified and named by their approximate depth below ground level. The groundwater flow models focus only on the “2,000-foot” sand. The study reveals the complexity of the Baton Rouge aquifer-fault system where the sand deposition is non-uniform, different sand units are interconnected, the sand unit displacement on the faults is significant, and the spatial distribution of flow pathways through the faults is sporadic. The identified locations of flow pathways through the Baton Rouge fault provide useful information on possible windows for saltwater intrusion from the south. From the results we learn that the “1,200-foot” sand, “1,500-foot” sand and the “1,700-foot” sand should not be modeled separately since they are very well connected near the Baton Rouge fault, while the “2,000-foot” sand between the two faults is a separate unit. Results suggest that at the “2,000-foot” sand the Denham Springs-Scotlandville fault has much lower permeability in comparison to the Baton Rouge fault, and that the Baton Rouge fault plays an important role in the aquifer salinization
    corecore