152 research outputs found

    Essays on Investment Fluctuation and Market Volatility

    Get PDF
    This dissertation includes two different groups of objects in macroeconomics and financial economics. In macroeconomics, the aggregate investment fluctuation and its relation to an individual firm\u27s behavior have been extensively studied for the past three decades. Most studies on the interdependence behavior of firms\u27 investment focus on the key issue of separating a firm\u27s reaction to others\u27 behavior from reaction to common shocks. However, few researchers have addressed the issue of isolating this endogenous effect from a statistical and econometrical approach. The first essay starts with a comprehensive review of the investment fluctuation and firms\u27 interdependence behavior, followed by an econometric model of lumpy investments and an analysis of the binary choice behavior of firms\u27investments. The last part of the first essay investigates the unique characteristics of the Italian economy and discusses the economic policy implications of our research findings. We ask a similar question in the field of financial economics: Where does stock market volatility come from? The literature on the sources of such volatility is abundant. As a result of the availability of high-frequency financial data, attention has been increasingly directed at the modeling of intraday volatility of asset prices and returns. However, no empirical research of intraday volatility analysis has been applied at both a single stock level and industry level in the food industry. The second essay is aimed at filling this gap by modeling and testing intraday volatility of asset prices and returns. It starts with a modified High Frequency Multiplicative Components GARCH (Generalized Autoregressive Conditional Heteroscedasticity) model, which breaks daily volatility into three parts: daily volatility, deterministic intraday volatility, and stochastic intraday volatility. Then we apply this econometric model to a single firm as well as the whole food industry using the Trade and Quote Data and Center for Research in Security Prices data. This study finds that there is little connection between the intraday return and overnight return. There exists, however, strong evidence that the food recall announcements have negative impacts on asset returns of the associated publicly traded firms

    Data-Driven Methods and Applications for Optimization under Uncertainty and Rare-Event Simulation

    Full text link
    For most of decisions or system designs in practice, there exist chances of severe hazards or system failures that can be catastrophic. The occurrence of such hazards is usually uncertain, and hence it is important to measure and analyze the associated risks. As a powerful tool for estimating risks, rare-event simulation techniques are used to improve the efficiency of the estimation when the risk occurs with an extremely small probability. Furthermore, one can utilize the risk measurements to achieve better decisions or designs. This can be achieved by modeling the task into a chance constrained optimization problem, which optimizes an objective with a controlled risk level. However, recent problems in practice have become more data-driven and hence brought new challenges to the existing literature in these two domains. In this dissertation, we will discuss challenges and remedies in data-driven problems for rare-event simulation and chance constrained problems. We propose a robust optimization based framework for approaching chance constrained optimization problems under a data-driven setting. We also analyze the impact of tail uncertainty in data-driven rare-event simulation tasks. On the other hand, due to recent breakthroughs in machine learning techniques, the development of intelligent physical systems, e.g. autonomous vehicles, have been actively investigated. Since these systems can cause catastrophes to public safety, the evaluation of their machine learning components and system performance is crucial. This dissertation will cover problems arising in the evaluation of such systems. We propose an importance sampling scheme for estimating rare events defined by machine learning predictors. Lastly, we discuss an application project in evaluating the safety of autonomous vehicle driving algorithms.PHDIndustrial & Operations EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/163270/1/zhyhuang_1.pd

    Alternative portfolio methods

    Get PDF
    Portfolio optimization in an uncertain environment has great practical value in investment decision process. But this area is highly fragmented due to fast evolution of market structure and changing investor behavior. In this dissertation, four methods are investigated/designed to explore their efficiency under different circumstances. Parametric portfolio decomposes weights by set of factors whose coefficients are uniquely determined via maximizing utility function. A robust bootstrap method is proposed to assist factor selection. If investors exhibit asymmetric aversion of tail risk, pessimistic models on Choquet utility maximization and coherent risk measures acquire superiority. A new hybrid method that inherits advantage of parameterization and tail risk minimization is designed. Mean-variance, which is optimal with elliptical return distribution, should be employed in the case of capital allocation to trading strategies. Nonparametric classifiers may enhance homogeneity of inputs before feeding the optimizer. Traditional factor portfolio can be extended to functional settings by applying FPCA to return curves sorted by factors. Diversification is always achieved by mixing with detected nonlinear components. This research contributes to existing literature on portfolio choice in three-folds: strength and weakness of each method is clarified; new models that outperform traditional approaches are developed; empirical studies are used to facilitate comparison

    ISIPTA'07: Proceedings of the Fifth International Symposium on Imprecise Probability: Theories and Applications

    Get PDF
    B

    Untangling hotel industry’s inefficiency: An SFA approach applied to a renowned Portuguese hotel chain

    Get PDF
    The present paper explores the technical efficiency of four hotels from Teixeira Duarte Group - a renowned Portuguese hotel chain. An efficiency ranking is established from these four hotel units located in Portugal using Stochastic Frontier Analysis. This methodology allows to discriminate between measurement error and systematic inefficiencies in the estimation process enabling to investigate the main inefficiency causes. Several suggestions concerning efficiency improvement are undertaken for each hotel studied.info:eu-repo/semantics/publishedVersio

    Nonparametric Statistical Inference with an Emphasis on Information-Theoretic Methods

    Get PDF
    This book addresses contemporary statistical inference issues when no or minimal assumptions on the nature of studied phenomenon are imposed. Information theory methods play an important role in such scenarios. The approaches discussed include various high-dimensional regression problems, time series and dependence analyses

    Dimensionality Reduction in Dynamic Optimization under Uncertainty

    Get PDF
    Dynamic optimization problems affected by uncertainty are ubiquitous in many application domains. Decision makers typically model the uncertainty through random variables governed by a probability distribution. If the distribution is precisely known, then the emerging optimization problems constitute stochastic programs or chance constrained programs. On the other hand, if the distribution is at least partially unknown, then the emanating optimization problems represent robust or distributionally robust optimization problems. In this thesis, we leverage techniques from stochastic and distributionally robust optimization to address complex problems in finance, energy systems management and, more abstractly, applied probability. In particular, we seek to solve uncertain optimization problems where the prior distributional information includes only the first and the second moments (and, sometimes, the support). The main objective of the thesis is to solve large instances of practical optimization problems. For this purpose, we develop complexity reduction and decomposition schemes, which exploit structural symmetries or multiscale properties of the problems at hand in order to break them down into smaller and more tractable components. In the first part of the thesis we study the growth-optimal portfolio, which maximizes the expected log-utility over a single investment period. In a classical stochastic setting, this portfolio is known to outperform any other portfolio with probability 1 in the long run. In the short run, however, it is notoriously volatile. Moreover, its performance suffers in the presence of distributional ambiguity. We design fixed-mix strategies that offer similar performance guarantees as the classical growth-optimal portfolio but for a finite investment horizon. Moreover, the proposed performance guarantee remains valid for any asset return distribution with the same mean and covariance matrix. These results rely on a Taylor approximation of the terminal logarithmic wealth that becomes more accurate as the rebalancing frequency is increased. In the second part of the thesis, we demonstrate that such a Taylor approximation is in fact not necessary. Specifically, we derive sharp probability bounds on the tails of a product of non-negative random variables. These generalized Chebyshev bounds can be computed numerically using semidefinite programming--in some cases even analytically. Similar techniques can also be used to derive multivariate Chebyshev bounds for sums, maxima, and minima of random variables. In the final part of the thesis, we consider a multi-market reservoir management problem. The eroding peak/off-peak spreads on European electricity spot markets imply reduced profitability for the hydropower producers and force them to participate in the balancing markets. This motivates us to propose a two-layer stochastic programming model for the optimal operation of a cascade of hydropower plants selling energy on both spot and balancing markets. The planning problem optimizes the reservoir management over a yearly horizon with weekly granularity, and the trading subproblems optimize the market transactions over a weekly horizon with hourly granularity. We solve both the planning and trading problems in linear decision rules, and we exploit the inherent parallelizability of the trading subproblems to achieve computational tractability

    Enhancing robustness and sparsity via mathematical optimization

    Get PDF
    Esta tesis se centra en derivar métodos robustos o dispersos bajo la perspectiva de la optimización para problemas que tradicionalmente se engloban en los campos de la Estadística o de la Investigación Operativa. Concretamente, el objetivo de esta tesis doctoral es fusionar técnicas de optimización con conceptos estadísticos para desarrollar metodologías innovadorass que puedan mejorar a los métodos ya existentes y que aúnen las matemáticas teóricas con los problemas de la vida real. Por una parte, los métodos robustos propuestos facilitarán un nuevo enfoque sobre el modelado y la interpretación de problemas clásicos del área de la Investigación Operativa, produciendo soluciones que sean resistentes a varios tipos de incertidumbre. Por otra parte, las estrategias dispersas desarrolladas para resolver problemas notorios del área de Estadística tendrán forma de Problemas No Lineales Mixtos (es decir, problemas de optimización con algunas variables enteras o binarias y función objetivo no lineal, denotados MINLP a partir de ahora). Se mostrará que los métodos propuestos no solamente son manejables computacionalmente, sino que además realzan la interpretabilidad y obtienen una buena calidad de predicción. Específicamente, el Capítulo 1 se centra en descubrir causalidades potenciales en series temporales multivariantes. Esto se lleva a cabo formulando el problema como un MINLP donde las restricciones modelan distintos aspectos de la dispersión, incluyendo restricciones que no permiten la aparición de relaciones espúreas en el modelo. El método muestra un buen rendimiento en términos de poder de predicción y de recuperación del modelo original. Análogamente, el objetivo del Capítulo 2 es descubrir cuáles son los predictores relevantes en un problema de regresión lineal, sin llevar a cabo tests de significación ya que éstos pueden fallar si existe multicolinealidad. Para ello, se formulan MINLPs que restringen los métodos de estimación seleccionados, añadiendo restricciones que miden la importancia de los predictores y que están diseñadas para evitar los problemas que produce la multicolinearidad en los datos. Los modelos restringidos muestran un buen equilibrio entre interpretabilidad y precisión. Por otra parte, en el Capítulo 3 se generaliza el problema clásico del vendedor de periódicos, asumiendo demandas correladas. En particular, una estrategia de inventario robusta, donde no se asumen hipótesis distribucionales sobre la demanda, se formula como un problema de optimización. Para el modelado de dicho problema se hace uso de técnicas que ligan conceptos estadísticos con conjuntos de incertidumbre. Las soluciones obtenidas son robustas ante la presencia de ruido con alta variabilidad en los datos, mientras evitan el exceso de conservadurismo. En el Capítulo 4 se extiende esta formulación para series temporales multivariantes. El escenario es, además, más complejo: no solamente se busca fijar los niveles de producción, sino que se quiere determinar la localización de instalaciones y la asignación de clientes a las mismas. Empíricamente se muestra que, para diseñar una cadena de suministros eficiente, es importante tener en cuenta la correlación y la variabilidad de los datos multivariantes, desarrollando técnicas basadas en los datos que hagan uso de métodos de predicción robustos. Un examen más exhaustivo de las características específicas del problema y de los conjuntos de incertidumbre se lleva a cabo en el Capítulo 5, donde se estudia el problema de selección de portfolios con costes de transacción. En este capítulo se obtienen resultados teóricos que relacionan los costes de transacción con diferentes maneras de protección ante la incertidumbre de los retornos. Como consecuencia, los resultados numéricos muestran que calibrar la penalización de los costes de transacción produce resultados que son resistentes a los errores de estimación.This thesis is focused on deriving robust or sparse approaches under an optimization perspective for problems that have traditionally fell into the Operations Research or the Statisics fields. In particular, the aim of this Ph.D. dissertation is to merge optimization techniques with statistical concepts, leading to novel methods that may outperform the classic approaches and bridge theoretical mathematics with real life problems. On one hand, the proposed robust approaches will provide new insights into the modelling and interpretation of classic problems in the Operations Research area, yielding solutions that are resilient to uncertainty of various kinds. On the other hand, the sparse approaches derived to address some up-to-the-minute topics in Statistics will take the form of Mixed Integer Non-Linear Programs (i.e. optimization problems with some integer or binary variables and non linear objective function, denoted as MINLP thereafter). The proposed methods will be shown to be computationally tractable and to enhance interpretability while attaining a good predictive quality. More specifically, Chapter 1 is focused on discovering potential causalities in multivariate time series. This is undertaken by formulating the estimation problem as a MINLP in which the constraints model different aspects of the sparsity, including constraints that do not allow spurious relationships to appear. The method shows a good performance in terms of forecasting power and recovery of the original model. Analogously, in Chapter 2 the aim is to discover the relevant predictors in a linear regression context without carrying out significance tests, since they may fail in the presence of strong collinearity. To this aim, the preferred estimation method is tightened, deriving MINLPs in which the constraints measure the significance of the predictors and are designed to avoid collinearity issues. The tightened approaches attain a good trade-off between interpretability and accuracy. In contrast, in Chapter 3 the classic newsvendor problem is generalized by assuming correlated demands. In particular, a robust inventory approach with distribution-free autoregressive demand is formulated as an optimization problem, using techniques that merge statistical concepts with uncertainty sets. The obtained solutions are robust against the presence of noises with high variability in the data while avoiding overconservativeness. In Chapter 4 this formulation is extended to multivariate time series in a more complex setting, where decisions over the location-allocation of facilities and their production levels are sought. Empirically, we illustrate that, in order to design an efficient supply chain or to improve an existent one, it is important to take into account the correlation and variability of the multivariate data, developing data-driven techniques which make use of robust forecasting methods. A closer examination of the specific characteristics of the problem and the uncertainty sets is undertaken in Chapter 5, where the portfolio selection problem with transaction costs is considered. In this chapter, theoretical results that relate transaction costs with different ways of protection against uncertainty of the returns are derived. As a consequence, the numerical experiments show that calibrating the transaction costs term yields to results that are resilient to estimation error

    A NEW LEVY BASED SHORT-RATE MODEL FOR THE FIXED INCOME MARKET AND ITS ESTIMATION WITH PARTICLE FILTER

    Get PDF
    In this thesis two contributions are made to the area of mathematical finance. First, in order to explain the non-trivial skewness and kurtosis that is observed in the time series data of constant maturity swap (CMS) rates, we employ the pure jump Levy processes, i.e. in particular Variance Gamma process, to model the variation of unobservable economic factors. It is the first model to include Levy dynamics in the short rate modeling. Specifically, the Vasicek type of short rate framework is adopted, where the short rate is an affine combination of three mean-reverting state variables. Zero-coupon bonds and a few fixed income derivatives are developed under the model based on the transform method. It is expected that the Levy based short rate model would give more realistic explanations to the yield curve movements than Gaussian-based models. Second, the model parameters are estimated by the particle filter (PF) technique. The PF has not seen wide applications in the field of financial engineering, partly due to its stringent requirement on the computing capability. However, given cheap computing cost nowadays, the PF method is a flexible yet powerful tool in estimating state-space models with non-Gaussian dynamics, such as the Levy-based models. To customize the PF algorithm to our model, the continuous-time Levy short rate model is cast into the discrete format by first-order forward Euler approximation. The PF technique is used to retrieve values of the unobservable factors by sequentially using readily available market prices. The optimal set of model parameters are obtained by invoking the quasi-maximum likelihood estimation
    corecore