213 research outputs found

    TCF periodogram's high sensitivity: A method for optimizing detection of small transiting planets

    Full text link
    We conduct a methodological study for statistically comparing the sensitivities of two periodograms for weak signal planet detection in transit surveys: the widely used Box-Least Squares (BLS) algorithm following light curve detrending and the Transit Comb Filter (TCF) algorithm following autoregressive ARIMA modeling. Small depth transits are injected into light curves with different simulated noise characteristics. Two measures of spectral peak significance are examined: the periodogram signal-to-noise ratio (SNR) and a False Alarm Probability (FAP) based on the generalized extreme value distribution. The relative performance of the BLS and TCF algorithms for small planet detection is examined for a range of light curve characteristics, including orbital period, transit duration, depth, number of transits, and type of noise. The TCF periodogram applied to ARIMA fit residuals with the SNR detection metric is preferred when short-memory autocorrelation is present in the detrended light curve and even when the light curve noise had white Gaussian noise. BLS is more sensitive to small planets only under limited circumstances with the FAP metric. BLS periodogram characteristics are inferior when autocorrelated noise is present. Application of these methods to TESS light curves with small exoplanets confirms our simulation results. The study ends with a decision tree that advises transit survey scientists on procedures to detect small planets most efficiently. The use of ARIMA detrending and TCF periodograms can significantly improve the sensitivity of any transit survey with regularly spaced cadence.Comment: 30 pages, 13 figures, submitted to AAS Journal

    The weather derivatives market: modelling and pricing temperature

    Get PDF
    The main objective of the thesis is to find a pricing model for weather derivatives based on temperature. A general Ornstein-Uhlenbeck process with seasonal mean and volatility is proposed to model the time-dynamics of daily average temperatures. The model is fitted to almost 54 years of daily observations recorded in Chicago, Philadelphia, Portland and Tucson. The unequivocal evidence of fat tails and negative skewness observed for the city of Tucson is modelled by introducing Lèvy processes. Since weather derivatives is an incomplete market, unique prices are derived using the market price of risk. Finally, an estimate of the market price of risk is provided by calibrating theoretical prices to the actual quoted market prices

    Improvement of the demand forecasting methods for vehicle parts at an international automotive company.

    Get PDF
    This study aims to improve the forecasting accuracy for the monthly material flows of an area forwarding based inbound logistics network for an international automotive company. Due to human errors, short-term changes in material requirements or data bases desynchronization the Material Requirement Planning (MRP) cannot be directly derived from the Master Production Schedule (MPS). Therefore, the inbound logistics flows are forecast. The current research extends the forecasting methods¿ scope already applied by the company namely, Naïve, ARIMA, Neural Networks, Exponential Smoothing and Ensemble Forecast (an average of the first four methods) by allowing the implementation of three new algorithms: The Prophet Algorithm, the Vector Autoregressive (Multivariate Time Series) and Automated Simple Moving Average, and two new data cleaning methods: Automated Outlier Detection and Linear Interpolation. All the methods are structured in a software using the programming language R. The results show that as of April 2018, 80.1% of all material flows have a Mean Absolute Percentage Error (MAPE) of less than or equal to 20%, in comparison with the 58.6% of all material flows which had the same behavior in the original software in February 2018. Furthermore, the three new algorithms represent now 29% of all forecasts. All the analysis realized in this research were made with actual data from the company, and the upgraded software was approved by the logistics analysts to make all future material flow forecasts.PregradoINGENIERO(A) EN INDUSTRIA

    Statistical methods for scale-invariant and multifractal stochastic processes.

    Get PDF
    This thesis focuses on stochastic modeling, and statistical methods, in finance and in climate science. Two financial markets, short-term interest rates and electricity prices, are analyzed. We find that the evidence of mean reversion in short-term interest rates is week, while the “log-returns” of electricity prices have significant anti-correlations. More importantly, empirical analyses confirm the multifractal nature of these financial markets, and we propose multifractal models that incorporate the specific conditional mean reversion and level dependence. A second topic in the thesis is the analysis of regional (5◦ × 5◦ and 2◦ × 2◦ latitude- longitude) globally gridded surface temperature series for the time period 1900-2014, with respect to a linear trend and long-range dependence. We find statistically significant trends in most regions. However, we also demonstrate that the existence of a second scaling regime on decadal time scales will have an impact on trend detection. The last main result is an approximative maximum likelihood (ML) method for the log- normal multifractal random walk. It is shown that the ML method has applications beyond parameter estimation, and can for instance be used to compute various risk measures in financial markets

    The long memory of the efficient market

    Full text link
    For the London Stock Exchange we demonstrate that the signs of orders obey a long-memory process. The autocorrelation function decays roughly as τα\tau^{-\alpha} with α0.6\alpha \approx 0.6, corresponding to a Hurst exponent H0.7H \approx 0.7. This implies that the signs of future orders are quite predictable from the signs of past orders; all else being equal, this would suggest a very strong market inefficiency. We demonstrate, however, that fluctuations in order signs are compensated for by anti-correlated fluctuations in transaction size and liquidity, which are also long-memory processes. This tends to make the returns whiter. We show that some institutions display long-range memory and others don't.Comment: 19 pages, 12 figure

    Single equation models for inflation forecasting in Rwanda

    Get PDF
    This study evaluates Phillips curve forecasts of inflation for Rwanda. The study relies on the use of various single equation prototype Phillips curve models, as described by Stock and Watson (2008). Pseudo out-of-sample comparison tests are used to evaluate the forecast performance of these Phillips curve forecasts relative to the AR (autoregression) benchmark forecasts. In this regard, tests of equal forecast accuracy based on mean square forecast error and those based on forecast encompassing as used by several scholars (for example, Clark and McCracken (2001, 2005), Rapach and Weber (2004)) are reported. Furthermore, the results from forecasts using inflation in levels and in differences as the dependent variable are reported, to check the sensitivity to this specification issue. The study finds that the Phillips curve and augmented Phillips curve forecasts outperform the AR benchmark forecasts at one- and two-quarter horizons. The output gap, exchange rate and money supply (M3) are found to be good predictors of inflation in Rwanda in the generalised Phillips curve context. It is therefore strongly recommended that Rwandan economic policymakers take into consideration these variables when forecasting inflation

    Componentes e pontos de quebra em séries temporais na análise de imagens de sensoriamento remoto

    Get PDF
    Orientador: Ricardo da Silva TorresDissertação (mestrado) - Universidade Estadual de Campinas, Instituto de ComputaçãoResumo: A detecção e caracterização de mudanças temporais são indicadores cruciais no processo de compreensão da maneira como mecanismos complexos funcionam e evoluem. Técnicas e imagens de sensoriamento remoto têm sido amplamente empregadas nas últimas décadas com objetivo de detectar e investigar mudanças temporais na superfície terrestre. Tal detecção em dados de séries temporais é passível de ser refinada ainda mais isolando-se as componentes aditivas de tendência e sazonalidade do ruído subjacente. Este trabalho investiga, em particular, o método Breaks For Additive Season and Trend (BFAST) para a análise, decomposição e detecção de pontos de quebra em séries temporais associadas a dados de sensoriamento remoto. Os outputs do método são, então, utilizados em três distintas ¿ mas altamente interconectadas ¿ linhas de pesquisa: em uma melhor compreensão de fenômenos climáticos; na correlação com dados de distúrbios antropológicos; e em problemas de classificação usando funções de dissimilaridade descobertas por um framework evolucionário baseado em Programação Genética (GP). Experimentos realizados demonstram que a decomposição e pontos de quebra produziram resultados efetivos quando aplicados aos estudos com dados ecológicos, mas não foram capazes de melhorar os resultados de classificação quando comparados ao uso das séries brutas. As realizações nesses três contextos também culminaram na criação de duas ferramentas de análise de séries temporais com código aberto baseadas na web, sendo que uma delas foi tão bem aceita pela comunidade-alvo, que atualmente encontra-se integrada em uma plataforma privada de computação em nuvemAbstract: Detecting and characterizing temporal changes are crucial indicators in the process of understanding how complex mechanisms work and evolve. The use of remote sensing images and techniques has been broadly employed over the past decades in order to detect and investigate temporal changes on the Earth surface. Such change detection in time series data may be even further refined by isolating the additive long-term (trend) and cyclical (seasonal) components from the underlying noise. This work investigates the particular Breaks For Additive Season and Trend (BFAST) method for the analysis, decomposition, and breakpoint detection of time series associated with remote sensing data. The derived outputs from that method are, then, used in three distinct ¿ but highly interconnected ¿ research venues: in a better comprehension of climatic phenomena; in the correlation to human-induced disturbances data; and in data classification problems using time series dissimilarity functions discovered by a Genetic-Programming-(GP)-based evolutionary framework. Performed experiments show that decomposition and breakpoints produced insightful and effective results when applied to the ecological data studies, but could not further improve the classification results when compared to its raw time series counterpart. The achievements in those three contexts also led to the creation of two open-source web-based time series analysis tools. One of those tools was so well received by the target community, that it is currently integrated into a private cloud computing platformMestradoCiência da ComputaçãoMestre em Ciência da Computação132847/2015-92015/02105-0CNPQFAPES
    corecore