34 research outputs found

    Stochastic Volatility Filtering with Intractable Likelihoods

    Full text link
    This paper is concerned with particle filtering for α\alpha-stable stochastic volatility models. The α\alpha-stable distribution provides a flexible framework for modeling asymmetry and heavy tails, which is useful when modeling financial returns. An issue with this distributional assumption is the lack of a closed form for the probability density function. To estimate the volatility of financial returns in this setting, we develop a novel auxiliary particle filter. The algorithm we develop can be easily applied to any hidden Markov model for which the likelihood function is intractable or computationally expensive. The approximate target distribution of our auxiliary filter is based on the idea of approximate Bayesian computation (ABC). ABC methods allow for inference on posterior quantities in situations when the likelihood of the underlying model is not available in closed form, but simulating samples from it is possible. The ABC auxiliary particle filter (ABC-APF) that we propose provides not only a good alternative to state estimation in stochastic volatility models, but it also improves on the existing ABC literature. It allows for more flexibility in state estimation while improving on the accuracy through better proposal distributions in cases when the optimal importance density of the filter is unavailable in closed form. We assess the performance of the ABC-APF on a simulated dataset from the α\alpha-stable stochastic volatility model and compare it to other currently existing ABC filters

    Multivariate Modeling of Natural Gas Spot Trading Hubs Incorporating Futures Market Realized Volatility

    Full text link
    Financial markets for Liquified Natural Gas (LNG) are an important and rapidly-growing segment of commodities markets. Like other commodities markets, there is an inherent spatial structure to LNG markets, with different price dynamics for different points of delivery hubs. Certain hubs support highly liquid markets, allowing efficient and robust price discovery, while others are highly illiquid, limiting the effectiveness of standard risk management techniques. We propose a joint modeling strategy, which uses high-frequency information from thickly-traded hubs to improve volatility estimation and risk management at thinly traded hubs. The resulting model has superior in- and out-of-sample predictive performance, particularly for several commonly used risk management metrics, demonstrating that joint modeling is indeed possible and useful. To improve estimation, a Bayesian estimation strategy is employed and data-driven weakly informative priors are suggested. Our model is robust to sparse data and can be effectively used in any market with similar irregular patterns of data availability

    Climate Change Meets the Law of the Horse

    Get PDF
    The climate change policy debate has only recently turned its full attention to adaptation - how to address the impacts of climate change we have already begun to experience and that will likely increase over time. Legal scholars have in turn begun to explore how the many different fields of law will and should respond. During this nascent period, one overarching question has gone unexamined: how will the legal system as a whole organize around climate change adaptation? Will a new distinct field of climate change adaptation law and policy emerge, or will legal institutions simply work away at the problem through unrelated, duly self-contained fields, as in the famous Law of the Horse? This Article is the first to examine that question comprehensively, to move beyond thinking about the law and climate change adaptation to consider the law of climate change adaptation. Part I of the Article lays out our methodological premises and approach. Using a model we call Stationarity Assessment, Part I explores how legal fields are structured and sustained based on assumptions about the variability of natural, social, and economic conditions, and how disruptions to that regime of variability can lead to the emergence of new fields of law and policy. Case studies of environmental law and environmental justice demonstrate the model’s predictive power for the formation of new distinct legal regimes. Part II applies the Stationarity Assessment model to the topic of climate change adaptation, using a case study of a hypothetical coastal region and the potential for climate change impacts to disrupt relevant legal doctrines and institutions. We find that most fields of law appear capable of adapting effectively to climate change. In other words, without some active intervention, we expect the law and policy of climate change adaptation to follow the path of the Law of the Horse - a collection of fields independently adapting to climate change - rather than organically coalescing into a new distinct field. Part III explores why, notwithstanding this conclusion, it may still be desirable to seek a different trajectory. Focusing on the likelihood of systemic adaptation decisions with perverse, harmful results, we identify the potential benefits offered by intervening to shape a new and distinct field of climate change adaptation law and policy. Part IV then identifies the contours of such a field, exploring the distinct purposes of reducing vulnerability, ensuring resiliency, and safeguarding equity. These features provide the normative policy components for a law of climate change adaptation that would be more than just a Law of the Horse. This new field would not replace or supplant any existing field, however, as environmental law did with regard to nuisance law, and it would not be dominated by substantive doctrine. Rather, like the field of environmental justice, this new legal regime would serve as a holistic overlay across other fields to ensure more efficient, effective, and just climate change adaptation solutions

    Erratum to: Methods for evaluating medical tests and biomarkers

    Get PDF
    [This corrects the article DOI: 10.1186/s41512-016-0001-y.]

    Evidence synthesis to inform model-based cost-effectiveness evaluations of diagnostic tests: a methodological systematic review of health technology assessments

    Get PDF
    Background: Evaluations of diagnostic tests are challenging because of the indirect nature of their impact on patient outcomes. Model-based health economic evaluations of tests allow different types of evidence from various sources to be incorporated and enable cost-effectiveness estimates to be made beyond the duration of available study data. To parameterize a health-economic model fully, all the ways a test impacts on patient health must be quantified, including but not limited to diagnostic test accuracy. Methods: We assessed all UK NIHR HTA reports published May 2009-July 2015. Reports were included if they evaluated a diagnostic test, included a model-based health economic evaluation and included a systematic review and meta-analysis of test accuracy. From each eligible report we extracted information on the following topics: 1) what evidence aside from test accuracy was searched for and synthesised, 2) which methods were used to synthesise test accuracy evidence and how did the results inform the economic model, 3) how/whether threshold effects were explored, 4) how the potential dependency between multiple tests in a pathway was accounted for, and 5) for evaluations of tests targeted at the primary care setting, how evidence from differing healthcare settings was incorporated. Results: The bivariate or HSROC model was implemented in 20/22 reports that met all inclusion criteria. Test accuracy data for health economic modelling was obtained from meta-analyses completely in four reports, partially in fourteen reports and not at all in four reports. Only 2/7 reports that used a quantitative test gave clear threshold recommendations. All 22 reports explored the effect of uncertainty in accuracy parameters but most of those that used multiple tests did not allow for dependence between test results. 7/22 tests were potentially suitable for primary care but the majority found limited evidence on test accuracy in primary care settings. Conclusions: The uptake of appropriate meta-analysis methods for synthesising evidence on diagnostic test accuracy in UK NIHR HTAs has improved in recent years. Future research should focus on other evidence requirements for cost-effectiveness assessment, threshold effects for quantitative tests and the impact of multiple diagnostic tests

    Erratum to: Methods for evaluating medical tests and biomarkers

    Get PDF
    [This corrects the article DOI: 10.1186/s41512-016-0001-y.]

    Association of Out-of-Hospital Cardiac Arrest with Exposure to Fine Particulate and Ozone Ambient Air Pollution from Case-Crossover Analysis Results: Are the Standards Protective?

    No full text
    About 300,000 cardiac arrests occur outside of hospitals in the United States each year; most are fatal. Studies have shown that a small but significant percentage of cardiac arrests appear to be triggered by exposure to increased levels one of two air pollutants: fine particulate matter and ozone. We analyzed seven key studies to determine if Environmental Protection Agency (EPA) standards protect the public from out-of-hospital cardiac arrests (OHCA) triggered by exposure to fine particulate matter and ozone. Using Houston, Texas, data, we found evidence of an increased risk of cardiac arrest on the order of 2% to 9% due to an increase of fine particulate levels (a daily average increase of 10 µg/m3) on the day of, or day before, the heart attack. The EPA fine particulate standard of 35 µg/m3 (35 micrograms per cubic meter of air) therefore does not effectively protect the public from OHCA triggered by exposure to fine particulates. However, the EPA’s ozone standard does appear to adequately protect public health from OHCA triggered by exposure to ozone

    Grid-Based Simulation And The Method Of Conditional Least Squares

    No full text
    This paper is concerned with the use of simulation to compute the conditional expectations that arise in the method of conditional least squares. Our approach involves performing simulations at each point on a discrete grid imbedded within a statistical parameter space. Our main result concerns the number of grid points and amount of simulation necessary in order to obtain a degree of accuracy comparable to that in the case in which the conditional expectations are available in closed form. 1 INTRODUCTION In this paper, we discuss a method known as "conditional least squares" that is widely used for purposes of statistical parameter estimation in the stochastic process setting; see Hall and Heyde (1980) for an introduction to the method. This method requires minimizing a function over the parameter space that involves conditional expectations defined in terms of the stochastic process under consideration. In certain applications, it is natural to compute the conditional expectations v..

    Point source influence on observed extreme pollution levels in a monitoring network

    No full text
    This paper presents a strategy to quantify the influence major point sources in a region have on extreme pollution values observed at each of the monitors in the network. We focus on the number of hours in a day the levels at a monitor exceed a specified health threshold. The number of daily exceedances are modeled using observation-driven negative binomial time series regression models, allowing for a zero-inflation component to characterize the probability of no exceedances in a particular day. The spatial nature of the problem is addressed through the use of a Gaussian plume model for atmospheric dispersion computed at locations of known emissions, creating covariates that impact exceedances. In order to isolate the influence of emitters at individual monitors, we fit separate regression models to the series of counts from each monitor. We apply a final model clustering step to group monitor series that exhibit similar behavior with respect to mean, variability, and common contributors to support policy decision making. The methodology is applied to eight benzene pollution series measured at air quality monitors around the Houston ship channel, a major industrial port
    corecore