232 research outputs found

    Revisiting variance gamma pricing : an application to S&P500 index options

    Get PDF
    We reformulate the Lévy-Kintchine formula to make it suitable for modelling the stochastic time-changing effects of Lévy processes. Using Variance-Gamma (VG) process as an example, it illustrates the dynamic properties of a Lévy process and revisits the earlier work of Geman (2002). It also shows how the model can be calibrated to price options under a Lévy VG process, and calibrates the model on recent S&P500 index options data. It then compares the pricing performance of Fast Fourier Transform (FFT) and Fractional Fourier Transform (FRFT) approaches to model calibration and investigates the trade-off between calibration performance and required calculation time

    Local volume fraction distributions of axons, astrocytes, and myelin in deep subcortical white matter

    Get PDF
    This study aims to statistically describe histologically stained white matter brain sections to subsequently inform and validate diffusion MRI techniques. For the first time, we characterise volume fraction distributions of three of the main structures in deep subcortical white matter (axons, astrocytes, and myelinated axons) in a representative cohort of an ageing population for which well-characterized neuropathology data is available. We analysed a set of samples from 90 subjects of the Cognitive Function and Ageing Study (CFAS), stratified into three groups of 30 subjects each, in relation to the presence of age-associated deep subcortical lesions. This provides volume fraction distributions in different scenarios relevant to brain diffusion MRI in dementia. We also assess statistically significant differences found between these groups. In agreement with previous literature, our results indicate that white matter lesions are related with a decrease in the myelinated axons fraction and an increase in astrocytic fraction, while no statistically significant changes occur in axonal mean fraction. In addition, we introduced a framework to quantify volume fraction distributions from 2D immunohistochemistry images, which is validated against in silico simulations. Since a trade-off between precision and resolution emerged, we also performed an assessment of the optimal scale for computing such distributions

    Population-based Bayesian regularization for microstructural diffusion MRI with NODDIDA.

    No full text
    PURPOSE:Information on the brain microstructure can be probed by Diffusion Magnetic Resonance Imaging (dMRI). Neurite Orientation Dispersion and Density Imaging with Diffusivities Assessment (NODDIDA) is one of the simplest microstructural model proposed. However, the estimation of the NODDIDA parameters from clinically plausible dMRI acquisition is ill-posed, and different parameter sets can describe the same measurements equally well. A few approaches to resolve this problem focused on developing better optimization strategies for this non-convex optimization. However, this fundamentally does not resolve ill-posedness. This article introduces a Bayesian estimation framework, which is regularized through knowledge from an extensive dMRI measurement set on a population of healthy adults (henceforth population-based prior). METHODS:We reformulate the problem as a Bayesian maximum a posteriori estimation, which includes as a special case previous approach using non-informative uniform priors. A population-based prior is estimated from 35 subjects of the MGH Adult Diffusion data (Human Connectome Project), acquired with an extensive acquisition protocol including high b-values. The accuracy and robustness of different approaches with and without the population-based prior is tested on subsets of the MGH dataset, and an independent dataset from a clinically comparable scanner, with only clinically plausible dMRI measurements. RESULTS:The population-based prior produced substantially more accurate and robust parameter estimates, compared to the conventional uniform priors, for clinically feasible protocols, without introducing any evident bias. CONCLUSIONS:The use of the proposed Bayesian population-based prior can lead to clinically feasible and robust estimation of NODDIDA parameters without changing the acquisition protocol

    Ecological-economic assessment of the effects of freshwater flow in the Florida Everglades on recreational fisheries

    Get PDF
    This research develops an integrated methodology to determine the economic value to anglers of recreational fishery ecosystem services in Everglades National Park that could result from different water management scenarios. The study first used bio-hydrological models to link managed freshwater inflows to indicators of fishery productivity and ecosystem health, then link those models to anglers\u27 willingness-to-pay for various attributes of the recreational fishing experience and monthly fishing effort. This approach allowed us to estimate the foregone economic benefits of failing to meet monthly freshwater delivery targets. The study found that the managed freshwater delivery to the Park had declined substantially over the years and had fallen short of management targets. This shortage in the flow resulted in the decline of biological productivity of recreational fisheries in downstream coastal areas. This decline had in turn contributed to reductions in the overall economic value of recreational ecosystem services enjoyed by anglers. The study estimated the annual value of lost recreational services at 68.81million.Thelossesweregreaterinthemonthsofdryseasonwhenthewatershortagewashigherandthenumberofanglersfishingalsowashigherthanthelevelsinwetseason.Thestudyalsodevelopedconservativeestimatesofimplicitpriceofwaterforrecreation,whichrangedfrom68.81million. The losses were greater in the months of dry season when the water shortage was higher and the number of anglers fishing also was higher than the levels in wet season. The study also developed conservative estimates of implicit price of water for recreation, which ranged from 11.88 per AF in November to 112.11perAFinApril.Theannualaveragepricewas112.11 per AF in April. The annual average price was 41.54 per AF. Linking anglers\u27 recreational preference directly to a decision variable such as water delivery is a powerful and effective way to make management decision

    Approximate Marginalization of Absorption and Scattering in Fluorescence Diffuse Optical Tomography

    Get PDF
    In fluorescence diffuse optical tomography (fDOT), the reconstruction of the fluorophore concentration inside the target body is usually carried out using a normalized Born approximation model where the measured fluorescent emission data is scaled by measured excitation data. One of the benefits of the model is that it can tolerate inaccuracy in the absorption and scattering distributions that are used in the construction of the forward model to some extent. In this paper, we employ the recently proposed Bayesian approximation error approach to fDOT for compensating for the modeling errors caused by the inaccurately known optical properties of the target in combination with the normalized Born approximation model. The approach is evaluated using a simulated test case with different amount of error in the optical properties. The results show that the Bayesian approximation error approach improves the tolerance of fDOT imaging against modeling errors caused by inaccurately known absorption and scattering of the target

    Megacity pumping and preferential flow threaten groundwater quality

    Get PDF
    Many of the world’s megacities depend on groundwater from geologically complex aquifers that are over-exploited and threatened by contamination. Here, using the example of Dhaka, Bangladesh, we illustrate how interactions between aquifer heterogeneity and groundwater exploitation jeopardize groundwater resources regionally. Groundwater pumping in Dhaka has caused large-scale drawdown that extends into outlying areas where arsenic-contaminated shallow groundwater is pervasive and has potential to migrate downward. We evaluate the vulnerability of deep, low-arsenic groundwater with groundwater models that incorporate geostatistical simulations of aquifer heterogeneity. Simulations show that preferential flow through stratigraphy typical of fluvio-deltaic aquifers could contaminate deep (>150 m) groundwater within a decade, nearly a century faster than predicted through homogeneous models calibrated to the same data. The most critical fast flowpaths cannot be predicted by simplified models or identified by standard measurements. Such complex vulnerability beyond city limits could become a limiting factor for megacity groundwater supplies in aquifers worldwide.National Institute of Environmental Health Sciences. Superfund Research Program (Grant P42 ES010349)National Science Foundation (U.S.) (Grant EAR-115173

    Vulnerability of low-arsenic aquifers to municipal pumping in Bangladesh

    Get PDF
    Sandy aquifers deposited >12,000 years ago, some as shallow as 30 m, have provided a reliable supply of low-arsenic (As) drinking water in rural Bangladesh. This study concerns the potential risk of contaminating these aquifers in areas surrounding the city of Dhaka where hydraulic heads in aquifers >150 m deep have dropped by 70 m in a few decades due to municipal pumping. Water levels measured continuously from 2012 to 2014 in 12 deep (>150 m), 3 intermediate (90-150 m) and 6 shallow (<90 m) community wells, 1 shallow private well, and 1 river piezometer show that the resulting drawdown cone extends 15-35 km east of Dhaka. Water levels in 4 low-As community wells within the 62-147 m depth range closest to Dhaka were inaccessible by suction for up to a third of the year. Lateral hydraulic gradients in the deep aquifer system ranged from 1.7 × 10-4 to 3.7 × 10-4 indicating flow towards Dhaka throughout 2012-2014. Vertical recharge on the edge of the drawdown cone was estimated at 0.21 ± 0.06 m/yr. The data suggest that continued municipal pumping in Dhaka could eventually contaminate some relatively shallow community wells

    Iba-1-/CD68+ microglia are a prominent feature of age-associated deep subcortical white matter lesions.

    Get PDF
    Deep subcortical lesions (DSCL) of the brain, are present in ~60% of the ageing population, and are linked to cognitive decline and depression. DSCL are associated with demyelination, blood brain barrier (BBB) dysfunction, and microgliosis. Microglia are the main immune cell of the brain. Under physiological conditions microglia have a ramified morphology, and react to pathology with a change to a more rounded morphology as well as showing protein expression alterations. This study builds on previous characterisations of DSCL and radiologically 'normal-appearing' white matter (NAWM) by performing a detailed characterisation of a range of microglial markers in addition to markers of vascular integrity. The Cognitive Function and Ageing Study (CFAS) provided control white matter (WM), NAWM and DSCL human post mortem tissue for immunohistochemistry using microglial markers (Iba-1, CD68 and MHCII), a vascular basement membrane marker (collagen IV) and markers of BBB integrity (fibrinogen and aquaporin 4). The immunoreactive profile of CD68 increased in a stepwise manner from control WM to NAWM to DSCL. This correlated with a shift from small, ramified cells, to larger, more rounded microglia. While there was greater Iba-1 immunoreactivity in NAWM compared to controls, in DSCL, Iba-1 levels were reduced to control levels. A prominent feature of these DSCL was a population of Iba-1-/CD68+ microglia. There were increases in collagen IV, but no change in BBB integrity. Overall the study shows significant differences in the immunoreactive profile of microglial markers. Whether this is a cause or effect of lesion development remains to be elucidated. Identifying microglia subpopulations based on their morphology and molecular markers may ultimately help decipher their function and role in neurodegeneration. Furthermore, this study demonstrates that Iba-1 is not a pan-microglial marker, and that a combination of several microglial markers is required to fully characterise the microglial phenotype

    Minimum sample size calculations for external validation of a clinical prediction model with a time-to-event outcome.

    Get PDF
    Previous articles in Statistics in Medicine describe how to calculate the sample size required for external validation of prediction models with continuous and binary outcomes. The minimum sample size criteria aim to ensure precise estimation of key measures of a model's predictive performance, including measures of calibration, discrimination, and net benefit. Here, we extend the sample size guidance to prediction models with a time-to-event (survival) outcome, to cover external validation in datasets containing censoring. A simulation-based framework is proposed, which calculates the sample size required to target a particular confidence interval width for the calibration slope measuring the agreement between predicted risks (from the model) and observed risks (derived using pseudo-observations to account for censoring) on the log cumulative hazard scale. Precise estimation of calibration curves, discrimination, and net-benefit can also be checked in this framework. The process requires assumptions about the validation population in terms of the (i) distribution of the model's linear predictor and (ii) event and censoring distributions. Existing information can inform this; in particular, the linear predictor distribution can be approximated using the C-index or Royston's D statistic from the model development article, together with the overall event risk. We demonstrate how the approach can be used to calculate the sample size required to validate a prediction model for recurrent venous thromboembolism. Ideally the sample size should ensure precise calibration across the entire range of predicted risks, but must at least ensure adequate precision in regions important for clinical decision-making. Stata and R code are provided
    • …
    corecore