103 research outputs found

    Quantile estimation with adaptive importance sampling

    Full text link
    We introduce new quantile estimators with adaptive importance sampling. The adaptive estimators are based on weighted samples that are neither independent nor identically distributed. Using a new law of iterated logarithm for martingales, we prove the convergence of the adaptive quantile estimators for general distributions with nonunique quantiles thereby extending the work of Feldman and Tucker [Ann. Math. Statist. 37 (1996) 451--457]. We illustrate the algorithm with an example from credit portfolio risk analysis.Comment: Published in at http://dx.doi.org/10.1214/09-AOS745 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    A dynamic look-ahead Monte Carlo algorithm for pricing Bermudan options

    Full text link
    Under the assumption of no-arbitrage, the pricing of American and Bermudan options can be casted into optimal stopping problems. We propose a new adaptive simulation based algorithm for the numerical solution of optimal stopping problems in discrete time. Our approach is to recursively compute the so-called continuation values. They are defined as regression functions of the cash flow, which would occur over a series of subsequent time periods, if the approximated optimal exercise strategy is applied. We use nonparametric least squares regression estimates to approximate the continuation values from a set of sample paths which we simulate from the underlying stochastic process. The parameters of the regression estimates and the regression problems are chosen in a data-dependent manner. We present results concerning the consistency and rate of convergence of the new algorithm. Finally, we illustrate its performance by pricing high-dimensional Bermudan basket options with strangle-spread payoff based on the average of the underlying assets.Comment: Published in at http://dx.doi.org/10.1214/105051607000000249 the Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute of Mathematical Statistics (http://www.imstat.org

    The Term Structure of Variance Swap Rates and Optimal Variance Swap Investments

    Get PDF
    This paper performs specification analysis on the term structure of variance swap rates on the S&P 500 index and studies the optimal investment decision on the variance swaps and the stock index. The analysis identifies 2 stochastic variance risk factors, which govern the short and long end of the variance swap term structure variation, respectively. The highly negative estimate for the market price of variance risk makes it optimal for an investor to take short positions in a short-term variance swap contract, long positions in a long-term variance swap contract, and short positions in the stock inde

    Assessing within-subject rates of change of placental MRI diffusion metrics in normal pregnancy

    Get PDF
    Purpose Studying placental development informs when development is abnormal. Most placental MRI studies are cross-sectional and do not study the extent of individual variability throughout pregnancy. We aimed to explore how diffusion MRI measures of placental function and microstructure vary in individual healthy pregnancies throughout gestation. Methods Seventy-nine pregnant, low-risk participants (17 scanned twice and 62 scanned once) were included. T2-weighted anatomical imaging and a combined multi-echo spin-echo diffusion-weighted sequence were acquired at 3 T. Combined diffusion–relaxometry models were performed using both a -ADC and a bicompartmental -intravoxel-incoherent-motion ( ) model fit. Results There was a significant decline in placental and ADC (both P < 0.01) over gestation. These declines are consistent in individuals for (covariance = −0.47), but not ADC (covariance = −1.04). The model identified a consistent decline in individuals over gestation in from both the perfusing and diffusing placental compartments, but not in ADC values from either. The placental perfusing compartment fraction increased over gestation (P = 0.0017), but this increase was not consistent in individuals (covariance = 2.57). Conclusion Whole placental and ADC values decrease over gestation, although only values showed consistent trends within subjects. There was minimal individual variation in rates of change of values from perfusing and diffusing placental compartments, whereas trends in ADC values from these compartments were less consistent. These findings probably relate to the increased complexity of the bicompartmental model, and differences in how different placental regions evolve at a microstructural level. These placental MRI metrics from low-risk pregnancies provide a useful benchmark for clinical cohorts

    On Optimal Stopping and Impulse Control with Constraint

    Get PDF
    The optimal stopping and impulse control problems for a Markov-Feller process are considered when the controls are allowed only when a signal arrives. This is referred to as control problems with constraint. In [28, 29, 30], the HJB equation was solved and an optimal control (for the optimal stopping problem, the discounted impulse control problem and the ergodic impulse control problem, respectively) was obtained, under suitable conditions, including a setting on a compact metric state space. In this work, we extend most of the results to the situation where the state space of the Markov process is locally compact

    Association of Interprofessional Discharge Planning Using an Electronic Health Record Tool With Hospital Length of Stay Among Patients with Multimorbidity: A Nonrandomized Controlled Trial

    Get PDF
    Whether interprofessional collaboration is effective and safe in decreasing hospital length of stay remains controversial.; To evaluate the outcomes and safety associated with an electronic interprofessional-led discharge planning tool vs standard discharge planning to safely reduce length of stay among medical inpatients with multimorbidity.; This multicenter prospective nonrandomized controlled trial used interrupted time series analysis to examine medical acute hospitalizations at 82 hospitals in Switzerland. It was conducted from February 2017 through January 2019. Data analysis was conducted from March 2021 to July 2022.; After a 12-month preintervention phase (February 2017 through January 2018), an electronic interprofessional-led discharge planning tool was implemented in February 2018 in 7 intervention hospitals in addition to standard discharge planning.; Mixed-effects segmented regression analyses were used to compare monthly changes in trends of length of stay, hospital readmission, in-hospital mortality, and facility discharge after the implementation of the tool with changes in trends among control hospitals.; There were 54 695 hospitalizations at intervention hospitals, with 27 219 in the preintervention period (median [IQR] age, 72 [59-82] years; 14 400 [52.9%] men) and 27 476 in the intervention phase (median [IQR] age, 72 [59-82] years; 14 448 [52.6%] men) and 438 791 at control hospitals, with 216 261 in the preintervention period (median [IQR] age, 74 [60-83] years; 109 770 [50.8%] men) and 222 530 in the intervention phase (median [IQR] age, 74 [60-83] years; 113 053 [50.8%] men). The mean (SD) length of stay in the preintervention phase was 7.6 (7.1) days for intervention hospitals and 7.5 (7.4) days for control hospitals. During the preintervention phase, population-averaged length of stay decreased by -0.344 hr/mo (95% CI, -0.599 to -0.090 hr/mo) in control hospitals; however, no change in trend was observed among intervention hospitals (-0.034 hr/mo; 95% CI, -0.646 to 0.714 hr/mo; difference in slopes, P = .09). Over the intervention phase (February 2018 through January 2019), length of stay remained unchanged in control hospitals (slope, -0.011 hr/mo; 95% CI, -0.281 to 0.260 hr/mo; change in slope, P = .03), but decreased steadily among intervention hospitals by -0.879 hr/mo (95% CI, -1.607 to -0.150 hr/mo; change in slope, P = .04, difference in slopes, P = .03). Safety analyses showed no change in trends of hospital readmission, in-hospital mortality, or facility discharge over the whole study time.; In this nonrandomized controlled trial, the implementation of an electronic interprofessional-led discharge planning tool was associated with a decline in length of stay without an increase in hospital readmission, in-hospital mortality, or facility discharge.; isrctn.org Identifier: ISRCTN83274049

    The Bari Manifesto : An interoperability framework for essential biodiversity variables

    Get PDF
    Essential Biodiversity Variables (EBV) are fundamental variables that can be used for assessing biodiversity change over time, for determining adherence to biodiversity policy, for monitoring progress towards sustainable development goals, and for tracking biodiversity responses to disturbances and management interventions. Data from observations or models that provide measured or estimated EBV values, which we refer to as EBV data products, can help to capture the above processes and trends and can serve as a coherent framework for documenting trends in biodiversity. Using primary biodiversity records and other raw data as sources to produce EBV data products depends on cooperation and interoperability among multiple stakeholders, including those collecting and mobilising data for EBVs and those producing, publishing and preserving EBV data products. Here, we encapsulate ten principles for the current best practice in EBV-focused biodiversity informatics as 'The Bari Manifesto', serving as implementation guidelines for data and research infrastructure providers to support the emerging EBV operational framework based on trans-national and cross-infrastructure scientific workflows. The principles provide guidance on how to contribute towards the production of EBV data products that are globally oriented, while remaining appropriate to the producer's own mission, vision and goals. These ten principles cover: data management planning; data structure; metadata; services; data quality; workflows; provenance; ontologies/vocabularies; data preservation; and accessibility. For each principle, desired outcomes and goals have been formulated. Some specific actions related to fulfilling the Bari Manifesto principles are highlighted in the context of each of four groups of organizations contributing to enabling data interoperability - data standards bodies, research data infrastructures, the pertinent research communities, and funders. The Bari Manifesto provides a roadmap enabling support for routine generation of EBV data products, and increases the likelihood of success for a global EBV framework.Peer reviewe

    Building essential biodiversity variables (EBVs) of species distribution and abundance at a global scale

    Get PDF
    Much biodiversity data is collected worldwide, but it remains challenging to assemble the scattered knowledge for assessing biodiversity status and trends. The concept of Essential Biodiversity Variables (EBVs) was introduced to structure biodiversity monitoring globally, and to harmonize and standardize biodiversity data from disparate sources to capture a minimum set of critical variables required to study, report and manage biodiversity change. Here, we assess the challenges of a 'Big Data' approach to building global EBV data products across taxa and spatiotemporal scales, focusing on species distribution and abundance. The majority of currently available data on species distributions derives from incidentally reported observations or from surveys where presence-only or presence-absence data are sampled repeatedly with standardized protocols. Most abundance data come from opportunistic population counts or from population time series using standardized protocols (e.g. repeated surveys of the same population from single or multiple sites). Enormous complexity exists in integrating these heterogeneous, multi-source data sets across space, time, taxa and different sampling methods. Integration of such data into global EBV data products requires correcting biases introduced by imperfect detection and varying sampling effort, dealing with different spatial resolution and extents, harmonizing measurement units from different data sources or sampling methods, applying statistical tools and models for spatial inter- or extrapolation, and quantifying sources of uncertainty and errors in data and models. To support the development of EBVs by the Group on Earth Observations Biodiversity Observation Network (GEO BON), we identify 11 key workflow steps that will operationalize the process of building EBV data products within and across research infrastructures worldwide. These workflow steps take multiple sequential activities into account, including identification and aggregation of various raw data sources, data quality control, taxonomic name matching and statistical modelling of integrated data. We illustrate these steps with concrete examples from existing citizen science and professional monitoring projects, including eBird, the Tropical Ecology Assessment and Monitoring network, the Living Planet Index and the Baltic Sea zooplankton monitoring. The identified workflow steps are applicable to both terrestrial and aquatic systems and a broad range of spatial, temporal and taxonomic scales. They depend on clear, findable and accessible metadata, and we provide an overview of current data and metadata standards. Several challenges remain to be solved for building global EBV data products: (i) developing tools and models for combining heterogeneous, multi-source data sets and filling data gaps in geographic, temporal and taxonomic coverage, (ii) integrating emerging methods and technologies for data collection such as citizen science, sensor networks, DNA-based techniques and satellite remote sensing, (iii) solving major technical issues related to data product structure, data storage, execution of workflows and the production process/cycle as well as approaching technical interoperability among research infrastructures, (iv) allowing semantic interoperability by developing and adopting standards and tools for capturing consistent data and metadata, and (v) ensuring legal interoperability by endorsing open data or data that are free from restrictions on use, modification and sharing. Addressing these challenges is critical for biodiversity research and for assessing progress towards conservation policy targets and sustainable development goals
    corecore