1,147 research outputs found
Recommended from our members
Forecasts for the Attainment of Major Research Milestones in Parkinson's Disease.
BACKGROUND: Projections about when research milestones will be attained are often of interest to patients and can help inform decisions about research funding and health system planning. OBJECTIVE: To collect aggregated expert forecasts on the attainment of 11 major research milestones in Parkinson's disease (PD). METHODS: Experts were asked to provide predictions about the attainment of 11 milestones in PD research in an online survey. PD experts were identified from: 1) The Michael J. Fox Foundation for Parkinson's Research data base, 2) doctors specializing in PD at top ranked neurology centers in the US and Canada, and 3) corresponding authors of articles on PD in top medical journals. Judgments were aggregated using coherence weighting. We tested the relationship between demographic variables and individual judgments using a linear regression. RESULTS: 249 PD experts completed the survey. In the aggregate, experts believed that new treatments like gene therapy for monogenic PD, immunotherapy and cell therapy had 56.1%, 59.7%, and 66.6% probability, respectively of progressing in the clinical approval process within the next 10 years. Milestones involving existing management approaches, like the approval of a deep brain stimulation device or a body worn sensor had 78.4% and 82.2% probability of occurring within the next 10 years. Demographic factors were unable to explain deviations from the aggregate forecast (R2â=â0.029). CONCLUSIONS: Aggregated expert opinion suggests that milestones for the advancement of new treatment options for PD are still many years away. However, other improvements in PD diagnosis and management are believed to be near at hand
Forecast combinations: an over 50-year review
Forecast combinations have flourished remarkably in the forecasting community
and, in recent years, have become part of the mainstream of forecasting
research and activities. Combining multiple forecasts produced from single
(target) series is now widely used to improve accuracy through the integration
of information gleaned from different sources, thereby mitigating the risk of
identifying a single "best" forecast. Combination schemes have evolved from
simple combination methods without estimation, to sophisticated methods
involving time-varying weights, nonlinear combinations, correlations among
components, and cross-learning. They include combining point forecasts and
combining probabilistic forecasts. This paper provides an up-to-date review of
the extensive literature on forecast combinations, together with reference to
available open-source software implementations. We discuss the potential and
limitations of various methods and highlight how these ideas have developed
over time. Some important issues concerning the utility of forecast
combinations are also surveyed. Finally, we conclude with current research gaps
and potential insights for future research
Global Portfolio Optiomization Revisted: A Least Discrimination Alternative to Black-Litterman
Global portfolio optimization models rank among the proudest achievements of modern finance theory, but practitioners are still struggling to put them to work. In 1992, Black and Litterman put the problem down to difficulties portfolio managers have in extrapolating views about some expected asset returns into full probabilistic forecasts about all asset returns and proposed a method to alleviate this problem. We propose a more general method based on a least discrimination (LD) principle. It produces a probabilistic forecast that remains true to personal views but is otherwise as close as possible to the forecast implied by a reference portfolio. The LD method produces optimal portfolios for a variety of views, including views on volatility and correlation, in which case optimal portfolios include option-like pay-offs. It also justifies a simple linear interpolation between market and personal forecasts, should a compromise be reached.Global portfolio optimization, black-litterman model, least discrimination, utility theory, mean-variance analysis, relative entropy, generalized relative entropy, non-linear pay-offs
Correcting Judgment Correctives in National Security Intelligence
Intelligence analysts, like other professionals, form norms that define standards of tradecraft excellence. These norms, however, have evolved in an idiosyncratic manner that reflects the influence of prominent insiders who had keen psychological insights but little appreciation for how to translate those insights into testable hypotheses. The net result is that the prevailing tradecraft norms of best practice are only loosely grounded in the science of judgment and decision-making. The âcommon senseâ of prestigious opinion leaders inside the intelligence community has pre-empted systematic validity testing of the training techniques and judgment aids endorsed by those opinion leaders. Drawing on the scientific literature, we advance hypotheses about how current best practices could well be reducing rather than increasing the quality of analytic products. One set of hypotheses pertain to the failure of tradecraft training to recognize the most basic threat to accuracy: measurement error in the interpretation of the same data and in the communication of interpretations. Another set of hypotheses focuses on the insensitivity of tradecraft training to the risk that issuing broad-brush, one-directional warnings against bias (e.g., over-confidence) will be less likely to encourage self-critical, deliberative cognition than simple response-threshold shifting that yields the mirror-image bias (e.g., under-confidence). Given the magnitude of the consequences of better and worse intelligence analysis flowing to policy-makers, we see a compelling case for greater funding of efforts to test what actually works
Data-Centric Epidemic Forecasting: A Survey
The COVID-19 pandemic has brought forth the importance of epidemic
forecasting for decision makers in multiple domains, ranging from public health
to the economy as a whole. While forecasting epidemic progression is frequently
conceptualized as being analogous to weather forecasting, however it has some
key differences and remains a non-trivial task. The spread of diseases is
subject to multiple confounding factors spanning human behavior, pathogen
dynamics, weather and environmental conditions. Research interest has been
fueled by the increased availability of rich data sources capturing previously
unobservable facets and also due to initiatives from government public health
and funding agencies. This has resulted, in particular, in a spate of work on
'data-centered' solutions which have shown potential in enhancing our
forecasting capabilities by leveraging non-traditional data sources as well as
recent innovations in AI and machine learning. This survey delves into various
data-driven methodological and practical advancements and introduces a
conceptual framework to navigate through them. First, we enumerate the large
number of epidemiological datasets and novel data streams that are relevant to
epidemic forecasting, capturing various factors like symptomatic online
surveys, retail and commerce, mobility, genomics data and more. Next, we
discuss methods and modeling paradigms focusing on the recent data-driven
statistical and deep-learning based methods as well as on the novel class of
hybrid models that combine domain knowledge of mechanistic models with the
effectiveness and flexibility of statistical approaches. We also discuss
experiences and challenges that arise in real-world deployment of these
forecasting systems including decision-making informed by forecasts. Finally,
we highlight some challenges and open problems found across the forecasting
pipeline.Comment: 67 pages, 12 figure
Calibrating Expert Assessments of Advanced Aerospace Technology Adoption Impact
This dissertation describes the development of expert judgment calibration methodology as part of elicitation of the expert judgments to assist in the task of quantifying parameter uncertainty for proposed new aerospace vehicles. From previous work, it has been shown that experts in the field of aerospace systems design and development can provide valuable input into the sizing and conceptual design of future space launch vehicles employing advanced technology. In particular (and of specific interest in this case), assessment of operations and support cost implications of adopting proposed new technology is frequently asked of the experts. Often the input consisting of estimates and opinions is imprecise and may be offered with less than a high degree of confidence in its efficacy. Since the sizing and design of advanced space or launch vehicles must ultimately have costs attached to them (for subsequent program advocacy and tradeoff studies), the lack of precision in parameter estimates will be detrimental to the development of viable cost models to support the advocacy and tradeoffs. It is postulated that a system, which could accurately apply a measure of calibration to the imprecise and/or low-confidence estimates of the surveyed experts, would greatly enhance the derived parametric data. The development of such a calibration aid has been the thrust of this effort. Bayesian network methodology, augmented by uncertainty modeling and aggregation techniques, among others, were employed in the tool construction. Appropriate survey questionnaire instruments were compiled for use in acquiring the experts\u27 input; the responses served as input to a test case for validation of the resulting calibration model. Application of the derived techniques were applied as part of a larger expert assessment elicitation and aggregation study. Results of this research show that calibration of expert judgments, particularly for far-term events, appears to be possible. Suggestions for refinement and extension of the development are presented
The Competing Hypotheses Analytical Model and Human Intelligence Single-Source Analysis
Abstract
The article examines some particular aspects of the analytical process within the intelligence cycle, having as reference the framework of strategic intelligence. Starting from a proposed model of analysis of the competing hypotheses using phases-tailored tools, which will improve the quality of all-source intelligence analysis and its final products, we further assess its applicability in HUMINT (Human Intelligence) analysis. The model of intelligence analysis as a problem-solving method, with a focus on predictive analysis, will serve to understand the expectations from single-source collection disciplines (in our case, HUMINT) data gathering and reporting, connected to the roles of HUMINT analysts in the specialized branches
Machine Learning for Load Profile Data Analytics and Short-term Load Forecasting
Short-term load forecasting (STLF) is a key issue for the operation and dispatch of day ahead energy market. It is a prerequisite for the economic operation of power systems and the basis of dispatching and making startup-shutdown plans, which plays a key role in the automatic control of power systems. Accurate power load forecasting not only help users choose a more appropriate electricity consumption scheme and reduces a lot of electric cost expenditure but also is conducive to optimizing the resources of power systems. This advantage helps while improving equipment utilization for reducing the production cost and improving the economic benefit, and improving power supply capability. Therefore, ultimately achieving the aim of efficient demand response program. This thesis outlines some machine learning based data driven models for STLF in smart grid. It also presents different policies and current statuses as well as future research direction for developing new STLF models. This thesis outlines three projects for load profile data analytics and machine learning based STLF models. First project is, load profile classification and determining load demand variability with the aim to estimate the load demand of a customer. In this project load profile data collected from smart meter are classified using recently developed extended nearest neighbor (ENN) algorithm. Here we have calculated generalized class wise statistics which will give the idea of load demand variability of a customer. Finally the load demand of a particular customer is estimated based on generalized class wise statistics, maximum load demand and minimum load demand. In the second project, a composite ENN model is proposed for STLF. The ENN model is proposed to improve the performance of k-nearest neighbor (kNN) algorithm based STLF models. In this project we have developed three individual models to process weather data i.e., temperature, social variables, and load demand data. The load demand is predicted separately for different input variables. Finally the load demand is forecasted from the weighted average of three models. The weights are determined based on the change in generalized class wise statistics. This projects provides a significant improvement in the performance of load forecasting accuracy compared to kNN based models. In the third project, an advanced data driven model is developed. Here, we have proposed a novel hybrid load forecasting model based on novel signal decomposition and correlation analysis. The hybrid model consists of improved empirical mode decomposition, T-Copula based correlation analysis. Finally we have employed deep belief network for making load demand forecasting. The results are compared with previous studies and it is evident that there is a significant improvement in mean absolute percentage error (MAPE) and root mean square error (RMSE)
- âŠ