1,890 research outputs found

    Statistical modelling under epistemic data imprecision : some results on estimating multinomial distributions and logistic regression for coarse categorical data

    Get PDF
    Paper presented at 9th International Symposium on Imprecise Probability: Theories and Applications, Pescara, Italy, 2015. Abstract: The paper deals with parameter estimation for categorical data under epistemic data imprecision, where for a part of the data only coarse(ned) versions of the true values are observable. For different observation models formalizing the information available on the coarsening process, we derive the (typically set-valued) maximum likelihood estimators of the underlying distributions. We discuss the homogeneous case of independent and identically distributed variables as well as logistic regression under a categorical covariate. We start with the imprecise point estimator under an observation model describing the coarsening process without any further assumptions. Then we determine several sensitivity parameters that allow the refinement of the estimators in the presence of auxiliary information

    Effect of Beloved Personā€™s Voice on Chest Tube Removal Pain in Patients undergoing Open Heart Surgery: Fuzzy Logistic Regression Model

    Get PDF
    Chest tube removal pain is one of the important complications after open heart surgery. The removal of a chest tube is a painful and frightening experience and should be managed with as little pain and distress as possible. The aim of this study is to assess the effect of beloved personā€™s voice on chest tube removal pain in patients undergoing open heart surgery. 128 patients were randomly assigned to two groups: one group listened to beloved personā€™s voice during the procedure, and the other did not. Since pain was measured by linguistic terms, a fuzzy logistic regression was applied for modeling. After controlling for the potential confounders, based on fuzzy logistic regression, the beloved personā€™s voice reduced the risk of pain. Therefore, using beloved personā€™s voice could be effective, inexpensive and safe for distraction and reduction of pain

    Vol. 15, No. 1 (Full Issue)

    Get PDF

    Study of Discrete Choice Models and Adaptive Neuro-Fuzzy Inference System in the Prediction of Economic Crisis Periods in USA

    Get PDF
    In this study two approaches are applied for the prediction of the economic recession or expansion periods in USA. The first approach includes Logit and Probit models and the second is an Adaptive Neuro-Fuzzy Inference System (ANFIS) with Gaussian and Generalized Bell membership functions. The in-sample period 1950-2006 is examined and the forecasting performance of the two approaches is evaluated during the out-of sample period 2007-2010. The estimation results show that the ANFIS model outperforms the Logit and Probit model. This indicates that neuro-fuzzy model provides a better and more reliable signal on whether or not a financial crisis will take place.ANFIS, Discrete Choice Models, Error Back-propagation, Financial Crisis, Fuzzy Logic, US Economy

    Applications and Modeling Techniques of Wind Turbine Power Curve for Wind Farms - A Review

    Get PDF
    In the wind energy industry, the power curve represents the relationship between the ā€œwind speedā€ at the hub height and the corresponding ā€œactive powerā€ to be generated. It is the most versatile condition indicator and of vital importance in several key applications, such as wind turbine selection, capacity factor estimation, wind energy assessment and forecasting, and condition monitoring, among others. Ensuring an effective implementation of the aforementioned applications mostly requires a modeling technique that best approximates the normal properties of an optimal wind turbines operation in a particular wind farm. This challenge has drawn the attention of wind farm operators and researchers towards the ā€œstate of the artā€ in wind energy technology. This paper provides an exhaustive and updated review on power curve based applications, the most common anomaly and fault types including their root-causes, along with data preprocessing and correction schemes (i.e., filtering, clustering, isolation, and others), and modeling techniques (i.e., parametric and non-parametric) which cover a wide range of algorithms. More than 100 references, for the most part selected from recently published journal articles, were carefully compiled to properly assess the past, present, and future research directions in this active domain

    Recent Developments in the Econometrics of Program Evaluation

    Get PDF
    Many empirical questions in economics and other social sciences depend on causal effects of programs or policies. In the last two decades much research has been done on the econometric and statistical analysis of the effects of such programs or treatments. This recent theoretical literature has built on, and combined features of, earlier work in both the statistics and econometrics literatures. It has by now reached a level of maturity that makes it an important tool in many areas of empirical research in economics, including labor economics, public finance, development economics, industrial organization and other areas of empirical micro-economics. In this review we discuss some of the recent developments. We focus primarily on practical issues for empirical researchers, as well as provide a historical overview of the area and give references to more technical research.program evaluation, causality, unconfoundedness, Rubin Causal Model, potential outcomes, instrumental variables

    Proactive Assessment of Accident Risk to Improve Safety on a System of Freeways, Research Report 11-15

    Get PDF
    This report describes the development and evaluation of real-time crash risk-assessment models for four freeway corridors: U.S. Route 101 NB (northbound) and SB (southbound) and Interstate 880 NB and SB. Crash data for these freeway segments for the 16-month period from January 2010 through April 2011 are used to link historical crash occurrences with real-time traffic patterns observed through loop-detector data. \u27The crash risk-assessment models are based on a binary classification approach (crash and non-crash outcomes), with traffic parameters measured at surrounding vehicle detection station (VDS) locations as the independent variables. The analysis techniques used in this study are logistic regression and classification trees. Prior to developing the models, some data-related issues such as data cleaning and aggregation were addressed. The modeling efforts revealed that the turbulence resulting from speed variation is significantly associated with crash risk on the U.S. 101 NB corridor. The models estimated with data from U.S. 101 NB were evaluated on the basis of their classification performance, not only on U.S. 101 NB, but also on the other three freeway segments for transferability assessment. It was found that the predictive model derived from one freeway can be readily applied to other freeways, although the classification performance decreases. The models that transfer best to other roadways were determined to be those that use the least number of VDSsā€“that is, those that use one upstream or downstream station rather than two or three.\ The classification accuracy of the models is discussed in terms of how the models can be used for real-time crash risk assessment. The models can be applied to developing and testing variable speed limits (VSLs) and ramp-metering strategies that proactively attempt to reduce crash risk
    • ā€¦
    corecore