19,051 research outputs found

    Threshold regression with threg

    Get PDF
    In this presentation, I introduce a new Stata command called threg. The command estimates regression coefficients of a threshold regression model based on the first hitting time of a boundary by the sample path of a Wiener diffusion process. The regression methodology is well suited to applications involving survival and time-to-event data. This new command uses the MLE routine in Stata for calculating regression coefficient estimates, asymptotic standard errors, and p-values. An initialization option is also allowed, as in the conventional MLE routine. The threg command can be carried out with either calendar or analytical time scales. Hazard ratios at selected time points for specified scenarios (based on given categories or value settings of covariates) can also be calculated by this command. Furthermore, curves of estimated hazard functions, survival functions, and probability distribution functions of the first hitting time can be plotted. Function curves corresponding to different scenarios can be overlaid in the same plot for a comparative analysis to give added research insights.

    Semiparametric Threshold Regression Analysis for Time-to-Event Data

    Get PDF
    Threshold regression is a relatively new alternative approach to the Cox proportional hazards model when the proportional hazards assumption is violated. It is based on first-hitting-time models, where the time-to-event data can be modeled as the time at which the stochastic process of interest first hits a boundary or threshold state. In this dissertation, we develop a semiparametric threshold regression model with flexible covariate effects. Specifically, we propose a B-spline approximation method to estimate nonlinear covariate effects on both the initial state and the rate of the process. We show that the spline based estimators are consistent and achieve the possible optimal rate of convergence under the smooth assumption. Simulation studies are conducted for practical situations, and the methodology is applied to a study of osteoporotic fractures that motivated this investigation. To check the validity of threshold regression model with parametric link functions, we propose two supremum-type test processes: one is based on cumulative sums of martingale residuals; the other one is based on censoring consistent residuals. The realizations of these test stochastic processes under the assumed model can be easily generated by computer simulation. We show that both tests are consistent against model misspecification. Both model checking methods have been applied to a kidney dialysis data set

    Threshold Regression for Survival Analysis: Modeling Event Times by a Stochastic Process Reaching a Boundary

    Full text link
    Many researchers have investigated first hitting times as models for survival data. First hitting times arise naturally in many types of stochastic processes, ranging from Wiener processes to Markov chains. In a survival context, the state of the underlying process represents the strength of an item or the health of an individual. The item fails or the individual experiences a clinical endpoint when the process reaches an adverse threshold state for the first time. The time scale can be calendar time or some other operational measure of degradation or disease progression. In many applications, the process is latent (i.e., unobservable). Threshold regression refers to first-hitting-time models with regression structures that accommodate covariate data. The parameters of the process, threshold state and time scale may depend on the covariates. This paper reviews aspects of this topic and discusses fruitful avenues for future research.Comment: Published at http://dx.doi.org/10.1214/088342306000000330 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    The Likelihood of Mixed Hitting Times

    Full text link
    We present a method for computing the likelihood of a mixed hitting-time model that specifies durations as the first time a latent L\'evy process crosses a heterogeneous threshold. This likelihood is not generally known in closed form, but its Laplace transform is. Our approach to its computation relies on numerical methods for inverting Laplace transforms that exploit special properties of the first passage times of L\'evy processes. We use our method to implement a maximum likelihood estimator of the mixed hitting-time model in MATLAB. We illustrate the application of this estimator with an analysis of Kennan's (1985) strike data.Comment: 35 page

    A non-Gaussian continuous state space model for asset degradation

    Get PDF
    The degradation model plays an essential role in asset life prediction and condition based maintenance. Various degradation models have been proposed. Within these models, the state space model has the ability to combine degradation data and failure event data. The state space model is also an effective approach to deal with the multiple observations and missing data issues. Using the state space degradation model, the deterioration process of assets is presented by a system state process which can be revealed by a sequence of observations. Current research largely assumes that the underlying system development process is discrete in time or states. Although some models have been developed to consider continuous time and space, these state space models are based on the Wiener process with the Gaussian assumption. This paper proposes a Gamma-based state space degradation model in order to remove the Gaussian assumption. Both condition monitoring observations and failure events are considered in the model so as to improve the accuracy of asset life prediction. A simulation study is carried out to illustrate the application procedure of the proposed model

    Can power-law scaling and neuronal avalanches arise from stochastic dynamics?

    Get PDF
    The presence of self-organized criticality in biology is often evidenced by a power-law scaling of event size distributions, which can be measured by linear regression on logarithmic axes. We show here that such a procedure does not necessarily mean that the system exhibits self-organized criticality. We first provide an analysis of multisite local field potential (LFP) recordings of brain activity and show that event size distributions defined as negative LFP peaks can be close to power-law distributions. However, this result is not robust to change in detection threshold, or when tested using more rigorous statistical analyses such as the Kolmogorov-Smirnov test. Similar power-law scaling is observed for surrogate signals, suggesting that power-law scaling may be a generic property of thresholded stochastic processes. We next investigate this problem analytically, and show that, indeed, stochastic processes can produce spurious power-law scaling without the presence of underlying self-organized criticality. However, this power-law is only apparent in logarithmic representations, and does not survive more rigorous analysis such as the Kolmogorov-Smirnov test. The same analysis was also performed on an artificial network known to display self-organized criticality. In this case, both the graphical representations and the rigorous statistical analysis reveal with no ambiguity that the avalanche size is distributed as a power-law. We conclude that logarithmic representations can lead to spurious power-law scaling induced by the stochastic nature of the phenomenon. This apparent power-law scaling does not constitute a proof of self-organized criticality, which should be demonstrated by more stringent statistical tests.Comment: 14 pages, 10 figures; PLoS One, in press (2010

    Learning Parameterized Skills

    Full text link
    We introduce a method for constructing skills capable of solving tasks drawn from a distribution of parameterized reinforcement learning problems. The method draws example tasks from a distribution of interest and uses the corresponding learned policies to estimate the topology of the lower-dimensional piecewise-smooth manifold on which the skill policies lie. This manifold models how policy parameters change as task parameters vary. The method identifies the number of charts that compose the manifold and then applies non-linear regression in each chart to construct a parameterized skill by predicting policy parameters from task parameters. We evaluate our method on an underactuated simulated robotic arm tasked with learning to accurately throw darts at a parameterized target location.Comment: Appears in Proceedings of the 29th International Conference on Machine Learning (ICML 2012

    Detection Algorithms for Molecular MIMO

    Full text link
    In this paper, we propose a novel design for molecular communication in which both the transmitter and the receiver have, in a 3-dimensional environment, multiple bulges (in RF communication this corresponds to antenna). The proposed system consists of a fluid medium, information molecules, a transmitter, and a receiver. We simulate the system with a one-shot signal to obtain the channel's finite impulse response. We then incorporate this result within our mathematical analysis to determine interference. Molecular communication has a great need for low complexity, hence, the receiver may have incomplete information regarding the system and the channel state. Thus, for the cases of limited information set at the receiver, we propose three detection algorithms, namely adaptive thresholding, practical zero forcing, and Genie-aided zero forcing.Comment: 6 pages, 6 figures, 2015 IEEE ICC accepte
    • …
    corecore