32 research outputs found

    On the Type-I Half-logistic Distribution and Related Contributions: A Review

    Get PDF
    The half-logistic (HL) distribution is a widely considered statistical model for studying lifetime phenomena arising in science, engineering, finance, and biomedical sciences. One of its weaknesses is that it has a decreasing probability density function and an increasing hazard rate function only. Due to that, researchers have been modifying the HL distribution to have more functional ability. This article provides an extensive overview of the HL distribution and its generalization (or extensions). The recent advancements regarding the HL distribution have led to numerous results in modern theory and statistical computing techniques across science and engineering. This work extended the body of literature in a summarized way to clarify some of the states of knowledge, potentials, and important roles played by the HL distribution and related models in probability theory and statistical studies in various areas and applications. In particular, at least sixty-seven flexible extensions of the HL distribution have been proposed in the past few years. We give a brief introduction to these distributions, emphasizing model parameters, properties derived, and the estimation method. Conclusively, there is no doubt that this summary could create a consensus between various related results in both theory and applications of the HL-related models to develop an interest in future studies

    Statistical inference for dependent competing risks data under adaptive Type-II progressive hybrid censoring

    Full text link
    In this article, we consider statistical inference based on dependent competing risks data from Marshall-Olkin bivariate Weibull distribution. The maximum likelihood estimates of the unknown model parameters have been computed by using the Newton-Raphson method under adaptive Type II progressive hybrid censoring with partially observed failure causes. The existence and uniqueness of maximum likelihood estimates are derived. Approximate confidence intervals have been constructed via the observed Fisher information matrix using the asymptotic normality property of the maximum likelihood estimates. Bayes estimates and highest posterior density credible intervals have been calculated under gamma-Dirichlet prior distribution by using the Markov chain Monte Carlo technique. Convergence of Markov chain Monte Carlo samples is tested. In addition, a Monte Carlo simulation is carried out to compare the effectiveness of the proposed methods. Further, three different optimality criteria have been taken into account to obtain the most effective censoring plans. Finally, a real-life data set has been analyzed to illustrate the operability and applicability of the proposed methods

    Log-Kumaraswamy distribution: its features and applications

    Get PDF
    This article aimed to present a new continuous probability density function for a non-negative random variable that serves as an alternative to some bounded domain distributions. The new distribution, termed the log-Kumaraswamy distribution, could faithfully be employed to compete with bounded and unbounded random processes. Some essential features of this distribution were studied, and the parameters of its estimates were obtained based on the maximum product of spacing, least squares, and weighted least squares procedures. The new distribution was proven to be better than traditional models in terms of flexibility and applicability to real-life data sets

    Bell-Touchard-G Family of Distributions: Applications to Quality Control and Actuarial Data

    Get PDF
    In this article, we developed a new statistical model named as the generalized complementary exponentiated Bell-Touchard model. The exponential model is taken as a special baseline model with a configurable failure rate function. The proposed model is based on several features of zero-truncated Bell numbers and Touchard polynomials that can address the complementary risk matters. The linear representation of the density of the proposed model is provided that can be used to obtain numerous important properties of the special model. The well-known actuarial metrics namely value at risk and expected shortfall are formulated, computed and graphically illustrated for the sub model. The maximum likelihood approach is used to estimate the parameters. Furthermore, we designed the group acceptance sampling plan for the proposed model by using the median as a quality parameter for truncated life tests. Three real data applications are offered for the complementary exponentiated Bell Touchard exponential model. The analysis of the two failure times data and comparative study yielded optimized results of the group acceptance sampling plan under the proposed model. The application to insurance claim data also provided the best results and showed that the proposed model had heavier tail

    Statistical analysis of progressively first-failure-censored data via beta-binomial removals

    Get PDF
    Progressive first-failure censoring has been widely-used in practice when the experimenter desires to remove some groups of test units before the first-failure is observed in all groups. Practically, some test groups may haphazardly quit the experiment at each progressive stage, which cannot be determined in advance. As a result, in this article, we propose a progressively first-failure censored sampling with random removals, which allows the removal of the surviving group(s) during the execution of the life test with uncertain probability, called the beta-binomial probability law. Generalized extreme value lifetime model has been widely-used to analyze a variety of extreme value data, including flood flows, wind speeds, radioactive emissions, and others. So, when the sample observations are gathered using the suggested censoring plan, the Bayes and maximum likelihood approaches are used to estimate the generalized extreme value distribution parameters. Furthermore, Bayes estimates are produced under balanced symmetric and asymmetric loss functions. A hybrid Gibbs within the Metropolis-Hastings method is suggested to gather samples from the joint posterior distribution. The highest posterior density intervals are also provided. To further understand how the suggested inferential approaches actually work in the long run, extensive Monte Carlo simulation experiments are carried out. Two applications of real-world datasets from clinical trials are examined to show the applicability and feasibility of the suggested methodology. The numerical results showed that the proposed sampling mechanism is more flexible to operate a classical (or Bayesian) inferential approach to estimate any lifetime parameter

    Bayesian Inference for Cure Rate Models

    Get PDF
    Η ανάλυση επιβίωσης αποτελείται από ένα σύνολο στατιστικών μεθόδων που στοχεύει στη μελέτη του χρόνου μέχρι την εμφάνιση ενός συγκεκριμένου γεγονότος όπως ο θάνατος. Για την πλειονότητα των μεθόδων αυτών, θεωρείται πως όλα τα άτομα που συμμετέχουν υπόκεινται στο γεγονός που μας ενδιαφέρει. Ωστόσο, υπάρχουν περιπτώσεις όπου η υπόθεση αυτή δεν είναι ρεαλιστική, καθώς υπάρχουν ασθενείς που δεν θα βιώσουν το γεγονός αυτό στη διάρκεια της μελέτης. Για αυτό το λόγο, έχουν αναπτυχθεί ορισμένα μοντέλα επιβίωσης που επιτρέπουν την ύπαρξη ασθενών οι οποίοι δε βιώνουν το συμβάν και ονομάζονται μακροχρόνια επιζώντες. Τα μοντέλα αυτά ονομάζονται μοντέλα ρυθμού θεραπείας και υποθέτουν ότι, καθώς ο χρόνος αυξάνεται, η συνάρτηση επιβίωσης τείνει σε μια τιμή p ∈ (0,1), που αντιπροσωπεύει το ποσοστό των μακροχρόνια επιζώντων, αντί να τείνει στο μηδέν όπως στην κλασική ανάλυση επιβίωσης. Πρόσφατα, ο Rocha (2016) πρότεινεμία νέα προσέγγισητωνπροβλημάτωνεπιβίωσης μεμακροχρόνια επιζώντες. Η μεθοδολογία του για τη μοντελοποίηση του ποσοστού των μακροχρόνια επιζώντων βασίστηκε στη χρήση των «ελαττωματικών» (defective) κατανομών, οι οποίες χαρακτηρίζονται από το γεγονός ότι το ολοκλήρωμα της συνάρτησης πιθανότητάς τους δεν ισούται με τη μονάδα για ορισμένες επιλογές του πεδίου ορισμού κάποιων παραμέτρων τους. Σκοπός της παρούσας διπλωματικής εργασίας, είναι να παράσχει νέους Μπεϋζιανούς εκτιμητές των παραμέτρων των «ελαττωματικών» μοντέλων κάτω από την υπόθέση της δεξιάς λογοκρισίας. Επίσης, θα αναπτυχθούν αλγόριθμοι Markov chain Monte Carlo (MCMC) για τη συμπερασματολογία σχετικά με τις παραμέτρους μιας ευρείας κατηγορίας μοντέλων ρυθμού θεραπείας βασισμένων στις «ελαττωματικές» αυτές κατανομές, ενώ οι Μπεϋζιανοί εκτιμητές και τα αντίστοιχα διαστήματα αξιοπιστίας θα ληφθούν από τα δείγματα της από κοινού εκ των υστέρων κατανομής. Επιπλέον, η συμπεριφορά των Μπεϋζιανών εκτιμητών θα αξιολογηθεί και θα συγκριθεί με αυτή των εκτιμητών μεγίστης πιθανοφάνειας του Rocha (2016) μέσω πειραμάτων προσομοίωσης. Ακόμη, τα προτεινόμενα αυτά μοντέλα-κατανομές θα εφαρμοσθούν σε πραγματικά σετ δεδομένων, όπου και θα συγκριθούν μεταξύ τους μέσω κατάλληλων στατιστικών μεγεθών. Τέλος, αξίζει να σημειωθεί πως η παρούσα διπλωματική εργασία αποτελεί την πρώτη προσπάθεια διερεύνησης των πλεονεκτημάτων της Μπεϋζιανής προσέγγισης στη συμπερασματολογία για τις παραμέτρους αρκετών μοντέλων ρυθμού θεραπείας, κάτω από την υπόθεση της δεξιάς λογοκρισίας, καθώς και της απόκτησης νέων Μπεϋζιανών εκτιμητών, χωρίς όμως τη συμπερίληψη της πληροφορίας από συν μεταβλητές.Survival analysis consists of a set of statistical methods in the field of biostatistics, whose main aim is to study the time until the occurrence of a specified event, such as death. For the majority of these methods it is assumed that all the individuals taking part in the study are subject to the event of interest. However, there are situations where this assumption is unrealistic, since there are observations not susceptible to the event of interest or cured. For this reason, there have been developed some survival models which allow for patients that may never experience the event, usually called long-term survivors. These models, called Cure Rate Models, assume that, as time increases, the survival function tends to a value p ∈ (0,1), representing the cure rate, instead of tending to zero as in standard survival analysis. Recently, Rocha (2016) proposed a new approach to modelling the situations in which there are long-term survivors in survival studies. His methodology was based on the use of defective distributions to model cure rates. In contrast to the standard distributions, the defective ones are characterized by having probability density functions which integrate to values less than one for certain choices of the domain of some of their parameters. The aim of the present thesis is to provide new Bayesian estimates for the parameters of the defective models used for cure rate modelling under the assumption of right censoring. We will develop Markov chain Monte Carlo (MCMC) algorithms for inferring the parameters of a broad class of defective models, both for the baseline distributions (Gompertz & Inverse Gaussian), as well as, for their extension under the Marshall-Olkin family of distributions. The Bayesian estimates of the distributions’ parameters, as well as their associated credible intervals, will be obtained from the samples drawn from their joint posterior distribution. In addition, Bayesian estimates’ behaviour will be evaluated and compared with the maximum likelihood estimates obtained by Rocha (2016) through simulation experiments. Finally, we will apply the competing models and approaches to real datasets and we will compare them through various statistical measures. This work will be the first attempt to explore the advantages of the Bayesian approach to inference for defective cure rate models under the assumption of right censoring mechanism, as well as the first presentation of new Bayesian estimates for several defective distributions, but without incorporating covariate information

    Random Number Generators

    Get PDF
    The quasi-negative-binomial distribution was applied to queuing theory for determining the distribution of total number of customers served before the queue vanishes under certain assumptions. Some structural properties (probability generating function, convolution, mode and recurrence relation) for the moments of quasi-negative-binomial distribution are discussed. The distribution’s characterization and its relation with other distributions were investigated. A computer program was developed using R to obtain ML estimates and the distribution was fitted to some observed sets of data to test its goodness of fit

    Current Topics on Risk Analysis: ICRA6 and RISK2015 Conference

    Get PDF
    Artículos presentados en la International Conference on Risk Analysis ICRA 6/RISK 2015, celebrada en Barcelona del 26 al 29 de mayo de 2015.Peer ReviewedPostprint (published version

    Current Topics on Risk Analysis: ICRA6 and RISK2015 Conference

    Get PDF
    Peer ReviewedPostprint (published version
    corecore