4,852 research outputs found

    The Beta-Gompertz Distribution

    Full text link
    In this paper, we introduce a new four-parameter generalized version of the Gompertz model which is called Beta-Gompertz (BG) distribution. It includes some well-known lifetime distributions such as beta-exponential and generalized Gompertz distributions as special sub-models. This new distribution is quite flexible and can be used effectively in modeling survival data and reliability problems. It can have a decreasing, increasing, and bathtub-shaped failure rate function depending on its parameters. Some mathematical properties of the new distribution, such as closed-form expressions for the density, cumulative distribution, hazard rate function, the kkth order moment, moment generating function, Shannon entropy, and the quantile measure are provided. We discuss maximum likelihood estimation of the BG parameters from one observed sample and derive the observed Fisher's information matrix. A simulation study is performed in order to investigate this proposed estimator for parameters. At the end, in order to show the BG distribution flexibility, an application using a real data set is presented.Comment: http://www.emis.de/journals/RCE/ingles/v37_1.htm

    On choosing mixture components via non-local priors

    Get PDF
    Choosing the number of mixture components remains an elusive challenge. Model selection criteria can be either overly liberal or conservative and return poorly-separated components of limited practical use. We formalize non-local priors (NLPs) for mixtures and show how they lead to well-separated components with non-negligible weight, interpretable as distinct subpopulations. We also propose an estimator for posterior model probabilities under local and non-local priors, showing that Bayes factors are ratios of posterior to prior empty-cluster probabilities. The estimator is widely applicable and helps set thresholds to drop unoccupied components in overfitted mixtures. We suggest default prior parameters based on multi-modality for Normal/T mixtures and minimal informativeness for categorical outcomes. We characterise theoretically the NLP-induced sparsity, derive tractable expressions and algorithms. We fully develop Normal, Binomial and product Binomial mixtures but the theory, computation and principles hold more generally. We observed a serious lack of sensitivity of the Bayesian information criterion (BIC), insufficient parsimony of the AIC and a local prior, and a mixed behavior of the singular BIC. We also considered overfitted mixtures, their performance was competitive but depended on tuning parameters. Under our default prior elicitation NLPs offered a good compromise between sparsity and power to detect meaningfully-separated components

    Risk Attitudes and Internet Search Engines: Theory and Experimental Evidence

    Get PDF
    This paper analyzes the impact on consumer prices of the size and biases of price comparison search engines. We develop several theoretical predictions, in the context of a model related to Burdett and Judd (1983) and Varian (1980), and test them experimentally. The data supports the model's predictions regarding the impact of the number of firms, and the type of bias of the search engine. The data does not support the model's predictions regarding the impact of the size of the search engine. We identified several data patterns, and developed an econometric model for the price distributions. Variables accounting for risk attitudes improved significantly the explanatory power of the econometric model

    Quadratic Hedging of Basis Risk

    Get PDF
    This paper examines a simple basis risk model based on correlated geometric Brownian motions. We apply quadratic criteria to minimize basis risk and hedge in an optimal manner. Initially, we derive the Follmer-Schweizer decomposition of a European claim. This allows pricing and hedging under the minimal martingale measure, corresponding to the local risk-minimizing strategy. Furthermore, since the mean-variance tradeoff process is deterministic in our setup, the minimal martingale- and variance-optimal martingale measures coincide. Consequently, the mean-variance optimal strategy is easily constructed. Simple closed-form pricing and hedging formulae for put and call options are derived. Due to market incompleteness, these formulae depend on the drift parameters of the processes. By making a further equilibrium assumption, we derive an approximate hedging formula, which does not require knowledge of these parameters. The hedging strategies are tested using Monte Carlo experiments, and are compared with recent results achieved using a utility maximization approach.Option hedging; incomplete markets; basis risk; local risk minimization; mean-variance hedging

    Multiple-Play Bandits in the Position-Based Model

    Full text link
    Sequentially learning to place items in multi-position displays or lists is a task that can be cast into the multiple-play semi-bandit setting. However, a major concern in this context is when the system cannot decide whether the user feedback for each item is actually exploitable. Indeed, much of the content may have been simply ignored by the user. The present work proposes to exploit available information regarding the display position bias under the so-called Position-based click model (PBM). We first discuss how this model differs from the Cascade model and its variants considered in several recent works on multiple-play bandits. We then provide a novel regret lower bound for this model as well as computationally efficient algorithms that display good empirical and theoretical performance

    Proximity Operators of Discrete Information Divergences

    Get PDF
    Information divergences allow one to assess how close two distributions are from each other. Among the large panel of available measures, a special attention has been paid to convex φ\varphi-divergences, such as Kullback-Leibler, Jeffreys-Kullback, Hellinger, Chi-Square, Renyi, and Iα_{\alpha} divergences. While φ\varphi-divergences have been extensively studied in convex analysis, their use in optimization problems often remains challenging. In this regard, one of the main shortcomings of existing methods is that the minimization of φ\varphi-divergences is usually performed with respect to one of their arguments, possibly within alternating optimization techniques. In this paper, we overcome this limitation by deriving new closed-form expressions for the proximity operator of such two-variable functions. This makes it possible to employ standard proximal methods for efficiently solving a wide range of convex optimization problems involving φ\varphi-divergences. In addition, we show that these proximity operators are useful to compute the epigraphical projection of several functions of practical interest. The proposed proximal tools are numerically validated in the context of optimal query execution within database management systems, where the problem of selectivity estimation plays a central role. Experiments are carried out on small to large scale scenarios
    • …
    corecore