14,106 research outputs found

    Market concentration and the likelihood of financial crises

    Get PDF
    According to theory, market concentration affects the likelihood of a financial crisis in different ways. The “concentration-stability” and the “concentrationfragility” hypotheses suggest opposing effects operating through specific channels. Using data of 160 countries for the period 1970-2007, this paper empirically tests these indirect effects of financial market structure. We set up a simultaneous system in order to jointly estimate financial stability and the relevant channel variables as endogenous variables. Our findings provide support for the assumption of channel effects in general and both the concentrationstability and the concentration-fragility hypothesis in particular. The effects are found to vary between high and low income countries.Market Concentration, Financial Crisis, Systemic Crisis

    Competition and innovative intentions: A study of Dutch SMEs

    Get PDF
    This paper explores the complex relationship between competition and innovation. Traditional measures of competition using industry statistics are often challenged and found wanting. This paper distinguishes between three types of competitive forces: internal rivalry among incumbent firms in an industry, bargaining power of suppliers, and bargaining power of buyers. Using survey data from 2,281 Dutch firms, we apply new perception-based measures for these competitive forces to explore how competition relates to firms innovative intentions. We also investigate the influence of innovation strategy as a contingency variable. Results show that specific innovative intentions, i.e. to invest in product and process innovation, are related to different competitive forces. Process innovation is correlated with the bargaining power of suppliers, while intentions to invest in product innovation are associated with buyer power. Finally, intended product innovation is related to internal rivalry, but only when firms have no innovation strategy.

    The Simulator: Understanding Adaptive Sampling in the Moderate-Confidence Regime

    Full text link
    We propose a novel technique for analyzing adaptive sampling called the {\em Simulator}. Our approach differs from the existing methods by considering not how much information could be gathered by any fixed sampling strategy, but how difficult it is to distinguish a good sampling strategy from a bad one given the limited amount of data collected up to any given time. This change of perspective allows us to match the strength of both Fano and change-of-measure techniques, without succumbing to the limitations of either method. For concreteness, we apply our techniques to a structured multi-arm bandit problem in the fixed-confidence pure exploration setting, where we show that the constraints on the means imply a substantial gap between the moderate-confidence sample complexity, and the asymptotic sample complexity as δ→0\delta \to 0 found in the literature. We also prove the first instance-based lower bounds for the top-k problem which incorporate the appropriate log-factors. Moreover, our lower bounds zero-in on the number of times each \emph{individual} arm needs to be pulled, uncovering new phenomena which are drowned out in the aggregate sample complexity. Our new analysis inspires a simple and near-optimal algorithm for the best-arm and top-k identification, the first {\em practical} algorithm of its kind for the latter problem which removes extraneous log factors, and outperforms the state-of-the-art in experiments

    Mixture Martingales Revisited with Applications to Sequential Tests and Confidence Intervals

    Get PDF
    This paper presents new deviation inequalities that are valid uniformly in time under adaptive sampling in a multi-armed bandit model. The deviations are measured using the Kullback-Leibler divergence in a given one-dimensional exponential family, and may take into account several arms at a time. They are obtained by constructing for each arm a mixture martingale based on a hierarchical prior, and by multiplying those martingales. Our deviation inequalities allow us to analyze stopping rules based on generalized likelihood ratios for a large class of sequential identification problems, and to construct tight confidence intervals for some functions of the means of the arms

    The Jeffreys-Lindley Paradox and Discovery Criteria in High Energy Physics

    Full text link
    The Jeffreys-Lindley paradox displays how the use of a p-value (or number of standard deviations z) in a frequentist hypothesis test can lead to an inference that is radically different from that of a Bayesian hypothesis test in the form advocated by Harold Jeffreys in the 1930s and common today. The setting is the test of a well-specified null hypothesis (such as the Standard Model of elementary particle physics, possibly with "nuisance parameters") versus a composite alternative (such as the Standard Model plus a new force of nature of unknown strength). The p-value, as well as the ratio of the likelihood under the null hypothesis to the maximized likelihood under the alternative, can strongly disfavor the null hypothesis, while the Bayesian posterior probability for the null hypothesis can be arbitrarily large. The academic statistics literature contains many impassioned comments on this paradox, yet there is no consensus either on its relevance to scientific communication or on its correct resolution. The paradox is quite relevant to frontier research in high energy physics. This paper is an attempt to explain the situation to both physicists and statisticians, in the hope that further progress can be made.Comment: v4: Continued editing for clarity. Figure added. v5: Minor fixes to biblio. Same as published version except for minor copy-edits, Synthese (2014). v6: fix typos, and restore garbled sentence at beginning of Sec 4 to v

    Recent advances in directional statistics

    Get PDF
    Mainstream statistical methodology is generally applicable to data observed in Euclidean space. There are, however, numerous contexts of considerable scientific interest in which the natural supports for the data under consideration are Riemannian manifolds like the unit circle, torus, sphere and their extensions. Typically, such data can be represented using one or more directions, and directional statistics is the branch of statistics that deals with their analysis. In this paper we provide a review of the many recent developments in the field since the publication of Mardia and Jupp (1999), still the most comprehensive text on directional statistics. Many of those developments have been stimulated by interesting applications in fields as diverse as astronomy, medicine, genetics, neurology, aeronautics, acoustics, image analysis, text mining, environmetrics, and machine learning. We begin by considering developments for the exploratory analysis of directional data before progressing to distributional models, general approaches to inference, hypothesis testing, regression, nonparametric curve estimation, methods for dimension reduction, classification and clustering, and the modelling of time series, spatial and spatio-temporal data. An overview of currently available software for analysing directional data is also provided, and potential future developments discussed.Comment: 61 page

    Concentration of personal and household crimes in England and Wales

    Get PDF
    Crime is disproportionally concentrated in few areas. Though long-established, there remains uncertainty about the reasons for variation in the concentration of similar crime (repeats) or different crime (multiples). Wholly neglected have been composite crimes when more than one crime types coincide as parts of a single event. The research reported here disentangles area crime concentration into repeats, multiple and composite crimes. The results are based on estimated bivariate zero-inflated Poisson regression models with covariance structure which explicitly account for crime rarity and crime concentration. The implications of the results for criminological theorizing and as a possible basis for more equitable police funding are discussed
    • …
    corecore