6,519 research outputs found

    Real and strongly real classes in PGL<i><sub>n</sub>(q)</i> and quasi-simple covers of PSL<i><sub>n</sub>(q)</i>

    Get PDF
    We classify the real and strongly real conjugacy classes in PGLn(q), PSLn(q) and all quasi-simple covers of PSLn(q). In each case we give a formula for the number of real, and the number of strongly real, conjugacy classes. This is a companion paper to [Gill and Singh, J. Group Theory, May 2011, 14:3, pp.437-459] in which we classified the real and strongly real conjugacy classes in GLn(q) and SLn(q)

    Is Economic Recovery a Myth? Robust Estimation of Impulse Responses

    Get PDF
    There is a lively debate on the persistence of the current banking crisis' impact on GDP. Impulse Response Functions (IRF) estimated by Cerra and Saxena (2008) suggest that the effects of earlier crises were long-lasting. We show that standard estimates of IRFs are highly sensitive to misspecification of the underlying data generation process. Direct estimation of IRFs by a methodology similar to Jorda's (2005) local projection method is robust to misspecifications of the data generation process but yields biased estimates when country fixed effects are added. We propose a simple method to deal with this bias, which we apply to panel data from 99 countries for the period 1974-2001. Our estimates suggest that an average banking crisis leads to an output loss of around 10 percent with little sign of recovery. GDP losses from banking crises are more severe for African countries and economies in transition.banking crisis, impulse response, panel data

    The Bayesian sampler : generic Bayesian inference causes incoherence in human probability

    Get PDF
    Human probability judgments are systematically biased, in apparent tension with Bayesian models of cognition. But perhaps the brain does not represent probabilities explicitly, but approximates probabilistic calculations through a process of sampling, as used in computational probabilistic models in statistics. Naïve probability estimates can be obtained by calculating the relative frequency of an event within a sample, but these estimates tend to be extreme when the sample size is small. We propose instead that people use a generic prior to improve the accuracy of their probability estimates based on samples, and we call this model the Bayesian sampler. The Bayesian sampler trades off the coherence of probabilistic judgments for improved accuracy, and provides a single framework for explaining phenomena associated with diverse biases and heuristics such as conservatism and the conjunction fallacy. The approach turns out to provide a rational reinterpretation of “noise” in an important recent model of probability judgment, the probability theory plus noise model (Costello & Watts, 2014, 2016a, 2017; Costello & Watts, 2019; Costello, Watts, & Fisher, 2018), making equivalent average predictions for simple events, conjunctions, and disjunctions. The Bayesian sampler does, however, make distinct predictions for conditional probabilities and distributions of probability estimates. We show in 2 new experiments that this model better captures these mean judgments both qualitatively and quantitatively; which model best fits individual distributions of responses depends on the assumed size of the cognitive sample

    Mitigating the effects of atmospheric distortion using DT-CWT fusion

    Get PDF
    This paper describes a new method for mitigating the effects of atmospheric distortion on observed images, particularly airborne turbulence which degrades a region of interest (ROI). In order to provide accurate detail from objects behind the dis-torting layer, a simple and efficient frame selection method is proposed to pick informative ROIs from only good-quality frames. We solve the space-variant distortion problem using region-based fusion based on the Dual Tree Complex Wavelet Transform (DT-CWT). We also propose an object alignment method for pre-processing the ROI since this can exhibit sig-nificant offsets and distortions between frames. Simple haze removal is used as the final step. The proposed method per-forms very well with atmospherically distorted videos and outperforms other existing methods. Index Terms — Image restoration, fusion, DT-CWT 1

    The Breakdown of Morale

    Get PDF
    This paper studies how morale in teams can break down. It interprets high morale as team members working together productively, either because of a sense of fairness or because of implicit incentives from repeated interactions. Team members learn that lay-offs will occur at a fixed future date, which will eventually cause morale to break down. The paper shows that the breakdown of morale can vary in size and the equilibrium outcomes can be Pareto ranked. A firm's measures to encourage cooperation may actually hurt morale, by convincing opportunistic team members to imitate and later take advantage of cooperative colleagues

    Imprint of DESI fiber assignment on the anisotropic power spectrum of emission line galaxies

    Get PDF
    The Dark Energy Spectroscopic Instrument (DESI), a multiplexed fiber-fed spectrograph, is a Stage-IV ground-based dark energy experiment aiming to measure redshifts for 29 million Emission-Line Galaxies (ELG), 4 million Luminous Red Galaxies (LRG), and 2 million Quasi-Stellar Objects (QSO). The survey design includes a pattern of tiling on the sky and the locations of the fiber positioners in the focal plane of the telescope, with the observation strategy determined by a fiber assignment algorithm that optimizes the allocation of fibers to targets. This strategy allows a given region to be covered on average five times for a five-year survey, but with coverage varying between zero and twelve, which imprints a spatially-dependent pattern on the galaxy clustering. We investigate the systematic effects of the fiber assignment coverage on the anisotropic galaxy clustering of ELGs and show that, in the absence of any corrections, it leads to discrepancies of order ten percent on large scales for the power spectrum multipoles. We introduce a method where objects in a random catalog are assigned a coverage, and the mean density is separately computed for each coverage factor. We show that this method reduces, but does not eliminate the effect. We next investigate the angular dependence of the contaminated signal, arguing that it is mostly localized to purely transverse modes. We demonstrate that the cleanest way to remove the contaminating signal is to perform an analysis of the anisotropic power spectrum P(k,μ)P(k,\mu) and remove the lowest μ\mu bin, leaving μ>0\mu>0 modes accurate at the few-percent level. Here, μ\mu is the cosine of the angle between the line-of-sight and the direction of k\vec{k}. We also investigate two alternative definitions of the random catalog and show they are comparable but less effective than the coverage randoms method.Comment: Submitted to JCA

    Capacity Constraints and Beliefs about Demand

    Get PDF
    This paper examines how a firm can strategically choose its capacity to manipulate consumer beliefs about aggregate demand. It looks at a market with social effects where consumers want to do what is popular, to buy what they believe others want to buy. By imposing a capacity constraint and setting a price just low enough for it to bind, the firm can fool certain naive consumers into believing that demand is greater than it actually is. This will in turn increase the willingness to pay of all consumers through social effects. In equilibrium, the firm will impose a capacity constraint whenever demand is lower than expected, even when the number of naive consumers is arbitrarily small
    corecore