794 research outputs found

    Consistency of the Shannon entropy in quantum experiments

    Full text link
    The consistency of the Shannon entropy, when applied to outcomes of quantum experiments, is analysed. It is shown that the Shannon entropy is fully consistent and its properties are never violated in quantum settings, but attention must be paid to logical and experimental contexts. This last remark is shown to apply regardless of the quantum or classical nature of the experiments.Comment: 12 pages, LaTeX2e/REVTeX4. V5: slightly different than the published versio

    Drivers and outcomes of work alienation: reviving a concept

    Get PDF
    This article sheds new light on an understudied construct in mainstream management theory, namely, work alienation. This is an important area of study because previous research indicates that work alienation is associated with important individual and organizational outcomes. We tested four antecedents of work alienation: decision-making autonomy, task variety, task identity, and social support. Moreover, we examined two outcomes of alienation: deviance and performance, the former measured 1 year after the independent variables were measured, and the latter as rated by supervisors. We present evidence from a sample of 283 employees employed at a construction and consultancy organization in the United Kingdom. The results supported the majority of our hypotheses, indicating that alienation is a worthy concept of exploration in the management sciences

    On the Computational Complexity of Measuring Global Stability of Banking Networks

    Full text link
    Threats on the stability of a financial system may severely affect the functioning of the entire economy, and thus considerable emphasis is placed on the analyzing the cause and effect of such threats. The financial crisis in the current and past decade has shown that one important cause of instability in global markets is the so-called financial contagion, namely the spreading of instabilities or failures of individual components of the network to other, perhaps healthier, components. This leads to a natural question of whether the regulatory authorities could have predicted and perhaps mitigated the current economic crisis by effective computations of some stability measure of the banking networks. Motivated by such observations, we consider the problem of defining and evaluating stabilities of both homogeneous and heterogeneous banking networks against propagation of synchronous idiosyncratic shocks given to a subset of banks. We formalize the homogeneous banking network model of Nier et al. and its corresponding heterogeneous version, formalize the synchronous shock propagation procedures, define two appropriate stability measures and investigate the computational complexities of evaluating these measures for various network topologies and parameters of interest. Our results and proofs also shed some light on the properties of topologies and parameters of the network that may lead to higher or lower stabilities.Comment: to appear in Algorithmic

    Facts, Values and Quanta

    Full text link
    Quantum mechanics is a fundamentally probabilistic theory (at least so far as the empirical predictions are concerned). It follows that, if one wants to properly understand quantum mechanics, it is essential to clearly understand the meaning of probability statements. The interpretation of probability has excited nearly as much philosophical controversy as the interpretation of quantum mechanics. 20th century physicists have mostly adopted a frequentist conception. In this paper it is argued that we ought, instead, to adopt a logical or Bayesian conception. The paper includes a comparison of the orthodox and Bayesian theories of statistical inference. It concludes with a few remarks concerning the implications for the concept of physical reality.Comment: 30 pages, AMS Late

    Credibility and adjustment: gold standards versus currency boards

    Full text link
    It is often maintained that currency boards (CBs) and gold standards (GSs) are alike in that they are stringent monetary rules, the two basic features of which are high credibility of monetary authorities and the existence of automatic adjustment (non discretionary) mechanism. This article includes a comparative analysis of these two types of regimes both from the perspective of the sources and mechanisms of generating confidence and credibility, and the elements of operation of the automatic adjustment mechanism. Confidence under the GS is endogenously driven, whereas it is exogenously determined under the CB. CB is a much more asymmetric regime than GS (the adjustment is much to the detriment of peripheral countries) although asymmetry is a typical feature of any monetary regime. The lack of credibility is typical for peripheral countries and cannot be overcome completely even by “hard” monetary regimes.http://deepblue.lib.umich.edu/bitstream/2027.42/40078/3/wp692.pd

    Come back Marshall, all is forgiven? : Complexity, evolution, mathematics and Marshallian exceptionalism

    Get PDF
    Marshall was the great synthesiser of neoclassical economics. Yet with his qualified assumption of self-interest, his emphasis on variation in economic evolution and his cautious attitude to the use of mathematics, Marshall differs fundamentally from other leading neoclassical contemporaries. Metaphors inspire more specific analogies and ontological assumptions, and Marshall used the guiding metaphor of Spencerian evolution. But unfortunately, the further development of a Marshallian evolutionary approach was undermined in part by theoretical problems within Spencer's theory. Yet some things can be salvaged from the Marshallian evolutionary vision. They may even be placed in a more viable Darwinian framework.Peer reviewedFinal Accepted Versio

    Crises and collective socio-economic phenomena: simple models and challenges

    Full text link
    Financial and economic history is strewn with bubbles and crashes, booms and busts, crises and upheavals of all sorts. Understanding the origin of these events is arguably one of the most important problems in economic theory. In this paper, we review recent efforts to include heterogeneities and interactions in models of decision. We argue that the Random Field Ising model (RFIM) indeed provides a unifying framework to account for many collective socio-economic phenomena that lead to sudden ruptures and crises. We discuss different models that can capture potentially destabilising self-referential feedback loops, induced either by herding, i.e. reference to peers, or trending, i.e. reference to the past, and account for some of the phenomenology missing in the standard models. We discuss some empirically testable predictions of these models, for example robust signatures of RFIM-like herding effects, or the logarithmic decay of spatial correlations of voting patterns. One of the most striking result, inspired by statistical physics methods, is that Adam Smith's invisible hand can badly fail at solving simple coordination problems. We also insist on the issue of time-scales, that can be extremely long in some cases, and prevent socially optimal equilibria to be reached. As a theoretical challenge, the study of so-called "detailed-balance" violating decision rules is needed to decide whether conclusions based on current models (that all assume detailed-balance) are indeed robust and generic.Comment: Review paper accepted for a special issue of J Stat Phys; several minor improvements along reviewers' comment

    Bayesian Probability and Statistics in Management Research: A New Horizon

    Get PDF
    This special issue is focused on how a Bayesian approach to estimation, inference, and reasoning in organizational research might supplement—and in some cases supplant—traditional frequentist approaches. Bayesian methods are well suited to address the increasingly complex phenomena and problems faced by 21st-century researchers and organizations, where very complex data abound and the validity of knowledge and methods are often seen as contextually driven and constructed. Traditional modeling techniques and a frequentist view of probability and method are challenged by this new reality
    • 

    corecore