49,254 research outputs found

    Harmonising Basel III and the Dodd Frank Act through International Accounting Standards – Reasons why International Accounting Standards Should Serve as “Thermostats

    Get PDF
    Why should differences between regulatory and accounting policies be mitigated? Because mitigating such differences could facilitate convergence – as well as financial stability. The paper ―Fair Value Accounting and Procyclicality: Mitigating Regulatory and Accounting Policy Differences through Regulatory Structure Reforms and Enforced Self Regulation‖ illustrates how the implementation of accounting standards and policies, in certain instances, have contrasted with Basel Committee initiatives aimed at mitigating procyclicality and facilitating forward looking provisioning. The paper also highlights how and why differences between regulatory and accounting policies could (and should) be mitigated. This paper focuses on how recent regulatory reforms – with particular reference to the Dodd Frank Act, impact fair value measurements. Other potential implications for accounting measurements and valuation, will also be considered. Given the tendencies for discrepancies to arise between regulatory and accounting policies, and owing to discrepancies between Basel III and the Dodd Frank Act, would a more imposing and commanding role for international standards not serve as a powerful weapon in harmonizing Basel III and Dodd Frank – whilst mitigating regulatory and accounting policy differences

    The private finance initiative (PFI) and finance capital: A note on gaps in the "accountability" debate

    Get PDF
    During recent years, a wide spectrum of research has questioned whether public services/infrastructure procurement through private finance, as exemplified by the UK Private Finance Initiative (PFI), meets minimum standard of democratic accountability. While broadly agreeing with some of these arguments, this paper suggests that this debate is flawed on two grounds. Firstly, PFI is not about effective procurement, or even about a pragmatic choice of procurement mechanisms which can potentially compromise public involvement and input; rather it is about a process where the state creates new profit opportunities at a time when the international financial system is increasingly lacking in safe investment opportunities. Secondly, because of its primary function as investment opportunity, PFI, by its very nature, prioritises the risk-return criteria of private finance over the needs of the public sector client and its stakeholders. Using two case studies of recent PFI projects, the paper illustrates some of the mechanisms through which finance capital exercises control over the PFI procurement process. The paper concludes that recent proposals aimed at “reforming” or “democratising” PFI fail to recognise the objective constraints which this type of state-finance capital nexus imposes on political process

    Cosmological Parameters from CMB Maps without Likelihood Approximation

    Full text link
    We propose an efficient Bayesian MCMC algorithm for estimating cosmological parameters from CMB data without use of likelihood approximations. It builds on a previously developed Gibbs sampling framework that allows for exploration of the joint CMB sky signal and power spectrum posterior, P(s,Cl|d), and addresses a long-standing problem of efficient parameter estimation simultaneously in high and low signal-to-noise regimes. To achieve this, our new algorithm introduces a joint Markov Chain move in which both the signal map and power spectrum are synchronously modified, by rescaling the map according to the proposed power spectrum before evaluating the Metropolis-Hastings accept probability. Such a move was already introduced by Jewell et al. (2009), who used it to explore low signal-to-noise posteriors. However, they also found that the same algorithm is inefficient in the high signal-to-noise regime, since a brute-force rescaling operation does not account for phase information. This problem is mitigated in the new algorithm by subtracting the Wiener filter mean field from the proposed map prior to rescaling, leaving high signal-to-noise information invariant in the joint step, and effectively only rescaling the low signal-to-noise component. To explore the full posterior, the new joint move is then interleaved with a standard conditional Gibbs sky map move. We apply our new algorithm to simplified simulations for which we can evaluate the exact posterior to study both its accuracy and performance, and find good agreement with the exact posterior; marginal means agree to less than 0.006 sigma, and standard deviations to better than 3%. The Markov Chain correlation length is of the same order of magnitude as those obtained by other standard samplers in the field.Comment: 9 pages, 3 figures, Published in Ap

    An Experiential Comparative Tool for Board Games

    Get PDF
    In the field of game studies, contemporary board games have until now remained relatively unexplored. The recent years have allowed us to witness the emergence of the occasional academic texts focusing on board games – such as Eurogames (Woods, 2012), Characteristics of Games (Elias et al. 2013), and most recently Game Play: Paratextuality in Contemporary Board Games (Booth, 2015). The mentioned authors all explore board games from diverse viewpoints but none of these authors present a viable and practical analytical tool to allow us to examine and differentiate one board game from another. In this vein, this paper seeks to present an analytical comparative tool intended specifically for board games. The tool builds upon previous works (Aarseth et al. 2003; Elias et al. 2012; and Woods 2012) to show how four categories – rules, luck, interaction and theme – can interact on different levels to generate diverse gameplay experiences. Such a tool allows to score games objectively and separately in each of the categories to create a combined gameplay experience profile for each board game. Following this, the paper proceeds to present numerous practical examples of contemporary board games and how it can be used from a design perspective and an analytical perspective alike

    Harmonising Basel III and the Dodd Frank Act through international accounting standards: reasons why international accounting standards should serve as “thermostats”

    Get PDF
    Why should differences between regulatory and accounting policies be mitigated? Because mitigating such differences could facilitate convergence – as well as financial stability. The paper “Fair Value Accounting and Procyclicality: Mitigating Regulatory and Accounting Policy Differences through Regulatory Structure Reforms and Enforced Self Regulation” illustrates how the implementation of accounting standards and policies, in certain instances, have contrasted with Basel Committee initiatives aimed at mitigating procyclicality and facilitating forward looking provisioning. The paper also highlights how and why differences between regulatory and accounting policies could (and should) be mitigated. This paper focuses on how recent regulatory reforms – with particular reference to the Dodd Frank Act, impact fair value measurements. Other potential implications for accounting measurements and valuation, will also be considered. Given the tendencies for discrepancies to arise between regulatory and accounting policies, and owing to discrepancies between Basel III and the Dodd Frank Act, would a more imposing and commanding role for international standards not serve as a powerful weapon in harmonizing Basel III and Dodd Frank – whilst mitigating regulatory and accounting policy differences?financial stability; OTC derivatives markets; counterparty risks; disclosure; information asymmetry; transparency; living wills; Volcker Rule; Basel III; Basel II; pro cyclicality; international auditing standards; Dodd Frank Act; fair values
    corecore