5,652 research outputs found

    Editors' Welcome

    Get PDF
    Welcome! We are excited to present the first issue of Pennsylvania Libraries: Research &amp; Practice. Also known as PaLRaP, this new publication is for anyone who works in or cares about libraries in Pennsylvania.</jats:p

    Here Comes the Sunburst: Measuring and Visualizing Scholarly Impact

    Get PDF
    Our ARL institution partnered with a new service (PlumX) to track, measure, and visualize faculty scholarly impact. In a pilot project, both traditional and emerging measures of scholarly impact were collected for 32 researchers. The presenters will chronicle the data management and enhancements applied, including utilizing content from our institutional repository, importing and enriching metadata, and using an intranet to organize work and collaborate with colleagues. Results will assist faculty and those who work with them to identify strengths and weaknesses of scholarly impact and where to focus efforts to increase research visibility

    Shipboard Crisis Management: A Case Study.

    Get PDF
    The loss of the "Green Lily" in 1997 is used as a case study to highlight the characteristics of escalating crises. As in similar safety critical industries, these situations are unpredictable events that may require co-ordinated but flexible and creative responses from individuals and teams working in stressful conditions. Fundamental skill requirements for crisis management are situational awareness and decision making. This paper reviews the naturalistic decision making (NDM) model for insights into the nature of these skills and considers the optimal training regimes to cultivate them. The paper concludes with a review of the issues regarding the assessment of crisis management skills and current research into the determination of behavioural markers for measuring competence

    Local Retrodiction Models for Photon-Noise-Limited Images

    Get PDF
    Imaging technologies working at very low light levels acquire data by attempting to count the number of photons impinging on each pixel. Especially in cases with, on average, less than one photocount per pixel the resulting images are heavily corrupted by Poissonian noise and a host of successful algorithms trying to reconstruct the original image from this noisy data have been developed. Here we review a recently proposed scheme that complements these algorithms by calculating the full probability distribution for the local intensity distribution behind the noisy photocount measurements. Such a probabilistic treatment opens the way to hypothesis testing and confidence levels for conclusions drawn from image analysis

    Rethinking the Liquidity Puzzle: Application of a New Measure of the Economic Money Stock

    Get PDF
    Historically, attempts to solve the liquidity puzzle have focused on narrowly defined monetary aggregates, such as non-borrowed reserves, the monetary base, or M1. Many of these efforts have failed to find a short-term negative correlation between interest rates and monetary policy innovations. More recent research uses sophisticated macroeconomic and econometric modeling. However, little research has investigated the role measurement error plays in the liquidity puzzle, since in nearly every case, work investigating the liquidity puzzle has used one of the official monetary aggregates, which have been shown to exhibit significant measurement error. This paper examines the role that measurement error plays in the liquidity puzzle by (i) providing a theoretical framework explaining how the official simple-sum methodology can lead to a liquidity puzzle, and (ii) testing for the liquidity effect by estimating an unrestricted VAR.Liquidity Puzzle, Monetary Policy, Monetary Aggregation, Money Stock, Divisia Index Numbers

    Forecast Design in Monetary Capital Stock Measurement

    Get PDF
    We design a procedure for measuring the United States capital stock of money implied by the Divisia monetary aggregate service flow, in a manner consistent with the present-value model of economic capital stock. We permit non-martingale expectations and time varying discount rates. Based on Barnett’s (1991) definition of the economic stock of money, we compute the U.S. economic stock of money by discounting to present value the flow of expected expenditure on the services of monetary assets, where expenditure on monetary services is evaluated at the user costs of the monetary components. As a theoretically consistent measure of money stock, our economic stock of money nests Rotemberg, Driscoll, and Poterba’s (1995) currency equivalent index as a special case, under the assumption of martingale expectations. To compute the economic stock of money without imposing martingale expectations, we define a procedure for producing the necessary forecasts based on an asymmetric vector autoregressive model and a Bayesian vector autoregressive model. In application of this proposed procedure, Barnett, Chae, and Keating (2005) find the resulting capital-stock growth-rate index to be surprisingly robust to the modeling of expectations. Similarly the primary conclusions of this supporting paper regard robustness. We believe that further experiments with other forecasting models would further confirm our robustness conclusion. Different forecasting models can produce substantial differences in forecasts into the distant future. But since the distant future is heavily discounted in our stock formula, and since alternative forecasting formulas rarely produce dramatic differences in short term forecasts, we believe that our robustness result obviates prior concerns about the dependency of theoretical monetary capital stock computations upon forecasts of future expected flows. Even the simple martingale forecast, which has no unknown parameters and is easily computed with current period data, produces a discounted stock measure that is adequate for most purposes. Determining an easily measured extended index that can remove the small bias that we identify under the martingale forecast remains a subject for our future research. At the time that Milton Friedman (1969) was at the University of Chicago, the “Chicago School” view on the monetary transmission mechanism was based upon the wealth effect, called the “real balance effect” or “Pigou (1943) effect,” of open market operations. Our research identifies very large errors in the wealth effects computed from the conventional simple sum monetary aggregates and makes substantial progress in the direction of accurate measurement of monetary-policy wealth effects.Monetary aggregation, Divisia money aggregate, economic stock of money, user cost of money, currency equivalent index, Bayesian vector autoregression, asymmetric vector autoregression.

    The Discounted Economic Stock of Money with VAR Forecasting

    Get PDF
    We measure the United States capital stock of money implied by the Divisia monetary aggregate service flow, in a manner consistent with the present-value model of economic capital stock. We permit non-martingale expectations and time varying discount rates. Based on Barnett’s (1991) definition of the economic stock of money, we compute the U.S. economic stock of money by discounting to present value the flow of expected expenditure on the services of monetary assets, where expenditure on monetary services is evaluated at the user costs of the monetary components. As a theoretically consistent measure of money stock, our economic stock of money nests Rotemberg, Driscoll, and Poterba’s (1995) currency equivalent index as a special case, under the assumption of martingale expectations. To compute the economic stock of money without imposing martingale expectations, we use forecasts based on the asymmetric vector autoregressive model and the Bayesian vector autoregressive model. We find the resulting capital-stock growth-rate index to be surprisingly robust to the modeling of expectations.monetary aggregation, econommic stock of money, aggregation theory, index number theory, VAR forecasting, wealth effect, Pigou effect, real balance effect, measurement

    Rethinking the Liquidity Puzzle: Application of a New Measure of the Economic Money Stock

    Get PDF
    Historically, attempts to solve the liquidity puzzle have focused on narrowly defined monetary aggregates, such as non-borrowed reserves, the monetary base, or M1. Many of these efforts have failed to find a short-term negative correlation between interest rates and monetary policy innovations. More recent research uses sophisticated macroeconomic and econometric modeling. However, little research has investigated the role measurement error plays in the liquidity puzzle, since in nearly every case, work investigating the liquidity puzzle has used one of the official monetary aggregates, which have been shown to exhibit significant measurement error. This paper examines the role that measurement error plays in the liquidity puzzle by (i) providing a theoretical framework explaining how the official simple-sum methodology can lead to a liquidity puzzle, and (ii) testing for the liquidity effect by estimating an unrestricted VAR.North-South, growth model, innovation assimilation

    Quantum retrodiction in open systems

    Get PDF
    Quantum retrodiction involves finding the probabilities for various preparation events given a measurement event. This theory has been studied for some time but mainly as an interesting concept associated with time asymmetry in quantum mechanics. Recent interest in quantum communications and cryptography, however, has provided retrodiction with a potential practical application. For this purpose quantum retrodiction in open systems should be more relevant than in closed systems isolated from the environment. In this paper we study retrodiction in open systems and develop a general master equation for the backward time evolution of the measured state, which can be used for calculating preparation probabilities. We solve the master equation, by way of example, for the driven two-level atom coupled to the electromagnetic field.Comment: 12 pages, no figure
    corecore