29,788 research outputs found

    Unspeakable Suffering; Eloquent Explanations: National Civil War Medicine Museum’s 26th Annual Conference

    Full text link
    On Friday, October 12th, 2018, the National Civil War Medicine Museum kicked off its 26th annual conference and began its three-day event with a series of lectures on topics ranging from Confederate medical practice to cultural understandings of cowardice. A series of unique lectures given by a professionally diverse cast of presenters illuminated the often-peripheral field of Civil War Medicine. [excerpt

    Making Photographs Speak

    Full text link
    It has often been said that “a picture is worth a thousand words.” Making that picture spit out those mythical thousand words, as we can all attest, is no easy task. Over the course of the first half of the fall semester, the three of us were tasked with developing brief interpretive captions for two Civil War photographs each, with the end goal to display our work at the Civil War Institute’s 2019 Summer Conference. What initially appeared as a simple project quickly revealed itself to be a difficult, yet rewarding, challenge that taught us all important lessons concerning history, photography, and writing that we will not soon forget. Producing the photography exhibit enhanced our skills as historical writers, introduced us to the challenge of writing for a popular audience, and deepened our understanding of Civil War photography. [excerpt

    Characterizing the Brazilian Term Structure of Interest Rates

    Get PDF
    This paper studies the Brazilian term structure of interest rates and characterizes how the term premia has changed over time. We employ a Kalman filter approach, which is extended to take into account regime switches and overlapping forecasts errors. Empirical evidence suggests that term premia depends on international global liquidity and domestic factors such as the composition of public debt and inflation volatility. These results provide guidance for the formulation of fiscal and monetary policies.

    Measurement based entanglement under conditions of extreme photon loss

    Full text link
    The act of measuring optical emissions from two remote qubits can entangle them. By demanding that a photon from each qubit reaches the detectors, one can ensure than no photon was lost. But the failure rate then rises quadratically with loss probability. In [1] this resulted in 30 successes per billion attempts. We describe a means to exploit the low grade entanglement heralded by the detection of a lone photon: A subsequent perfect operation is quickly achieved by consuming this noisy resource. We require only two qubits per node, and can tolerate both path length variation and loss asymmetry. The impact of photon loss upon the failure rate is then linear; realistic high-loss devices can gain orders of magnitude in performance and thus support QIP.Comment: Contains an extension of the protocol that makes it robust against asymmetries in path length and photon los

    Efficient growth of complex graph states via imperfect path erasure

    Get PDF
    Given a suitably large and well connected (complex) graph state, any quantum algorithm can be implemented purely through local measurements on the individual qubits. Measurements can also be used to create the graph state: Path erasure techniques allow one to entangle multiple qubits by determining only global properties of the qubits. Here, this powerful approach is extended by demonstrating that even imperfect path erasure can produce the required graph states with high efficiency. By characterizing the degree of error in each path erasure attempt, one can subsume the resulting imperfect entanglement into an extended graph state formalism. The subsequent growth of the improper graph state can be guided, through a series of strategic decisions, in such a way as to bound the growth of the error and eventually yield a high-fidelity graph state. As an implementation of these techniques, we develop an analytic model for atom (or atom-like) qubits in mismatched cavities, under the double-heralding entanglement procedure of Barrett and Kok [Phys. Rev. A 71, 060310 (2005)]. Compared to straightforward postselection techniques our protocol offers a dramatic improvement in growing complex high-fidelity graph states.Comment: 15 pages, 10 figures (which print to better quality than when viewed as an on screen pdf

    Evaluation of Adult Cottonwood Leaf Beetle, \u3ci\u3eChrysomela Scripta\u3c/i\u3e (Coleoptera: Chrysomelidae), Feeding Preference for Hybrid Poplars

    Get PDF
    Foliage from the Leuce section of Populus was rejected for feeding by Chrysomela scripta adults in a choice test involving 12 hybrid poplar clones. Adults showed a feeding preference for the foliage from the Tacamahaca clones when compared to the Aigeiros clones

    Evaluation of Mobility Modes on Lunar Exploration Traverses - Marius Hills, Copernicus Peaks, and Hadley Apennines Missions

    Get PDF
    Energy and time costs of lunar walking or riding traverses, and scientific tasks on J-type missions, and capabilities of A7L suits and life support system

    Financial Stability and Monetary Policy - The case of Brazil

    Get PDF
    This paper investigates the effects of monetary policy over banks' loans growth and non-performing loans for the recent period in Brazil. We contribute to the literature on bank lending and risk taking channel by showing that during periods of loosening/tightening monetary policy, banks increase/decrease their loans. Moreover, our results illustrate that large, well-capitalized and liquid banks absorb better the effects of monetary policy shocks. We also find that low interest rates lead to an increase in credit risk exposure, supporting the existence of a risk-taking channel. Finally, we show that the impact of monetary policy differs across state-owned, foreign and private domestic banks. These results are important for developing and conducting monetary policy.

    Pileup Mitigation with Machine Learning (PUMML)

    Full text link
    Pileup involves the contamination of the energy distribution arising from the primary collision of interest (leading vertex) by radiation from soft collisions (pileup). We develop a new technique for removing this contamination using machine learning and convolutional neural networks. The network takes as input the energy distribution of charged leading vertex particles, charged pileup particles, and all neutral particles and outputs the energy distribution of particles coming from leading vertex alone. The PUMML algorithm performs remarkably well at eliminating pileup distortion on a wide range of simple and complex jet observables. We test the robustness of the algorithm in a number of ways and discuss how the network can be trained directly on data.Comment: 20 pages, 8 figures, 2 tables. Updated to JHEP versio
    corecore