421 research outputs found

    Rules, Discretion, and Central Bank Independence: The German Experience 1880 - 1989

    Full text link
    Theories of rules and discretion have become a corner stone in the formulation of macroeconomic policy. They suggest that monetary policy rules are first best in terms of social welfare. However, if commitment is not feasible, delegating monetary policy to an independent and conservative central bank can be second best. Monetary policy in Germany during the past one hundred years provides an excellent case to assess the empirical evidence on the use of rules and central bank independence in monetary policy making. Since the creation of a central monetary authority in 1876, Germany has participated in four monetary regimes: the pre-war gold standard, the inter-war gold standard, the Bretton Woods system, and the floating exchange rate regime. With the exception of the two world war periods German monetary policy was geared primarily towards maintaining price stability and characterized by a high degree of formal and practical central bank independenc

    Performance of the EUDET-type beam telescopes

    Full text link
    Test beam measurements at the test beam facilities of DESY have been conducted to characterise the performance of the EUDET-type beam telescopes originally developed within the EUDET project. The beam telescopes are equipped with six sensor planes using MIMOSA26 monolithic active pixel devices. A programmable Trigger Logic Unit provides trigger logic and time stamp information on particle passage. Both data acquisition framework and offline reconstruction software packages are available. User devices are easily integrable into the data acquisition framework via predefined interfaces. The biased residual distribution is studied as a function of the beam energy, plane spacing and sensor threshold. Its standard deviation at the two centre pixel planes using all six planes for tracking in a 6\,GeV electron/positron-beam is measured to be (2.88\,\pm\,0.08)\,\upmu\meter.Iterative track fits using the formalism of General Broken Lines are performed to estimate the intrinsic resolution of the individual pixel planes. The mean intrinsic resolution over the six sensors used is found to be (3.24\,\pm\,0.09)\,\upmu\meter.With a 5\,GeV electron/positron beam, the track resolution halfway between the two inner pixel planes using an equidistant plane spacing of 20\,mm is estimated to (1.83\,\pm\,0.03)\,\upmu\meter assuming the measured intrinsic resolution. Towards lower beam energies the track resolution deteriorates due to increasing multiple scattering. Threshold studies show an optimal working point of the MIMOSA26 sensors at a sensor threshold of between five and six times their RMS noise. Measurements at different plane spacings are used to calibrate the amount of multiple scattering in the material traversed and allow for corrections to the predicted angular scattering for electron beams

    Rules, Discretion, and Central Bank Independence: The German Experience 1880-1989

    Get PDF
    Theories of rules and discretion suggest that monetary policy rules are first best in terms of social welfare. However, if commitment is not feasible, delegating monetary policy to an independent and conservative central bank can be second best. Monetary policy in Germany during the past one hundred years provides an excellent case to assess the empirical evidence on the use of rules and central bank independence in monetary policy making. Since the creation of a central monetary authority in 1876, Germany has participated in four monetary regimes: the pre-war gold standard, the inter-war gold standard, the Bretton-Woods system, and the floating exchange rate regime. The bottom line of our analysis is that monetary policy in Germany was always geared toward maintaining price stability with the exception of the two world war periods. Germany relied both on rules and discretion with central bank independence to achieve the goal of price stability. A comparison of the Classical Gold Standard regime with the floating exchange rate regime suggests that society under the floating exchange rate regime with central bank independence was better off. However, this comparison ignores the historical difference in output shocks and the possibility that society became more inflation averse over time.

    SymmetryĂą Directed SelfĂą Assembly of a Tetrahedral Protein Cage Mediated by de NovoĂą Designed Coiled Coils

    Full text link
    The organization of proteins into new hierarchical forms is an important challenge in synthetic biology. However, engineering new interactions between protein subunits is technically challenging and typically requires extensive redesign of proteinĂą protein interfaces. We have developed a conceptually simple approach, based on symmetry principles, that uses short coiledĂą coil domains to assemble proteins into higherĂą order structures. Here, we demonstrate the assembly of a trimeric enzyme into a wellĂą defined tetrahedral cage. This was achieved by genetically fusing a trimeric coiledĂą coil domain to its C terminus through a flexible polyglycine linker sequence. The linker length and coiledĂą coil strength were the only parameters that needed to be optimized to obtain a high yield of correctly assembled protein cages.Geometry lesson: A modular approach for assembling proteins into largeĂą scale geometric structures was developed in which coiledĂą coil domains acted as Ăą twist tiesĂą to facilitate assembly. The geometry of the cage was specified primarily by the rotational symmetries of the coiled coil and building block protein and was largely independent of protein structural details.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/138862/1/cbic201700406_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/138862/2/cbic201700406.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/138862/3/cbic201700406-sup-0001-misc_information.pd

    Chancengerechtigkeit durch Bildung – Chancengerechtigkeit in der Bildung (Auszug)

    Get PDF
    Der hier mit freundlicher Genehmigung des AWO Bundesverbands abgedruckte Text ist ein Auszug aus der BroschĂŒre: Arbeiterwohlfahrt Bundesverband (Hrsg.): Standpunkte 2006. Chancengerechtigkeit durch Bildung – Chancengerechtigkeit in der Bildung, Bonn 2006. Unser Bildungssystem fĂŒr die Kinder im Alter von 6 bis 16 Jahren wird den Herausforderungen der Zukunft nicht gerecht. Ein Umsteuern ist dringend notwendig, da ohne Bildung der Wandel in die Wissensgesellschaft nicht zu bewĂ€ltigen ist. Bildung, Qualifikation und Kompetenzen und das Erlernen von Diskurs- und KonfliktfĂ€higkeit entscheiden ĂŒber die beruflichen und gesellschaftlichen Chancen eines jeden Menschen und davon abhĂ€ngig ĂŒber seine Zukunftschancen. Bildung bedeutet Entwicklung der Persönlichkeit, der IdentitĂ€t. Bildung bedeutet aber auch, die gemeinschaftsfĂ€hige Persönlichkeit zu gestalten. Und somit bekommt Bildung gerade in der Lebensphase der 6- bis 16-JĂ€hrigen ĂŒber die eher traditionelle Dimension hinaus auch einen emanzipatorischen Charakter. Wenn Bildung also fĂŒr den Einzelnen diese entscheidende Rolle spielt, dann bekommt die öffentliche Verantwortung fĂŒr dieses Bildungswesen eine ganz zentrale Bedeutung. (DIPF/Orig.

    Virtual reality surgery simulation: A survey on patient specific solution

    Get PDF
    For surgeons, the precise anatomy structure and its dynamics are important in the surgery interaction, which is critical for generating the immersive experience in VR based surgical training applications. Presently, a normal therapeutic scheme might not be able to be straightforwardly applied to a specific patient, because the diagnostic results are based on averages, which result in a rough solution. Patient Specific Modeling (PSM), using patient-specific medical image data (e.g. CT, MRI, or Ultrasound), could deliver a computational anatomical model. It provides the potential for surgeons to practice the operation procedures for a particular patient, which will improve the accuracy of diagnosis and treatment, thus enhance the prophetic ability of VR simulation framework and raise the patient care. This paper presents a general review based on existing literature of patient specific surgical simulation on data acquisition, medical image segmentation, computational mesh generation, and soft tissue real time simulation

    Challenges in QCD matter physics - The Compressed Baryonic Matter experiment at FAIR

    Full text link
    Substantial experimental and theoretical efforts worldwide are devoted to explore the phase diagram of strongly interacting matter. At LHC and top RHIC energies, QCD matter is studied at very high temperatures and nearly vanishing net-baryon densities. There is evidence that a Quark-Gluon-Plasma (QGP) was created at experiments at RHIC and LHC. The transition from the QGP back to the hadron gas is found to be a smooth cross over. For larger net-baryon densities and lower temperatures, it is expected that the QCD phase diagram exhibits a rich structure, such as a first-order phase transition between hadronic and partonic matter which terminates in a critical point, or exotic phases like quarkyonic matter. The discovery of these landmarks would be a breakthrough in our understanding of the strong interaction and is therefore in the focus of various high-energy heavy-ion research programs. The Compressed Baryonic Matter (CBM) experiment at FAIR will play a unique role in the exploration of the QCD phase diagram in the region of high net-baryon densities, because it is designed to run at unprecedented interaction rates. High-rate operation is the key prerequisite for high-precision measurements of multi-differential observables and of rare diagnostic probes which are sensitive to the dense phase of the nuclear fireball. The goal of the CBM experiment at SIS100 (sqrt(s_NN) = 2.7 - 4.9 GeV) is to discover fundamental properties of QCD matter: the phase structure at large baryon-chemical potentials (mu_B > 500 MeV), effects of chiral symmetry, and the equation-of-state at high density as it is expected to occur in the core of neutron stars. In this article, we review the motivation for and the physics programme of CBM, including activities before the start of data taking in 2022, in the context of the worldwide efforts to explore high-density QCD matter.Comment: 15 pages, 11 figures. Published in European Physical Journal

    Multiplicity dependence of jet-like two-particle correlations in p-Pb collisions at sNN\sqrt{s_{NN}} = 5.02 TeV

    Full text link
    Two-particle angular correlations between unidentified charged trigger and associated particles are measured by the ALICE detector in p-Pb collisions at a nucleon-nucleon centre-of-mass energy of 5.02 TeV. The transverse-momentum range 0.7 <pT,assoc<pT,trig< < p_{\rm{T}, assoc} < p_{\rm{T}, trig} < 5.0 GeV/cc is examined, to include correlations induced by jets originating from low momen\-tum-transfer scatterings (minijets). The correlations expressed as associated yield per trigger particle are obtained in the pseudorapidity range ∣η∣<0.9|\eta|<0.9. The near-side long-range pseudorapidity correlations observed in high-multiplicity p-Pb collisions are subtracted from both near-side short-range and away-side correlations in order to remove the non-jet-like components. The yields in the jet-like peaks are found to be invariant with event multiplicity with the exception of events with low multiplicity. This invariance is consistent with the particles being produced via the incoherent fragmentation of multiple parton--parton scatterings, while the yield related to the previously observed ridge structures is not jet-related. The number of uncorrelated sources of particle production is found to increase linearly with multiplicity, suggesting no saturation of the number of multi-parton interactions even in the highest multiplicity p-Pb collisions. Further, the number scales in the intermediate multiplicity region with the number of binary nucleon-nucleon collisions estimated with a Glauber Monte-Carlo simulation.Comment: 23 pages, 6 captioned figures, 1 table, authors from page 17, published version, figures at http://aliceinfo.cern.ch/ArtSubmission/node/161

    Multi-particle azimuthal correlations in p-Pb and Pb-Pb collisions at the CERN Large Hadron Collider

    Full text link
    Measurements of multi-particle azimuthal correlations (cumulants) for charged particles in p-Pb and Pb-Pb collisions are presented. They help address the question of whether there is evidence for global, flow-like, azimuthal correlations in the p-Pb system. Comparisons are made to measurements from the larger Pb-Pb system, where such evidence is established. In particular, the second harmonic two-particle cumulants are found to decrease with multiplicity, characteristic of a dominance of few-particle correlations in p-Pb collisions. However, when a âˆŁÎ”Î·âˆŁ|\Delta \eta| gap is placed to suppress such correlations, the two-particle cumulants begin to rise at high-multiplicity, indicating the presence of global azimuthal correlations. The Pb-Pb values are higher than the p-Pb values at similar multiplicities. In both systems, the second harmonic four-particle cumulants exhibit a transition from positive to negative values when the multiplicity increases. The negative values allow for a measurement of v2{4}v_{2}\{4\} to be made, which is found to be higher in Pb-Pb collisions at similar multiplicities. The second harmonic six-particle cumulants are also found to be higher in Pb-Pb collisions. In Pb-Pb collisions, we generally find v2{4}≃v2{6}≠0v_{2}\{4\} \simeq v_{2}\{6\}\neq 0 which is indicative of a Bessel-Gaussian function for the v2v_{2} distribution. For very high-multiplicity Pb-Pb collisions, we observe that the four- and six-particle cumulants become consistent with 0. Finally, third harmonic two-particle cumulants in p-Pb and Pb-Pb are measured. These are found to be similar for overlapping multiplicities, when a âˆŁÎ”Î·âˆŁ>1.4|\Delta\eta| > 1.4 gap is placed.Comment: 25 pages, 11 captioned figures, 3 tables, authors from page 20, published version, figures at http://aliceinfo.cern.ch/ArtSubmission/node/87

    EUDAQ - A data acquisition software framework for common beam telescopes

    Get PDF
    EUDAQ is a generic data acquisition software developed for use in conjunction with common beam telescopes at charged particle beam lines. Providing high-precision reference tracks for performance studies of new sensors, beam telescopes are essential for the research and development towards future detectors for high-energy physics. As beam time is a highly limited resource, EUDAQ has been designed with reliability and ease-of-use in mind. It enables flexible integration of different independent devices under test via their specific data acquisition systems into a top-level framework. EUDAQ controls all components globally, handles the data flow centrally and synchronises and records the data streams. Over the past decade, EUDAQ has been deployed as part of a wide range of successful test beam campaigns and detector development applications
    • 

    corecore