4,571 research outputs found

    Neutron and proton tests of different technologies for the upgrade of the cold readout electronics of the ATLAS Hadronic End-cap Calorimeter

    Full text link
    The expected increase of total integrated luminosity by a factor ten at the HL-LHC compared to the design goals for LHC essentially eliminates the safety factor for radiation hardness realized at the current cold amplifiers of the ATLAS Hadronic End-cap Calorimeter (HEC). New more radiation hard technologies have been studied: SiGe bipolar, Si CMOS FET and GaAs FET transistors have been irradiated with neutrons up to an integrated fluence of 2.2 x 10^{16} n/cm^2 and with 200 MeV protons up to an integrated fluence of 2.6 x 10^{14} p/cm^2. Comparisons of transistor parameters such as the gain for both types of irradiations are presented.Comment: 6 pages, CALOR2012 Conference Proceeding

    Irradiation Tests and Expected Performance of Readout Electronics of the ATLAS Hadronic Endcap Calorimeter for the HL-LHC

    Full text link
    The readout electronics of the ATLAS Hadronic Endcap Calorimeter (HEC) will have to withstand an about 3-5 times larger radiation environment at the future high-luminosity LHC (HLLHC) compared to their design values. The preamplifier and summing boards (PSBs), which are equipped with GaAs ASICs and comprise the heart of the readout electronics, were irradiated with neutrons and protons with fluences surpassing several times ten years of operation of the HL-LHC. Neutron tests were performed at the NPI in Rez, Czech Republic, where a 36 MeV proton beam was directed on a thick heavy water target to produce neutrons. The proton irradiation was done with 200 MeV protons at the PROSCAN area of the Proton Irradiation Facility at the PSI in Villigen, Switzerland. In-situ measurements of S-parameters in both tests allow the evaluation of frequency dependent performance parameters, like gain and input impedance, as a function of fluence. The linearity of the ASIC response was measured directly in the neutron tests with a triangular input pulse of varying amplitude. The results obtained allow an estimation of the expected performance degradation of the HEC. For a possible replacement of the PSB chips, alternative technologies were investigated and exposed to similar neutron radiation levels. In particular, IHP 250 nm Si CMOS technology has turned out to show good performance and match the specifications required. The performance measurements of the current PSB devices, the expected performance degradations under HL-LHC conditions, and results from alternative technologies will be presented.Comment: 5 pages, 4 figures, CHEF2013 Conference Proceedings. arXiv admin note: text overlap with arXiv:1301.375

    The dynamics of iterated transportation simulations

    Full text link
    Iterating between a router and a traffic micro-simulation is an increasibly accepted method for doing traffic assignment. This paper, after pointing out that the analytical theory of simulation-based assignment to-date is insufficient for some practical cases, presents results of simulation studies from a real world study. Specifically, we look into the issues of uniqueness, variability, and robustness and validation. Regarding uniqueness, despite some cautionary notes from a theoretical point of view, we find no indication of ``meta-stable'' states for the iterations. Variability however is considerable. By variability we mean the variation of the simulation of a given plan set by just changing the random seed. We show then results from three different micro-simulations under the same iteration scenario in order to test for the robustness of the results under different implementations. We find the results encouraging, also when comparing to reality and with a traditional assignment result. Keywords: dynamic traffic assignment (DTA); traffic micro-simulation; TRANSIMS; large-scale simulations; urban planningComment: 24 pages, 7 figure

    Price Floors and Competition

    Get PDF
    A potential source of instability of many economic models is that agents have little incentive to stick with the equilibrium. We show experimentally that this may matter with price competition. The control variable is a price floor, which increases the cost of deviating from equilibrium. Theoretically the floor allows competitors to obtain higher profits, as low prices are excluded. However, behaviorally the opposite is observed; with a floor competitors receive lower joint profits. An error model (logit equilibrium) captures some but not all the important features of the data. We provide statistical support for a complementary explanation, which refers to how "threatening" an equilibrium is. We discuss the economic import of these findings, concerning matters like resale price maintenance and auction design.Price competition; price floors; Bertrand model; experiment; salience; logit equilibrium; threats

    Market efficiency in the age of big data

    Get PDF
    Modern investors face a high-dimensional prediction problem: thousands of observable variables are potentially relevant for forecasting. We reassess the conventional wisdom on market efficiency in light of this fact. In our equilibrium model, N assets have cash flows that are linear in J characteristics, with unknown coefficients. Risk-neutral Bayesian investors learn these coefficients and determine market prices. If J and N are comparable in size, returns are cross-sectionally predictable ex post. In-sample tests of market efficiency reject the no-predictability null with high probability, even though investors use information optimally in real time. In contrast, out-of-sample tests retain their economic meaning

    Reverse-engineering of the rule-of-half in order to retrofit an assessment procedure based on resource consumption

    Get PDF
    The German evaluation procedure for the Federal Transport Infrastructure Plan (‘Bun-desverkehrswegeplan’) is a large-scale and comprehensive modeling, simulation, and eval-uation effort. An important component of the evaluation procedure is a cost-benefit analy-sis, based on the concept of resource consumption. This concept means that new transport infrastructure causes changes in the consumption of time, money, safety, environment, etc. In this paper, we show that — assuming elastic demand for the facility under consideration — the current approach is not in line with basic consumer theory. This stems from incon-sistencies between the behavioral model and the evaluation method: ignoring unobserved attributes of the different transport modes in the evaluation can lead to quite different eco-nomic gains than when these attributes are considered. Current practice in other EU coun-tries avoids this problem typically by applying the so-called rule-of-half, or by directly deriving the logsum term from the underlying logit model. However, a change in the Ger-man assessment procedure towards one of these best-practice approaches for the upcoming Federal Transport Infrastructure Plan in 2015 seems politically not feasible. We therefore propose an easily applicable procedure to include the logic of the rule-of-half into the exist-ing evaluation approach. We show that the resulting calculation yields the same result as the rule-of-half while maintaining the rest of the former evaluation method. Finally, we discuss how another German assessment scheme for urban public transit projects, which is currently under revision, fits into the proposed procedure.BMVBS, 960974/2011, Review and further development of the assessment methodology with a focus on the benefits components of the benefit-cost-analysis of the German national transport assessment exerciseDFG, 92485222, Detaillierte Evaluation verkehrlicher Maßnahmen mit Hilfe von Mikrosimulatio

    Spatial competition and price formation

    Full text link
    We look at price formation in a retail setting, that is, companies set prices, and consumers either accept prices or go someplace else. In contrast to most other models in this context, we use a two-dimensional spatial structure for information transmission, that is, consumers can only learn from nearest neighbors. Many aspects of this can be understood in terms of generalized evolutionary dynamics. In consequence, we first look at spatial competition and cluster formation without price. This leads to establishement size distributions, which we compare to reality. After some theoretical considerations, which at least heuristically explain our simulation results, we finally return to price formation, where we demonstrate that our simple model with nearly no organized planning or rationality on the part of any of the agents indeed leads to an economically plausible price.Comment: Minor change
    • 

    corecore