14,491 research outputs found
Operational experience, improvements, and performance of the CDF Run II silicon vertex detector
The Collider Detector at Fermilab (CDF) pursues a broad physics program at
Fermilab's Tevatron collider. Between Run II commissioning in early 2001 and
the end of operations in September 2011, the Tevatron delivered 12 fb-1 of
integrated luminosity of p-pbar collisions at sqrt(s)=1.96 TeV. Many physics
analyses undertaken by CDF require heavy flavor tagging with large charged
particle tracking acceptance. To realize these goals, in 2001 CDF installed
eight layers of silicon microstrip detectors around its interaction region.
These detectors were designed for 2--5 years of operation, radiation doses up
to 2 Mrad (0.02 Gy), and were expected to be replaced in 2004. The sensors were
not replaced, and the Tevatron run was extended for several years beyond its
design, exposing the sensors and electronics to much higher radiation doses
than anticipated. In this paper we describe the operational challenges
encountered over the past 10 years of running the CDF silicon detectors, the
preventive measures undertaken, and the improvements made along the way to
ensure their optimal performance for collecting high quality physics data. In
addition, we describe the quantities and methods used to monitor radiation
damage in the sensors for optimal performance and summarize the detector
performance quantities important to CDF's physics program, including vertex
resolution, heavy flavor tagging, and silicon vertex trigger performance.Comment: Preprint accepted for publication in Nuclear Instruments and Methods
A (07/31/2013
Dynamic Partial Reconfiguration for Dependable Systems
Moore’s law has served as goal and motivation for consumer electronics manufacturers in the last decades. The results in terms of processing power increase in the consumer electronics devices have been mainly achieved due to cost reduction and technology shrinking. However, reducing physical geometries mainly affects the electronic devices’ dependability, making them more sensitive to soft-errors like Single Event Transient (SET) of Single Event Upset (SEU) and hard (permanent) faults, e.g. due to aging effects.
Accordingly, safety critical systems often rely on the adoption of old technology nodes, even if they introduce longer design time w.r.t. consumer electronics. In fact, functional safety requirements are increasingly pushing industry in developing innovative methodologies to design high-dependable systems with the required diagnostic coverage. On the other hand commercial off-the-shelf (COTS) devices adoption began to be considered for safety-related systems due to real-time requirements, the need for the implementation of computationally hungry algorithms and lower design costs. In this field FPGA market share is constantly increased, thanks to their flexibility and low non-recurrent engineering costs, making them suitable for a set of safety critical applications with low production volumes.
The works presented in this thesis tries to face new dependability issues in modern reconfigurable systems, exploiting their special features to take proper counteractions with low impacton performances, namely Dynamic Partial Reconfiguration
Thermal Management for Dependable On-Chip Systems
This thesis addresses the dependability issues in on-chip systems from a thermal perspective. This includes an explanation and analysis of models to show the relationship between dependability and tempature. Additionally, multiple novel methods for on-chip thermal management are introduced aiming to optimize thermal properties. Analysis of the methods is done through simulation and through infrared thermal camera measurements
NEGATIVE BIAS TEMPERATURE INSTABILITY STUDIES FOR ANALOG SOC CIRCUITS
Negative Bias Temperature Instability (NBTI) is one of the recent reliability issues in
sub threshold CMOS circuits. NBTI effect on analog circuits, which require matched
device pairs and mismatches, will cause circuit failure. This work is to assess the
NBTI effect considering the voltage and the temperature variations. It also provides a
working knowledge of NBTI awareness to the circuit design community for reliable
design of the SOC analog circuit. There have been numerous studies to date on the
NBTI effect to analog circuits. However, other researchers did not study the
implication of NBTI stress on analog circuits utilizing bandgap reference circuit. The
reliability performance of all matched pair circuits, particularly the bandgap reference,
is at the mercy of aging differential. Reliability simulation is mandatory to obtain
realistic risk evaluation for circuit design reliability qualification. It is applicable to all
circuit aging problems covering both analog and digital. Failure rate varies as a
function of voltage and temperature. It is shown that PMOS is the reliabilitysusceptible
device and NBTI is the most vital failure mechanism for analog circuit in
sub-micrometer CMOS technology. This study provides a complete reliability
simulation analysis of the on-die Thermal Sensor and the Digital Analog Converter
(DAC) circuits and analyzes the effect of NBTI using reliability simulation tool. In
order to check out the robustness of the NBTI-induced SOC circuit design, a bum-in
experiment was conducted on the DAC circuits. The NBTI degradation observed in
the reliability simulation analysis has given a clue that under a severe stress condition,
a massive voltage threshold mismatch of beyond the 2mV limit was recorded. Bum-in
experimental result on DAC proves the reliability sensitivity of NBTI to the DAC
circuitry
JUNO Conceptual Design Report
The Jiangmen Underground Neutrino Observatory (JUNO) is proposed to determine
the neutrino mass hierarchy using an underground liquid scintillator detector.
It is located 53 km away from both Yangjiang and Taishan Nuclear Power Plants
in Guangdong, China. The experimental hall, spanning more than 50 meters, is
under a granite mountain of over 700 m overburden. Within six years of running,
the detection of reactor antineutrinos can resolve the neutrino mass hierarchy
at a confidence level of 3-4, and determine neutrino oscillation
parameters , , and to
an accuracy of better than 1%. The JUNO detector can be also used to study
terrestrial and extra-terrestrial neutrinos and new physics beyond the Standard
Model. The central detector contains 20,000 tons liquid scintillator with an
acrylic sphere of 35 m in diameter. 17,000 508-mm diameter PMTs with high
quantum efficiency provide 75% optical coverage. The current choice of
the liquid scintillator is: linear alkyl benzene (LAB) as the solvent, plus PPO
as the scintillation fluor and a wavelength-shifter (Bis-MSB). The number of
detected photoelectrons per MeV is larger than 1,100 and the energy resolution
is expected to be 3% at 1 MeV. The calibration system is designed to deploy
multiple sources to cover the entire energy range of reactor antineutrinos, and
to achieve a full-volume position coverage inside the detector. The veto system
is used for muon detection, muon induced background study and reduction. It
consists of a Water Cherenkov detector and a Top Tracker system. The readout
system, the detector control system and the offline system insure efficient and
stable data acquisition and processing.Comment: 328 pages, 211 figure
Cross-Layer Approaches for an Aging-Aware Design of Nanoscale Microprocessors
Thanks to aggressive scaling of transistor dimensions, computers have revolutionized our life. However, the increasing unreliability of devices fabricated in nanoscale technologies emerged as a major threat for the future success of computers. In particular, accelerated transistor aging is of great importance, as it reduces the lifetime of digital systems. This thesis addresses this challenge by proposing new methods to model, analyze and mitigate aging at microarchitecture-level and above
- …