1,574 research outputs found

    Sensor/ROIC Integration using Oxide Bonding

    Full text link
    We explore the Ziptronix Direct Bond Interconnect technology for the integration of sensors and readout integrated circuits (ROICs) for high energy physics. The technology utilizes an oxide bond to form a robust mechanical connection between layers which serves to assist with the formation of metallic interlayer connections. We report on testing results of sample sensors bonded to ROICs and thinned to 100 microns.Comment: Talk given at the 2008 International Linear Collider Workshop (LCWS08 and ILC08), Chicago, Illinois, November 16-20, 2008. 4 pages, 1 figur

    Implementation, modeling, and exploration of precision visual servo systems

    Get PDF

    Deeply Virtual Compton Scattering at Future Electron-Ion Colliders

    Full text link
    The study of hadronic structure has been carried out for many years. Generalized parton distribution functions (GPDs) give broad information on the internal structure of hadrons. Combining GPDs and high-energy scattering experiments, we expect yielding three-dimensional physical quantities from experiments. Deeply Virtual Compton Scattering (DVCS) process is a powerful tool to study GPDs. It is one of the important experiments of Electron Ion Collider (EIC) and Electron ion collider at China (EicC) in the future. In the initial stage, the proposed EicC will have 353 \sim 5 GeV polarized electrons on 122512 \sim 25 GeV polarized protons, with luminosity up to 12×10331 \sim 2 \times 10^{33}cm2^{-2}s1^{-1}. EIC will be constructed in coming years, which will cover the variable c.m. energies from 30 to 50 GeV, with the luminosity about 1033103410^{33} \sim 10^{34}cm2^{-2}s1^{-1}. In this work we present a detailed simulation of DVCS to study the feasibility of experiments at EicC and EIC. Referring the method used by HERMES Collaboration, and comparing the model calculations with pseudo data of asymmetries attributed to the DVCS, we obtained a model-dependent constraint on the total angular momentum of up and down quarks in the proton.Comment: 12 pages, 18 figures, 3 Table

    Initialization Matters: Privacy-Utility Analysis of Overparameterized Neural Networks

    Full text link
    We analytically investigate how over-parameterization of models in randomized machine learning algorithms impacts the information leakage about their training data. Specifically, we prove a privacy bound for the KL divergence between model distributions on worst-case neighboring datasets, and explore its dependence on the initialization, width, and depth of fully connected neural networks. We find that this KL privacy bound is largely determined by the expected squared gradient norm relative to model parameters during training. Notably, for the special setting of linearized network, our analysis indicates that the squared gradient norm (and therefore the escalation of privacy loss) is tied directly to the per-layer variance of the initialization distribution. By using this analysis, we demonstrate that privacy bound improves with increasing depth under certain initializations (LeCun and Xavier), while degrades with increasing depth under other initializations (He and NTK). Our work reveals a complex interplay between privacy and depth that depends on the chosen initialization distribution. We further prove excess empirical risk bounds under a fixed KL privacy budget, and show that the interplay between privacy utility trade-off and depth is similarly affected by the initialization
    corecore