133 research outputs found
Development of the Techology for Intermediate Energy Electron Cooling
This research was sponsored by the National Science Foundation Grant NSF PHY-931478
Study of the Feasibility of Decreasing the Emittance of the SSC Beam Through the Use of Electron Cooling in the SSC Medium Energy Booster
This research was sponsored by the National Science Foundation Grant NSF PHY-931478
Design of a 6 MeV Electron Cooling System for the SSC Medium Energy Booster
This research was sponsored by the National Science Foundation Grant NSF PHY-931478
Efficient Bit-Decomposition and Modulus-Conversion Protocols with an Honest Majority
We propose secret-sharing-based bit-decomposition and modulus conversion protocols for a prime order ring with an honest majority: an adversary can corrupt parties of parties and . Our protocols are secure against passive and active adversaries depending on the components of our protocols. We assume a secret is an -bit element and , where in the passive security and in the active security. The outputs of our bit-decomposition and modulus-conversion protocols are tuple of shares in and a share in , respectively, where is the modulus to be converted. If and are small, the communication complexity of our passively secure bit-decomposition and modulus-conversion protocols are bits and bits, respectively. Our key observation is that a quotient of additive shares can be computed from the \emph{least} significant bits. If a secret is ``shifted\u27\u27 and additively shared by in as , the least significant bits of determines since is an odd prime and the least significant bits of are s
Results from the intercalibration of optical low light calibration sources 2011
Following the 38th Annual European Meeting on Atmospheric Studies by Optical Methods in Siuntio in Finland, an intercalibration workshop for optical low light calibration sources was held in Sodankylä, Finland. The main purpose of this workshop was to provide a comparable scale for absolute measurements of aurora and airglow. All sources brought to the intercalibration workshop were compared to the Fritz Peak reference source using the Lindau Calibration Photometer built by Wilhelm Barke and Hans Lauche in 1984. The results were compared to several earlier intercalibration workshops. It was found that most sources were fairly stable over time, with errors in the range of 5–25%. To further validate the results, two sources were also intercalibrated at UNIS, Longyearbyen, Svalbard. Preliminary analysis indicates agreement with the intercalibration in Sodankylä within about 15–25%
Results from the intercalibration of optical low light calibration sources 2011
Following the 38th Annual European Meeting on Atmospheric Studies by Optical Methods in Siuntio in Finland, an intercalibration workshop for optical low light calibration sources was held in Sodankylä, Finland. The main purpose of this workshop was to provide a comparable scale for absolute measurements of aurora and airglow. All sources brought to the intercalibration workshop were compared to the Fritz Peak reference source using the Lindau Calibration Photometer built by Wilhelm Barke and Hans Lauche in 1984. The results were compared to several earlier intercalibration workshops. It was found that most sources were fairly stable over time, with errors in the range of 5–25%. To further validate the results, two sources were also intercalibrated at UNIS, Longyearbyen, Svalbard. Preliminary analysis indicates agreement with the intercalibration in Sodankylä within about 15–25%.publishedVersio
Amortised Memory Analysis Using the Depth of Data Structures
Abstract. Hofmann and Jost have presented a heap space analysis [1] that finds linear space bounds for many functional programs. It uses an amortised analysis: assigning hypothetical amounts of free space (called potential) to data structures in proportion to their sizes using type annotations. Constraints on these annotations in the type system ensure that the total potential assigned to the input is an upper bound on the total memory required to satisfy all allocations. We describe a related system for bounding the stack space requirements which uses the depth of data structures, by expressing potential in terms of maxima as well as sums. This is achieved by adding extra structure to typing contexts (inspired by O’Hearn’s bunched typing [2]) to describe the form of the bounds. We will also present the extra steps that must be taken to construct a typing during the analysis. Obtaining bounds on the resource requirements of programs can be crucial for ensuring that they enjoy reliability and security properties, particularly for use i
Testing nowcasts of the ionospheric convection from the expanding and contracting polar cap model
The expanding/contracting polar cap (ECPC) model, or the time-dependent Dungey cycle, provides a theoretical framework for understanding solar wind-magnetosphere-ionosphere coupling. The ECPC describes the relationship between magnetopause reconnection and substorm growth phase, magnetotail reconnection and substorm expansion phase, associated changes in auroral morphology, and ionospheric convective motions. Despite the many successes of the model, there has yet to be a rigorous test of the predictions or nowcasts made regarding ionospheric convection, which remains a final hurdle for the validation of the ECPC. In this study we undertake a comparison of ionospheric convection, as measured in situ by ion drift meters on board DMSP (Defense Meteorological Satellite Program) satellites and from the ground by SuperDARN (Super Dual Auroral Radar Network), with motions nowcasted by a theoretical model. The model is coupled to measurements of changes in the size of the polar cap made using global auroral imagery from the IMAGE FUV (Imager for Magnetopause to Aurora Global Exploration Far Ultraviolet) instrument, as well as the dayside reconnection rate, estimated using the OMNI data set. The results show that we can largely nowcast the magnitudes of ionospheric convection flows using the context of our understanding of magnetic reconnection at the magnetopause and in the magnetotail
The Acceleration and Storage of Radioactive Ions for a Beta-Beam Facility
The term beta-beam has been coined for the production of a pure beam of
electron neutrinos or their antiparticles through the decay of radioactive ions
circulating in a storage ring. This concept requires radioactive ions to be
accelerated to as high Lorentz gamma as 150. The neutrino source itself
consists of a storage ring for this energy range, with long straight sections
in line with the experiment(s). Such a decay ring does not exist at CERN today,
nor does a high-intensity proton source for the production of the radioactive
ions. Nevertheless, the existing CERN accelerator infrastructure could be used
as this would still represent an important saving for a beta-beam facility.Comment: beta-beam working group website at http://cern.ch/beta-bea
- …