46 research outputs found
Optimization of Magnetized Electron Cooling with JSPEC
The Electron-Ion-Collider (EIC) will be a next-generation facility located at
Brookhaven National Laboratory (BNL), built with the goal of accelerating heavy
ions up to 275 GeV. To prevent ion beam size growth during the acceleration
phase, cooling techniques will be required to keep the beam size from growing
due to intra-beam scattering. The JSPEC (JLab Simulation Package for Electron
Cooling) package is a tool designed to numerically model
magnetized and unmagnetized cooling through friction forces between
co-propagating electron and ion bunches.
Here we describe a feature that has been added to the JSPEC package, which
implements a Nelder-Mead Simplex optimization algorithm to allow a user to
optimize certain beam parameters in order to achieve a target cooling time
Electron beam manipulation, injection and acceleration in plasma wakefield accelerators by optically generated plasma density spikes
We discuss considerations regarding a novel and robust scheme for optically triggered electron bunch generation in plasma wakefield accelerators [1]. In this technique, a transversely propagating focused laser pulse ignites a quasi-stationary plasma column before the arrival of the plasma wake. This localized plasma density enhancement or optical "plasma torch" distorts the blowout during the arrival of the electron drive bunch and modifies the electron trajectories, resulting in controlled injection. By changing the gas density, and the laser pulse parameters such as beam waist and intensity, and by moving the focal point of the laser pulse, the shape of the plasma torch, and therefore the generated trailing beam, can be tuned easily. The proposed method is much more flexible and faster in generating gas density transitions when compared to hydrodynamics-based methods, and it accommodates experimentalists needs as it is a purely optical process and straightforward to implement
Plasma-photonic spatiotemporal synchronization of relativistic electron and laser beams
Modern particle accelerators and their applications increasingly rely on precisely coordinated interactions of intense charged particle and laser beams. Femtosecond-scale synchronization alongside micrometre-scale spatial precision are essential e.g. for pump-probe experiments, seeding and diagnostics of advanced light sources and for plasma-based accelerators. State-of-the-art temporal or spatial diagnostics typically operate with low-intensity beams to avoid material damage at high intensity. As such, we present a plasma-based approach, which allows measurement of both temporal and spatial overlap of high-intensity beams directly at their interaction point. It exploits amplification of plasma afterglow arising from the passage of an electron beam through a laser-generated plasma filament. The corresponding photon yield carries the spatiotemporal signature of the femtosecond-scale dynamics, yet can be observed as a visible light signal on microsecond-millimetre scales
Recommended from our members
Development of high gradient laser wakefield accelerators towards nuclear detection applications at LBNL
Compact high-energy linacs are important to applications including monochromatic gamma sources for nuclear material security applications. Recent laser wakefield accelerator experiments at LBNL demonstrated narrow energy spread beams, now with energies of up to 1 GeV in 3 cm using a plasma channel at low density. This demonstrates the production of GeV beams from devices much smaller than conventional linacs, and confirms the anticipated scaling of laser driven accelerators to GeV energies. Stable performance at 0.5 GeV was demonstrated. Experiments and simulations are in progress to control injection of particles into the wake and hence to improve beam quality and stability. Using plasma density gradients to control injection, stable beams at 1 MeV over days of operation, and with an order of magnitude lower absolute momentum spread than previously observed, have been demonstrated. New experiments are post-accelerating the beams from controlled injection experiments to increase beam quality and stability. Thomson scattering from such beams is being developed to provide collimated multi-MeV monoenergetic gamma sources for security applications from compact devices. Such sources can reduce dose to target and increase accuracy for applications including photofission and nuclear resonance fluorescence
The global methane budget 2000–2017
Understanding and quantifying the global methane (CH4) budget is important for assessing realistic pathways to mitigate climate change. Atmospheric emissions and concentrations of CH4 continue to increase, making CH4 the second most important human-influenced greenhouse gas in terms of climate forcing, after carbon dioxide (CO2). The relative importance of CH4 compared to CO2 depends on its shorter atmospheric lifetime, stronger warming potential, and variations in atmospheric growth rate over the past decade, the causes of which are still debated. Two major challenges in reducing uncertainties in the atmospheric growth rate arise from the variety of geographically overlapping CH4 sources and from the destruction of CH4 by short-lived hydroxyl radicals (OH). To address these challenges, we have established a consortium of multidisciplinary scientists under the umbrella of the Global Carbon Project to synthesize and stimulate new research aimed at improving and regularly updating the global methane budget. Following Saunois et al. (2016), we present here the second version of the living review paper dedicated to the decadal methane budget, integrating results of top-down studies (atmospheric observations within an atmospheric inverse-modelling framework) and bottom-up estimates (including process-based models for estimating land surface emissions and atmospheric chemistry, inventories of anthropogenic emissions, and data-driven extrapolations).
For the 2008–2017 decade, global methane emissions are estimated by atmospheric inversions (a top-down approach) to be 576 Tg CH4 yr−1 (range 550–594, corresponding to the minimum and maximum estimates of the model ensemble). Of this total, 359 Tg CH4 yr−1 or ∼ 60 % is attributed to anthropogenic sources, that is emissions caused by direct human activity (i.e. anthropogenic emissions; range 336–376 Tg CH4 yr−1 or 50 %–65 %). The mean annual total emission for the new decade (2008–2017) is 29 Tg CH4 yr−1 larger than our estimate for the previous decade (2000–2009), and 24 Tg CH4 yr−1 larger than the one reported in the previous budget for 2003–2012 (Saunois et al., 2016). Since 2012, global CH4 emissions have been tracking the warmest scenarios assessed by the Intergovernmental Panel on Climate Change. Bottom-up methods suggest almost 30 % larger global emissions (737 Tg CH4 yr−1, range 594–881) than top-down inversion methods. Indeed, bottom-up estimates for natural sources such as natural wetlands, other inland water systems, and geological sources are higher than top-down estimates. The atmospheric constraints on the top-down budget suggest that at least some of these bottom-up emissions are overestimated. The latitudinal distribution of atmospheric observation-based emissions indicates a predominance of tropical emissions (∼ 65 % of the global budget, < 30∘ N) compared to mid-latitudes (∼ 30 %, 30–60∘ N) and high northern latitudes (∼ 4 %, 60–90∘ N). The most important source of uncertainty in the methane budget is attributable to natural emissions, especially those from wetlands and other inland waters.
Some of our global source estimates are smaller than those in previously published budgets (Saunois et al., 2016; Kirschke et al., 2013). In particular wetland emissions are about 35 Tg CH4 yr−1 lower due to improved partition wetlands and other inland waters. Emissions from geological sources and wild animals are also found to be smaller by 7 Tg CH4 yr−1 by 8 Tg CH4 yr−1, respectively. However, the overall discrepancy between bottom-up and top-down estimates has been reduced by only 5 % compared to Saunois et al. (2016), due to a higher estimate of emissions from inland waters, highlighting the need for more detailed research on emissions factors. Priorities for improving the methane budget include (i) a global, high-resolution map of water-saturated soils and inundated areas emitting methane based on a robust classification of different types of emitting habitats; (ii) further development of process-based models for inland-water emissions; (iii) intensification of methane observations at local scales (e.g., FLUXNET-CH4 measurements) and urban-scale monitoring to constrain bottom-up land surface models, and at regional scales (surface networks and satellites) to constrain atmospheric inversions; (iv) improvements of transport models and the representation of photochemical sinks in top-down inversions; and (v) development of a 3D variational inversion system using isotopic and/or co-emitted species such as ethane to improve source partitioning
Recommended from our members
Computational studies and optimization of wakefield accelerators
Laser- and particle beam-driven plasma wakefield accelerators produce accelerating fields thousands of times higher than radio-frequency accelerators, offering compactness and ultrafast bunches to extend the frontiers of high energy physics and to enable laboratory-scale radiation sources. Large-scale kinetic simulations provide essential understanding of accelerator physics to advance beam performance and stability and show and predict the physics behind recent demonstration of narrow energy spread bunches. Benchmarking between codes is establishing validity of the models used and, by testing new reduced models, is extending the reach of simulations to cover upcoming meter-scale multi-GeV experiments. This includes new models that exploit Lorentz boosted simulation frames to speed calculations. Simulations of experiments showed that recently demonstrated plasma gradient injection of electrons can be used as an injector to increase beam quality by orders of magnitude. Simulations are now also modeling accelerator stages of tens of GeV, staging of modules, and new positron sources to design next-generation experiments and to use in applications in high energy physics and light sources
Proof-of-Principle Experiment for FEL-Based Coherent Electron Cooling,”
Abstract Coherent electron cooling (CEC) has a potential to significantly boost luminosity of high-energy, highintensity hadron-hadron and electron-hadron colliders. In a CEC system, a hadron beam interacts with a cooling electron beam. A perturbation of the electron density caused by ions is amplified and fed back to the ions to reduce the energy spread and the emittance of the ion beam. To demonstrate the feasibility of CEC we propose a proof-of-principle experiment at RHIC using SRF linac. In this paper, we describe the setup for CeC installed into one of RHIC's interaction regions. We present results of analytical estimates and results of initial simulations of cooling a gold-ion beam at 40 GeV/u energy via CeC
Recommended from our members
Final Report for 'ParSEC-Parallel Simulation of Electron Cooling"
The Department of Energy has plans, during the next two or three years, to design an electron cooling section for the collider ring at RHIC (Relativistic Heavy Ion Collider) [1]. Located at Brookhaven National Laboratory (BNL), RHIC is the premier nuclear physics facility. The new cooling section would be part of a proposed luminosity upgrade [2] for RHIC. This electron cooling section will be different from previous electron cooling facilities in three fundamental ways. First, the electron energy will be 50 MeV, as opposed to 100's of keV (or 4 MeV for the electron cooling system now operating at Fermilab [3]). Second, both the electron beam and the ion beam will be bunched, rather than being essentially continuous. Third, the cooling will take place in a collider rather than in a storage ring. Analytical work, in combination with the use and further development of the semi-analytical codes BETACOOL [4,5] and SimCool [6,7] are being pursued at BNL [8] and at other laboratories around the world. However, there is a growing consensus in the field that high-fidelity 3-D particle simulations are required to fully understand the critical cooling physics issues in this new regime. Simulations of the friction coefficient, using the VORPAL code [9], for single gold ions passing once through the interaction region, have been compared with theoretical calculations [10,11], and the results have been presented in conference proceedings papers [8,12,13,14] and presentations [15,16,17]. Charged particles are advanced using a fourth-order Hermite predictor corrector algorithm [18]. The fields in the beam frame are obtained from direct calculation of Coulomb's law, which is more efficient than multipole-type algorithms for less than {approx} 10{sup 6} particles. Because the interaction time is so short, it is necessary to suppress the diffusive aspect of the ion dynamics through the careful use of positrons in the simulations, and to run 100's of simulations with the same physical parameters but with different ''seeds'' for the particle loading. VORPAL can now be used to simulate other electron cooling facilities around the world, and it is also suitable for other accelerator modeling applications of direct interest to the Department of Energy. For example: (a) the Boersch effect in transport of strongly-magnetized electron beams for electron cooling sections, (b) the intrabeam scattering (IBS) effect in heavy ion accelerators, (c) the formation of crystalline beams and (d) target physics for heavy-ion fusion (HIF)
Simulating laser pulse propagation and low-frequency wave emission in capillary plasma channel systems with a ponderomotive guiding center model
Capillary channels of ≈3 cm in length and with plasma densities ≈10^{18} cm^{-3} are a promising alternative to the much shorter, higher-density gas jets for GeV-scale laser wakefield acceleration of electrons. However, the large discrepancy between length scales of the plasma and the laser presents a major computational challenge for particle-in-cell (PIC) simulations. Methods are therefore sought that relax the need to concurrently resolve both length scales. For example, the commonly used “moving window” algorithm enables a reduction of the computational domain to a few plasma wavelengths, which is orders of magnitude smaller than the full length of the laser-plasma interaction. In addition, ponderomotive guiding center methods enable relaxation of the constraint to resolve the laser wavelength. These averaging methods split the laser-induced current into a rapidly varying part and a slowly varying envelope. The average over fast time scales is performed in a semianalytic way, leaving the evolution of the laser envelope and the plasma response to be modeled numerically. Here, we present a ponderomotive guiding center algorithm and demonstrate its applicability to model capillary channels by comparing it with fully kinetic PIC simulations