309 research outputs found

    Cold electron beams from cryo-cooled, alkali antimonide photocathodes

    Full text link
    In this letter we report on the generation of cold electron beams using a Cs3Sb photocathode grown by co-deposition of Sb and Cs. By cooling the photocathode to 90 K we demonstrate a significant reduction in the mean transverse energy validating the long standing speculation that the lattice temperature contribution limits the mean transverse energy or thermal emittance near the photoemission threshold, opening new frontiers in generating ultra-bright beams. At 90 K, we achieve a record low thermal emittance of 0.2 μ\mum (rms) per mm of laser spot diameter from an ultrafast (sub-picosecond) photocathode with quantum efficiency greater than 7×1057\times 10^{-5} using a visible laser wavelength of 690 nm

    Development of a 3-D energy-momentum analyzer for meV-scale energy electrons.

    Get PDF
    In this article, we report on the development of a time-of-flight based electron energy analyzer capable of measuring the 3-D momentum and energy distributions of very low energy (millielectronvolt-scale) photoemitted electrons. This analyzer is capable for measuring energy and 3-D momentum distributions of electrons with energies down to 1 meV with a sub-millielectronvolt energy resolution. This analyzer is an ideal tool for studying photoemission processes very close to the photoemission threshold and also for studying the physics of photoemission based electron sources

    Liveness-Based Garbage Collection for Lazy Languages

    Full text link
    We consider the problem of reducing the memory required to run lazy first-order functional programs. Our approach is to analyze programs for liveness of heap-allocated data. The result of the analysis is used to preserve only live data---a subset of reachable data---during garbage collection. The result is an increase in the garbage reclaimed and a reduction in the peak memory requirement of programs. While this technique has already been shown to yield benefits for eager first-order languages, the lack of a statically determinable execution order and the presence of closures pose new challenges for lazy languages. These require changes both in the liveness analysis itself and in the design of the garbage collector. To show the effectiveness of our method, we implemented a copying collector that uses the results of the liveness analysis to preserve live objects, both evaluated (i.e., in WHNF) and closures. Our experiments confirm that for programs running with a liveness-based garbage collector, there is a significant decrease in peak memory requirements. In addition, a sizable reduction in the number of collections ensures that in spite of using a more complex garbage collector, the execution times of programs running with liveness and reachability-based collectors remain comparable

    A Forecast for Large Scale Structure Constraints on Horndeski Gravity with Line Intensity Mapping

    Full text link
    We consider the potential for line intensity mapping (LIM) of the rotational CO(1-0), CO(2-1) and CO(3-2) transitions to detect deviations from General Relativity from 0<z<30 < z < 3 within the framework of a very general class of modified gravity models, called Horndeski theories. Our forecast assumes a multi-tracer analysis separately obtaining information from the matter power spectrum and the first two multipoles of the redshift space distortion power spectrum. To achieve ±0.1\pm 0.1 level constraints on the slope of the kinetic gravity braiding and Planck mass evolution parameters, a mm-wave LIM experiment would need to accumulate 108109\approx 10^8-10^9 spectrometer hours, feasible with instruments that could be deployed in the 2030s. Such a measurement would constrain large portions of the remaining parameter space available to Scalar-Tensor modified gravity theories. Our modeling code is publicly available.Comment: 13 pages, 5 figures; to be submitted to Monthly Notices of the Royal Astronomical Societ

    Comparison of artifacts between paste and collodion method of electrode application in pediatric EEG

    Get PDF
    © 2019 International Federation of Clinical Neurophysiology Objectives: Children pose challenges to obtain quality EEG data due to excessive artifact. Collodion is used in EEG electrodes due to its water resistance and strong adhesive qualities. This study was done to evaluate differences in artifacts between the collodion and paste method. Methods: 115 subjects (children age \u3e3 years) were randomized into paste and collodion groups and artifacts evaluated at baseline and every hour over 30 s increments. Age, sleep state, and number of electrodes with artifact were also documented. T-test was performed to determine differences in the various parameters between the two groups. Results: 61 subjects were in the paste group and 54 in the collodion group. Mean of total seconds of artifact from 0 to 24 h were 41.8 s in paste group versus 30.3 s in collodion group (P = 0.02). Children \u3e11 years old had less artifact than younger children from 0 to 24 h (24.3 versus 41.2 s, P = 0.03), and from 24 to 48 h (33.1 versus 43.1 s, P = 0.03). There was a significant effect of sleep vs. awake state recordings on artifact from 0 to 24 h (30.3 versus 50.2 s, P = 0.01). Conclusion: Electrode problems are common with both collodion and paste in prolonged AEEG monitoring. However, for studies less than 24 h, collodion may be a better alternative. Significance: Our study provides evidence that in some cases collodion may be a better alternative to paste in terms of decreased artifacts

    Analysis of Bent Microstrip Resonator Using Finite Clement Method

    Get PDF

    The simulation and design of an on-chip superconducting millimetre filter-bank spectrometer

    Get PDF
    Superconducting on-chip filter banks provide a scalable, space saving solution to create imaging spectrometers at millimetre and submillimetre wavelengths. We present an easy to realise, lithographed superconducting filter design with a high tolerance to fabrication error. Using a capacitively coupled λ/2 microstrip resonator to define a narrow (λ/Δλ=300) spectral pass band, the filtered output of a given spectrometer channel directly connects to a lumped-element kinetic inductance detector. We show the tolerance analysis of our design, demonstrating <11% change in filter quality factor to any one realistic fabrication error and a full filter-bank efficiency forecast to be 50% after accounting for fabrication errors and dielectric loss tangent

    Freeze-Dried Ham Promotes Azoxymethane-Induced Mucin-Depleted Foci and Aberrant Crypt Foci in Rat Colon

    Get PDF
    Processed and red meat consumption is associated with the risk of colorectal cancer. Meta-analyses have suggested that the risk associated with processed meat is higher. Most processed meats are cured and cooked, which leads to formation of free nitrosyl heme. We speculated that free nitrosyl heme is more toxic than native myoglobin. The promoting effect of a freeze-dried, cooked, cured ham diet was looked for in a 100-day study. Colon carcinogenesis endpoints were aberrant crypt foci and mucin depleted foci (MDF). A second study (14 days) was designed 1) to compare the effect of ham, hemoglobin, and hemin; and 2) to test the effect of sodium chloride, nitrite, and phosphate in diet on early biomarkers associated with heme-induced promotion. In the 100-day study, control and ham-fed rats had 3.5 and 8.5 MDF/colon, respectively (P < 0.0001). Promotion was associated with cytotoxicity and lipid peroxidation. In the short-term study, cytotoxicity and lipid peroxidation of fecal water, and the urinary marker of lipid peroxidation, increased dramatically in ham- and hemin-fed rat. In contrast, the hemoglobin diet, sodium chloride, nitrite, phosphate diet had no effect. Freeze-dried cooked ham can promote colon carcinogenesis in a rodent model. Hemin, but not hemoglobin, mimicked ham effect on early biochemical markers associated with carcinogenesis

    The design and characterization of a 300 channel, optimized full-band millimeter filterbank for science with SuperSpec

    Get PDF
    SuperSpec is an integrated, on-chip spectrometer for millimeter and sub-millimeter astronomy. We report the approach, design optimization, and partial characterization of a 300 channel filterbank covering the 185 to 315 GHz frequency band that targets a resolving power R ~ 310, and fits on a 3.5×5.5 cm chip. SuperSpec uses a lens and broadband antenna to couple radiation into a niobium microstrip that feeds a bank of niobium microstrip half-wave resonators for frequency selectivity. Each half-wave resonator is coupled to the inductor of a titanium nitride lumped-element kinetic inductance detector (LEKID) that detects the incident radiation. The device was designed for use in a demonstration instrument at the Large Millimeter Telescope (LMT)
    corecore