5,218 research outputs found
A microrod-resonator Brillouin laser with 240 Hz absolute linewidth
We demonstrate an ultralow-noise microrod-resonator based laser that
oscillates on the gain supplied by the stimulated Brillouin scattering optical
nonlinearity. Microresonator Brillouin lasers are known to offer an outstanding
frequency noise floor, which is limited by fundamental thermal fluctuations.
Here, we show experimental evidence that thermal effects also dominate the
close-to-carrier frequency fluctuations. The 6-mm diameter microrod resonator
used in our experiments has a large optical mode area of ~100 {\mu}m, and
hence its 10 ms thermal time constant filters the close-to-carrier optical
frequency noise. The result is an absolute laser linewidth of 240 Hz with a
corresponding white-frequency noise floor of 0.1 Hz/Hz. We explain the
steady-state performance of this laser by measurements of its operation state
and of its mode detuning and lineshape. Our results highlight a mechanism for
noise that is common to many microresonator devices due to the inherent
coupling between intracavity power and mode frequency. We demonstrate the
ability to reduce this noise through a feedback loop that stabilizes the
intracavity power.Comment: 11 pages, 5 figure
Crop Production Management in South Dakota: LISA Farmers Compared to farmers in General
This paper summarizes (1) five main contrasts in crop production management between LISA and conventional farmers in South Dakota and (2) reactions of panels of LISA farmers, conventional farmers, and other key informants to the existence of and explanations for apparent contrasts, between LISA farmers and farmers in general, in their respective crop production practices
Mean Robust Optimization
Robust optimization is a tractable and expressive technique for
decision-making under uncertainty, but it can lead to overly conservative
decisions when pessimistic assumptions are made on the uncertain parameters.
Wasserstein distributionally robust optimization can reduce conservatism by
being data-driven, but it often leads to very large problems with prohibitive
solution times. We introduce mean robust optimization, a general framework that
combines the best of both worlds by providing a trade-off between computational
effort and conservatism. We propose uncertainty sets constructed based on
clustered data rather than on observed data points directly thereby
significantly reducing problem size. By varying the number of clusters, our
method bridges between robust and Wasserstein distributionally robust
optimization. We show finite-sample performance guarantees and explicitly
control the potential additional pessimism introduced by any clustering
procedure. In addition, we prove conditions for which, when the uncertainty
enters linearly in the constraints, clustering does not affect the optimal
solution. We illustrate the efficiency and performance preservation of our
method on several numerical examples, obtaining multiple orders of magnitude
speedups in solution time with little-to-no effect on the solution quality
Learning for Robust Optimization
We propose a data-driven technique to automatically learn the uncertainty
sets in robust optimization. Our method reshapes the uncertainty sets by
minimizing the expected performance across a family of problems while
guaranteeing constraint satisfaction. We learn the uncertainty sets using a
novel stochastic augmented Lagrangian method that relies on differentiating the
solutions of the robust optimization problems with respect to the parameters of
the uncertainty set. We show sublinear convergence to stationary points under
mild assumptions, and finite-sample probabilistic guarantees of constraint
satisfaction using empirical process theory. Our approach is very flexible and
can learn a wide variety of uncertainty sets while preserving tractability.
Numerical experiments show that our method outperforms traditional approaches
in robust and distributionally robust optimization in terms of out of sample
performance and constraint satisfaction guarantees. We implemented our method
in the open-source package LROPT
A microrod-resonator Brillouin laser with 240 Hz absolute linewidth
Wedemonstrate an ultralow-noise microrod-resonator based laser that oscillates on the gain supplied by the stimulated Brillouin scattering optical nonlinearity. Microresonator Brillouin lasers are known to offer an outstanding frequency noise floor, which is limited by fundamental thermal fluctuations. Here, we show experimental evidence that thermal effects also dominate the close-to-carrier frequency fluctuations. The 6mmdiameter microrod resonator used in our experiments has a large
optical mode area ofâŒ100 ÎŒm2, and hence its 10 ms thermal time constant filters the close-to-carrier optical frequency noise. The result is an absolute laser linewidth of 240 Hz with a corresponding white-frequency noise floor of 0.1 Hz2 Hzâ1.We explain the steady-state performance of this laser by measurements of its operation state and of its mode detuning and lineshape. Our results highlight a mechanism for noise that is common to many microresonator devices due to the inherent coupling between intracavity power and mode frequency.Wedemonstrate the ability to reduce this noise through a feedback loop that stabilizes the intracavity power.William Loh, Joe Becker, Daniel C Cole, Aurelien Coillet, Fred N Baynes, Scott B Papp and Scott A Diddam
Low Risk Monitoring in Neurocritical Care
Background/Rationale: Patients are admitted to Intensive care units (ICUs) either because they need close monitoring despite a low risk of hospital mortality (LRM group) or to receive ICU specific active treatments (AT group). The characteristics and differential outcomes of LRM patients vs. AT patients in Neurocritical Care Units are poorly understood. Methods: We classified 1,702 patients admitted to our tertiary and quaternary care center Neuroscience-ICU in 2016 and 2017 into LRM vs. AT groups. We compared demographics, admission diagnosis, goal of care status, readmission rates and managing attending specialty extracted from the medical record between groups. Acute Physiology, Age and Chronic Health Evaluation (APACHE) IVa risk predictive modeling was used to assess comparative risks for ICU and hospital mortality and length of stay between groups. Results: 56.9% of patients admitted to our Neuroscience-ICU in 2016 and 2017 were classified as LRM, whereas 43.1% of patients were classified as AT. While demographically similar, the groups differed significantly in all risk predictive outcome measures [APACHE IVa scores, actual and predicted ICU and hospital mortality (p \u3c 0.0001 for all metrics)]. The most common admitting diagnosis overall, cerebrovascular accident/stroke, was represented in the LRM and AT groups with similar frequency [24.3 vs. 21.3%, respectively (p = 0.15)], illustrating that further differentiating factors like symptom duration, neurologic status and its dynamic changes and neuro-imaging characteristics determine the indication for active treatment vs. observation. Patients with intracranial hemorrhage/hematoma were significantly more likely to receive active treatments as opposed to having a primary focus on monitoring [13.6 vs. 9.8%, respectively (p = 0.017)]. Conclusion: The majority of patients admitted to our Neuroscience ICU (56.9%) had \u3c10% hospital mortality risk and a focus on monitoring, whereas the remaining 43.1% of patients received active treatments in their first ICU day. LRM Patients exhibited significantly lower APACHE IVa scores, ICU and hospital mortality rates compared to AT patients. Observed-over-expected ICU and hospital mortality ratios were better than predicted by APACHE IVa for low risk monitored patients and close to prediction for actively treated patients, suggesting that at least a subset of LRM patients may safely and more cost effectively be cared for in intermediate level care settings
Cosmological Feedback from High-Redshift Dwarf Galaxies
We model how repeated supernova explosions in high-redshift dwarf starburst
galaxies drive superbubbles and winds out of the galaxies. We compute the
efficiencies of metal and mass ejection and energy transport from the galactic
potentials, including the effect of cosmological infall of external gas. The
starburst bubbles quickly blow out of small, high-redshift, galactic disks, but
must compete with the ram pressure of the infalling gas to escape into
intergalactic space. We show that the assumed efficiency of the star formation
rate dominates the bubble evolution and the metal, mass, and energy feedback
efficiencies. With star formation efficiency f*=0.01, the ram pressure of
infall can confine the bubbles around high-redshift dwarf galaxies with
circular velocities v_c>52 km/s. We can expect high metal and mass ejection
efficiencies, and moderate energy transport efficiencies in halos with
v_c~30-50 km/s and f*~0.01 as well as in halos with v_c~100 km/s and f*>>0.01.
Such haloes collapse successively from 1-2 sigma peaks in LambdaCDM Gaussian
density perturbations as time progresses. These dwarf galaxies can probably
enrich low and high-density regions of intergalactic space with metals to
10^-3-10^-2 Zsun as they collapse at z~8 and z<5 respectively. They also may be
able to provide adequate turbulent energy to prevent the collapse of other
nearby halos, as well as to significantly broaden Lyman-alpha absorption lines
to v_rms~20-40 km/s. We compute the timescales for the next starbursts if gas
freely falls back after a starburst, and find that, for star formation
efficiencies as low as f*<0.01, the next starburst should occur in less than
half the Hubble time at the collapse redshift. This suggests that episodic star
formation may be ubiquitous in dwarf galaxies.Comment: Accepted for ApJ v613, 60 pages, 15 figure
Derivation of the Blackbody Radiation Spectrum from a Natural Maximum-Entropy Principle Involving Casimir Energies and Zero-Point Radiation
By numerical calculation, the Planck spectrum with zero-point radiation is
shown to satisfy a natural maximum-entropy principle whereas alternative
choices of spectra do not. Specifically, if we consider a set of
conducting-walled boxes, each with a partition placed at a different location
in the box, so that across the collection of boxes the partitions are uniformly
spaced across the volume, then the Planck spectrum correspond to that spectrum
of random radiation (having constant energy kT per normal mode at low
frequencies and zero-point energy (1/2)hw per normal mode at high frequencies)
which gives maximum uniformity across the collection of boxes for the radiation
energy per box. The analysis involves Casimir energies and zero-point radiation
which do not usually appear in thermodynamic analyses. For simplicity, the
analysis is presented for waves in one space dimension.Comment: 11 page
Quantitative Imaging of Protein-Protein Interactions by Multiphoton Fluorescence Lifetime Imaging Microscopy using a Streak camera
Fluorescence Lifetime Imaging Microscopy (FLIM) using multiphoton excitation
techniques is now finding an important place in quantitative imaging of
protein-protein interactions and intracellular physiology. We review here the
recent developments in multiphoton FLIM methods and also present a description
of a novel multiphoton FLIM system using a streak camera that was developed in
our laboratory. We provide an example of a typical application of the system in
which we measure the fluorescence resonance energy transfer between a
donor/acceptor pair of fluorescent proteins within a cellular specimen.Comment: Overview of FLIM techniques, StreakFLIM instrument, FRET application
Low Risk Monitoring in Neurocritical Care
Background/Rationale: Patients are admitted to Intensive care units (ICUs) either because they need close monitoring despite a low risk of hospital mortality (LRM group) or to receive ICU specific active treatments (AT group). The characteristics and differential outcomes of LRM patients vs. AT patients in Neurocritical Care Units are poorly understood.Methods: We classified 1,702 patients admitted to our tertiary and quaternary care center Neuroscience-ICU in 2016 and 2017 into LRM vs. AT groups. We compared demographics, admission diagnosis, goal of care status, readmission rates and managing attending specialty extracted from the medical record between groups. Acute Physiology, Age and Chronic Health Evaluation (APACHE) IVa risk predictive modeling was used to assess comparative risks for ICU and hospital mortality and length of stay between groups.Results: 56.9% of patients admitted to our Neuroscience-ICU in 2016 and 2017 were classified as LRM, whereas 43.1% of patients were classified as AT. While demographically similar, the groups differed significantly in all risk predictive outcome measures [APACHE IVa scores, actual and predicted ICU and hospital mortality (p < 0.0001 for all metrics)]. The most common admitting diagnosis overall, cerebrovascular accident/stroke, was represented in the LRM and AT groups with similar frequency [24.3 vs. 21.3%, respectively (p = 0.15)], illustrating that further differentiating factors like symptom duration, neurologic status and its dynamic changes and neuro-imaging characteristics determine the indication for active treatment vs. observation. Patients with intracranial hemorrhage/hematoma were significantly more likely to receive active treatments as opposed to having a primary focus on monitoring [13.6 vs. 9.8%, respectively (p = 0.017)].Conclusion: The majority of patients admitted to our Neuroscience ICU (56.9%) had <10% hospital mortality risk and a focus on monitoring, whereas the remaining 43.1% of patients received active treatments in their first ICU day. LRM Patients exhibited significantly lower APACHE IVa scores, ICU and hospital mortality rates compared to AT patients. Observed-over-expected ICU and hospital mortality ratios were better than predicted by APACHE IVa for low risk monitored patients and close to prediction for actively treated patients, suggesting that at least a subset of LRM patients may safely and more cost effectively be cared for in intermediate level care settings
- âŠ