1,647 research outputs found
Discovery of the 2010 Eruption and the Pre-Eruption Light Curve for Recurrent Nova U Scorpii
We report the discovery by B. G. Harris and S. Dvorak on JD 2455224.9385
(2010 Jan 28.4385 UT) of the predicted eruption of the recurrent nova U Scorpii
(U Sco). We also report on 815 magnitudes (and 16 useful limits) on the
pre-eruption light curve in the UBVRI and Sloan r' and i' bands from 2000.4 up
to 9 hours before the peak of the January 2010 eruption. We found no
significant long-term variations, though we did find frequent fast variations
(flickering) with amplitudes up to 0.4 mag. We show that U Sco did not have any
rises or dips with amplitude greater than 0.2 mag on timescales from one day to
one year before the eruption. We find that the peak of this eruption occurred
at JD 2455224.69+-0.07 and the start of the rise was at JD 2455224.32+-0.12.
From our analysis of the average B-band flux between eruptions, we find that
the total mass accreted between eruptions is consistent with being a constant,
in agreement with a strong prediction of nova trigger theory. The date of the
next eruption can be anticipated with an accuracy of +-5 months by following
the average B-band magnitudes for the next ~10 years, although at this time we
can only predict that the next eruption will be in the year 2020+-2.Comment: Astronomical Journal submitted, 36 pages, 3 figures, full table
Gamma-Ray Burster Counterparts: HST Blue and Ultraviolet Data
The surest solution of the Gamma Ray Burst (GRB) mystery is to find an
unambiguous low-energy quiescent counterpart. However, to date no reasonable
candidates have been identified in the x-ray, optical, infrared, or radio
ranges. The Hubble Space Telescope (HST) has now allowed for the first deep
ultraviolet searches for quiescent counterparts. This paper reports on
multiepoch ultraviolet searches of five GRB positions with HST. We found no
sources with significant ultraviolet excesses, variability, parallax, or proper
motion in any of the burst error regions. In particular, we see no sources
similar to that proposed as a counterpart to the GRB970228. While this negative
result is disappointing, it still has good utility for its strict limits on the
no-host-galaxy problem in cosmological models of GRBs. For most cosmological
models (with peak luminosity 6X10^50 erg/s), the absolute B magnitude of any
possible host galaxy must be fainter than -15.5 to -17.4. These smallest boxes
for some of the brightest bursts provide the most critical test, and our limits
are a severe problem for all published cosmological burst models.Comment: 15 pages, 2 ps figures, accepted for publication in the Astrophysical
Journa
Assessing the Value of Coordinated Sire Genetics in a Synchronized AI Program
Synchronized artificial insemination was used to inseminate cows using different types of sire genetics, including low-accuracy, calving-ease, and high-accuracy. These three calf sire groups were compared to calves born to cows bred using natural service. We found substantial production efficiency grains, carcass merit improvement, and economic value to calves born to cows following a synchronized artificial insemination program with high-accuracy semen included. The economic advantage to the high-accuracy calf sire group was computed to be in the neighborhood of 80/head, relative to the natural service calf sire group.artificial insemination, beef, cow, carcass, feed-out, genetics, pre-conditioning, sire synchronization., Agricultural Finance,
Handling Qualities Evaluations of Low Complexity Model Reference Adaptive Controllers for Reduced Pitch and Roll Damping Scenarios
National Aeronautics and Space Administration (NASA) researchers have conducted a series of flight experiments designed to study the effects of varying levels of adaptive controller complexity on the performance and handling qualities of an aircraft under various simulated failure or damage conditions. A baseline, nonlinear dynamic inversion controller was augmented with three variations of a model reference adaptive control design. The simplest design consisted of a single adaptive parameter in each of the pitch and roll axes computed using a basic gradient-based update law. A second design was built upon the first by increasing the complexity of the update law. The third and most complex design added an additional adaptive parameter to each axis. Flight tests were conducted using NASA s Full-scale Advanced Systems Testbed, a highly modified F-18 aircraft that contains a research flight control system capable of housing advanced flight controls experiments. Each controller was evaluated against a suite of simulated failures and damage ranging from destabilization of the pitch and roll axes to significant coupling between the axes. Two pilots evaluated the three adaptive controllers as well as the non-adaptive baseline controller in a variety of dynamic maneuvers and precision flying tasks designed to uncover potential deficiencies in the handling qualities of the aircraft, and adverse interactions between the pilot and the adaptive controllers. The work was completed as part of the Integrated Resilient Aircraft Control Project under NASA s Aviation Safety Program
Complexity and Pilot Workload Metrics for the Evaluation of Adaptive Flight Controls on a Full Scale Piloted Aircraft
Flight research has shown the effectiveness of adaptive flight controls for improving aircraft safety and performance in the presence of uncertainties. The National Aeronautics and Space Administration's (NASA)'s Integrated Resilient Aircraft Control (IRAC) project designed and conducted a series of flight experiments to study the impact of variations in adaptive controller design complexity on performance and handling qualities. A novel complexity metric was devised to compare the degrees of simplicity achieved in three variations of a model reference adaptive controller (MRAC) for NASA's F-18 (McDonnell Douglas, now The Boeing Company, Chicago, Illinois) Full-Scale Advanced Systems Testbed (Gen-2A) aircraft. The complexity measures of these controllers are also compared to that of an earlier MRAC design for NASA's Intelligent Flight Control System (IFCS) project and flown on a highly modified F-15 aircraft (McDonnell Douglas, now The Boeing Company, Chicago, Illinois). Pilot comments during the IRAC research flights pointed to the importance of workload on handling qualities ratings for failure and damage scenarios. Modifications to existing pilot aggressiveness and duty cycle metrics are presented and applied to the IRAC controllers. Finally, while adaptive controllers may alleviate the effects of failures or damage on an aircraft's handling qualities, they also have the potential to introduce annoying changes to the flight dynamics or to the operation of aircraft systems. A nuisance rating scale is presented for the categorization of nuisance side-effects of adaptive controllers
CASTER - a concept for a Black Hole Finder Probe based on the use of new scintillator technologies
The primary scientific mission of the Black Hole Finder Probe (BHFP), part of
the NASA Beyond Einstein program, is to survey the local Universe for black
holes over a wide range of mass and accretion rate. One approach to such a
survey is a hard X-ray coded-aperture imaging mission operating in the 10--600
keV energy band, a spectral range that is considered to be especially useful in
the detection of black hole sources. The development of new inorganic
scintillator materials provides improved performance (for example, with regards
to energy resolution and timing) that is well suited to the BHFP science
requirements. Detection planes formed with these materials coupled with a new
generation of readout devices represent a major advancement in the performance
capabilities of scintillator-based gamma cameras. Here, we discuss the Coded
Aperture Survey Telescope for Energetic Radiation (CASTER), a concept that
represents a BHFP based on the use of the latest scintillator technology.Comment: 12 pages; conference paper presented at the SPIE conference "UV,
X-Ray, and Gamma-Ray Space Instrumentation for Astronomy XIV." To be
published in SPIE Conference Proceedings, vol. 589
CASTER: a scintillator-based black hole finder probe
The primary scientific mission of the Black Hole Finder Probe (BHFP), part of the NASA Beyond Einstein program, is to survey the local Universe for black holes over a wide range of mass and accretion rate. One approach to such a survey is a hard X-ray coded-aperture imaging mission operating in the 10-600 keV energy band, a spectral range that is considered to be especially useful in the detection of black hole sources. The development of new inorganic scintillator materials provides improved performance (for example, with regards to energy resolution and timing) that is well suited to the BHFP science requirements. Detection planes formed with these materials coupled with a new generation of readout devices represent a major advancement in the performance capabilities of scintillator-based gamma cameras. Here, we discuss the Coded Aperture Survey Telescope for Energetic Radiation (CASTER), a concept that represents a BHFP based on the use of the latest scintillator technology
Application of a genome-based predictive CHO model for increased mAb production and Glycosylation control
Monoclonal antibody therapeutics continue to grow in both number and market share with recent forecasts of global sales reaching ~$125MM by 2020. Most mAb products currently on the market are produced using cultured mammalian cells, typically Chinese Hamster Ovary (CHO) cells, which provide the necessary post-translational modifications to make the antibody efficacious. Many post-translational modifications such as the oligosaccharide profile are considered critical quality attributes (CQAs) that must be tightly controlled throughout the manufacturing process to ensure product safety and effectiveness. Therefore, the ability to predict how cell culture media components, including potential contaminants like trace metals, will affect product formation and glycosylation is important from both a process development and process control viewpoint. A detailed genome-based, predictive CHO model from the Insilico Cells™ library was adapted by the reconstruction software Insilico Discovery™ for a representative fed-batch process through a collaborative effort leveraging the computational and experimental expertise of two companies. The final, compartmentalized network model contained 1900 reactions (including transport reactions), 1300 compounds and contains stoichiometric descriptions of anabolic pathways for amino acids, lipids and carbohydrate species. The genome-scale model was constrained using several assumptions on the cell physiology and then used to compute time-resolved flux distributions by the software module Insilico Inspector™. The Insilico Designer™ module was then used to subsequently reduce the large model to a computationally manageable reduced model able to describe all flux distributions using 5 flux modes, of which 4 combined several metabolic functions and one is independently responsible for product synthesis. Using Insilico Designer™, the kinetic parameters of the reduced model were estimated by fitting the model-predicted metabolite concentrations to the experimentally determined values. The calibrated model was able to properly describe the time-dependent trajectories of biomass, product and most metabolites. Simulations using the reduced model were run and a media composition predicted to improve mAb production was identified and experimentally verified. Furthermore, experiments probing the effects of trace metals on product glycosylation were used to extend the model’s glycosylation predictability. The ability to identify both metabolic signatures, as well as media components, that correlate to specific glycan profiles will allow for fine-tuning of desired CQAs and enable more robust control strategies in upstream processes
- …