459 research outputs found
Shuttle electrical environment
Part of an AFGL payload flown on the STS-4 mission consisted of experiments to measure in-situ electric fields, electron densities, and vehicle charging. During this flight some 11 hours of data were acquired ranging from 5 minute snapshots up to continuous half-orbits. These experiments are described and results presented for such vehicle induced events as a main engine burn, thruster firings and water dumps in addition to undisturbed periods. The main characteristic of all the vehicle induced events is shown to be an enhancement in the low frequency noise (less than 2 kHz), in both the electrostatic and electron irregularity (delta N/N) spectra. The non-event results indicate that the electrostatic broadband emissions show a white noise characteristic in the low frequency range up to 2 kHz at an amplitude of 10 db above the shuttle design specification limit, falling below that limit above 10 kHz. The vehicle potential remained within the range of -3 to +1 volt throughout the flight which exhibits normal behavior for a satellite in a low equatorial orbit
Recertification of the air and methane storage vessels at the Langley 8-foot high-temperature structures tunnel
This center operates a number of sophisticated wind tunnels in order to fulfill the needs of its researchers. Compressed air, which is kept in steel storage vessels, is used to power many of these tunnels. Some of these vessels have been in use for many years, and Langley is currently recertifying these vessels to insure their continued structural integrity. One of the first facilities to be recertified under this program was the Langley 8-foot high-temperature structures tunnel. This recertification involved (1) modification, hydrotesting, and inspection of the vessels; (2) repair of all relevant defects; (3) comparison of the original design of the vessel with the current design criteria of Section 8, Division 2, of the 1974 ASME Boiler and Pressure Vessel Code; (4) fracture-mechanics, thermal, and wind-induced vibration analyses of the vessels; and (5) development of operating envelopes and a future inspection plan for the vessels. Following these modifications, analyses, and tests, the vessels were recertified for operation at full design pressure (41.4 MPa (6000 psi)) within the operating envelope developed
Presearch Data Conditioning in the Kepler Science Operations Center Pipeline
We describe the Presearch Data Conditioning (PDC) software component and its context in the Kepler Science Operations Center (SOC) pipeline. The primary tasks of this component are to correct systematic and other errors, remove excess flux due to aperture crowding, and condition the raw flux light curves for over 160,000 long cadence (~thirty minute) and 512 short cadence (~one minute) targets across the focal plane array. Long cadence corrected flux light curves are subjected to a transiting planet search in a subsequent pipeline module. We discuss the science algorithms for long and short cadence PDC: identification and correction of unexplained (i.e., unrelated to known anomalies) discontinuities; systematic error correction; and excess flux removal. We discuss the propagation of uncertainties from raw to corrected flux. Finally, we present examples of raw and corrected flux time series for flight data to illustrate PDC performance. Corrected flux light curves produced by PDC are exported to the Multi-mission Archive at Space Telescope [Science Institute] (MAST) and will be made available to the general public in accordance with the NASA/Kepler data release policy
Kepler Presearch Data Conditioning II - A Bayesian Approach to Systematic Error Correction
With the unprecedented photometric precision of the Kepler Spacecraft,
significant systematic and stochastic errors on transit signal levels are
observable in the Kepler photometric data. These errors, which include
discontinuities, outliers, systematic trends and other instrumental signatures,
obscure astrophysical signals. The Presearch Data Conditioning (PDC) module of
the Kepler data analysis pipeline tries to remove these errors while preserving
planet transits and other astrophysically interesting signals. The completely
new noise and stellar variability regime observed in Kepler data poses a
significant problem to standard cotrending methods such as SYSREM and TFA.
Variable stars are often of particular astrophysical interest so the
preservation of their signals is of significant importance to the astrophysical
community. We present a Bayesian Maximum A Posteriori (MAP) approach where a
subset of highly correlated and quiet stars is used to generate a cotrending
basis vector set which is in turn used to establish a range of "reasonable"
robust fit parameters. These robust fit parameters are then used to generate a
Bayesian Prior and a Bayesian Posterior Probability Distribution Function (PDF)
which when maximized finds the best fit that simultaneously removes systematic
effects while reducing the signal distortion and noise injection which commonly
afflicts simple least-squares (LS) fitting. A numerical and empirical approach
is taken where the Bayesian Prior PDFs are generated from fits to the light
curve distributions themselves.Comment: 43 pages, 21 figures, Submitted for publication in PASP. Also see
companion paper "Kepler Presearch Data Conditioning I - Architecture and
Algorithms for Error Correction in Kepler Light Curves" by Martin C. Stumpe,
et a
Photometric Analysis in the Kepler Science Operations Center Pipeline
We describe the Photometric Analysis (PA) software component and its context in the Kepler Science Operations Center (SOC) pipeline. The primary tasks of this module are to compute the photometric flux and photocenters (centroids) for over 160,000 long cadence (~thirty minute) and 512 short cadence (~one minute) stellar targets from the calibrated pixels in their respective apertures. We discuss the science algorithms for long and short cadence PA: cosmic ray cleaning; background estimation and removal; aperture photometry; and flux-weighted centroiding. We discuss the end-to-end propagation of uncertainties for the science algorithms. Finally, we present examples of photometric apertures, raw flux light curves, and centroid time series from Kepler flight data. PA light curves, centroid time series, and barycentric timestamp corrections are exported to the Multi-mission Archive at Space Telescope [Science Institute] (MAST) and are made available to the general public in accordance with the NASA/Kepler data release policy
Kepler Presearch Data Conditioning I - Architecture and Algorithms for Error Correction in Kepler Light Curves
Kepler provides light curves of 156,000 stars with unprecedented precision.
However, the raw data as they come from the spacecraft contain significant
systematic and stochastic errors. These errors, which include discontinuities,
systematic trends, and outliers, obscure the astrophysical signals in the light
curves. To correct these errors is the task of the Presearch Data Conditioning
(PDC) module of the Kepler data analysis pipeline. The original version of PDC
in Kepler did not meet the extremely high performance requirements for the
detection of miniscule planet transits or highly accurate analysis of stellar
activity and rotation. One particular deficiency was that astrophysical
features were often removed as a side-effect to removal of errors. In this
paper we introduce the completely new and significantly improved version of PDC
which was implemented in Kepler SOC 8.0. This new PDC version, which utilizes a
Bayesian approach for removal of systematics, reliably corrects errors in the
light curves while at the same time preserving planet transits and other
astrophysically interesting signals. We describe the architecture and the
algorithms of this new PDC module, show typical errors encountered in Kepler
data, and illustrate the corrections using real light curve examples.Comment: Submitted to PASP. Also see companion paper "Kepler Presearch Data
Conditioning II - A Bayesian Approach to Systematic Error Correction" by Jeff
C. Smith et a
Lessons learned from the introduction of autonomous monitoring to the EUVE science operations center
The University of California at Berkeley's (UCB) Center for Extreme Ultraviolet Astrophysics (CEA), in conjunction with NASA's Ames Research Center (ARC), has implemented an autonomous monitoring system in the Extreme Ultraviolet Explorer (EUVE) science operations center (ESOC). The implementation was driven by a need to reduce operations costs and has allowed the ESOC to move from continuous, three-shift, human-tended monitoring of the science payload to a one-shift operation in which the off shifts are monitored by an autonomous anomaly detection system. This system includes Eworks, an artificial intelligence (AI) payload telemetry monitoring package based on RTworks, and Epage, an automatic paging system to notify ESOC personnel of detected anomalies. In this age of shrinking NASA budgets, the lessons learned on the EUVE project are useful to other NASA missions looking for ways to reduce their operations budgets. The process of knowledge capture, from the payload controllers for implementation in an expert system, is directly applicable to any mission considering a transition to autonomous monitoring in their control center. The collaboration with ARC demonstrates how a project with limited programming resources can expand the breadth of its goals without incurring the high cost of hiring additional, dedicated programmers. This dispersal of expertise across NASA centers allows future missions to easily access experts for collaborative efforts of their own. Even the criterion used to choose an expert system has widespread impacts on the implementation, including the completion time and the final cost. In this paper we discuss, from inception to completion, the areas where our experiences in moving from three shifts to one shift may offer insights for other NASA missions
Verification of the Kepler Input Catalog from Asteroseismology of Solar-type Stars
We calculate precise stellar radii and surface gravities from the
asteroseismic analysis of over 500 solar-type pulsating stars observed by the
Kepler space telescope. These physical stellar properties are compared with
those given in the Kepler Input Catalog (KIC), determined from ground-based
multi-color photometry. For the stars in our sample, we find general agreement
but we detect an average overestimation bias of 0.23 dex in the KIC
determination of log (g) for stars with log (g)_KIC > 4.0 dex, and a resultant
underestimation bias of up to 50% in the KIC radii estimates for stars with
R_KIC < 2 R sun. Part of the difference may arise from selection bias in the
asteroseismic sample; nevertheless, this result implies there may be fewer
stars characterized in the KIC with R ~ 1 R sun than is suggested by the
physical properties in the KIC. Furthermore, if the radius estimates are taken
from the KIC for these affected stars and then used to calculate the size of
transiting planets, a similar underestimation bias may be applied to the
planetary radii.Comment: Published in The Astrophysical Journal Letter
- …