383 research outputs found
Rightsizing LISA
The LISA science requirements and conceptual design have been fairly stable
for over a decade. In the interest of reducing costs, the LISA Project at NASA
has looked for simplifications of the architecture, at downsizing of
subsystems, and at descopes of the entire mission. This is a natural activity
of the formulation phase, and one that is particularly timely in the current
NASA budgetary context. There is, and will continue to be, enormous pressure
for cost reduction from both ESA and NASA, reviewers and the broader research
community. Here, the rationale for the baseline architecture is reviewed, and
recent efforts to find simplifications and other reductions that might lead to
savings are reported. A few possible simplifications have been found in the
LISA baseline architecture. In the interest of exploring cost sensitivity, one
moderate and one aggressive descope have been evaluated; the cost savings are
modest and the loss of science is not.Comment: To be published in Classical and Quantum Gravity; Proceedings of the
Seventh International LISA Symposium, Barcelona, Spain, 16-20 Jun. 2008; 10
pages, 1 figure, 3 table
A Tight Excess Risk Bound via a Unified PAC-Bayesian-Rademacher-Shtarkov-MDL Complexity
We present a novel notion of complexity that interpolates between and
generalizes some classic existing complexity notions in learning theory: for
estimators like empirical risk minimization (ERM) with arbitrary bounded
losses, it is upper bounded in terms of data-independent Rademacher complexity;
for generalized Bayesian estimators, it is upper bounded by the data-dependent
information complexity (also known as stochastic or PAC-Bayesian,
complexity. For
(penalized) ERM, the new complexity reduces to (generalized) normalized maximum
likelihood (NML) complexity, i.e. a minimax log-loss individual-sequence
regret. Our first main result bounds excess risk in terms of the new
complexity. Our second main result links the new complexity via Rademacher
complexity to entropy, thereby generalizing earlier results of Opper,
Haussler, Lugosi, and Cesa-Bianchi who did the log-loss case with .
Together, these results recover optimal bounds for VC- and large (polynomial
entropy) classes, replacing localized Rademacher complexity by a simpler
analysis which almost completely separates the two aspects that determine the
achievable rates: 'easiness' (Bernstein) conditions and model complexity.Comment: 38 page
An Optimal Mitigation Strategy Against the Asteroid Impact Threat with Short Warning Time
This paper presents the results of a NASA Innovative Advanced Concept (NIAC) Phase 2 study entitled "An Innovative Solution to NASA's Near-Earth Object (NEO) Impact Threat Mitigation Grand Challenge and Flight Validation Mission Architecture Development." This NIAC Phase 2 study was conducted at the Asteroid Deflection Research Center (ADRC) of Iowa State University in 2012-2014. The study objective was to develop an innovative yet practically implementable mitigation strategy for the most probable impact threat of an asteroid or comet with short warning time (less than 5 years). The mitigation strategy described in this paper is intended to optimally reduce the severity and catastrophic damage of the NEO impact event, especially when we don't have sufficient warning times for non-disruptive deflection of a hazardous NEO. This paper provides an executive summary of the NIAC Phase 2 study results
Study of tooling concepts for manufacturing operations in space Final report
Mechanical linkage device for manufacturing operations with orbital workshop
F-15B Quiet Spike(TradeMark) Aeroservoelastic Flight-Test Data Analysis
System identification is utilized in the aerospace community for development of simulation models for robust control law design. These models are often described as linear, time-invariant processes and assumed to be uniform throughout the flight envelope. Nevertheless, it is well known that the underlying process is inherently nonlinear. Over the past several decades the controls and biomedical communities have made great advances in developing tools for the identification of nonlin ear systems. In this report, we show the application of one such nonlinear system identification technique, structure detection, for the an alysis of Quiet Spike(TradeMark)(Gulfstream Aerospace Corporation, Savannah, Georgia) aeroservoelastic flight-test data. Structure detectio n is concerned with the selection of a subset of candidate terms that best describe the observed output. Structure computation as a tool fo r black-box modeling may be of critical importance for the development of robust, parsimonious models for the flight-test community. The ob jectives of this study are to demonstrate via analysis of Quiet Spike(TradeMark) aeroservoelastic flight-test data for several flight conditions that: linear models are inefficient for modelling aeroservoelast ic data, nonlinear identification provides a parsimonious model description whilst providing a high percent fit for cross-validated data an d the model structure and parameters vary as the flight condition is altered
Variability, negative evidence, and the acquisition of verb argument constructions
We present a hierarchical Bayesian framework for modeling the acquisition of verb argument constructions. It embodies a domain-general approach to learning higher-level knowledge in the form of inductive constraints (or overhypotheses), and has been used to explain other aspects of language development such as the shape bias in learning object names. Here, we demonstrate that the same model captures several phenomena in the acquisition of verb constructions. Our model, like adults in a series of artificial language learning experiments, makes inferences about the distributional statistics of verbs on several levels of abstraction simultaneously. It also produces the qualitative learning patterns displayed by children over the time course of acquisition. These results suggest that the patterns of generalization observed in both children and adults could emerge from basic assumptions about the nature of learning. They also provide an example of a broad class of computational approaches that can resolve Baker's Paradox
An Innovative Solution to NASA's NEO Impact Threat Mitigation Grand Challenge and Flight Validation Mission Architecture Development
This paper presents the results of a NASA Innovative Advanced Concept (NIAC) Phase 2 study entitled "An Innovative Solution to NASA's Near-Earth Object (NEO) Impact Threat Mitigation Grand Challenge and Flight Validation Mission Architecture Development." This NIAC Phase 2 study was conducted at the Asteroid Deflection Research Center (ADRC) of Iowa State University in 2012-2014. The study objective was to develop an innovative yet practically implementable mitigation strategy for the most probable impact threat of an asteroid or comet with short warning time (< 5 years). The mitigation strategy described in this paper is intended to optimally reduce the severity and catastrophic damage of the NEO impact event, especially when we don't have sufficient warning times for non-disruptive deflection of a hazardous NEO. This paper provides an executive summary of the NIAC Phase 2 study results. Detailed technical descriptions of the study results are provided in a separate final technical report, which can be downloaded from the ADRC website (www.adrc.iastate.edu)
- …
