2,025 research outputs found
Robust Control Design for Large Space Structures
The control design problem for the class of future spacecraft referred to as large space structures (LSS) is by now well known. The issue is the reduced order control of a very high order, lightly damped system with uncertain system parameters, particularly in the high frequency modes. A design methodology which incorporates robustness considerations as part of the design process is presented. Combining pertinent results from multivariable systems theory and optimal control and estimation, LQG eigenstructure assignment and LQG frequency shaping, were used to improve singular value robustness measures in the presence of control and observation spillover
A simplified procedure for correcting both errors and erasures of a Reed-Solomon code using the Euclidean algorithm
It is well known that the Euclidean algorithm or its equivalent, continued fractions, can be used to find the error locator polynomial and the error evaluator polynomial in Berlekamp's key equation needed to decode a Reed-Solomon (RS) code. A simplified procedure is developed and proved to correct erasures as well as errors by replacing the initial condition of the Euclidean algorithm by the erasure locator polynomial and the Forney syndrome polynomial. By this means, the errata locator polynomial and the errata evaluator polynomial can be obtained, simultaneously and simply, by the Euclidean algorithm only. With this improved technique the complexity of time domain RS decoders for correcting both errors and erasures is reduced substantially from previous approaches. As a consequence, decoders for correcting both errors and erasures of RS codes can be made more modular, regular, simple, and naturally suitable for both VLSI and software implementation. An example illustrating this modified decoding procedure is given for a (15, 9) RS code
The Mass of the Black Hole in the Quasar PG 2130+099
We present the results of a recent reverberation-mapping campaign undertaken
to improve measurements of the radius of the broad line region and the central
black hole mass of the quasar PG 2130+099. Cross correlation of the 5100
angstrom continuum and H-beta emission-line light curves yields a time lag of
22.9 (+4.4 - 4.3) days, corresponding to a central black hole mass MBH= 3.8
(+/- 1.5) x 10^7 Msun. This value supports the notion that previous
measurements yielded an incorrect lag. We re-analyzed previous datasets to
investigate the possible sources of the discrepancy and conclude that previous
measurement errors were apparently caused by a combination of undersampling of
the light curves and long-term secular changes in the H-beta emission-line
equivalent width. With our new measurements, PG 2130+099 is no longer an
outlier in either the R-L or the MBH-Sigma relationships.Comment: 21 pages, 7 figures; Accepted for publication in Ap
Recommended from our members
Managing digital coordination of design: emerging hybrid practices in an institutionalized project setting
What happens when digital coordination practices are introduced into the institutionalized setting of an engineering project? This question is addressed through an interpretive study that examines how a shared digital model becomes used in the late design stages of a major station refurbishment project. The paper contributes by mobilizing the idea of ‘hybrid practices’ to understand the diverse patterns of activity that emerge to manage digital coordination of design. It articulates how engineering and architecture professions develop different relationships with the shared model; the design team negotiates paper-based practices across organizational boundaries; and diverse practitioners probe the potential and limitations of the digital infrastructure. While different software packages and tools have become linked together into an integrated digital infrastructure, these emerging hybrid practices contrast with the interactions anticipated in practice and policy guidance and presenting new opportunities and challenges for managing project delivery. The study has implications for researchers working in the growing field of empirical work on engineering project organizations as it shows the importance of considering, and suggests new ways to theorise, the introduction of digital coordination practices into these institutionalized settings
Decadal Trends in Abundance, Size and Condition of Antarctic Toothfish in McMurdo Sound, Antarctica, 1972-2010
We report analyses of a dataset spanning 38 years of near-annual fishing for Antarctic toothfish Dissostichus mawsoni, using a vertical setline through the fast ice of McMurdo Sound, Antarctica, 1972-2010. This constitutes one of the longest biological time series in the Southern Ocean, and certainly the longest for any fish. Fish total length, condition and catch per unit effort (CPUE) were derived from the more than 5500 fish caught. Contrary to expectation, length-frequency was dominated by fish in the upper half of the industrial catch. The discrepancy may be due to biases in the sampling capabilities of vertical (this study) versus benthic (horizontal) fishing gear (industry long lines), related to the fact that only large Antarctic toothfish (more than 100 cm TL) are neutrally buoyant and occur in the water column. Fish length and condition increased from the early 1970s to the early 1990s and then decreased, related to sea ice cover, with lags of 8 months to 5 years, and may ultimately be related to the fishery (which targets large fish) and changes in the Southern Annular Mode through effects on toothfish main prey, Antarctic silverfish Pleuragramma antarcticum. CPUE was constant through 2001 and then decreased dramatically, likely related to the industrial fishery, which began in 1996 and which concentrates effort over the Ross Sea slope, where tagged McMurdo fish have been found. Due to limited prey choices and, therefore, close coupling among mesopredators of the Ross Sea, Antarctic toothfish included, the fishery may be altering the trophic structure of the Ross Sea
A New Algorithm for Supernova Neutrino Transport and Some Applications
We have developed an implicit, multi-group, time-dependent, spherical
neutrino transport code based on the Feautrier variables, the tangent-ray
method, and accelerated iteration. The code achieves high
angular resolution, is good to O(), is equivalent to a Boltzmann solver
(without gravitational redshifts), and solves the transport equation at all
optical depths with precision. In this paper, we present our formulation of the
relevant numerics and microphysics and explore protoneutron star atmospheres
for snapshot post-bounce models. Our major focus is on spectra, neutrino-matter
heating rates, Eddington factors, angular distributions, and phase-space
occupancies. In addition, we investigate the influence on neutrino spectra and
heating of final-state electron blocking, stimulated absorption, velocity terms
in the transport equation, neutrino-nucleon scattering asymmetry, and weak
magnetism and recoil effects. Furthermore, we compare the emergent spectra and
heating rates obtained using full transport with those obtained using
representative flux-limited transport formulations to gauge their accuracy and
viability. Finally, we derive useful formulae for the neutrino source strength
due to nucleon-nucleon bremsstrahlung and determine bremsstrahlung's influence
on the emergent and neutrino spectra.Comment: 58 pages, single-spaced LaTeX, 23 figures, revised title, also
available at http://jupiter.as.arizona.edu/~burrows/papers, accepted for
publication in the Ap.
Tackling Exascale Software Challenges in Molecular Dynamics Simulations with GROMACS
GROMACS is a widely used package for biomolecular simulation, and over the
last two decades it has evolved from small-scale efficiency to advanced
heterogeneous acceleration and multi-level parallelism targeting some of the
largest supercomputers in the world. Here, we describe some of the ways we have
been able to realize this through the use of parallelization on all levels,
combined with a constant focus on absolute performance. Release 4.6 of GROMACS
uses SIMD acceleration on a wide range of architectures, GPU offloading
acceleration, and both OpenMP and MPI parallelism within and between nodes,
respectively. The recent work on acceleration made it necessary to revisit the
fundamental algorithms of molecular simulation, including the concept of
neighborsearching, and we discuss the present and future challenges we see for
exascale simulation - in particular a very fine-grained task parallelism. We
also discuss the software management, code peer review and continuous
integration testing required for a project of this complexity.Comment: EASC 2014 conference proceedin
- …