20,017 research outputs found
The new space and Earth science information systems at NASA's archive
The on-line interactive systems of the National Space Science Data Center (NSSDC) are examined. The worldwide computer network connections that allow access to NSSDC users are outlined. The services offered by the NSSDC new technology on-line systems are presented, including the IUE request system, Total Ozone Mapping Spectrometer (TOMS) data, and data sets on astrophysics, atmospheric science, land sciences, and space plasma physics. Plans for future increases in the NSSDC data holdings are considered
Strategic voting and nomination
Using computer simulations based on three separate data generating processes, I estimate the fraction of elections in which sincere voting will be a core equilibrium given each of eight single-winner voting rules. Additionally, I determine how often each voting rule is vulnerable to simple voting strategies such as 'burying' and 'compromising', and how often each voting rule gives an incentive for non-winning candidates to enter or leave races. I find that Hare is least vulnerable to strategic voting in general, whereas Borda, Coombs, approval, and range are most vulnerable. I find that plurality is most vulnerable to compromising and strategic exit (which can both reinforce two-party systems), and that Borda is most vulnerable to strategic entry. I support my key results with analytical proofs.strategic voting; tactical voting; strategic nomination; Condorcet; alternative vote; Borda count; approval voting
Managing Ada development
The Ada programming language was developed under the sponsorship of the Department of Defense to address the soaring costs associated with software development and maintenance. Ada is powerful, and yet to take full advantage of its power, it is sufficiently complex and different from current programming approaches that there is considerable risk associated with committing a program to be done in Ada. There are also few programs of any substantial size that have been implemented using Ada that may be studied to determine those management methods that resulted in a successful Ada project. The items presented are the author's opinions which have been formed as a result of going through an experience software development. The difficulties faced, risks assumed, management methods applied, and lessons learned, and most importantly, the techniques that were successful are all valuable sources of management information for those managers ready to assume major Ada developments projects
Space data management at the NSSDC (National Space Sciences Data Center): Applications for data compression
The National Space Science Data Center (NSSDC), established in 1966, is the largest archive for processed data from NASA's space and Earth science missions. The NSSDC manages over 120,000 data tapes with over 4,000 data sets. The size of the digital archive is approximately 6,000 gigabytes with all of this data in its original uncompressed form. By 1995 the NSSDC digital archive is expected to more than quadruple in size reaching over 28,000 gigabytes. The NSSDC digital archive is expected to more than quadruple in size reaching over 28,000 gigabytes. The NSSDC is beginning several thrusts allowing it to better serve the scientific community and keep up with managing the ever increasing volumes of data. These thrusts involve managing larger and larger amounts of information and data online, employing mass storage techniques, and the use of low rate communications networks to move requested data to remote sites in the United States, Europe and Canada. The success of these thrusts, combined with the tremendous volume of data expected to be archived at the NSSDC, clearly indicates that innovative storage and data management solutions must be sought and implemented. Although not presently used, data compression techniques may be a very important tool for managing a large fraction or all of the NSSDC archive in the future. Some future applications would consist of compressing online data in order to have more data readily available, compress requested data that must be moved over low rate ground networks, and compress all the digital data in the NSSDC archive for a cost effective backup that would be used only in the event of a disaster
Interview with James Leary, October 18, 2008
James Leary was interviewed on October 18, 2008, by Sierra Green about his experiences during World War II.
Course Information: Course Title: HIST 300: Historical Method Academic Term: Fall 2008 Course Instructor: Dr. Michael J. Birkner \u2772
Collection Note: This oral history was selected from the Oral History Collection maintained by Special Collections & College Archives. Transcripts are available for browsing in the Special Collections Reading Room, 4th floor, Musselman Library. GettDigital contains the complete listing of oral histories done from 1978 to the present. To view this list and to access selected digital versions please visit -- http://gettysburg.cdmhost.com/cdm/landingpage/collection/p16274coll
MLAPM - a C code for cosmological simulations
We present a computer code written in C that is designed to simulate
structure formation from collisionless matter. The code is purely grid-based
and uses a recursively refined Cartesian grid to solve Poisson's equation for
the potential, rather than obtaining the potential from a Green's function.
Refinements can have arbitrary shapes and in practice closely follow the
complex morphology of the density field that evolves. The timestep shortens by
a factor two with each successive refinement. It is argued that an appropriate
choice of softening length is of great importance and that the softening should
be at all points an appropriate multiple of the local inter-particle
separation. Unlike tree and P3M codes, multigrid codes automatically satisfy
this requirement. We show that at early times and low densities in cosmological
simulations, the softening needs to be significantly smaller relative to the
inter-particle separation than in virialized regions. Tests of the ability of
the code's Poisson solver to recover the gravitational fields of both
virialized halos and Zel'dovich waves are presented, as are tests of the code's
ability to reproduce analytic solutions for plane-wave evolution. The times
required to conduct a LCDM cosmological simulation for various configurations
are compared with the times required to complete the same simulation with the
ART, AP3M and GADGET codes. The power spectra, halo mass functions and
halo-halo correlation functions of simulations conducted with different codes
are compared.Comment: 20 pages, 20 figures, MNRAS in press, the code can be downloaded at
http://www-thphys.physics.ox.ac.uk/users/MLAPM
Variations of Hodge Structure Considered as an Exterior Differential System: Old and New Results
This paper is a survey of the subject of variations of Hodge structure (VHS)
considered as exterior differential systems (EDS). We review developments over
the last twenty-six years, with an emphasis on some key examples. In the
penultimate section we present some new results on the characteristic
cohomology of a homogeneous Pfaffian system. In the last section we discuss how
the integrability conditions of an EDS affect the expected dimension of an
integral submanifold. The paper ends with some speculation on EDS and Hodge
conjecture for Calabi-Yau manifolds
Vagueness as Cost Reduction : An Empirical Test
This work was funded in part by an EPSRC Platform Grant awarded to the NLG group at Aberdeen.Publisher PD
- …