264 research outputs found
The influence of protocol choice on network performance
Bibliography: leaves 100-102.Computer communication networks are a vital link in providing many of the services that we use daily, and our reliance on these networks is on the increase. The growing use of networks is driving network design towards greater performance. The greater need for network connectivity and increased performance makes the study of network performance constraints important. Networks consist of both hardware and software components. Currently great advances are being made in network hardware, resulting in advances in the available raw network performance. In this thesis, I will show through measurement that it is difficult to harness all the raw performance and to make it available to carry network services. I will also identify some of the factors limiting the full utilization of a high speed network
Benchmarking the two-dimensional finite difference synthetic seismogram code
During the past six months, the two-dimensional finite difference synthetic seismogram code was installed and run on a
number of different computer systems. The results were compared for timing, accuracy and the ease with which the code was
adapted to each system. This report documents the software modifications and the method used to implement the finite difference code on each
computer, and presents the results of the benchmark survey.Funding was provided by the Office of Naval Research
under Contract No. N00014-89-J-1012
Functional requirements document for the Earth Observing System Data and Information System (EOSDIS) Scientific Computing Facilities (SCF) of the NASA/MSFC Earth Science and Applications Division, 1992
Five scientists at MSFC/ESAD have EOS SCF investigator status. Each SCF has unique tasks which require the establishment of a computing facility dedicated to accomplishing those tasks. A SCF Working Group was established at ESAD with the charter of defining the computing requirements of the individual SCFs and recommending options for meeting these requirements. The primary goal of the working group was to determine which computing needs can be satisfied using either shared resources or separate but compatible resources, and which needs require unique individual resources. The requirements investigated included CPU-intensive vector and scalar processing, visualization, data storage, connectivity, and I/O peripherals. A review of computer industry directions and a market survey of computing hardware provided information regarding important industry standards and candidate computing platforms. It was determined that the total SCF computing requirements might be most effectively met using a hierarchy consisting of shared and individual resources. This hierarchy is composed of five major system types: (1) a supercomputer class vector processor; (2) a high-end scalar multiprocessor workstation; (3) a file server; (4) a few medium- to high-end visualization workstations; and (5) several low- to medium-range personal graphics workstations. Specific recommendations for meeting the needs of each of these types are presented
MEGA
This research was sponsored by the National Science Foundation Grant NSF PHY-931478
Localization length in a random magnetic field
Kubo formula is used to get the d.c conductance of a statistical ensemble of
two dimensional clusters of the square lattice in the presence of random
magnetic fluxes. Fluxes traversing lattice plaquettes are distributed uniformly
between minus one half and plus one half of the flux quantum. The localization
length is obtained from the exponential decay of the averaged conductance as a
function of the cluster side. Standard results are recovered when this
numerical approach is applied to Anderson model of diagonal disorder. The
localization length of the complex non-diagonal model of disorder remains well
below 10 000 (in units of the lattice constant) in the main part of the band in
spite of its exponential increase near the band edges.Comment: 12 two-column pages including 10 figures (epsfig), revtex, to appear
in PR
Mega: A Search for the Decay μ → e γ
This research was sponsored by the National Science Foundation Grant NSF PHY-931478
Bootstraping the general linear hypothesis test
We discuss the use of bootstrap methodology in hypothesis testing, focusing on the classical F-test for linear hypotheses in the linear model. A modification of the F-statistics which allows for resampling under the null hypothesis is proposed. This approach is specifically considered in the one-way analysis of variance model. A simulation study illustrating the behaviour of our proposal is presented
- …