20 research outputs found
Detecting a stochastic gravitational wave background with the Laser Interferometer Space Antenna
The random superposition of many weak sources will produce a stochastic
background of gravitational waves that may dominate the response of the LISA
(Laser Interferometer Space Antenna) gravitational wave observatory. Unless
something can be done to distinguish between a stochastic background and
detector noise, the two will combine to form an effective noise floor for the
detector. Two methods have been proposed to solve this problem. The first is to
cross-correlate the output of two independent interferometers. The second is an
ingenious scheme for monitoring the instrument noise by operating LISA as a
Sagnac interferometer. Here we derive the optimal orbital alignment for
cross-correlating a pair of LISA detectors, and provide the first analytic
derivation of the Sagnac sensitivity curve.Comment: 9 pages, 11 figures. Significant changes to the noise estimate
All-particle cosmic ray energy spectrum measured with 26 IceTop stations
We report on a measurement of the cosmic ray energy spectrum with the IceTop
air shower array, the surface component of the IceCube Neutrino Observatory at
the South Pole. The data used in this analysis were taken between June and
October, 2007, with 26 surface stations operational at that time, corresponding
to about one third of the final array. The fiducial area used in this analysis
was 0.122 km^2. The analysis investigated the energy spectrum from 1 to 100 PeV
measured for three different zenith angle ranges between 0{\deg} and 46{\deg}.
Because of the isotropy of cosmic rays in this energy range the spectra from
all zenith angle intervals have to agree. The cosmic-ray energy spectrum was
determined under different assumptions on the primary mass composition. Good
agreement of spectra in the three zenith angle ranges was found for the
assumption of pure proton and a simple two-component model. For zenith angles
{\theta} < 30{\deg}, where the mass dependence is smallest, the knee in the
cosmic ray energy spectrum was observed between 3.5 and 4.32 PeV, depending on
composition assumption. Spectral indices above the knee range from -3.08 to
-3.11 depending on primary mass composition assumption. Moreover, an indication
of a flattening of the spectrum above 22 PeV were observed.Comment: 38 pages, 17 figure
An improved method for measuring muon energy using the truncated mean of dE/dx
The measurement of muon energy is critical for many analyses in large
Cherenkov detectors, particularly those that involve separating
extraterrestrial neutrinos from the atmospheric neutrino background. Muon
energy has traditionally been determined by measuring the specific energy loss
(dE/dx) along the muon's path and relating the dE/dx to the muon energy.
Because high-energy muons (E_mu > 1 TeV) lose energy randomly, the spread in
dE/dx values is quite large, leading to a typical energy resolution of 0.29 in
log10(E_mu) for a muon observed over a 1 km path length in the IceCube
detector. In this paper, we present an improved method that uses a truncated
mean and other techniques to determine the muon energy. The muon track is
divided into separate segments with individual dE/dx values. The elimination of
segments with the highest dE/dx results in an overall dE/dx that is more
closely correlated to the muon energy. This method results in an energy
resolution of 0.22 in log10(E_mu), which gives a 26% improvement. This
technique is applicable to any large water or ice detector and potentially to
large scintillator or liquid argon detectors.Comment: 12 pages, 16 figure
Execution Time Estimation for Workflow Scheduling
Estimation of the execution time is an important part of the workflow scheduling problem. The aim of this paper is to highlight common problems in estimating the workflow execution time and propose a solution that takes into account the complexity and the randomness of the workflow components and their runtime. The solution proposed in this paper addresses the problems at different levels from task to workflow, including the error measurement and the theory behind the estimation algorithm. The proposed estimation algorithm can be integrated easily into a wide class of schedulers as a separate module. We use a dual stochastic representation, characteristic / distribution functions, in order to combine tasks’ estimates into the overall workflow makespan. Additionally, we propose the workflow reductions - the operations on a workflow graph that do not decrease the accuracy of the estimates, but simplify the graph structure, hence increasing the performance of the algorithm
Execution time estimation for workflow scheduling
Estimation of the execution time is an important part of the workflow scheduling problem. The aim of this paper is to highlight common problems in estimating the workflow execution time and propose a solution that takes into account the complexity and the stochastic aspects of the workflow components as well as their runtime. The solution proposed in this paper addresses the problems at different levels from a task to a workflow, including the error measurement and the theory behind the estimation algorithm. The proposed makespan estimation algorithm can be integrated easily into a wide class of schedulers as a separate module. We use a dual stochastic representation, characteristic/distribution function, in order to combine task estimates into the overall workflow makespan. Additionally, we propose the workflow reductions—operations on a workflow graph that do not decrease the accuracy of the estimates but simplify the graph structure, hence increasing the performance of the algorithm. Another very important feature of our work is that we integrate the described estimation schema into earlier developed scheduling algorithm GAHEFT and experimentally evaluate the performance of the enhanced solution in the real environment using the CLAVIRE platform