71 research outputs found

    Assessing an intuitive condition for stability under a range of traffic conditions via a generalised Lu-Kumar network

    Get PDF
    We argue the importance both of developing simple sufficient conditions for the stability of general multiclass queueing networks and also of assessing such conditions under a range of assumptions on the weight of the traffic flowing between service stations. To achieve the former, we review a peak-rate stability condition and extend its range of application and for the latter, we introduce a generalisation of the Lu-Kumar network on which the stability condition may be tested for a range of traffic configurations. The peak-rate condition is close to exact when the between-station traffic is light, but degrades as this traffic increases.Multiclass queueing networks, stability, fluid model, Lu-Kumar network

    Modeling and analysis of uncertain time-critical tasking problems

    Get PDF
    Naval Research Logistics, 53 , No. 6, (Sept. 2006), 588-599.This paper describes modeling and operational analysis of a generic asymmetric services-system situation in which (a) Red agents, potentially threatening, but in another but important interpretation, are isolated friendlies, such as downed pilots, that require assistance and "arrive" according to some partially known and potentially changing pattern in time and space: and (b) Reds have effectively limited unknown deadlines or times of availability for Blue service, i.e., detection, classification, and attack in a military setting or emergency assistance in others. We discuss various service options by Blue service agents and devise several approximations allowing one to compute efficiently those proportions of tasks of different classes that are successfully serviced, or more generally, if different rewards are associated with different classes of tasks, the percentage of the possible reward gained. We suggest heuristic policies of a Blue server to select the next task to perform and to decide how much time to allocate to that service. We discuss this for a number of specific examples

    Benefits of hybrid lateral transshipments in multi-item inventory systems under periodic replenishment

    Get PDF
    Lateral transshipments are a method of responding to shortages of stock in a network of inventory-holding locations. Conventional reactive approaches only seek to meet immediate shortages. The study proposes hybrid transshipments which exploit economies of scale by moving additional stock between locations to prevent future shortages in addition to meeting immediate ones. The setting considered is motivated by retailers who operate networks of outlets supplying car parts via a system of periodic replenishment. It is novel in allowing non-stationary stochastic demand and general patterns of dependence between multiple item types. The generality of our work makes it widely applicable. We develop an easy-to-compute quasi-myopic heuristic for determining how hybrid transshipments should be made. We obtain simple characterizations of the heuristic and demonstrate its strong cost performance in both small and large networks in an extensive numerical study

    A stochastic game framework for patrolling a border

    Get PDF
    In this paper we consider a stochastic game for modelling the interactions between smugglers and a patroller along a border. The problem we examine involves a group of cooperating smugglers making regular attempts to bring small amounts of illicit goods across a border. A single patroller has the goal of preventing the smugglers from doing so, but must pay a cost to travel from one location to another. We model the problem as a two-player stochastic game and look to find the Nash equilibrium to gain insight to real world problems. Our framework extends the literature by assuming that the smugglers choose a continuous quantity of contraband, complicating the analysis of the game. We discuss a number of properties of Nash equilibria, including the aggregation of smugglers, the discount factors of the players, and the equivalence to a zero-sum game. Additionally, we present algorithms to find Nash equilibria that are more computationally efficient than existing methods. We also consider certain assumptions on the parameters of the model that give interesting equilibrium strategies for the players

    FORTIS: Pathfinder to the Lyman Continuum

    Full text link
    Shull et al. (1999) have asserted that the contribution of stars, relative to quasars, to the metagalactic background radiation that ionizes most of the baryons in the universe remains almost completely unknown at all epochs. The potential to directly quantify this contribution at low redshift has recently become possible with the identification by GALEX of large numbers of sparsely distributed faint ultraviolet galaxies. Neither STIS nor FUSE nor GALEX have the ability to efficiently survey these sparse fields and directly measure the Lyman continuum radiation that may leak into the low redshift (z < 0.4) intergalactic medium. We present here a design for a new type of far ultraviolet spectrograph, one that is more sensitive, covers wider fields, and can provide spectra and images of a large number of objects simultaneously, called the Far-ultraviolet Off Rowland-circle Telescope for Imaging and Spectroscopy (FORTIS). We intend to use a sounding rocket flight to validate the new instrument with a simple long-slit observation of the starburst populations in the galaxy M83. If however, the long-slit were replaced with microshutter array, this design could isolate the chains of blue galaxies found by GALEX over an ~30' diameter field-of-view and directly address the Lyman continuum problem in a long duration orbital mission. Thus, our development of the sounding rocket instrument is a pathfinder to a new wide field spectroscopic technology for enabling the potential discovery of the long hypothesized but elusive Lyman continuum radiation that is thought to leak from low redshift galaxies and contribute to the ionization of the universe.Comment: 10 pages to appear in Proceeedings of SPIE Vol. 5488, UV to Gamma Ray Space Telescope System

    The WiggleZ Dark Energy Survey: Star-formation in UV-luminous galaxies from their luminosity functions

    Get PDF
    We present the ultraviolet (UV) luminosity function of galaxies from the GALEX Medium Imaging Survey with measured spectroscopic redshifts from the first data release of the WiggleZ Dark Energy Survey. This sample selects galaxies with high star formation rates: at 0.6 < z < 0.9 the median star formation rate is at the upper 95th percentile of optically-selected (r<22.5) galaxies and the sample contains about 50 per cent of all NUV < 22.8, 0.6 < z < 0.9 starburst galaxies within the volume sampled. The most luminous galaxies in our sample (-21.0>M_NUV>-22.5) evolve very rapidly with a number density declining as (1+z)^{5\pm 1} from redshift z = 0.9 to z = 0.6. These starburst galaxies (M_NUV<-21 is approximately a star formation rate of 30 \msuny) contribute about 1 per cent of cosmic star formation over the redshift range z=0.6 to z=0.9. The star formation rate density of these very luminous galaxies evolves rapidly, as (1+z)^{4\pm 1}. Such a rapid evolution implies the majority of star formation in these large galaxies must have occurred before z = 0.9. We measure the UV luminosity function in 0.05 redshift intervals spanning 0.1<z<0.9, and provide analytic fits to the results. At all redshifts greater than z=0.55 we find that the bright end of the luminosity function is not well described by a pure Schechter function due to an excess of very luminous (M_NUV<-22) galaxies. These luminosity functions can be used to create a radial selection function for the WiggleZ survey or test models of galaxy formation and evolution. Here we test the AGN feedback model in Scannapieco et al. (2005), and find that this AGN feedback model requires AGN feedback efficiency to vary with one or more of the following: stellar mass, star formation rate and redshift.Comment: 27 pages; 13 pages without appendices. 22 figures; 11 figures in the main tex

    The WiggleZ Dark Energy Survey: improved distance measurements to z = 1 with reconstruction of the baryonic acoustic feature

    Get PDF
    We present significant improvements in cosmic distance measurements from the WiggleZ Dark Energy Survey, achieved by applying the reconstruction of the baryonic acoustic feature technique. We show using both data and simulations that the reconstruction technique can often be effective despite patchiness of the survey, significant edge effects and shot-noise. We investigate three redshift bins in the redshift range 0.2 < z < 1, and in all three find improvement after reconstruction in the detection of the baryonic acoustic feature and its usage as a standard ruler. We measure model-independent distance measures DV(rsfid/rs) of 1716 ± 83, 2221 ± 101, 2516 ± 86 Mpc (68 per cent CL) at effective redshifts z = 0.44, 0.6, 0.73, respectively, where DV is the volume-averaged distance, and rs is the sound horizon at the end of the baryon drag epoch. These significantly improved 4.8, 4.5 and 3.4 per cent accuracy measurements are equivalent to those expected from surveys with up to 2.5 times the volume of WiggleZ without reconstruction applied. These measurements are fully consistent with cosmologies allowed by the analyses of the Planck Collaboration and the Sloan Digital Sky Survey. We provide the DV(rsfid/rs) posterior probability distributions and their covariances. When combining these measurements with temperature fluctuations measurements of Planck, the polarization of Wilkinson Microwave Anisotropy Probe 9, and the 6dF Galaxy Survey baryonic acoustic feature, we do not detect deviations from a flat Λ cold dark matter (ΛCDM) model. Assuming this model, we constrain the current expansion rate to H₀ = 67.15 ± 0.98 km s⁻ÂčMpc⁻Âč. Allowing the equation of state of dark energy to vary, we obtain wDE = −1.080 ± 0.135. When assuming a curved ΛCDM model we obtain a curvature value of ΩK = −0.0043 ± 0.0047

    The WiggleZ Dark Energy Survey: measuring the cosmic expansion history using the Alcock-Paczynski test and distant supernovae

    Full text link
    Astronomical observations suggest that today's Universe is dominated by a dark energy of unknown physical origin. One of the most notable consequences in many models is that dark energy should cause the expansion of the Universe to accelerate: but the expansion rate as a function of time has proven very difficult to measure directly. We present a new determination of the cosmic expansion history by combining distant supernovae observations with a geometrical analysis of large-scale galaxy clustering within the WiggleZ Dark Energy Survey, using the Alcock-Paczynski test to measure the distortion of standard spheres. Our result constitutes a robust and non-parametric measurement of the Hubble expansion rate as a function of time, which we measure with 10-15% precision in four bins within the redshift range 0.1 < z < 0.9. We demonstrate that the cosmic expansion is accelerating, in a manner independent of the parameterization of the cosmological model (although assuming cosmic homogeneity in our data analysis). Furthermore, we find that this expansion history is consistent with a cosmological-constant dark energy.Comment: 13 pages, 7 figures, accepted for publication by MNRA

    The WiggleZ Dark Energy Survey: Survey Design and First Data Release

    Get PDF
    The WiggleZ Dark Energy Survey is a survey of 240,000 emission line galaxies in the distant universe, measured with the AAOmega spectrograph on the 3.9-m Anglo-Australian Telescope (AAT). The target galaxies are selected using ultraviolet photometry from the GALEX satellite, with a flux limit of NUV<22.8 mag. The redshift range containing 90% of the galaxies is 0.2<z<1.0. The primary aim of the survey is to precisely measure the scale of baryon acoustic oscillations (BAO) imprinted on the spatial distribution of these galaxies at look-back times of 4-8 Gyrs. Detailed forecasts indicate the survey will measure the BAO scale to better than 2% and the tangential and radial acoustic wave scales to approximately 3% and 5%, respectively. This paper provides a detailed description of the survey and its design, as well as the spectroscopic observations, data reduction, and redshift measurement techniques employed. It also presents an analysis of the properties of the target galaxies, including emission line diagnostics which show that they are mostly extreme starburst galaxies, and Hubble Space Telescope images, which show they contain a high fraction of interacting or distorted systems. In conjunction with this paper, we make a public data release of data for the first 100,000 galaxies measured for the project.Comment: Accepted by MNRAS; this has some figures in low resolution format. Full resolution PDF version (7MB) available at http://www.physics.uq.edu.au/people/mjd/pub/wigglez1.pdf The WiggleZ home page is at http://wigglez.swin.edu.au
    • 

    corecore