193 research outputs found

    Measuring and Calculating Queue Length Distributions

    Get PDF

    In-Stream Monitoring of Sediments and Water in the Lower Ouachita River for Site Impact to Aquatic Biota

    Get PDF
    Reported reduced sportfish densities in the main channel of the Ouachita River prompted an investigation, beginning in 1990, into potential causes of ongoing impairment to aquatic biota. In-stream monitoring that incorporated toxicity testing of sediments and water was conducted to discern potential sources of contaminants that might be related to the suboptimal fishery populations. Organisms selected to evaluate chronic impairment included larval fish, clams, midges and water fleas. The fathead minnow {Pimephales promelas) and cladoceran (Ceriodaphnia dubid) were used to estimate patterns of toxicity associated with water from seven designated reaches and selected tributaries of the Ouachita River. Larval survival and growth tests were conducted using the fathead minnow, while survival and reproduction were assessed for the cladoceran. An enzyme assay using the Asian clam (Corbicula fluminea), and growth and survival tests with Chironomus tentans were used to evaluate ambient sediment toxicity within these same reaches and tributaries. Ambient toxicity was rarely observed in the mainstem of the River and, moreover, represented intermittent events. However, impaired growth in larval fish, poor reproduction in cladocera, and reduced enzyme activity in clams were evident for several tributaries. Results of 10-day whole sediment tests showed significant growth reductions in C. tentans exposed to sediments collected from West and East Two bayous, Smackover and Coffee creeks. These results suggest there is intermittent impairment in tributaries of the Ouachita River due to ambient water and sediment conditions that are aside from current concerns for mercury contamination

    A Stochastic Broadcast Pi-Calculus

    Get PDF
    In this paper we propose a stochastic broadcast PI-calculus which can be used to model server-client based systems where synchronization is always governed by only one participant. Therefore, there is no need to determine the joint synchronization rates. We also take immediate transitions into account which is useful to model behaviors with no impact on the temporal properties of a system. Since immediate transitions may introduce non-determinism, we will show how these non-determinism can be resolved, and as result a valid CTMC will be obtained finally. Also some practical examples are given to show the application of this calculus.Comment: In Proceedings QAPL 2011, arXiv:1107.074

    A line-balancing strategy for designing flexible assembly systems

    Full text link
    We present a rough-cut analysis tool that quickly determines a few potential cost-effective designs at the initial design stage of flexible assembly systems (FASs) prior to a detailed analysis such as simulation. It uses quantitative methods for selecting and configuring the components of an FAS suitable for medium to high volumes of several similar products. The system is organized as a series of assembly stations linked with an automated material-handling system moving parts in a unidirectional flow. Each station consists of a single machine or of identical parallel machines. The methods exploit the ability of flexible hardware to switch almost instantaneously from product to product. Our approach is particularly suitable where the product mix is expected to be stable, since we combine the hardware-configuration phase with the task-allocation phase.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/45513/1/10696_2004_Article_BF00167513.pd

    The Response Times of Priority Classes under Preemptive Resume in M/M/m Queues

    Get PDF

    Rethinking Randomness, An Interview with Jeff Buzen, Part I

    Get PDF
    Editor's Introduction: For more than 40 years, Jeffrey Buzen has been a leader in performance prediction of computer systems and networks. His first major contribution was an algorithm, known now as Buzen’s Algorithm, that calculated the throughput and response time of any practical network of servers in a few seconds. Prior algorithms were useless because they would have taken months or years for the same calculations. Buzen’s breakthrough opened a new industry of companies providing performance evaluation services, and laid scientific foundations for designing systems that meet performance objectives. Along the way, he became troubled by the fact that the real systems he was evaluating seriously violated his model’s assumptions, and yet the faulty models predicted throughput to within 5 percent of the true value and response time to within 25 percent. He began puzzling over this anomaly and invented a new framework for building computer performance models, which he called operational analysis. Operational analysis produced the same formulas, but with assumptions that hold in most systems. As he continued to understand this puzzle, he formulated a more complete theory of randomness, which he calls observational stochastics, and he wrote a book Rethinking Randomness laying out his new theory. We talked with Jeff Buzen about his work. (Peter J. Denning, Editor in Chief

    Surveyor's Forum: A Predictable Problem

    No full text
    • …
    corecore