587 research outputs found
Observations On Dissolved Oxygen Conditions In Three Virginia Estuaries After Tropical Storm Agnes (Summer 1972)
Dissolved oxygen (DO) and salinity levels in the James, York, and Rappahannock estuaries were monitored for approximately two months (June 24-August 31, 1972) following Tropical Storm Agnes. DO depressions developed more rapidly and were more severe in the deep waters of the York and Rappahannock than in the James. Depressions that developed immediately after the storm were followed by recoveries and subsequent, more severe depressions. In late July, bottom water DO concentrations below 1 mg/1 were found at stations covering 15 miles of the York and 25 miles of the Rappahannock. Comparison of river data with Chesapeake Bay data suggests that the rivers contributed oxygen poor water to the Bay during the post-Agnes period. Comparison of 1972 river data with data from other years suggests that the post-Agnes oxygen depressions were more severe than those that occur in normal years.https://scholarworks.wm.edu/vimsbooks/1070/thumbnail.jp
How Much is Location Information Worth? A Competitive Analysis of the Online Traveling Salesman Problem with Two Disclosure Dates
In this paper we derive the worst-case ratio of an online algorithm for the Traveling Salesman Problem (TSP) with two disclosure dates. This problem, a variant of the online TSP with release dates, is characterized by the disclosure of a jobâs location at one point in time followed by the disclosure of that jobâs release date at a later point in time. We present an online algorithm for this problem restricted to the positive real number line. We then derive the worst-case ratio of our algorithm and show that it is best-possible in two contexts â the first, one in which the amount of time between the disclosure events and release time are fixed and equal for all jobs; and a second in which the time between disclosure events va
Multi Agent Systems in Logistics: A Literature and State-of-the-art Review
Based on a literature survey, we aim to answer our main question: âHow should we plan and execute logistics in supply chains that aim to meet todayâs requirements, and how can we support such planning and execution using IT?â Todayâs requirements in supply chains include inter-organizational collaboration and more responsive and tailored supply to meet specific demand. Enterprise systems fall short in meeting these requirements The focus of planning and execution systems should move towards an inter-enterprise and event-driven mode. Inter-organizational systems may support planning going from supporting information exchange and henceforth enable synchronized planning within the organizations towards the capability to do network planning based on available information throughout the network. We provide a framework for planning systems, constituting a rich landscape of possible configurations, where the centralized and fully decentralized approaches are two extremes. We define and discuss agent based systems and in particular multi agent systems (MAS). We emphasize the issue of the role of MAS coordination architectures, and then explain that transportation is, next to production, an important domain in which MAS can and actually are applied. However, implementation is not widespread and some implementation issues are explored. In this manner, we conclude that planning problems in transportation have characteristics that comply with the specific capabilities of agent systems. In particular, these systems are capable to deal with inter-organizational and event-driven planning settings, hence meeting todayâs requirements in supply chain planning and execution
Caracterização Granulométrica De Biomassa Polidispersa Pelo Método De Peneiramento Mecùnico
Polydispersed biomass has as one of the main features variations in particle dimension and size. Mean diameter determination methods by screening are widely used due to their relative simplicity and low cost. However, when applied for heterogeneous materials such as polydispersed biomass, they can lead to misleading results. Thus, we evaluated three methods for determining mean diameter of polydispersed biomass particles, aiming at most accurate results. The methods are based on size fractionation by mechanical sieving: one analytical (Sauter diameter) and two by graphical analysis of particle distribution functions (Mass Distribution Density - MDD and Cumulative/ Augmentative Distribution - CAD). The method based on graphical analysis of mass distribution density (MDD) was the most efficient for sugarcane bagasse analysis, once it enabled detecting different populations and clearly identifying their respective characteristic dimensions. Therefore, it is a powerful tool for particle size analysis of polydispersed biomass.36110211
3D reconstruction of the fundus of a phantom eye through stereo imaging of slit lamp images
In the detection of glaucoma, the second leading cause of blindness worldwide, the alteration of the optic disc's morphology is a key clinical indicator. The current gold standard test, stereo funduscopy using stereo fundus cameras, is subjective. Quantitative devices exist but are prohibitively expensive. Work carried out elsewhere has demonstrated quantitative results from stereo matching fundus camera images. Building on this idea, the slit lamp microscope (a mainstay of eye diagnostics, present in practically all ophthalmology and optometry practices) has the potential to be used as a quantitative device. This study explored the feasibility of uncalibrated 3D reconstructions of retinal structures of a phantom eye's fundus using a slit lamp
Non-zero entropy density in the XY chain out of equilibrium
The von Neumann entropy density of a block of n spins is proved to be
non-zero for large n in the non-equilibrium steady state of the XY chain
constructed by coupling a finite cutout of the chain to the two infinite parts
to its left and right which act as thermal reservoirs at different
temperatures. Moreover, the non-equilibrium density is shown to be strictly
greater than the density in thermal equilibrium
Light-Front Quantisation as an Initial-Boundary Value Problem
In the light front quantisation scheme initial conditions are usually
provided on a single lightlike hyperplane. This, however, is insufficient to
yield a unique solution of the field equations. We investigate under which
additional conditions the problem of solving the field equations becomes well
posed. The consequences for quantisation are studied within a Hamiltonian
formulation by using the method of Faddeev and Jackiw for dealing with
first-order Lagrangians. For the prototype field theory of massive scalar
fields in 1+1 dimensions, we find that initial conditions for fixed light cone
time {\sl and} boundary conditions in the spatial variable are sufficient to
yield a consistent commutator algebra. Data on a second lightlike hyperplane
are not necessary. Hamiltonian and Euler-Lagrange equations of motion become
equivalent; the description of the dynamics remains canonical and simple. In
this way we justify the approach of discretised light cone quantisation.Comment: 26 pages (including figure), tex, figure in latex, TPR 93-
- âŠ