67 research outputs found

    The scalar complex potential and the Aharonov-Bohm effect

    Full text link
    The Aharonov-Bohm effect is traditionally attributed to the effect of the electromagnetic 4-potential AA, even in regions where both the electric field E\mathbf{E} and the magnetic field B\mathbf{B} are zero. The AB effect reveals that multiple-valued functions play a crucial role in the description of an electromagnetic field. We argue that the quantity measured by AB experiments is a difference in values of a multiple-valued complex function, which we call a complex potential or {pre-potential. We show that any electromagnetic field can be described by this pre-potential, and give an explicit expression for the electromagnetic field tensor through this potential. The pre-potential is a modification of the two scalar potential functions.Comment: 10 pages 2 figure

    Estimating seasonal abundance of a central place forager using counts and telemetry data

    Get PDF
    R.J.S. was supported by a Natural Environment Research Council studentship.Obtaining population estimates of species that are not easily observed directly can be problematic. However, central place foragers can often be observed some of the time, e.g. when seals are hauled out. In these instances, population estimates can be derived from counts, combined with information on the proportion of time that animals can be observed. We present a modelling framework to estimate seasonal absolute abundance using counts and information from satellite telemetry data. The method was tested on a harbour seal population in an area of southeast Scotland. Counts were made monthly, between November 2001 and June 2003, when seals were hauled out on land and were corrected for the proportion of time the seals were at sea using satellite telemetry. Harbour seals (n=25) were tagged with satellite relay data loggers between November 2001 and March 2003. To estimate the proportion of time spent hauled out, time at sea on foraging trips was modelled separately from haul-out behaviour close to haul-out sites because of the different factors affecting these processes. A generalised linear mixed model framework was developed to capture the longitudinal nature of the data and the repeated measures across individuals. Despite seasonal variability in the number of seals counted at haul-out sites, the model generated estimates of abundance, with an overall mean of 846 (95% CI: 767 to 979). The methodology shows the value of using count and telemetry data collected concurrently for estimating absolute abundance, information that is essential to assess interactions between predators, fish stocks and fisheries.Publisher PDFPeer reviewe

    A Scalable Middleware Solution for Advanced Wide Area Web Services

    Get PDF
    To alleviate scalability problems in the Web, many researchers concentrate on how to incorporate advanced caching and replication techniques. Many solutions incorporate object-based techniques. In particular, Web resources are considered as distributed objects offering a well-defined interface. We argue that most proposals ignore two important aspects. First, there is little discussion on what kind of coherence should be provided. Proposing specific caching or replication solutions makes sense only if we know what coherence model they should implement. Second, most proposals treat all Web resources alike. Such a one-size-fits-all approach will never work in a wide-area system. We propose a solution in which Web resources are encapsulated in physically distributed shared objects. Each object should encapsulate not only state and operations, but also the policy by which its state is distributed, cached, replicated, migrated, etc

    Autonomous Replication in Wide-Area Internetworks

    No full text
    The number of users connected to the Internet has been growing at an exponential rate, resulting in similar increases in network traffic and Internet server load. Advances in microprocessors and network technologies have kept up with growth so far, but we are reaching the limits of hardware solutions. In order for the Internet's growth to continue, we must efficiently distribute server load and reduce the network traffic generated by its various services. Traditional wide-area caching schemes are client initiated. Decisions on where and when to cache information are made without the benefit of the server's global knowledge of the situation. We introduce a technique---push- caching---that is server initiated; it leaves caching decisions to the server. The server uses its knowledge of network topology, geography, and access patterns to minimize network traffic and server load. The World Wide Web is an example of a large-scale distributed information system that will benefit from this ge..

    World Wide Web Cache Consistency

    No full text
    The bandwidth demands of the World Wide Web continue to grow at a hyper-exponential rate. Given this rocketing growth, caching of web objects as a means to reduce network bandwidth consumption is likely to be a necessity in the very near future. Unfortunately, many Web caches do not satisfactorily maintain cache consistency. This paper presents a survey of contemporary cache consistency mechanisms in use on the Internet today and examines recent research in Web cache consistency. Using trace-driven simulation, we show that a weak cache consistency protocol (the one used in the Alex ftp cache) reduces network bandwidth consumption and server load more than either time-to-live fields or an invalidation protocol and can be tuned to return stale data less than 5% of the time

    An Analysis of Geographical Push-Caching

    No full text
    Most caching schemes in wide-area, distributed systems are client-initiated. Decisions of when and where to cache information are made without the benefit of the server's global knowledge of the usage patterns. In this paper, we present a new caching strategy: geographical push-caching. Using the server's global knowledge and a derived network topology, we distribute data to cooperating servers. The World Wide Web is an example of a wide-area system that will benefit from distance-sensitive caching, and we present an architecture that allows a Web server to autonomously replicate HTML pages. We use a tracedriven simulation to evaluate several competing caching strategies. Our results show that geographical push-caching reduces bandwidth consumption and sever load by the same amount as web proxy caching, but with a savings in global cache space of almost two orders of magnitude. More importantly, servers that wish to reduce Internet bandwidth consumption and their load can do so without..
    • …
    corecore