1,683 research outputs found

    Web Caching and Prefetching with Cyclic Model Analysis of Web Object Sequences

    Get PDF
    Web caching is the process in which web objects are temporarily stored to reduce bandwidth consumption, server load and latency. Web prefetching is the process of fetching web objects from the server before they are actually requested by the client. Integration of caching and prefetching can be very beneficial as the two techniques can support each other. By implementing this integrated scheme in a client-side proxy, the perceived latency can be reduced for not one but many users. In this paper, we propose a new integrated caching and prefetching policy called the WCP-CMA which makes use of a profit-driven caching policy that takes into account the periodicity and cyclic behaviour of the web access sequences for deriving prefetching rules. Our experimental results have shown a 10%-15% increase in the hit ratios of the cached objects and 5%-10% decrease in delay compared to the existing schem

    Defining interoperability standards: A case study of public health observatory websites

    Get PDF
    The Association of Public Health Observatories (APHO) is a group of region-based health-information providers. Each PHO publishes health-related data for their specific region. Each observatory has taken a national lead in one or more key health area - such as 'cancer' or Obesity'. In 2003, a project was initiated to develop 'interoperability' between public health observatory websites, so the national resources published by one lead observatory could be found on the websites for each other PHO. The APHO interoperability project defined a set of requirements for each PHO - websites should comply with the current government data standards and provide webservices to allow data to be searched in real-time between different PHOs. This thesis describes the production of an interoperable website for the North East Public Health Observatory (NEPHO) and the problems faced during implementation to comply with the APHO interoperability requirements. The areas of interoperability, e-Government and metadata were investigated specifically in suitability for NEPHO and an action list of tasks necessary to achieve the project aims was drawn up. This project has resulted in the successful introduction of a new NEPHO website that complies with the APHO and e-Govemment requirements, however interoperability with other organisations has been difficult to achieve. This thesis describes how other organisations approached the same APHO interoperability criteria and questions whether the national project governance could be improved

    Intellectual property rights in a knowledge-based economy

    Get PDF
    Intellectual property rights (IPR) have been created as economic mechanisms to facilitate ongoing innovation by granting inventors a temporary monopoly in return for disclosure of technical know-how. Since the beginning of 1980s, IPR have come under scrutiny as new technological paradigms appeared with the emergence of knowledge-based industries. Knowledge-based products are intangible, non-excludable and non-rivalrous goods. Consequently, it is difficult for their creators to control their dissemination and use. In particular, many information goods are based on network externalities and on the creation of market standards. At the same time, information technologies are generic in the sense of being useful in many places in the economy. Hence, policy makers often define current IPR regimes in the context of new technologies as both over- and under-protective. They are over-protective in the sense that they prevent the dissemination of information which has a very high social value; they are under-protective in the sense that they do not provide strong control over the appropriation of rents from their invention and thus may not provide strong incentives to innovate. During the 1980s, attempts to assess the role of IPR in the process of technological learning have found that even though firms in high-tech sectors do use patents as part of their strategy for intellectual property protection, the reliance of these sectors on patents as an information source for innovation is lower than in traditional industries. Intellectual property rights are based mainly on patents for technical inventions and on copyrights for artistic works. Patents are granted only if inventions display minimal levels of utility, novelty and non-obviousness of technical know-how. By contrast, copyrights protect only final works and their derivatives, but guarantee protection for longer periods, according to the Berne Convention. Licensing is a legal aid that allows the use of patented technology by other firms, in return for royalty fees paid to the inventor. Licensing can be contracted on an exclusive or non-exclusive basis, but in most countries patented knowledge can be exclusively held by its inventors, as legal provisions for compulsory licensing of technologies do not exist. The fair use doctrine aims to prevent formation of perfect monopolies over technological fields and copyrighted artefacts as a result of IPR application. Hence, the use of patented and copyrighted works is permissible in academic research, education and the development of technologies that are complimentary to core technologies. Trade secrecy is meant to prevent inadvertent technology transfer to rival firms and is based on contracts between companies and employees. However, as trade secrets prohibit transfer of knowledge within industries, regulators have attempted to foster disclosure of technical know-how by institutional means of patents, copyrights and sui-generis laws. And indeed, following the provisions formed by IPR regulation, firms have shifted from methods of trade secrecy towards patenting strategies to achieve improved protection of intellectual property, as well as means to acquire competitive advantages in the market by monopolization of technological advances.economics of technology ;

    A Holistic Approach to Lowering Latency in Geo-distributed Web Applications

    Get PDF
    User perceived end-to-end latency of web applications have a huge impact on the revenue for many businesses. The end-to-end latency of web applications is impacted by: (i) User to Application server (front-end) latency which includes downloading and parsing web pages, retrieving further objects requested by javascript executions; and (ii) Application and storage server(back-end) latency which includes retrieving meta-data required for an initial rendering, and subsequent content based on user actions. Improving the user-perceived performance of web applications is challenging, given their complex operating environments involving user-facing web servers, content distribution network (CDN) servers, multi-tiered application servers, and storage servers. Further, the application and storage servers are often deployed on multi-tenant cloud platforms that show high performance variability. While many novel approaches like SPDY and geo-replicated datastores have been developed to improve their performance, many of these solutions are specific to certain layers, and may have different impact on user-perceived performance. The primary goal of this thesis is to address the above challenges in a holistic manner, focusing specifically on improving the end-to-end latency of geo-distributed multi-tiered web applications. This thesis makes the following contributions: (i) First, it reduces user-facing latency by helping CDNs identify and map objects that are more critical for page-load latency to the faster CDN cache layers. Through controlled experiments on real-world web pages, we show the potential of our approach to reduce hundreds of milliseconds in latency without affecting overall CDN miss rates. (ii) Next, it reduces back-end latency by optimally adapting the datastore replication policies (including number and location of replicas) to the heterogeneity in workloads. We show the benefits of our replication models using real-world traces of Twitter, Wikipedia and Gowalla on a 8 datacenter Cassandra cluster deployed on EC2. (iii) Finally, it makes multi-tier applications resilient to the inherent performance variability in the cloud through fine-grained request redirection. We highlight the benefits of our approach by deploying three real-world applications on commercial cloud platforms

    Evaluating and Mapping Internet Connectivity in the United States

    Get PDF
    We evaluated Internet connectivity in the United States, drawn from different definitions of connectivity and different methods of analysis. Using DNS cache manipulation, traceroutes, and a crowdsourced “site ping” method we identify patterns in connectivity that correspond to higher population or coastal regions of the US. We analyze the data for quality strengths and shortcomings, establish connectivity heatmaps, state rankings, and statistical measures of the data. We give comparative analyses of the three methods and present suggestions for future work building off this report
    • …
    corecore