89 research outputs found

    Web Caching and Prefetching with Cyclic Model Analysis of Web Object Sequences

    Get PDF
    Web caching is the process in which web objects are temporarily stored to reduce bandwidth consumption, server load and latency. Web prefetching is the process of fetching web objects from the server before they are actually requested by the client. Integration of caching and prefetching can be very beneficial as the two techniques can support each other. By implementing this integrated scheme in a client-side proxy, the perceived latency can be reduced for not one but many users. In this paper, we propose a new integrated caching and prefetching policy called the WCP-CMA which makes use of a profit-driven caching policy that takes into account the periodicity and cyclic behaviour of the web access sequences for deriving prefetching rules. Our experimental results have shown a 10%-15% increase in the hit ratios of the cached objects and 5%-10% decrease in delay compared to the existing schem

    Supporting Distributed Geo-Processing: A Framework for Managing Multi-Accuracy Spatial Data

    Get PDF
    Negli ultimi anni molti paesi hanno sviluppato un'infrastruttura tecnologica al fine di gestire i propri dati geografici (Spatial Data Infrastructure, SDI). Tali infrastrutture rechiedono nuove ed efficati metodologie per integrare continuamente dati che provengoono da sorgenti diverse e sono caratterizzati da diversi livelli di qualit\ue0. Questo bisogno \ue8 riconosciuto in letteratura ed \ue8 noto come problema di integrazione del dato (data integration) o fusione di informazioni (information fusion). Un aspetto peculiare dell'integrazione del dato geografico riguarda il matching e l'allineamento degli oggetti geometrici. I metodi esistenti solitamente eseguono l'integrazione semplicemente allineando il database meno accurato con quello pi\uf9 accurato, assumendo che il secondo contenga sempre una rappresentazione migliore delle geometrie rilevate. Seguendo questo approccio, gli oggetti geografici sono combinati assieme in una maniera non ottimale, causando distorsioni che potenzialmente riducono la qualit\ue0 complessiva del database finale. Questa tesi si occupa del problema dell'integrazione del dato spaziale all'interno di una SDI fortemente strutturata, in cui i membri hanno preventivamente aderito ad uno schema globale comune, pertanto si focalizza sul problema dell'integrazione geometrica, assumendo che precedenti operazioni di integrazione sullo schema siano gi\ue0 state eseguire. In particulare, la tesi inizia proponendo un modello per la rappresentazione dell'informazione spaziale caratterizzata da differenti livelli di qualit\ue0, quindi definisce un processo di integrazione che tiene conto dell'accuratezza delle posizioni contenute in entrambi i database coinvoilti. La tecnica di integrazione proposta rappresenta la base per un framework capace di supportare il processamento distributo di dati geografici (geo-processing) nel contesto di una SDI. Il problema di implementare tale computazione distribuita e di lunga durata \ue8 trattato anche da un punto di vista pratico attraverso la valutazione dell'applicabilit\ue0 delle tecnologie di workflow esistenti. Tale valutazione ha portato alla definizione di una soluzione software ideale, le cui caratteristiche sono discusse negli ultimi capitoli, considerando come caso di studio il design del processo di integrazione proposto.In the last years many countries have developed a Spatial Data Infrastructure (SDI) to manage their geographical information. Large SDIs require new effective techniques to continuously integrate spatial data coming from different sources and characterized by different quality levels. This need is recognized in the scientific literature and is known as data integration or information fusion problem. A specific aspect of spatial data integration concerns the matching and alignment of object geometries. Existing methods mainly perform the integration by simply aligning the less accurate database with the more accurate one, assuming that the latter always contains a better representation of the relevant geometries. Following this approach, spatial entities are merged together in a sub-optimal manner, causing distortions that potentially reduce the overall database quality. This thesis deals with the problem of spatial data integration in a highly-coupled SDI where members have already adhered to a common global schema, hence it focuses on the geometric integration problem assuming that some schema matching operations have already been performed. In particular, the thesis initially proposes a model for representing spatial data together with their quality characteristics, producing a multi-accuracy spatial database, then it defines a novel integration process that takes care of the different positional accuracies of the involved source databases. The main goal of such process is to preserve coherence and consistency of the integrated data and when possible enhancing its accuracy. The proposed multi-accuracy spatial data model and the related integration technique represent the basis for a framework able to support distributed geo-processing in a SDI context. The problem of implementing such long-running distributed computations is also treated from a practical perspective by evaluating the applicability of existing workflow technologies. This evaluation leads to the definition of an ideal software solution, whose characteristics are discussed in the last chapters by considering the design of the proposed integration process as a motivating example

    Internet search techniques: using word count, links and directory structure as internet search tools

    Get PDF
    A thesis submitted for the degree of Doctor of Philosophy ofthe University of LutonAs the Web grows in size it becomes increasingly important that ways are developed to maximise the efficiency of the search process and index its contents with minimal human intervention. An evaluation is undertaken of current popular search engines which use a centralised index approach. Using a number of search terms and metrics that measure similarity between sets of results, it was found that there is very little commonality between the outcome of the same search performed using different search engines. A semi-automated system for searching the web is presented, the Internet Search Agent (ISA), this employs a method for indexing based upon the idea of "fingerprint types". These fingerprint types are based upon the text and links contained in the web pages being indexed. Three examples of fingerprint type are developed, the first concentrating upon the textual content of the indexed files, the other two augment this with the use of links to and from these files. By looking at the results returned as a search progresses in terms of numbers and measures of content of results for effort expended, comparisons can be made between the three fingerprint types. The ISA model allows the searcher to be presented with results in context and potentially allows for distributed searching to be implemented

    Lightweight Federation of Non-Cooperating Digital Libraries

    Get PDF
    This dissertation studies the challenges and issues faced in federating heterogeneous digital libraries (DLs). The objective of this research is to demonstrate the feasibility of interoperability among non-cooperating DLs by presenting a lightweight, data driven approach, or Data Centered Interoperability (DCI). We build a Lightweight Federated Digital Library (LFDL) system to provide federated search service for existing digital libraries with no prior coordination. We describe the motivation, architecture, design and implementation of the LFDL. We develop, deploy, and evaluate key services of the federation. The major difference to existing DL interoperability approaches is one where we do not insist on cooperation among DLs, that is, they do not have to change anything in their system or processes. The underlying approach is to have a dynamic federation where digital libraries can be added (removed) to the federation in real-time. This is made possible by describing the behavior of participating DLs in an XML-based language that the federation engine understands. The major contributions of this work are: (1) This dissertation addresses the interoperability issues among non-cooperating DLs and presents a practical and efficient approach toward providing federated search service for those DLs. The DL itself remains autonomous and does not need to change its structure, data format, protocol and other internal features when it is added to the federation. (2) The implementation of the LFDL is based on a lightweight, dynamic, data-centered and rule-driven architecture. To add a DL to the federation, all that is needed is observing a DL\u27s interaction with the user and storing the interaction specification in a human-readable and highly maintainable format. The federation engine provides the federated service based on the specification of a DL. A registration service allows dynamic DL registration, removal, or modification. No code needs to be rewritten or recompiled to add or change a DL. These notions are achieved by designing a new specification language in XML format and a powerful processing engine that enforces and implements the rules specified using the language. (3) In this thesis we explore an alternate approach where searches are distributed to participating DLs in real time. We have addressed the performance and reliability problems associated with other distributed search approaches. This is achieved by a locally maintained metadata repository extracted from DLs, as well as an efficient caching system based on the repository

    Life Cycle Sustainability Assessment of the Hydrogen Fuel Cell Buses in the European Context. Evaluation of relevant measures to support low-carbon mobility in the public transport sector

    Get PDF
    Goal and Background. Transport represents 27% of Europe's Greenhouse Gas (GHG) emissions and is the main cause of air pollution in cities. With the global shift towards a low-carbon economy, the EU set forth a lowemission mobility strategy with the aim of reducing the overall emissions in the transport sector. The High V.LO.-City project is part of this overarching strategy and addresses the integration of hydrogen fuel cell (H2FC) buses in the public transport. Methods. In this thesis, the environmental assessment of one H2FC bus and the related refuelling station is carried out using the Life Cycle Assessment (LCA) methodology, taking into account the following phases: (1) bus production, (2) hydrogen production pathways (water electrolysis, chlor-alkali electrolysis, and steam methane reforming), (3) hydrogen consumption during bus operation, and (4) the vehicles' end of life. The potential impacts are evaluated for magnitude and signi cance in the life cycle impact assessment (LCIA) phase, using Environmental Footprint (EF) method which is part of the Product Environmental Footprint (PEF) method, established by the European Union (EU) in 2013. The calculated fuel economy is around 10.54 KgH2/100Km and the energy demand of a refuelling infrastructure may vary between 6 and 9 KWh/KgH2. Results. The results show that H2FC buses have the potential to reduce emissions during the use phase if renewables resources are used. The expected Global Warming Potential (GWP) bene t is about 85% in comparison to a diesel bus. Additionally, the emissions of the selected patterns of hydrogen production depend on how electricity is produced and on the chemical-based or fossil-based feedstocks used to drive the production process. Conclusions and Outlook. The improvement of the environmental pro le of hydrogen production requires to promote clean electricity sources to supply a low-carbon hydrogen and to sharpen policy focus with regard to life cycle management, and to counter potential setbacks, in particular those related to problem-shifting and to grid improvement

    Supporting integrated care pathways with workflow technology

    Get PDF
    Modern healthcare has moved to a focus on providing patient centric care rather than disease centred care. This new approach is provided by a unique care team which is formed to treat a patient. At the start of the treatment, the care team decide on the treatment pathway for the patient. This is a series of treatment stages where at the end of each stage, the care team use the patient’s current condition to decide whether the treatment moves to the next stage, continues in the treatment stage, or moves to an unanticipated stage. The initial treatment pathway for each patient is based on the clinical guidelines in an Integrated Care Pathway (ICP) [1] modified to suit the patient state. This research mapped a patient ICP decided by the healthcare providers into a Workflow Management System (WFMS) [2]. The clinical guidelines reflect the patient-centric flow to create an IT system supporting the care team. In the initial stage of the research the IT development team at Velindre Hospital identified that team communication and care coordination were obstacles hindering the implementation of a patient-centric delivery model. This was investigated to determine the causes, which were identified as difficulty in accessing the medical information held in dispersed legacy systems. Moreover, a major constraint in the domain is the need to keep legacy systems in operation and so there is a need to investigate approaches to enhance their functionalities. These information systems cannot be changed across all healthcare organisations and their complete autonomy needs to be retained as they are in constant use at the sites. Using workflow technology, an independent application representing an ICP was implemented. This was used to construct an independent layer in the software architecture to interact with legacy Clinical Information Systems (CISs) and so evolve their offered functionalities to support the teams. This was used to build a Virtual Organisation (VO) [3, 4] around a patient which facilitates patient-centric care. Moreover, the VO virtually integrates the data from legacy systems and ensures its availability (as needed) at the different treatment stages along the care pathway. Implications of the proposal include: formalising the treatment process, filtering and gathering the patient’s information, ensuring care continuity, and pro-acting to change. Evaluation of the proposal involved three stages; First, usefulness evaluation by the healthcare providers representing the users; Second, setup evaluation by developers of CISs; and Finally, technical evaluation by the community of the technology. The evaluation proved; the healthcare providers’ need for an adaptive and a proactive system, the possibility of adopting the proposed system, and the novelty and innovation of the proposed approach. The research proposes a patient-centric system achieved by creating a version of an ICP in the system for each patient. It also provides focussed support for team communication and care coordination, by identifying the treatment stages and providing the care team requirements at each stage. It utilises the data within the legacy system to be proactive. Moreover, it makes these required data for the actions available from the running legacy system which is required for patient-centred care. In the future the worth could be extended by mapping other ICPs into the system. This work has been published in four full papers. It found acceptance in the health informatics community [5, 6, 7] as well as the BPM community [8, 9]. It is also the winner of the 2011 “Global Award of Excellence in Adaptive Case Management (ACM)” in “Medical and Healthcare” [10] of the Workflow Management Coalition (WFMC) [11].EThOS - Electronic Theses Online ServiceGBUnited Kingdo
    • …
    corecore