17 research outputs found

    Internet Web Replication and Caching Taxonomy

    Full text link

    FILTER SITUS KONTEN NEGATIF PADA PROXY SERVER MENGGUNAKAN LOGIKA FUZZY

    Get PDF
    berdampak pada pengguna dapat dengan bebas mengakses konten yang bersifat negatif. Proxy yang bertugas sebagai perantara antara client dengan server memiliki fungsi sebagai cache dan filter yang mampu berperan sebagai pembatasan akses konten negatif sehingga terciptanya penggunaan internet secara sehat. Diawali dengan melakukan percobaan komunikasi antara server dan client dengan menggunakan socket programming, kemudian memperoleh data berupa alamat dari sebuah situs dengan menggunakan metode Term Frequency menghasilkan 100 data situs yang terindikasi memiliki konten negatif yang selanjutnya disebut data log. Pada proses ini diperoleh 8 prase yang mengandung konten dewasa, frekuensi kemunculan terkecil sejumlah 4 kali dan frekuensi kemunculan terbesar sejumlah 23 kali, dengan total keseluruhan frekuensi sejumlah 105 prase. Proses pengujian menggunakan 20 data alamat situs yang memiliki variasi prase yang selanjutnya dengan menggunakan logika fuzzy untuk melakukan metode Invers Document Frequency dapat diperoleh hasil situs yang berada didalam kategori tidak negatif sejumlah 13 situs dan terindikasi memiliki konten negatif sejumlah 7 situ

    A Scalable Cluster-based Infrastructure for Edge-computing Services

    Get PDF
    In this paper we present a scalable and dynamic intermediary infrastruc- ture, SEcS (acronym of BScalable Edge computing Services’’), for developing and deploying advanced Edge computing services, by using a cluster of heterogeneous machines. Our goal is to address the challenges of the next-generation Internet services: scalability, high availability, fault-tolerance and robustness, as well as programmability and quick prototyping. The system is written in Java and is based on IBM’s Web Based Intermediaries (WBI) [71] developed at IBM Almaden Research Center

    Evaluation, Analysis and adaptation of web prefetching techniques in current web

    Full text link
    Abstract This dissertation is focused on the study of the prefetching technique applied to the World Wide Web. This technique lies in processing (e.g., downloading) a Web request before the user actually makes it. By doing so, the waiting time perceived by the user can be reduced, which is the main goal of the Web prefetching techniques. The study of the state of the art about Web prefetching showed the heterogeneity that exists in its performance evaluation. This heterogeneity is mainly focused on four issues: i) there was no open framework to simulate and evaluate the already proposed prefetching techniques; ii) no uniform selection of the performance indexes to be maximized, or even their definition; iii) no comparative studies of prediction algorithms taking into account the costs and benefits of web prefetching at the same time; and iv) the evaluation of techniques under very different or few significant workloads. During the research work, we have contributed to homogenizing the evaluation of prefetching performance by developing an open simulation framework that reproduces in detail all the aspects that impact on prefetching performance. In addition, prefetching performance metrics have been analyzed in order to clarify their definition and detect the most meaningful from the user's point of view. We also proposed an evaluation methodology to consider the cost and the benefit of prefetching at the same time. Finally, the importance of using current workloads to evaluate prefetching techniques has been highlighted; otherwise wrong conclusions could be achieved. The potential benefits of each web prefetching architecture were analyzed, finding that collaborative predictors could reduce almost all the latency perceived by users. The first step to develop a collaborative predictor is to make predictions at the server, so this thesis is focused on an architecture with a server-located predictor. The environment conditions that can be found in the web are alsDoménech I De Soria, J. (2007). Evaluation, Analysis and adaptation of web prefetching techniques in current web [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/1841Palanci

    Content Distribution in P2P Systems

    Get PDF
    The report provides a literature review of the state-of-the-art for content distribution. The report's contributions are of threefold. First, it gives more insight into traditional Content Distribution Networks (CDN), their requirements and open issues. Second, it discusses Peer-to-Peer (P2P) systems as a cheap and scalable alternative for CDN and extracts their design challenges. Finally, it evaluates the existing P2P systems dedicated for content distribution according to the identied requirements and challenges
    corecore