37,052 research outputs found

    Stochastic Dynamic Cache Partitioning for Encrypted Content Delivery

    Full text link
    In-network caching is an appealing solution to cope with the increasing bandwidth demand of video, audio and data transfer over the Internet. Nonetheless, an increasing share of content delivery services adopt encryption through HTTPS, which is not compatible with traditional ISP-managed approaches like transparent and proxy caching. This raises the need for solutions involving both Internet Service Providers (ISP) and Content Providers (CP): by design, the solution should preserve business-critical CP information (e.g., content popularity, user preferences) on the one hand, while allowing for a deeper integration of caches in the ISP architecture (e.g., in 5G femto-cells) on the other hand. In this paper we address this issue by considering a content-oblivious ISP-operated cache. The ISP allocates the cache storage to various content providers so as to maximize the bandwidth savings provided by the cache: the main novelty lies in the fact that, to protect business-critical information, ISPs only need to measure the aggregated miss rates of the individual CPs and do not need to be aware of the objects that are requested, as in classic caching. We propose a cache allocation algorithm based on a perturbed stochastic subgradient method, and prove that the algorithm converges close to the allocation that maximizes the overall cache hit rate. We use extensive simulations to validate the algorithm and to assess its convergence rate under stationary and non-stationary content popularity. Our results (i) testify the feasibility of content-oblivious caches and (ii) show that the proposed algorithm can achieve within 10\% from the global optimum in our evaluation

    The Amateur Sky Survey Mark III Project

    Get PDF
    The Amateur Sky Survey (TASS) is a loose confederation of amateur and professional astronomers. We describe the design and construction of our Mark III system, a set of wide-field drift-scan CCD cameras which monitor the celestial equator down to thirteenth magnitude in several passbands. We explain the methods by which images are gathered, processed, and reduced into lists of stellar positions and magnitudes. Over the period October, 1996, to November, 1998, we compiled a large database of photometric measurements. One of our results is the "tenxcat" catalog, which contains measurements on the standard Johnson-Cousins system for 367,241 stars; it contains links to the light curves of these stars as well.Comment: 20 pages, including 4 figures; additional JPEG files for Figures 1, 2. Submitted to PAS

    E-tailers versus Retailers: Which Factors Determine Consumer Preferences

    Get PDF
    The growth of Internet technology and electronic commerce has not been matched by theoretically-guided social science research. Clear and well designed consumer research is needed to describe, explain, and predict what will happen to this changing landscape. The primary purpose of this study is to investigate the structure for consumer preferences to make product purchases through three available retail formats - store, catalog, and the Internet. Conjoint analysis was used to assess the structure of the decision and the importance of the attributes in the decision-making process. The results from this study noticeably show that the structure of the consumer decision-making process was found to be primarily one of choosing the retail format (store, catalog, or Internet) and price of product (set at low, medium or high) desired. The strength of the retail store format suggests that fears that the Internet will take over the retail arena seem, at least at this point in time, overblown and exaggerated. However, there seems to be an identifiable segment of customers that has a preference for the Internet as a retail shopping alternative.Economics ;

    Big Data Meets Telcos: A Proactive Caching Perspective

    Full text link
    Mobile cellular networks are becoming increasingly complex to manage while classical deployment/optimization techniques and current solutions (i.e., cell densification, acquiring more spectrum, etc.) are cost-ineffective and thus seen as stopgaps. This calls for development of novel approaches that leverage recent advances in storage/memory, context-awareness, edge/cloud computing, and falls into framework of big data. However, the big data by itself is yet another complex phenomena to handle and comes with its notorious 4V: velocity, voracity, volume and variety. In this work, we address these issues in optimization of 5G wireless networks via the notion of proactive caching at the base stations. In particular, we investigate the gains of proactive caching in terms of backhaul offloadings and request satisfactions, while tackling the large-amount of available data for content popularity estimation. In order to estimate the content popularity, we first collect users' mobile traffic data from a Turkish telecom operator from several base stations in hours of time interval. Then, an analysis is carried out locally on a big data platform and the gains of proactive caching at the base stations are investigated via numerical simulations. It turns out that several gains are possible depending on the level of available information and storage size. For instance, with 10% of content ratings and 15.4 Gbyte of storage size (87% of total catalog size), proactive caching achieves 100% of request satisfaction and offloads 98% of the backhaul when considering 16 base stations.Comment: 8 pages, 5 figure
    • …
    corecore