62,365 research outputs found

    Characterizing and Improving the Reliability of Broadband Internet Access

    Full text link
    In this paper, we empirically demonstrate the growing importance of reliability by measuring its effect on user behavior. We present an approach for broadband reliability characterization using data collected by many emerging national initiatives to study broadband and apply it to the data gathered by the Federal Communications Commission's Measuring Broadband America project. Motivated by our findings, we present the design, implementation, and evaluation of a practical approach for improving the reliability of broadband Internet access with multihoming.Comment: 15 pages, 14 figures, 6 table

    Characterization of ISP Traffic: Trends, User Habits, and Access Technology Impact

    Get PDF
    In the recent years, the research community has increased its focus on network monitoring which is seen as a key tool to understand the Internet and the Internet users. Several studies have presented a deep characterization of a particular application, or a particular network, considering the point of view of either the ISP, or the Internet user. In this paper, we take a different perspective. We focus on three European countries where we have been collecting traffic for more than a year and a half through 5 vantage points with different access technologies. This humongous amount of information allows us not only to provide precise, multiple, and quantitative measurements of "What the user do with the Internet" in each country but also to identify common/uncommon patterns and habits across different countries and nations. Considering different time scales, we start presenting the trend of application popularity; then we focus our attention to a one-month long period, and further drill into a typical daily characterization of users activity. Results depict an evolving scenario due to the consolidation of new services as Video Streaming and File Hosting and to the adoption of new P2P technologies. Despite the heterogeneity of the users, some common tendencies emerge that can be leveraged by the ISPs to improve their servic

    Analysis of Software Aging in a Web Server

    Get PDF
    A number of recent studies have reported the phenomenon of ā€œsoftware agingā€, characterized by progressive performance degradation and/or an increased occurrence rate of hang/crash failures of a software system due to the exhaustion of operating system resources or the accumulation of errors. To counteract this phenomenon, a proactive technique called 'software rejuvenation' has been proposed. It essentially involves stopping the running software, cleaning its internal state and/or its environment and then restarting it. Software rejuvenation, being preventive in nature, begs the question as to when to schedule it. Periodic rejuvenation, while straightforward to implement, may not yield the best results, because the rate at which software ages is not constant, but it depends on the time-varying system workload. Software rejuvenation should therefore be planned and initiated in the face of the actual system behavior. This requires the measurement, analysis and prediction of system resource usage. In this paper, we study the development of resource usage in a web server while subjecting it to an artificial workload. We first collect data on several system resource usage and activity parameters. Non-parametric statistical methods are then applied for detecting and estimating trends in the data sets. Finally, we fit time series models to the data collected. Unlike the models used previously in the research on software aging, these time series models allow for seasonal patterns, and we show how the exploitation of the seasonal variation can help in adequately predicting the future resource usage. Based on the models employed here, proactive management techniques like software rejuvenation triggered by actual measurements can be built. --Software aging,software rejuvenation,Linux,Apache,web server,performance monitoring,prediction of resource utilization,non-parametric trend analysis,time series analysis

    Internet banking acceptance model: Cross-market examination

    Get PDF
    This article proposes a revised technology acceptance model to measure consumersā€™ acceptance of Internet banking, the Internet Banking Acceptance Model (IBAM). Data was collected from 618 university students in the United Kingdom and Saudi Arabia. The results suggest the importance of attitude, such that attitude and behavioral intentions emerge as a single factor, denoted as ā€œattitudinal intentionsā€ (AI). Structural equation modeling confirms the fit of the model, in which perceived usefulness and trust fully mediate the impact of subjective norms and perceived manageability on AI. The invariance analysis demonstrates the psychometric equivalence of the IBAM measurements between the two country groups. At the structural level, the influence of trust and system usefulness on AI vary between the two countries, emphasizing the potential role of cultures in IS adoption. The IBAM is robust and parsimonious, explaining over 80% of AI

    Big Data Meets Telcos: A Proactive Caching Perspective

    Full text link
    Mobile cellular networks are becoming increasingly complex to manage while classical deployment/optimization techniques and current solutions (i.e., cell densification, acquiring more spectrum, etc.) are cost-ineffective and thus seen as stopgaps. This calls for development of novel approaches that leverage recent advances in storage/memory, context-awareness, edge/cloud computing, and falls into framework of big data. However, the big data by itself is yet another complex phenomena to handle and comes with its notorious 4V: velocity, voracity, volume and variety. In this work, we address these issues in optimization of 5G wireless networks via the notion of proactive caching at the base stations. In particular, we investigate the gains of proactive caching in terms of backhaul offloadings and request satisfactions, while tackling the large-amount of available data for content popularity estimation. In order to estimate the content popularity, we first collect users' mobile traffic data from a Turkish telecom operator from several base stations in hours of time interval. Then, an analysis is carried out locally on a big data platform and the gains of proactive caching at the base stations are investigated via numerical simulations. It turns out that several gains are possible depending on the level of available information and storage size. For instance, with 10% of content ratings and 15.4 Gbyte of storage size (87% of total catalog size), proactive caching achieves 100% of request satisfaction and offloads 98% of the backhaul when considering 16 base stations.Comment: 8 pages, 5 figure

    Web Acceptance and Usage Model: A Comparison between Goal-directed and Experiential Web Users

    Get PDF
    In this paper we analyse the Web acceptance and usage between goal-directed users and experiential users, incorporating intrinsic motives to improve the particular and explanatory TAM value ā€“traditionally related to extrinsic motives-. A field study was conducted to validate measures used to operationalize model variables and to test the hypothesised network of relationships. The data analysis method used was Partial Least Squares (PLS).The empirical results provided strong support for the hypotheses, highlighting the roles of flow, ease of use and usefulness in determining the actual use of the Web among experiential and goal-directed users. In contrast with previous research that suggests that flow would be more likely to occur during experiential activities than goal-directed activities, we found clear evidence of flow for goal-directed activities. In particular the study findings indicate that flow might play a powerfulrole in determining the attitude towards usage,intention to useand, in turn,actual Web use among experiential and goal-directed users
    • ā€¦
    corecore