148 research outputs found

    A cross-layer approach to enhance QoS for multimedia applications over satellite

    Get PDF
    The need for on-demand QoS support for communications over satellite is of primary importance for distributed multimedia applications. This is particularly true for the return link which is often a bottleneck due to the large set of end-users accessing a very limited uplink resource. Facing this need, Demand Assignment Multiple Access (DAMA) is a classical technique that allows satellite operators to offer various types of services, while managing the resources of the satellite system efficiently. Tackling the quality degradation and delay accumulation issues that can result from the use of these techniques, this paper proposes an instantiation of the Application Layer Framing (ALF) approach, using a cross-layer interpreter(xQoS-Interpreter). The information provided by this interpreter is used to manage the resource provided to a terminal by the satellite system in order to improve the quality of multimedia presentations from the end users point of view. Several experiments are carried out for different loads on the return link. Their impact on QoS is measured through different application as well as network level metrics

    Service management for multi-domain Active Networks

    Get PDF
    The Internet is an example of a multi-agent system. In our context, an agent is synonymous with network operators, Internet service providers (ISPs) and content providers. ISPs mutually interact for connectivity's sake, but the fact remains that two peering agents are inevitably self-interested. Egoistic behaviour manifests itself in two ways. Firstly, the ISPs are able to act in an environment where different ISPs would have different spheres of influence, in the sense that they will have control and management responsibilities over different parts of the environment. On the other hand, contention occurs when an ISP intends to sell resources to another, which gives rise to at least two of its customers sharing (hence contending for) a common transport medium. The multi-agent interaction was analysed by simulating a game theoretic approach and the alignment of dominant strategies adopted by agents with evolving traits were abstracted. In particular, the contention for network resources is arbitrated such that a self-policing environment may emerge from a congested bottleneck. Over the past 5 years, larger ISPs have simply peddled as fast as they could to meet the growing demand for bandwidth by throwing bandwidth at congestion problems. Today, the dire financial positions of Worldcom and Global Crossing illustrate, to a certain degree, the fallacies of over-provisioning network resources. The proposed framework in this thesis enables subscribers of an ISP to monitor and police each other's traffic in order to establish a well-behaved norm in utilising limited resources. This framework can be expanded to other inter-domain bottlenecks within the Internet. One of the main objectives of this thesis is also to investigate the impact on multi-domain service management in the future Internet, where active nodes could potentially be located amongst traditional passive routers. The advent of Active Networking technology necessitates node-level computational resource allocations, in addition to prevailing resource reservation approaches for communication bandwidth. Our motivation is to ensure that a service negotiation protocol takes account of these resources so that the response to a specific service deployment request from the end-user is consistent and predictable. To promote the acceleration of service deployment by means of Active Networking technology, a pricing model is also evaluated for computational resources (e.g., CPU time and memory). Previous work in these areas of research only concentrate on bandwidth (i.e., communication) - related resources. Our pricing approach takes account of both guaranteed and best-effort service by adapting the arbitrage theorem from financial theory. The central tenet for our approach is to synthesise insights from different disciplines to address problems in data networks. The greater parts of research experience have been obtained during direct and indirect participation in the 1ST-10561 project known as FAIN (Future Active IP Networks) and ACTS-AC338 project called MIAMI (Mobile Intelligent Agent for Managing the Information Infrastructure). The Inter-domain Manager (IDM) component was integrated as an integral part of the FAIN policy-based network management systems (PBNM). Its monitoring component (developed during the MIAMI project) learns about routing changes that occur within a domain so that the management system and the managed nodes have the same topological view of the network. This enabled our reservation mechanism to reserve resources along the existing route set up by whichever underlying routing protocol is in place

    A reduced-reference perceptual image and video quality metric based on edge preservation

    Get PDF
    In image and video compression and transmission, it is important to rely on an objective image/video quality metric which accurately represents the subjective quality of processed images and video sequences. In some scenarios, it is also important to evaluate the quality of the received video sequence with minimal reference to the transmitted one. For instance, for quality improvement of video transmission through closed-loop optimisation, the video quality measure can be evaluated at the receiver and provided as feedback information to the system controller. The original image/video sequence-prior to compression and transmission-is not usually available at the receiver side, and it is important to rely at the receiver side on an objective video quality metric that does not need reference or needs minimal reference to the original video sequence. The observation that the human eye is very sensitive to edge and contour information of an image underpins the proposal of our reduced reference (RR) quality metric, which compares edge information between the distorted and the original image. Results highlight that the metric correlates well with subjective observations, also in comparison with commonly used full-reference metrics and with a state-of-the-art RR metric. © 2012 Martini et al

    Multimedia content description framework

    Get PDF
    A framework is provided for describing multimedia content and a system in which a plurality of multimedia storage devices employing the content description methods of the present invention can interoperate. In accordance with one form of the present invention, the content description framework is a description scheme (DS) for describing streams or aggregations of multimedia objects, which may comprise audio, images, video, text, time series, and various other modalities. This description scheme can accommodate an essentially limitless number of descriptors in terms of features, semantics or metadata, and facilitate content-based search, index, and retrieval, among other capabilities, for both streamed or aggregated multimedia objects

    Optimization inWeb Caching: Cache Management, Capacity Planning, and Content Naming

    Full text link
    Caching is fundamental to performance in distributed information retrieval systems such as the World Wide Web. This thesis introduces novel techniques for optimizing performance and cost-effectiveness in Web cache hierarchies. When requests are served by nearby caches rather than distant servers, server loads and network traffic decrease and transactions are faster. Cache system design and management, however, face extraordinary challenges in loosely-organized environments like the Web, where the many components involved in content creation, transport, and consumption are owned and administered by different entities. Such environments call for decentralized algorithms in which stakeholders act on local information and private preferences. In this thesis I consider problems of optimally designing new Web cache hierarchies and optimizing existing ones. The methods I introduce span the Web from point of content creation to point of consumption: I quantify the impact of content-naming practices on cache performance; present techniques for variable-quality-of-service cache management; describe how a decentralized algorithm can compute economically-optimal cache sizes in a branching two-level cache hierarchy; and introduce a new protocol extension that eliminates redundant data transfers and allows “dynamic” content to be cached consistently. To evaluate several of my new methods, I conducted trace-driven simulations on an unprecedented scale. This in turn required novel workload measurement methods and efficient new characterization and simulation techniques. The performance benefits of my proposed protocol extension are evaluated using two extraordinarily large and detailed workload traces collected in a traditional corporate network environment and an unconventional thin-client system. My empirical research follows a simple but powerful paradigm: measure on a large scale an important production environment’s exogenous workload; identify performance bounds inherent in the workload, independent of the system currently serving it; identify gaps between actual and potential performance in the environment under study; and finally devise ways to close these gaps through component modifications or through improved inter-component integration. This approach may be applicable to a wide range of Web services as they mature.Ph.D.Computer Science and EngineeringUniversity of Michiganhttp://deepblue.lib.umich.edu/bitstream/2027.42/90029/1/kelly-optimization_web_caching.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/90029/2/kelly-optimization_web_caching.ps.bz

    Study of customers' perceptions toward informative service setting items in U.S. hotel industry

    Get PDF
    Scope and Method of Study: The purpose of this study was to uncover how the hotel ISSI (informative service setting items) were being perceived by hotel customers, and how these perceptions affect customers' satisfaction and their hotel-selection decision. Thus, the research design for this study was a cross-sectional descriptive research that investigated customers' perceptions of ISSI. An online survey was conducted. Data analysis in this study included the following techniques: frequency, compared-means, independent t-test, Analysis of Variance (ANOVA), Importance-Performance Analysis (IPA), Factor analysis, and Multiple Regression Analysis.Findings and Conclusions: Importance-Performance Analysis (IPA) was employed to compare the overall perceived ISSI Hospitality educators and hotel customers differed in their importance rating of ISSI attributes (t = -2.012, df =254, p<0.05) and also in their performance rating (t = -1.65, df =254, p<0.05). Western customers and Asian customers also differed in both importance rating (t = -3.462, df =254, p<0.01) and performance rating (t = -3.665, df =254, p<0.01) of ISSI attributes. Also, this study identified six underlying ISSI factors that have influence on customers' satisfaction level, their future recommendation, and their hotel-selection decision. The regression model in this study revealed that the six factors appeared as significant independent variables to three major dependent variables

    The Localization Industry: A Profile of DNA Media

    Get PDF
    Since the mid-1990s the e-commerce industry experienced dramatic growth that was only the start of a business revolution. With the rapid expansion of Internet related infrastructure equipment and services that allowed low-cost global communications, the beginnings of a truly global economy began to take shape. Riding on the coat tails of this wave was software and content localization services that were a necessary component in selling products and services to different countries and across many cultures. The challenges of operating in a diverse, multicultural market are great, filled with cultural subtleties that can be a minefield for the uninformed. DNA Media, based in Vancouver, Canada, is a software localization company specializing in language, software application and content (Web-based technologies, application design, CD-ROM, DVD and multi-media versioning). The company enjoyed strong growth in its services in the last two years and, by the year 2000 it was in a position to expand rapidly. This case provides insight into how managers of a small but growing information technology company managed its growth, established its market in the software localization industry, and planned for the next phase of expansion

    Improving web interaction in small screen displays

    Get PDF
    Soon many people will retrieve information from the Web using handheld, palmsized or even smaller computers. Although these computers have dramatically increased in sophistication, their display size is – and will remain – much smaller than their conventional, desktop counterparts. Currently, browsers for these devices present web pages without taking account of the very different display capabilities. As part of a collaborative project with Reuters, we carried out a study into the usability impact of small displays for retrieval tasks. Users of the small screen were 50% less effective in completing tasks than the large screen subjects. Small screen users used a very substantial number of scroll activities in attempting to complete the tasks. Our study also provided us with interesting insights into the shifts in approach users seem to make when using a small screen device for retrieval. These results suggest that the metaphors useful in a full screen desktop environment are not the most appropriate for the new devices. Design guidelines are discussed, here, proposing directed access methods for effective small screen interaction. In our ongoing work, we are developing such “meta-interfaces” which will sit between the small screen user and the “conventional” web page
    corecore