227 research outputs found

    QoS adaptation in multimedia multicast conference applications for e-learning services

    Get PDF
    The evolution of the World Wide Web (WWW) service has incorporated new distributed multimedia conference applications, powering a new generation of e-learning development, and allowing improved interactivity and pro- human relations. Groupware applications are increasingly representative in the Internet home applications market, however, the Quality of Service (QoS) provided by the network is still a limitation impairing their performance. Such applications have found in multicast technology an ally contributing for their efficient implementation and scalability. Additionally, consider QoS as design goal at application level becomes crucial for groupware development, enabling QoS proactivity to applications. The applications’ ability to adapt themselves dynamically according to the resources availability can be considered a quality factor. Tolerant real-time applications, such as videoconferences, are in the frontline to benefit from QoS adaptation. However, not all include adaptive technology able to provide both end-system and network quality awareness. Adaptation, in these cases, can be achieved by introducing a multiplatform middleware layer responsible for tutoring the applications' resources (enabling adjudication or limitation) based on the available processing and networking capabilities. Congregating these technological contributions, an adaptive platform has been developed integrating public domain multicast tools, applied to a web-based distance learning system. The system is user-centered (e-student), aiming at good pedagogical practices and proactive usability for multimedia and network resources. The services provided, including QoS adapted interactive multimedia multicast conferences (MMC), are fully integrated and transparent to end-users. QoS adaptation, when treated systematically in tolerant real-time applications, denotes advantages in group scalability and QoS sustainability in heterogeneous and unpredictable environments such as the Internet

    QoS adaptation in multimedia multicast conference applications for e-learning services

    Get PDF
    Tolerant real-time applications, such as video conferences, are in the frontline to benefit from QoS adaptation. However, not all include adaptive technology able to provide both end-system and network quality awareness. Adaptation, in these cases, can be achieved by introducing a multiplatform middleware layer responsible for tutoring the applications’ resources (enabling adjudication or limitation) based on the available processing and networking capabilities. Congregating these technological contributions, an adaptive platform has been developed integrating public domain multicast tools, applied to a Web-based distance learning system. The system is user-centered (estudent), aiming at good pedagogical practices and proactive usability for multimedia and networkresources. The services provided, including QoS adapted interactive multimedia multicast conferences (MMC), are fully integrated and transparent to end-users. QoS adaptation, when treated systematically in tolerant real-time applications, denotes advantages in group scalability and QoS sustainability in heterogeneous and unpredictable environments such as the Internet

    Using Process Technology to Control and Coordinate Software Adaptation

    Get PDF
    We have developed an infrastructure for end-to-end run-time monitoring, behavior/performance analysis, and dynamic adaptation of distributed software. This infrastructure is primarily targeted to pre-existing systems and thus operates outside the target application, without making assumptions about the target's implementation, internal communication/computation mechanisms, source code availability, etc. This paper assumes the existence of the monitoring and analysis components, presented elsewhere, and focuses on the mechanisms used to control and coordinate possibly complex repairs/reconfigurations to the target system. These mechanisms require lower level effectors somehow attached to the target system, so we briefly sketch one such facility (elaborated elsewhere). Our main contribution is the model, architecture, and implementation of Workflakes, the decentralized process engine we use to tailor, control, coordinate, etc. a cohort of such effectors. We have validated the Workflakes approach with case studies in several application domains. Due to space restrictions we concentrate primarily on one case study, briefly discuss a second, and only sketch others

    Multi-Quality Auto-Tuning by Contract Negotiation

    Get PDF
    A characteristic challenge of software development is the management of omnipresent change. Classically, this constant change is driven by customers changing their requirements. The wish to optimally leverage available resources opens another source of change: the software systems environment. Software is tailored to specific platforms (e.g., hardware architectures) resulting in many variants of the same software optimized for different environments. If the environment changes, a different variant is to be used, i.e., the system has to reconfigure to the variant optimized for the arisen situation. The automation of such adjustments is subject to the research community of self-adaptive systems. The basic principle is a control loop, as known from control theory. The system (and environment) is continuously monitored, the collected data is analyzed and decisions for or against a reconfiguration are computed and realized. Central problems in this field, which are addressed in this thesis, are the management of interdependencies between non-functional properties of the system, the handling of multiple criteria subject to decision making and the scalability. In this thesis, a novel approach to self-adaptive software--Multi-Quality Auto-Tuning (MQuAT)--is presented, which provides design and operation principles for software systems which automatically provide the best possible utility to the user while producing the least possible cost. For this purpose, a component model has been developed, enabling the software developer to design and implement self-optimizing software systems in a model-driven way. This component model allows for the specification of the structure as well as the behavior of the system and is capable of covering the runtime state of the system. The notion of quality contracts is utilized to cover the non-functional behavior and, especially, the dependencies between non-functional properties of the system. At runtime the component model covers the runtime state of the system. This runtime model is used in combination with the contracts to generate optimization problems in different formalisms (Integer Linear Programming (ILP), Pseudo-Boolean Optimization (PBO), Ant Colony Optimization (ACO) and Multi-Objective Integer Linear Programming (MOILP)). Standard solvers are applied to derive solutions to these problems, which represent reconfiguration decisions, if the identified configuration differs from the current. Each approach is empirically evaluated in terms of its scalability showing the feasibility of all approaches, except for ACO, the superiority of ILP over PBO and the limits of all approaches: 100 component types for ILP, 30 for PBO, 10 for ACO and 30 for 2-objective MOILP. In presence of more than two objective functions the MOILP approach is shown to be infeasible

    Overview of UMTS network evolution through radio and transmission feature validation

    Get PDF
    This project is based on several UMTS network feature validation with the aim to provide an end-to-end in-depth knowledge overview gained in parallel in the areas of radio network mobility processes (cell camping and inter-system handover), Quality of Service improvement for HSPA data users and transport network evolution towards the All-IP era.Hardware and software validation is a key step in the relationship between the mobile network operator and the vendor. Through this verification process, while executing that functionality or testing a specific hardware, the difference between the actual result and expected result can be better understood and, in turn, this in-depth knowledge acquisition is translated into a tailored usage of the product in the operator’s live network. As a result, validation helps in building a better product as per the customer’s requirement and helps satisfying their needs, which positively impacts in the future evolution of the vendor product roadmap implementation process for a specific customer. This project is based on several Universal Mobile Telecommunication Services (UMTS) network feature validation with the aim to provide an end-to-end in-depth knowledge overview gained in parallel in the areas of radio network mobility processes (cell camping and inter-system handover), Quality of Service improvement for High Speed Downlink Packet Access (HSPA) data users and transport network evolution towards the All-IP era.Las campañas de validación hardware y software son un paso clave en las relaciones comerciales establecidas entre un operador de telecomunicaciones y su proveedor de equipos de red. Durante los procesos de certificación, mientras se ejecuta una funcionalidad software o se valida un determinado hardware, se obtiene un conocimiento profundo de la diferencia entre el resultado obtenido y el esperado, repercutiendo directamente en un uso a medida de dicha funcionalidad o hardware en la propia red del cliente. Como consecuencia de lo anterior, podemos aseverar que los procesos de validación permiten en gran medida al proveedor adaptarse mejor a los requerimientos del cliente, ayudando a satisfacer realmente sus necesidades. Esto implica directamente un impacto positivo en la futura evolución del portfolio que el fabricante ofrece a un determinado cliente. Este proyecto está basado en la validación de diferentes funcionalidades de red UMTS, cuyo objetivo es proporcionar un conocimiento global de distintos aspectos que conforman el funcionamiento de una red de telecomunicaciones 3G, como son los procesos de movilidad de acceso radio (acampado de red y handover inter-sistema), las mejoras en la calidad de servicio para usuarios de datos HSPA y la convergencia de la red de transporte hacia la era IP.Els processos de validació hardware i software són un punt clau en les relacions comercials establertes entre un operador de telecomunicaciones i el proveïdor d'equipament de la xarxa. En el transcurs dels processos de certificació, a la mateixa vegada que s'executa una funcionalitat software o es valida un determinat hardware, s'obtenen grans coneixements respecte la diferència entre el resultat obtingut i l'esperat, que són d'aplicació directa a l'hora d'establir un ús adpatat a la xarxa del client. En conseqüència, podem asseverar que les campanyes de validació permeten en gran mesura al proveïdor adaptar-se millor als requeriments del client, ajudant a satisfer realment les seves necessitats. Això implica directament un impacte positiu en la futura evol.lució del portfoli que el fabricant ofereix a un determinat client. Aquest projecte es basa en la presentació d'un procès de validació de diferents funcionalitats relacionades amb la xarxa UMTS, amb l'objectiu de proporcionar un coneixement global de la varietat d'aspectes que conformen el funcionament d'una xarxa de telecomunicacions 3G, com són els processos de mobilitat en accès radio (acampat de l'usuari i handover inter-sistema), millores en la qualitat de servei per a usuaris de dades HSPA i la convergència de la xarxa de transport cap a l'era IP

    Fifth ERCIM workshop on e-mobility

    Get PDF
    • …
    corecore