60,100 research outputs found

    Energy Efficiency in the ICT - Profiling Power Consumption in Desktop Computer Systems

    Get PDF
    Energy awareness in the ICT has become an important issue. Focusing on software, recent work suggested the existence of a relationship between power consumption, software configuration and usage patterns in computer systems. The aim of this work was collecting and analysing power consumption data of general-purpose computer systems, simulating common usage scenarios, in order to extract a power consumption profile for each scenario. We selected two desktop systems of different generations as test machines. Meanwhile, we developed 11 usage scenarios, and conducted several test runs of them, collecting power consumption data by means of a power meter. Our analysis resulted in an estimation of a power consumption value for each scenario and software application used, obtaining that each single scenario introduced an overhead from 2 to 11 Watts, which corresponds to a percentage increase that can reach up to 20% on recent and more powerful systems. We determined that software and its usage patterns impact consistently on the power consumption of computer systems. Further work will be devoted to evaluate how power consumption is affected by the usage of specific system resource

    Data Analysis in Multimedia Quality Assessment: Revisiting the Statistical Tests

    Full text link
    Assessment of multimedia quality relies heavily on subjective assessment, and is typically done by human subjects in the form of preferences or continuous ratings. Such data is crucial for analysis of different multimedia processing algorithms as well as validation of objective (computational) methods for the said purpose. To that end, statistical testing provides a theoretical framework towards drawing meaningful inferences, and making well grounded conclusions and recommendations. While parametric tests (such as t test, ANOVA, and error estimates like confidence intervals) are popular and widely used in the community, there appears to be a certain degree of confusion in the application of such tests. Specifically, the assumption of normality and homogeneity of variance is often not well understood. Therefore, the main goal of this paper is to revisit them from a theoretical perspective and in the process provide useful insights into their practical implications. Experimental results on both simulated and real data are presented to support the arguments made. A software implementing the said recommendations is also made publicly available, in order to achieve the goal of reproducible research

    Creating telecommunication services based on object-oriented frameworks and SDL

    Get PDF
    This paper describes the tools and techniques being applied in the TINA Open Service Creation Architecture (TOSCA) project to develop object-oriented models of distributed telecommunication services in SDL. The paper also describes the way in which Tree and Tabular Combined Notation (TTCN) test cases are derived from these models and subsequently executed against the CORBA-based implementations of these services through a TTCN/CORBA gateway

    Experiences modelling and using object-oriented telecommunication service frameworks in SDL

    Get PDF
    This paper describes experiences in using SDL and its associated tools to create telecommunication services by producing and specialising object-oriented frameworks. The chosen approach recognises the need for the rapid creation of validated telecommunication services. It introduces two stages to service creation. Firstly a software expert produces a service framework, and secondly a telecommunications ‘business consultant' specialises the framework by means of graphical tools to rapidly produce services. Here the focus is given to the underlying technology required. In particular, the advantages and disadvantages of SDL and tools for this purpose are highlighted

    Preventing Distributed Denial-of-Service Attacks on the IMS Emergency Services Support through Adaptive Firewall Pinholing

    Full text link
    Emergency services are vital services that Next Generation Networks (NGNs) have to provide. As the IP Multimedia Subsystem (IMS) is in the heart of NGNs, 3GPP has carried the burden of specifying a standardized IMS-based emergency services framework. Unfortunately, like any other IP-based standards, the IMS-based emergency service framework is prone to Distributed Denial of Service (DDoS) attacks. We propose in this work, a simple but efficient solution that can prevent certain types of such attacks by creating firewall pinholes that regular clients will surely be able to pass in contrast to the attackers clients. Our solution was implemented, tested in an appropriate testbed, and its efficiency was proven.Comment: 17 Pages, IJNGN Journa

    E-Science in the classroom - Towards viability

    Get PDF
    E-Science has the potential to transform school science by enabling learners, teachers and research scientists to engage together in authentic scientific enquiry, collaboration and learning. However, if we are to reap the benefits of this potential as part of everyday teaching and learning, we need to explicitly think about and support the work required to set up and run e-Science experiences within any particular educational context. In this paper, we present a framework for identifying and describing the resources, tools and services necessary to move e-Science into the classroom together with examples of these. This framework is derived from previous experiences conducting educational e-Science projects and systematic analysis of the categories of ‘hidden work’ needed to run these projects (Smith, Underwood, Fitzpatrick, & Luckin, forthcoming). The articulation of resources, tools and services based on these categories provides a starting point for more methodical design and deployment of future educational e- Science projects, reflection on which can also help further develop the framework. It also points to the technological infrastructure from which such tools and services could be built. As such it provides an agenda of work to develop both processes and technologies that would make it practical for teachers to deliver active, and collaborative e-Science learning experiences on a larger scale within and across schools. Routine school e- Science will only be possible if such support is specified, implemented and made available to teachers within their work contexts in an appropriate and usable form
    corecore