193,568 research outputs found

    Data mining technology for the evaluation of learning content interaction

    Get PDF
    Interactivity is central for the success of learning. In e-learning and other educational multimedia environments, the evaluation of interaction and behaviour is particularly crucial. Data mining – a non-intrusive, objective analysis technology – shall be proposed as the central evaluation technology for the analysis of the usage of computer-based educational environments and in particular of the interaction with educational content. Basic mining techniques are reviewed and their application in a Web-based third-level course environment is illustrated. Analytic models capturing interaction aspects from the application domain (learning) and the software infrastructure (interactive multimedia) are required for the meaningful interpretation of mining results

    A conceptual architecture for interactive educational multimedia

    Get PDF
    Learning is more than knowledge acquisition; it often involves the active participation of the learner in a variety of knowledge- and skills-based learning and training activities. Interactive multimedia technology can support the variety of interaction channels and languages required to facilitate interactive learning and teaching. A conceptual architecture for interactive educational multimedia can support the development of such multimedia systems. Such an architecture needs to embed multimedia technology into a coherent educational context. A framework based on an integrated interaction model is needed to capture learning and training activities in an online setting from an educational perspective, to describe them in the human-computer context, and to integrate them with mechanisms and principles of multimedia interaction

    Managing Uncertainty: A Case for Probabilistic Grid Scheduling

    Get PDF
    The Grid technology is evolving into a global, service-orientated architecture, a universal platform for delivering future high demand computational services. Strong adoption of the Grid and the utility computing concept is leading to an increasing number of Grid installations running a wide range of applications of different size and complexity. In this paper we address the problem of elivering deadline/economy based scheduling in a heterogeneous application environment using statistical properties of job historical executions and its associated meta-data. This approach is motivated by a study of six-month computational load generated by Grid applications in a multi-purpose Grid cluster serving a community of twenty e-Science projects. The observed job statistics, resource utilisation and user behaviour is discussed in the context of management approaches and models most suitable for supporting a probabilistic and autonomous scheduling architecture

    Built to Last or Built Too Fast? Evaluating Prediction Models for Build Times

    Full text link
    Automated builds are integral to the Continuous Integration (CI) software development practice. In CI, developers are encouraged to integrate early and often. However, long build times can be an issue when integrations are frequent. This research focuses on finding a balance between integrating often and keeping developers productive. We propose and analyze models that can predict the build time of a job. Such models can help developers to better manage their time and tasks. Also, project managers can explore different factors to determine the best setup for a build job that will keep the build wait time to an acceptable level. Software organizations transitioning to CI practices can use the predictive models to anticipate build times before CI is implemented. The research community can modify our predictive models to further understand the factors and relationships affecting build times.Comment: 4 paged version published in the Proceedings of the IEEE/ACM 14th International Conference on Mining Software Repositories (MSR) Pages 487-490. MSR 201

    A grid-based approach for processing group activity log files

    Get PDF
    The information collected regarding group activity in a collaborative learning environment requires classifying, structuring and processing. The aim is to process this information in order to extract, reveal and provide students and tutors with valuable knowledge, awareness and feedback in order to successfully perform the collaborative learning activity. However, the large amount of information generated during online group activity may be time-consuming to process and, hence, can hinder the real-time delivery of the information. In this study we show how a Grid-based paradigm can be used to effectively process and present the information regarding group activity gathered in the log files under a collaborative environment. The computational power of the Grid makes it possible to process a huge amount of event information, compute statistical results and present them, when needed, to the members of the online group and the tutors, who are geographically distributed.Peer ReviewedPostprint (author's final draft

    Automated design analysis, assembly planning and motion study analysis using immersive virtual reality

    Get PDF
    Previous research work at Heriot-Watt University using immersive virtual reality (VR) for cable harness design showed that VR provided substantial productivity gains over traditional computer-aided design (CAD) systems. This follow-on work was aimed at understanding the degree to which aspects of this technology were contributed to these benefits and to determine if engineering design and planning processes could be analysed in detail by nonintrusively monitoring and logging engineering tasks. This involved using a CAD-equivalent VR system for cable harness routing design, harness assembly and installation planning that can be functionally evaluated using a set of creative design-tasks to measure the system and users' performance. A novel design task categorisation scheme was created and formalised which broke down the cable harness design process and associated activities. The system was also used to demonstrate the automatic generation of usable bulkhead connector, cable harness assembly and cable harness installation plans from non-intrusive user logging. Finally, the data generated from the user-logging allowed the automated activity categorisation of the user actions, automated generation of process flow diagrams and chronocyclegraphs

    An approach to software maintenance support using a syntactic source code analyser data base : this thesis is presented in a partial fulfillment of the requirements for the degree of Master of Arts in Computer Science at Massey University

    Get PDF
    In this thesis, the development of a software maintenance tool called a syntactic source code analyser (SSCA) is summarised. An SSCA supports other maintenance tools which interact with source code by creating a data base of source information which has links to a formatted version of program source code. The particular SSCA presented handles programs written in a version of COBOL. Before developing a SSCA system, aspects of software maintenance need to be considered. Hence, the scope, definitions and problems of maintenance activities are briefly reviewed and maintenance support through environments, software metrics, and specific tools and techniques examined. A complete maintenance support environment for an application is found to overlap considerably with the application documentation system and shares some tools with development environments. Program source code is also identified as the fundamental documentation of an application and interaction with this source code is a requirement of many maintenance support tools
    • 

    corecore