326 research outputs found

    Integrating serious games in adaptive hypermedia applications for personalised learning experiences

    Get PDF
    Game-based approaches to learning are increasingly recognized for their potential to stimulate intrinsic motivation amongst learners. While a range of examples of effective serious games exist, creating high-fidelity content with which to populate games is resource-intensive task. To reduce this resource requirement, research is increasingly exploring means to reuse and repurpose existing games. Education has proven a popular application area for Adaptive Hypermedia (AH), as adaptation can offer enriched learning experiences. Whilst content has mainly been in the form of rich text, various efforts have been made to integrate serious games into AH. However, there is little in the way of effective integrated authoring and user modeling support. This paper explores avenues for effectively integrating serious games into AH. In particular, we consider authoring and user modeling aspects in addition to integration into run-time adaptation engines, thereby enabling authors to create AH that includes an adaptive game, thus going beyond mere selection of a suitable game and towards an approach with the capability to adapt and respond to the needs of learners and educators

    Touching artefacts in an ancient world on a browser-based platform

    Get PDF
    Innovations in teaching and learning process are influenced by the rapid emergence of a knowledge society and tremendous growth in demands for highly informed and educated individuals. Various kinds of computer-based learning systems have already been integrated into conventional teaching methods. However, there is a pressing need to provide a more accessible and immersive learning environment in order to increase learners' receptiveness towards the learning process. Complete involvement of learners in their learning environment will promote better absorptions of knowledge via experiential and exploratory pedagogies. In tandem with such pedagogic approaches, this paper discusses the deployment of tactile perception to complement virtual artefacts within the domain of cultural heritage. By stimulating visual and tactile perceptions, the learners' engagement and interest can be sustained. Towards enhancing accessibility to a wider demography in a more cost-effective manner, web technologies provide a platform that is widely available for mass consumption. The development capitalises on the fact that the majority of UK households have access to computers and internet

    Technical evaluation of the mEducator 3.0 linked data-based environment for sharing medical educational resources

    Get PDF
    mEducator 3.0 is a content sharing approach for medical education, based on Linked Data principles. Through standardization, it enables sharing and discovery of medical information. Overall the mEducator project seeks to address the following two different approaches, mEducator 2.0, based on web 2.0 and ad-hoc Application Programmers Interfaces (APIs), and mEducator 3.0, which builds upon a collection of Semantic Web Services that federate existing sources of medical and Technology Enhanced Learning (TEL) data. The semantic mEducator 3.0 approach It has a number of different instantiations, allowing flexibility and choice. At present these comprise of a standalone social web-based instantiation (MetaMorphosis+) and instantiations integrated with Drupal, Moodle and OpenLabyrinth systems. This paper presents the evaluation results of the mEducator 3.0 Linked Data based environment for sharing medical educational resources and focuses on metadata enrichment, conformance to the requirements and technical performance (of the MetaMorphosis+ and Drupal instantiations)

    A Genome-wide screen identifies frequently methylated genes in haematological and epithelial cancers

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Genetic as well as epigenetic alterations are a hallmark of both epithelial and haematological malignancies. High throughput screens are required to identify epigenetic markers that can be useful for diagnostic and prognostic purposes across malignancies.</p> <p>Results</p> <p>Here we report for the first time the use of the MIRA assay (methylated CpG island recovery assay) in combination with genome-wide CpG island arrays to identify epigenetic molecular markers in childhood acute lymphoblastic leukemia (ALL) on a genome-wide scale. We identified 30 genes demonstrating methylation frequencies of ≥25% in childhood ALL, nine genes showed significantly different methylation frequencies in B vs T-ALL. For majority of the genes expression could be restored in methylated leukemia lines after treatment with 5-azaDC. Forty-four percent of the genes represent targets of the polycomb complex. In chronic myeloid leukemia (CML) two of the genes, (<it>TFAP2A </it>and <it>EBF2)</it>, demonstrated increased methylation in blast crisis compared to chronic phase (P < 0.05). Furthermore hypermethylation of an autophagy related gene <it>ATG16L2 </it>was associated with poorer prognosis in terms of molecular response to Imatinib treatment. Lastly we demonstrated that ten of these genes were also frequently methylated in common epithelial cancers.</p> <p>Conclusion</p> <p>In summary we have identified a large number of genes showing frequent methylation in childhood ALL, methylation status of two of these genes is associated with advanced disease in CML and methylation status of another gene is associated with prognosis. In addition a subset of these genes may act as epigenetic markers across hematological malignancies as well as common epithelial cancers.</p

    Towards a global participatory platform: Democratising open data, complexity science and collective intelligence

    Get PDF
    The FuturICT project seeks to use the power of big data, analytic models grounded in complexity science, and the collective intelligence they yield for societal benefit. Accordingly, this paper argues that these new tools should not remain the preserve of restricted government, scientific or corporate élites, but be opened up for societal engagement and critique. To democratise such assets as a public good, requires a sustainable ecosystem enabling different kinds of stakeholder in society, including but not limited to, citizens and advocacy groups, school and university students, policy analysts, scientists, software developers, journalists and politicians. Our working name for envisioning a sociotechnical infrastructure capable of engaging such a wide constituency is the Global Participatory Platform (GPP). We consider what it means to develop a GPP at the different levels of data, models and deliberation, motivating a framework for different stakeholders to find their ecological niches at different levels within the system, serving the functions of (i) sensing the environment in order to pool data, (ii) mining the resulting data for patterns in order to model the past/present/future, and (iii) sharing and contesting possible interpretations of what those models might mean, and in a policy context, possible decisions. A research objective is also to apply the concepts and tools of complexity science and social science to the project's own work. We therefore conceive the global participatory platform as a resilient, epistemic ecosystem, whose design will make it capable of self-organization and adaptation to a dynamic environment, and whose structure and contributions are themselves networks of stakeholders, challenges, issues, ideas and arguments whose structure and dynamics can be modelled and analysed. Graphical abstrac
    corecore