92,815 research outputs found

    Serious Gaming Analytics: What Students´ Log Files Tell Us about Gaming and Learning

    Get PDF
    In this paper we explore existing log files of the VIBOA environmental policy game. Our aim is to identify relevant player behaviours and performance patterns. The VIBOA game is a 50 hours master level serious game that supports inquiry-based learning: students adopt the role of an environmental consultant in the (fictitious) consultancy agency VIBOA, and have to deal with complex, multi-faceted environmental problems in an academic and methodologically sound way. A sample of 118 master students played the game. We used learning analytics to extract relevant data from the logging and find meaningful patterns and relationships. We observed substantial behavioural variability across students. Correlation analysis suggest a behavioural trade that reflects the rate of “switching” between different game objects or activities. We were able to establish a model that uses switching indicators as predictors for the efficiency of learning. Also we found slight evidence that students who display increased switching behaviours need more time to complete the games. We conclude the paper by critically evaluating our findings, making explicit the limitations of our study and making suggestions for future research that links together learning analytics and serious gaming

    Provincial Data-linkage to Address Complex Policy Challenges

    Get PDF
    Introduction The Province of British Columbia, Canada has established a Data Innovation Program (DI Program) and a Data Science Partnerships Program (DSP Program) to use integrated public-sector data to drive insights into complex policy challenges and support the public good. These programs are a part of the province's new Integrated Data. Objectives and Approach The DI Program was built to enable policy decisions based on a more complete picture of the citizen journey across and throughout government programs. It provides a privacy and security framework for corporate data analytics and a cross-government secure research environment. The DSP Program provides analytics and/or project support for high-priority cross-government projects. The opportunity afforded by this approach to policy decision-making is that valuable data and evidence from multiple sectors can be utilized to make positive changes in the lives of citizens. Results The IDO has partnered with cross government experts on a series of pilot projects that used linked data spanning social services, families and households, education, and health and clinical records. Research topics ranged from the prediction of risk of long-term unemployment, to the impact of the foreign home buyers tax, to the effectiveness of labour market programs. Throughout our presentation we will use these projects as case examples to address the benefits and opportunities provided through our citizen-centred, integrated approach. Conclusion/Implications The future of policy decision-making in terms of service delivery relies on mutually beneficial collaboration and the evidence-based insight available through integrated data. Moving forward, it is essential that researchers across government make the most out of integrated population-level data to solve pressing issues affecting the lives of citizens

    Revealing spatiotemporal transmission patterns and stages of COVID-19 in China using individual patients’ trajectory data

    Get PDF
    Gauging viral transmission through human mobility in order to contain the COVID-19 pandemic has been a hot topic in academic studies and evidence-based policy-making. Although it is widely accepted that there is a strong positive correlation between the transmission of the coronavirus and the mobility of the general public, there are limitations to existing studies on this topic. For example, using digital proxies of mobile devices/apps may only partially reflect the movement of individuals; using the mobility of the general public and not COVID-19 patients in particular, or only using places where patients were diagnosed to study the spread of the virus may not be accurate; existing studies have focused on either the regional or national spread of COVID-19, and not the spread at the city level; and there are no systematic approaches for understanding the stages of transmission to facilitate the policy-making to contain the spread. To address these issues, we have developed a new methodological framework for COVID-19 transmission analysis based upon individual patients’ trajectory data. By using innovative space–time analytics, this framework reveals the spatiotemporal patterns of patients’ mobility and the transmission stages of COVID-19 from Wuhan to the rest of China at finer spatial and temporal scales. It can improve our understanding of the interaction of mobility and transmission, identifying the risk of spreading in small and medium-sized cities that have been neglected in existing studies. This demonstrates the effectiveness of the proposed framework and its policy implications to contain the COVID-19 pandemic

    Using a Model-driven Approach in Building a Provenance Framework for Tracking Policy-making Processes in Smart Cities

    Full text link
    The significance of provenance in various settings has emphasised its potential in the policy-making process for analytics in Smart Cities. At present, there exists no framework that can capture the provenance in a policy-making setting. This research therefore aims at defining a novel framework, namely, the Policy Cycle Provenance (PCP) Framework, to capture the provenance of the policy-making process. However, it is not straightforward to design the provenance framework due to a number of associated policy design challenges. The design challenges revealed the need for an adaptive system for tracking policies therefore a model-driven approach has been considered in designing the PCP framework. Also, suitability of a networking approach is proposed for designing workflows for tracking the policy-making process.Comment: 15 pages, 5 figures, 2 tables, Proc of the 21st International Database Engineering & Applications Symposium (IDEAS 2017

    Synthetic Biology: Mapping the Scientific Landscape

    Get PDF
    This article uses data from Thomson Reuters Web of Science to map and analyse the scientific landscape for synthetic biology. The article draws on recent advances in data visualisation and analytics with the aim of informing upcoming international policy debates on the governance of synthetic biology by the Subsidiary Body on Scientific, Technical and Technological Advice (SBSTTA) of the United Nations Convention on Biological Diversity. We use mapping techniques to identify how synthetic biology can best be understood and the range of institutions, researchers and funding agencies involved. Debates under the Convention are likely to focus on a possible moratorium on the field release of synthetic organisms, cells or genomes. Based on the empirical evidence we propose that guidance could be provided to funding agencies to respect the letter and spirit of the Convention on Biological Diversity in making research investments. Building on the recommendations of the United States Presidential Commission for the Study of Bioethical Issues we demonstrate that it is possible to promote independent and transparent monitoring of developments in synthetic biology using modern information tools. In particular, public and policy understanding and engagement with synthetic biology can be enhanced through the use of online interactive tools. As a step forward in this process we make existing data on the scientific literature on synthetic biology available in an online interactive workbook so that researchers, policy makers and civil society can explore the data and draw conclusions for themselves

    Data analytics and algorithms in policing in England and Wales: Towards a new policy framework

    Get PDF
    RUSI was commissioned by the Centre for Data Ethics and Innovation (CDEI) to conduct an independent study into the use of data analytics by police forces in England and Wales, with a focus on algorithmic bias. The primary purpose of the project is to inform CDEI’s review of bias in algorithmic decision-making, which is focusing on four sectors, including policing, and working towards a draft framework for the ethical development and deployment of data analytics tools for policing. This paper focuses on advanced algorithms used by the police to derive insights, inform operational decision-making or make predictions. Biometric technology, including live facial recognition, DNA analysis and fingerprint matching, are outside the direct scope of this study, as are covert surveillance capabilities and digital forensics technology, such as mobile phone data extraction and computer forensics. However, because many of the policy issues discussed in this paper stem from general underlying data protection and human rights frameworks, these issues will also be relevant to other police technologies, and their use must be considered in parallel to the tools examined in this paper. The project involved engaging closely with senior police officers, government officials, academics, legal experts, regulatory and oversight bodies and civil society organisations. Sixty nine participants took part in the research in the form of semi-structured interviews, focus groups and roundtable discussions. The project has revealed widespread concern across the UK law enforcement community regarding the lack of official national guidance for the use of algorithms in policing, with respondents suggesting that this gap should be addressed as a matter of urgency. Any future policy framework should be principles-based and complement existing police guidance in a ‘tech-agnostic’ way. Rather than establishing prescriptive rules and standards for different data technologies, the framework should establish standardised processes to ensure that data analytics projects follow recommended routes for the empirical evaluation of algorithms within their operational context and evaluate the project against legal requirements and ethical standards. The new guidance should focus on ensuring multi-disciplinary legal, ethical and operational input from the outset of a police technology project; a standard process for model development, testing and evaluation; a clear focus on the human–machine interaction and the ultimate interventions a data driven process may inform; and ongoing tracking and mitigation of discrimination risk

    The Evidence Hub: harnessing the collective intelligence of communities to build evidence-based knowledge

    Get PDF
    Conventional document and discussion websites provide users with no help in assessing the quality or quantity of evidence behind any given idea. Besides, the very meaning of what evidence is may not be unequivocally defined within a community, and may require deep understanding, common ground and debate. An Evidence Hub is a tool to pool the community collective intelligence on what is evidence for an idea. It provides an infrastructure for debating and building evidence-based knowledge and practice. An Evidence Hub is best thought of as a filter onto other websites — a map that distills the most important issues, ideas and evidence from the noise by making clear why ideas and web resources may be worth further investigation. This paper describes the Evidence Hub concept and rationale, the breath of user engagement and the evolution of specific features, derived from our work with different community groups in the healthcare and educational sector
    corecore