1,596 research outputs found

    Student-Centered Learning: Functional Requirements for Integrated Systems to Optimize Learning

    Get PDF
    The realities of the 21st-century learner require that schools and educators fundamentally change their practice. "Educators must produce college- and career-ready graduates that reflect the future these students will face. And, they must facilitate learning through means that align with the defining attributes of this generation of learners."Today, we know more than ever about how students learn, acknowledging that the process isn't the same for every student and doesn't remain the same for each individual, depending upon maturation and the content being learned. We know that students want to progress at a pace that allows them to master new concepts and skills, to access a variety of resources, to receive timely feedback on their progress, to demonstrate their knowledge in multiple ways and to get direction, support and feedback from—as well as collaborate with—experts, teachers, tutors and other students.The result is a growing demand for student-centered, transformative digital learning using competency education as an underpinning.iNACOL released this paper to illustrate the technical requirements and functionalities that learning management systems need to shift toward student-centered instructional models. This comprehensive framework will help districts and schools determine what systems to use and integrate as they being their journey toward student-centered learning, as well as how systems integration aligns with their organizational vision, educational goals and strategic plans.Educators can use this report to optimize student learning and promote innovation in their own student-centered learning environments. The report will help school leaders understand the complex technologies needed to optimize personalized learning and how to use data and analytics to improve practices, and can assist technology leaders in re-engineering systems to support the key nuances of student-centered learning

    Delving into instructor‐led feedback interventions informed by learning analytics in massive open online courses

    Get PDF
    Producción CientíficaBackground:Providing feedback in massive open online courses (MOOCs) is chal-lenging due to the massiveness and heterogeneity of learners' population. Learninganalytics (LA) solutions aim at scaling up feedback interventions and supportinginstructors in this endeavour.Paper Objectives:This paper focuses on instructor-led feedback mediated by LAtools in MOOCs. Our goal is to answer how, to what extent data-driven feedback isprovided to learners, and what its impact is.Methods:We conducted a systematic literature review on the state-of-the-art LA-informed instructor-led feedback in MOOCs. From a pool of 227 publications, weselected 38 articles that address the topic of LA-informed feedback in MOOCs medi-ated by instructors. We applied etic content analysis to the collected data.Results and Conclusions:The results revealed a lack of empirical studies exploring LA todeliver feedback, and limited attention on pedagogy to inform feedback practices. Our find-ings suggest the need for systematization and evaluation of feedback. Additionally, there isa need for conceptual tools to guide instructors' in the design of LA-based feedback.Takeaways:We point out the need for systematization and evaluation of feedback. Weenvision that this research can support the design of LA-based feedback, thus contribut-ing to bridge the gap between pedagogy and data-driven practice in MOOCs.Consejo de Investigación de Estonia (PSG286)Ministerio de Ciencia e Innovación - Fondo Europeo de Desarrollo Regional y la Agencia Nacional de Investigación (grant PID2020-112584RB-C32) and (grant TIN2017-85179-C3-2-R)Junta de Castilla y León - Fondo Social Europeo y el Consejo Regional de Educación (grant E-47-2018-0108488

    Mining Social Media for Newsgathering: A Review

    Get PDF
    Social media is becoming an increasingly important data source for learning about breaking news and for following the latest developments of ongoing news. This is in part possible thanks to the existence of mobile devices, which allows anyone with access to the Internet to post updates from anywhere, leading in turn to a growing presence of citizen journalism. Consequently, social media has become a go-to resource for journalists during the process of newsgathering. Use of social media for newsgathering is however challenging, and suitable tools are needed in order to facilitate access to useful information for reporting. In this paper, we provide an overview of research in data mining and natural language processing for mining social media for newsgathering. We discuss five different areas that researchers have worked on to mitigate the challenges inherent to social media newsgathering: news discovery, curation of news, validation and verification of content, newsgathering dashboards, and other tasks. We outline the progress made so far in the field, summarise the current challenges as well as discuss future directions in the use of computational journalism to assist with social media newsgathering. This review is relevant to computer scientists researching news in social media as well as for interdisciplinary researchers interested in the intersection of computer science and journalism.Comment: Accepted for publication in Online Social Networks and Medi

    A Systematic Review of Teacher-Facing Dashboards for Collaborative Learning Activities and Tools in Online Higher Education

    Get PDF
    Dashboard for online higher education support monitoring and evaluation of students’ interactions, but mostly limited to interaction occurring within learning management systems. In this study, we sought to find which collaborative learning activities and tools in online higher education are included in teaching dashboards. By following Kitchenham’s procedure for systematic reviews, 36 papers were identified according to this focus and analysed. The results identify dashboards supporting collaborative tools, both synchronous and asynchronous, along categories such as learning management systems, communication tools, social media, computer programming code management platforms, project management platforms, and collaborative writing tools. Dashboard support was also found for collaborative activities, grouped under four categories of forum discussion activities, three categories of communication activities and four categories of collaborative editing/sharing activities, though most of the analysed dashboards only provide support for no more than two or three collaborative tools. This represents a need for further research on how to develop dashboards that combine data from a more diverse set of collaborative activities and tools.This work was supported by the TRIO project funded by the European Union’s Erasmus+ KA220-ADU – Cooperation partnerships in adult education programme under grant agreement no. KA220-ADU-1B9975F8.info:eu-repo/semantics/publishedVersio

    A DESIGN STUDY TO ENHANCE PERFORMANCE DASHBOARDS TO IMPROVE THE DECISION-MAKING PROCESS

    Get PDF
    Performance dashboards are tools that can be used to improve the decision making in an organisation (Henke et al., 2016). Nevertheless, organisations have trouble finding the right person to integrate and analyse the data in an organisation (Henke et al., 2016). This is not solely because the data analyst does not have the capabilities, but also because there is an information imbalance between the management board and the data engineer. Nowadays we live in a digital era and data plays an important role for organisations (McGee, Prusak and Pyburn, 1993). This thesis aims to solve this problem by creating an artefact to improve performance dashboards with explanatory business diagnoses. This will solve the imbalance between the management board and the data engineers and will improve the decisions in the organisation. The first chapter is starts with the practical and scientific relevance and gives reasons why an artefact is needed. The research and sub questions are formulated, and the scope of this thesis is described. The second chapter focuses on the history of business intelligence (BI) and the role of BI in performance dashboard. Business intelligence and performance dashboards are related. Furthermore, the characteristics of performance dashboards and different performance dashboards are discussed. Multiple articles are combined to form four important characteristics for performance dashboards. 1. Flexibility: a performance dashboard needs to be easy to modify, used by multiple users and the ability to personalize the overview page. 2. Interactive: a performance dashboard needs to have the ability to drill down, monitor KPI’s and show not solely graphs. 3. Visual: a performance dashboard needs to give a visual overview of accurate data from the past and the present day in time. 4. External benchmarking: a performance dashboard needs to have the ability to compare the results with competitors and make prescriptive and predictive analysis based on the data. This chapter ends with a comparison of different performance dashboards to find the most suitable tool for this research. Power BI is the most suitable tool for this research because it is easy to use and free. The focus of the third chapter is on the decision-making process. The articles of Mintzberg (1970) Endsley and Garland (2000) and Eppler and Mengis (2004) form the basis of this chapter. Information influences the decision-making process, but information can also lead to information overload (Eppler and Mengis, 2004). This chapter gives an overview of important factors in the decision-making process. These factors are used to improve the performance dashboard. Chapter four is about business diagnoses and explains the model of the artefact. The artefact is based on an article of Daniels and Feelders (2001). This article states that a good business diagnosis is based on six different steps. 1. determine the actual data (normalised/absolute and scaled or not scaled); 2. determine the reference data (normalised/absolute and scaled or not scaled); 3. get model relations from star scheme; 4. compute influence of reference data to determine causes; 5. filter causes to avoid information overload; 6. visual explanation tree of causes. These steps are used to create the artefact. Chapter five analyses a new business diagnosis tool from Power BI. This tool is called the decomposition tree. This chapter finds out if it is useful for automated business diagnosis. The artefact is described in chapter six and different graphs and outcomes are displayed. The research ends with a conclusion about the advantages of the artefact, limitations and future research

    Assisting Forensic Identification through Unsupervised Information Extraction of Free Text Autopsy Reports: The Disappearances Cases during the Brazilian Military Dictatorship

    Get PDF
    Anthropological, archaeological, and forensic studies situate enforced disappearance as a strategy associated with the Brazilian military dictatorship (1964–1985), leaving hundreds of persons without identity or cause of death identified. Their forensic reports are the only existing clue for people identification and detection of possible crimes associated with them. The exchange of information among institutions about the identities of disappeared people was not a common practice. Thus, their analysis requires unsupervised techniques, mainly due to the fact that their contextual annotation is extremely time-consuming, difficult to obtain, and with high dependence on the annotator. The use of these techniques allows researchers to assist in the identification and analysis in four areas: Common causes of death, relevant body locations, personal belongings terminology, and correlations between actors such as doctors and police officers involved in the disappearances. This paper analyzes almost 3000 textual reports of missing persons in São Paulo city during the Brazilian dictatorship through unsupervised algorithms of information extraction in Portuguese, identifying named entities and relevant terminology associated with these four criteria. The analysis allowed us to observe terminological patterns relevant for people identification (e.g., presence of rings or similar personal belongings) and automate the study of correlations between actors. The proposed system acts as a first classificatory and indexing middleware of the reports and represents a feasible system that can assist researchers working in pattern search among autopsy reportsThis research was partially funded by Spanish Ministry of Economy, Industry and 5 Competitiveness under its Competitive Juan de la Cierva Postdoctoral Research Programme, grant FJCI-2016-6 28032 and from the European Union, through the Marie Skłodowska-Curie Innovative Training Network ‘CHEurope: Critical Heritage Studies and the Future of Europe’ H2020 Marie Skłodowska-Curie Actions, grant 722416S

    Dashboard Framework. A Tool for Threat Monitoring on the Example of Covid-19

    Get PDF
    The aim of the study is to create a dashboard framework to monitor the spread of the Covid-19 pandemic based on quantitative and qualitative data processing. The theoretical part propounds the basic assumptions underlying the concept of the dashboard framework. The paper presents the most important functions of the dashboard framework and examples of its adoption. The limitations related to the dashboard framework development are also indicated. As part of empirical research, an original model of the Dash-Cov framework was designed, enabling the acquisition and processing of quantitative and qualitative data on the spread of the SARS-CoV-2 virus. The developed model was pre-validated. Over 25,000 records and around 100,000 tweets were analyzed. The adopted research methods included statistical analysis and text analysis methods, in particular the sentiment analysis and the topic modeling
    corecore