18,839 research outputs found

    Big data for monitoring educational systems

    Get PDF
    This report considers “how advances in big data are likely to transform the context and methodology of monitoring educational systems within a long-term perspective (10-30 years) and impact the evidence based policy development in the sector”, big data are “large amounts of different types of data produced with high velocity from a high number of various types of sources.” Five independent experts were commissioned by Ecorys, responding to themes of: students' privacy, educational equity and efficiency, student tracking, assessment and skills. The experts were asked to consider the “macro perspective on governance on educational systems at all levels from primary, secondary education and tertiary – the latter covering all aspects of tertiary from further, to higher, and to VET”, prioritising primary and secondary levels of education

    Student Privacy in Learning Analytics: An Information Ethics Perspective

    Get PDF
    In recent years, educational institutions have started using the tools of commercial data analytics in higher education. By gathering information about students as they navigate campus information systems, learning analytics “uses analytic techniques to help target instructional, curricular, and support resources” to examine student learning behaviors and change students’ learning environments. As a result, the information educators and educational institutions have at their disposal is no longer demarcated by course content and assessments, and old boundaries between information used for assessment and information about how students live and work are blurring. Our goal in this paper is to provide a systematic discussion of the ways in which privacy and learning analytics conflict and to provide a framework for understanding those conflicts. We argue that there are five crucial issues about student privacy that we must address in order to ensure that whatever the laudable goals and gains of learning analytics, they are commensurate with respecting students’ privacy and associated rights, including (but not limited to) autonomy interests. First, we argue that we must distinguish among different entities with respect to whom students have, or lack, privacy. Second, we argue that we need clear criteria for what information may justifiably be collected in the name of learning analytics. Third, we need to address whether purported consequences of learning analytics (e.g., better learning outcomes) are justified and what the distributions of those consequences are. Fourth, we argue that regardless of how robust the benefits of learning analytics turn out to be, students have important autonomy interests in how information about them is collected. Finally, we argue that it is an open question whether the goods that justify higher education are advanced by learning analytics, or whether collection of information actually runs counter to those goods

    ‘A double-edged sword. This is powerful but it could be used destructively’: Perspectives of early career education researchers on learning analytics

    Get PDF
    Learning analytics has been increasingly outlined as a powerful tool for measuring, analysing, and predicting learning experiences and behaviours. The rising use of learning analytics means that many educational researchers now require new ranges of technical analytical skills to contribute to an increasingly data-heavy field. However, it has been argued that educational data scientists are a ‘scarce breed’ (Buckingham Shum et al., 2013) and that more resources are needed to support the next generation of early career researchers in the education field. At the same time, little is known about how early career education researchers feel towards learning analytics and whether it is important to their current and future research practices. Using a thematic analysis of a participatory learning analytics workshop discussions with 25 early career education researchers, we outline in this article their ambitions, challenges and anxieties towards learning analytics. In doing so, we have provided a roadmap for how the learning analytics field might evolve and practical implications for supporting early career researchers’ development

    Insights from Learning Analytics for Hands-On Cloud Computing Labs in AWS

    Get PDF
    [EN] Cloud computing instruction requires hands-on experience with a myriad of distributed computing services from a public cloud provider. Tracking the progress of the students, especially for online courses, requires one to automatically gather evidence and produce learning analytics in order to further determine the behavior and performance of students. With this aim, this paper describes the experience from an online course in cloud computing with Amazon Web Services on the creation of an open-source data processing tool to systematically obtain learning analytics related to the hands-on activities carried out throughout the course. These data, combined with the data obtained from the learning management system, have allowed the better characterization of the behavior of students in the course. Insights from a population of more than 420 online students through three academic years have been assessed, the dataset has been released for increased reproducibility. The results corroborate that course length has an impact on online students dropout. In addition, a gender analysis pointed out that there are no statistically significant differences in the final marks between genders, but women show an increased degree of commitment with the activities planned in the course.This research was funded by the Spanish "Ministerio de Economia, Industria y Competitividad through grant number TIN2016-79951-R (BigCLOE)", the "Vicerrectorado de Estudios, Calidad y Acreditacion" of the Universitat Politecnica de Valencia (UPV) to develop the PIME B29 and PIME/19-20/166, and by the Conselleria d'Innovacio, Universitat, Ciencia i Societat Digital for the project "CloudSTEM" with reference number AICO/2019/313.Moltó, G.; Naranjo-Delgado, DM.; Segrelles Quilis, JD. (2020). Insights from Learning Analytics for Hands-On Cloud Computing Labs in AWS. Applied Sciences. 10(24):1-13. https://doi.org/10.3390/app10249148S1131024Motiwalla, L., Deokar, A. V., Sarnikar, S., & Dimoka, A. (2019). Leveraging Data Analytics for Behavioral Research. Information Systems Frontiers, 21(4), 735-742. doi:10.1007/s10796-019-09928-8Siemens, G., & Baker, R. S. J. d. (2012). Learning analytics and educational data mining. Proceedings of the 2nd International Conference on Learning Analytics and Knowledge - LAK ’12. doi:10.1145/2330601.2330661Blikstein, P. (2013). Multimodal learning analytics. Proceedings of the Third International Conference on Learning Analytics and Knowledge - LAK ’13. doi:10.1145/2460296.2460316Hewson, E. R. F. (2018). Students’ Emotional Engagement, Motivation and Behaviour Over the Life of an Online Course: Reflections on Two Market Research Case Studies. Journal of Interactive Media in Education, 2018(1). doi:10.5334/jime.472Kahan, T., Soffer, T., & Nachmias, R. (2017). Types of Participant Behavior in a Massive Open Online Course. The International Review of Research in Open and Distributed Learning, 18(6). doi:10.19173/irrodl.v18i6.3087Cross, S., & Whitelock, D. (2016). Similarity and difference in fee-paying and no-fee learner expectations, interaction and reaction to learning in a massive open online course. Interactive Learning Environments, 25(4), 439-451. doi:10.1080/10494820.2016.1138312Charleer, S., Klerkx, J., & Duval, E. (2014). Learning Dashboards. Journal of Learning Analytics, 1(3), 199-202. doi:10.18608/jla.2014.13.22Worsley, M. (2012). Multimodal learning analytics. Proceedings of the 14th ACM international conference on Multimodal interaction - ICMI ’12. doi:10.1145/2388676.2388755Spikol, D., Prieto, L. P., Rodríguez-Triana, M. J., Worsley, M., Ochoa, X., Cukurova, M., … Ringtved, U. L. (2017). Current and future multimodal learning analytics data challenges. Proceedings of the Seventh International Learning Analytics & Knowledge Conference. doi:10.1145/3027385.3029437Ochoa, X., Worsley, M., Weibel, N., & Oviatt, S. (2016). Multimodal learning analytics data challenges. Proceedings of the Sixth International Conference on Learning Analytics & Knowledge - LAK ’16. doi:10.1145/2883851.2883913Aguilar, J., Sánchez, M., Cordero, J., Valdiviezo-Díaz, P., Barba-Guamán, L., & Chamba-Eras, L. (2017). Learning analytics tasks as services in smart classrooms. Universal Access in the Information Society, 17(4), 693-709. doi:10.1007/s10209-017-0525-0Lu, O. H. T., Huang, J. C. H., Huang, A. Y. Q., & Yang, S. J. H. (2017). Applying learning analytics for improving students engagement and learning outcomes in an MOOCs enabled collaborative programming course. Interactive Learning Environments, 25(2), 220-234. doi:10.1080/10494820.2016.1278391Drachsler, H., & Kalz, M. (2016). The MOOC and learning analytics innovation cycle (MOLAC): a reflective summary of ongoing research and its challenges. Journal of Computer Assisted Learning, 32(3), 281-290. doi:10.1111/jcal.12135Ruiperez-Valiente, J. A., Munoz-Merino, P. J., Gascon-Pinedo, J. A., & Kloos, C. D. (2017). Scaling to Massiveness With ANALYSE: A Learning Analytics Tool for Open edX. IEEE Transactions on Human-Machine Systems, 47(6), 909-914. doi:10.1109/thms.2016.2630420Er, E., Gómez-Sánchez, E., Dimitriadis, Y., Bote-Lorenzo, M. L., Asensio-Pérez, J. I., & Álvarez-Álvarez, S. (2019). Aligning learning design and learning analytics through instructor involvement: a MOOC case study. Interactive Learning Environments, 27(5-6), 685-698. doi:10.1080/10494820.2019.1610455Tabaa, Y., & Medouri, A. (2013). LASyM: A Learning Analytics System for MOOCs. International Journal of Advanced Computer Science and Applications, 4(5). doi:10.14569/ijacsa.2013.040516Shorfuzzaman, M., Hossain, M. S., Nazir, A., Muhammad, G., & Alamri, A. (2019). Harnessing the power of big data analytics in the cloud to support learning analytics in mobile learning environment. Computers in Human Behavior, 92, 578-588. doi:10.1016/j.chb.2018.07.002Klašnja-Milićević, A., Ivanović, M., & Budimac, Z. (2017). Data science in education: Big data and learning analytics. Computer Applications in Engineering Education, 25(6), 1066-1078. doi:10.1002/cae.21844Logglyhttps://www.loggly.com/Molto, G., & Caballer, M. (2014). On using the cloud to support online courses. 2014 IEEE Frontiers in Education Conference (FIE) Proceedings. doi:10.1109/fie.2014.7044041Caballer, M., Blanquer, I., Moltó, G., & de Alfonso, C. (2014). Dynamic Management of Virtual Infrastructures. Journal of Grid Computing, 13(1), 53-70. doi:10.1007/s10723-014-9296-5AWS CloudTrailhttps://aws.amazon.com/cloudtrail/Amazon Simple Storage Service (Amazon S3)http://aws.amazon.com/s3/Naranjo, D. M., Prieto, J. R., Moltó, G., & Calatrava, A. (2019). A Visual Dashboard to Track Learning Analytics for Educational Cloud Computing. Sensors, 19(13), 2952. doi:10.3390/s19132952Baldini, I., Castro, P., Chang, K., Cheng, P., Fink, S., Ishakian, V., … Suter, P. (2017). Serverless Computing: Current Trends and Open Problems. Research Advances in Cloud Computing, 1-20. doi:10.1007/978-981-10-5026-8_1Zimmerman, D. W. (1987). Comparative Power of StudentTTest and Mann-WhitneyUTest for Unequal Sample Sizes and Variances. The Journal of Experimental Education, 55(3), 171-174. doi:10.1080/00220973.1987.10806451Kruskal, W. H., & Wallis, W. A. (1952). Use of Ranks in One-Criterion Variance Analysis. Journal of the American Statistical Association, 47(260), 583-621. doi:10.1080/01621459.1952.10483441Voyer, D., & Voyer, S. D. (2014). Gender differences in scholastic achievement: A meta-analysis. Psychological Bulletin, 140(4), 1174-1204. doi:10.1037/a0036620Ellemers, N., Heuvel, H., Gilder, D., Maass, A., & Bonvini, A. (2004). The underrepresentation of women in science: Differential commitment or the queen bee syndrome? British Journal of Social Psychology, 43(3), 315-338. doi:10.1348/0144666042037999Sheard, M. (2009). Hardiness commitment, gender, and age differentiate university academic performance. British Journal of Educational Psychology, 79(1), 189-204. doi:10.1348/000709908x30440

    Stability and sensitivity of Learning Analytics based prediction models

    Get PDF
    Learning analytics seek to enhance the learning processes through systematic measurements of learning related data and to provide informative feedback to learners and educators. Track data from Learning Management Systems (LMS) constitute a main data source for learning analytics. This empirical contribution provides an application of Buckingham Shum and Deakin Crick’s theoretical framework of dispositional learning analytics: an infrastructure that combines learning dispositions data with data extracted from computer-assisted, formative assessments and LMSs. In two cohorts of a large introductory quantitative methods module, 2049 students were enrolled in a module based on principles of blended learning, combining face-to-face Problem-Based Learning sessions with e-tutorials. We investigated the predictive power of learning dispositions, outcomes of continuous formative assessments and other system generated data in modelling student performance and their potential to generate informative feedback. Using a dynamic, longitudinal perspective, computer-assisted formative assessments seem to be the best predictor for detecting underperforming students and academic performance, while basic LMS data did not substantially predict learning. If timely feedback is crucial, both use-intensity related track data from e-tutorial systems, and learning dispositions, are valuable sources for feedback generation

    Big data in higher education: an action research on managing student engagement with business intelligence

    Get PDF
    This research aims to explore the value of Big Data in student engagement management. It presents an action research on applying BI in a UK higher education institution that has developed and implemented a student engagement tracking system (SES) for better student engagement management. The SES collects data from various sources, including RFID tracking devices across many locations in the campus and student online activities. This public funded research project has enhanced the current SES with BI solutions and raised awareness on the value of the Big Data in improving student experience. The action research concerns with the organizational wide development and deployment of Intelligent Student Engagement System involving a diverse range of stakeholders. The activities undertaken to date have revealed interesting findings and implications for advancing our understanding and research in leveraging the benefit of the Big Data in Higher Education from a socio-technical perspective

    Student-Centered Learning: Functional Requirements for Integrated Systems to Optimize Learning

    Get PDF
    The realities of the 21st-century learner require that schools and educators fundamentally change their practice. "Educators must produce college- and career-ready graduates that reflect the future these students will face. And, they must facilitate learning through means that align with the defining attributes of this generation of learners."Today, we know more than ever about how students learn, acknowledging that the process isn't the same for every student and doesn't remain the same for each individual, depending upon maturation and the content being learned. We know that students want to progress at a pace that allows them to master new concepts and skills, to access a variety of resources, to receive timely feedback on their progress, to demonstrate their knowledge in multiple ways and to get direction, support and feedback from—as well as collaborate with—experts, teachers, tutors and other students.The result is a growing demand for student-centered, transformative digital learning using competency education as an underpinning.iNACOL released this paper to illustrate the technical requirements and functionalities that learning management systems need to shift toward student-centered instructional models. This comprehensive framework will help districts and schools determine what systems to use and integrate as they being their journey toward student-centered learning, as well as how systems integration aligns with their organizational vision, educational goals and strategic plans.Educators can use this report to optimize student learning and promote innovation in their own student-centered learning environments. The report will help school leaders understand the complex technologies needed to optimize personalized learning and how to use data and analytics to improve practices, and can assist technology leaders in re-engineering systems to support the key nuances of student-centered learning

    Data analytics 2016: proceedings of the fifth international conference on data analytics

    Get PDF
    corecore