1,089 research outputs found

    A Visual Dashboard to Track Learning Analytics for Educational Cloud Computing

    Full text link
    [EN] Cloud providers such as Amazon Web Services (AWS) stand out as useful platforms to teach distributed computing concepts as well as the development of Cloud-native scalable application architectures on real-world infrastructures. Instructors can benefit from high-level tools to track the progress of students during their learning paths on the Cloud, and this information can be disclosed via educational dashboards for students to understand their progress through the practical activities. To this aim, this paper introduces CloudTrail-Tracker, an open-source platform to obtain enhanced usage analytics from a shared AWS account. The tool provides the instructor with a visual dashboard that depicts the aggregated usage of resources by all the students during a certain time frame and the specific use of AWS for a specific student. To facilitate self-regulation of students, the dashboard also depicts the percentage of progress for each lab session and the pending actions by the student. The dashboard has been integrated in four Cloud subjects that use different learning methodologies (from face-to-face to online learning) and the students positively highlight the usefulness of the tool for Cloud instruction in AWS. This automated procurement of evidences of student activity on the Cloud results in close to real-time learning analytics useful both for semi-automated assessment and student self-awareness of their own training progress.This research was funded by the Spanish Ministerio de Economia, Industria y Competitividad, grant number TIN2016-79951-R (BigCLOE) and by the Vicerrectorado de Estudios, Calidad y Acreditacion of the Universitat Politecnica de Valencia (UPV) to develop the PIME B29.Naranjo, DM.; Prieto, JR.; Moltó, G.; Calatrava Arroyo, A. (2019). A Visual Dashboard to Track Learning Analytics for Educational Cloud Computing. Sensors. 19(13):1-15. https://doi.org/10.3390/s19132952S1151913Porter, W. W., Graham, C. R., Spring, K. A., & Welch, K. R. (2014). Blended learning in higher education: Institutional adoption and implementation. Computers & Education, 75, 185-195. doi:10.1016/j.compedu.2014.02.011Thai, N. T. T., De Wever, B., & Valcke, M. (2017). The impact of a flipped classroom design on learning performance in higher education: Looking for the best «blend» of lectures and guiding questions with feedback. Computers & Education, 107, 113-126. doi:10.1016/j.compedu.2017.01.003Chen, Y., Wang, Y., Kinshuk, & Chen, N.-S. (2014). Is FLIP enough? Or should we use the FLIPPED model instead? Computers & Education, 79, 16-27. doi:10.1016/j.compedu.2014.07.004Baepler, P., Walker, J. D., & Driessen, M. (2014). It’s not about seat time: Blending, flipping, and efficiency in active learning classrooms. Computers & Education, 78, 227-236. doi:10.1016/j.compedu.2014.06.006Molto, G., & Caballer, M. (2014). On using the cloud to support online courses. 2014 IEEE Frontiers in Education Conference (FIE) Proceedings. doi:10.1109/fie.2014.7044041González-Martínez, J. A., Bote-Lorenzo, M. L., Gómez-Sánchez, E., & Cano-Parra, R. (2015). Cloud computing and education: A state-of-the-art survey. Computers & Education, 80, 132-151. doi:10.1016/j.compedu.2014.08.017AWS Cloudtrailhttps://aws.amazon.com/cloudtrail/?nc1=h_lsFerguson, R. (2012). Learning analytics: drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5/6), 304. doi:10.1504/ijtel.2012.051816Schwendimann, B. A., Rodriguez-Triana, M. J., Vozniuk, A., Prieto, L. P., Boroujeni, M. S., Holzer, A., … Dillenbourg, P. (2017). Perceiving Learning at a Glance: A Systematic Literature Review of Learning Dashboard Research. IEEE Transactions on Learning Technologies, 10(1), 30-41. doi:10.1109/tlt.2016.2599522Sedrakyan, G., Malmberg, J., Verbert, K., Järvelä, S., & Kirschner, P. A. (2020). Linking learning behavior analytics and learning science concepts: Designing a learning analytics dashboard for feedback to support learning regulation. Computers in Human Behavior, 107, 105512. doi:10.1016/j.chb.2018.05.004Tabaa, Y., & Medouri, A. (2013). LASyM: A Learning Analytics System for MOOCs. International Journal of Advanced Computer Science and Applications, 4(5). doi:10.14569/ijacsa.2013.040516Verbert, K., Govaerts, S., Duval, E., Santos, J. L., Van Assche, F., Parra, G., & Klerkx, J. (2013). Learning dashboards: an overview and future research opportunities. Personal and Ubiquitous Computing. doi:10.1007/s00779-013-0751-2Arnold, K. E., & Pistilli, M. D. (2012). Course signals at Purdue. Proceedings of the 2nd International Conference on Learning Analytics and Knowledge - LAK ’12. doi:10.1145/2330601.2330666Ali, L., Hatala, M., Gašević, D., & Jovanović, J. (2012). A qualitative evaluation of evolution of a learning analytics tool. Computers & Education, 58(1), 470-489. doi:10.1016/j.compedu.2011.08.030Leony, D., Pardo, A., de la Fuente Valentín, L., de Castro, D. S., & Kloos, C. D. (2012). GLASS. Proceedings of the 2nd International Conference on Learning Analytics and Knowledge - LAK ’12. doi:10.1145/2330601.2330642Vieira, C., Parsons, P., & Byrd, V. (2018). Visual learning analytics of educational data: A systematic literature review and research agenda. Computers & Education, 122, 119-135. doi:10.1016/j.compedu.2018.03.018Jivet, I., Scheffel, M., Specht, M., & Drachsler, H. (2018). License to evaluate. Proceedings of the 8th International Conference on Learning Analytics and Knowledge. doi:10.1145/3170358.3170421Amazon CloudWatchhttps://aws.amazon.com/cloudwatch/?nc1=h_lsSpectrumhttps://spectrumapp.io/Opsview Monitorhttps://www.opsview.com/SignalFxhttps://signalfx.com/AWS Cloud Monitoringhttps://www.solarwinds.com/topics/aws-monitoringLonn, S., Aguilar, S. J., & Teasley, S. D. (2015). Investigating student motivation in the context of a learning analytics intervention during a summer bridge program. Computers in Human Behavior, 47, 90-97. doi:10.1016/j.chb.2014.07.013Pintrich, P. R. (2004). A Conceptual Framework for Assessing Motivation and Self-Regulated Learning in College Students. Educational Psychology Review, 16(4), 385-407. doi:10.1007/s10648-004-0006-xButler, D. L., & Winne, P. H. (1995). Feedback and Self-Regulated Learning: A Theoretical Synthesis. Review of Educational Research, 65(3), 245-281. doi:10.3102/00346543065003245Knight, S., Buckingham Shum, S., & Littleton, K. (2014). Epistemology, Assessment, Pedagogy: Where Learning Meets Analytics in the Middle Space. Journal of Learning Analytics, 1(2). doi:10.18608/jla.2014.12.3Jivet, I., Scheffel, M., Drachsler, H., & Specht, M. (2017). Awareness Is Not Enough: Pitfalls of Learning Analytics Dashboards in the Educational Practice. Lecture Notes in Computer Science, 82-96. doi:10.1007/978-3-319-66610-5_

    Beyond Failure: The 2nd LAK Failathon Poster

    Get PDF
    This poster will be a chance for a wider LAK audience to engage with the 2nd LAK Failathon workshop. Both of these will build on the successful Failathon event in 2016 and extend beyond discussing individual experiences of failure to exploring how the field can improve, particularly regarding the creation and use of evidence. Failure in research is an increasingly hot topic, with high-profile crises of confidence in the published research literature in medicine and psychology. Among the major factors in this research crisis are the many incentives to report and publish only positive findings. These incentives prevent the field in general from learning from negative findings, and almost entirely preclude the publication of mistakes and errors. Thus providing an alternative forum for practitioners and researchers to learn from each other’s failures can be very productive. The first LAK Failathon, held in 2016, provided just such an opportunity for researchers and practitioners to share their failures and negative findings in a lower-stakes environment, to help participants learn from each other’s mistakes. It was very successful, and there was strong support for running it as an annual event. The 2nd LAK Failathon workshop will build on that success, with twin objectives to provide an environment for individuals to learn from each other’s failures, and also to co-develop plans for how we as a field can better build and deploy our evidence base. This poster is an opportunity for wider feedback on the plans developed in the workshop, with interactive use of sticky notes to add new ideas and coloured dots to illustrate prioritisation. This broadens the participant base in this important work, which should improve the quality of the plans and the commitment of the community to delivering them

    Attitudes expressed in online comments about environmental factors in the tourism sector: an exploratory study

    Get PDF
    The object of this exploratory study is to identify the positive, neutral and negative environment factors that affect users who visit Spanish hotels in order to help the hotel managers decide how to improve the quality of the services provided. To carry out the research a Sentiment Analysis was initially performed, grouping the sample of tweets (n = 14459) according to the feelings shown and then a textual analysis was used to identify the key environment factors in these feelings using the qualitative analysis software Nvivo (QSR International, Melbourne, Australia). The results of the exploratory study present the key environment factors that affect the users experience when visiting hotels in Spain, such as actions that support local traditions and products, the maintenance of rural areas respecting the local environment and nature, or respecting air quality in the areas where hotels have facilities and offer services. The conclusions of the research can help hotels improve their services and the impact on the environment, as well as improving the visitors experience based on the positive, neutral and negative environment factors which the visitors themselves identified

    Harnessing Transparent Learning Analytics for Individualized Support through Auto-detection of Engagement in Face-to-Face Collaborative Learning

    Full text link
    Using learning analytics to investigate and support collaborative learning has been explored for many years. Recently, automated approaches with various artificial intelligence approaches have provided promising results for modelling and predicting student engagement and performance in collaborative learning tasks. However, due to the lack of transparency and interpretability caused by the use of "black box" approaches in learning analytics design and implementation, guidance for teaching and learning practice may become a challenge. On the one hand, the black box created by machine learning algorithms and models prevents users from obtaining educationally meaningful learning and teaching suggestions. On the other hand, focusing on group and cohort level analysis only can make it difficult to provide specific support for individual students working in collaborative groups. This paper proposes a transparent approach to automatically detect student's individual engagement in the process of collaboration. The results show that the proposed approach can reflect student's individual engagement and can be used as an indicator to distinguish students with different collaborative learning challenges (cognitive, behavioural and emotional) and learning outcomes. The potential of the proposed collaboration analytics approach for scaffolding collaborative learning practice in face-to-face contexts is discussed and future research suggestions are provided.Comment: 12 pages, 5 figure

    Sentiment Analysis for Fake News Detection

    Get PDF
    [Abstract] In recent years, we have witnessed a rise in fake news, i.e., provably false pieces of information created with the intention of deception. The dissemination of this type of news poses a serious threat to cohesion and social well-being, since it fosters political polarization and the distrust of people with respect to their leaders. The huge amount of news that is disseminated through social media makes manual verification unfeasible, which has promoted the design and implementation of automatic systems for fake news detection. The creators of fake news use various stylistic tricks to promote the success of their creations, with one of them being to excite the sentiments of the recipients. This has led to sentiment analysis, the part of text analytics in charge of determining the polarity and strength of sentiments expressed in a text, to be used in fake news detection approaches, either as a basis of the system or as a complementary element. In this article, we study the different uses of sentiment analysis in the detection of fake news, with a discussion of the most relevant elements and shortcomings, and the requirements that should be met in the near future, such as multilingualism, explainability, mitigation of biases, or treatment of multimedia elements.Xunta de Galicia; ED431G 2019/01Xunta de Galicia; ED431C 2020/11This work has been funded by FEDER/Ministerio de Ciencia, Innovación y Universidades — Agencia Estatal de Investigación through the ANSWERASAP project (TIN2017-85160-C2-1-R); and by Xunta de Galicia through a Competitive Reference Group grant (ED431C 2020/11). CITIC, as Research Center of the Galician University System, is funded by the Consellería de Educación, Universidade e Formación Profesional of the Xunta de Galicia through the European Regional Development Fund (ERDF/FEDER) with 80%, the Galicia ERDF 2014-20 Operational Programme, and the remaining 20% from the Secretaría Xeral de Universidades (ref. ED431G 2019/01). David Vilares is also supported by a 2020 Leonardo Grant for Researchers and Cultural Creators from the BBVA Foundation. Carlos Gómez-Rodríguez has also received funding from the European Research Council (ERC), under the European Union’s Horizon 2020 research and innovation programme (FASTPARSE, grant No. 714150

    Affordances and limitations of learning analytics for computer-assisted language learning: a case study of the VITAL project

    Get PDF
    Learning analytics (LA) has emerged as a field that offers promising new ways to support failing or weaker students, prevent drop-out and aid retention. However, other research suggests that large datasets of learner activity can be used to understand online learning behaviour and improve pedagogy. While the use of LA in language learning has received little attention to date, available research suggests that understanding language learner behaviour could provide valuable insights into task design for instructors and materials designers, as well as help students with effective learning strategies and personalised learning pathways. This paper first discusses previous research in the field of language learning and teaching based on learner tracking and the specific affordances of LA for CALL, as well as its inherent limitations and challenges. The second part of the paper analyses data arising from the European Commission (EC) funded VITAL project that adopted a bottom-up pedagogical approach to LA and implemented learner activity tracking in different blended or distance learning settings. Referring to data arising from 285 undergraduate students on a Business French course at Hasselt University which used a flipped classroom design, statistical and process-mining techniques were applied to map and visualise actual uses of online learning resources over the course of one semester. Results suggested that most students planned their self-study sessions in accordance with the flipped classroom design, both in terms of their timing of online activity and selection of contents. Other metrics measuring active online engagement – a crucial component of successful flipped learning - indicated significant differences between successful and non-successful students. Meaningful learner patterns were revealed in the data, visualising students’ paths through the online learning environment and uses of the different activity types. The research implied that valuable insights for instructors, course designers and students can be acquired based on the tracking and analysis of language learner data and the use of visualisation and process-mining tools

    Reflective Writing Analytics - Empirically Determined Keywords of Written Reflection

    Get PDF
    Despite their importance for educational practice, reflective writings are still manually analysed and assessed, posing a constraint on the use of this educational technique. Recently, research started to investigate automated approaches for analysing reflective writing. Foundational to many automated approaches is the knowledge of words that are important for the genre. This research presents keywords that are specific to several categories of a reflective writing model. These keywords have been derived from eight datasets, which contain several thousand instances using the log-likelihood method. Both performance measures, the accuracy and the Cohen's κ, for these keywords were estimated with ten-fold cross validation. The results reached an accuracy of 0.78 on average for all eight categories and a fair to good inter-rater reliability for most categories even though it did not make use of any sophisticated rule-based mechanisms or machine learning approaches. This research contributes to the development of automated reflective writing analytics that are based on data-driven empirical foundations

    Insights from Learning Analytics for Hands-On Cloud Computing Labs in AWS

    Get PDF
    [EN] Cloud computing instruction requires hands-on experience with a myriad of distributed computing services from a public cloud provider. Tracking the progress of the students, especially for online courses, requires one to automatically gather evidence and produce learning analytics in order to further determine the behavior and performance of students. With this aim, this paper describes the experience from an online course in cloud computing with Amazon Web Services on the creation of an open-source data processing tool to systematically obtain learning analytics related to the hands-on activities carried out throughout the course. These data, combined with the data obtained from the learning management system, have allowed the better characterization of the behavior of students in the course. Insights from a population of more than 420 online students through three academic years have been assessed, the dataset has been released for increased reproducibility. The results corroborate that course length has an impact on online students dropout. In addition, a gender analysis pointed out that there are no statistically significant differences in the final marks between genders, but women show an increased degree of commitment with the activities planned in the course.This research was funded by the Spanish "Ministerio de Economia, Industria y Competitividad through grant number TIN2016-79951-R (BigCLOE)", the "Vicerrectorado de Estudios, Calidad y Acreditacion" of the Universitat Politecnica de Valencia (UPV) to develop the PIME B29 and PIME/19-20/166, and by the Conselleria d'Innovacio, Universitat, Ciencia i Societat Digital for the project "CloudSTEM" with reference number AICO/2019/313.Moltó, G.; Naranjo-Delgado, DM.; Segrelles Quilis, JD. (2020). Insights from Learning Analytics for Hands-On Cloud Computing Labs in AWS. Applied Sciences. 10(24):1-13. https://doi.org/10.3390/app10249148S1131024Motiwalla, L., Deokar, A. V., Sarnikar, S., & Dimoka, A. (2019). Leveraging Data Analytics for Behavioral Research. Information Systems Frontiers, 21(4), 735-742. doi:10.1007/s10796-019-09928-8Siemens, G., & Baker, R. S. J. d. (2012). Learning analytics and educational data mining. Proceedings of the 2nd International Conference on Learning Analytics and Knowledge - LAK ’12. doi:10.1145/2330601.2330661Blikstein, P. (2013). Multimodal learning analytics. Proceedings of the Third International Conference on Learning Analytics and Knowledge - LAK ’13. doi:10.1145/2460296.2460316Hewson, E. R. F. (2018). Students’ Emotional Engagement, Motivation and Behaviour Over the Life of an Online Course: Reflections on Two Market Research Case Studies. Journal of Interactive Media in Education, 2018(1). doi:10.5334/jime.472Kahan, T., Soffer, T., & Nachmias, R. (2017). Types of Participant Behavior in a Massive Open Online Course. The International Review of Research in Open and Distributed Learning, 18(6). doi:10.19173/irrodl.v18i6.3087Cross, S., & Whitelock, D. (2016). Similarity and difference in fee-paying and no-fee learner expectations, interaction and reaction to learning in a massive open online course. Interactive Learning Environments, 25(4), 439-451. doi:10.1080/10494820.2016.1138312Charleer, S., Klerkx, J., & Duval, E. (2014). Learning Dashboards. Journal of Learning Analytics, 1(3), 199-202. doi:10.18608/jla.2014.13.22Worsley, M. (2012). Multimodal learning analytics. Proceedings of the 14th ACM international conference on Multimodal interaction - ICMI ’12. doi:10.1145/2388676.2388755Spikol, D., Prieto, L. P., Rodríguez-Triana, M. J., Worsley, M., Ochoa, X., Cukurova, M., … Ringtved, U. L. (2017). Current and future multimodal learning analytics data challenges. Proceedings of the Seventh International Learning Analytics & Knowledge Conference. doi:10.1145/3027385.3029437Ochoa, X., Worsley, M., Weibel, N., & Oviatt, S. (2016). Multimodal learning analytics data challenges. Proceedings of the Sixth International Conference on Learning Analytics & Knowledge - LAK ’16. doi:10.1145/2883851.2883913Aguilar, J., Sánchez, M., Cordero, J., Valdiviezo-Díaz, P., Barba-Guamán, L., & Chamba-Eras, L. (2017). Learning analytics tasks as services in smart classrooms. Universal Access in the Information Society, 17(4), 693-709. doi:10.1007/s10209-017-0525-0Lu, O. H. T., Huang, J. C. H., Huang, A. Y. Q., & Yang, S. J. H. (2017). Applying learning analytics for improving students engagement and learning outcomes in an MOOCs enabled collaborative programming course. Interactive Learning Environments, 25(2), 220-234. doi:10.1080/10494820.2016.1278391Drachsler, H., & Kalz, M. (2016). The MOOC and learning analytics innovation cycle (MOLAC): a reflective summary of ongoing research and its challenges. Journal of Computer Assisted Learning, 32(3), 281-290. doi:10.1111/jcal.12135Ruiperez-Valiente, J. A., Munoz-Merino, P. J., Gascon-Pinedo, J. A., & Kloos, C. D. (2017). Scaling to Massiveness With ANALYSE: A Learning Analytics Tool for Open edX. IEEE Transactions on Human-Machine Systems, 47(6), 909-914. doi:10.1109/thms.2016.2630420Er, E., Gómez-Sánchez, E., Dimitriadis, Y., Bote-Lorenzo, M. L., Asensio-Pérez, J. I., & Álvarez-Álvarez, S. (2019). Aligning learning design and learning analytics through instructor involvement: a MOOC case study. Interactive Learning Environments, 27(5-6), 685-698. doi:10.1080/10494820.2019.1610455Tabaa, Y., & Medouri, A. (2013). LASyM: A Learning Analytics System for MOOCs. International Journal of Advanced Computer Science and Applications, 4(5). doi:10.14569/ijacsa.2013.040516Shorfuzzaman, M., Hossain, M. S., Nazir, A., Muhammad, G., & Alamri, A. (2019). Harnessing the power of big data analytics in the cloud to support learning analytics in mobile learning environment. Computers in Human Behavior, 92, 578-588. doi:10.1016/j.chb.2018.07.002Klašnja-Milićević, A., Ivanović, M., & Budimac, Z. (2017). Data science in education: Big data and learning analytics. Computer Applications in Engineering Education, 25(6), 1066-1078. doi:10.1002/cae.21844Logglyhttps://www.loggly.com/Molto, G., & Caballer, M. (2014). On using the cloud to support online courses. 2014 IEEE Frontiers in Education Conference (FIE) Proceedings. doi:10.1109/fie.2014.7044041Caballer, M., Blanquer, I., Moltó, G., & de Alfonso, C. (2014). Dynamic Management of Virtual Infrastructures. Journal of Grid Computing, 13(1), 53-70. doi:10.1007/s10723-014-9296-5AWS CloudTrailhttps://aws.amazon.com/cloudtrail/Amazon Simple Storage Service (Amazon S3)http://aws.amazon.com/s3/Naranjo, D. M., Prieto, J. R., Moltó, G., & Calatrava, A. (2019). A Visual Dashboard to Track Learning Analytics for Educational Cloud Computing. Sensors, 19(13), 2952. doi:10.3390/s19132952Baldini, I., Castro, P., Chang, K., Cheng, P., Fink, S., Ishakian, V., … Suter, P. (2017). Serverless Computing: Current Trends and Open Problems. Research Advances in Cloud Computing, 1-20. doi:10.1007/978-981-10-5026-8_1Zimmerman, D. W. (1987). Comparative Power of StudentTTest and Mann-WhitneyUTest for Unequal Sample Sizes and Variances. The Journal of Experimental Education, 55(3), 171-174. doi:10.1080/00220973.1987.10806451Kruskal, W. H., & Wallis, W. A. (1952). Use of Ranks in One-Criterion Variance Analysis. Journal of the American Statistical Association, 47(260), 583-621. doi:10.1080/01621459.1952.10483441Voyer, D., & Voyer, S. D. (2014). Gender differences in scholastic achievement: A meta-analysis. Psychological Bulletin, 140(4), 1174-1204. doi:10.1037/a0036620Ellemers, N., Heuvel, H., Gilder, D., Maass, A., & Bonvini, A. (2004). The underrepresentation of women in science: Differential commitment or the queen bee syndrome? British Journal of Social Psychology, 43(3), 315-338. doi:10.1348/0144666042037999Sheard, M. (2009). Hardiness commitment, gender, and age differentiate university academic performance. British Journal of Educational Psychology, 79(1), 189-204. doi:10.1348/000709908x30440
    corecore