13,746 research outputs found
Recommended from our members
Practitioner Track Proceedings of the 6th International Learning Analytics & Knowledge Conference (LAK16)
Practitioners spearhead a significant portion of learning analytics, relying on implementation and experimentation rather than on traditional academic research. Both approaches help to improve the state of the art. The LAK conference has created a practitioner track for submissions, which first ran in 2015 as an alternative to the researcher track.
The primary goal of the practitioner track is to share thoughts and findings that stem from learning analytics project implementations. While both large and small implementations are considered, all practitioner track submissions are required to relate to initiatives that are designed for large-scale and/or long-term use (as opposed to research-focused initiatives). Other guidelines include:
• Implementation track record The project should have been used by an institution or have been deployed on a learning site. There are no hard guidelines about user numbers or how long the project has been running.
• Learning/education related Submissions have to describe work that addresses learning/academic analytics, either at an educational institution or in an area (such as corporate training, health care or informal learning) where the goal is to improve the learning environment or learning outcomes.
• Institutional involvement Neither submissions nor presentations have to include a named person from an academic institution. However, all submissions have to include information collected from people who have used the tool or initiative in a learning environment (such as faculty, students, administrators and trainees).
• No sales pitches While submissions from commercial suppliers are welcome; reviewers do not accept overt (or covert) sales pitches. Reviewers look for evidence that a presentation will take into account challenges faced, problems that have arisen, and/or user feedback that needs to be addressed.
Submissions are limited to 1,200 words, including an abstract, a summary of deployment with end users, and a full description. Most papers in the proceedings are therefore short, and often informal, although some authors chose to extend their papers once they had been accepted.
Papers accepted in 2016 fell into two categories.
• Practitioner Presentations Presentation sessions are designed to focus on deployment of a single learning analytics tool or initiative.
• Technology Showcase The Technology Showcase event enables practitioners to demonstrate new and emerging learning analytics technologies that they are piloting or deploying.
Both types of paper are included in these proceedings
Big data in higher education: an action research on managing student engagement with business intelligence
This research aims to explore the value of Big Data in student engagement management. It presents an action research on applying BI in a UK higher education institution that has developed and implemented a student engagement tracking system (SES) for better student engagement management. The SES collects data from various sources, including RFID tracking devices across many locations in the campus and student online activities. This public funded research project has enhanced the current SES with BI solutions and raised awareness on the value of the Big Data in improving student experience. The action research concerns with the organizational wide development and deployment of Intelligent Student Engagement System involving a diverse range of stakeholders. The activities undertaken to date have revealed interesting findings and implications for advancing our understanding and research in leveraging the benefit of the Big Data in Higher Education from a socio-technical perspective
Intelligent student engagement management : applying business intelligence in higher education
Advances in emerging ICT have enabled organisations to develop innovative ways to intelligently collect data that may not be possible before. However, this leads to the explosion of data and unprecedented challenges in making strategic and effective use of available data. This research-in-progress paper presents an action research focusing on applying business intelligence (BI) in a UK higher education institution that has developed a student engagement tracking system (SES) for student engagement management. The current system serves merely as a data collection and processing system, which needs significant enhancement for better decision support. This action research aims to enhance the current SETS with BI solutions and explore its strategic use. The research attempts to follow socio-technical approach in its effort to make the BI application a success. Progress and experience so far has revealed interesting findings on advancing our understanding and research in organisation-wide BI for better decision-making
Piloting Multimodal Learning Analytics using Mobile Mixed Reality in Health Education
© 2019 IEEE. Mobile mixed reality has been shown to increase higher achievement and lower cognitive load within spatial disciplines. However, traditional methods of assessment restrict examiners ability to holistically assess spatial understanding. Multimodal learning analytics seeks to investigate how combinations of data types such as spatial data and traditional assessment can be combined to better understand both the learner and learning environment. This paper explores the pedagogical possibilities of a smartphone enabled mixed reality multimodal learning analytics case study for health education, focused on learning the anatomy of the heart. The context for this study is the first loop of a design based research study exploring the acquisition and retention of knowledge by piloting the proposed system with practicing health experts. Outcomes from the pilot study showed engagement and enthusiasm of the method among the experts, but also demonstrated problems to overcome in the pedagogical method before deployment with learners
Assessing collaborative learning: big data, analytics and university futures
Traditionally, assessment in higher education has focused on the performance of individual students. This focus has been a practical as well as an epistemic one: methods of assessment are constrained by the technology of the day, and in the past they required the completion by individuals under controlled conditions, of set-piece academic exercises. Recent advances in learning analytics, drawing upon vast sets of digitally-stored student activity data, open new practical and epistemic possibilities for assessment and carry the potential to transform higher education. It is becoming practicable to assess the individual and collective performance of team members working on complex projects that closely simulate the professional contexts that graduates will encounter. In addition to academic knowledge this authentic assessment can include a diverse range of personal qualities and dispositions that are key to the computer-supported cooperative working of professionals in the knowledge economy. This paper explores the implications of such opportunities for the purpose and practices of assessment in higher education, as universities adapt their institutional missions to address 21st Century needs. The paper concludes with a strong recommendation for university leaders to deploy analytics to support and evaluate the collaborative learning of students working in realistic contexts
Student-Centered Learning: Functional Requirements for Integrated Systems to Optimize Learning
The realities of the 21st-century learner require that schools and educators fundamentally change their practice. "Educators must produce college- and career-ready graduates that reflect the future these students will face. And, they must facilitate learning through means that align with the defining attributes of this generation of learners."Today, we know more than ever about how students learn, acknowledging that the process isn't the same for every student and doesn't remain the same for each individual, depending upon maturation and the content being learned. We know that students want to progress at a pace that allows them to master new concepts and skills, to access a variety of resources, to receive timely feedback on their progress, to demonstrate their knowledge in multiple ways and to get direction, support and feedback from—as well as collaborate with—experts, teachers, tutors and other students.The result is a growing demand for student-centered, transformative digital learning using competency education as an underpinning.iNACOL released this paper to illustrate the technical requirements and functionalities that learning management systems need to shift toward student-centered instructional models. This comprehensive framework will help districts and schools determine what systems to use and integrate as they being their journey toward student-centered learning, as well as how systems integration aligns with their organizational vision, educational goals and strategic plans.Educators can use this report to optimize student learning and promote innovation in their own student-centered learning environments. The report will help school leaders understand the complex technologies needed to optimize personalized learning and how to use data and analytics to improve practices, and can assist technology leaders in re-engineering systems to support the key nuances of student-centered learning
Recommended from our members
Fostering medical students' lifelong learning skills with a dashboard, coaching and learning planning.
IntroductionTo develop lifelong learning skills, students need feedback, access to performance data, and coaching. A new medical curriculum incorporated infrastructural supports based on self-regulated learning theory and the Master Adaptive Learner framework to engage students in reflection and learning planning. This study examines students' experience with a performance dashboard, longitudinal coaching, and structured time for goal-setting.MethodsFocus groups with first-year medical students explored performance dashboard usage, coaching and learning planning. We analyzed findings using thematic analysis. Results informed development of a 29-item survey rated strongly disagree (1) to strongly agree (5) to investigate experience with the dashboard, coaching and learning goals program. The survey was distributed to one first-year medical student class. We performed descriptive statistics and factor analysis.ResultsIn three focus groups with 21 participants, students endorsed using the dashboard to access performance information but had trouble interpreting and integrating information. They valued coaches as sources of advice but varied in their perceptions of the value of discussing learning planning. Of 152 students, 114 (75%) completed the survey. Exploratory factor analysis yielded 5 factors explaining 57% of the variance: learning goals development (α = 0.88; mean 3.25 (standard deviation 0.91)), dashboard usage (α = 0.82; 3.36 (0.64)), coaching (α = 0.71; 3.72 (0.64)), employment of learning strategies (α = 0.81; 3.67 (0.79)), and reflection (α = 0.63; 3.68 (0.64)).DiscussionThe student performance dashboard provides efficient feedback access, yet students' use of this information to guide learning is variable. These results can inform other programs seeking to foster lifelong learning skills
- …