1,331 research outputs found
Recommended from our members
Learning Analytics Community Exchange: Evidence Hub
This poster sets out the background and development of the LACE Evidence Hub, a site that gathers evidence about learning analytics in an accessible form. The poster also describes the functionality of the site, summarises its quantitative and thematic content to date and the state of evidence. In addition, it encourages people to add to and make use of the Hub
Recommended from our members
Learning at Scale: Using an Evidence Hub To Make Sense of What We Know
The large datasets produced by learning at scale, and the need for ways of dealing with high learner/educator ratios, mean that MOOCs and related environments are frequently used for the deployment and development of learning analytics. Despite the current proliferation of analytics, there is as yet relatively little hard evidence of their effectiveness. The Evidence Hub developed by the Learning Analytics Community Exchange (LACE) provides a way of collating and filtering the available evidence in order to support the use of analytics and to target future studies to fill the gaps in our knowledge
Recommended from our members
Practitioner Track Proceedings of the 6th International Learning Analytics & Knowledge Conference (LAK16)
Practitioners spearhead a significant portion of learning analytics, relying on implementation and experimentation rather than on traditional academic research. Both approaches help to improve the state of the art. The LAK conference has created a practitioner track for submissions, which first ran in 2015 as an alternative to the researcher track.
The primary goal of the practitioner track is to share thoughts and findings that stem from learning analytics project implementations. While both large and small implementations are considered, all practitioner track submissions are required to relate to initiatives that are designed for large-scale and/or long-term use (as opposed to research-focused initiatives). Other guidelines include:
• Implementation track record The project should have been used by an institution or have been deployed on a learning site. There are no hard guidelines about user numbers or how long the project has been running.
• Learning/education related Submissions have to describe work that addresses learning/academic analytics, either at an educational institution or in an area (such as corporate training, health care or informal learning) where the goal is to improve the learning environment or learning outcomes.
• Institutional involvement Neither submissions nor presentations have to include a named person from an academic institution. However, all submissions have to include information collected from people who have used the tool or initiative in a learning environment (such as faculty, students, administrators and trainees).
• No sales pitches While submissions from commercial suppliers are welcome; reviewers do not accept overt (or covert) sales pitches. Reviewers look for evidence that a presentation will take into account challenges faced, problems that have arisen, and/or user feedback that needs to be addressed.
Submissions are limited to 1,200 words, including an abstract, a summary of deployment with end users, and a full description. Most papers in the proceedings are therefore short, and often informal, although some authors chose to extend their papers once they had been accepted.
Papers accepted in 2016 fell into two categories.
• Practitioner Presentations Presentation sessions are designed to focus on deployment of a single learning analytics tool or initiative.
• Technology Showcase The Technology Showcase event enables practitioners to demonstrate new and emerging learning analytics technologies that they are piloting or deploying.
Both types of paper are included in these proceedings
‘A double-edged sword. This is powerful but it could be used destructively’: Perspectives of early career education researchers on learning analytics
Learning analytics has been increasingly outlined as a powerful tool for measuring, analysing, and predicting learning experiences and behaviours. The rising use of learning analytics means that many educational researchers now require new ranges of technical analytical skills to contribute to an increasingly data-heavy field. However, it has been argued that educational data scientists are a ‘scarce breed’ (Buckingham Shum et al., 2013) and that more resources are needed to support the next generation of early career researchers in the education field. At the same time, little is known about how early career education researchers feel towards learning analytics and whether it is important to their current and future research practices. Using a thematic analysis of a participatory learning analytics workshop discussions with 25 early career education researchers, we outline in this article their ambitions, challenges and anxieties towards learning analytics. In doing so, we have provided a roadmap for how the learning analytics field might evolve and practical implications for supporting early career researchers’ development
Recommended from our members
Understanding Evidence-Based Interventions for Cross-Cultural Group Work: A Learning Analytics Perspective
As the numbers of international students worldwide continue to rise, one common challenge is how best to socially integrate diverse groups of students. Indeed, research demonstrates that many students form social and learning relationships with those from the same cultural background, despite benefits of cross-cultural communication. This lack of social cohesion negatively affects students, particularly when it comes to their perceptions of collaborative group work. However, few studies have analysed measurable student behaviours in group work, such as with learning analytics, to determine how culture and existing social networks influence measurable differences in contributions. Similarly, little is known about what evidence-based interventions lead to more equal participation between diverse students. In this research, learning analytics is combined with social network analysis to determine the role of social connections on group work participation, and highlight replicable interventions that can help promote social cohesion in diverse classrooms
Guest Editorial: Ethics and Privacy in Learning Analytics
The European Learning Analytics Community Exchange (LACE) project is responsible for an ongoing series of workshops on ethics and privacy in learning analytics (EP4LA), which have been responsible for driving and transforming activity in these areas. Some of this activity has been brought together with other work in the papers that make up this special issue. These papers cover the creation and development of ethical frameworks, as well as tools and approaches that can be used to address issues of ethics and privacy. This editorial suggests that it is worth taking time to consider the often intertangled issues of ethics, data protection and privacy separately. The challenges mentioned within the special issue are summarised in a table of 22 challenges that are used to identify the values that underpin work in this area. Nine ethical goals are suggested as the editors’ interpretation of the unstated values that lie behind the challenges raised in this paper
Recommended from our members
Situating multimodal learning analytics
The digital age has introduced a host of new challenges and opportunities for the learning sciences community. These challenges and opportunities are particularly abundant in multimodal learning analytics (MMLA), a research methodology that aims to extend work from Educational Data Mining (EDM) and Learning Analytics (LA) to multimodal learning environments by treating multimodal data. Recognizing the short-term opportunities and longterm challenges will help develop proof cases and identify grand challenges that will help propel the field forward. To support the field's growth, we use this paper to describe several ways that MMLA can potentially advance learning sciences research and touch upon key challenges that researchers who utilize MMLA have encountered over the past few years
Sample descriptors linked to metagenomic sequencing data from human and animal enteric samples from Vietnam.
There is still limited information on the diversity of viruses co-circulating in humans and animals. Here, we report data obtained from a large field collection of enteric samples taken from humans, pigs, rodents and other mammal hosts in Vietnam between 2012 and 2016. Each of 2100 stool or rectal swab samples was subjected to virally-enriched agnostic metagenomic sequencing; the short read sequence data are accessible from the European Nucleotide Archive (ENA). We link the sequence data to metadata on host type and demography and geographic location, distinguishing hospital patients, members of a cohort identified as a high risk of zoonotic infections (e.g. abattoir workers, rat traders) and animals. These data are suitable for further studies of virus diversity and virus discovery in humans and animals from Vietnam and to identify viruses found in multiple hosts that are potentially zoonotic
- …