26,169 research outputs found
Recommended from our members
Evaluating Mobile Learning: Reflections on Current Practice
The field of mobile learning is at present characterised by a proliferation of pilots and trials that allow mobile technologies to be tested out in a variety of learning contexts. The sustained deployment of mobile learning will depend on the quality of these pilots and trials, which includes evaluation methodology and reporting. The paper examines current evaluation practice, based on evidence drawn from conference publications, published case studies, and other accounts from the literature. The authors also draw on their work in collecting case studies of mobile learning from a range of recent projects. Issues deserving examination include the apparent objectives of the pilots or trials, the nature of the evaluations, instruments and techniques used, and the analysis and presentation of findings. The paper reflects on the quality of evaluation in mobile learning pilots and trials, in the broader context of evolving practices in the evaluation of educational technologies
Recommended from our members
The evaluation of next generation learning technologies: the case of mobile learning
Mobile learning is at a leading edge of learning technologies and is at present characterised by pilots and trials that allow mobile technologies to be tested in a variety of learning contexts. The sustained deployment of mobile learning will depend on these pilots and trials, especially their evaluation methodology and reporting. The paper examines a sample of current evaluation practice, based on evidence drawn from conference proceedings, published case studies, and other accounts from the literature and draws on the authors' work in collecting case studies of mobile learning from a range of recent projects. The issues discussed include the apparent objectives of the documented pilots or trials, the nature of the evaluations, instruments and techniques used, and the presentation of findings. The paper reflects on the quality of evaluation in mobile learning pilots and trials, in the broader context of evolving practices in the evaluation of educational technologies
CAHRS Partners\u27 Implementation of Artificial Intelligence
[Excerpt] The ideas and uses for Artificial Intelligence (AI) are abundant, and each business is seemingly ripe for disruption, including HR. As the hype surrounding AI continues to be championed by popular press, we began our research in order to determine whether the press’ biased view that AI was here and ready to implement was accurate. We found that in reality, AI programs were far behind the progress discussed, as the software was slower, more expensive, and there was a general lack of amalgamation throughout the industry. From there, we asked CAHRS partners to tell us where AI was used in their company, and how it helped them deliver HR differently. Our research focused on how AI technology will disrupt, change, or bolster the HR function, specifically in Talent Acquisition and Learning and Development (L&D) spaces.
We found our CAHRS partners dove into AI, and represented three key points along a spectrum of AI implementation. Of the 59 participants at 32 companies, 26% are Observers, 48% are Explorers, and 26% are Implementers. Observers were companies that did not believe AI fits with their strategy, and therefore do not intend to implement AI right now. Explorers are companies that have begun to actively explore AI through industry research, vendor exploration, and piloting AI and machine learning (ML) technologies. Implementers are companies that have either built in house or worked with an external vendor to implement an AI or machine learning technology. The CAHRS partners represented such a wide range along this spectrum because there are no best practices for AI implementation. However, each of our partners that leveraged AI understood the tool, while also understanding their business needs, people, and technology, which allowed them to utilize AI technology
Bridging the Gap
School districts across the country are increasingly seeking out digital tools to support the work of educators, in the hopes of improving students' academic achievement. With the rapid emergence of this new market, many districts have been challenged by the task of identifying and procuring educational technology (ed-tech) products that match the needs of their educators and students.The NYC Department of Education's "Innovate NYC Schools" division, supported by a U.S. DOE Investing in Innovation (i3) grant, aims to address this problem, in part by promoting "user-centered design," an approach that puts the needs and preferences of products' intended users (in this case, teachers, students, and parents) front and center in the development and procurement of new technology.Bridging the Gap describes the design and implementation of three Innovate NYC Schools initiatives grounded in user-centered design theory:School Choice Design Challenge (SCDC),an effort to develop apps that would help students explore and narrow down their choices of high school.#SharkTankEDU events, during which ed-tech developers present a product to a panel of educators who provide feedback on the tool.Short-Cycle Evaluation Challenges (SCEC), a classroom-based, semester-long pilot of ed-tech tools intended to inform product development, as well as the ultimate procurement decisions of school staff.The report focuses on four phases of work involved in bringing ed-tech companies and the users of their products together: defining a problem; selecting users and ed-tech companies; implementing pilot-based initiatives; and evaluating products. It describes strategies used and challenges faced, and offers practical lessons gleaned from the experiences of the individuals who designed and participated in these efforts.
Recommended from our members
OULDI-JISC Project Evaluation Report: the impact of new curriculum design tools and approaches on institutional process and design cultures
This report presents research and evaluation undertaken by the OULDI-JISC Project (Open University Learning Design Initiative JISC Project) between 2008 and 2012. In particular, it considers the impact of new curriculum design tools and approaches piloted by the project on institutional processes and design cultures. These tools and approaches include tools for sharing learning design expertise (Cloudworks), visualising designs (CompendiumLD, Module Map, Activity Profile) and for supporting design and reflection in workshops (Facilitation Cards, workshop activities, etc.). The project has adopted a learning design approach so as to help foreground pedagogy and learner experience. Nine pilots have been completed across six UK universities
Aerospace Medicine and Biology: A continuing bibliography with indexes (supplement 291)
This bibliography lists 131 reports, articles and other documents introduced into the NASA scientific and technical information system in November 1986
Evolving a software development methodology for commercial ICTD projects
This article discusses the evolution of a “DistRibuted Agile Methodology Addressing Technical Ictd in Commercial Settings” (DRAMATICS) that was developed in a global software corporation to support ICTD projects from initial team setup through ICT system design, development, and prototyping, to scaling up and transitioning, to sustainable commercial models. We developed the methodology using an iterative Action Research approach in a series of commercial ICTD projects over a period of more than six years. Our learning is reflected in distinctive methodology features that support the development of contextually adapted ICT systems, collaboration with local partners, involvement of end users in design, and the transition from research prototypes to scalable, long-term solutions. We offer DRAMATICS as an approach that others can appropriate and adapt to their particular project contexts. We report on the methodology evolution and provide evidence of its effectiveness in the projects where it has been used
- …