2,035 research outputs found

    Student-Centered Learning: Functional Requirements for Integrated Systems to Optimize Learning

    Get PDF
    The realities of the 21st-century learner require that schools and educators fundamentally change their practice. "Educators must produce college- and career-ready graduates that reflect the future these students will face. And, they must facilitate learning through means that align with the defining attributes of this generation of learners."Today, we know more than ever about how students learn, acknowledging that the process isn't the same for every student and doesn't remain the same for each individual, depending upon maturation and the content being learned. We know that students want to progress at a pace that allows them to master new concepts and skills, to access a variety of resources, to receive timely feedback on their progress, to demonstrate their knowledge in multiple ways and to get direction, support and feedback from—as well as collaborate with—experts, teachers, tutors and other students.The result is a growing demand for student-centered, transformative digital learning using competency education as an underpinning.iNACOL released this paper to illustrate the technical requirements and functionalities that learning management systems need to shift toward student-centered instructional models. This comprehensive framework will help districts and schools determine what systems to use and integrate as they being their journey toward student-centered learning, as well as how systems integration aligns with their organizational vision, educational goals and strategic plans.Educators can use this report to optimize student learning and promote innovation in their own student-centered learning environments. The report will help school leaders understand the complex technologies needed to optimize personalized learning and how to use data and analytics to improve practices, and can assist technology leaders in re-engineering systems to support the key nuances of student-centered learning

    Automatic generation of software interfaces for supporting decisionmaking processes. An application of domain engineering & machine learning

    Get PDF
    [EN] Data analysis is a key process to foster knowledge generation in particular domains or fields of study. With a strong informative foundation derived from the analysis of collected data, decision-makers can make strategic choices with the aim of obtaining valuable benefits in their specific areas of action. However, given the steady growth of data volumes, data analysis needs to rely on powerful tools to enable knowledge extraction. Information dashboards offer a software solution to analyze large volumes of data visually to identify patterns and relations and make decisions according to the presented information. But decision-makers may have different goals and, consequently, different necessities regarding their dashboards. Moreover, the variety of data sources, structures, and domains can hamper the design and implementation of these tools. This Ph.D. Thesis tackles the challenge of improving the development process of information dashboards and data visualizations while enhancing their quality and features in terms of personalization, usability, and flexibility, among others. Several research activities have been carried out to support this thesis. First, a systematic literature mapping and review was performed to analyze different methodologies and solutions related to the automatic generation of tailored information dashboards. The outcomes of the review led to the selection of a modeldriven approach in combination with the software product line paradigm to deal with the automatic generation of information dashboards. In this context, a meta-model was developed following a domain engineering approach. This meta-model represents the skeleton of information dashboards and data visualizations through the abstraction of their components and features and has been the backbone of the subsequent generative pipeline of these tools. The meta-model and generative pipeline have been tested through their integration in different scenarios, both theoretical and practical. Regarding the theoretical dimension of the research, the meta-model has been successfully integrated with other meta-model to support knowledge generation in learning ecosystems, and as a framework to conceptualize and instantiate information dashboards in different domains. In terms of the practical applications, the focus has been put on how to transform the meta-model into an instance adapted to a specific context, and how to finally transform this later model into code, i.e., the final, functional product. These practical scenarios involved the automatic generation of dashboards in the context of a Ph.D. Programme, the application of Artificial Intelligence algorithms in the process, and the development of a graphical instantiation platform that combines the meta-model and the generative pipeline into a visual generation system. Finally, different case studies have been conducted in the employment and employability, health, and education domains. The number of applications of the meta-model in theoretical and practical dimensions and domains is also a result itself. Every outcome associated to this thesis is driven by the dashboard meta-model, which also proves its versatility and flexibility when it comes to conceptualize, generate, and capture knowledge related to dashboards and data visualizations

    Representing Data Visualization Goals and Tasks through Meta-Modeling to Tailor Information Dashboards

    Get PDF
    [EN]Information dashboards are everywhere. They support knowledge discovery in a huge variety of contexts and domains. Although powerful, these tools can be complex, not only for the end-users but also for developers and designers. Information dashboards encode complex datasets into different visual marks to ease knowledge discovery. Choosing a wrong design could compromise the entire dashboard’s effectiveness, selecting the appropriate encoding or configuration for each potential context, user, or data domain is a crucial task. For these reasons, there is a necessity to automatize the recommendation of visualizations and dashboard configurations to deliver tools adapted to their context. Recommendations can be based on different aspects, such as user characteristics, the data domain, or the goals and tasks that will be achieved or carried out through the visualizations. This work presents a dashboard meta-model that abstracts all these factors and the integration of a visualization task taxonomy to account for the different actions that can be performed with information dashboards. This meta-model has been used to design a domain specific language to specify dashboards requirements in a structured way. The ultimate goal is to obtain a dashboard generation pipeline to deliver dashboards adapted to any context, such as the educational context, in which a lot of data are generated, and there are several actors involved (students, teachers, managers, etc.) that would want to reach different insights regarding their learning performance or learning methodologies

    A framework for strategic planning of data analytics in the educational sector

    Get PDF
    The field of big data and data analysis is not a new one. Big data systems have been investigated with respect to the volume of the data and how it is stored, the data velocity and how it is subject to change, variety of data to be analysed and data veracity referring to integrity and quality. Higher Education Institutions (HEIs) have a significant range of data sources across their operations and increasingly invest in collecting, analysing and reporting on their data in order to improve their efficiency. Data analytics and Business Intelligence (BI) are two terms that are increasingly popular over the past few years in the relevant literature with emphasis on their impact in the education sector. There is a significant volume of literature discussing the benefits of data analytics in higher education and even more papers discussing specific case studies of institutions resorting on BI by deploying various data analytics practices. Nevertheless, there is a lack of an integrated framework that supports HEIs in using learning analytics both at strategic and operational level. This research study was driven by the need to offer a point of reference for universities wishing to make good use of the plethora of data they can access. Increasingly institutions need to become ‘smart universities’ by supporting their decisions with findings from the analysis of their operations. The Business Intelligence strategies of many universities seems to focus mostly on identifying how to collect data but fail to address the most important issue that is how to analyse the data, what to do with the findings and how to create the means for a scalable use of learning analytics at institutional level. The scope of this research is to investigate the different factors that affect the successful deployment of data analytics in educational contexts focusing both on strategic and operational aspects of academia. The research study attempts to identify those elements necessary for introducing data analytics practices across an institution. The main contribution of the research is a framework that models the data collection, analysis and visualisation in higher education. The specific contribution to the field comes in the form of generic guidelines for strategic planning of HEI data analytics projects, combined with specific guidelines for staff involved in the deployment of data analytics to support certain institutional operations. The research is based on a mixed method approach that combines grounded theory in the form of extensive literature review, state-of-the-art investigation and case study analysis, as well as a combination of qualitative and quantitative data collection. The study commences with an extensive literature review that identifies the key factors affecting the use of learning analytics. Then the research collected more information from an analysis of a wide range of case studies showing how learning analytics are used across HEIs. The primary data collection concluded with a series of focus groups and interviews assessing the role of learning analytics in universities. Next, the research focused on a synthesis of guidelines for using learning analytics both at strategic and operational levels, leading to the production of generic and specific guidelines intended for different university stakeholders. The proposed framework was revised twice to create an integrated point of reference for HEIs that offers support across institutions in scalable and applicable way that can accommodate the varying needs met at different HEIs. The proposed framework was evaluated by the same participants in the earlier focus groups and interviews, providing a qualitative approach in evaluating the contributions made during this research study. The research resulted in the creation of an integrated framework that offers HEIs a reference for setting up a learning analytics strategy, adapting institutional policies and revising operations across faculties and departments. The proposed C.A.V. framework consists of three phases including Collect, Analysis and Visualisation. The framework determines the key features of data sources and resulting dashboards but also a list of functions for the data collection, analysis and visualisation stages. At strategic level, the C.A.V. framework enables institutions to assess their learning analytics maturity, determine the learning analytics stages that they are involved in, identify the different learning analytics themes and use a checklist as a reference point for their learning analytics deployment. Finally, the framework ensures that institutional operations can become more effective by determining how learning analytics provide added value across different operations, while assessing the impact of learning analytics on stakeholders. The framework also supports the adoption of learning analytics processes, the planning of dashboard contents and identifying factors affecting the implementation of learning analytics

    Learn Smarter, Not Harder – Exploring the Development of Learning Analytics Use Cases to Create Tailor-Made Online Learning Experiences

    Get PDF
    Our world is significantly shaped by digitalization, fostering new opportunities for technology-mediated learning. Therefore, massive amounts of knowledge become available online. However, concurrently these formats entail less interaction and guidance from lecturers. Thus, learners need to be supported by intelligent learning tools that provide suitable knowledge in a tailored way. In this context, the use of learning analytics in its multifaceted forms is essential. Existing literature shows a proliferation of learning analytics use cases without a systematic structure. Based on a structured literature review of 42 papers we organized existing literature contributions systematically and derived four use cases: learning dashboards, individualized content, tutoring systems, and adaptable learning process based on personality. Our use cases will serve as a basis for a targeted scientific discourse and are valuable orientation for the development of future learning analytics use cases to give rise to the new form of Learning Experience Platforms

    Biodiversity and ecosystem services dashboards to inform landscape and urban planning: a systematic analysis of current practices

    Get PDF
    Guiding the transformation of cities and regions towards more sustainable pathways requires a deep understanding of the complexities of socio-ecological systems. This entails gaining insights into the status and trends of biodiversity, ecosystems and their services (BES), as well as navigating complex governance and power structures, particularly in contested spaces. Digital dashboards, understood as visual representations of key information, could effectively communicate complex BES information to decision makers and planners in landscape and urban planning, enabling more informed decisions. While dashboards are increasingly being used in spatial-related applications, the lack of scientific understanding regarding the emerging applications of BES information in dashboards underscores the pressing need for research and review in this area. This study aims to identify and analyze contemporary case studies of BES dashboard applications to explore their potential role, which can effectively support decision-making in landscape and urban planning. We develop a conceptual framework of interlinkages between BES dashboards and landscape planning processes and apply this framework to analyze 12 state-of-the-art BES dashboard applications from Asia, Australia, Europe, North and South America. Our results reflect emerging practices of dashboards visualizing BES information, which varied in purposes, content, functionalities, visual design, and output features. The dashboards represented/covered a total of 66 BES indicators, including tree health, forest status and functionality, green and blue spaces connectivity, and specific components of biodiversity. Further research on user demands and real-world impacts is necessary to enhance the effectiveness of BES dashboards in informing landscape and urban planning for people and nature

    Tailored information dashboards: A systematic mapping of the literature

    Get PDF
    Information dashboards are extremely useful tools to exploit knowledge. Dashboards enable users to reach insights and to identify patterns within data at-a-glance. However, dashboards present a series of characteristics and configurations that could not be optimal for every user, thus requiring the modification or variation of its features to fulfill specific user requirements. This variation process is usually referred to as customization, personalization or adaptation, depending on how this variation process is achieved. Given the great number of users and the exponential growth of data sources, tailoring an information dashboard is not a trivial task, as several solutions and configurations could arise. To analyze and understand the current state-of-the-art regarding tailored information dashboards, a systematic mapping has been performed. This mapping focus on answering questions regarding how existing dashboard solutions in the literature manage the customization, personalization and/or adaptation of its elements to produce tailored displays

    Identifying and addressing adaptability and information system requirements for tactical management

    Get PDF

    What Makes Learning Analytics Research Matter

    Full text link
    The ongoing changes and challenges brought on by the COVID-19 pandemic have exacerbated long-standing inequities in education, leading many to question basic assumptions about how learning can best benefit all students. Thirst for data about learning is at an all-time high, sometimes without commensurate attention to ensuring principles this community has long valued: privacy, transparency, openness, accountability, and fairness. How we navigate this dynamic context is critical for the future of learning analytics. Thinking about the issue through the lens of JLA publications over the last eight years, we highlight the important contributions of “problem-centric” rather than “tool-centric” research. We also value attention (proximal or distal) to the eventual goal of closing the loop, connecting the results of our analyses back to improve the learning from which they were drawn. Finally, we recognize the power of cycles of maturation: using information generated about real-world uses and impacts of a learning analytics tool to guide new iterations of data, analysis, and intervention design. A critical element of context for such work is that the learning problems we identify and choose to work on are never blank slates; they embed societal structures, reflect the influence of past technologies; and have previous enablers, barriers and social mediation acting on them. In that context, we must ask the hard questions: What parts of existing systems is our work challenging? What parts is it reinforcing? Do these effects, intentional or not, align with our values and beliefs? In the end what makes learning analytics matter is our ability to contribute to progress on both immediate and long-standing challenges in learning, not only improving current systems, but also considering alternatives for what is and what could be. This requires including stakeholder voices in tackling important problems of learning with rigorous analytic approaches to promote equitable learning across contexts. This journal provides a central space for the discussion of such issues, acting as a venue for the whole community to share research, practice, data and tools across the learning analytics cycle in pursuit of these goals.</jats:p
    • 

    corecore