591,505 research outputs found
Profiling and understanding student information behaviour: Methodologies and meaning
This paper draws on work conducted under the Joint Information Systems Committee (JISC) User Behaviour Monitoring and Evaluation Framework to identify a range of issues associated with research design that can form a platform for enquiry about knowledge creation in the arena of user behaviour. The Framework has developed a multidimensional set of tools for profiling, monitoring and evaluating user behaviour. The Framework has two main approaches: one, a broadâbased survey which generates both a qualitative and a quantitative profile of user behaviour, and the other a longitudinal qualitative study of user behaviour that (in addition to providing inâdepth insights) is the basis for the development of the EIS (Electronic Information Services) Diagnostic Toolkit. The strengths and weaknesses of the Framework approach are evaluated. In the context of profiling user behaviour, key methodological concerns relate to: representativeness, sampling and access, the selection of appropriate measures and the interpretation of those measures. Qualitative approaches are used to generate detailed insights. These include detailed narratives, case study analysis and gap analysis. The messages from this qualitative analysis do not lend themselves to simple summarization. One approach that has been employed to capture and interpret these messages is the development of the EIS Diagnostic Toolkit. This toolkit can be used to assess and monitor an institution's progress with embedding EIS into learning processes. Finally, consideration must be given to integration of insights generated through different strands within the Framework
Recommended from our members
Exploring fuzzy cognitive mapping for IS evaluation: A research note
Existing IS Evaluation (ISE) techniques tend to focus on modeling individuals, teams, organization, or systems, in relation to process and environmental boundaries. Whilst such approaches are noteworthy and of merit, they do not necessarily provide insights into those causal interdependencies that are inherent within decision-making task. As has been noted by the extant literature in the field, the ISE task is dependent upon many factors â the resulting outputs of which may be tangible or intangible. The implicit level of uncertainty associated with modeling such decision-making tasks and behaviors, are therefore difficult to comprehend and impart via wholly Quantitative and / or Qualitative analyses. The authors therefore present and propose supporting and on-going research into the application of Fuzzy Logic, in the guise of Fuzzy Cognitive Mapping (FCM) simulations, as a means to model tangible/intangible aspects of the ISE decision-making task. Such a Fuzzy Information Systems Evaluation (F-ISE) is shown via the application of the FCM technique, in terms of three models of investment appraisal that are aligned to an ISE task within a UK manufacturing organization. In doing so, it is anticipated that such a technique may be a useful addition to the plethora of ISE techniques available to both researcher and practitioner alike
EXPLORATORY VISUALIZATION OF GRAPHS BASED ON COMMUNITY STRUCTURE
Communities, also called clusters or modules, are groups of nodes which probably share common properties and/or play similar roles within a graph. They widely exist in real networks such as biological, social, and information networks. Allowing users to interactively browse and explore the community structure, which is essential for understanding complex systems, is a challenging yet important research topic. My work has been focused on visualization approaches to exploring the community structure in graphs based on automatic community detection results.
In this dissertation, we first report a formal user study that investigated the essen- tial influence factors, benefits, and constraints of a community based graph visual- ization system in a background application of seeking information from text corpora. A general evaluation methodology for exploratory visualization systems has been proposed and practiced. The evaluation methodology integrates detailed cognitive load analysis and usersâ prior knowledge evaluation with quantitative and qualitative measures, so that in-depth insights can be gained. The study revealed that visual exploration based on the community structure benefits the understanding of real net- works. A literature review and a set of interviews were then conducted to learn tasks facing such graph exploration and the state-of-the-arts. This work led to commu- nity related graph visualization task taxonomy. Our examination of existing graph visualization systems revealed that a large number of community related graph visualization tasks are poorly supported in existing approaches. To bridge the gap, several
novel visualization techniques are proposed. In these approaches, graph topology information is mapped to a multidimensional space where the relationships between the communities and the nodes can be explicitly explored. Several user studies and case studies have been conducted to demonstrate the usefulness of these systems in real-world applications
A multidimensional evaluation framework for personal learning environments
Evaluating highly dynamic and heterogeneous Personal Learning Environments (PLEs) is extremely challenging. Components of PLEs are selected and configured by individual users based on their personal preferences, needs, and goals. Moreover, the systems usually evolve over time based on contextual opportunities and constraints. As such dynamic systems have no predefined configurations and user interfaces, traditional evaluation methods often fall short or are even inappropriate. Obviously, a host of factors influence the extent to which a PLE successfully supports a learner to achieve specific learning outcomes. We categorize such factors along four major dimensions: technological, organizational, psycho-pedagogical, and social. Each dimension is informed by relevant theoretical models (e.g., Information System Success Model, Community of Practice, self-regulated learning) and subsumes a set of metrics that can be assessed with a range of approaches. Among others, usability and user experience play an indispensable role in acceptance and diffusion of the innovative technologies exemplified by PLEs. Traditional quantitative and qualitative methods such as questionnaire and interview should be deployed alongside emergent ones such as learning analytics (e.g., context-aware metadata) and narrative-based methods. Crucial for maximal validity of the evaluation is the triangulation of empirical findings with multi-perspective (end-users, developers, and researchers), mixed-method (qualitative, quantitative) data sources. The framework utilizes a cyclic process to integrate findings across cases with a cross-case analysis in order to gain deeper insights into the intriguing questions of how and why PLEs work
Culturally-Responsive Canadian Postsecondary Performance Measurement
Student success has multiple meanings; however, the postpositivist bias prevalent in Canadian postsecondary education restricts how student success is defined and measured. When we standardize measures of student success we assume that the student experience is homogeneous and risk implementing policies and programs based on insufficient information. Unless new evaluation approaches are adopted, it is unlikely postsecondary institutions will generate the knowledge and wisdom needed to serve their regional, national, and international learners and communities. Postsecondary education leaders must be cognizant of the legacy of colonialism and consider cultural congruency between performance measurement systems and local context. This organizational improvement plan proposes a theory of action model for culturally-responsive postsecondary performance measurement that leverages shared governance through participatory, emergent, and appreciative processes and qualitative evaluation methodologies. Perception and socially constructed norms play a pivotal role in addressing the postsecondary education sectorâs quantitative bias; therefore, an interpretivist lens is used to critically examine the cultural appropriateness of quality assurance and measurement processes at a Canadian university. Culturally-responsive performance measurement requires consideration of diverse worldviews and methodologies. Qualitative evaluation can amplify the lived experiences of students and inform complex policy issues through examination of phenomena and local variability. The next generation of quality assurance requires inclusive decision-making structures to generate collective wisdom and cultivate an ethic of community by engaging community members, faculty, staff, and students as change agents
Applying network analysis to assess coastal risk planning
Adequate response to risks affecting coasts requires an integrated and coordinated multi-risk governance system, with ongoing evaluation of statutory planning documents and responsible stakeholders. Traditionally, such analyses have been carried out using mainly qualitative approaches. This paper adopts a more systemic and quantitative perspective on assessing planning systems and stakeholder relationships in connection with coastal risk. We apply network analysis to the Catalan coast (Northwestern Mediterranean Basin), paying special attention to the level of climate change integration in the planning system, as an aggravating factor of current risk dynamics. Our results demonstrate and quantify the complexity of Catalan coastal risk planning, which requires dealings with multi-level legal and administrative frameworks. Also highlighted is dissimilar management traditions according to risk type: the perspective on flooding risk is more unified and multi-risk focused, whereas coastal erosion (a significant issue for the Catalan coast) is managed more sectorially from a centralized administrative level. Climate change, moreover, is weakly accounted for in current statutory planning. We also acknowledge the relevance of using qualitative information as an important complement in interpreting results and making policy recommendations.Peer ReviewedPostprint (author's final draft
An assessment of data quality in routine health information systems in Oyo State, Nigeria
Magister Public Health - MPHEnsuring that routine health information systems provide good quality information for informed decision making and planning in health systems remain a major priority in several countries and health systems. The lack of use of health information or use of poor quality data in health care and systems results in
inadequate assessments and evaluation of health care and result in weak and poorly functioning health systems. The Nigerian health system like in many developing countries has challenges with the building blocks of the health system with a weak Health Information System. Although the quality of data in the Nigerian routine health information system has been deemed poor in some reports and studies, there is little research based evidence of the current state of data quality in the country as well as factors that may influence data quality in routine health information systems. This study explored the data quality of routine health information generated from health facilities in Oyo State, Nigeria, providing the state of data quality of the routine health information. This study was a cross sectional descriptive study taking a retrospective look at paper based and electronic data records in the National Health Management Information System in Nigeria. A mixed methodology approaches with quantitative to assess the quality of data within the health information system and qualitative methods to identify factors influencing the quality of health information at the health facilities in the district. Assessment of the quality of information was done using a structured evaluation tool looking at
completeness, accuracy and consistency of routine health statistics generated at these health facilities. A multistage sampling method was used in the quantitative component of the research. For the qualitative component of the research, purposive sampling was done to select respondents from each health facility to describe the factors influencing data quality. The study found incomplete and inaccurate data in facility paper summaries as well as in the electronic databases storing aggregate information from the facility data
Automatic Transformation of Natural to Unified Modeling Language: A Systematic Review
Context: Processing Software Requirement Specifications (SRS) manually takes
a much longer time for requirement analysts in software engineering.
Researchers have been working on making an automatic approach to ease this
task. Most of the existing approaches require some intervention from an analyst
or are challenging to use. Some automatic and semi-automatic approaches were
developed based on heuristic rules or machine learning algorithms. However,
there are various constraints to the existing approaches of UML generation,
such as restriction on ambiguity, length or structure, anaphora,
incompleteness, atomicity of input text, requirements of domain ontology, etc.
Objective: This study aims to better understand the effectiveness of existing
systems and provide a conceptual framework with further improvement guidelines.
Method: We performed a systematic literature review (SLR). We conducted our
study selection into two phases and selected 70 papers. We conducted
quantitative and qualitative analyses by manually extracting information,
cross-checking, and validating our findings. Result: We described the existing
approaches and revealed the issues observed in these works. We identified and
clustered both the limitations and benefits of selected articles. Conclusion:
This research upholds the necessity of a common dataset and evaluation
framework to extend the research consistently. It also describes the
significance of natural language processing obstacles researchers face. In
addition, it creates a path forward for future research
- âŠ