1,148 research outputs found

    A knowledge development lifecycle for reflective practice

    Get PDF
    Reflective practice is valuable because of its potential for continuous improvement through feedback and learning. Conventional models of knowledge practice however do not explicitly include reflection as part of the practice, nor locate it in a developmental cycle. They focus on modelling in a knowledge plane which itself is contextualised by active knowing processes, and ignore the influence of power in their activity models. Further, many models focus on either an artefact or a process view, resulting from a conceptual disconnect between knowledge and knowing, and failure to relate passive to active views. Using the idea of higher order loops that govern knowledge development processes, in this paper we propose a conceptualisation of a reflective Knowledge Development Life Cycle (KDLC). This explicitly includes the investigator and the organisation itself as dynamic components of a systemic process and is suited to either a constructivist or realist epistemological stance. We describe the stages required in the KDLC and discuss their significance. Finally we show how incorporation of reflection into process enables dynamic interplay between the knowing and the knowledge in the organisation

    The cat's cradle network

    Get PDF
    In this paper we will argue that the representation of context in knowledge management is appropriately served by the representation of the knowledge networks in an historicised form. Characterising context as essentially extra to any particular knowledge representation, we argue that another dimension to these be modelled, rather than simply elaborating a form in its own terms. We present the formalism of the cat's cradle network, and show how it can be represented by an extension of the Pathfinder associative network that includes this temporal dimension, and allows evolutions of understandings to be traced. Grounding its semantics in communities of practice ensures utility and cohesiveness, which is lost when mere externalities of a representation are communicated in fully fledged forms. The scheme is general and subsumes other formalisms for knowledge representation. The cat's cradle network enables us to model such community-based social constructs as pattern languages, shared memory and patterns of trust and reliance, by placing their establishment in a structure that shows their essential temporality

    The Noetic Prism

    Get PDF
    Definitions of ‘knowledge’ and its relationships with ‘data’ and ‘information’ are varied, inconsistent and often contradictory. In particular the traditional hierarchy of data-information-knowledge and its various revisions do not stand up to close scrutiny. We suggest that the problem lies in a flawed analysis that sees data, information and knowledge as separable concepts that are transformed into one another through processing. We propose instead that we can describe collectively all of the materials of computation as ‘noetica’, and that the terms data, information and knowledge can be reconceptualised as late-binding, purpose-determined aspects of the same body of material. Changes in complexity of noetica occur due to value-adding through the imposition of three different principles: increase in aggregation (granularity), increase in set relatedness (shape), and increase in contextualisation through the formation of networks (scope). We present a new model in which granularity, shape and scope are seen as the three vertices of a triangular prism, and show that all value-adding through computation can be seen as movement within the prism space. We show how the conceptual framework of the noetic prism provides a new and comprehensive analysis of the foundations of computing and information systems, and how it can provide a fresh analysis of many of the common problems in the management of intellectual resources

    Just below the surface: developing knowledge management systems using the paradigm of the noetic prism

    Get PDF
    In this paper we examine how the principles embodied in the paradigm of the noetic prism can illuminate the construction of knowledge management systems. We draw on the formalism of the prism to examine three successful tools: frames, spreadsheets and databases, and show how their power and also their shortcomings arise from their domain representation, and how any organisational system based on integration of these tools and conversion between them is inevitably lossy. We suggest how a late-binding, hybrid knowledge based management system (KBMS) could be designed that draws on the lessons learnt from these tools, by maintaining noetica at an atomic level and storing the combinatory processes necessary to create higher level structure as the need arises. We outline the “just-below-the-surface” systems design, and describe its implementation in an enterprise-wide knowledge-based system that has all of the conventional office automation features

    The Role of Theory in Quantitative Data Analysis

    Get PDF
    If you take theory and models seriously, then (a) you need to elaborate clearly for yourself ‘what counts’ and how things supposedly fit together, and (b) you must hold yourself accountable to data.
 From my perspective, theory is – or should be – the lifeblood of the empirical scientist. (Schoenfeld, 2010, p. 105) The process of theorizing, collecting evidence, testing theory, revising theory, and then working through the cycle again is the basis of science. Theory is no less important when conducting a data analysis using quantitative methods. All statistical textbooks spend significant space on discussing the assumptions of statistical tests, the data requirements for a given estimation procedure, and the boundaries around the conclusions that can be drawn from results. Theory about the phenomenon that one is examining through quantitative data analysis is not only the driver for the methods used to collect evidence, but most importantly, the decisions made about how to model and test that theory

    The Use of Meta-Analytic Statistical Significance Testing

    Get PDF
    Meta-analysis multiplicity, the concept of conducting multiple tests of statistical significance within one study, is an underdeveloped literature (Tendal, NĂŒesch, Higgins, JĂŒni, & GĂžtzsche, 2011). We address this issue by considering how Type I errors can impact meta-analytic results, suggest how statistical power may be affected through the use of multiplicity corrections, and propose how meta-analysts should analyze multiple tests of statistical significance. The context for this study is a meta-review of meta-analyses published in two leading review journals in education and psychology. Our review of 130 meta-analyses revealed a strong reliance on statistical significance testing without considering of Type I errors or the use of multiplicity corrections. In order to provide valid conclusions, meta-analysts must consider these issues prior to conducting the study

    Validation of the Employment Hope Scale: Measuring Psychological Self-Sufficiency Among Low-Income Jobseekers

    Get PDF
    The Employment Hope scale (EHS) was designed to measure the empowerment-based self-sufficiency (SS) outcome among low-income job-seeking clients. This measure captures the psychological SS dimension as opposed to the more commonly used economic SS in workforce development and employment support practice. The study validates the EHS and reports its psychometric properties. Method: An exploratory factor analysis (EFA) was conducted using an agency data from the Cara Program in Chicago, United States. The principal axis factor extraction process was employed to identify the factor structure. Results: EFA resulted in a 13-item two-factor structure with Factor 1 representing “Psychological Empowerment” and Factor 2 representing “Goal-Oriented Pathways.” Both factors had high internal consistency reliability and construct validity. Conclusions: While findings may be preliminary, this study found the EHS to be a reliable and valid measure, demonstrating its utility in assessing psychological SS as an empowerment outcome among low-income jobseekers

    Quality of Research Evidence in Education: How Do We Know?

    Get PDF
    The persistence of inequitable education is the fundamental fact facing education researchers as we reflect on the quality and value of the evidence we produce (American Educational Research Association & National Academy of Education, 2020; Educational Opportunity Monitoring Project, 2020). As a field, we must critically examine what it means for us to develop increasingly sophisticated research tools and research design models while disparate outcomes along familiar lines of race and class continue apace. This issue’s importance has been laid bare by the COVID-19 pandemic and the global protests for racial justice in the wake of George Floyd’s murder. If our research endeavors are not effectively combating racism in education, providing help as our schools refashion themselves for remote and hybrid teaching, or supporting schools in other ways to address the myriad of equity gaps they face, then what are we doing? What are we generating evidence of and for

    Late results of surgical and medical therapy for patients with coronary artery disease and depressed left ventricular function

    Get PDF
    Late survival and freedom from myocardial infarction were determined for 192 patients with coronary artery disease and depressed left ventricular ejection fraction at rest (<35%) determined by biplane angiography who were evaluated between 1970 and 1977. Seventy-seven patients had coronary artery bypass grafting and 115 patients were treated medically and were considered surgical candidates. The medical and surgical groups were comparable in all baseline characteristics examined except frequency of three vessel disease and angina pectoris, which occurred in a significantly greater percent of the surgically treated patients (p < 0.01). Only three medically treated patients (2.6%) underwent coronary bypass grafting in the follow-up period.Seven year actuarial survival was 63% in the surgical and 34% in the medical group (p < 0.001). Ninety-three percent of patients in the surgical group and 81% of those in the medical group were free of nonfatal myocardial infarction (p = 0.01), and 62 and 33%, respectively, were alive and free of myocardial infarction (p < 0.001) at 7 years. Significant differences in survival favoring surgical treatment were observed for the subsets of patients with an ejection fraction of 25% or less (p = 0.0002) and 26 to 35% (p = 0.01), and for the subsets with three vessel coronary disease (p < 0.001), normal left ventricular end-diastolic volume (<100 ml/m2) (p = 0.005) and elevated end-diastolic volume (>100 ml/m2)(p = 0.001). After adjustment for other important prognostic variables, the type of treatment remained significant in predicting the relative risk (medical to surgical) of mortality at 5 and 7 years (2.58 and 2.12, respectively).These data corroborate the trends observed in several randomized trials of medical and surgical therapy in patients with abnormal left ventricular function. If hospital mortality for coronary artery bypass grafting is less than 5%, substantial benefit can be anticipated for the majority of patients with depressed ventricular function
    • 

    corecore