55 research outputs found

    Cognición y aprendizaje

    Get PDF

    Math and Science Education

    Get PDF
    Resnick provides an excellent brief account of current work in cognitive psychology and its important implications for math and science education. As she indicates, most cognitive psychologists view knowledge as consisting of highly organized schemata into which new experiences are assimilated and view the learner as actively constructing new knowledge. This view is consistent with the ideas that Piagetian theorists and educators have been propounding for many years, although Resnick’s discussion is rooted in the more detailed analysis of specific knowledge and learning in specific content areas that typifies the information-processing paradigm of modern cognitive science

    Cognición y aprendizaje

    Get PDF

    Beyond summative evaluation: The Instructional Quality Assessment as a professional development tool

    Get PDF
    In order to improve students' opportunities to learn, educators need tools that can assist them to reflect on and analyze their own and others' teaching practice. Many available observation tools and protocols for studying student work are inadequate because they do not directly engage educators in core issues about rigorous content and pedagogy. In this conceptual paper, we argue that the Instructional Quality Assessment (IQA)--a formal toolkit for rating instructional quality that is based primarily on classroom observations and student assignments--has strong potential to support professional development within schools at multiple levels. We argue that the IQA could be useful to "teachers" for analyzing their own and their colleagues' practice; additionally, the IQA could aid the efforts of "principals" in their work as instructional leaders, identifying effective practitioners to help lead professional development within a school and targeting professional development needs that would require external support. Although the IQA was designed for summative, external evaluation, we argue that the steps taken to improve the reliability of the instrument--particularly the efforts to make the rubric descriptors for gradations of instructional quality as transparent as possible--also serve to make the tool a resource for professional growth among educators. The following are appended: (1) Abridged Version of the Principles of Learning; (2) Relationship between Checklist Ratings and Rubric Scores; and (3) Accountable Talk Function Checklist. (Contains 3 notes, 1 table, and 1 figure.

    Using the Instructional Quality Assessment toolkit to investigate the quality of reading comprehension assignments and student work (CSE Report 669)

    Get PDF
    This study presents preliminary findings from research developing an instructional quality assessment (IQA) toolkit that could be used to monitor the influence of reform initiatives on students' learning environments and to guide professional development efforts within a school or district. This report focuses specifically on the portion of the IQA used to evaluate the quality of teachers' reading comprehension assignments and student work. Results are limited due to a very small sample of participating teachers (N = 13, 52 assignments), and indicate a poor to moderate level of inter-rater agreement and a good degree of consistency for the dimensions measuring academic rigor, but not the clarity of teachers' expectations. The rigor of the assignments collected from teachers also was associated with the rigor of observed instruction. Collecting four assignments (two challenging and two recent) from teachers did not yield a stable estimate of quality. Additional analyses looking separately at the two different assignment types indicate, however, that focusing on one assignment type would yield a stable estimate of quality. This suggests that the way in which assignments are collected from teachers should be revised. Implications for professional development are also discussed. The 2003 Draft Observation and Assignment Rubrics for Reading Comprehension is appended. (Contains 6 tables, 4 figures, and 4 footnotes.

    Proceedings of the 3rd Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2015: advancing efficient methodologies through community partnerships and team science

    Get PDF
    It is well documented that the majority of adults, children and families in need of evidence-based behavioral health interventionsi do not receive them [1, 2] and that few robust empirically supported methods for implementing evidence-based practices (EBPs) exist. The Society for Implementation Research Collaboration (SIRC) represents a burgeoning effort to advance the innovation and rigor of implementation research and is uniquely focused on bringing together researchers and stakeholders committed to evaluating the implementation of complex evidence-based behavioral health interventions. Through its diverse activities and membership, SIRC aims to foster the promise of implementation research to better serve the behavioral health needs of the population by identifying rigorous, relevant, and efficient strategies that successfully transfer scientific evidence to clinical knowledge for use in real world settings [3]. SIRC began as a National Institute of Mental Health (NIMH)-funded conference series in 2010 (previously titled the “Seattle Implementation Research Conference”; $150,000 USD for 3 conferences in 2011, 2013, and 2015) with the recognition that there were multiple researchers and stakeholdersi working in parallel on innovative implementation science projects in behavioral health, but that formal channels for communicating and collaborating with one another were relatively unavailable. There was a significant need for a forum within which implementation researchers and stakeholders could learn from one another, refine approaches to science and practice, and develop an implementation research agenda using common measures, methods, and research principles to improve both the frequency and quality with which behavioral health treatment implementation is evaluated. SIRC’s membership growth is a testament to this identified need with more than 1000 members from 2011 to the present.ii SIRC’s primary objectives are to: (1) foster communication and collaboration across diverse groups, including implementation researchers, intermediariesi, as well as community stakeholders (SIRC uses the term “EBP champions” for these groups) – and to do so across multiple career levels (e.g., students, early career faculty, established investigators); and (2) enhance and disseminate rigorous measures and methodologies for implementing EBPs and evaluating EBP implementation efforts. These objectives are well aligned with Glasgow and colleagues’ [4] five core tenets deemed critical for advancing implementation science: collaboration, efficiency and speed, rigor and relevance, improved capacity, and cumulative knowledge. SIRC advances these objectives and tenets through in-person conferences, which bring together multidisciplinary implementation researchers and those implementing evidence-based behavioral health interventions in the community to share their work and create professional connections and collaborations
    corecore