3,807 research outputs found

    A Process-Based Approach to ABET Accreditation: A Case Study of a Cybersecurity and Digital Forensics Program

    Get PDF
    ABET accreditation has become a well-known standard for academic programs not only in the U.S. but also across the globe. Instantiating the processes to systematically improve the quality of programs is a daunting task for higher education institutions. In this contribution, we provide a detailed process-based framework that can assist aspiring institutions to embed quality in their processes leading to ABET accreditation. Our contribution is a novel framework for a process-based approach to quality assurance, as most of the published literature is primarily concerned with the experience of ABET accreditation of a solitary program. However, in this paper, we have presented a generic framework that ABET aspiring programs can instantiate in their preparation for ABET accreditation. We have validated these processes in our successful ABET accreditation application of the Bachelor of Science in Cybersecurity and Digital Forensics program. Our existing ABET-accredited programs were following old ABET criteria and the Bachelor of Science in Cybersecurity and Digital Forensics program must apply based on the new criteria proposed by ABET. Another novelty of our contribution is that it is based on our work for the first application cycle for ABET cybersecurity-related programs, so the findings of our contribution may assist other aspiring cybersecurity related academic programs to well prepare in their ABET accreditation pursuit

    Assessing Information Systems and Computer Information Systems Programs from a Balanced Scorecard Perspective

    Get PDF
    Assessment of educational programs is one of the important means used in academia for accountability, accreditation, and improvement of program quality. The assessment practices, guidelines, and requirements are very broad and vary widely among academic programs and from one institution to the other. In this paper, from the theoretical lenses of a strategic planning and management methodology, the Balanced Scorecard, we try to integrate various perspectives into a performance assessment framework for an educational assessment of computing and information systems. Particularly, based on the actual accreditation experience, we propose two assessment models: a conceptual model and a process model. This modeling approach addresses the critical conceptual elements required for educational assessment and provides practical guidelines to follow for a complete, smooth and successful assessment process. In addition, we present a set of robust tools and techniques, incorporated into the process steps, team work, and task-driven management process. We were successful in our accreditation efforts, and improved the quality of our computing and information systems programs by using these presented assessment methods. We share our views and thoughts in the form of lessons learned and suggested best practices so as to streamline program assessment and simplify its procedures and steps

    CSU industrial hygiene program training needs analysis

    Get PDF
    Includes bibliographical references.2022 Fall.Graduates of industrial hygiene (IH) programs must be able to meet continuously evolving health and safety needs in a wide variety of occupational settings. Therefore, academic IH graduate programs must regularly evaluate their curricula and solicit input from industry professionals in order to make curricular changes that will better prepare their students for professional roles in industry. The purpose of this study was to identify the training gaps that exist between industry needs, accreditation criteria, and the current curriculum for the CSU Industrial Hygiene graduate program. In Phase 1 of this study, a set of curriculum maps were developed to evaluate training gaps between accreditation criteria, course learning objectives, and course assessment mechanisms for each core course in the IH curriculum. In Phase 2, the research team facilitated two group interviews with the Colorado State University (CSU) IH Advisory Board, collected alumni survey data, and performed a qualitative analysis to identify skills gaps/needs for CSU IH Program graduates. In Phase 1, only one gap was identified between accreditation criteria and IH program course objectives in the nine core departmental courses of the CSU IH graduate curriculum. No gaps were found between IH Program course objectives to course assessment mechanisms. In Phase 2, the research team identified three themes (technical, applied, and soft skills) and selected several skills within each theme that interview participants thought were necessary proficiencies for young IH professionals. The curriculum mapping exercise generally validated the satisfactory accreditation status of the CSU IH Program graduate curriculum. The development of the curriculum mapping tool and evaluation method can aid in providing evidence for the re-accreditation process for the CSU IH Program, as well as other occupational health programs. The skills identified in the qualitative interview and survey data can be incorporated into the curriculum to improve the training of IH students. Additionally, by using qualitative analysis, the researchers uncovered soft skills previously unidentified in IH needs assessments, providing valuable information for all IH graduate programs

    Teaching ethics to engineering undergraduates - lessons learned and a guide for lecturers: perspective from an English University

    Get PDF
    The issue of ethics within engineering profession has been gaining more and more importance due to globalisation, increasing awareness of sustainability and the fast changing business culture within engineering organisations. As a direct result of such factors the accrediting bodies such as the IMechE and the ABET are very vocal about explicit ethics content in relevant undergraduate engineering programmes. However it is a very challenging exercise to deliver the topic in an effective way due to a number of reasons. First and foremost is the general reluctance of today’s lecturers who themselves were not taught such topics and hence the vast majority are not very keen to consider such ‘softer’ topics very seriously. It is also difficult to accommodate the contents within the engineering curriculum which is already filled with various technical subjects. At the same time, a significant proportion of students find it difficult to relate ethics to real life working environment due to inexperience and hence would consider ‘ethics’ to be ‘not so rigorous’ a subject resulting in poor engagement. The present paper discusses the complete journey of how engineering ethics has been incorporated into an accredited BEng programme in Mechanical engineering. The three steps in course design i.e., breadth and depth of content, detailed planning for effective delivery and assessment and feedback – are all critically discussed by reference to available literature. The author also provides more than one pathway such that the experience may prove useful to the wider communit

    International ABET Accreditation: From The Perspective Of A South African Information Systems Department

    Get PDF
    The undergraduate Information Systems program of the Department of Informatics, University of Pretoria recently obtained ABET accreditation. Having an accredited program and keeping it accredited, has quite a few implications on the processes and structure of the department. However, the paper will mainly focus on the difficulties surrounding getting a non-US IS program accredited by a US based accreditation body. It is the hope that the insights that we gained from this experience will assist other hosts of non-US programs in their preparations for ABET accreditation

    Report from the STEM 2026 Workshop on Assessment, Evaluation, and Accreditation

    Get PDF
    A gathering of science, technology, engineering, and math (STEM) higher education stakeholders met in November 2018 to consider the relationship between innovation in education and assessment. When we talk about assessment in higher education, it is inextricably linked to both evaluation and accreditation, so all three were considered. The first question we asked was can we build a nation of learners? This starts with considering the student, first and foremost. As educators, this is a foundation of our exploration and makes our values transparent. As educators, how do we know we are having an impact? As members and implementers of institutions, programs and professional societies, how do we know students are learning and that what they are learning has value? The focus of this conversation was on undergraduate learning, although we acknowledge that the topic is closely tied to successful primary and secondary learning as well as graduate education. Within the realm of undergraduate education, students can experience four-year institutions and two-year institutions, with many students learning at both at different times. Thirty-seven participants spent two days considering cases of innovation in STEM education, learning about the best practices in assessment, and then discussing the relationship of innovation and assessment at multiple levels within the context of higher education. Six working groups looked at course-level, program-level, and institution-level assessment, as well as cross-disciplinary programs, large-scale policy issues, and the difficult-to-name “non-content/cross-content” group that looked at assessment of transferable skills and attributes like professional skills, scientific thinking, mindset, and identity, all of which are related to post-baccalaureate success. These conversations addressed issues that cut across multiple levels, disciplines, and course topics, or are otherwise seen as tangential or perpendicular to perhaps “required” assessment at institutional, programmatic, or course levels. This report presents the context, recommendations, and “wicked” challenges from the meeting participants and their working groups. Along with the recommendations of workshop participants, these intricate challenges weave a complex web of issues that collectively need to be addressed by our community. They generated a great deal of interest and engagement from workshop participants, and act as a call to continue these conversations and seek answers that will improve STEM education through innovation and improved assessment. This material is based upon work supported by the National Science Foundation under Grant No. DUE-1843775. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation

    Report from the STEM 2026 Workshop on Assessment, Evaluation, and Accreditation

    Get PDF
    A gathering of science, technology, engineering, and math (STEM) higher education stakeholders met in November 2018 to consider the relationship between innovation in education and assessment. When we talk about assessment in higher education, it is inextricably linked to both evaluation and accreditation, so all three were considered. The first question we asked was can we build a nation of learners? This starts with considering the student, first and foremost. As educators, this is a foundation of our exploration and makes our values transparent. As educators, how do we know we are having an impact? As members and implementers of institutions, programs and professional societies, how do we know students are learning and that what they are learning has value? The focus of this conversation was on undergraduate learning, although we acknowledge that the topic is closely tied to successful primary and secondary learning as well as graduate education. Within the realm of undergraduate education, students can experience four-year institutions and two-year institutions, with many students learning at both at different times. Thirty-seven participants spent two days considering cases of innovation in STEM education, learning about the best practices in assessment, and then discussing the relationship of innovation and assessment at multiple levels within the context of higher education. Six working groups looked at course-level, program-level, and institution-level assessment, as well as cross-disciplinary programs, large-scale policy issues, and the difficult-to-name “non-content/cross-content” group that looked at assessment of transferable skills and attributes like professional skills, scientific thinking, mindset, and identity, all of which are related to post-baccalaureate success. These conversations addressed issues that cut across multiple levels, disciplines, and course topics, or are otherwise seen as tangential or perpendicular to perhaps “required” assessment at institutional, programmatic, or course levels. This report presents the context, recommendations, and “wicked” challenges from the meeting participants and their working groups. Along with the recommendations of workshop participants, these intricate challenges weave a complex web of issues that collectively need to be addressed by our community. They generated a great deal of interest and engagement from workshop participants, and act as a call to continue these conversations and seek answers that will improve STEM education through innovation and improved assessment. This material is based upon work supported by the National Science Foundation under Grant No. DUE-1843775. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation

    A web-based course assessment tool with direct mapping to student outcomes

    Get PDF
    The assessment of curriculum outcomes is an essential element for continuous academic improvement. However, the collection, aggregation and analysis of assessment data are notoriously complex and time-consuming processes. At the same time, only few developments of supporting electronic processes and tools for continuous academic program assessment and curriculum performance feedback have emerged. In this paper, we introduce a novel course assessment process supported by a Web based interface that articulates and streamlines the assessment data collection, performance evaluation and tracking of remedial recommendations. To close the assessment loop, the Web interface provides also a mechanism to follow up on the implementation of remedial recommendations and analyzes their associated reflective actions during the subsequent course assessment cycle. A guide to map assessment instruments to the course and overall program outcomes is advocated by the proposed tool to propagate the course assessment results towards higher educational objectives (e.g., student outcomes) in a dashboard-like assessment interface. This approach streamlines improvements in education through reflecting the achievement of course outcomes on the achievement of higher educational objectives.In addition, the tool maps the course outcomes to the corresponding course outlines to facilitate the detection of areas where revisions in the instruction and content is needed, and to best respond to recommendations and remedial actions. We provide a methodical approach as well as a Web-based automation of the assessment process, which we evaluate in the context of our regular academic assessment cycles that have eventually led to a successful international accreditation experience. The collected assessment data shows a significant improvement in the achievement rate of the student outcomes after deploying the tool

    Invited Paper: Ingredients of a High-Quality Information Systems Program in a Changing IS Landscape

    Get PDF
    This paper describes James Madison University’s undergraduate major in Computer Information Systems as an example of a high- quality Information Systems (IS) program and discusses our planned evolution in the context of the rapid changes of technological, business, and social factors. We have determined what we consider to be five essential ingredients of what makes JMU’s program a high-quality IS major. These are: (1) building an integrated, rigorous curriculum with a strong technical foundation; (2) developing a vibrant community of faculty, students, alumni, employers, and community service organizations; (3) respecting and supporting pedagogical scholarship; (4) committing to continuous improvement and assessment; and (5) accreditation. We believe these ingredients will continue to be highly relevant as the IS discipline moves forward, but also that curriculum content will need to adjust to meet changing demand. We discuss the increasing relevance of topics such as analytics, security, and the cloud to the IS curriculum and their implications for pedagogy, accreditation, and scholarship. We hope that sharing JMU’s experience, insights, and future directions will be useful to JISE’s readership
    corecore