49,965 research outputs found

    An Equitable Approach to Academic Progression Advising: To Use an Automated Tool or Not?

    Get PDF
    This Organizational Improvement Plan (OIP) is concerned with the problem of inequitable access to an automated degree audit tool at a large metropolitan university in central Canada, which is affecting year-to-year progression and timely graduation. Despite the implementation of this tool 17 years ago, it has not been fully adopted, leaving the campus community to navigate degree progression inconsistently and in isolation, with nearly half the students underserved; of particular concern are first-generation students. Stakeholder theory underpins the organizational change management plan, which espouses authentic, transformational, and shared leadership approaches. Following internal and external analyses, it was determined that the institution has the capacity and readiness for change. Using an interpretive paradigm to consider multiple perspectives, four possible solutions are considered. After assessing time, finances, human resources, and impact, the preferred solution is to implement an incremental working group. An inclusive and comprehensive change model and an iterative review cycle comprising immediate and ongoing monitoring and evaluation is proposed for effective implementation. As well, strategic communication for equitable access to information sharing needed for degree progression will enhance the overall student experience and improve administrative efficiencies. The execution of a five-stage change plan will see all academic advisors using the automated degree audit tool by 2025. Successful implementation of this OIP will allow for exploration of other initiatives to improve the student experience using technology and a social justice lens in leadership

    Using Counts as Heuristics for the Analysis of Static Models

    Get PDF
    The upstream activities of software development are often viewed as both the most important, in terms of cost, and the yet the least understood, and most problematic, particularly in terms of satisfying customer requirements. Business process modelling is one solution that is being increasingly used in conjunction with traditional software development, often feeding in to requirements and analysis activities. In addition, research in Systems Engineering for Business Process Change, highlights the importance of modelling business processes in evolving and maintaining the legacy systems that support those processes. However, the major use of business process modelling, is to attempt to restructure the business process, in order to improve some given aspect, e.g., cost or time. This restructuring may be seen either as separate activity or as a pre-cursor to the development of systems to support the new or improved process. Hence, the analysis of these business models is vital to the improvement of the process, and as a consequence to the development of supporting software systems. Supporting this analysis is the focus of this paper. Business processes are typically described with static (diagrammatic) models. This paper proposes the use of measures (counts) to aid analysis and comparison of these static process descriptions. The proposition is illustrated by showing how measures can be applied to a commonly used process-modelling notation, Role Activity Diagrams (RADs). Heuristics for RADs are described and measures suggested which support those heuristics. An example process is used to show how a coupling measure can be used to highlight features in RADs useful to the process modeller. To fully illustrate the proposition the paper describes and applies a framework for the theoretical validation of the coupling measure. An empirical evaluation follows. This is illustrated by two case studies; the first based on the bidding process of a large telecommunications systems supplier, and the second a study of ten prototyping processes across a number of organisations. These studies found that roles of the same type exhibited similar levels of coupling across processes. Where roles did not adhere to tentative threshold values, further investigation revealed unusual circumstances or hidden behaviour. Notably, study of the prototyping roles, which exhibited the greatest variation in coupling, found that coupling was highly correlated with the size of the development team. This suggests that prototyping in large projects had a different process to that for small projects, using more mechanisms for communication. Hence, the empirical studies support the view that counts (measures) may be useful in the analysis of static process models

    Can Who-Edits-What Predict Edit Survival?

    Get PDF
    As the number of contributors to online peer-production systems grows, it becomes increasingly important to predict whether the edits that users make will eventually be beneficial to the project. Existing solutions either rely on a user reputation system or consist of a highly specialized predictor that is tailored to a specific peer-production system. In this work, we explore a different point in the solution space that goes beyond user reputation but does not involve any content-based feature of the edits. We view each edit as a game between the editor and the component of the project. We posit that the probability that an edit is accepted is a function of the editor's skill, of the difficulty of editing the component and of a user-component interaction term. Our model is broadly applicable, as it only requires observing data about who makes an edit, what the edit affects and whether the edit survives or not. We apply our model on Wikipedia and the Linux kernel, two examples of large-scale peer-production systems, and we seek to understand whether it can effectively predict edit survival: in both cases, we provide a positive answer. Our approach significantly outperforms those based solely on user reputation and bridges the gap with specialized predictors that use content-based features. It is simple to implement, computationally inexpensive, and in addition it enables us to discover interesting structure in the data.Comment: Accepted at KDD 201

    MITK-ModelFit: A generic open-source framework for model fits and their exploration in medical imaging -- design, implementation and application on the example of DCE-MRI

    Full text link
    Many medical imaging techniques utilize fitting approaches for quantitative parameter estimation and analysis. Common examples are pharmacokinetic modeling in DCE MRI/CT, ADC calculations and IVIM modeling in diffusion-weighted MRI and Z-spectra analysis in chemical exchange saturation transfer MRI. Most available software tools are limited to a special purpose and do not allow for own developments and extensions. Furthermore, they are mostly designed as stand-alone solutions using external frameworks and thus cannot be easily incorporated natively in the analysis workflow. We present a framework for medical image fitting tasks that is included in MITK, following a rigorous open-source, well-integrated and operating system independent policy. Software engineering-wise, the local models, the fitting infrastructure and the results representation are abstracted and thus can be easily adapted to any model fitting task on image data, independent of image modality or model. Several ready-to-use libraries for model fitting and use-cases, including fit evaluation and visualization, were implemented. Their embedding into MITK allows for easy data loading, pre- and post-processing and thus a natural inclusion of model fitting into an overarching workflow. As an example, we present a comprehensive set of plug-ins for the analysis of DCE MRI data, which we validated on existing and novel digital phantoms, yielding competitive deviations between fit and ground truth. Providing a very flexible environment, our software mainly addresses developers of medical imaging software that includes model fitting algorithms and tools. Additionally, the framework is of high interest to users in the domain of perfusion MRI, as it offers feature-rich, freely available, validated tools to perform pharmacokinetic analysis on DCE MRI data, with both interactive and automatized batch processing workflows.Comment: 31 pages, 11 figures URL: http://mitk.org/wiki/MITK-ModelFi
    • …
    corecore