28 research outputs found
The need for a risk management framework for data science projects: a systematic literature review
Many data science endeavors encounter failure, surfacing at any project phase. Even after successful deployments, data science projects grapple with ethical dilemmas, such as bias and discrimination. Current project management methodologies prioritize efficiency and cost savings over risk management. The methodologies largely overlook the diverse risks of sociotechnical systems and risk articulation inherent in data science lifecycles. Conversely, while the established risk management framework (RMF) by NIST and McKinsey aims to manage AI risks, there is a heavy reliance on normative definitions of risk, neglecting the multifaceted subjectivities of data science project failures. This paper reports on a systematic literature review that identifies three main themes: Big Data Execution Issues, Demand for a Risk Management Framework tailored for Large-Scale Data Science Projects, and the need for a General Risk Management Framework for all Data Science Endeavors. Another overarching focus is on how risk is articulated by the institution and the practitioners. The paper discusses a novel and adaptive data science risk management framework – “DS EthiCo RMF” – which merges project management, ethics, and risk management for diverse data science projects into one holistic framework. This agile risk management framework DS EthiCo RMF can bridge the current divide between normative risk standards and the multitude of data science requirements, offering a human-centric method to navigate the intertwined sociotechnical risks of failure in data science projects
Thoughts on Current and Future Research on Agile and Lean: Ensuring Relevance and Rigor
Over the past two decades, research in the area of agile and lean software development has mirrored the strong growth of the use of agile and lean methodologies. However, while these research streams have made a significant contribution in the use of agile and lean methodologies, much of the recent research lacks the rigor and relevance to make an impact in research and practice. For example, many of the studies have not measured the actual use of agile or lean methods nor had a significant theoretical grounding. Furthermore, agile research has not expanded to fully cover emerging opportunities and challenges. A deeper theoretical motivation on agile and lean software development can help demonstrate how the principles of, for example, agile software development, may be transferred to these other areas, and hence, broaden the research’s relevance. This paper provides commentary intended to help push the agile and lean research agenda forward, and outlines three key critieria that future researchers should consider when conducting research on the phenomenon of agile. The paper also provides an example for the use of the criteria, and presents several initial, open research questions that could help increase the use of agile, including the use of agile and lean concepts in other IT and non-IT contexts
The GET Immersion Experience: A New Model for Leveraging the Synergies between Industry and Academia
This article describes a new and innovative open co-op program for MIS/IS students. The program, Global Enterprise Technology Immersion Experience (GET IE), has a global enterprise focus that is integrated with hands-on experiential work-based learning to provide a context in which students are stimulated to utilize their classroom knowledge. The program includes a two-semester internship component that can be seamlessly incorporated with an existing MIS curriculum. The internship\u27s unique pedagogical innovation is to deliver academic coursework on global enterprise technology to the students just in time—that is, while they are participating in an extended internship. The program, in effect, creates a domain-specific, next generation co-op program that complements traditional information systems curricula with a skillset that is required for creating and running very large global enterprise applications. The guiding GET consortium consists of four universities and a number of large companies, and the consortium is open to future expansion. The continued growth of the consortium would enrich student choices and foster cross-fertilization of curriculum activities
Coordination in OSS 2.0: ANT Approach
Open source software projects are increasingly driven by a combination of independent and professional developers, the former volunteers and the later hired by a company to contribute to the project to support commercial product development. This mix of developers has been referred to as OSS 2.0. However, we do not fully understand the multi-layered coordination spanning individuals, teams, and organizations. Using Actor-Network Theory (ANT), we describe how coordination and power dynamics unfold among developers and how different tools and artifacts both display activities and mediate coordination efforts. Internal communication within an organization was reported to cause broken links in the community, duplication of work, and political tensions. ANT shows how tools and code can exercise agency and alter a software development process as an equivalently active actor of the scene. We discuss the theoretical and practical implications of the changing nature of open source software development
Helping Data Science Students Develop Task Modularity
This paper explores the skills needed to be a data scientist. Specifically, we report on a mixed method study of a project-based data science class, where we evaluated student effectiveness with respect to dividing a project into appropriately sized modular tasks, which we termed task modularity. Our results suggest that while data science students can appreciate the value of task modularity, they struggle to achieve effective task modularity. As a first step, based our study, we identified six task decomposition best practices. However, these best practices do not fully address this gap of how to enable data science students to effectively use task modularity. We note that while computer science/information system programs typically teach modularity (e.g., the decomposition process and abstraction), and there remains a need identify a corresponding model to that used for computer science / information system students, to teach modularity to data science students
Imaging and Neuro-Oncology Clinical Trials of the National Clinical Trials Network (NCTN)
Imaging in neuro-oncology clinical trials can be used to validate patient eligibility, stage at presentation, response to therapy, and radiation therapy. A number of National Clinical Trials Network trials illustrating this are presented. Through the Imaging and Radiation Oncology Core’s quality assurance processes for data acquisition and review, there are uniform data and imaging sets for review. Once the trial endpoints have been analyzed and published, the clinical trial information including pathology, imaging, and radiation therapy objects can be moved to a public archive for use by investigators interested in translational science and the application of new informatics tools for trial analysis