6,313 research outputs found

    Modeling functional requirements using tacit knowledge: a design science research methodology informed approach

    Get PDF
    The research in this paper adds to the discussion linked to the challenge of capturing and modeling tacit knowledge throughout software development projects. The issue emerged when modeling functional requirements during a project for a client. However, using the design science research methodology at a particular point in the project helped to create an artifact, a functional requirements modeling technique, that resolved the issue with tacit knowledge. Accordingly, this paper includes research based upon the stages of the design science research methodology to design and test the artifact in an observable situation, empirically grounding the research undertaken. An integral component of the design science research methodology, the knowledge base, assimilated structuration and semiotic theories so that other researchers can test the validity of the artifact created. First, structuration theory helped to identify how tacit knowledge is communicated and can be understood when modeling functional requirements for new software. Second, structuration theory prescribed the application of semiotics which facilitated the development of the artifact. Additionally, following the stages of the design science research methodology and associated tasks allows the research to be reproduced in other software development contexts. As a positive outcome, using the functional requirements modeling technique created, specifically for obtaining tacit knowledge on the software development project, indicates that using such knowledge increases the likelihood of deploying software successfully

    Customising software products in distributed software development a model for allocating customisation requirements across organisational boundaries

    Get PDF
    Requirements engineering plays a vital role in the software development process. While it is difficult to manage those requirements locally, it is even more difficult to communicate those requirements over organisational boundaries and to convey them to multiple distribution customers. This paper discusses the requirements of multiple distribution customers empirically in the context of customised software products. The main purpose is to understand the challenges of communicating and allocating customisation requirements across distributed organisational boundaries. We conducted an empirical survey with 19 practitioners, which confirmed that communicating customisation requirements in a DSD context is a significant challenge. We therefore propose a model for allocating customisation requirements between a local, customer-based agile team and a distributed development team that uses a traditional development approach. Our conjecture is that the model would reduce the challenge of communicating requirements across organisational boundaries, address customers’ requirements and provide a focus for future empirical studies

    SCU Courses

    Get PDF
    Registering for classes is a nightmare that students at Santa Clara University undergo three or more times a year while juggling midterm exams. It’s hard to find a schedule that works well for you, balancing the need to take classes that will satisfy degree progress with the need to work around obligations outside of class and avoid getting stuck in an 8am lecture. SCU Courses is a web app where students input their current degree progress and receive a list of possible schedules to take next quarter, collapsing the time-consuming process of carefully crafting a schedule into just one step: choose your favorite

    Estimating, planning and managing Agile Web development projects under a value-based perspective

    Get PDF
    Context: The processes of estimating, planning and managing are crucial for software development projects, since the results must be related to several business strategies. The broad expansion of the Internet and the global and interconnected economy make Web development projects be often characterized by expressions like delivering as soon as possible, reducing time to market and adapting to undefined requirements. In this kind of environment, traditional methodologies based on predictive techniques sometimes do not offer very satisfactory results. The rise of Agile methodologies and practices has provided some useful tools that, combined with Web Engineering techniques, can help to establish a framework to estimate, manage and plan Web development projects. Objective: This paper presents a proposal for estimating, planning and managing Web projects, by combining some existing Agile techniques with Web Engineering principles, presenting them as an unified framework which uses the business value to guide the delivery of features. Method: The proposal is analyzed by means of a case study, including a real-life project, in order to obtain relevant conclusions. Results: The results achieved after using the framework in a development project are presented, including interesting results on project planning and estimation, as well as on team productivity throughout the project. Conclusion: It is concluded that the framework can be useful in order to better manage Web-based projects, through a continuous value-based estimation and management process.Ministerio de EconomĂ­a y Competitividad TIN2013-46928-C3-3-

    Software Engineers' Information Seeking Behavior in Change Impact Analysis - An Interview Study

    Get PDF
    Software engineers working in large projects must navigate complex information landscapes. Change Impact Analysis (CIA) is a task that relies on engineers' successful information seeking in databases storing, e.g., source code, requirements, design descriptions, and test case specifications. Several previous approaches to support information seeking are task-specific, thus understanding engineers' seeking behavior in specific tasks is fundamental. We present an industrial case study on how engineers seek information in CIA, with a particular focus on traceability and development artifacts that are not source code. We show that engineers have different information seeking behavior, and that some do not consider traceability particularly useful when conducting CIA. Furthermore, we observe a tendency for engineers to prefer less rigid types of support rather than formal approaches, i.e., engineers value support that allows flexibility in how to practically conduct CIA. Finally, due to diverse information seeking behavior, we argue that future CIA support should embrace individual preferences to identify change impact by empowering several seeking alternatives, including searching, browsing, and tracing.Comment: Accepted for publication in the proceedings of the 25th International Conference on Program Comprehensio

    Applying the proto-theory of design to explain and modify the parameter analysis method of conceptual design

    Get PDF
    This article reports on the outcomes of applying the notions provided by the reconstructed proto-theory of design, based on Aristotle’s remarks, to the parameter analysis (PA) method of conceptual design. Two research questions are addressed: (1) What further clarification and explanation to the approach of PA is provided by the proto-theory? (2) Which conclusions can be drawn from the study of an empirically derived design approach through the proto-theory regarding usefulness, validity and range of that theory? An overview of PA and an application example illustrate its present model and unique characteristics. Then, seven features of the proto-theory are explained and demonstrated through geometrical problem solving and analogies are drawn between these features and the corresponding ideas in modern design thinking. Historical and current uses of the terms analysis and synthesis in design are also outlined and contrasted, showing that caution should be exercised when applying them. Consequences regarding the design moves, process and strategy of PA allow proposing modifications to its model, while demonstrating how the ancient method of analysis can contribute to better understanding of contemporary design-theoretic issues

    New Methods, Current Trends and Software Infrastructure for NLP

    Full text link
    The increasing use of `new methods' in NLP, which the NeMLaP conference series exemplifies, occurs in the context of a wider shift in the nature and concerns of the discipline. This paper begins with a short review of this context and significant trends in the field. The review motivates and leads to a set of requirements for support software of general utility for NLP research and development workers. A freely-available system designed to meet these requirements is described (called GATE - a General Architecture for Text Engineering). Information Extraction (IE), in the sense defined by the Message Understanding Conferences (ARPA \cite{Arp95}), is an NLP application in which many of the new methods have found a home (Hobbs \cite{Hob93}; Jacobs ed. \cite{Jac92}). An IE system based on GATE is also available for research purposes, and this is described. Lastly we review related work.Comment: 12 pages, LaTeX, uses nemlap.sty (included

    A comparison of processing techniques for producing prototype injection moulding inserts.

    Get PDF
    This project involves the investigation of processing techniques for producing low-cost moulding inserts used in the particulate injection moulding (PIM) process. Prototype moulds were made from both additive and subtractive processes as well as a combination of the two. The general motivation for this was to reduce the entry cost of users when considering PIM. PIM cavity inserts were first made by conventional machining from a polymer block using the pocket NC desktop mill. PIM cavity inserts were also made by fused filament deposition modelling using the Tiertime UP plus 3D printer. The injection moulding trials manifested in surface finish and part removal defects. The feedstock was a titanium metal blend which is brittle in comparison to commodity polymers. That in combination with the mesoscale features, small cross-sections and complex geometries were considered the main problems. For both processing methods, fixes were identified and made to test the theory. These consisted of a blended approach that saw a combination of both the additive and subtractive processes being used. The parts produced from the three processing methods are investigated and their respective merits and issues are discussed
    • 

    corecore