8 research outputs found

    Метод анализа динамики изменения требований назначения многофункциональных программно-технических систем

    Get PDF
    В работе приведена модель системы требований назначения в виде изменяемой иерархии функциональных компонентов, введен показатель динамики изменения требований назначения многофункциональных систе

    REQUIREMENTS ENGINEERING (RE) EFFECTIVENESS IN OPEN SOURCE SOFTWARE: THE ROLE OF SOCIAL NETWORK CONFIGURATIONS AND REQUIREMENTS PROPERTIES

    Get PDF
    As open source software (OSS) development projects have become popular, requirements engineering (RE) practices in OSS development have come under scrutiny for their marked difference. The current study views OSS RE as knowledge propagation expressed in coordinated sequences of action interrupted and shaped by the demands of an ever-changing environment resulting in multiple social network configurations. The attributes of environment are manifested in the changing nature of requirements that the projects are subjected to and based on the 6-V requirements model involving 6 properties of requirement knowledge such as volume or volatility. The diverse network configurations in OSS projects manifested in communication centrality responding to those demands, in turn, will have varying effects on OSS project effectiveness measured by the rate of task completion. We hypothesize that the volume of requirements and their velocity of change negatively moderate the positive effect of communication centrality on the project’s task completion whereas the variance (diversity) of requirements knowledge positively moderates the positive effect of communication centrality on task completion. The hypotheses of the study are tested using a sample of GitHub OSS projects and we find support to most hypotheses. Several implications for OSS governance are drawn

    Estudo de Caso com Use Case 2.0

    Get PDF
    TCC(graduação) - Universidade Federal de Santa Catarina. Centro Tecnológico. Ciências da Computação.A metodologia ágil está conquistando cada vez mais espaço dentro do desenvolvimento de software. Uma pesquisa realizada recentemente pela VersionOne, revelou um crescimento massivo na adesão de metodologias ágeis. Contudo, um relatório apresentado pela Standish Group, revelou que somente 29% dos projetos em 2015 foram entregues com sucesso. Dentre os fatores para o sucesso apresentados no relatório, os principais estão relacionados a falha no levantamento de requisitos e a falta de participação do cliente. Apesar da importância da elicitação de requisitos para o sucesso de desenvolvimento de software, as metodologias ágeis consideram essas atividades burocráticas, tornando o projeto menos ágil. Com o intuito de resolver esse problema, em 2011, surgiu a abordagem Use Case 2.0, uma evolução da técnica de casos de uso, que vem como uma prática ágil e escalável para capturar requisitos e auxiliar na gestão do desenvolvimento. Diante desse cenário, este trabalho tem o objetivo de implantar e avaliar a abordagem Use Case 2.0 em uma unidade organizacional. A pesquisa é feita através de um estudo de caso aplicado no laboratório Bridge da Universidade Federal de Santa Catarina. Para tanto, inicialmente são revisados os conceitos fundamentais relacionados ao tema proposto através uma revisão da literatura. A partir dos conhecimentos adquiridos, é realizada uma avaliação do método de definição de requisitos utilizado por uma unidade organizacional do projeto SISMOB do laboratório. Em seguida, são aplicadas as técnicas da abordagem Use Case 2.0 em uma equipe do projeto SISMOB e, são feitas avaliações quanto aos resultados obtidos

    Estudo de Caso com Use Case 2.0

    Get PDF
    TCC(graduação) - Universidade Federal de Santa Catarina. Centro Tecnológico. Ciências da Computação.A metodologia ágil está conquistando cada vez mais espaço dentro do desenvolvimento de software. Uma pesquisa realizada recentemente pela VersionOne, revelou um crescimento massivo na adesão de metodologias ágeis. Contudo, um relatório apresentado pela Standish Group, revelou que somente 29% dos projetos em 2015 foram entregues com sucesso. Dentre os fatores para o sucesso apresentados no relatório, os principais estão relacionados a falha no levantamento de requisitos e a falta de participação do cliente. Apesar da importância da elicitação de requisitos para o sucesso de desenvolvimento de software, as metodologias ágeis consideram essas atividades burocráticas, tornando o projeto menos ágil. Com o intuito de resolver esse problema, em 2011, surgiu a abordagem Use Case 2.0, uma evolução da técnica de casos de uso, que vem como uma prática ágil e escalável para capturar requisitos e auxiliar na gestão do desenvolvimento. Diante desse cenário, este trabalho tem o objetivo de implantar e avaliar a abordagem Use Case 2.0 em uma unidade organizacional. A pesquisa é feita através de um estudo de caso aplicado no laboratório Bridge da Universidade Federal de Santa Catarina. Para tanto, inicialmente são revisados os conceitos fundamentais relacionados ao tema proposto através uma revisão da literatura. A partir dos conhecimentos adquiridos, é realizada uma avaliação do método de definição de requisitos utilizado por uma unidade organizacional do projeto SISMOB do laboratório. Em seguida, são aplicadas as técnicas da abordagem Use Case 2.0 em uma equipe do projeto SISMOB e, são feitas avaliações quanto aos resultados obtidos

    Applying model-based systems engineering in search of quality by design

    Get PDF
    2022 Spring.Includes bibliographical references.Model-Based System Engineering (MBSE) and Model-Based Engineering (MBE) techniques have been successfully introduced into the design process of many different types of systems. The application of these techniques can be reflected in the modeling of requirements, functions, behavior, and many other aspects. The modeled design provides a digital representation of a system and the supporting development data architecture and functional requirements associated with that architecture through modeling system aspects. Various levels of the system and the corresponding data architecture fidelity can be represented within MBSE environment tools. Typically, the level of fidelity is driven by crucial systems engineering constraints such as cost, schedule, performance, and quality. Systems engineering uses many methods to develop system and data architecture to provide a representative system that meets costs within schedule with sufficient quality while maintaining the customer performance needs. The most complex and elusive constraints on systems engineering are defining system requirements focusing on quality, given a certain set of system level requirements, which is the likelihood that those requirements will be correctly and accurately found in the final system design. The focus of this research will investigate specifically the Department of Defense Architecture Framework (DoDAF) in use today to establish and then assess the relationship between the system, data architecture, and requirements in terms of Quality By Design (QbD). QbD was first coined in 1992, Quality by Design: The New Steps for Planning Quality into Goods and Services [1]. This research investigates and proposes a means to: contextualize high-level quality terms within the MBSE functional area, provide an outline for a conceptual but functional quality framework as it pertains to the MBSE DoDAF, provides tailored quality metrics with improved definitions, and then tests this improved quality framework by assessing two corresponding case studies analysis evaluations within the MBSE functional area to interrogate model architectures and assess quality of system design. Developed in the early 2000s, the Department of Defense Architecture Framework (DoDAF) is still in use today, and its system description methodologies continue to impact subsequent system description approaches [2]. Two case studies were analyzed to show proposed QbD evaluation to analyze DoDAF CONOP architecture quality. The first case study addresses the analysis of DoDAF CONOP of the National Aeronautics and Space Administration (NASA) Joint Polar Satellite System (JPSS) ground system for National Oceanic and Atmospheric Administration (NOAA) satellite system with particular focus on the Stored Mission Data (SMD) mission thread. The second case study addresses the analysis of DoDAF CONOP of the Search and Rescue (SAR) navel rescue operation network System of Systems (SoS) with particular focus on the Command and Control signaling mission thread. The case studies help to demonstrate a new DoDAF Quality Conceptual Framework (DQCF) as a means to investigate quality of DoDAF architecture in depth to include the application of DoDAF standard, the UML/SysML standards, requirement architecture instantiation, as well as modularity to understand architecture reusability and complexity. By providing a renewed focus on a quality-based systems engineering process when applying the DoDAF, improved trust in the system and data architecture of the completed models can be achieved. The results of the case study analyses reveal how a quality-focused systems engineering process can be used during development to provide a product design that better meets the customer's intent and ultimately provides the potential for the best quality product

    A study of IT project management methodology with agile development

    Get PDF

    Using requirements and design information to predict volatility in software development

    Get PDF
    We hypothesise that data about the requirements and design stages of a software development project can be used to make predictions about the subsequent number of development changes that software components will experience. This would allow managers to concentrate time-consuming efforts (such as traceability and staff training) to a few at-risk, cost-effective areas, and may also allow predictions to be made at an earlier stage than is possible using traditional metrics, such as lines of code. Previous researchers have studied links between change-proneness and metrics such as measures of inheritance, size and code coupling. We extend these studies by including measures of requirements and design activity as well. Firstly we develop structures to model the requirements and design processes, and then propose some new metrics based on these models. The structures are populated using data from a case study project and analysed alongside existing complexity metrics to ascertain whether change-proneness can be predicted. Finally we examine whether combining these metrics with existing metrics improves our ability to make predictions about change-proneness. First results show that our metrics can be linked to the quantity of change experienced by components in a software development project (potentially allowing predictions to take place earlier than before) but that best results are obtained by combining existing complexity metrics such as size, or combining existing metrics with our newer metrics.EThOS - Electronic Theses Online ServiceBAE Systems : Engineering and Physical Sciences Research CouncilGBUnited Kingdo

    An industrial case study on requirements volatility measures

    No full text
    Requirements volatility is an important risk factor for software projects. Software measures can help in quantifying and predicting this risk. In this paper, we present an industrial case study that investigated measures of volatility for a medium size software project. The goal of the study was twofold: 1) to empirically validate a set of measures associated with the volatility of use case models (UCM); 2) to investigate the correlation between subjective and objective volatility. Measurement data was collected in retrospect for all use case models of the software project. In addition, we determined subjective volatility by interviewing stakeholders of the project. Our data analysis showed a high correlation between our measures of size of UCM and total number of changes, indicating that the measures of size of UCMs are good indicators of requirements volatility. No correlations was found between subjective and objective volatility. These results suggest that project managers at this company should measure their projects because of the risk to take wrong decisions based on their own and the developer´s perceptions
    corecore