69 research outputs found

    Management of quality requirements in agile and rapid software development: A systematic mapping study

    Get PDF
    Context: Quality requirements (QRs) describe the desired quality of software, and they play an important role in the success of software projects. In agile software development (ASD), QRs are often ill-defined and not well addressed due to the focus on quickly delivering functionality. Rapid software development (RSD) approaches (e.g., continuous delivery and continuous deployment), which shorten delivery times, are more prone to neglect QRs. Despite the significance of QRs in both ASD and RSD, there is limited synthesized knowledge on their management in those approaches. Objective: This study aims to synthesize state-of-the-art knowledge about QR management in ASD and RSD, focusing on three aspects: bibliometric, strategies, and challenges. Research method: Using a systematic mapping study with a snowballing search strategy, we identified and structured the literature on QR management in ASD and RSD. Results: We found 156 primary studies: 106 are empirical studies, 16 are experience reports, and 34 are theoretical studies. Security and performance were the most commonly reported QR types. We identified various QR management strategies: 74 practices, 43 methods, 13 models, 12 frameworks, 11 advices, 10 tools, and 7 guidelines. Additionally, we identified 18 categories and 4 non-recurring challenges of managing QRs. The limited ability of ASD to handle QRs, time constraints due to short iteration cycles, limitations regarding the testing of QRs and neglect of QRs were the top categories of challenges. Conclusion: Management of QRs is significant in ASD and is becoming important in RSD. This study identified research gaps, such as the need for more tools and guidelines, lightweight QR management strategies that fit short iteration cycles, investigations of the link between QRs challenges and technical debt, and extension of empirical validation of existing strategies to a wider context. It also synthesizes QR management strategies and challenges, which may be useful for practitioners.Peer ReviewedPostprint (author's final draft

    FIN-DM: finantsteenuste andmekaeve protsessi mudel

    Get PDF
    Andmekaeve hĂ”lmab reeglite kogumit, protsesse ja algoritme, mis vĂ”imaldavad ettevĂ”tetel iga pĂ€ev kogutud andmetest rakendatavaid teadmisi ammutades suurendada tulusid, vĂ€hendada kulusid, optimeerida tooteid ja kliendisuhteid ning saavutada teisi eesmĂ€rke. Andmekaeves ja -analĂŒĂŒtikas on vaja hĂ€sti mÀÀratletud metoodikat ja protsesse. Saadaval on mitu andmekaeve ja -analĂŒĂŒtika standardset protsessimudelit. KĂ”ige mĂ€rkimisvÀÀrsem ja laialdaselt kasutusele vĂ”etud standardmudel on CRISP-DM. Tegu on tegevusalast sĂ”ltumatu protsessimudeliga, mida kohandatakse sageli sektorite erinĂ”uetega. CRISP-DMi tegevusalast lĂ€htuvaid kohandusi on pakutud mitmes valdkonnas, kaasa arvatud meditsiini-, haridus-, tööstus-, tarkvaraarendus- ja logistikavaldkonnas. Seni pole aga mudelit kohandatud finantsteenuste sektoris, millel on omad valdkonnapĂ”hised erinĂ”uded. Doktoritöös kĂ€sitletakse seda lĂŒnka finantsteenuste sektoripĂ”hise andmekaeveprotsessi (FIN-DM) kavandamise, arendamise ja hindamise kaudu. Samuti uuritakse, kuidas kasutatakse andmekaeve standardprotsesse eri tegevussektorites ja finantsteenustes. Uurimise kĂ€igus tuvastati mitu tavapĂ€rase raamistiku kohandamise stsenaariumit. Lisaks ilmnes, et need meetodid ei keskendu piisavalt sellele, kuidas muuta andmekaevemudelid tarkvaratoodeteks, mida saab integreerida organisatsioonide IT-arhitektuuri ja Ă€riprotsessi. Peamised finantsteenuste valdkonnas tuvastatud kohandamisstsenaariumid olid seotud andmekaeve tehnoloogiakesksete (skaleeritavus), Ă€rikesksete (tegutsemisvĂ”ime) ja inimkesksete (diskrimineeriva mĂ”ju leevendus) aspektidega. SeejĂ€rel korraldati tegelikus finantsteenuste organisatsioonis juhtumiuuring, mis paljastas 18 tajutavat puudujÀÀki CRISP- DMi protsessis. Uuringu andmete ja tulemuste abil esitatakse doktoritöös finantsvaldkonnale kohandatud CRISP-DM nimega FIN-DM ehk finantssektori andmekaeve protsess (Financial Industry Process for Data Mining). FIN-DM laiendab CRISP-DMi nii, et see toetab privaatsust sĂ€ilitavat andmekaevet, ohjab tehisintellekti eetilisi ohte, tĂ€idab riskijuhtimisnĂ”udeid ja hĂ”lmab kvaliteedi tagamist kui osa andmekaeve elutsĂŒklisData mining is a set of rules, processes, and algorithms that allow companies to increase revenues, reduce costs, optimize products and customer relationships, and achieve other business goals, by extracting actionable insights from the data they collect on a day-to-day basis. Data mining and analytics projects require well-defined methodology and processes. Several standard process models for conducting data mining and analytics projects are available. Among them, the most notable and widely adopted standard model is CRISP-DM. It is industry-agnostic and often is adapted to meet sector-specific requirements. Industry- specific adaptations of CRISP-DM have been proposed across several domains, including healthcare, education, industrial and software engineering, logistics, etc. However, until now, there is no existing adaptation of CRISP-DM for the financial services industry, which has its own set of domain-specific requirements. This PhD Thesis addresses this gap by designing, developing, and evaluating a sector-specific data mining process for financial services (FIN-DM). The PhD thesis investigates how standard data mining processes are used across various industry sectors and in financial services. The examination identified number of adaptations scenarios of traditional frameworks. It also suggested that these approaches do not pay sufficient attention to turning data mining models into software products integrated into the organizations' IT architectures and business processes. In the financial services domain, the main discovered adaptation scenarios concerned technology-centric aspects (scalability), business-centric aspects (actionability), and human-centric aspects (mitigating discriminatory effects) of data mining. Next, an examination by means of a case study in the actual financial services organization revealed 18 perceived gaps in the CRISP-DM process. Using the data and results from these studies, the PhD thesis outlines an adaptation of CRISP-DM for the financial sector, named the Financial Industry Process for Data Mining (FIN-DM). FIN-DM extends CRISP-DM to support privacy-compliant data mining, to tackle AI ethics risks, to fulfill risk management requirements, and to embed quality assurance as part of the data mining life-cyclehttps://www.ester.ee/record=b547227

    INTEROCC : Occupational exposure assessment for electromagnetic fields and risk of brain tumours. Development of a new source-based approach

    Get PDF
    Introducción: Con el fin de mejorar los métodos de evaluación de la exposición a campos electromagnéticos, desarrollamos un nuevo método basado en fuentes de exposición en lugar de códigos de ocupación. Mediciones obtenidas de la bibliografía fueron recopiladas, evaluadas y compiladas en una base de datos para ser posteriormente combinadas en forma de una matriz fuente-exposición. Esta matriz, junto a otros determinantes de la exposición, fue usada para calcular índices de exposición acumulada. El riesgo de mayor incidencia de tumores cerebrales, glioma y meningioma, se analizó usando estas estimaciones acumuladas. Los resultados del anålisis mostraron un ligero incremento del riesgo de glioma en los grupos de trabajadores mås expuestos a campos electromagnéticos de alta frecuencia en las ventanas de exposición mås cercanas a la fecha de diagnóstico/referencia y en todas las ventanas para meningioma. En campos de frecuencia intermedia, se encontraron ligeros aumentos de riesgo solo en glioma en las ventanas de exposición mås recientes. Estos resultados pueden reflejar que los campos electromagnéticos de alta frecuencia pueden tener un papel en las etapas mås tardías de la carcinogénesis (promoción y progresión)Introduction: To improve exposure assessment methods for electromagnetic fields, we developed a methodology based on sources of exposure rather than job titles. Methods: Measurements collected from the literature were assessed and summarized into a source-exposure matrix (SEM). The SEM and personal determinants of exposure were combined to obtain individual indices cumulative exposure, which were used to assess risk of brain tumours, glioma an meningioma Results: Over 3,000 records were obtained and judged useful, creating a SEM with exposure estimates for 312 EMF sources. Overall the analysis yielded no association between glioma or meningioma risk and cumulative exposure to RF or IF EMF. However, some positive associations were identified for RF and IF EMF in the highest exposed groups in the 1- to 4-year exposure window for glioma, and in all windows for meningioma and RF only. These results might reflect a possible role of high frequency EMF in the later stages of carcinogenesis (promotion and progression)

    An investigation of the acquisition and sharing of tacit knowledge in software development teams

    Get PDF
    Knowledge in general, and tacit knowledge in particular, has been hailed as an important factor for successful performance in knowledge-worker teams. Despite claims of the importance of tacit knowledge, few researchers have studied the concept empirically, due in part to the confusion surrounding its conceptualisation. The present study examined the acquisition and sharing o f tacit knowledge and the consequent effect on team performance, through social interaction and the development of a transactive memory system (TMS). TMSs are important for the acquisition and sharing of tacit knowledge, since they enact ‘collective minds’ of teams, and are also a factor in successful team performance. In order to conduct this research, a team-level operational definition of tacit knowledge was forwarded and a direct measure of tacit knowledge for software development teams, called the Team Tacit Knowledge Measure (TTKM ) was developed and validated. To investigate the main premise of this research an empirical survey study was conducted which involved 48 software development teams (n = 181 individuals), from Ireland and the UK. Software developers were chosen as the example of knowledge-worker teams because they work with intangible cognitive processes. It was concluded that tacit knowledge was acquired and shared directly through good quality social interactions and through the development of a TMS. Quality of social interaction was found to be a more important route through which teams can learn and share tacit knowledge, than was transactive memory. However, transactive memory was not a mediator between social interaction and team tacit knowledge, indicating that both provided separate contributions. Team tacit knowledge was found to predict team performance above and beyond transactive memory, though both were significant. Based on these findings recommendations were made for the management of software development teams and for future research directions

    A systematic approach for integrated product, materials, and design-process design

    Get PDF
    Designers are challenged to manage customer, technology, and socio-economic uncertainty causing dynamic, unquenchable demands on limited resources. In this context, increased concept flexibility, referring to a designer s ability to generate concepts, is crucial. Concept flexibility can be significantly increased through the integrated design of product and material concepts. Hence, the challenge is to leverage knowledge of material structure-property relations that significantly affect system concepts for function-based, systematic design of product and materials concepts in an integrated fashion. However, having selected an integrated product and material system concept, managing complexity in embodiment design-processes is important. Facing a complex network of decisions and evolving analysis models a designer needs the flexibility to systematically generate and evaluate embodiment design-process alternatives. In order to address these challenges and respond to the primary research question of how to increase a designer s concept and design-process flexibility to enhance product creation in the conceptual and early embodiment design phases, the primary hypothesis in this dissertation is embodied as a systematic approach for integrated product, materials and design-process design. The systematic approach consists of two components i) a function-based, systematic approach to the integrated design of product and material concepts from a systems perspective, and ii) a systematic strategy to design-process generation and selection based on a decision-centric perspective and a value-of-information-based Process Performance Indicator. The systematic approach is validated using the validation-square approach that consists of theoretical and empirical validation. Empirical validation of the framework is carried out using various examples including: i) design of a reactive material containment system, and ii) design of an optoelectronic communication system.Ph.D.Committee Chair: Allen, Janet K.; Committee Member: Aidun, Cyrus K.; Committee Member: Klein, Benjamin; Committee Member: McDowell, David L.; Committee Member: Mistree, Farrokh; Committee Member: Yoder, Douglas P

    Modelling spatio-temporal human behaviour with mobile phone data : a data analytical approach

    Get PDF

    An integration framework for managing rich organisational process knowledge

    Get PDF
    The problem we have addressed in this dissertation is that of designing a pragmatic framework for integrating the synthesis and management of organisational process knowledge which is based on domain-independent AI planning and plan representations. Our solution has focused on a set of framework components which provide methods, tools and representations to accomplish this task.In the framework we address a lifecycle of this knowledge which begins with a methodological approach to acquiring information about the process domain. We show that this initial domain specification can be translated into a common constraint-based model of activity (based on the work of Tate, 1996c and 1996d) which can then be operationalised for use in an AI planner. This model of activity is ontologically underpinned and may be expressed with a flexible and extensible language based on a sorted first-order logic. The model combines perspectives covering both the space of behaviour as well as the space of decisions. Synthesised or modified processes/plans can be translated to and from the common representation in order to support knowledge sharing, visualisation and mixed-initiative interaction.This work united past and present Edinburgh research on planning and infused it with perspectives from design rationale, requirements engineering, and process knowledge sharing. The implementation has been applied to a portfolio of scenarios which include process examples from business, manufacturing, construction and military operations. An archive of this work is available at: http://www.aiai.ed.ac.uk/~oplan/cpf

    Embracing Analytics in the Drinking Water Industry

    Get PDF
    Analytics can support numerous aspects of water industry planning, management, and operations. Given this wide range of touchpoints and applications, it is becoming increasingly imperative that the championship and capability of broad-based analytics needs to be developed and practically integrated to address the current and transitional challenges facing the drinking water industry. Analytics will contribute substantially to future efforts to provide innovative solutions that make the water industry more sustainable and resilient. The purpose of this book is to introduce analytics to practicing water engineers so they can deploy the covered subjects, approaches, and detailed techniques in their daily operations, management, and decision-making processes. Also, undergraduate students as well as early graduate students who are in the water concentrations will be exposed to established analytical techniques, along with many methods that are currently considered to be new or emerging/maturing. This book covers a broad spectrum of water industry analytics topics in an easy-to-follow manner. The overall background and contexts are motivated by (and directly drawn from) actual water utility projects that the authors have worked on numerous recent years. The authors strongly believe that the water industry should embrace and integrate data-driven fundamentals and methods into their daily operations and decision-making process(es) to replace established Ïrule-of-thumbß and weak heuristic approaches ñ and an analytics viewpoint, approach, and culture is key to this industry transformation
    • 

    corecore