2,282 research outputs found

    Evaluasi Pendekatan Pembangunan Traceability Link Dalam Evolusi Perangkat Lunak

    Full text link
    Traceability merupakan hal penting pada proyek perangkat lunak, terutama pada proyek skala besar. Traceability berfungsi untuk mengetahui ketelusuran antar artefak dalam fase-fase yang berbeda (analisis kebutuhan, analisis desain, dan analisis implementasi) maupun antara artefak dan pihak pengembang yang terlibat. Sistem traceability otomatis diperlukan untuk membangun ketelusuran antar artefak. Penelitian ini bertujuan untuk mengeksplorasi sejumlah literatur pendekatan terbaru yang digunakan untuk membangun traceability link. Eksplorasi literatur mengacu pada taksonomi berbasis evolusi perangkat lunak terhadap sejumlah mekanisme karakterisasi Perubahan dan faktor-faktor yang mempengaruhi mekanisme. Hasil penelitian dapat digunakan untuk mengidentifikasi bagaimana pendekatan tersebut dapat mendukung evolusi perangkat lunak serta memberikan garis besar dari kriteria yang dibutuhkan untuk membangun metode traceability yang lebih baik. Kesimpulan dari penelitian ini adalah variasi faktor suatu pendekatan tidak berbeda jauh dengan pendekatan lainnya kecuali jika terdapat perbedaan pada faktor temporal

    WFIRST Coronagraph Technology Requirements: Status Update and Systems Engineering Approach

    Full text link
    The coronagraphic instrument (CGI) on the Wide-Field Infrared Survey Telescope (WFIRST) will demonstrate technologies and methods for high-contrast direct imaging and spectroscopy of exoplanet systems in reflected light, including polarimetry of circumstellar disks. The WFIRST management and CGI engineering and science investigation teams have developed requirements for the instrument, motivated by the objectives and technology development needs of potential future flagship exoplanet characterization missions such as the NASA Habitable Exoplanet Imaging Mission (HabEx) and the Large UV/Optical/IR Surveyor (LUVOIR). The requirements have been refined to support recommendations from the WFIRST Independent External Technical/Management/Cost Review (WIETR) that the WFIRST CGI be classified as a technology demonstration instrument instead of a science instrument. This paper provides a description of how the CGI requirements flow from the top of the overall WFIRST mission structure through the Level 2 requirements, where the focus here is on capturing the detailed context and rationales for the CGI Level 2 requirements. The WFIRST requirements flow starts with the top Program Level Requirements Appendix (PLRA), which contains both high-level mission objectives as well as the CGI-specific baseline technical and data requirements (BTR and BDR, respectively)... We also present the process and collaborative tools used in the L2 requirements development and management, including the collection and organization of science inputs, an open-source approach to managing the requirements database, and automating documentation. The tools created for the CGI L2 requirements have the potential to improve the design and planning of other projects, streamlining requirement management and maintenance. [Abstract Abbreviated]Comment: 16 pages, 4 figure

    Leveraging Intermediate Artifacts to Improve Automated Trace Link Retrieval

    Get PDF
    Software traceability establishes a network of connections between diverse artifacts such as requirements, design, and code. However, given the cost and effort of creating and maintaining trace links manually, researchers have proposed automated approaches using information retrieval techniques. Current approaches focus almost entirely upon generating links between pairs of artifacts and have not leveraged the broader network of interconnected artifacts. In this paper we investigate the use of intermediate artifacts to enhance the accuracy of the generated trace links – focus- ing on paths consisting of source, target, and intermediate artifacts. We propose and evaluate combinations of techniques for computing semantic similarity, scaling scores across multiple paths, and aggregating results from multiple paths. We report results from five projects, including one large industrial project. We find that leverag- ing intermediate artifacts improves the accuracy of end-to-end trace retrieval across all datasets and accuracy metrics. After further analysis, we discover that leveraging intermediate artifacts is only helpful when a project’s artifacts share a common vocabulary, which tends to occur in refinement and decomposition hierarchies of artifacts. Given our hybrid approach that integrates both direct and transitive links, we observed little to no loss of accuracy when intermediate artifacts lacked a shared vocabulary with source or target artifacts

    User Review-Based Change File Localization for Mobile Applications

    Get PDF
    In the current mobile app development, novel and emerging DevOps practices (e.g., Continuous Delivery, Integration, and user feedback analysis) and tools are becoming more widespread. For instance, the integration of user feedback (provided in the form of user reviews) in the software release cycle represents a valuable asset for the maintenance and evolution of mobile apps. To fully make use of these assets, it is highly desirable for developers to establish semantic links between the user reviews and the software artefacts to be changed (e.g., source code and documentation), and thus to localize the potential files to change for addressing the user feedback. In this paper, we propose RISING (Review Integration via claSsification, clusterIng, and linkiNG), an automated approach to support the continuous integration of user feedback via classification, clustering, and linking of user reviews. RISING leverages domain-specific constraint information and semi-supervised learning to group user reviews into multiple fine-grained clusters concerning similar users' requests. Then, by combining the textual information from both commit messages and source code, it automatically localizes potential change files to accommodate the users' requests. Our empirical studies demonstrate that the proposed approach outperforms the state-of-the-art baseline work in terms of clustering and localization accuracy, and thus produces more reliable results.Comment: 15 pages, 3 figures, 8 table

    EVALUASI PENDEKATAN PEMBANGUNAN TRACEABILITY LINK DALAM EVOLUSI PERANGKAT LUNAK

    Get PDF
    Traceability merupakan hal penting pada proyek perangkat lunak, terutama pada proyek skala besar. Traceability berfungsi untuk mengetahui ketelusuran antar artefak dalam fase-fase yang berbeda (analisis kebutuhan, analisis desain, dan analisis implementasi) maupun antara artefak dan pihak pengembang yang terlibat. Sistem traceability otomatis diperlukan untuk membangun ketelusuran antar artefak. Penelitian ini bertujuan untuk mengeksplorasi sejumlah literatur pendekatan terbaru yang digunakan untuk membangun traceability link. Eksplorasi literatur mengacu pada taksonomi berbasis evolusi perangkat lunak terhadap sejumlah mekanisme karakterisasi perubahan dan faktor-faktor yang mempengaruhi mekanisme. Hasil penelitian dapat digunakan untuk mengidentifikasi bagaimana pendekatan tersebut dapat mendukung evolusi perangkat lunak serta memberikan garis besar dari kriteria yang dibutuhkan untuk membangun metode traceability yang lebih baik. Kesimpulan dari penelitian ini adalah variasi faktor suatu pendekatan tidak berbeda jauh dengan pendekatan lainnya kecuali jika terdapat perbedaan pada faktor temporal

    A synthesis of logic and bio-inspired techniques in the design of dependable systems

    Get PDF
    Much of the development of model-based design and dependability analysis in the design of dependable systems, including software intensive systems, can be attributed to the application of advances in formal logic and its application to fault forecasting and verification of systems. In parallel, work on bio-inspired technologies has shown potential for the evolutionary design of engineering systems via automated exploration of potentially large design spaces. We have not yet seen the emergence of a design paradigm that effectively combines these two techniques, schematically founded on the two pillars of formal logic and biology, from the early stages of, and throughout, the design lifecycle. Such a design paradigm would apply these techniques synergistically and systematically to enable optimal refinement of new designs which can be driven effectively by dependability requirements. The paper sketches such a model-centric paradigm for the design of dependable systems, presented in the scope of the HiP-HOPS tool and technique, that brings these technologies together to realise their combined potential benefits. The paper begins by identifying current challenges in model-based safety assessment and then overviews the use of meta-heuristics at various stages of the design lifecycle covering topics that span from allocation of dependability requirements, through dependability analysis, to multi-objective optimisation of system architectures and maintenance schedules

    A Life Cycle Software Quality Model Using Bayesian Belief Networks

    Get PDF
    Software practitioners lack a consistent approach to assessing and predicting quality within their products. This research proposes a software quality model that accounts for the influences of development team skill/experience, process maturity, and problem complexity throughout the software engineering life cycle. The model is structured using Bayesian Belief Networks and, unlike previous efforts, uses widely-accepted software engineering standards and in-use industry techniques to quantify the indicators and measures of software quality. Data from 28 software engineering projects was acquired for this study, and was used for validation and comparison of the presented software quality models. Three Bayesian model structures are explored and the structure with the highest performance in terms of accuracy of fit and predictive validity is reported. In addition, the Bayesian Belief Networks are compared to both Least Squares Regression and Neural Networks in order to identify the technique is best suited to modeling software product quality. The results indicate that Bayesian Belief Networks outperform both Least Squares Regression and Neural Networks in terms of producing modeled software quality variables that fit the distribution of actual software quality values, and in accurately forecasting 25 different indicators of software quality. Between the Bayesian model structures, the simplest structure, which relates software quality variables to their correlated causal factors, was found to be the most effective in modeling software quality. In addition, the results reveal that the collective skill and experience of the development team, over process maturity or problem complexity, has the most significant impact on the quality of software products

    Hybrid Deep Learning Algorithm for Insulin Dosage Prediction Using Blockchain and IOT

    Get PDF
    This paper addresses the problem of predicting insulin dosage in diabetes patients using the PSO-LSTM, COA-LSTM, and LOA-LSTM algorithms. Accurate insulin dosage prediction is crucial in effectively managing Diabetes and maintaining blood glucose levels within the desired range. The study proposes a novel approach that combines particle swarm optimization (PSO) with the long short-term memory (LSTM) model. PSO is used to optimize the LSTM's parameters, enhancing its prediction capabilities specifically for insulin dosage. Additionally, two other techniques, COA-LSTM and LOA-LSTM, are introduced for comparison purposes. The algorithms utilize a dataset comprising relevant features such as past insulin dosages, blood glucose levels, carbohydrate intake, and physical activity. These features are fed into the PSO-LSTM, COA-LSTM, and LOA-LSTM models to predict the appropriate insulin dosage for future time points. The results demonstrate the effectiveness of the proposed PSO-LSTM algorithm in accurately predicting insulin dosage, surpassing the performance of COA-LSTM and LOA-LSTM. The PSO-LSTM model achieves a high level of accuracy, aiding in personalized and precise insulin administration for diabetes patients. By leveraging the power of PSO optimization and LSTM modeling, this research improves the accuracy and reliability of insulin dosage prediction. The findings highlight the potential of the PSO-LSTM algorithm as a valuable tool for healthcare professionals in optimizing diabetes management and enhancing patient outcomes

    Automatic evaluation of measurement data

    Get PDF
    Due to recent advances in computer technology and network infrastructure, databases and data programs are becoming increasingly important in the metrology area. This thesis introduces a new methodology for automatic evaluation of measurement data concerning calibration of electronic instruments. The methodology for developing a software-based analysis procedure is described. The requirements for such an analysis procedure are discussed and necessary statistical methods are implemented in order to evaluate the measured data against the historical data. Hierarchical Bayesian method is chosen since it meets the analysis requirements. Implementation of the first version of the analysis procedure and tests has been performed. The range of use is specifically designed for electronic instruments. On the other hand there is nothing to suggest that the same approach cannot be used for other instruments. The reader of this thesis would benefit from having some basic familiarity with both statistical methods and information technology (IT). However, some basic concepts are provided in the text
    • …
    corecore