496,996 research outputs found

    A Life Cycle Software Quality Model Using Bayesian Belief Networks

    Get PDF
    Software practitioners lack a consistent approach to assessing and predicting quality within their products. This research proposes a software quality model that accounts for the influences of development team skill/experience, process maturity, and problem complexity throughout the software engineering life cycle. The model is structured using Bayesian Belief Networks and, unlike previous efforts, uses widely-accepted software engineering standards and in-use industry techniques to quantify the indicators and measures of software quality. Data from 28 software engineering projects was acquired for this study, and was used for validation and comparison of the presented software quality models. Three Bayesian model structures are explored and the structure with the highest performance in terms of accuracy of fit and predictive validity is reported. In addition, the Bayesian Belief Networks are compared to both Least Squares Regression and Neural Networks in order to identify the technique is best suited to modeling software product quality. The results indicate that Bayesian Belief Networks outperform both Least Squares Regression and Neural Networks in terms of producing modeled software quality variables that fit the distribution of actual software quality values, and in accurately forecasting 25 different indicators of software quality. Between the Bayesian model structures, the simplest structure, which relates software quality variables to their correlated causal factors, was found to be the most effective in modeling software quality. In addition, the results reveal that the collective skill and experience of the development team, over process maturity or problem complexity, has the most significant impact on the quality of software products

    Terrestrial Laser Scanner and Close Range Photogrammetry point clouds accuracy assessment for the structure deformations monitoring

    Get PDF
    In this paper we show the results of several tests carried out using methods and instrumentation typical of an architectural survey, along with a set of metrological instrumentation, on a Reinforced Concrete (RC) beam subjected to increasing loads. The goal was to assess the accuracy in the displacements estimated by a medium quality terrestrial laser scanner (TLS) Focus 3d from Faro Technologies, and the low-cost digital camera Canon PowerShot S110 used in a Close Range Photogrammetry (CRP) survey. The software used for scan-data and point clouds processing was Reconstructor JRC Software v. 3.1.0, maintained by Gexcel Ltd, while the images processing was performed with the software Photoscan from Agisoft, which implements Structure from Motion (SfM) approach. Two processing strategies were used in the point clouds comparison: mesh2mesh and modelling the beam behavior fitting the contours of the beam with second order polynomials. Comparisons between the TLS and CRP techniques and the metrological equipment used in parallel highlighted the limits and potentialities of the two geomatic techniques used. It has been shown that modeling the behavior of the beam leads to significantly better results than using the mesh2mesh comparison. For the CRP the increase in accuracy was in the order of 40%, while for the TLS of 50%

    UML models consistency management: guidelines for software quality manager

    No full text
    Unified Modeling Language (UML) has become the de-facto standard to design today’s large-size object-oriented systems. However, focusing on multiple UML diagrams is a main cause of breaching the consistency problem, which ultimately reduces the overall software model’s quality. Consistency management techniques are widely used to ensure the model consistency by correct model-to-model and model-to-code transformation. Consistency management becomes a promising area of research especially for model-driven architecture. In this paper, we extensively review UML consistency management techniques. The proposed techniques have been classified based on the parameters identified from the research literature. Moreover, we performed a qualitative comparison of consistency management techniques in order to identify current research trends, challenges and research gaps in this field of study. Based on the results, we concluded that researchers have not provided more attention on exploring inter-model and semantic consistency problems. Furthermore, state-of-the-art consistency management techniques mostly focus only on three UML diagrams (i.e., class, sequence and state chart) and the remaining UML diagrams have been overlooked. Consequently, due to this incomplete body of knowledge, researchers are unable to take full advantage of overlooked UML diagrams, which may be otherwise useful to handle the consistency management challenge in an efficient manner

    A Comparative Analysis Of Conventional Software Development Approaches Vs. Formal Methods In Call Distribution Systems

    Get PDF
    When we think about formal method; the first thing which comes in our mind is mathematical approach. The process of formalization is an approach based on mathematics and used to elaborate the properties of systems (hardware and software). The mathematical modeling or formal methods provide us a framework for large and complex systems. Thus these systems can be specified, analyzed, designed, and verified in a systematic way rather than the approaches which are used conventionally. Formal verification and the methods are applied using theoretical computer science fundamentals to solve the complex and difficult problems in large and complex software and hardware systems to ensure the systems will not fail with run-time errors. Conventional approaches of software verification in call distribution systems rely on quality assurance to verify the system behavior and robustness. The process of software testing cannot show the absence of errors it can only show the presence of errors in software systems. [1] In contrast, the mathematically-based techniques of verification are based on formal methods to prove certain software attributes, for example proving that software does or does not contain the occurrence of errors at run-time such as overflows, divide-by-zero, and access violation, invalid memory access and stack/heap corruption. [1] In this paper later we will have comparative analysis of formal methods vs. conventional software development approaches in call distribution systems. Using this comparison we‘ll try to identify the methodologies and approaches which would be better in SDLC for call distribution systems.

    An assessment of quality management system indicators for the ISO 9001:2008 certified work organisations in Kuwait

    Get PDF
    The purpose of this research is to assess the performance of quality management systems in the Kuwaiti work organizations as per ISO 9001:2008 from the customers' perspectives (end users) based on the auditing practices and quality implementations. The research has taken a long path of research methodology starting with the development and customizations of two different survey questionnaires and ending up with a data analysis using several statistical software packages such as SPSS and Minitab. Most of the data analysis has used the method of non-parametric statistical techniques except for the modeling part where advanced statistical techniques have been used. One survey was directed to all business types and the other was only directed to manufacturing organizations. The target respondents for both surveys were provided in the form of listed names of ISO 9001 certified work organizations by a government agency for the state of Kuwait, public authority for industry (PAFI). Reliability and validity of both surveys were statistically justifiable enough to make the author to proceed with: (1) comparison against Swedish certified work organizations, and (2) building up a statistical model from each survey. The comparison between the Kuwaiti and Swedish work organizations has shown many significant differences in the auditing practices and quality implementations. Moreover, the resulted differences between the two culturally work environments (Kuwaiti and Swedish), shed the lights about the existing gaps of ISO implementations and auditing practices in the two countries and help the author analyze these gaps for suggesting any prospect of quality improvements. Aside from descriptive and inferential analysis on the surveyed data, a model building was the final objective of this research. The main model was built up based on 10 interrelated factors, extracted from the survey questionnaire using LISREL software as a structural equation modeling technique. Furthermore, the model has shown the capability of predicting the total and direct effects from one factor to another. From modeling, it was statistically shown that the ISO certified manufacturing organizations outperformed the ISO certified services organizations in Kuwait

    Quality prediction for component-based software development: techniques and a generic environment.

    Get PDF
    Cai Xia.Thesis (M.Phil.)--Chinese University of Hong Kong, 2002.Includes bibliographical references (leaves 105-110).Abstracts in English and Chinese.Chapter 1 --- Introduction --- p.1Chapter 1.1 --- Component-Based Software Development and Quality Assurance Issues --- p.1Chapter 1.2 --- Our Main Contributions --- p.5Chapter 1.3 --- Outline of This Thesis --- p.6Chapter 2 --- Technical Background and Related Work --- p.8Chapter 2.1 --- Development Framework for Component-based Software --- p.8Chapter 2.1.1 --- Common Object Request Broker Architecture (CORBA) --- p.9Chapter 2.1.2 --- Component Object Model (COM) and Distributed COM (DCOM) --- p.12Chapter 2.1.3 --- Sun Microsystems's JavaBeans and Enterprise JavaBeans --- p.14Chapter 2.1.4 --- Comparison among Different Frameworks --- p.17Chapter 2.2 --- Quality Assurance for Component-Based Systems --- p.199Chapter 2.2.1 --- Traditional Quality Assurance Issues --- p.199Chapter 2.2.2 --- The Life Cycle of Component-based Software Systems --- p.255Chapter 2.2.3 --- Differences between components and objects --- p.266Chapter 2.2.4 --- Quality Characteristics of Components --- p.27Chapter 2.3 --- Quality Prediction Techniques --- p.32Chapter 2.3.1 --- ARMOR: A Software Risk Analysis Tool --- p.333Chapter 3 --- A Quality Assurance Model for CBSD --- p.35Chapter 3.1 --- Component Requirement Analysis --- p.38Chapter 3.2 --- Component Development --- p.39Chapter 3.3 --- Component Certification --- p.40Chapter 3.4 --- Component Customization --- p.42Chapter 3.5 --- System Architecture Design --- p.43Chapter 3.6 --- System Integration --- p.44Chapter 3.7 --- System Testing --- p.45Chapter 3.8 --- System Maintenance --- p.46Chapter 4 --- A Generic Quality Assessment Environment: ComPARE --- p.48Chapter 4.1 --- Objective --- p.50Chapter 4.2 --- Metrics Used in ComPARE --- p.53Chapter 4.2.1 --- Metamata Metrics --- p.55Chapter 4.2.2 --- JProbe Metrics --- p.57Chapter 4.2.3 --- Application of Metamata and Jprobe Metrics --- p.58Chapter 4.3 --- Models Definition --- p.61Chapter 4.3.1 --- Summation Model --- p.61Chapter 4.3.2 --- Product Model --- p.62Chapter 4.3.3 --- Classification Tree Model --- p.62Chapter 4.3.4 --- Case-Based Reasoning Model --- p.64Chapter 4.3.5 --- Bayesian Network Model --- p.65Chapter 4.4 --- Operations in ComPARE --- p.66Chapter 4.5 --- ComPARE Prototype --- p.68Chapter 5 --- Experiments and Discussions --- p.70Chapter 5.1 --- Data Description --- p.71Chapter 5.2 --- Experiment Procedures --- p.73Chapter 5.3 --- Modeling Methodology --- p.75Chapter 5.3.1 --- Classification Tree Modeling --- p.75Chapter 5.3.2 --- Bayesian Belief Network Modeling --- p.80Chapter 5.4 --- Experiment Results --- p.83Chapter 5.3.1 --- Classification Tree Results Using CART --- p.83Chapter 5.3.2 --- BBN Results Using Hugin --- p.86Chapter 5.5 --- Comparison and Discussion --- p.90Chapter 6 --- Conclusion --- p.92Chapter A --- Classification Tree Report of CART --- p.95Chapter B --- Publication List --- p.104Bibliography --- p.10

    Modeling the object-oriented software process: OPEN and the unified process

    Get PDF
    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is reviewed

    Using of modern technologies for visualization of cultural heritage

    Get PDF
    This paper explores the historical evolution and contemporary applications of photogrammetry and laser scanning in cultural heritage preservation, focusing on the restoration of the Shush synagogue in Iraqi Kurdistan. It traces the development of documentation techniques, highlighting photogrammetry's pivotal role and the impact of the digital revolution. The case study of Project Shush illustrates the practical use of geomatics techniques, advanced 3D modeling, and collaboration with NGOs and authorities. The methodology outlines the use of technologies like terrestrial laser scanners (BLK360, Zeb-Revo) and UAVs, emphasizing their mobility and accuracy. Results detail the project stages, showcasing the creation of a detailed 3D model and the use of Unreal Engine for visualization. The conclusion emphasizes the importance of 3D documentation in cultural heritage and celebrates the success of the Shush synagogue restoration as a testament to technological advancements in preservation. Our research has shown that the joining of different 3D object documentation technologies significantly improves the quality and speeds up the workflow. Comparison of partial point clouds in software Cloudcompare on a case study of a smaller historic building showed differences in the internal structure in centimeters, while for the external parts that were covered with vegetation the differences reached up to decimeters
    corecore