35 research outputs found

    DEVELOPMENT OF A QUALITY MANAGEMENT ASSESSMENT TOOL TO EVALUATE SOFTWARE USING SOFTWARE QUALITY MANAGEMENT BEST PRACTICES

    Get PDF
    Organizations are constantly in search of competitive advantages in today’s complex global marketplace through improvement of quality, better affordability, and quicker delivery of products and services. This is significantly true for software as a product and service. With other things being equal, the quality of software will impact consumers, organizations, and nations. The quality and efficiency of the process utilized to create and deploy software can result in cost and schedule overruns, cancelled projects, loss of revenue, loss of market share, and loss of consumer confidence. Hence, it behooves us to constantly explore quality management strategies to deliver high quality software quickly at an affordable price. This research identifies software quality management best practices derived from scholarly literature using bibliometric techniques in conjunction with literature review, synthesizes these best practices into an assessment tool for industrial practitioners, refines the assessment tool based on academic expert review, further refines the assessment tool based on a pilot test with industry experts, and undertakes industry expert validation. Key elements of this software quality assessment tool include issues dealing with people, organizational environment, process, and technology best practices. Additionally, weights were assigned to issues of people, organizational environment, process, and technology best practices based on their relative importance, to calculate an overall weighted score for organizations to evaluate where they stand with respect to their peers in pursuing the business of producing quality software. This research study indicates that people best practices carry 40% of overall weight, organizational best v practices carry 30% of overall weight, process best practices carry 15% of overall weight, and technology best practices carry 15% of overall weight. The assessment tool that is developed will be valuable to organizations that seek to take advantage of rapid innovations in pursuing higher software quality. These organizations can use the assessment tool for implementing best practices based on the latest cutting edge management strategies that can lead to improved software quality and other competitive advantages in the global marketplace. This research contributed to the current academic literature in software quality by presenting a quality assessment tool based on software quality management best practices, contributed to the body of knowledge on software quality management, and expanded the knowledgebase on quality management practices. This research also contributed to current professional practice by incorporating software quality management best practices into a quality management assessment tool to evaluate software

    Teaching and Collecting Technical Standards: A Handbook for Librarians and Educators

    Get PDF
    Technical standards are a vital source of information for providing guidelines during the design, manufacture, testing, and use of whole products, materials, and components. To prepare students—especially engineering students—for the workforce, universities are increasing the use of standards within the curriculum. Employers believe it is important for recent university graduates to be familiar with standards. Despite the critical role standards play within academia and the workforce, little information is available on the development of standards information literacy, which includes the ability to understand the standardization process; identify types of standards; and locate, evaluate, and use standards effectively. Libraries and librarians are a critical part of standards education, and much of the discussion has been focused on the curation of standards within libraries. However, librarians also have substantial experience in developing and teaching standards information literacy curriculum. With the need for universities to develop a workforce that is well-educated on the use of standards, librarians and course instructors can apply their experiences in information literacy toward teaching students the knowledge and skills regarding standards that they will need to be successful in their field. This title provides background information for librarians on technical standards as well as collection development best practices. It also creates a model for librarians and course instructors to use when building a standards information literacy curriculum.https://docs.lib.purdue.edu/pilh/1004/thumbnail.jp

    A model-driven engineering approach for the uniquely identity reconciliation of heterogeneous data sources.

    Get PDF
    The objectives to be achieved with this Doctoral Thesis are: 1. Perform a study of the state of the art of the different existing solutions for the entity reconciliation of heterogeneous data sources, checking if they are being used in real environments. 2. Define and develop a Framework for designing the entity reconciliation models by a systematic way for the requirement, analysis and testing phases of a software methodology. For this purpose, this objective has been divided in three sub objectives: a. Define a set of activities, represented as a process which can be added to any software development methodology to carry out the activities related to the entity reconciliation in the requirement, analysis and testing phase of any software development life cycle. b. Define a metamodel that allows us to represent an abstract view of our model-based approach. c. Define a set of derivation mechanisms that allow to stablish the base for automate the testing of the solutions where the framework proposed in this doctoral thesis has been used. Considering that the process will be applied in the early stages of the development, it is possible to say that this proposal applies Early Testing. 3. Provide a support tool for the framework. The support tool will allow to a software engineer to define the analysis model of an entity reconciliation problem between different and heterogeneous data sources. The tool will be represented as a Domain Specific Language (DSL). 4. Evaluate the results obtained of the application of the proposal in a real-world case study

    Innovative Tools and Methods Using BIM for an Efficient Renovation in Buildings

    Get PDF
    This open access book describes a BIM-based toolkit that has been developed according to the latest research activities on building information modelling and semantic interoperability to optimize the building process. It highlights the impacts of using such new tools to fast renovation activities starting from the decision-making and design stages to the construction site management with the possibility to monitor occupants' and owners’ feedback during the realization process. In this process, a framework has been developed and implemented to allow stakeholders involved in a renovation project to efficiently compile, maintain, and add data about (i) building elements, (ii) building services systems, (iii) tenants, operators, and owners of the building, and (iv) current and predicted performance of the building from the various data sources available. The framework applies and specializes the existing practices in the Semantic Web, Linked Data, and ontology domain to the management of renovation projects. It has been designed to be open so that any system which implements the required functions and uses the specified conventions will be able to achieve semantic interoperability with other framework-compliant systems in the renovation domain. Finally, this book represents the validation process of the toolkit that has been held in three demo sites: a social housing building in Italy and two private residential buildings in Poland and Finland. The outcome shows that the toolkit facilitates the renovation process with relevant reductions of time, costs, and energy consumption and that the inhabitants can take advantage of the increase in building performances, quality, and comfort

    Methodology for Specifying and Testing Traffic Rule Compliance for Automated Driving

    Get PDF
    The introduction of highly-automated driving functions promises to increase safety and comfort, but the safety validation remains an unsolved challenge. Here, the requirement is that the introduction does not reduce safety on public roads. This dissertation addresses one major aspect of road safety: traffic rule compliance. Even an automated vehicle must comply with existing traffic rules. The developed method enables automated testing of traffic rule compliance of automated driving functions. In the first part of the thesis, the state of the art for describing and formalizing behavioral rules is analyzed. A special challenge is posed by the different traffic rules depending on the traffic region. With existing approaches, a separate description and formalization of the behavior rules is necessary for each traffic region or even for individual traffic areas. This shows the necessity to develop new approaches for the abstraction and transferability of the behavioral rules in order to reduce the effort of testing and ensuring traffic rule compliance. The rule compliance criteria are to be integrated into the behavior specification within the functional specification. The objective of this thesis is to develop a method to formalize the limits of traffic rule compliance, based on which fail criteria for system testing are defined and applied. For this purpose, existing traffic rules are analyzed as a basis to identify which behavior constraints are imposed by the static traffic environment. Based on this, a semantic description that is transferable between traffic domains and that links the boundaries of traffic rule compliance to the static traffic environment is developed. The method involves deriving behavioral attributes from which the semantic behavior description is constructed. These behavioral attributes construct the behavior space that describes the boundaries of legally allowed behavior. Furthermore, methods for automated derivation of behavioral attributes from high definition maps are developed, thus extracting the behavioral requirement from an operational design domain. It is investigated which functionalities an automated vehicle has to provide to comply with the behavioral attributes. The attributes are then formalized to obtain quantifiable failure criteria of traffic rule compliance that can be used in automated testing. Finally, building on the state of the art, a test strategy for validating traffic rule conformance is presented. The explicit availability of the behavioral limits results in an advantage in the influence analysis of possible parameters for these tests. Finally, the developed method is applied to existing map material and to test drives with an automated vehicle prototype in order to investigate the practical applicability of the approach as well as the resulting gain in knowledge about traffic rule compliance testing. The developed approach allows to derive the behavioral specification with respect to traffic rule conformance as an essential part of the functional specification independent of the application domain. It is proven that the approach is able to test the traffic rule conformance of an automated vehicle in different test scenarios within an application domain. By applying the developed methodology, it was possible to identify defects in the investigated test vehicle with respect to rule understanding and compliance
    corecore