2,256 research outputs found

    Development and Application of New Quality Model for Software Projects

    Get PDF
    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects

    The relationship between quality management and competitive advantage : an empirical study of the Egyptian hotel industry

    Get PDF
    According to the resource based view (RBV) of the firm, quality management (QM) is one of the sources the firm can use to generate competitive advantage (CA). Although QM and CA have widely attracted the attention of both academics and practitioners, the link between these two concepts has rarely been examined in the literature, especially in service industry. Additionally, among those few studies that investigated the relationship between QM and CA, there is contradictory evidence on which QM practices generate CA. Thus, this study examines the impact of QM on CA in the hotels industry, in order to identify which QM practices generate CA. Based on an extensive review of the literature on QM and CA, valid and reliable definitions were formulated for both concepts, and then a conceptual framework was developed to illustrate the relationships between the research variables. Data obtained from a survey of 384 four and five star hotels in Egypt is used to test the impact of QM on CA. A total of 300 responses were obtained. Twelve uncompleted questionnaires were removed, leaving 288 usable questionnaires and yielding a response rate of 75 %. All questionnaires were completed by the hotel general managers. Three main data analysis techniques were employed: exploratory factor analysis (EFA), confirmatory factor analysis (CFA), and structural equation modelling (SEM). Three models are employed in CFA to test the dimensional structure of QM. These models include a model that allows all factors to be freely correlated (oblique factor model), a model where all factors are correlated because they all measure one higher order factor (higher order factor model), and a model where all indicators are employed to test if they measure only one construct. The results of CFA provide solid statistical evidence that affirm the multidimensionality of the QM construct and contradict other studies that employed QM as a unidimensional construct. These results assist in resolving the problems that might arise from the lack of clarity in the literature concerning the dimensional structure of QM. The SEM results affirm that the soft QM practices such as top management leadership (TML), employee management (EM), customer focus (CF), and supplier management (SM) directly improve the hotel financial performance but the hard QM practices such as process management (PM) and quality data and reporting (QD&R) do not. However, two quality management practices, TML and SM, are found to distinguish those hotels that have CA from those hotels that have not. Therefore, these results can help hotel managers to reallocate the hotel resources to implement those QM practices that can improve the hotel financial performance and generate CA. Finally, this study would benefit if these models are tested with an alternative data set. This study also suffered from a limitation common to survey research and SEM. The current study survey, due to time and money constraints, is a cross sectional sample at one specific point in time. As a result, while causal relationships can be inferred, they cannot be strictly proven. Causal inferences are stronger with longitudinal studies

    LEAN MANUFACTURING - AN INTEGRATED SOCIO-TECHNICAL SYSTEMS APPROACH TO WORK DESIGN

    Get PDF
    Over the years, the manufacturing industry has witnessed a number of work design practices, based on different principles, which have significantly shaped the nature of work and have affected employees\u27 behavior and performance. This study compares the socio-technical systems (STS) principles and lean production (LP) principles in to explore the potential for synergistic integration between the two. They are categorized according to the common overarching goals of these principles, and through a process of theoretical rationalization, these categories are operationalized into the work design practices of middle management support, social practices usage, and technical practices usage. A model of work design is proposed to test the relationships between these work practices and to understand their effect on employees\u27 quality of work life and performance. The effect of task interdependence is also examined since teams are the basic unit of analysis in STS and LP approaches to work design. This model is tested with a cross-sectional survey research in which team leaders in manufacturing plants in the United States were the key respondents. Statistical analyses of survey data yielded three key findings. Middle management support has a positive direct and indirect effect on improved employee performance, a positive direct effect on social practices usage, and a positive indirect effect on technical practices usage and on employees\u27 quality of work life. Social practices usage has a total positive direct effect on technical practices usage, and a positive indirect effect on employees\u27 quality of work life and their performance. Technical practices usage has a direct effect on both quality of work life and employee performance. This study provides empirical support for the definition of lean production posited by Shah and Ward (2007). Results indicate that middle management is crucial for the implementation and sustainability of a lean system because it offers the support necessary for the usage of social and technical practices. Applications for manufacturing organizations and suggestions for future research are presented

    Data driven discovery of materials properties.

    Get PDF
    The high pace of nowadays industrial evolution is creating an urgent need to design new cost efficient materials that can satisfy both current and future demands. However, with the increase of structural and functional complexity of materials, the ability to rationally design new materials with a precise set of properties has become increasingly challenging. This basic observation has triggered the idea of applying machine learning techniques in the field, which was further encouraged by the launch of the Materials Genome Initiative (MGI) by the US government since 2011. In this work, we present a novel approach to apply machine learning techniques for materials science applications. Guided by knowledge from domain experts, our approach focuses on machine learning to accelerate data-driven discovery of materials properties. Our objectives are two folds: (i) Identify the optimal set of features that best describes a given predicted variable. (ii) Boost prediction accuracy via applying various regression algorithms. Ordinary Least Square, Partial Least Square and Lasso regressions, combined with well adjusted feature selection techniques are applied and tested to predict key properties of semiconductors for two types of applications. First, we propose to build a more robust prediction model for band-gap energy (BG-E) of chalcopyrites, commonly used for solar cells industry. Compared to the results reported in [1-3] , our approach shows that learning and using only a subset of relevant features can improve the prediction accuracy by about 40%. For the second application, we propose to determine the underlying factors responsible for Defect-Induced Magnetism (DIM) in Dilute Magnetic Semiconductors (DMS) through the analysis of a set of 30 features for different DMS systems. We show that 8 of these features are more likely to contribute to this property. Using only these features to predict the total magnetic moment of new candidate DMSs has reduced the mean square error by about 90% compared to the models trained using the whole set of features. Given the scarcity of the available data sets for similar applications, this work aims not only to build robust models but also to establish a collaborative platform for future research

    Integrated circuit outlier identification by multiple parameter correlation

    Get PDF
    Semiconductor manufacturers must ensure that chips conform to their specifications before they are shipped to customers. This is achieved by testing various parameters of a chip to determine whether it is defective or not. Separating defective chips from fault-free ones is relatively straightforward for functional or other Boolean tests that produce a go/no-go type of result. However, making this distinction is extremely challenging for parametric tests. Owing to continuous distributions of parameters, any pass/fail threshold results in yield loss and/or test escapes. The continuous advances in process technology, increased process variations and inaccurate fault models all make this even worse. The pass/fail thresholds for such tests are usually set using prior experience or by a combination of visual inspection and engineering judgment. Many chips have parameters that exceed certain thresholds but pass Boolean tests. Owing to the imperfect nature of tests, to determine whether these chips (called "outliers") are indeed defective is nontrivial. To avoid wasted investment in packaging or further testing it is important to screen defective chips early in a test flow. Moreover, if seemingly strange behavior of outlier chips can be explained with the help of certain process parameters or by correlating additional test data, such chips can be retained in the test flow before they are proved to be fatally flawed. In this research, we investigate several methods to identify true outliers (defective chips, or chips that lead to functional failure) from apparent outliers (seemingly defective, but fault-free chips). The outlier identification methods in this research primarily rely on wafer-level spatial correlation, but also use additional test parameters. These methods are evaluated and validated using industrial test data. The potential of these methods to reduce burn-in is discussed

    Product assurance technology for custom LSI/VLSI electronics

    Get PDF
    The technology for obtaining custom integrated circuits from CMOS-bulk silicon foundries using a universal set of layout rules is presented. The technical efforts were guided by the requirement to develop a 3 micron CMOS test chip for the Combined Release and Radiation Effects Satellite (CRRES). This chip contains both analog and digital circuits. The development employed all the elements required to obtain custom circuits from silicon foundries, including circuit design, foundry interfacing, circuit test, and circuit qualification

    Exploring variability in medical imaging

    Get PDF
    Although recent successes of deep learning and novel machine learning techniques improved the perfor- mance of classification and (anomaly) detection in computer vision problems, the application of these methods in medical imaging pipeline remains a very challenging task. One of the main reasons for this is the amount of variability that is encountered and encapsulated in human anatomy and subsequently reflected in medical images. This fundamental factor impacts most stages in modern medical imaging processing pipelines. Variability of human anatomy makes it virtually impossible to build large datasets for each disease with labels and annotation for fully supervised machine learning. An efficient way to cope with this is to try and learn only from normal samples. Such data is much easier to collect. A case study of such an automatic anomaly detection system based on normative learning is presented in this work. We present a framework for detecting fetal cardiac anomalies during ultrasound screening using generative models, which are trained only utilising normal/healthy subjects. However, despite the significant improvement in automatic abnormality detection systems, clinical routine continues to rely exclusively on the contribution of overburdened medical experts to diagnosis and localise abnormalities. Integrating human expert knowledge into the medical imaging processing pipeline entails uncertainty which is mainly correlated with inter-observer variability. From the per- spective of building an automated medical imaging system, it is still an open issue, to what extent this kind of variability and the resulting uncertainty are introduced during the training of a model and how it affects the final performance of the task. Consequently, it is very important to explore the effect of inter-observer variability both, on the reliable estimation of model’s uncertainty, as well as on the model’s performance in a specific machine learning task. A thorough investigation of this issue is presented in this work by leveraging automated estimates for machine learning model uncertainty, inter-observer variability and segmentation task performance in lung CT scan images. Finally, a presentation of an overview of the existing anomaly detection methods in medical imaging was attempted. This state-of-the-art survey includes both conventional pattern recognition methods and deep learning based methods. It is one of the first literature surveys attempted in the specific research area.Open Acces

    An integrated total quality management model for the Ghanaian construction industry

    Get PDF
    D.Phil. (Engineering Management)Abstract: This research project investigated and modelled Total Quality Management (TQM) for the Ghanaian construction industry. The primary aim of the research was to model the extent to which Leadership/Top Management features, Company Supplier Quality Management features, Client Focus and Involvement features, Company Quality System Evaluation features, Company Vision and Plan Statement features, Product Design Management features, Product Selection Management features, Construction Process Management and Improvement features, and Construction Employees’ Involvement and Motivation features predict TQM for the construction industry, these factors being classified as the exogenous variables. Mixed-methods research which involved both Qualitative and Quantitative approaches was adopted for the study. Empirical data was collected through a Delphi study and a field questionnaire survey. Analysis of results from the Delphi study was done with Microsoft Excel to output descriptive statistics. A conceptual integrated TQM for the Ghanaian construction industry model was based on the theory developed from literature review findings and the Delphi study. A questionnaire survey was conducted among the top management working in the construction industry in Ghana. From the 641 sample questionnaires, 536 questionnaires were returned which represents 83.62 per cent. An exploratory factor analysis (EFA) was conducted on the initial eight-factor constructs and their variables to determine their reliability for their inclusion in the confirmatory factor analysis (CFA). Nine-factor constructs were realized after the EFA factor loading test. Further, CFA was conducted on these nine-factor constructs using structural equation modelling (SEM) software with Eqations (EQS) version 6.2 software programme to validate and determine their reliability and inclusion in the final model. Findings from the literature on TQM studies revealed the theory that TQM implementations and practices and the latent variables lead to TQM in the construction industry. Findings from the Delphi study revealed that several factors (Leadership/Top Management features, Company Supplier Quality Management features, Client Focus and Involvement features, Company Quality System Evaluation features, Company Vision and Plan Statement features, Product Selection and Design Management features, Construction Process Management and Improvement features, and Construction Employees’ Involvement and Motivation features) were considered to be the most important determinants of TQM in the Ghanaian construction industry. Both findings revealed that TQM could be considered as an eight-factor model defined by the influence of TQM practices and experts in construction..

    Empirical Analysis of Socio-Cognitive Factors Affecting Security Behaviors and Practices of Smartphone Users

    Get PDF
    The overall security posture of information systems (IS) depends on the behaviors of the IS users. Several studies have shown that users are the greatest vulnerability to IS security. The proliferation of smartphones is introducing an entirely new set of risks, threats, and vulnerabilities. Smartphone devices amplify this data exposure problem by enabling instantaneous transmission and storage of personally identifiable information (PII) by smartphone users, which is becoming a major security risk. Moreover, companies are also capitalizing on the availability and powerful computing capabilities of these smartphone devices and developing a bring-your-own-device (BYOD) program, which makes companies susceptible to divulgence of organizational proprietary information and sensitive customer information. In addition to users being the greatest risk to IS security, several studies have shown that many people do not implement even the most basic security countermeasures on their smartphones. The lack of security countermeasures implementation, risky user behavior, and the amount of sensitive information stored and transmitted on smartphones is becoming an ever-increasing problem. A literature review revealed a significant gap in literature pertaining to smartphone security. This study identified six socio-cognitive factors from the domain of traditional computer security which have shown to have an impact on user security behaviors and practices. The six factors this study identified and analyzed are mobile information security self-efficacy, institutional trust, party trust, and awareness of smartphone risks, threats, and vulnerabilities and their influence on smartphone security practices and behaviors. The analysis done in this research was confirmatory factor analysis (CFA) – structural equation modeling (SEM). The goal of this study was to cross-validate previously validated factors within the context of traditional computer security and assess their applicability in the context of smartphone security. Additionally, this study assessed the influential significance of these factors on the security behaviors and practices of smartphone users. This study used a Web-based survey and was distributed to approximately 539 users through Facebook® and LinkedIn® social media outlets which resulted in 275 responses for a 51% response rate. After pre-analysis data screening was completed, there were a total of 19 responses that had to be eliminated due to unengaged responses and outliers leaving 256 responses left to analyze. The results of the analysis found that vulnerability awareness, threat awareness, and risk awareness are interrelated to one another which all in turn had significance in predicting self-efficacy, security practices, and behaviors. This intricate relationship revealed in this study indicates that a user has to have an increased awareness in all three categories of awareness before they can fully understand how to protect themselves. Having an increased awareness in one category does not impact the overall security posture of the user and that risk, threat, and vulnerability awareness all work together. Another interesting find was that as risk awareness increased the less the smartphone users protected themselves. This finding warrants additional research to investigate why the user is more averse to risk, and willing to accept the risk, despite their increased awareness. Finally, institutional trust and party trust was found not to have any significance on any of the factors. These findings should give smartphone users and organizations insight into specific areas to focus on in minimizing inappropriate security behaviors and practices of smartphone users. More specifically, users and organizations need to focus on educating users on all three factors of threats, risks, and vulnerabilities in order for there to have any impact on increasing self-efficacy and reducing inappropriate security behaviors and practices
    • …
    corecore