426,215 research outputs found

    CoFeD: A visualisation framework for comparative quality evaluation

    Get PDF
    Evaluation for the purpose of selection can be a challenging task particularly when there is a plethora of choices available. Short-listing, comparisons and eventual choice(s) can be aided by visualisation techniques. In this paper we use Feature Analysis, Tabular and Tree Representations and Composite Features Diagrams (CFDs) for profiling user requirements and for top-down profiling and evaluation of items (methods, tools, techniques, processes and so on) under evaluation. The resulting framework CoFeD enables efficient visual comparison and initial short-listing. The second phase uses bottom-up quantitative evaluation which aids the elimination of the weakest items and hence the effective selection of the most appropriate item. The versatility of the framework is illustrated by a case study comparison and evaluation of two agile methodologies. The paper concludes with limitations and indications of further work

    A framework to evaluate the impact of ICT usage on collaborative product development performance in manufacturing firms : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Engineering at Massey University, Auckland, New Zealand

    Get PDF
    Manufacturers are increasingly adopting collaborative product development (CPD) to achieve competitive advantage through joint synergies. Information and communication technology (ICT) is the major enabler of communication, collaboration, product designing, development, knowledge and information management, project management, and market research activities involved in CPD. Most ICT implementations incur a significant cost for firms, thus a deeper understanding of the impact of ICT usage on CPD performance would be immensely useful for managing ICT resources effectively in innovation programmes. However, existing evidence for the direct relationships between ICT usage and performance dimensions are counterintuitive (negative or insignificant). Not considering the different aspects of ICT usage was identified as a key reason for the lack of strong empirical evidence. Furthermore, the impact of ICT usage on collaboration-based product development performance and indirect impact through this collaboration performance on new product performance, as well as moderating effects of project characteristics on the direct and indirect ICT impact have largely been ignored in the literature. Therefore, drawing on relational resource-based view and organizational information processing theory, this study develops and utilizes a model including multidimensional ICT usage and CPD performance measurements, and possible moderating project characteristics, for better evaluating the impact of ICT usage on CPD performance. Initially, product development professionals from manufacturing firms and knowledgeable managers from ICT vendor firms were interviewed for a preliminary qualitative evaluation of the suggested model with industry perspectives. In addition, a quantitative investigation of secondary data obtained from the PDMA’s (Product Development and Management Association) 2012 comparative performance assessment study was conducted prior to the main survey in order to assess the significance of the proposed model with a different source of data. In the final main quantitative study, data collected from 244 CPD projects via an online global survey were used to test the research hypotheses. The study contributes to the current body of knowledge by revealing a positive direct impact of ICT usage on new product performance in terms of quality, commercial success, and time performance, and collaboration performance, which also in turn increases new product performance. In addition, moderating effects of project characteristics (complexity and uncertainty) on these associations have been explored. The study implies that manufacturers need to value not only the direct project benefits of ICT use, but also the collaboration-related outcomes that significantly increase the likelihood of achieving higher performance in their present and future CPD projects. Adequate attention must be paid to individual ICT usage dimensions as well. Particularly, other than frequency of ICT use, manufacturing firms need to improve the utilization of available features and functionalities of the tools (intensity) and the ICT proficiency of R&D staff, to gain the desired results in CPD projects

    Quantification of left ventricular longitudinal strain, strain rate, velocity and displacement in healthy horses by 2-dimensional speckle tracking

    Get PDF
    Background: The quantification of equine left ventricular (LV) function is generally limited to short-axis M-mode measurements. However, LV deformation is 3-dimensional (3D) and consists of longitudinal shortening, circumferential shortening, and radial thickening. In human medicine, longitudinal motion is the best marker of subtle myocardial dysfunction. Objectives: To evaluate the feasibility and reliability of 2-dimensional speckle tracking (2DST) for quantifying equine LV longitudinal function. Animals: Ten healthy untrained trotter horses; 9.6 +/- 4.4 years; 509 +/- 58 kg. Methods : Prospective study. Repeated echocardiographic examinations were performed by 2 observers from a modified 4-chamber view. Global, segmental, and averaged peak values and timing of longitudinal strain (SL), strain rate (SrL), velocity (VL), and displacement (DL) were measured in 4 LV wall segments. The inter- and intraobserver within- and between-day variability was assessed by calculating the coefficients of variation for repeated measurements. Results: 2DST analysis was feasible in each exam. The variability of peak systolic values and peak timing was low to moderate, whereas peak diastolic values showed a higher variability. Significant segmental differences were demonstrated. DL and VL presented a prominent base-to-midwall gradient. SL and SrL values were similar in all segments except the basal septal segment, which showed a significantly lower peak SL occurring about 60 ms later compared with the other segments. Conclusions and Clinical Importance 2DST is a reliable technique for measuring systolic LV longitudinal motion in healthy horses. This study provides preliminary reference values, which can be used when evaluating the technique in a clinical setting

    Discovering, quantifying, and displaying attacks

    Full text link
    In the design of software and cyber-physical systems, security is often perceived as a qualitative need, but can only be attained quantitatively. Especially when distributed components are involved, it is hard to predict and confront all possible attacks. A main challenge in the development of complex systems is therefore to discover attacks, quantify them to comprehend their likelihood, and communicate them to non-experts for facilitating the decision process. To address this three-sided challenge we propose a protection analysis over the Quality Calculus that (i) computes all the sets of data required by an attacker to reach a given location in a system, (ii) determines the cheapest set of such attacks for a given notion of cost, and (iii) derives an attack tree that displays the attacks graphically. The protection analysis is first developed in a qualitative setting, and then extended to quantitative settings following an approach applicable to a great many contexts. The quantitative formulation is implemented as an optimisation problem encoded into Satisfiability Modulo Theories, allowing us to deal with complex cost structures. The usefulness of the framework is demonstrated on a national-scale authentication system, studied through a Java implementation of the framework.Comment: LMCS SPECIAL ISSUE FORTE 201

    Quantitative geometric analysis of rib, costal cartilage and sternum from childhood to teenagehood

    Get PDF
    Better understanding of the effects of growth on children’s bones and cartilage is necessary for clinical and biomechanical purposes. The aim of this study is to define the 3D geometry of children’s rib cages: including sternum, ribs and costal cartilage. Three-dimensional reconstructions of 960 ribs, 518 costal cartilages and 113 sternebrae were performed on thoracic CT-scans of 48 children, aged four months to 15 years. The geometry of the sternum was detailed and nine parameters were used to describe the ribs and rib cages. A "costal index" was defined as the ratio between cartilage length and whole rib length to evaluate the cartilage ratio for each rib level. For all children, the costal index decreased from rib level one to three and increased from level three to seven. For all levels, the cartilage accounted for 45 to 60% of the rib length, and was longer for the first years of life. The mean costal index decreased by 21% for subjects over three years old compared to those under three (p<10-4). The volume of the sternebrae was found to be highly age dependent. Such data could be useful to define the standard geometry of the paediatric thorax and help to detect clinical abnormalities.Grant from the ANR (SECUR_ENFANT 06_0385) and supported by the GDR 2610 “Biomécanique des chocs” (CNRS/INRETS/GIE PSA Renault

    A robust digital image watermarking using repetition codes against common attacks

    Get PDF
    Digital watermarking is hiding the information inside a digital media to protect for such documents against malicious intentions to change such documents or even claim the rights of such documents. Currently the capability of repetition codes on various attacks in not sufficiently studied. In this project, a robust frequency domain watermarking scheme has been implemented using Discrete Cosine Transform (DCT). The idea of this scheme is to embed an encoded watermark using repetition code (3, 1) inside the cover image pixels based on Discrete Cosine Transform (DCT) embedding technique. The proposed methods have undergone several simulation attacks tests in order to check up and compare their robustness against various attacks, like salt and pepper, speckle, compress, Gaussian, image contrast, resizing and cropping attack. The robustness of the watermarking scheme has been calculated using Peak Signal-To-Noise Ratio (PSNR), Mean Squared Error (MSE) and Normalized Correlations (NC). In our experiments, the results show that the robustness of a watermark with repetition codes is much better than without repetition code

    Practical applications of probabilistic model checking to communication protocols

    Get PDF
    Probabilistic model checking is a formal verification technique for the analysis of systems that exhibit stochastic behaviour. It has been successfully employed in an extremely wide array of application domains including, for example, communication and multimedia protocols, security and power management. In this chapter we focus on the applicability of these techniques to the analysis of communication protocols. An analysis of the performance of such systems must successfully incorporate several crucial aspects, including concurrency between multiple components, real-time constraints and randomisation. Probabilistic model checking, in particular using probabilistic timed automata, is well suited to such an analysis. We provide an overview of this area, with emphasis on an industrially relevant case study: the IEEE 802.3 (CSMA/CD) protocol. We also discuss two contrasting approaches to the implementation of probabilistic model checking, namely those based on numerical computation and those based on discrete-event simulation. Using results from the two tools PRISM and APMC, we summarise the advantages, disadvantages and trade-offs associated with these techniques
    corecore