305 research outputs found

    Digital voice-of-customer processing by topic modelling algorithms: insights to validate empirical results

    Get PDF
    Purpose: Digital voice-of-customer (digital VoC) analysis is gaining much attention in the field of quality management. Digital VoC can be a great source of knowledge about customer needs, habits and expectations. To this end, the most popular approach is based on the application of text mining algorithms named topic modelling. These algorithms can identify latent topics discussed within digital VoC and categorise each source (e.g. each review) based on its content. This paper aims to propose a structured procedure for validating the results produced by topic modelling algorithms. Design/methodology/approach: The proposed procedure compares, on random samples, the results produced by topic modelling algorithms with those generated by human evaluators. The use of specific metrics allows to make a comparison between the two approaches and to provide a preliminary empirical validation. Findings: The proposed procedure can address users of topic modelling algorithms in validating the obtained results. An application case study related to some car-sharing services supports the description. Originality/value: Despite the vast success of topic modelling-based approaches, metrics and procedures to validate the obtained results are still lacking. This paper provides a first practical and structured validation procedure specifically employed for quality-related applications

    On the strong connectivity of the 2-Engel graphs of almost simple groups

    Full text link
    The Engel graph of a finite group GG is a directed graph encoding the pairs of elements in GG satisfying some Engel word. Recent work of Lucchini and the third author shows that, except for a few well-understood cases, the Engel graphs of almost simple groups are strongly connected. In this paper, we give a refinement to this analysis

    Cooperative diagnostics for distributed LSDM systems based on triangulation

    Get PDF
    In the field of large-scale dimensional metrology (LSDM), new distributed systems based on different technologies have blossomed over the last decade. They generally include (i) some targets to be localized and (ii) a network of portable devices, distributed around the object to be measured, which is often bulky and difficult to handle. The objective of this paper is to present some diagnostic tests for those distributed LSDM systems that perform the target localization by triangulation. Three are the tests presented: two global tests to detect the presence of potential anomalies in the system during measurements, and one local test aimed at isolating any faulty network device(s). This kind of diagnostics is based on the cooperation of different network devices that merge their local observations, not only for target localization, but also for detecting potential measurement anomalies. Tests can be implemented in real-time, without interrupting or slowing down the measurement process. After a detailed description of the tests, we present some practical applications on MSCMS-II, a distributed LSDM system based on infrared photogrammetric technology, recently developed at DIGEP-Politecnico di Torino

    Flexible aggregation operators to support hierarchization of Engineering Characteristics in QFD

    Get PDF
    Quality Function Deployment (QFD) is a management tool for organizing and conducting design activities of new products and/or services together with their relevant production and/or supply processes, starting from the requirements directly expressed by the end-users. It is organized in a series of operative steps which drive from the collection of the customer needs to the definition of the technical characteristics of the production/supply processes. The first step entails the construction of the House of Quality (HoQ), a planning matrix translating the Customer Requirements (CRs) into measurable product/service technical characteristics (Engineering Characteristics – ECs). One of the main goals of this step is to transform CR importances into an EC prioritization. A robust evaluation method should consider the relationships between CRs and ECs while determining the importance levels of ECs in the HoQ. In traditional approaches, such as for example Independent Scoring Method, ordinal information is arbitrarily converted in cardinal information introducing a series of controversial assumptions. Actually, the current scientific literature presents a number of possible solutions to this problem, but the question of attributing scalar properties to information collected on ordinal scales is far from being settled. This paper proposes a method based on ME-MCDM techniques (Multi Expert / Multiple Criteria Decision Making), which is able to compute EC prioritization without operating an artificial numerical codification of the information contained in the HoQ. After a general description of the theoretical principles of the method, a series of application examples are presented and discussed

    MScMS-II: an innovative IR-based indoor coordinate measuring system for large-scale metrology applications

    No full text
    According to the current great interest concerning large-scale metrology applications in many different fields of manufacturing industry, technologies and techniques for dimensional measurement have recently shown a substantial improvement. Ease-of-use, logistic and economic issues, as well as metrological performance are assuming a more and more important role among system requirements. This paper describes the architecture and the working principles of a novel infrared (IR) optical-based system, designed to perform low-cost and easy indoor coordinate measurements of large-size objects. The system consists of a distributed network-based layout, whose modularity allows fitting differently sized and shaped working volumes by adequately increasing the number of sensing units. Differently from existing spatially distributed metrological instruments, the remote sensor devices are intended to provide embedded data elaboration capabilities, in order to share the overall computational load. The overall system functionalities, including distributed layout configuration, network self-calibration, 3D point localization, and measurement data elaboration, are discussed. A preliminary metrological characterization of system performance, based on experimental testing, is also presente

    Biodegradation of porous calcium phosphate scaffolds in an ectopic bone formation model studied by X-ray computed microtomograph

    Get PDF
    Three types of ceramic scaffolds with different composition and structure [namely synthetic 100% hydroxyapatite (HA; Engipore), synthetic calcium phosphate multiphase biomaterial containing 67% silicon stabilized tricalcium phosphate (Si-TCP; Skelite™) and natural bone mineral derived scaffolds (Bio-oss®)] were seeded with mesenchymal stem cells (MSC) and ectopically implanted for 8 and 16 weeks in immunodeficient mice. X-ray synchrotron radiation microtomography was used to derive 3D structural information on the same scaffolds both before and after implantation. Meaningful images and morphometric parameters such as scaffold and bone volume fraction, mean thickness and thickness distribution of the different phases as a function of the implantation time, were obtained. The used imaging algorithms allowed a direct comparison and registration of the 3D structure before and after implantation of the same sub-volume of a given scaffold. In this way it was possible to directly monitor the tissue engineered bone growth and the complete or partial degradation of the scaffold.Further, the detailed kinetics studies on Skelite™ scaffolds implanted for different length of times from 3 days to 24 weeks, revealed in the X-ray absorption histograms two separate peaks associated to HA and TCP. It was therefore possible to observe that the progressive degradation of the Skelite™ scaffolds was mainly due to the resorption of TCP. The different saturation times in the tissue engineered bone growth and in the TCP resorption confirmed that the bone growth was not limited the scaffold regions that were resorbed but continued in the inward direction with respect to the pore surface

    The success-index: an alternative approach to the h-index for evaluating an individual's research output

    Get PDF
    Among the most recent bibliometric indicators for normalizing the differences among fields of science in terms of citation behaviour, Kosmulski (J Informetr 5(3):481-485, 2011) proposed the NSP (number of successful paper) index. According to the authors, NSP deserves much attention for its great simplicity and immediate meaning— equivalent to those of the h-index—while it has the disadvantage of being prone to manipulation and not very efficient in terms of statistical significance. In the first part of the paper, we introduce the success-index, aimed at reducing the NSP-index's limitations, although requiring more computing effort. Next, we present a detailed analysis of the success-index from the point of view of its operational properties and a comparison with the h-index's ones. Particularly interesting is the examination of the success-index scale of measurement, which is much richer than the h-index's. This makes success-index much more versatile for different types of analysis—e.g., (cross-field) comparisons of the scientific output of (1) individual researchers, (2) researchers with different seniority, (3) research institutions of different size, (4) scientific journals, etc

    Analysing uncertainty contributions in dimensional measurements of large-size objects by ultrasound sensors

    Get PDF
    According to the ever-increasing interest in metrological systems for dimensional measurements of large-size objects in a wide range of industrial sectors, several solutions based on different technologies, working principles, architectures and functionalities have been recently designed. Among these, a distributed flexible system based on a network of low-cost ultrasound (US) sensors - the Mobile Spatial coordinate Measuring System (MScMS) - has been developed. This article presents a possible approach to assess the system uncertainty referring to the measured point coordinates in the 3D space, focusing on the sources of measurement uncertainty and the related propagation la

    A wireless sensor network-based approach to large-scale dimensional metrology

    No full text
    In many branches of industry, dimensional measurements have become an important part of the production cycle, in order to check product compliance with specifications. This task is not trivial especially when dealing with largescale dimensional measurements: the bigger the measurement dimensions are, the harder is to achieve high accuracies. Nowadays, the problem can be handled using many metrological systems, based on different technologies (e.g. optical, mechanical, electromagnetic). Each of these systems is more or less adequate, depending upon measuring conditions, user's experience and skill, or other factors such as time, cost, accuracy and portability. This article focuses on a new possible approach to large-scale dimensional metrology based on wireless sensor networks. Advantages and drawbacks of such approach are analysed and deeply discussed. Then, the article briefly presents a recent prototype system - the Mobile Spatial Coordinate-Measuring System (MScMS-II) - which has been developed at the Industrial Metrology and Quality Laboratory of DISPEA - Politecnico di Torino. The system seems to be suitable for performing dimensional measurements of large-size objects (sizes on the order of several meters). Owing to its distributed nature, the system - based on a wireless network of optical devices - is portable, fully scalable with respect to dimensions and shapes and easily adaptable to different working environments. Preliminary results of experimental tests, aimed at evaluating system performance as well as research perspectives for further improvements, are discusse
    corecore