15,848 research outputs found

    Information Quality Measurement Using Quality Function Deployment – A Korean Case Study

    Get PDF
    Contemporary business organizations understand the criticality of quality of data, yet they struggle to enhance it. Establishing, enhancing, and maintaining data quality in organizational information systems has been the focus research and industry for the last two decades. There is, however, nothing denying the fact that improvements in information quality require its objective assessment. This objectivity is necessitated due to the range of information quality dimensions, where lacking quality in each dimension affects other dimensions. Therefore, such an assessment provides for actionable learning that facilitates tangible improvements in system and practice. This paper presents a case study on assessment of information quality in a manufacturing organization. It utilizes product perspective of information applied to six sigma methodology, analytical hierarchy process to find the correlation between various information quality dimensions, and quality function deployment to develop critical to quality trees

    PICES Press, Vol. 9, No. 2, July 2001

    Get PDF
    Cover [pdf, 0.2 Mb] Climate, biodiversity and ecosystems of the North Pacific [pp. 1-2] [pdf, 0.2 Mb] The state of the western North Pacific in the second half of 2000 [pp. 3-5] [pdf, 0.8 Mb] The status of the Bering Sea: June – December 2000 [pp. 6-7] [pdf, 1.5 Mb] The state of the eastern North Pacific since autumn 2000 [p. 8] [pdf, 0.3 Mb] Korean Yellow Sea Large Marine Ecosystem Program [pp. 9-12] [pdf, 0.5 Mb] Past and ongoing Mexican ecosystem research in the northeast Pacific Ocean [pp. 13-15] [pdf, 0.3 Mb] Vera Alexander [pp. 16-19] [pdf, 1.0 Mb] North Pacific CO2 data for the new millennium [pp. 20-21] [pdf, 0.3 Mb] PICES Higher Trophic Level Modelling Workshop [pp. 22-23] [pdf, 0.4 Mb] Argo Science Team 3rd Meeting (AST-3) [pp. 24-25] [pdf, 0.3 Mb] 2001 coast ocean / salmon ecosystem event [p. 26-27] [pdf, 0.3 Mb] Shifts in zooplankton abundance and species composition off central Oregon and southwestern British Columbia [pp. 28-29] [pdf, 0.3 Mb] The CLIVAR - Pacific Workshop [p. 30] [pdf, 0.2 Mb] PICES dialogue with Mexican scientists [p. 31] [pdf, 0.2 Mb] Announcements [p. 32] [pdf, 0.2 Mb

    Agile management and interoperability testing of SDN/NFV-enriched 5G core networks

    Get PDF
    In the fifth generation (5G) era, the radio internet protocol capacity is expected to reach 20Gb/s per sector, and ultralarge content traffic will travel across a faster wireless/wireline access network and packet core network. Moreover, the massive and mission-critical Internet of Things is the main differentiator of 5G services. These types of real-time and large-bandwidth-consuming services require a radio latency of less than 1 ms and an end-to-end latency of less than a few milliseconds. By distributing 5G core nodes closer to cell sites, the backhaul traffic volume and latency can be significantly reduced by having mobile devices download content immediately from a closer content server. In this paper, we propose a novel solution based on software-defined network and network function virtualization technologies in order to achieve agile management of 5G core network functionalities with a proof-of-concept implementation targeted for the PyeongChang Winter Olympics and describe the results of interoperability testing experiences between two core networks

    Highly efficient Bayesian joint inversion for receiver-based data and its application to lithospheric structure beneath the southern Korean Peninsula

    Get PDF
    With the deployment of extensive seismic arrays, systematic and efficient parameter and uncertainty estimation is of increasing importance and can provide reliable, regional models for crustal and upper-mantle structure.We present an efficient Bayesian method for the joint inversion of surface-wave dispersion and receiver-function data that combines trans-dimensional (trans-D) model selection in an optimization phase with subsequent rigorous parameter uncertainty estimation. Parameter and uncertainty estimation depend strongly on the chosen parametrization such that meaningful regional comparison requires quantitative model selection that can be carried out efficiently at several sites. While significant progress has been made for model selection (e.g. trans-D inference) at individual sites, the lack of efficiency can prohibit application to large data volumes or cause questionable results due to lack of convergence. Studies that address large numbers of data sets have mostly ignored model selection in favour of more efficient/simple estimation techniques (i.e. focusing on uncertainty estimation but employing ad-hoc model choices). Our approach consists of a two-phase inversion that combines trans-D optimization to select the most probable parametrization with subsequent Bayesian sampling for uncertainty estimation given that parametrization. The trans-D optimization is implemented here by replacing the likelihood function with the Bayesian information criterion (BIC). The BIC provides constraints on model complexity that facilitate the search for an optimal parametrization. Parallel tempering (PT) is applied as an optimization algorithm. After optimization, the optimal model choice is identified by the minimum BIC value from all PT chains. Uncertainty estimation is then carried out in fixed dimension. Data errors are estimated as part of the inference problem by a combination of empirical and hierarchical estimation. Data covariance matrices are estimated from data residuals (the difference between prediction and observation) and periodically updated. In addition, a scaling factor for the covariance matrix magnitude is estimated as part of the inversion. The inversion is applied to both simulated and observed data that consist of phase- and group-velocity dispersion curves (Rayleigh wave), and receiver functions. The simulation results show that model complexity and important features are well estimated by the fixed dimensional posterior probability density. Observed data for stations in different tectonic regions of the southern Korean Peninsula are considered. The results are consistent with published results, but important features are better constrained than in previous regularized inversions and are more consistent across the stations. For example, resolution of crustal and Moho interfaces, and absolute values and gradients of velocities in lower crust and upper mantle are better constrained

    Peningkatan Kualitas Layanan Penyedia Layanan Logistik Berdasarkan Integrasi SERVQUAL dan QFD

    Get PDF
    This study aims to find the causes of customer dissatisfaction and its implications for logistics service provider (LSP) operations management by improving the quality of customer service. This research is a case study on Pusat Logistik Berikat (PLB), a logistics service provider based in Cilincing, Indonesia. Qualitative and quantitative methods were used with the Delphi method for expert validation. Data were collected during the covid-19 pandemic using questionnaires distributed to a limited sample of 50 PLB customers and semi-structured interviews to deepen information related to site observations. The 5-dimensional service quality (SERVQUAL) instrument was validated using 40 completed questionnaires used as primary data, resulting in 33 valid attributes. GAP analysis found 30 attributes have low performance, and reliability, assurance, tangibles, responsiveness, and empathy contribute to customer satisfaction. Empathy was the priority for improving the quality of employees. Six attributes based on importance performance analysis (IPA) were identified as the voice of customer (VOC). Based on QFD evaluation, it was found that for the most critical professional improving the quality of customer service is performance as an added value for the logistics process, and the lowest is the fast response from the customer service team. Highest technical requirements: cycle time and lowest stock out. The technical implications of the SERVQUAL and QFD approaches can be used to increase customer satisfaction through improving the quality of human resources and procurement management.Tujuan penelitian ini mencari sebab-sebab ketidakpuasan pelanggan dan implikasinya pada manajemen operasi penyedia jasa logistik (logistics service provider/LSP) dengan meningkatkan kualitas layanan pelanggan. Penelitian ini terbatas pada studi kasus manajemen rantai pasok perusahaan jasa logistik layanan Pusat Logistik Berikat (PLB) di Cilincing dan Studi kasus dilakukan dengan metode kualitatif dan kuantitatif, dengan validasi pakar berdasarkan metode Delphi. Pengumpulan data dilakukan di masa pandemi covid-19 Kuesioner disebar pada sampel terbatas 50 pelanggan PLB menggunakan aplikasi google form dan wawancara semi struktur untuk pendalaman informasi terkait dengan observasi ke lokasi. Pendekatan service quality (SERVQUAL) 5 dimensi divalidasi menggunakan data primer dari 40 data kuesioner yang terisi lengkap. Dihasilkan 33 atribut valid dan analisis GAP menemukan 30 atribut dengan kinerja rendah yang berpengaruh terhadap kepuasan konsumen. Kontribusi terhadap tingkat kepuasan diberikan oleh dimensi reliability, assurance, tangibles, responsiveness dan empathy, dengan dimensi empathy menjadi prioritas perbaikan. Berdasarkan important performance analysis (IPA diidentifikasi enam atribut  sebagai voice of customer (VOC). Evaluasi QFD menghasilkan bahwa dalam peningkatan kualitas layanan pelanggan, yang terpenting adalah kinerja profesional yang merupakan nilai tambah bagi proses logistik dan paling rendah adalah respon cepat dari tim layanan pelanggan. Ditemukan pula bahwa persyaratan teknis tertinggi adalah cycle time dan terendah adalah stock out. Sebagai implikasi teknis, pendekatan SERVQUAL dan QFD dapat digunakan untuk  peningkatan kepuasan pelanggan melalui peningkatan kualitas sumber daya manusia dan manajemen pengadaan barang.

    Arguments for use of ABC in TQM environment

    Get PDF
    In the current context, analysis methods of management, quality-centered management is of particular relevance.TQM is a management approach with great potential, which incite to new and deeper analysis and research. Multiple analyses of the techniques, methods and TQM require expansion and accounting solutions.The application presented is a model for tracking quality costs in terms of applying TQM.Tracking quality costs using ABC is the solution obtained after a thorough analysis of TQM and ABC method. For relevance solution, we chose the solution in an enterprise application.The material presented is a step in the successful implementation of TQM using the method of analysis of quality costs ABC method.quality costs, quality management
    corecore