23 research outputs found

    cuDNN๊ณผ ์œ ์‚ฌํ•œ ์ธํ„ฐํŽ˜์ด์Šค๋ฅผ ๊ฐ–๋Š” ์˜คํ”ˆ์†Œ์Šค ๋”ฅ ๋Ÿฌ๋‹ ํ”„๋ฆฌ๋ฏธํ‹ฐ๋ธŒ ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ (์„์‚ฌ)-- ์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› : ๊ณต๊ณผ๋Œ€ํ•™ ์ปดํ“จํ„ฐ๊ณตํ•™๋ถ€, 2019. 2. ์ด์žฌ์šฑ.Deep neural networks (DNNs) are a key enabler of today's intelligent applications and services. cuDNN is the de-facto standard library of deep learning primitives, which makes it easy to develop sophisticated DNN models. However, cuDNN is a propriatary software from NVIDIA, and thus does not allow the user to customize the library based on her needs. Furthermore, it only targets NVIDIA GPUs and cannot support other hardware devices such as manycore CPUs and FPGAs. In this thesis we propose OpenDNN, an open-source, cuDNN-like DNN primitive library that can flexibly support multiple hardware devices. In particular, we demonstrate the portability and flexibility of OpenDNN by porting it to multiple popular DNN frameworks and hardware devices, including GPUs, CPUs, and FPGAs.์‹ฌ์ธต ์‹ ๊ฒฝ๋ง์€ ์˜ค๋Š˜๋‚ ์˜ ์ง€๋Šฅํ˜• ์–ดํ”Œ๋ฆฌ์ผ€์ด์…˜๊ณผ ์„œ๋น„์Šค์˜ ํ•ต์‹ฌ ์š”์†Œ๋กœ ๊ฐ๊ด‘๋ฐ›๊ณ  ์žˆ๋‹ค. NVIDIA์—์„œ ๊ฐœ๋ฐœํ•œ cuDNN์€ ๋”ฅ ๋Ÿฌ๋‹ ํ”„๋ฆฌ๋ฏธํ‹ฐ๋ธŒ ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ์˜ ํ‘œ์ค€์œผ๋กœ, ์ •๊ตํ•œ ์‹ฌ์ธต ์‹ ๊ฒฝ๋ง ๋ชจ๋ธ์„ ์‰ฝ๊ฒŒ ๊ฐœ๋ฐœํ•˜๋„๋ก ๋•๋Š”๋‹ค. ๊ทธ๋Ÿฌ๋‚˜, cuDNN์€ NVIDIA์˜ ํŠนํ—ˆ ์†Œํ”„ํŠธ์›จ์–ด๋กœ ์œ ์ €๋“ค์ด ์ž์‹ ๋“ค์˜ ์š”๊ตฌ์— ๋งž๊ฒŒ ์ œ์ž‘ํ•˜๋Š” ๊ฒƒ์„ ํ—ˆ์šฉํ•˜์ง€ ์•Š๋Š”๋‹ค. ๊ฒŒ๋‹ค๊ฐ€ NVIDIA GPU๋งŒ์„ ์ง€์›ํ•˜๊ธฐ ๋•Œ๋ฌธ์— ๋ฉ€ํ‹ฐ์ฝ”์–ด CPU๋‚˜ ํƒ€ FPGA๋ฅผ ์ง€์›ํ•˜์ง€ ์•Š๋Š”๋‹ค. ์ด ๋…ผ๋ฌธ์—์„œ๋Š” ๋‹ค์–‘ํ•œ ํ•˜๋“œ์›จ์–ด ์žฅ์น˜๋ฅผ ์œ ์—ฐํ•˜๊ฒŒ ์ง€์›ํ•˜๊ณ , cuDNN๊ณผ ์œ ์‚ฌํ•œ ์ธํ„ฐํŽ˜์ด์Šค๋ฅผ ๊ฐ€์ง„ ๋”ฅ ๋Ÿฌ๋‹ ํ”„๋ฆฌ๋ฏธํ‹ฐ๋ธŒ ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ์ธ OpenDNN์„ ์†Œ๊ฐœํ•œ๋‹ค. ํŠนํžˆ, ๋‹ค์–‘ํ•œ ์‹ฌ์ธต ์‹ ๊ฒฝ๋ง ํ”„๋ ˆ์ž„์›Œํฌ์™€ CPU, GPU, ๊ทธ๋ฆฌ๊ณ  FPGA์™€ ๊ฐ™์€ ํ•˜๋“œ์›จ์–ด ์žฅ์น˜๋“ค์— ์—ฐ๋™ํ•˜์—ฌ OpenDNN์˜ ์ด์‹์„ฑ๊ณผ ์œ ์—ฐ์„ฑ์„ ์ž…์ฆํ•œ๋‹ค.Abstract Contents List of Tables List of Figures Chapter 1 Introduction Chapter 2 Background 2.1 Deep Neural Network 2.2 Heterogeneous Computer Chapter 3 OpenDNN API 3.1 Overview 3.2 Context Manager 3.3 Descriptor Manager 3.4 Computation Functions 3.5 Summary Chapter 4 Backend Devices 4.1 CPU 4.2 GPU 4.3 FPGA Chapter 5 OpenDNN-enabled DNN Frameworks 5.1 Caffe 5.2 TensorFlow 5.3 DarkNet Chapter 6 Evaluation 6.1 Programmable Effort 6.2 Performance Chapter 7 Related Work Chapter 8 Conclusion Bibliography ๊ตญ๋ฌธ์ดˆ๋ก AcknowledgementsMaste

    ๋ณ€๋ณ„ ์›จ์ด๋ธ”๋ › ํŒจํ‚ท ๊ธฐ๋ฐ˜ ํ‘œ๋ฉด ํ’ˆ์งˆ ํŒ๋ณ„ ๋ฐฉ๋ฒ•๋ก 

    No full text
    The successful implementation of image processing technology has been made it possible to replace the human vision with automated machine vision. In many applications of machine vision, the product grading is considered as the most challenging problem in image analysis field. Recently, image data can be acquired from a product surface in real time by image sensor systems in chemical plants. For quality determination based on these image datasets, effective texture classification methodology is essential to handle high dimensional images and to extract quality-related information from these product surface images. In this thesis, a surface quality determination methodology based on texture analysis technique was proposed. The proposed methodology has four main stages: image acquisition, feature extraction, feature selection and determination. Feature extraction is a crucial step in pattern recognition problems as well as in methods for characterizing the quality of a product surface. The different types of wavelet transforms, i.e., the wavelet packet transform and the discrete wavelet transform, are compared in the feature extraction step for classification of the surface quality of rolled steel sheets. Feature selection approach was also proposed for selecting discriminative bases for surface quality determination which had not been performed in wavelet packet domain. Wavelet texture analysis is useful for reducing the dimension and extracting textural information from images. Although wavelet texture analysis extracts only textural characteristics from images, the extracted features still contain unnecessary information to be classified. The texture analysis method can be improved by retaining only class-dependent features and removing obstructive features for classification. In previous works, best bases and local discriminant bases were the most popular techniques for selecting important bases from the wavelet packet bases. Because previous methods were designed for wavelet coefficients as features for analysis, their performance was poor at wavelet texture analysis. However, feature selection based on wavelet texture analysis has not been studied for texture classification. The proposed methodology is validated through quality determination for industrial steel surfaces. Using this real-world industrial example, it is experimentally shown that which transform method (wavelet packet transform versus discrete wavelet transform) is appropriate for characterizing industrial product surface. In feature selection stage, it is found that orthonormal constraint which has been used in wavelet bases selection for a long time is unnecessary for surface classification when wavelet coefficients are summarized by wavelet texture analysis. From this finding, feature selection strategy to select important bases was proposed for surface quality determination of product. The experimental results show that the proposed method has lower classification errors than previous methods. This work supports development of automated determination system by providing surface quality determination methodology. The work will be used for other multi-class surface image classification problems as a basic methodology.์ด๋ฏธ์ง€ ์ฒ˜๋ฆฌ ๊ธฐ์ˆ ์˜ ๋ฐœ๋‹ฌ๋กœ ์ธํ•ด, ์ธ๊ฐ„ ์‹œ๊ฐ์˜ ์—ญํ• ์„ ๊ธฐ๊ณ„๋‚˜ ์ปดํ“จํ„ฐ๋กœ ๋Œ€์ฒดํ•˜๋Š” ๊ธฐ๊ณ„ ์‹œ๊ฐ(Machine vision)์—ฐ๊ตฌ ๊ธฐ๋ฐ˜์ด ๊ณผ๊ฑฐ์— ๋น„ํ•ด ๊ธ‰์†๋„๋กœ ๋ฐœ์ „ํ•˜๊ณ  ์žˆ๋‹ค. ๊ธฐ๊ณ„ ์‹œ๊ฐ ์—ฐ๊ตฌ์˜ ์˜์—ญ ์ค‘, ์ œํ’ˆ์˜ ํ’ˆ์งˆ ํŒ๋ณ„์€ ๋‹จ์ˆœํžˆ ์ธ๊ฐ„์˜ ์ธ์ง€ ์˜์—ญ์„ ํƒ๊ตฌํ•˜๋Š” ๊ฒƒ์ด ์•„๋‹Œ ์ง€์‹๊ณผ ๊ฒฝํ—˜์„ ๊ฐ€์ง„ ๊ธฐ์ˆ ์ž์— ์ƒ์‘ํ•˜๋Š” ์ „๋ฌธ์ ์ธ ์—ญํ• ์„ ์ˆ˜ํ–‰ํ•ด์•ผ ํ•˜๋ฏ€๋กœ, ์ด๋ฏธ์ง€ ๋ถ„์„ ๋ถ„์•ผ์—์„œ ๋„์ „์ ์ธ ๊ณผ์ œ๋กœ ์—ฌ๊ฒจ์ง„๋‹ค. ์ด๋ฏธ์ง€ ์„ผ์„œ ๊ธฐ์ˆ , ๋ฐ์ดํ„ฐ ์ €์žฅ๋งค์ฒด, ์ปดํ“จํŒ… ๊ธฐ์ˆ ์˜ ๋ฐœ๋‹ฌ๋กœ ๋‹ค๋Ÿ‰์˜ ์ œํ’ˆ ์ด๋ฏธ์ง€๋ฅผ ์ˆ˜์ง‘ํ•˜๊ณ  ๋ถ„์„ํ•˜๋Š” ์ž‘์—…์€ ์‚ฐ์—…์—์„œ ๋น ๋ฅด๊ฒŒ ํ™•์‚ฐ๋˜๊ณ  ์žˆ๋‹ค. ์ด๋Ÿฌํ•œ ํ•˜๋“œ์›จ์–ด์  ๊ธฐ๋ฐ˜์„ ๋ฐ”ํƒ•์œผ๋กœ ์ž๋™ํ™”๋œ ์ œํ’ˆ ์ด๋ฏธ์ง€ ํŒ๋ณ„ ์‹œ์Šคํ…œ์„ ๊ฐœ๋ฐœํ•˜๊ธฐ ์œ„ํ•ด์„œ๋Š” ๋งŽ์€ ์–‘์˜ ์ด๋ฏธ์ง€ ๋ฐ์ดํ„ฐ์—์„œ ์ œํ’ˆ์˜ ํ’ˆ์งˆ๊ณผ ๊ด€๋ จ๋œ ํŠน์„ฑ์„ ์ถ”์ถœํ•˜๋Š” ํšจ๊ณผ์ ์ธ ์ด๋ฏธ์ง€ ์ฒ˜๋ฆฌ ๊ธฐ์ˆ  ์—ฐ๊ตฌ๊ฐ€ ํ•„์ˆ˜์ ์ด๋‹ค. ๋ณธ ๋…ผ๋ฌธ์—์„œ๋Š”, ์งˆ๊ฐ ๋ถ„์„์— ๊ธฐ๋ฐ˜ํ•œ ํ‘œ๋ฉด ํ’ˆ์งˆ ํŒ๋ณ„ ๋ฐฉ๋ฒ•๋ก ์ด ์ œ์•ˆ๋œ๋‹ค. ์ œ์•ˆ๋œ ๋ฐฉ๋ฒ•๋ก ์€ ์ด๋ฏธ์ง€ ์ˆ˜์ง‘, ํŠน์„ฑ ์ถ”์ถœ, ํŠน์„ฑ ์„ ๋ณ„, ํŒ๋ณ„์˜ ๋‹จ๊ณ„๋กœ ๊ตฌ์„ฑ๋˜์–ด ์žˆ๋‹ค. ์ด๋ฏธ์ง€ ํŠน์„ฑ ์ถ”์ถœ์€ ์ œํ’ˆ ํ‘œ๋ฉด ํ’ˆ์งˆ์˜ ํŠน์„ฑํ™”์— ์žˆ์–ด ํ•ต์‹ฌ์ ์ธ ๊ณผ์ • ์ด๋‹ค. ์ด๋ฅผ ์œ„ํ•ด ๊ธฐ์กด์— ๋„๋ฆฌ ์‚ฌ์šฉ๋˜๋˜ ์ด์‚ฐ ์›จ์ด๋ธ”๋ › ๋ณ€ํ™˜(Discrete Wavelet Transform)๊ณผ ์›จ์ด๋ธ”๋ › ํŒจํ‚ท ๋ณ€ํ™˜ (Wavelet Packet Transform)์˜ ๋น„๊ต๋ฅผ ํ†ตํ•˜์—ฌ, ์ œํ’ˆ ํ’ˆ์งˆ์˜ ํŒ๋ณ„์„ ์œ„ํ•œ ํšจ๊ณผ์ ์ธ ์ด๋ฏธ์ง€ ๋ณ€ํ™˜๋ฐฉ๋ฒ•์„ ์ œ์‹œํ•œ๋‹ค. ์›จ์ด๋ธ”๋ › ์งˆ๊ฐ ๋ถ„์„(Wavelet Texture Analysis) ๊ธฐ๋ฒ•์€ ์ด๋ฏธ์ง€์˜ ์ฐจ์›์„ ์ถ•์†Œํ•˜๋Š” ๋™์‹œ์— ์งˆ๊ฐ ์ •๋ณด๋งŒ์„ ์ถ”์ถœํ•  ์ˆ˜ ์žˆ๋Š” ํšจ๊ณผ์ ์ธ ๊ธฐ์ˆ ์ด๋‹ค. ์›จ์ด๋ธ”๋ › ์งˆ๊ฐ ๋ถ„์„์œผ๋กœ ์–ป์–ด์ง„ ํŠน์„ฑ ์ •๋ณด๋Š” ์ด๋ฏธ์ง€์˜ ์งˆ๊ฐ ์„ฑ๋ถ„๋งŒ์„ ์ถ”์ถœํ•˜์—ฌ ์ด๋ฏธ์ง€ ๋ถ„์„์— ์žˆ์–ด ํšจ์œจ์„ฑ์„ ๋†’์˜€์œผ๋‚˜, ์ด ํŠน์„ฑ ์ •๋ณด์—๋Š” ํŒ๋ณ„๊ณผ์ •์—์„œ ๋ถˆํ•„์š”ํ•œ ์ •๋ณด๋“ค๋„ ํ˜ผ์žฌํ•ด ์žˆ๋‹ค. ์ด๋ฏธ์ง€ ์งˆ๊ฐ ๋ถ„์„ ๋ฐฉ๋ฒ•์€ ํด๋ž˜์Šค์— ๊ธฐ๋ฐ˜ํ•œ ์ •๋ณด๋งŒ์„ ์ถ”์ถœํ•˜๊ณ , ๋ถˆํ•„์š”ํ•œ ํŠน์„ฑ์„ ์ œ๊ฑฐํ•จ์œผ๋กœ์จ ๊ทธ ์„ฑ๋Šฅ์„ ํ–ฅ์ƒ์‹œํ‚ฌ ์ˆ˜ ์žˆ๋‹ค. ๊ทธ ๋™์•ˆ ์ œํ’ˆ ์ด๋ฏธ์ง€ ๊ธฐ๋ฐ˜ ํŒ๋ณ„์„ ์œ„ํ•ด ์ด์šฉ๋˜์ง€ ์•Š์•˜๋˜ ์›จ์ด๋ธ”๋ › ํŒจํ‚ท ๋ณ€ํ™˜ ๊ธฐ๋ฐ˜ ํŠน์„ฑ ์„ ๋ณ„ ์ ‘๊ทผ์ด ์ œ์‹œ๋œ๋‹ค. ์ด์ „์˜ ์›จ์ด๋ธ”๋ › ํŒจํ‚ท ๋ณ€ํ™˜์—์„œ ์‚ฌ์šฉ๋˜๋˜ ์ตœ์ ์˜ ๋ถ„ํ•  ๊ตฌ์กฐ (best sub-band structure for decomposition)๋ฅผ ์ฐพ์•„๋‚ด๊ธฐ ์œ„ํ•œ ๋ฐฉ๋ฒ•์€ ๋ฒ ์ŠคํŠธ ๋ฒ ์ด์‹œ์Šค(Best Bases)์™€ ๋กœ์ปฌ ๋ณ€๋ณ„ ๋ฒ ์ด์‹œ์Šค(Local Discriminant Bases)์ด๋‹ค. ๊ธฐ์กด์˜ ์ด ๋ฐฉ๋ฒ•๋ก ๋“ค์€ ์›จ์ด๋ธ”๋ › ํŒจํ‚ท ๋ณ€ํ™˜์—์„œ ์›จ์ด๋ธ”๋ › ์ƒ์ˆ˜(wavelet coefficients)๋ฅผ ํŠน์„ฑ ๋ณ€์ˆ˜๋กœ ์‚ฌ์šฉํ•  ๊ฒฝ์šฐ์— ๋Œ€ํ•ด ๊ณ ์•ˆ๋œ ๋ฐฉ๋ฒ•์ด๊ธฐ ๋•Œ๋ฌธ์—, ์ด๋ฅผ ์›จ์ด๋ธ”๋ › ์งˆ๊ฐ ๋ถ„์„์— ๋ฐ”๋กœ ์ ์šฉ ์‹œ ์„ฑ๋Šฅ์ด ์ €ํ•˜๋œ๋‹ค. ๊ทธ๋Ÿผ์—๋„ ๋ถˆ๊ตฌํ•˜๊ณ , ๊ทธ ๋™์•ˆ์˜ ์—ฐ๊ตฌ์—์„œ๋Š” ์งˆ๊ฐ ๋ถ„๋ฅ˜(texture classification)๋ฅผ ์œ„ํ•œ ์›จ์ด๋ธ”๋ › ์งˆ๊ฐ ๋ถ„์„๊ธฐ๋ฐ˜ ๋ฒ ์ด์‹œ์Šค ํƒ์ƒ‰ ๋ฐฉ๋ฒ•๋ก ์ด ์ œ๋Œ€๋กœ ์—ฐ๊ตฌ๋˜์ง€ ์•Š์•˜๋‹ค. ์ œ์•ˆ๋œ ๋ฐฉ๋ฒ•๋ก ์€ ์ฒ ํŒ ์ด๋ฏธ์ง€์˜ ํ’ˆ์งˆ ๋ฐ์ดํ„ฐ๋ฅผ ํ†ตํ•ด ์„ฑ๋Šฅ์„ ๊ฒ€์ฆํ•˜์˜€๋‹ค. ์ด๋ฅผ ํ†ตํ•ด, ์›จ์ด๋ธ”๋ › ํŒจํ‚ท ๋ณ€ํ™˜์ด ์ด์‚ฐ ์›จ์ด๋ธ”๋ › ๋ณ€ํ™˜์— ๋น„ํ•ด ํŒ๋ณ„ ์„ฑ๋Šฅ์ด ํ–ฅ์ƒ๋œ๋‹ค๋Š” ๊ฒƒ์„ ์ฆ๋ช…ํ•˜์˜€๋‹ค. ํŠน์„ฑ ์„ ๋ณ„ ๊ณผ์ •์—์„œ๋Š” ๊ธฐ์กด์˜ ์ •๊ทœ์ง๊ต(orthonormal)์˜ ์ œ์•ฝ์กฐ๊ฑด์„ ์ง€๋‹Œ ๋ฒ ์ด์‹œ์Šค ์„ ํƒ๋ฐฉ๋ฒ•์€ ์›จ์ด๋ธ”๋ › ์งˆ๊ฐ ๋ถ„์„์— ๋งž์ง€ ์•Š์œผ๋ฉฐ, ์ •๊ทœ์ง๊ต ์ œ์•ฝ์กฐ๊ฑด์˜ ์ œ๊ฑฐ๋กœ ์„ฑ๋Šฅ์„ ํ–ฅ์ƒ์‹œํ‚ฌ ์ˆ˜ ์žˆ์Œ์„ ๋ณด์—ฌ์ฃผ์—ˆ๋‹ค. ์ด๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ, ์ตœ์  ํŒจํ‚ท ๊ตฌ์กฐ๋ฅผ ์ฐพ๊ธฐ ์œ„ํ•ด ํŠน์„ฑ ์„ ๋ณ„(feature selection)์ ‘๊ทผ์„ ์‹œ๋„ํ•˜์˜€๋‹ค. ๊ธฐ์กด์˜ ๋ฐฉ๋ฒ•์„ ์ˆ˜์ •ํ•˜์—ฌ ์ œ์‹œํ•œ ๋ฐฉ๋ฒ•๋ก ์ด ์ด์ „์˜ ๋ฐฉ๋ฒ•์— ๋น„ํ•ด ์ ์€ ์˜ค์ฐจ๋ฅผ ๋ณด์ž„์„ ์‹คํ—˜์ ์œผ๋กœ ์ฆ๋ช…ํ•˜์˜€๋‹ค. ์ œ์•ˆ๋œ ํ‘œ๋ฉด ํ’ˆ์งˆ ํŒ๋ณ„ ๊ธฐ๋ฒ•์€ ์ถ”๊ฐ€์ ์ธ ์—ฐ๊ตฌ๋ฅผ ํ†ตํ•ด ํ‘œ๋ฉด ํ’ˆ์งˆ ์ž๋™ํ™” ์‹œ์Šคํ…œ์˜ ๊ตฌ์ถ•์— ์ด์šฉ๋  ์ˆ˜ ์žˆ์„ ๊ฒƒ์ด๋‹ค. ๋˜ํ•œ, ์ƒ‰๊ฐ ๊ธฐ๋ฐ˜ ์งˆ๊ฐ ์—ฐ๊ตฌ์™€ ํŠน์„ฑ ํŒ๋ณ„์ด ์–ด๋ ค์šด ์ด๋ฏธ์ง€, ํ’ˆ์งˆ ๋ถ„๋ฅ˜๊ฐ€ ๋ณต์žกํ•œ ์‹ค์ œ ํ’ˆ์งˆ ํŒ๋ณ„ ์—ฐ๊ตฌ์˜ ์„ฑ๋Šฅ ํ–ฅ์ƒ์„ ์œ„ํ•ด ํ™œ์šฉ๋  ์ˆ˜ ์žˆ์„ ๊ฒƒ์ด๋‹ค.Contents_x000D_ _x000D_ CHAPTER 1 : Introduction 1_x000D_ 1.1. Research motivation 1_x000D_ 1.2. Research objectives 2_x000D_ 1.3. Outline of the thesis 3_x000D_ CHAPTER 2 : Image Texture Analysis Techniques 5_x000D_ 2.1. Introduction 5_x000D_ 2.2. Direct PLS-DA approach 6_x000D_ 2.3. GLCM (Gray level co-occurrence matrix) features 9_x000D_ 2.4. Multivariate image analysis 12_x000D_ 2.5. Wavelet texture analysis 14_x000D_ 2.6. Effective image analysis techniques for extracting textural features 16_x000D_ CHAPTER 3 : Application of Texture Analysis Methodology for Surface Quality Determination 18_x000D_ 3.1. Description of steel sheets data 18_x000D_ 3.2. Preparation of images 19_x000D_ 3.3. Surface determination framework based on texture classification method 21_x000D_ 3.3.1. Image acquisition 21_x000D_ 3.3.2. Feature extraction 21_x000D_ 3.3.3. Feature selection 22_x000D_ 3.3.4. Quality determination 23_x000D_ CHAPTER 4 : The Use of Wavelet Packet Transform in Characterization of Surface Quality 25_x000D_ 4.1. Introduction 25_x000D_ 4.2. Methodology for surface quality characterization 27_x000D_ 4.2.1. Discrete wavelet transform 27_x000D_ 4.2.2. Wavelet packet transform 30_x000D_ 4.3. Implementation to steel quality classification 33_x000D_ 4.3.1. Determination of decomposition level 33_x000D_ 4.3.2. Surface quality classification 33_x000D_ 4.4. Results and analysis 35_x000D_ 4.4.1. KNN classification performances of DWT and WPT 35_x000D_ 4.4.2. Linear classification performances of DWT and WPT 39_x000D_ 4.4.3. Bandwidth characteristics 39_x000D_ 4.5. Conclusions 44_x000D_ CHAPTER 5 : Bases Selection for Wavelet Texture Analysis based on Wavelet Packets 45_x000D_ 5.1. Introduction 45_x000D_ 5.2. Bases selection for WPT 47_x000D_ 5.3. BB approach 49_x000D_ 5.3.1. Selection criterion for discriminate measure 50_x000D_ 5.3.2. Non-orthonormal bases selection 51_x000D_ 5.3.3. Difference between Non-orthonormal bases and BB 53_x000D_ 5.4. Classification performance of different bases selection 56_x000D_ 5.4.1. KNN classification results 56_x000D_ 5.4.2. Linear classification result of different bases 60_x000D_ 5.5. Classification performance of different features 63_x000D_ 5.6. Conclusions 67_x000D_ CHAPTER 6 : Feature Selection Approach for Discriminating Packets 68_x000D_ 6.1. Introduction 68_x000D_ 6.2. Discriminative packet selection for wavelet analysis 70_x000D_ 6.2.1. Feature selection for WTA 70_x000D_ 6.2.2. Feature selection problem for discrimination 70_x000D_ 6.3. Feature selection algorithms for discrimination 71_x000D_ 6.3.1. Individual selection 71_x000D_ 6.3.2. Sequential forward selection 71_x000D_ 6.3.3. Sequential backward selection 72_x000D_ 6.3.4. Sequential forward floating selection 72_x000D_ 6.4. Classification performance of different feature selection 74_x000D_ 6.4.1. KNN classification performance 74_x000D_ 6.4.2. Linear classification performance 77_x000D_ 6.5. Comparison of feature selection with all methods 80_x000D_ 6.6. Conclusions 83_x000D_ CHAPTER 7 : Concluding Remarks 84_x000D_ 7.1. Conclusions 84_x000D_ 7.2. Future works 85_x000D_ Nomenclature 87_x000D_ Literature cited 90_x000D_ Abstract in Korean (์š” ์•ฝ) 94_x000D_ _x000D_ โ€ƒ_x000D_ List of Figures_x000D_ _x000D_ Figure 2 1. Schematic for PLS-DA approach for image analysis 8_x000D_ Figure 2 2. GLCM features for analysis of (a) intensity image array with 3 gray scale, (b) GLCM of intensity array12 11_x000D_ Figure 2 3. Multivariate image by spatial shifting and staked13 13_x000D_ Figure 3 1. Sample images of steel surfaces; quality was determined as (a) excellent, (b) good, (c) medium, and (d) bad; all original images were divided as shown in (a) 20_x000D_ Figure 3 2. Framework for surface quality determination 24_x000D_ Figure 4 1. A separable structure for 2-D DWT at the j-th decomposition stage 29_x000D_ Figure 4 2. Structures of filter bank of (a) pyramid- structured wavelet decomposition (b) tree-structured wavelet decomposition where A is the approximation and D is the detail section 32_x000D_ Figure 4 3. KNN classification comparison of DWT and WPT 38_x000D_ Figure 4 4. Comparison of linear classification result DWT with WPT 42_x000D_ Figure 4 5. Time (or Space) โ€“ Frequency tiling of (a) octave-band tree structured filter bank or wavelet transform and (b) full-tree structured filter bank or wavelet packet transform 43_x000D_ Figure 5 1. Example wavelet packet decomposition tree structure of orthonormal bases (a) and non-orthonormal bases (b) 55_x000D_ Figure 5 2. Comparison of KNN classification of different bases selection 59_x000D_ Figure 5 3. Comparison of linear classification for different bases selection 62_x000D_ Figure 5 4. KNN classification errors of different features 65_x000D_ Figure 5 5. Linear classification errors of different features 66_x000D_ Figure 6 1. Comparison of feature selection approach in KNN classification 76_x000D_ Figure 6 2. Comparison of different feature selection in linear classification 79_x000D_ Figure 6 3. Comparison of KNN classification errors for all methods 81_x000D_ Figure 6 4. Linear classification errors for all methods 82_x000D_ _x000D_ โ€ƒ_x000D_ List of Tables_x000D_ _x000D_ Table 4 1. Leave one out KNN classification errors for DWT 36_x000D_ Table 4 2. Leave one out KNN classification errors for WPT 37_x000D_ Table 4 3. Linear classification results for DWT and WPT 41_x000D_ Table 5 1. KNN classification for Best Bases (BB), Local Discriminant Bases(LDB) and Non-orthonormal bases 58_x000D_ Table 5 2. Linear classification results for different bases 61_x000D_ Table 6 1. KNN classification results for feature selection approach 75_x000D_ Table 6 2. Linear classification for feature selection approach 78_x000D_ _x000D_ โ€ƒ_x000D_Docto

    ์›จ์ด๋ธ”๋ › ํŒจํ‚ท ๋ณ€ํ™˜์„ ์ด์šฉํ•œ ์ฒ ํŒ ํ’ˆ์งˆ ๋ชจ๋‹ˆํ„ฐ๋ง

    No full text
    ํ•™์œ„๋…ผ๋ฌธ(์„์‚ฌ) --์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› :ํ™”ํ•™์ƒ๋ฌผ๊ณตํ•™๋ถ€,2007.Maste

    ๊ด‘์ „์ž ์†Œ์ž ์ œ์ž‘์„ ์œ„ํ•œ InGaAsP์˜ MOCVD์— ์˜ํ•œ ์„ฑ์žฅ ๋ฐ ๋ถ„์„

    No full text
    ํ•™์œ„๋…ผ๋ฌธ(์„์‚ฌ)--์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› :๋ฌด๊ธฐ์žฌ๋ฃŒ๊ณตํ•™๊ณผ,1997.Maste

    The Study for Residual Strength of Corroded Gas Pipe

    No full text
    Maste

    Routing Algorithm and Routability Analysis in Very-large-scale Integration

    No full text
    Master๋ณธ ํ•™์œ„๋…ผ๋ฌธ์€ ์ดˆ์ง‘์ ํšŒ๋กœ ๊ตฌํ˜„์— ํ•„์š”ํ•œ ๋ผ์šฐํŒ… ์•Œ๊ณ ๋ฆฌ์ฆ˜๊ณผ ๋ผ์šฐํŒ… ๊ฐ€๋Šฅ์„ฑ๋ถ„์„ ๋ชจ๋ธ์— ๋Œ€ํ•˜์—ฌ ๋‹ค๋ฃฌ๋‹ค. Heterogeneous ์‹œ์Šคํ…œ ๊ตฌํ˜„์„ ์œ„ํ•œ 2.5D ์ธํ„ฐํฌ์ € ํŒจํ‚ค์ง• ๋‹จ๊ณ„์—์„œ์˜ ๋ฒ„์Šค๋ฅผ ํšจ์œจ์ ์œผ๋กœ ๋ผ์šฐํŒ… ํ•˜๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜๊ณผ ๋ฐ์ดํ„ฐ ๊ตฌ์กฐ๋ฅผ ์ œ์‹œํ•˜๊ณ , ๋จธ์‹ ๋Ÿฌ๋‹ ๊ธฐ๋ฐ˜ ์ดˆ๊ธฐ ๋‹จ๊ณ„ ๊ฒฐ๊ณผ ์˜ˆ์ธก ๋ชจ๋ธ์„ ๊ตฌ์ถ•ํ•˜๊ธฐ์œ„ํ•œ ์ธ๊ณต ๋„ท๋ฆฌ์ŠคํŠธ ์ƒ์„ฑ๊ธฐ๋ฅผ ์ œ์•ˆํ•˜์—ฌ ๋ผ์šฐํŒ… ๊ฐ€๋Šฅ์„ฑ ๋ถ„์„ ๋ชจ๋ธ์„ ๊ตฌํ˜„ํ•˜์˜€๋‹ค. ๋ฒ„์Šค ๋ผ์šฐํŒ…์—์„œ, ๋ฒ„์Šค ๊ตฌ์กฐ์˜ ์‹ ํ˜ธ ๋น„ํŠธ๊ฐ€ ๊ณตํ†ต ๋ผ์šฐํŒ… ํ† ํด๋กœ์ง€๋ฅผ ๊ณต์œ ํ•˜๋Š” ๊ฒฝ์šฐ, ๋’คํ‹€๋ฆฐ ํŒจํ„ด๊ณผ ๋ณ€๋™ ๋‚ด์„ฑ์„ ํ”ผํ•˜์—ฌ ๋ผ์šฐํŒ…์„ฑ์ด ์ฆ๊ฐ€ํ•œ๋‹ค. ๊ณ ๊ธ‰ ๊ธฐ์ˆ ์—์„œ ๋‹ค์ค‘ ์นฉ ๋ชจ๋“ˆ, I/O ํ•€ ๋˜๋Š” ์˜จ ์นฉ ๋ฉ”๋ชจ๋ฆฌ์˜ ๋ฒ„์Šค ๊ตฌ์กฐ๊ฐ€ ์ ์  ๋ณต์žกํ•ด์ง์— ๋”ฐ๋ผ ๋ฒ„์Šค ๋ผ์šฐํŒ… ๋ฌธ์ œ๊ฐ€ ์ƒ๋‹นํžˆ ์ค‘์š”ํ•ด์กŒ๋‹ค. ๋ณธ ๋…ผ๋ฌธ ์ฑ•ํ„ฐ 2 ์—์„œ๋Š” ๋ฒ„์Šค ๋ฐ€๋„๊ฐ€ ๋†’๊ณ  ์„ ๋กœ ํ™œ์šฉ๋„๊ฐ€ ๋†’์€ ์„ค๊ณ„์—์„œ๋„ ๋ฒ„์Šค์˜ ๋ผ์šฐํŒ… ํ† ํด๋กœ์ง€๋ฅผ ์ฝคํŒฉํŠธํ•˜๊ฒŒ ํ•ฉ์„ฑํ•˜๊ณ  ์„ค๊ณ„ ๊ทœ์น™ ์œ„๋ฐ˜์„ ์ตœ์†Œํ™”ํ•  ์ˆ˜ ์žˆ๋Š” ์ฝคํŒฉํŠธ ์œ„์ƒ ์ธ์‹ ๋ฒ„์Šค ๋ผ์šฐํŒ… ๋ฐฉ๋ฒ•์„ ์ œ์‹œํ•˜๊ณ  ํ‰๊ฐ€ํ•œ๋‹ค. ์ œ์•ˆ๋œ ๋ฐฉ๋ฒ•์€ ICCAD-2018 ๋Œ€ํšŒ์˜ ๋Ÿฐํƒ€์ž„ ์ œํ•œ์—์„œ ๋ฒ„์Šค ๋ผ์šฐํŒ…์„ ์™„๋ฃŒํ•˜๊ณ , ๊ทธ ๋Œ€ํšŒ์˜ ์šฐ์Šน์ž์™€ ๋น„๊ตํ•˜์—ฌ ์ด ๋น„์šฉ์„ 66% ์ ˆ๊ฐํ–ˆ๋‹ค. ์ฒจ๋‹จ ๊ธฐ์ˆ  ๋…ธ๋“œ์—์„œ ์ดˆ๊ธฐ ๋‹จ๊ณ„์˜ ๊ฒฐ๊ณผ(QoR) ์˜ˆ์ธก์€ ๋ฌผ๋ฆฌ์  ์„ค๊ณ„์˜ ๋Ÿฐํƒ€์ž„์„ ์ค„์ด๋Š” ์ฃผ์š” ์š”์ธ ์ค‘ ํ•˜๋‚˜๊ฐ€ ๋˜์—ˆ๋‹ค. ๊ทธ๋Ÿฌ๋‚˜ ๋งŽ์€ ๋‹ค๋ฅธ ์„ค๊ณ„ ์˜ต์…˜์—์„œ๋Š” ์ •ํ™•ํ•œ ์˜ˆ์ธก์ด ์–ด๋ ต๋‹ค. ์ด๋Ÿฌํ•œ ๊ณผ์ œ๋ฅผ ๊ทน๋ณตํ•˜๊ธฐ ์œ„ํ•ด ๊ธฐ๊ณ„ํ•™์Šต ์ ‘๊ทผ๋ฐฉ์‹์€ ๋งค์šฐ ์ •ํ™•ํ•œ ์กฐ๊ธฐ ์˜ˆ์ธก ๋ชจ๋ธ๋กœ ๋“ฑ์žฅํ•˜๊ณ  ์žˆ๋‹ค. ๋ณธ ๋…ผ๋ฌธ ์ฑ•ํ„ฐ 2์—์„œ๋Š” ์ดˆ๊ธฐ ๋‹จ๊ณ„์˜ QoR ์˜ˆ์ธก์„ ์œ„ํ•œ ๊ธฐ๊ณ„ ํ•™์Šต ํ”„๋ ˆ์ž„์›Œํฌ๋ฅผ ์ œ์‹œํ•˜๋ฉฐ, ์ด๋ฅผ ํ†ตํ•ด ๋ฌผ๋ฆฌ์  ์„ค๊ณ„์— โ€™์ตœ์ƒ์˜โ€™ ๊ตฌํ˜„ ์˜ต์…˜์„ ๊ถŒ์žฅํ•  ์ˆ˜ ์žˆ๋‹ค. ๋Œ€์ƒ ํšŒ๋กœ์˜ ๋‹ค์–‘ํ•œ ๋ฌผ๋ฆฌ์  ํŠน์„ฑ์„ ๊ต์œกํ•˜๊ธฐ ์œ„ํ•ด ํ˜„์‹ค์ ์ธ ์ธ๊ณต ๋„ท๋ฆฌ์ŠคํŠธ๋ฅผ ์ƒ์„ฑํ•˜๊ณ  ์ถฉ๋ถ„ํ•œ ๊ธฐ๊ณ„ ํ•™์Šต ๋ฐ์ดํ„ฐ ์„ธํŠธ๋ฅผ ๊ตฌ์ถ•ํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์ œ์•ˆํ•œ๋‹ค. ์šฐ๋ฆฌ์˜ ์ธ๊ณต ๋ฐ์ดํ„ฐ ์„ธํŠธ์— ์˜ํ•ด ํ›ˆ๋ จ๋œ ๊นŠ์€ ์‹ ๊ฒฝ ๋„คํŠธ์›Œํฌ๋Š” ์‹ค์ œ ์—ด๋ฆฐ ๋ฒค์น˜๋งˆํฌ๋กœ ๊ฒ€์ฆ๋˜์—ˆ๋‹ค. Top1๊ณผ Top2์˜ ํ›ˆ๋ จ ์ •ํ™•๋„๋Š” ๊ฐ๊ฐ 65.1%์™€ 85.9%์ด๋‹ค. ์šฐ๋ฆฌ์˜ ์ตœ์ข… ํ›ˆ๋ จ ๋ชจ๋ธ์€ 55.6%์˜ ๋งค์นญ๊ณผ 100%์˜ ํ•„ํ„ฐ๋ง ํ™•๋ฅ ์„ ๊ฐ€์ง„ ์ƒ์œ„1 PDN์„ ํšจ๊ณผ์ ์œผ๋กœ ์˜ˆ์ธกํ•œ

    ์ž๊ถ๊ฒฝ๋ถ€ ์„ ์•”์—์„œ ์ธ์œ ๋‘์ข… ๋ฐ”์ด๋Ÿฌ์Šค์˜ ๊ฐ์—ผ ์–‘์ƒ๊ณผ p53 ๋‹จ๋ฐฑ์˜ ๋ฐœํ˜„์— ๊ด€ํ•œ ์—ฐ๊ตฌ

    No full text
    ํ•™์œ„๋…ผ๋ฌธ(์„์‚ฌ)--์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› :์˜ํ•™๊ณผ ์‚ฐ๋ถ€์ธ๊ณผํ•™์ „๊ณต,1998.Maste

    ์‹ ์ƒ์•„ ์žฅ ์‹ ๊ฒฝ์ ˆ์„ธํฌ์—์„œ cathepsin D ๋ฐœํ˜„

    No full text
    ํ•™์œ„๋…ผ๋ฌธ(์„์‚ฌ)--์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› :์˜ํ•™๊ณผ ์™ธ๊ณผํ•™์ „๊ณต,1999.Maste

    ๋ฐ•์šฉ ์›์ž๋กœ์˜ ์•ˆ์ „์„ฑ์— ๊ด€ํ•œ ์—ฐ๊ตฌ

    No full text

    A fast and scalable qubit-mapping method for noisy intermediate-scale quantum computers

    No full text
    This paper presents an efficient qubit-mapping method that redesigns a quantum circuit to overcome the limitations of qubit connectivity. We propose a recursive graph-isomorphism search to generate the scalable initial mapping. In the main mapping, we use an adaptive look-ahead window search to resolve the connectivity constraint within a short runtime. Compared with the state-of-the-art method [15], our proposed method reduced the number of additional gates by 23% on average and the runtime by 68% for the three largest benchmark circuits. Furthermore, our method improved circuit stability by reducing the circuit depth and thus can be a step forward towards fault tolerance.1
    corecore