51 research outputs found

    ๊ทธ๋ผ๋””์–ธํŠธ ๊ฐœ์„  ๋ฐ ๋ช…์‹œ์  ์ •๊ทœํ™”๋ฅผ ํ†ตํ•œ ์‹ฌ์ธต ๋ชจ๋ธ ์••์ถ•์— ๊ด€ํ•œ ์—ฐ๊ตฌ

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ(๋ฐ•์‚ฌ) -- ์„œ์šธ๋Œ€ํ•™๊ต๋Œ€ํ•™์› : ์œตํ•ฉ๊ณผํ•™๊ธฐ์ˆ ๋Œ€ํ•™์› ์œตํ•™๊ณผํ•™๋ถ€, 2022.2. ๊น€์žฅํ˜ธDeep Neural Network (DNN)์€ ๋น ๋ฅด๊ฒŒ ๋ฐœ์ „ํ•˜์—ฌ ์ปดํ“จํ„ฐ ๋น„์ „, ์ž์—ฐ์–ด ์ฒ˜๋ฆฌ ๋ฐ ์Œ์„ฑ ์ฒ˜๋ฆฌ๋ฅผ ํฌํ•จํ•œ ๋งŽ์€ ์˜์—ญ์—์„œ ๋†€๋ผ์šด ์„ฑ๋Šฅ์„ ๋ณด์—ฌ ์™”๋‹ค. ์ด๋Ÿฌํ•œ DNN์˜ ๋ฐœ์ „์— ๋”ฐ๋ผ edge IoT ์žฅ์น˜์™€ ์Šค๋งˆํŠธํฐ์— DNN์„ ๊ตฌ๋™ํ•˜๋Š” ์˜จ๋””๋ฐ”์ด์Šค DNN์— ๋Œ€ํ•œ ์ˆ˜์š”๊ฐ€ ์ฆ๊ฐ€ํ•˜๊ณ  ์žˆ๋‹ค. ๊ทธ๋Ÿฌ๋‚˜ DNN์˜ ์„ฑ์žฅ๊ณผ ํ•จ๊ป˜ DNN ๋งค๊ฐœ๋ณ€์ˆ˜์˜ ์ˆ˜๊ฐ€ ๊ธ‰๊ฒฉํžˆ ์ฆ๊ฐ€ํ–ˆ๋‹ค. ์ด๋กœ ์ธํ•ด DNN ๋ชจ๋ธ์„ ๋ฆฌ์†Œ์Šค ์ œ์•ฝ์ด ์žˆ๋Š” ์—์ง€ ์žฅ์น˜์— ๊ตฌ๋™ํ•˜๊ธฐ๊ฐ€ ์–ด๋ ต๋‹ค. ๋˜ ๋‹ค๋ฅธ ๋ฌธ์ œ๋Š” ์—์ง€ ์žฅ์น˜์—์„œ DNN์˜ ์ „๋ ฅ ์†Œ๋น„๋Ÿ‰์ด๋‹ค ์™œ๋ƒํ•˜๋ฉด ์—์ง€ ์žฅ์น˜์˜ ์ „๋ ฅ์šฉ ๋ฐฐํ„ฐ๋ฆฌ๊ฐ€ ์ œํ•œ๋˜์–ด ์žˆ๊ธฐ ๋•Œ๋ฌธ์ด๋‹ค. ์œ„์˜ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๊ธฐ ์œ„ํ•ด์„œ๋Š” ๋ชจ๋ธ ์••์ถ•์ด ๋งค์šฐ ์ค‘์š”ํ•˜๋‹ค. ์ด ๋…ผ๋ฌธ์—์„œ ์šฐ๋ฆฌ๋Š” ์ง€์‹ ์ฆ๋ฅ˜, ์–‘์žํ™” ๋ฐ ๊ฐ€์ง€์น˜๊ธฐ๋ฅผ ํฌํ•จํ•œ ๋ชจ๋ธ ์••์ถ•์˜ ์„ธ ๊ฐ€์ง€ ์ƒˆ๋กœ์šด ๋ฐฉ๋ฒ•์„ ์ œ์•ˆํ•œ๋‹ค. ๋จผ์ €, ์ง€์‹ ์ฆ๋ฅ˜๋ผ๊ณ  ๋ถˆ๋ฆฌ๋Š” ๋ฐฉ๋ฒ•์œผ๋กœ์จ, ๊ต์‚ฌ ๋„คํŠธ์›Œํฌ์˜ ์ถ”๊ฐ€ ์ •๋ณด๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ํ•™์ƒ ๋ชจ๋ธ์„ ํ•™์Šต์‹œํ‚ค๋Š” ๊ฒƒ์„ ๋ชฉํ‘œ๋กœ ํ•œ๋‹ค. ์ด ํ”„๋ ˆ์ž„์›Œํฌ๋ฅผ ์‚ฌ์šฉํ•˜๋ฉด ์ฃผ์–ด์ง„ ๋งค๊ฐœ๋ณ€์ˆ˜๋ฅผ ์ตœ๋Œ€ํ•œ ํ™œ์šฉํ•  ์ˆ˜ ์žˆ์œผ๋ฉฐ ์ด๋Š” ์žฅ์น˜์˜ ๋ฆฌ์†Œ์Šค๊ฐ€ ์ œํ•œ๋œ ์ƒํ™ฉ์—์„œ ์ค‘์š”ํ•˜๋‹ค. ๊ธฐ์กด ์ง€์‹ ์ฆ๋ฅ˜ ํ”„๋ ˆ์ž„์›Œํฌ์™€ ๋‹ฌ๋ฆฌ ๋„คํŠธ์›Œํฌ ๊ตฌ์กฐ, ๋ฐฐ์น˜ ๋ฌด์ž‘์œ„์„ฑ ๋ฐ ์ดˆ๊ธฐ ์กฐ๊ฑด๊ณผ ๊ฐ™์€ ๊ต์‚ฌ์™€ ํ•™์ƒ ๊ฐ„์˜ ๊ณ ์œ ํ•œ ์ฐจ์ด๊ฐ€ ์ ์ ˆํ•œ ์ง€์‹์„ ์ „๋‹ฌํ•˜๋Š” ๋ฐ ๋ฐฉํ•ด๊ฐ€ ๋  ์ˆ˜ ์žˆ์œผ๋ฏ€๋กœ ํ”ผ์ณ์—์„œ ์š”์†Œ๋ฅผ ์ถ”์ถœํ•˜์—ฌ ์ง€์‹์„ ๊ฐ„์ ‘์ ์œผ๋กœ ์ฆ๋ฅ˜ํ•˜๋Š” ๋ฐ ์ค‘์ ์„ ๋‘”๋‹ค. ๋‘˜์งธ, ์–‘์žํ™”๋ฅผ ์œ„ํ•œ ์ •๊ทœํ™” ๋ฐฉ๋ฒ•์„ ์ œ์•ˆํ•œ๋‹ค. ์–‘์žํ™”๋œ ๋ชจ๋ธ์€ ์ž์›์ด ์ œํ•œ๋œ ์—์ง€ ์žฅ์น˜์— ์ค‘์š”ํ•œ ์ „๋ ฅ ์†Œ๋ชจ์™€ ๋ฉ”๋ชจ๋ฆฌ์— ์ด์ ์ด ์žˆ๋‹ค. ํŒŒ๋ผ๋ฏธํ„ฐ ๋ถ„ํฌ๋ฅผ ์–‘์žํ™” ์นœํ™”์ ์œผ๋กœ ๋งŒ๋“ค๊ธฐ ์œ„ํ•ด ํ›ˆ๋ จ ์‹œ๊ฐ„์— ๋ชจ๋ธ์˜ ๊ธฐ์šธ๊ธฐ๋ฅผ ๋ถˆ๊ท ์ผํ•˜๊ฒŒ ์žฌ์กฐ์ •ํ•œ๋‹ค. ์šฐ๋ฆฌ๋Š” ๊ทธ๋ผ๋””์–ธํŠธ์˜ ํฌ๊ธฐ๋ฅผ ์žฌ์กฐ์ •ํ•˜๊ธฐ ์œ„ํ•ด position-based scaled gradient (PSG)๋ฅผ ์‚ฌ์šฉํ•œ๋‹ค. Stochastic gradient descent (SGD) ์™€ ๋น„๊ตํ•˜์—ฌ, ์šฐ๋ฆฌ์˜ position-based scaled gradient descent (PSGD)๋Š” ๋ชจ๋ธ์˜ ์–‘์žํ™” ์นœํ™”์ ์ธ ๊ฐ€์ค‘์น˜ ๋ถ„ํฌ๋ฅผ ๋งŒ๋“ค๊ธฐ ๋•Œ๋ฌธ์— ์–‘์žํ™” ํ›„ ์„ฑ๋Šฅ ์ €ํ•˜๋ฅผ ์™„ํ™”ํ•œ๋‹ค. ์…‹์งธ, ์ค‘์š”ํ•˜์ง€ ์•Š์€ ๊ณผ์ž‰ ๋งค๊ฐœ ๋ณ€์ˆ˜ํ™” ๋ชจ๋ธ์„ ์ œ๊ฑฐํ•˜๊ธฐ ์œ„ํ•ด, ๊ฐ€์ง€์น˜๊ธฐ๋œ ๊ฐ€์ค‘์น˜์˜ ๋Œ€๋žต์ ์ธ ๊ธฐ์šธ๊ธฐ์— Straight-Through-Estimator (STE)๋ฅผ ํ™œ์šฉํ•˜์—ฌ ํ›ˆ๋ จ ์ค‘์— ๋‹ค์–‘ํ•œ ํฌ์†Œ์„ฑ ํŒจํ„ด์„ ์ฐพ์œผ๋ ค๊ณ  ํ•˜๋Š” ๋™์  ๊ฐ€์ง€์น˜๊ธฐ ๋ฐฉ๋ฒ•์ด ๋“ฑ์žฅํ–ˆ๋‹ค. STE๋Š” ๋™์  ํฌ์†Œ์„ฑ ํŒจํ„ด์„ ์ฐพ๋Š” ๊ณผ์ •์—์„œ ์ œ๊ฑฐ๋œ ํŒŒ๋ผ๋ฏธํ„ฐ๊ฐ€ ๋˜์‚ด์•„๋‚˜๋„๋ก ๋„์šธ ์ˆ˜ ์žˆ๋‹ค. ๊ทธ๋Ÿฌ๋‚˜ ์ด๋Ÿฌํ•œ ๊ฑฐ์นœ ๊ธฐ์šธ๊ธฐ (coarse gradient)๋ฅผ ์‚ฌ์šฉํ•˜๋ฉด STE ๊ทผ์‚ฌ์˜ ์‹ ๋ขฐํ•  ์ˆ˜ ์—†๋Š” ๊ธฐ์šธ๊ธฐ ๋ฐฉํ–ฅ์œผ๋กœ ์ธํ•ด ํ›ˆ๋ จ์ด ๋ถˆ์•ˆ์ •ํ•ด์ง€๊ณ  ์„ฑ๋Šฅ์ด ์ €ํ•˜๋œ๋‹ค. ์ด ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๊ธฐ ์œ„ํ•ด ์šฐ๋ฆฌ๋Š” ์ด์ค‘ ์ „๋‹ฌ ๊ฒฝ๋กœ๋ฅผ ํ˜•์„ฑํ•˜์—ฌ ์ œ๊ฑฐ๋œ ํŒŒ๋ผ๋ฏธํ„ฐ (pruned weights)๋ฅผ ์—…๋ฐ์ดํŠธํ•˜๊ธฐ ์œ„ํ•ด ์ •์ œ๋œ ๊ทธ๋ผ๋””์–ธํŠธ๋ฅผ ์ œ์•ˆํ•œ๋‹ค. ๊ฐ€์ง€์น˜๊ธฐ์— ๊ฑฐ์นœ ๊ธฐ์šธ๊ธฐ๋ฅผ ์‚ฌ์šฉํ•˜์ง€ ์•Š๊ธฐ ์œ„ํ•ด Dynamic Collective Intelligence Learning (DCIL)์„ ์ œ์•ˆํ•œ๋‹ค. ๋งˆ์ง€๋ง‰์œผ๋กœ ์ œ์•ˆ๋œ ๋ฐฉ๋ฒ•๋“ค์„ ์ด์šฉํ•˜์—ฌ ํ†ตํ•ฉ ๋ชจ๋ธ ์••์ถ• ํ›ˆ๋ จ ํ”„๋ ˆ์ž„์›Œํฌ๋กœ์„œ ๊ฒฐํ•ฉํ•œ๋‹ค. ์ด ๋ฐฉ๋ฒ•์€ ๊ทน๋„๋กœ ํฌ์†Œํ•˜๊ณ  ์–‘์žํ™” ์นœํ™”์ ์ธ ๋ชจ๋ธ์„ ํ›ˆ๋ จํ•  ์ˆ˜ ์žˆ๋‹ค.Deep neural network (DNN) has been developed rapidly and has shown remarkable performance in many domains including computer vision, natural language processing and speech processing. The demand for on-device DNN, i.e., deploying DNN on the edge IoT device and smartphone in line with this development of DNN has increased. However, with the growth of DNN, the number of DNN parameters has risen drastically. This makes DNN models hard to be deployed on resource-constraint edge devices. Another challenge is the power consumption of DNN on the edge device because edge devices have a limited battery for the power. To resolve the above issues model compression is very important. In this dissertation, we propose three novel methods in model compression including knowledge distillation, quantization and pruning. First, we aim to train the student model with additional information of the teacher network, named as knowledge distillation. This framework makes it possible to make the most of a given parameter, which is essential in situations where the device's resources are limited. Unlike previous knowledge distillation frameworks, we focus on distilling the knowledge indirectly by extracting the factor from features because the inherent differences between the teacher and the student, such as the network structure, batch randomness, and initial conditions, can hinder the transfer of appropriate knowledge. Second, we propose the regularization method for quantization. The quantized model has advantages in power consumption and memory which are essential to the resource-constraint edge device. We non-uniformly rescale the gradient of the model in the training time to make a weight distribution quantization-friendly. We use position-based scaled gradient (PSG) for rescaling the gradient. Compared with the stochastic gradient descent (SGD), our position-based scaled gradient descent (PSGD) mitigates the performance degradation after quantization because it makes a quantization-friendly weight distribution of the model. Third, to prune the unimportant overparameterized model dynamic pruning methods have emerged, which try to find diverse sparsity patterns during training by utilizing Straight-Through-Estimator (STE) to approximate gradients of pruned weights. STE can help the pruned weights revive in the process of finding dynamic sparsity patterns. However, using these coarse gradients causes training instability and performance degradation owing to the unreliable gradient signal of the STE approximation. To tackle this issue, we propose refined gradients to update the pruned weights by forming dual forwarding paths. We propose a Dynamic Collective Intelligence Learning (DCIL) to avoid using coarse gradients for pruning. Lastly, we combine proposed methods as a unified model compression training framework. This method can train a drastically sparse and quantization-friendly model.Abstract i Contents iii List of Tables vii List of Figures x 1 Introduction 1 1.1 Motivation 1 1.2 Tasks 4 1.3 Contributions and Outline 7 2 Related work 11 2.1 Knowledge Distillation 11 2.2 Quantization 13 2.2.1 Sparse training 14 2.3 Pruning 15 3 Factor Transfer (FT) for Knowledge Distillation 17 3.1 Introduction 17 3.2 Proposed method 19 3.2.1 Teacher Factor Extraction with Paraphraser 20 3.2.2 Factor Transfer with Translator 21 3.3 Experiments 23 3.3.1 CIFAR-10 24 3.3.2 CIFAR-100 26 3.3.3 Ablation Study 28 3.3.4 ImageNet 29 3.3.5 Object Detection 29 3.3.6 Discussion 31 3.4 Conclusion 31 4 Position based Scaled Gradients (PSG) for Quantization 33 4.1 Introduction 33 4.2 Proposed method 37 4.2.1 Optimization in warped space 38 4.2.2 Position-based scaled gradient 39 4.2.3 Target points 43 4.2.4 PSGD for deep networks 44 4.2.5 Geometry of the Warped Space 45 4.3 Experiments 50 4.3.1 Implementation details 51 4.3.2 Pruning 53 4.3.3 Quantization 56 4.3.4 Knowledge Distillation 58 4.3.5 Various architectures with PSGD 60 4.3.6 Adam optimizer with PSG 60 4.4 Discussion 61 4.4.1 Toy Example 61 4.4.2 Weight Distributions 62 4.4.3 Quantization-aware training vs PSGD 64 4.4.4 Post-training with PSGD-trained model 65 4.5 Conclusion 65 5 Dynamic Collective Intelligence Learning (DCIL) for Pruning 69 5.1 Introduction 69 5.2 Proposed method 73 5.2.1 Backgrounds 73 5.2.2 Dynamic Collective Intelligence Learning 74 5.2.3 Convergence analysis 79 5.3 Experiments 80 5.3.1 Experiment Setting 81 5.3.2 Experiment Results 84 5.3.3 Differences between Dense and pruned model 87 5.3.4 Analysis of the stability 87 5.3.5 Cost of training 90 5.3.6 Fast convergence of DCIL 92 5.3.7 Tendency of warm-up 93 5.3.8 CIFAR10 94 5.3.9 ImageNet 94 5.3.10 Analysis of training and inference overheads 95 5.4 Conclusion 96 6 Deep Model Compression via KD, Quantization and Pruning (KQP) 97 6.1 Method 97 6.2 Experiment 98 6.3 Conclusion 102 7 Conclusion 103 7.1 Summary 103 7.2 Limitations and Future Directions 105 Abstract (In Korean) 118 ๊ฐ์‚ฌ์˜ ๊ธ€ 120๋ฐ•

    Humanistic Corporate Paradigm and Comprehensive Learning Society

    Get PDF
    โ… . ์„œ๋ก  / 95 โ…ก. ๅ‹žๅ‹•์˜ ่ฝ‰ๆ›: ๆ„ๅ‘ณ์™€ ็คบๅ”†้ปž / 99 1. ํฌ๋“œ์ฃผ์˜ ๋…ธ๋™ํŽธ์„ฑ์›๋ฆฌ์˜ ๋ถ•๊ดด / 99 2. ์ง€์‹๊ธฐ๋ฐ˜๊ฒฝ์ œ์˜ ๋Œ€๋‘ / 101 3. ๋น„์ •ํ˜• ๊ณ ์šฉํ˜•ํƒœ์˜ ํ™•๋Œ€ / 103 โ…ข. ่ณ‡ๆœฌไธป็พฉ ็™ผๅฑ•๊ณผ ไผๆฅญํŒจ๋Ÿฌ๋‹ค์ž„์˜ ่ฎŠๅŒ– / 104 โ…ฃ. ไบบๆœฌไธป็พฉ ไผๆฅญํŒจ๋Ÿฌ๋‹ค์ž„๊ณผ ็ธฝ้ซ”็š„ ๅญธ็ฟ’็คพๆœƒ์˜ ๅ…ท็พ / 108 1. ์ธ๋ณธ์ฃผ์˜ ๊ธฐ์—…ํŒจ๋Ÿฌ๋‹ค์ž„ / 108 2. ่„ซํฌ๋“œ์ฃผ์˜ ์ž‘์—…์กฐ์ง ๋ฐ ๋…ธ์‚ฌ๊ด€๊ณ„ / 110 3. ์ธ๋ณธ์ฃผ์˜ ๋…ธ๋™๊ฒฝ์ œ์˜ ๋น„์ „ / 112 4. ์ด์ฒด์  ํ•™์Šต์‚ฌํšŒ์˜ ๊ตฌํ˜„ / 114 โ…ค. ๆŒ็บŒ็š„ ้›‡ๅ‚ญ๊ณผ ๅญธ็ฟ’์˜ ๅ–„ๅพช็’ฐ ๆจกๅž‹: ์œ ํ•œํ‚ด๋ฒŒ๋ฆฌ ๆˆๅŠŸไบ‹ไพ‹ / 115 โ…ฅ. ๊ฒฐ๋ก  / 119 ์ฐธ๊ณ ๋ฌธํ—Œ / 121 Abstract / 122This paper examined the significance and meaning of characteristical transformation of work that has come into the limelight recently and suggested the humanistic corporate paradigm, vision of work system principle and basic policy direction required in the 21st century knowledge-based society. Recently, the change in the technology paradigm of informationalization globalization and the change in social economic conditions are transforming the characteristic of labor, thus requiring the establishment of a new corporate paradigm and work system principle. This paper emphasized that the basic ideological and philosophical condition of the production system that actively conforms to the new social economic condition has to be human oriented, and that the post-Fordism work system principle that backs this is needed not only to overcome labor alienation and to raise equity, but also in order to secure economic efficiency in a long term point of view. This paper also pointed out that such a vision is not only something that is desirable, but is a reality of historical trend, considering the changes expected in the 21st century including the transition in technical condition, corporate environment and labor market conditions. It also stressed that in order to meet the humanistic corporate paradigm and post-Fordism's work system principle, the whole society needs to outgrow from the existing labor-centered society and become a comprehensive learning society in which learning keeps pace with labor in all units of organizations. This paper asserted that in order to speed up the realization of such comprehensive learning society, the nation, market, businesses and labor unions should all be approached with a new principle and way of thinking. The business reengineering success case of Yuhan-Kimberly, Inc., which is growing strong through consistent employment security and lifelong learning, is an important case that proves the potential of humanistic learning corporations. It hints that the employment crisis that we face now could be overcome with a paradigm shift

    ์กฐ์ง๊ณตํ•™์  ์ธ์กฐ๊ณ ๋ง‰์˜ ๊ฐœ๋ฐœ

    No full text
    Thesis(masters)--์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› :๋ฐ”์ด์˜ค์‹œ์Šคํ…œยท์†Œ์žฌํ•™๋ถ€(๋ฐ”์ด์˜ค์‹œ์Šคํ…œ๊ณตํ•™),2009.2.Thesis(masters) -

    Visions and Strategies for a Human Resources-Centered Society

    No full text
    ์‚ฐ์—…ํ™” ์‹œ๋Œ€์— ๊ฐœ์ธ์˜ ๊ต์œก์—ด์— ์˜์กดํ–ˆ๋˜ ์ธ์ ์ž์› ํˆฌ์ž๋Š” ์ง€์‹๊ธฐ๋ฐ˜๊ฒฝ์ œ์—์„œ๋Š” ๊ตญ๊ฐ€์  ์ฐจ์›์—์„œ ์ฒด๊ณ„์ ์ด๊ณ  ์ „๋žต์ ์ธ ํˆฌ์ž๋กœ ์ „ํ™˜๋˜์–ด์•ผ ํ•จ์„ ์ „์ œ๋กœ ํ•˜์—ฌ ๋ณธ์„œ์—์„œ๋Š” ์ธ์ ์ž์› ์ค‘์‹ฌ์˜ ์ƒˆ๋กœ์šด ๊ตญ๊ฐ€๋ฐœ์ „ํŒจ๋Ÿฌ๋‹ค์ž„์„ โ€˜์ธ์ ์ž์›์ž…๊ตญโ€™์œผ๋กœ ๋ช…๋ช…ํ•˜๊ณ , ์ƒˆ๋กœ์šด ๋ฐœ์ „ ํŒจ๋Ÿฌ๋‹ค์ž„์˜ ๋ฐฐ๊ฒฝ๊ณผ ํ•„์š”์„ฑ์— ๋Œ€ํ•œ ์ง„์ง€ํ•œ ์„ฑ์ฐฐ์„ ํ†ตํ•ด ๊ทธ ๋…ผ๋ฆฌ๋ฅผ ๊ตฌ์„ฑํ•˜๋ฉฐ ์ด๋ฅผ ๋ฐ”ํƒ•์œผ๋กœ ์ธ์ ์ž์›์ž…๊ตญ์˜ ๋น„์ „๊ณผ ์‹ค์ฒœ์ „๋žต์„ ์ง„์ง€ํ•˜๊ฒŒ ๋ชจ์ƒ‰ํ•œ ๊ฒฐ๊ณผ๋ฌผ์ด๋‹ค. ๋ณธ ์—ฐ๊ตฌ์—์„œ๋Š” ๊ตญ๊ฐ€๋ฐœ์ „ํŒจ๋Ÿฌ๋‹ค์ž„์˜ ์ „ ์˜์—ญ์„ ๋ง๋ผํ•˜๊ธฐ ๋ณด๋‹ค๋Š” ํ•™๊ต๊ต์œก, ๊ธฐ์—…์กฐ์ง๊ณผ ์ง€์—ญ ์ฐจ์› ์˜ ํ‰์ƒํ•™์Šต, ๊ต์œก๊ณผ ๋…ธ๋™์‹œ์žฅ์˜ ์—ฐ๊ณ„, ๊ทธ๋ฆฌ๊ณ  ๊ตญ๊ฐ€์ธ์ ์ž์›๊ด€๋ฆฌ์˜ ๋„ค ์˜์—ญ์„ ์ธ์ ์ž์›์ž…๊ตญ์˜ ํ•ต์‹ฌ์ ์ธ ์˜์—ญ์œผ๋กœ ์„ค์ •ํ•˜๊ณ , ๊ฐ ์˜์—ญ์„ ์ค‘์‹ฌ์œผ๋กœ ํ˜„ํ™ฉ๊ณผ ๋ฌธ์ œ์ ์„ ์‚ดํŽด๋ณธ ๋‹ค์Œ ์ค‘์š”ํ•œ ์ •์ฑ…์  ์ด์Šˆ๋“ค์„ ๋‹ค๋ฃจ๋Š” ์„ ํƒ๊ณผ ์ง‘์ค‘์˜ ์ „๋žต์ ์ธ ์ ‘๊ทผ๋ฐฉ์‹์„ ์ทจํ•˜์˜€๋‹ค.์š” ์•ฝ ์ œ1์žฅ ์„œ๋ก  ์ œ2์žฅ ์ธ์ ์ž์›์ž…๊ตญ์˜ ๋น„์ „๊ณผ ์ „๋žต ์ œ1์ ˆ ์™œ ์ธ์ ์ž์›์ž…๊ตญ์ธ๊ฐ€? 7 1. ๊ฒฉ๋ž‘์†์˜ ํ•œ๊ตญํ˜ธ: ์„ฑ์žฅ๊ณผ ํ†ตํ•ฉ์˜ ์œ„๊ธฐ 7 2. ์„ฑ์žฅ๊ณผ ํ†ตํ•ฉ์˜ ์œ„๊ธฐ ๊ทน๋ณต ๋ฐฉํ–ฅ: ์ธ์ ์ž์›์— ์žˆ๋‹ค 25 3. ์ด๋Œ€๋กœ๋Š” ์•ˆ ๋œ๋‹ค: ์ˆ˜์ถœ์ž…๊ตญ์—์„œ ์ธ์ ์ž์›์ž…๊ตญ์œผ๋กœ ํŒจ๋Ÿฌ๋‹ค์ž„ ์ „ํ™˜ํ•„์š” 26 4. ์„ธ๊ณ„๋Š” ์ง€๊ธˆ ์ธ์ ์ž์›ํ˜๋ช… ์ค‘ 30 ์ œ2์ ˆ ์ธ์ ์ž์›์ž…๊ตญ์ด๋ž€ ๋ฌด์—‡์ธ๊ฐ€? 34 1. ์ƒˆ๋กœ์šด ๋ฐœ์ „ํŒจ๋Ÿฌ๋‹ค์ž„: ์ธ์ ์ž์›์ž…๊ตญ 34 2. ์ธ์ ์ž์›์ž…๊ตญ์˜ ๋น„์ „: โ€˜ํ•™์Šต, ํ˜์‹ , ํ†ตํ•ฉ์˜ ์—ญ๋™์  ์„ ์ง„๊ตญ๊ฐ€โ€™37 3. ์ธ์ ์ž์›์ž…๊ตญ์˜ ๋ชฉํ‘œ: ์ด์ฒด์  ํ•™์Šต์‚ฌํšŒ ๊ตฌ์ถ• 38 4. ์ธ์ ์ž์›์ž…๊ตญ์˜ ์ •์ฑ…์  ์‹œ์‚ฌ์ : ๊ตญ๊ฐ€๋ฐœ์ „์ •์ฑ…์˜ ํ•ต์‹ฌ์€ ์ธ์ ์ž์›์ •์ฑ… 41 ์ œ3์ ˆ ๋ฌด์—‡์ด ์ธ์ ์ž์›์ž…๊ตญ์„ ๊ฐ€๋กœ๋ง‰๋Š”๊ฐ€? 42 1. ๊ณต๊ธ‰์ž ์ค‘์‹ฌ์˜ ํš์ผํ™”๋œ ํ•™๊ต๊ต์œก 42 2. ์‹ค์† ์—†๋Š” ํ‰์ƒํ•™์Šต 45 3. ๊ต์œก๊ณผ ์ง์—…์„ธ๊ณ„์˜ ์œ ๋ฆฌ 47 4. ๋น„์ฒด๊ณ„์ , ๋น„ํšจ์œจ์ ์ธ ๊ตญ๊ฐ€์ธ์ ์ž์›์ •์ฑ… 49 ์ œ4์ ˆ ์šฐ๋ฆฌ๋Š” ๋ฌด์—‡์„ ํ•ด์•ผ ํ•˜๋Š”๊ฐ€? 50 1. ์ •์ฑ…๊ธฐ์กฐ์˜ ์ „ํ™˜: ์‚ฌ๋žŒ ์ค‘์‹ฌ์˜ ์ •์ฑ… ํŒจ๋Ÿฌ๋‹ค์ž„์˜ ๊ธฐ์กฐ 51 2. ์ธ์ ์ž์›์ž…๊ตญ์„ ์œ„ํ•œ ์ธ์ ์ž์›์ •์ฑ… ์˜์—ญ 54 3. ์ธ์ ์ž์›์ •์ฑ…์˜ 7๋Œ€ ์ •์ฑ… ๋ฐฉํ–ฅ 56 ์ œ3์žฅ ํ•™๊ต๊ต์œก ์ œ1์ ˆ ๊ฐœ๊ด€ 61 ์ œ2์ ˆ ํ•™๊ต์˜ ํ˜์‹  63 1. ์ฐฝ์˜์ ์ธ ๊ต์œก๊ณผ์ • ๊ฐœ๋ฐœ 69 2. ๊ต์›์˜ ์ „๋ฌธ์„ฑ ์ œ๊ณ ์™€ ์‹ ๋ถ„๋ณด์žฅ 73 3. ๊ณ ๋“ฑ๊ต์œกํ‰๊ฐ€์˜ ์ƒˆ๋กœ์šด ๋ฐฉํ–ฅ 78 4. ์ด๊ณต๊ณ„ ์œ„๊ธฐ์™€ ๋Œ€ํ•™๊ต์œก์˜ ํ˜์‹  ๋ฐฉ์•ˆ 82 5. ๋Œ€ํ•™์—์„œ์˜ ์„ฑ์ธ๊ต์œก ํ™œ์„ฑํ™” 85 6. ๋Œ€ํ•™์˜ ์—ฐ๊ตฌ๊ฐœ๋ฐœ ๊ธฐ๋Šฅํ˜์‹  89 7. ์ง์—…๊ต์œก ์ œ์ž๋ฆฌ ์ฐพ๊ธฐ 93 ์ œ3์ ˆ ๊ต์œก์ œ๋„์™€ ์ •์ฑ…์˜ ๊ฐœํ˜ 99 1. ํ‰์ค€ํ™”์ œ๋„ ๋ฐ ๋Œ€ํ•™์ž…์‹œ์ œ๋„ ๊ฐœ์„  105 2. ์ •๋ถ€์˜ ๊ต์œก๊ด€๋ฆฌ ์ฒด์ œ ํ˜์‹  110 3. ๊ต์œก์†Œ์™ธ๊ณ„์ธต์˜ ๊ต์œก๋ณต์ง€ ๋ฐœ์ „๋ฐฉ์•ˆ 115 4. ์‚ฐ์—… ์š”๊ตฌ๋ฅผ ๋ฐ˜์˜ํ•œ ์ง์—…์ „๋ฌธํ•™์œ„์˜ ์‹ ์„คใ†๊ด€๋ฆฌ์ฒด์ œ ๊ตฌ์ถ• 120 5. ๋Œ€ํ•™๊ตญ์ œํ™”์˜ ํ˜„ํ™ฉ๊ณผ ์ „๋ง, ๋Œ€์ฑ… 125 6. ์ง€์‹๊ธฐ๋ฐ˜์‚ฌํšŒ์—์„œ์˜ ํ•™์ œ ๋ฐœ์ „ ๋ฐฉ์•ˆ 130 ์ œ4์žฅ ํ‰์ƒํ•™์Šต ์ œ1์ ˆ ๊ฐœ๊ด€ 137 ์ œ2์ ˆ ํ‰์ƒํ•™์Šต์˜ ํ˜„ํ™ฉ ๋ฐ ๋ฌธ์ œ์  138 1. ํ‰์ƒํ•™์Šต์— ๋Œ€ํ•œ ์ด๋ถ„๋ฒ•์  ์ ‘๊ทผ 138 2. ํ•™๋ น๊ธฐ ๊ณผ๋‹ค ํˆฌ์ž / ๋…ธ๋™์‹œ์žฅ ์ง„์ž… ์ดํ›„ HRD ๊ณผ์†Œํˆฌ์ž 140 3. ์„ฑ์ธ์˜ ํ‰์ƒํ•™์Šต ์ฐธ์—ฌ์œจ ์ €์กฐ 141 4. ๊ธฐ์—…์˜ ๊ต์œกํ›ˆ๋ จ ๊ฐ์†Œ ์ถ”์„ธ 144 5. ํ‰์ƒํ•™์Šต ๊ธฐํšŒ์˜ ๋ถˆ๊ท ๋“ฑ ํ˜„์ƒ ์‹ฌํ™” 145 ์ œ3์ ˆ ์ผ๊ณผ ํ•™์Šต์˜ ๋ณ‘ํ–‰ 147 1. ์ง์žฅ๋‚ด ํ‰์ƒํ•™์Šต์ฒด์ œ ๊ตฌ์ถ• 149 2. ๋…ธ์‚ฌ์ฐธ์—ฌ์— ๊ธฐ์ดˆํ•œ ํ˜‘๋ ฅ์  ์ธ์ ์ž์›๊ฐœ๋ฐœ 152 3. ํ˜„์žฅํ•™์Šต(OJT) ์ด‰์ง„ 155 4. ์ค‘์†Œ๊ธฐ์—… ์ธ์ ์ž์›๊ฐœ๋ฐœ ํ™œ์„ฑํ™” 159 5. ์ธ์ ์ž์›๊ฐœ๋ฐœ ์ธ์ฆ์ œ 161 6. ๊ฐœ์ธ์ฃผ๋„ ํ•™์Šต์˜ ํ™œ์„ฑํ™” 164 ์ œ4์ ˆ ํ•™์Šต ์ค‘์‹ฌ์˜ ์ง€์—ญ์‚ฌํšŒ : ํ•™์Šต์„ ํ†ตํ•œ ํ˜์‹ , ํ†ตํ•ฉ, ์„ฑ์žฅ 169 1. ์ง€์—ญ HRD ์—ญ๋Ÿ‰์˜ ๊ฐ•ํ™” 173 2. ์ง€์—ญ์˜ ํ•™์Šต ํŒŒํŠธ๋„ˆ์‹ญ ๊ตฌ์ถ• 177 3. ๊ณต๊ณต๊ธฐ๊ด€์˜ ํ•™์Šต ๊ฑฐ์ ํ™” 182 4. ์ธ์ ์ž์›๊ฐœ๋ฐœ๊ณผ ์ง€์—ญ๋ฐœ์ „์˜ ์—ฐ๊ฒฐ๊ณ ๋ฆฌ๋กœ์„œ์˜ ํ•™์Šต๋™์•„๋ฆฌ 186 5. ํ•™์Šตํ˜• ์‚ฌํšŒ์  ์ผ์ž๋ฆฌ ์ฐฝ์ถœ 190 6. ํ‰์ƒํ•™์Šต ๊ณต๋™์ฒด ๋งŒ๋“ค๊ธฐ 194 ์ œ5์žฅ ํ†ตํ•ฉ์  ์ธ์ ์ž์›๊ฐœ๋ฐœ ์ œ1์ ˆ ๊ฐœ๊ด€ 199 ์ œ2์ ˆ ๊ต์œก-๋…ธ๋™ ์—ฐ๊ณ„ 204 1. ์ž๊ฒฉ์ œ๋„์˜ ์žฌ์ •๋ฆฝ 206 2. ์ง„๋กœ ๋ฐ ๊ฒฝ๋ ฅ๊ฐœ๋ฐœ ์ฒด์ œ์˜ ์žฌํ™•๋ฆฝ 210 3. ํ•™์Šต๊ฒฐ๊ณผ ์ธ์ฆ์ฒด์ œ ์žฌ๊ตฌ์ถ• 214 4. HRD ์ค‘์‹ฌ์˜ ์‚ฐํ•™์—ฐ๊ณ„ ํ™œ์„ฑํ™” 217 ์ œ3์ ˆ ์ •๋ณดใ†์„œ๋น„์Šค์˜ ์ „๋‹ฌ 221 1. ์ธ์ ์ž์› ์ •๋ณด์ธํ”„๋ผ ๊ตฌ์ถ• 222 2. ํ•™์Šต-๊ณ ์šฉ-๋ณต์ง€ ์—ฐ๊ณ„์ฒด๊ณ„ 227 ์ œ4์ ˆ ์ธ์ ์ž์›์˜ ์œ ์ถœ์ž… 232 1. ๊ณ ๊ธ‰๋‘๋‡Œ ํ™œ์šฉ์˜ ๊ตญ์ œํ™” 234 2. ์™ธ๊ตญ์ธ ์ธ๋ ฅ์˜ ๊ด€๋ฆฌ์ฒด๊ณ„ 237 ์ œ5์ ˆ ์ธ์ ์ž์›์ •์ฑ… ์—ญ๋Ÿ‰ 242 1. ๊ณต๋ฌด์› ์ž„์šฉ์ œ๋„์˜ ํ˜์‹  244 2. ์ธ์ ์ž์›์ •์ฑ… ์ถ”์ง„์ฒด์ œ ํ˜์‹  247 ์ œ6์žฅ ๊ฒฐ๋ก : ์ธ์ ์ž์›์ž…๊ตญ ์ œ์•ˆ 2005 ์ œ1์ ˆ ์š”์•ฝ 254 ์ œ2์ ˆ ์ œ์–ธ: ์ธ์ ์ž์›์ž…๊ตญ ์ œ์•ˆ 2005 257 1. ์ธ์ ์ž์›์ž…๊ตญ์˜ ๋ฐฉํ–ฅ 258 2. ๊ต์œก์˜ ๋‚ด์šฉ๊ณผ ๋ฐฉ๋ฒ•์˜ ํ˜์‹  259 3. ํ•ต์‹ฌ์ธ์žฌ ์–‘์„ฑ๊ณผ ๊ต์œก ๊ธ€๋กœ๋ฒŒํ™” 261 4. ๊ต์œก์˜ ์งˆ ๋ณด์žฅ ์ฒด์ œ ๊ตฌ์ถ• 262 5. ํ•™์Šต์ค‘์‹ฌใ†๋Šฅ๋ ฅ์ค‘์‹ฌ์˜ ์‚ฌํšŒ๋กœ ์žฌ๊ตฌ์กฐํ™” 264 6. ์ •๋ณด ๋ฐ ํ•™์Šต์ง€์› ์ธํ”„๋ผ ๊ตฌ์ถ• 266 7. ์žฌ์ •์ธํ”„๋ผ ๊ตฌ์ถ•๊ณผ ๊ฑฐ๋ฒ„๋„Œ์Šค ํšจ์œจํ™” 267 ์ฐธ๊ณ ๋ฌธํ—Œ 26

    [ํŠน๋ณ„๊ธฐ๊ณ ] ๋™๋ฐ˜์„ฑ์žฅ์„ ์œ„ํ•œ ํ‰์ƒ์ง์—…๋Šฅ๋ ฅ๊ฐœ๋ฐœ ์ฒด์ œ ํ˜์‹ 

    No full text
    ์ง์—…๋Šฅ๋ ฅ๊ฐœ๋ฐœ์ •์ฑ…์˜ ์ „๊ฐœ๊ณผ์ • ํ‰์ƒ์ง์—…๋Šฅ๋ ฅ๊ฐœ๋ฐœ ๊ฐœ๋…๊ณผ ํ•„์š”์„ฑ ํ‰์ƒ์ง์—…๋Šฅ๋ ฅ๊ฐœ๋ฐœ์˜ ํ˜„์ฃผ์†Œ ๋น„์ „ ยท ๋ชฉํ‘œ ์ •์ฑ…๊ณผ์ œ ๋ณดํŽธ์  ๊ถŒ๋ฆฌ๋กœ์„œ์˜ ์ง์—…๋Šฅ๋ ฅ๊ฐœ๋ฐœ ๊ฒฝ์Ÿ๋ ฅ ์žˆ๋Š” ์ง€์‹๊ทผ๋กœ์ž ์œก์„ฑ ์‹œ์žฅ์ˆ˜์š”๋ฅผ ๋ฐ˜์˜ํ•˜๋Š” ์ง€์›์ฒด๊ณ„ ๊ตฌ์ถ• ํŒŒํŠธ๋„ˆ์‹ญ๊ณผ ๋ฏผ๊ฐ„์ฐธ์—ฌ์— ์˜ํ•œ ์ถ”์ง„์ฒด์ œ ๋Šฅ๋ ฅ์ค‘์‹ฌ ๋ฌธํ™” ํ™•

    A new paradigm for Human Resources Development : a theory of the comprehensive learning society

    No full text
    I. ์„œ๋ก  โ…ก. ํ•™์Šต์‚ฌํšŒ ๊ฐœ๋…์˜ ์ง„ํ™” 1.โ€˜ํ‰์ƒํ•™์Šตโ€™๊ฐœ๋…์˜ ๋ฐœ์ „ 2.โ€˜ํ•™์Šต์‚ฌํšŒโ€™์˜ ๊ฐœ๋…๊ณผ ์ด๋ก  3.โ€˜ํ•™์Šต์‚ฌํšŒ๋ก โ€™์˜ ์œ ํ˜•๊ณผ ๋ฐœ์ „ 4.โ€˜์ด์ฒด์  ํ•™์Šต์‚ฌํšŒโ€™๊ฐœ๋…๊ณผ ํŠน์ง• โ…ข. ๊ฒฝ์ œ์‚ฌํšŒ ํŒจ๋Ÿฌ๋‹ค์ž„์˜ ๋ณ€ํ™”์™€ ์ด์ฒด์  ํ•™์Šต์‚ฌํšŒ 1. ๊ฒฝ์ œ์‚ฌํšŒ ํŒจ๋Ÿฌ๋‹ค์ž„์˜ ์ „ํ™˜ 2. ํ•œ๊ตญ ๊ฒฝ์ œ์‚ฌํšŒ ํŒจ๋Ÿฌ๋‹ค์ž„์˜ ์ „ํ™˜ 3. ์ด์ฒด์  ํ•™์Šต์‚ฌํšŒ์—์„œ์˜ ๊ต์œก๊ณผ ํ•™์Šต โ…ฃ. ์ธ์ ์ž์›๊ฐœ๋ฐœ์˜ ์ƒˆ๋กœ์šด ํŒจ๋Ÿฌ๋‹ค์ž„ 1. ์ธ์ ์ž์›๊ฐœ๋ฐœ ๊ฐœ๋…์˜ ํ™•์žฅ 2. ์ด์ฒด์  ํ•™์Šต์‚ฌํšŒ์˜ ์ธ์ ์ž์›๊ฐœ๋ฐœ ํŒจ๋Ÿฌ๋‹ค์ž„ 3. ์ด์ฒด์  ํ•™์Šต์‚ฌํšŒ์˜ ์ธ์ ์ž์›๊ฐœ๋ฐœ ์ „๋žต โ…ค. ๋งบ์Œ๋ง ์ฐธ๊ณ ๋ฌธํ—Œ abstractIn today's highly networked society, a new learning environment is created at the individual, organizational and societal levels and this demands us to re-focus on the concept of the learning society. A 'comprehensive learning society' upholds the existing notion of the learning society but also makes a conceptional effort at overcoming its limitations. The new socio-economic paradigm is characterized by de-industrialization, informatization, flexible production system, knowledge-based economy, networked organization, and cooperative labor-management relations. Changes in the socio-economic paradigm demands a new paradigm for human resources development. The new paradigm for human resources development is one that is demand-driven and is characterized by cooperative governance. It should also be epitomized by decentralization, localization, competency-based certification system, workplace learning, continuing education and training and lifelong learning, learner-centered learning method, and fostering of knowledge-labor and innovative capabilities. To realize a comprehensive learning society, multi-dimensional strategies need to be designed not only at the individual level but also at the corporate level as well as various organizational and governmental levels
    • โ€ฆ
    corecore