38 research outputs found

    Real-Time Assembly Operation Recognition with Fog Computing and Transfer Learning for Human-Centered Intelligent Manufacturing

    Get PDF
    In a human-centered intelligent manufacturing system, every element is to assist the operator in achieving the optimal operational performance. The primary task of developing such a human-centered system is to accurately understand human behavior. In this paper, we propose a fog computing framework for assembly operation recognition, which brings computing power close to the data source in order to achieve real-time recognition. For data collection, the operator\u27s activity is captured using visual cameras from different perspectives. For operation recognition, instead of directly building and training a deep learning model from scratch, which needs a huge amount of data, transfer learning is applied to transfer the learning abilities to our application. A worker assembly operation dataset is established, which at present contains 10 sequential operations in an assembly task of installing a desktop CNC machine. The developed transfer learning model is evaluated on this dataset and achieves a recognition accuracy of 95% in the testing experiments

    Action Recognition in Manufacturing Assembly using Multimodal Sensor Fusion

    Get PDF
    Production innovations are occurring faster than ever. Manufacturing workers thus need to frequently learn new methods and skills. In fast changing, largely uncertain production systems, manufacturers with the ability to comprehend workers\u27 behavior and assess their operation performance in near real-time will achieve better performance than peers. Action recognition can serve this purpose. Despite that human action recognition has been an active field of study in machine learning, limited work has been done for recognizing worker actions in performing manufacturing tasks that involve complex, intricate operations. Using data captured by one sensor or a single type of sensor to recognize those actions lacks reliability. The limitation can be surpassed by sensor fusion at data, feature, and decision levels. This paper presents a study that developed a multimodal sensor system and used sensor fusion methods to enhance the reliability of action recognition. One step in assembling a Bukito 3D printer, which composed of a sequence of 7 actions, was used to illustrate and assess the proposed method. Two wearable sensors namely Myo-armband captured both Inertial Measurement Unit (IMU) and electromyography (EMG) signals of assembly workers. Microsoft Kinect, a vision based sensor, simultaneously tracked predefined skeleton joints of them. The collected IMU, EMG, and skeleton data were respectively used to train five individual Convolutional Neural Network (CNN) models. Then, various fusion methods were implemented to integrate the prediction results of independent models to yield the final prediction. Reasons for achieving better performance using sensor fusion were identified from this study

    A CRY-BIC negative-feedback circuitry regulating blue light sensitivity of Arabidopsis.

    Get PDF
    Cryptochromes are blue light receptors that regulate various light responses in plants. Arabidopsis cryptochrome 1 (CRY1) and cryptochrome 2 (CRY2) mediate blue light inhibition of hypocotyl elongation and long-day (LD) promotion of floral initiation. It has been reported recently that two negative regulators of Arabidopsis cryptochromes, Blue light Inhibitors of Cryptochromes 1 and 2 (BIC1 and BIC2), inhibit cryptochrome function by blocking blue light-dependent cryptochrome dimerization. However, it remained unclear how cryptochromes regulate the BIC gene activity. Here we show that cryptochromes mediate light activation of transcription of the BIC genes, by suppressing the activity of CONSTITUTIVE PHOTOMORPHOGENIC 1 (COP1), resulting in activation of the transcription activator ELONGATED HYPOCOTYL 5 (HY5) that is associated with chromatins of the BIC promoters. These results demonstrate a CRY-BIC negative-feedback circuitry that regulates the activity of each other. Surprisingly, phytochromes also mediate light activation of BIC transcription, suggesting a novel photoreceptor co-action mechanism to sustain blue light sensitivity of plants under the broad spectra of solar radiation in nature

    Emission Reduction Decisions in Blockchain-Enabled Low-Carbon Supply Chains under Different Power Structures

    No full text
    With the global climate problem becoming increasingly severe, governments have adopted policies to encourage enterprises to invest in low-carbon technologies. However, the opacity of the carbon emission reduction process leads to incomplete consumer trust in low-carbon products as well as higher supply chain transaction costs. Based on this, this paper constructs Stackelberg game models with and without blockchain under different power structures and compares the impact of these models on low-carbon emission reduction decisions. The results show that: (1) blockchain does not necessarily improve enterprise profits and can only help enterprises maintain optimal profits within a certain range when the carbon emission cost is low; (2) when consumers’ environmental awareness is high, the blockchain can incentivize manufacturers to enhance carbon emission reduction, and it has an obvious promotional effect on retailers’ profits; and (3) the profit gap between enterprises in the supply chain is larger under different power structures, and the implementation of blockchain can coordinate profit distribution and narrow the gap between enterprises. Compared with the manufacturer-dominated model, the emission reduction in products is maximized under the retailer-dominated model. Our study provides theoretical support for the government to regulate greenhouse gas emissions as well as for the optimization of enterprises’ decision-making supported by blockchain

    A Self-Aware and Active-Guiding Training & Assistant System for Worker-Centered Intelligent Manufacturing

    No full text
    Training and on-site assistance is critical to help workers master required skills, improve worker productivity, and guarantee the product quality. Traditional training methods lack worker-centered considerations that are particularly in need when workers are facing ever-changing demands. In this study, we propose a worker-centered training & assistant system for intelligent manufacturing, which is featured with self-awareness and active-guidance. Multi-modal sensing techniques are applied to perceive each individual worker and a deep learning approach is developed to understand the worker\u27s behavior and intention. Moreover, an object detection algorithm is implemented to identify the parts/tools the worker is interacting with. Then the worker\u27s current state is inferred and used for quantifying and assessing the worker performance, from which the worker\u27s potential guidance demands are analyzed. Furthermore, onsite guidance with multi-modal augmented reality is provided actively and continuously during the operational process. Two case studies are used to demonstrate the feasibility and great potential of our proposed approach and system for applying to the manufacturing industry for frontline workers
    corecore