5 research outputs found

    āļāļēāļĢāļ›āļĢāļ°āļĒāļļāļāļ•āđŒāđƒāļŠāđ‰āđ€āļ„āļĢāļ·āđˆāļ­āļ‡āļ§āļąāļ”āļ„āļĨāļ·āđˆāļ™āđ„āļŸāļŸāđ‰āļēāļŠāļĄāļ­āļ‡āđāļšāļšāļžāļāļžāļēāđƒāļ™āļāļēāļĢāļ§āļąāļ”āļ„āļ§āļēāļĄāļ„āļīāļ”āļŠāļĢāđ‰āļēāļ‡āļŠāļĢāļĢāļ„āđŒāļ‚āļ­āļ‡āļœāļđāđ‰āđ€āļĢāļĩāļĒāļ™āļ—āļĩāđˆāđ€āļĢāļĩāļĒāļ™āļĢāļđāđ‰āļ”āđ‰āļ§āļĒāļāļīāļˆāļāļĢāļĢāļĄāļŠāļ°āđ€āļ•āđ‡āļĄāđāļšāļšāđ€āļ›āļīāļ”āđāļĨāļ°āđāļšāļšāļĄāļĩāđ‚āļ„āļĢāļ‡āļŠāļĢāđ‰āļēāļ‡

    Get PDF
    The Application of Portable Electroencephalography Device to Measure Learners’ Creative Thinking when Learning with Structured and Open STEM Activities Suthida Chamrat āļĢāļąāļšāļšāļ—āļ„āļ§āļēāļĄ: 28 āļāļļāļĄāļ āļēāļžāļąāļ™āļ˜āđŒ 2563; āđāļāđ‰āđ„āļ‚āļšāļ—āļ„āļ§āļēāļĄ: 24 āļžāļĪāļĐāļ āļēāļ„āļĄ 2563; āļĒāļ­āļĄāļĢāļąāļšāļ•āļĩāļžāļīāļĄāļžāđŒ: 29 āļžāļĪāļĐāļ āļēāļ„āļĄ 2563DOI: http://doi.org/10.14456/jstel.2020.2 āļšāļ—āļ„āļąāļ”āļĒāđˆāļ­āļāļēāļĢāļ§āļīāļˆāļąāļĒāđƒāļ™āļ„āļĢāļąāđ‰āļ‡āļ™āļĩāđ‰āļĻāļķāļāļĐāļēāļāļēāļĢāļ™āļģāļ„āļ§āļēāļĄāļāđ‰āļēāļ§āļŦāļ™āđ‰āļēāļ‚āļ­āļ‡āļ§āļīāļ—āļĒāļēāļāļēāļĢāļ›āļĢāļ°āļŠāļēāļ—āļ§āļīāļ—āļĒāļēāļĻāļēāļŠāļ•āļĢāđŒ āđ‚āļ”āļĒāļāļēāļĢāđƒāļŠāđ‰āļ­āļļāļ›āļāļĢāļ“āđŒāļ§āļąāļ”āļ„āļĨāļ·āđˆāļ™āđ„āļŸāļŸāđ‰āļēāļŠāļĄāļ­āļ‡ (EEG) āđāļšāļšāļžāļāļžāļē āđ€āļžāļ·āđˆāļ­āļ™āļģāļĄāļēāļ§āļąāļ”āļŠāļ āļēāļ§āļ°āļāļēāļĢāļ„āļīāļ”āļŠāļĢāđ‰āļēāļ‡āļŠāļĢāļĢāļ„āđŒāļ‚āļ­āļ‡āļœāļđāđ‰āđ€āļĢāļĩāļĒāļ™āđƒāļ™āļ‚āļ“āļ°āļ”āļģāđ€āļ™āļīāļ™āļāļīāļˆāļāļĢāļĢāļĄāļāļēāļĢāđ€āļĢāļĩāļĒāļ™āļĢāļđāđ‰āļ•āļēāļĄāđāļ™āļ§āļŠāļ°āđ€āļ•āđ‡āļĄāļĻāļķāļāļĐāļēāđāļšāļšāđ€āļ›āļīāļ”āđāļĨāļ°āđāļšāļšāļĄāļĩāđ‚āļ„āļĢāļ‡āļŠāļĢāđ‰āļēāļ‡ āđ‚āļ”āļĒāđ€āļāđ‡āļšāļ‚āđ‰āļ­āļĄāļđāļĨāļ§āļīāļˆāļąāļĒāļˆāļēāļāļžāļĨāļ§āļīāļˆāļąāļĒāļ‹āļķāđˆāļ‡āđ€āļ›āđ‡āļ™āļ­āļēāļŠāļēāļŠāļĄāļąāļ„āļĢāļˆāļģāļ™āļ§āļ™ 12 āļ„āļ™ āļŦāļĨāļąāļ‡āļˆāļēāļāļ—āļĩāđˆāđ„āļ”āđ‰āļĢāļąāļšāļāļēāļĢāļ­āļ™āļļāļĄāļąāļ•āļīāļˆāļēāļāļ„āļ“āļ°āļāļĢāļĢāļĄāļāļēāļĢāļˆāļĢāļīāļĒāļ˜āļĢāļĢāļĄāļāļēāļĢāļ§āļīāļˆāļąāļĒāđƒāļ™āļ„āļ™ āļĄāļŦāļēāļ§āļīāļ—āļĒāļēāļĨāļąāļĒāđ€āļŠāļĩāļĒāļ‡āđƒāļŦāļĄāđˆ āļžāļĨāļ§āļīāļˆāļąāļĒāđ„āļ”āđ‰āđ€āļ‚āđ‰āļēāļĢāđˆāļ§āļĄāļāļīāļˆāļāļĢāļĢāļĄāļŠāļ°āđ€āļ•āđ‡āļĄāļ—āļąāđ‰āļ‡āļŠāļ­āļ‡āļāļīāļˆāļāļĢāļĢāļĄ āļ—āļĩāđˆāļ›āļĢāļ°āļāļ­āļšāļ”āđ‰āļ§āļĒāļāļīāļˆāļāļĢāļĢāļĄāļāļēāļĢāļŸāļąāļ‡āļšāļĢāļĢāļĒāļēāļĒāļˆāļēāļāļœāļđāđ‰āļŠāļ­āļ™ āļāļēāļĢāļ”āļđāļ§āļīāļ”āļīāļ—āļąāļĻāļ™āđŒāļˆāļēāļāļĒāļđāļ—āļđāļ› āđāļĨāļ°āļāļēāļĢāļĨāļ‡āļĄāļ·āļ­āļ›āļāļīāļšāļąāļ•āļī āļ„āļĨāļ·āđˆāļ™āđ„āļŸāļŸāđ‰āļēāļŠāļĄāļ­āļ‡āļˆāļ°āļ–āļđāļāļ§āļąāļ”āđ‚āļ”āļĒāđ€āļ„āļĢāļ·āđˆāļ­āļ‡āļŠāļ§āļĄāļĻāļĩāļĢāļĐāļ° Muse āļ—āļĩāđˆāļĄāļĩ 4 āļ­āļīāđ€āļĨāđ‡āļāđ‚āļ—āļĢāļ” āđ„āļ”āđ‰āđāļāđˆ AF7  AF8  TP9 āđāļĨāļ° TP10 āđ‚āļ”āļĒāļĻāļķāļāļĐāļēāļ§āļīāļ˜āļĩāļāļēāļĢāđƒāļŠāđ‰āļ‡āļēāļ™āđ€āļ„āļĢāļ·āđˆāļ­āļ‡āļ§āļąāļ”āļ„āļĨāļ·āđˆāļ™āđ„āļŸāļŸāđ‰āļēāļŠāļĄāļ­āļ‡āđāļšāļšāđ€āļ„āļĨāļ·āđˆāļ­āļ™āļ—āļĩāđˆāļ—āļĩāđˆāļ§āļąāļ”āđāļĨāļ°āļŠāđˆāļ‡āļ„āļĨāļ·āđˆāļ™āļŠāļąāļāļāļēāļ™āđ„āļ›āļĒāļąāļ‡āļ­āļļāļ›āļāļĢāļ“āđŒāļĢāļąāļšāļŠāļąāļāļāļēāļ“ Bluetooth 2 āļĢāļđāļ›āđāļšāļš āļ„āļ·āļ­ 1) āļāļēāļĢāļŠāļ•āļĢāļĩāļĄāļ‚āđ‰āļ­āļĄāļđāļĨāđ„āļ›āļĒāļąāļ‡āđ€āļ„āļĢāļ·āđˆāļ­āļ‡āļ„āļ­āļĄāļžāļīāļ§āđ€āļ•āļ­āļĢāđŒāļœāđˆāļēāļ™āđāļ­āļ›āļžāļĨāļīāđ€āļ„āļŠāļąāļ™ Muse Direct āđƒāļ™āļĢāļđāļ›āđāļšāļšāđ„āļŸāļĨāđŒāļ™āļēāļĄāļŠāļāļļāļĨ .Muse āļ‹āļķāđˆāļ‡āđāļŠāļ”āļ‡āļœāļĨāđāļšāļš real–time āļœāđˆāļēāļ™āđ‚āļ›āļĢāđāļāļĢāļĄ Neuro visual āđāļĨāļ° 2) āļāļēāļĢāļŠāļ•āļĢāļĩāļĄāļ‚āđ‰āļ­āļĄāļđāļĨāđ„āļ›āļĒāļąāļ‡āđ‚āļ—āļĢāļĻāļąāļžāļ—āđŒāļĄāļ·āļ­āļ–āļ·āļ­āļœāđˆāļēāļ™āđāļ­āļ›āļžāļĨāļīāđ€āļ„āļŠāļąāļ™ Mind Monitor āļ‹āļķāđˆāļ‡āđ€āļŠāļ·āđˆāļ­āļĄāļ•āđˆāļ­āđāļĨāļ°āđ€āļāđ‡āļšāļ‚āđ‰āļ­āļĄāļđāļĨāđ€āļ‚āđ‰āļēāļĢāļ°āļšāļšāļ„āļĨāļēāļ§āļ”āđŒāļ‚āļ­āļ‡ Dropbox āđ‚āļ”āļĒāļ­āļąāļ•āđ‚āļ™āļĄāļąāļ•āļī āđƒāļ™āļĢāļđāļ›āđāļšāļš CVS āļœāļĨāļāļēāļĢāļ§āļīāļˆāļąāļĒāļžāļšāļ§āđˆāļēāļ§āļīāļ˜āļĩāļāļēāļĢāļ—āļĩāđˆ 2 āļŠāļ°āļ”āļ§āļāđāļĨāļ°āđƒāļŦāđ‰āļ‚āđ‰āļ­āļĄāļđāļĨāļ—āļĩāđˆāđ€āļŠāļ–āļĩāļĒāļĢāļĄāļēāļāļāļ§āđˆāļē āļŠāļēāļĄāļēāļĢāļ–āļ™āļģāļ‚āđ‰āļ­āļĄāļđāļĨāđ„āļ›āđƒāļŠāđ‰āļ§āļīāđ€āļ„āļĢāļēāļ°āļŦāđŒāļœāļĨāđ„āļ”āđ‰āđ€āļĨāļĒ āđƒāļ™āļ‚āļ“āļ°āļ—āļĩāđˆāđāļšāļšāļ—āļĩāđˆ 1 āļ‚āđ‰āļ­āļĄāļđāļĨāļ™āļēāļĄāļŠāļāļļāļĨ .Muse āļ•āđ‰āļ­āļ‡āļ™āļģāđ„āļ›āđāļ›āļĨāļ‡āđ„āļŸāļĨāđŒāđāļĨāļ°āļ§āļīāđ€āļ„āļĢāļēāļ°āļŦāđŒāļ”āđ‰āļ§āļĒāđ‚āļ›āļĢāđāļāļĢāļĄ Matlab āļ—āļĩāđˆāļ‹āļąāļšāļ‹āđ‰āļ­āļ™āđāļĨāļ°āļ•āđ‰āļ­āļ‡āđƒāļŠāđ‰āļ„āļ§āļēāļĄāđ€āļŠāļĩāđˆāļĒāļ§āļŠāļēāļāđƒāļ™āļāļēāļĢāđ€āļ‚āļĩāļĒāļ™āļ„āļģāļŠāļąāđˆāļ‡ āļœāļĨāļˆāļēāļāļāļēāļĢāļ§āļīāđ€āļ„āļĢāļēāļ°āļŦāđŒāļ‚āđ‰āļ­āļĄāļđāļĨāđ‚āļ”āļĒ Mind Monitor Graphing āđāļšāļšāļ­āļ­āļ™āđ„āļĨāļ™āđŒ āđāļŠāļ”āļ‡āđƒāļŦāđ‰āđ€āļŦāđ‡āļ™āļŠāļ āļēāļ§āļ°āļāļēāļĢāļ—āļģāļ‡āļēāļ™āļ‚āļ­āļ‡āļŠāļĄāļ­āļ‡āđāļ•āļāļ•āđˆāļēāļ‡āļāļąāļ™āđ„āļ›āļ•āļēāļĄāļāļīāļˆāļāļĢāļĢāļĄāļ—āļĩāđˆāļ—āļģ āđ‚āļ”āļĒāļāļēāļĢāļ—āļģāļāļīāļˆāļāļĢāļĢāļĄāļŠāļ°āđ€āļ•āđ‡āļĄāđāļšāļšāļĄāļĩāđ‚āļ„āļĢāļ‡āļŠāļĢāđ‰āļēāļ‡ āđƒāļ™āļŠāđˆāļ§āļ‡āļāļēāļĢāļ­āļ­āļāđāļšāļšāđƒāļŦāđ‰āļ„āļĨāļ·āđˆāļ™āđ„āļŸāļŸāđ‰āļēāļŠāļĄāļ­āļ‡āļŠāđˆāļ§āļ‡āđāļ­āļĨāļŸāļē (āļ„āļ§āļēāļĄāļ–āļĩāđˆ 8–12 Hz) āđ€āļ‰āļĨāļĩāđˆāļĒāļŠāļđāļ‡āļāļ§āđˆāļēāļāļīāļˆāļāļĢāļĢāļĄāļŠāļ°āđ€āļ•āđ‡āļĄāđāļšāļšāđ€āļ›āļīāļ” āđāļĨāļ°āļĨāļ”āļĨāļ‡āļ•āđˆāļģāļŠāļļāļ”āđƒāļ™āļŠāđˆāļ§āļ‡āļāļēāļĢāļŸāļąāļ‡āļšāļĢāļĢāļĒāļēāļĒ (71.637) āđ€āļĄāļ·āđˆāļ­āļžāļīāļˆāļēāļĢāļ“āļēāļ„āļ§āļēāļĄāļŠāļĄāļĄāļēāļ•āļĢāļ‚āļ­āļ‡āļ„āļĨāļ·āđˆāļ™āļ­āļąāļĨāļŸāļēāđƒāļ™āļŠāļĄāļ­āļ‡āļ‹āļĩāļāļ‚āļ§āļēāđāļĨāļ°āļ‹āļĩāļāļ‹āđ‰āļēāļĒ āļžāļšāļ§āđˆāļē āļāļīāļˆāļāļĢāļĢāļĄāļŠāļ°āđ€āļ•āđ‡āļĄāđāļšāļšāđ€āļ›āļīāļ”āđƒāļŦāđ‰āļ„āļĨāļ·āđˆāļ™āđāļ­āļĨāļŸāļēāđƒāļ™āļŠāļĄāļ­āļ‡āļ‹āļĩāļāļ‚āļ§āļēāļĄāļēāļāļāļ§āđˆāļēāļ‹āļĩāļāļ‹āđ‰āļēāļĒāļ­āļĒāđˆāļēāļ‡āļŠāļąāļ”āđ€āļˆāļ™ (āļŠāđˆāļ§āļ™āļ•āđˆāļēāļ‡āļŦāļĢāļ·āļ­āļ­āļŠāļĄāļĄāļēāļ•āļĢ = 11.160)āļ„āļģāļŠāļģāļ„āļąāļ: āļāļīāļˆāļāļĢāļĢāļĄāļŠāļ°āđ€āļ•āđ‡āļĄ  āļ§āļīāļ˜āļĩāļāļēāļĢāļšāļąāļ™āļ—āļķāļāļ„āļĨāļ·āđˆāļ™āđ„āļŸāļŸāđ‰āļēāđƒāļ™āļŠāļĄāļ­āļ‡  āļ›āļĢāļ°āļŠāļēāļ—āļ§āļīāļ—āļĒāļēāļĻāļēāļŠāļ•āļĢāđŒÂ AbstractThis research explored the utilization of neuroscience technology using portable electroencephalography (EEG) devices to measure the state of students’ creative thinking during their performance of STEM activities and structured and open STEM activities. After obtaining Institutional Review Board approval, 12 students who voluntarily participated in this study were assigned to engage in two STEM activities consisting of listening to lectures, watching YouTube videos, and engaging in hands–on activities. Participants were measured for their creative thinking using the EEG Muse Headband while they were performing the tasks. The Muse headband transmitted brain signals collected by four electrodes in areas of AF7, AF8, TP9 and TP10 to mobile equipment via Bluetooth in 2 methods: 1) streaming data to a computer via the Muse Direct application in the Muse file format, which is displayed in real–time by Neuro visual program, and 2) streaming data to mobile phones via the Mind Monitor application, which connects and stores data automatically in Cloud Storage with Dropbox in CVS format. The results indicated that the second method is more convenient and gives a stable, ready to analyze data. The first method, the file in *.Muse format must be converted before analyzed by Matlab, which is more complicated and requires expertise in writing commands. The results of the online data analysis by Mind Monitor Graphing show the brain wave conditions vary according to the activity. In the structured STEM activity, the alpha band (8–12 Hz) showed that creative thinking was higher than open STEM activity. The alpha brain wave was lowest during the lecture sessions. When considering the alpha wave’s symmetry in the right and left hemisphere, open STEM activity is giving the alpha waves in the right hemisphere than the left hemisphere (difference or asymmetry = 11.160).Keywords: STEM activity, Electroencephalography, Neuroscience

    A mixed-perception approach for safe human–robot collaboration in industrial automation

    Get PDF
    Digital-enabled manufacturing systems require a high level of automation for fast and low-cost production but should also present flexibility and adaptiveness to varying and dynamic conditions in their environment, including the presence of human beings; however, this presence of workers in the shared workspace with robots decreases the productivity, as the robot is not aware about the human position and intention, which leads to concerns about human safety. This issue is addressed in this work by designing a reliable safety monitoring system for collaborative robots (cobots). The main idea here is to significantly enhance safety using a combination of recognition of human actions using visual perception and at the same time interpreting physical human–robot contact by tactile perception. Two datasets containing contact and vision data are collected by using different volunteers. The action recognition system classifies human actions using the skeleton representation of the latter when entering the shared workspace and the contact detection system distinguishes between intentional and incidental interactions if physical contact between human and cobot takes place. Two different deep learning networks are used for human action recognition and contact detection, which in combination, are expected to lead to the enhancement of human safety and an increase in the level of cobot perception about human intentions. The results show a promising path for future AI-driven solutions in safe and productive human–robot collaboration (HRC) in industrial automation
    corecore