12 research outputs found
Anomalous Increase in Specific Heat of Binary Molten Salt-Based Graphite Nanofluids for Thermal Energy Storage
An anomalous increase of the specific heat was experimentally observed in molten salt nanofluids using a differential scanning calorimeter. Binary carbonate molten salt mixtures were used as a base fluid, and the base salts were doped with graphite nanoparticles. Specific heat measurements of the nanofluids were performed to examine the effects of the composition of two salts consisting of the base fluid. In addition, the effect of the nanoparticle concentration was investigated as the concentration of the graphite nanoparticles was varied from 0.025 to 1.0 wt %. Moreover, the dispersion homogeneity of the nanoparticles was explored by increasing amount of surfactant in the synthesis process of the molten salt nanofluids. The results showed that the specific heat of the nanofluid was enhanced by more than 30% in the liquid phase and by more than 36% in the solid phase at a nanoparticle concentration of 1 wt %. It was also observed that the concentration and the dispersion homogeneity of nanoparticles favorably affected the specific heat enhancement of the molten salt nanofluids. The dispersion status of graphite nanoparticles into the salt mixtures was visualized via scanning electron microscopy. The experimental results were explained according to the nanoparticle-induced compressed liquid layer structure of the molten salts
A Deep Neural Network-Based Pain Classifier Using a Photoplethysmography Signal
Side effects occur when excessive or low doses of analgesics are administered compared to the required amount to mediate the pain induced during surgery. It is important to accurately assess the pain level of the patient during surgery. We proposed a pain classifier based on a deep belief network (DBN) using photoplethysmography (PPG). Our DBN learned about a complex nonlinear relationship between extracted PPG features and pain status based on the numeric rating scale (NRS). A bagging ensemble model was used to improve classification performance. The DBN classifier showed better classification results than multilayer perceptron neural network (MLPNN) and support vector machine (SVM) models. In addition, the classification performance was improved when the selective bagging model was applied compared with the use of each single model classifier. The pain classifier based on DBN using a selective bagging model can be helpful in developing a pain classification system
Recommended from our members
Non-invasive transmission of sensorimotor information in humans using an EEG/focused ultrasound brain-to-brain interface
We present non-invasive means that detect unilateral hand motor brain activity from one individual and subsequently stimulate the somatosensory area of another individual, thus, enabling the remote hemispheric link between each brain hemisphere in humans. Healthy participants were paired as a sender and a receiver. A sender performed a motor imagery task of either right or left hand, and associated changes in the electroencephalogram (EEG) mu rhythm (8–10 Hz) originating from either hemisphere were programmed to move a computer cursor to a target that appeared in either left or right of the computer screen. When the cursor reaches its target, the outcome was transmitted to another computer over the internet, and actuated the focused ultrasound (FUS) devices that selectively and non-invasively stimulated either the right or left hand somatosensory area of the receiver. Small FUS transducers effectively allowed for the independent administration of stimulatory ultrasonic waves to somatosensory areas. The stimulation elicited unilateral tactile sensation of the hand from the receiver, thus establishing the hemispheric brain-to-brain interface (BBI). Although there was a degree of variability in task accuracy, six pairs of volunteers performed the BBI task in high accuracy, transferring approximately eight commands per minute. Linkage between the hemispheric brain activities among individuals suggests the possibility for expansion of the information bandwidth in the context of BBI
Schematics and exemplar view of the brain-to-brain interface (BBI) system.
<p>On the left panels, the EEG-based BCI procedure is shown, including the placement of EEG electrodes over the motor cortices. Upon the presentation of the left/right target on a computer display, the sender’s motor imagery of the left/right hand modulates the EEG signal through the reduction of the <i>mu</i> rhythm in each hemisphere (<i>i</i>.<i>e</i>., illustration given in the C3 and C4 EEG location), and moves the computer cursor (appeared as a ball) through BCI signal processing. When the cursor reaches its target, the computer generates a trigger signal that is transmitted to the receiving location (~30 km away) through the internet (<i>via</i> TCP/IP protocol), whereby the FUS-based CBI (illustrated on the right panels) stimulates either the left or right SI. The right panel illustrates implementation of two FUS transducers that independently stimulate left/right SI of the receiving individuals. The stimulation of the SI elicits tactile sensation of the contralateral hand area, and the receiver was instructed to signal the hand where she/he felt the elicited tactile sensations.</p
Exemplar view of CBI segment responses during a BBI session.
<p>FUS-mediated CBI segment exemplar capture screen of the BCI-originated trigger signals, which includes the elicited responses from the CBI segments.</p
Accuracy and processing time of the BCI segment, CBI segment, and overall BBI communication for three sessions.
<p>Accuracy and processing time of the BCI segment, CBI segment, and overall BBI communication for three sessions.</p
Performance of the six subject pairs for the implemented brain-to-brain interface (BBI).
<p>The ROC curves from the six subject pairs were shown in terms of EEG-based BCI task (left column), FUS-mediated CBI task (middle column), and overall BBI communication (right column). The rows indicate three multiple sessions (rows: session #1 to #3). Each BCI/CBI subject or BBI pair was labeled 1–6 around the data point circles.</p
Schematics for the FUS-mediated CBI system in BBI communication.
<p>(<b>A</b>) <u>Left panel</u>: a rendering of the FUS setup for the CBI segment. The left and right SI were independently targeted by two single-element 210 kHz FUS transducers where the locations of FUS foci were tracked by using the optical trackers (‘tracker 1’ and ‘tracker 2’) in reference to the head (tracked <i>via</i> ‘helmet tracker’). Each tracker having four infrared-reflective markers was recognized by a motion capture camera. The gap between the transducer and the scalp was filled with a soft compressible hydrogel for the acoustic coupling. FUS transducers were actuated by separate sets of computer-controlled driving circuits. <u>Right panel</u>: The acoustic intensity mapping of the FUS transducer. The FWHM of the intensity profile is demarcated by red dotted lines. (<b>B</b>) Illustration of the FUS actuations triggered by the transmitted signals from the BCI segment with the following acoustic parameters: SD = 500 ms; ISI, inter-stimulation-interval, = 7 s; TBD = 1 ms; PRF = 500 Hz; I<sub>sppa</sub> = 35.0 W/cm<sup>2</sup>. (<b>C</b> and <b>D</b>) Exemplar views of the FUS targeting to the bilateral SI as guided by individual-specific neuroimage data. The sonication target is indicated as the intersection of the green orthogonal crosshairs shown in the triplanar views (<i>i</i>.<i>e</i>., axial, sagittal, and coronal slices), while the thick yellow line (connecting green and red dots as intended entry to target points) and green line represent the orientation of the planned and guided sonication beam paths, respectively. In the lower right panel of 3D rendered view, the four colored dots show the locations of anatomical markers used for the spatial co-registration between the subject and the neuroimage data. ‘R’ and ‘L’ denote right and left, respectively.</p
Performance of the BCI segment, CBI segments, and overall BBI sessions.
<p>Performance of the BCI segment, CBI segments, and overall BBI sessions.</p
Flow charts of the visual displays used for the brain-computer interface (BCI) task trials.
<p>(<b>A</b>) The timing diagram of BCI test sessions used for setting the BCI parameters. (<b>B</b>) The timing diagram used for BCI experiment session for decoding the motor intention during the BBI implementation.</p