9 research outputs found

    Effect of Denosumab on Femoral Periprosthetic BMD and Early Femoral Stem Subsidence in Postmenopausal Women Undergoing Cementless Total Hip Arthroplasty

    Get PDF
    Antiresorptive denosumab is known to improve the quality and strength of cortical bone in the proximal femurs of osteoporotic women, but its efficacy in preventing periprosthetic bone loss and reducing femoral stem migration has not been studied in women undergoing cementless total hip arthroplasty. We conducted a single‐center, randomized, double‐blinded, placebo‐controlled trial of 65 postmenopausal women with primary hip osteoarthritis and Dorr type A or B proximal femur anatomy. The patients randomly received subcutaneous injections of denosumab 60 mg or placebo once every 6 months for 12 months, starting 1 month before surgery. The primary endpoint was the change in bone mineral density (BMD) of the proximal femur (Gruen zone 7) at week 48, and the secondary endpoint was stem subsidence measured by radiostereometric analysis (RSA) at week 48. Exploratory endpoints included changes in BMDs of the contralateral hip, lumbar spine and distal radius, serum levels of bone turnover markers, walking speed, walking activity, patient‐reported outcome measures, and radiographic assessment of stem osseointegration. The participants underwent vertebral‐fracture assessment in an extension safety study at 3 years. Denosumab significantly decreased bone loss in the medial femoral neck (zone 7) and increased periprosthetic BMD in the greater trochanteric region (zone 1) and lesser trochanteric region (zone 6). Denosumab did not reduce temporary femoral stem migration. The migration occurred mainly during the settling period (0 to 12 weeks) after implantation of the prosthesis. All of the stems osseointegrated, as evaluated by RSA and radiographs. There were no intergroup differences in functional recovery. Discontinuation of denosumab did not lead to any adverse events. In conclusion, denosumab increased periprosthetic BMD in the clinically relevant regions of the proximal femur, but the treatment response was not associated with any reduction of initial stem migration.

    Decoding brain basis of laughter and crying in natural scenes

    Get PDF
    Laughter and crying are universal signals of prosociality and distress, respectively. Here we investigated the functional brain basis of perceiving laughter and crying using naturalistic functional magnetic resonance imaging (fMRI) approach. We measured haemodynamic brain activity evoked by laughter and crying in three experiments with 100 subjects in each. The subjects i) viewed a 20-minute medley of short video clips, and ii) 30 min of a full-length feature film, and iii) listened to 13.5 min of a radio play that all contained bursts of laughter and crying. Intensity of laughing and crying in the videos and radio play was annotated by independent observes, and the resulting time series were used to predict hemodynamic activity to laughter and crying episodes. Multivariate pattern analysis (MVPA) was used to test for regional selectivity in laughter and crying evoked activations. Laughter induced widespread activity in ventral visual cortex and superior and middle temporal and motor cortices. Crying activated thalamus, cingulate cortex along the anterior-posterior axis, insula and orbitofrontal cortex. Both laughter and crying could be decoded accurately (66–77% depending on the experiment) from the BOLD signal, and the voxels contributing most significantly to classification were in superior temporal cortex. These results suggest that perceiving laughter and crying engage distinct neural networks, whose activity suppresses each other to manage appropriate behavioral responses to others’ bonding and distress signals

    Decoding brain basis of laughter and crying in natural scenes

    Get PDF
    Laughter and crying are universal signals of prosociality and distress, respectively. Here we investigated the functional brain basis of perceiving laughter and crying using naturalistic functional magnetic resonance imaging (fMRI) approach. We measured haemodynamic brain activity evoked by laughter and crying in three experiments with 100 subjects in each. The subjects i) viewed a 20-minute medley of short video clips, and ii) 30 min of a full-length feature film, and iii) listened to 13.5 min of a radio play that all contained bursts of laughter and crying. Intensity of laughing and crying in the videos and radio play was annotated by independent observes, and the resulting time series were used to predict hemodynamic activity to laughter and crying episodes. Multivariate pattern analysis (MVPA) was used to test for regional selectivity in laughter and crying evoked activations. Laughter induced widespread activity in ventral visual cortex and superior and middle temporal and motor cortices. Crying activated thalamus, cingulate cortex along the anterior-posterior axis, insula and orbitofrontal cortex. Both laughter and crying could be decoded accurately (66–77% depending on the experiment) from the BOLD signal, and the voxels contributing most significantly to classification were in superior temporal cortex. These results suggest that perceiving laughter and crying engage distinct neural networks, whose activity suppresses each other to manage appropriate behavioral responses to others’ bonding and distress signals

    Predicting final ischemic stroke lesions from initial diffusion-weighted images using a deep neural network

    No full text
    Background: For prognosis of stroke, measurement of the diffusion-perfusion mismatch is a common practice for estimating tissue at risk of infarction in the absence of timely reperfusion. However, perfusion-weighted imaging (PWI) adds time and expense to the acute stroke imaging workup. We explored whether a deep convolutional neural network (DCNN) model trained with diffusion-weighted imaging obtained at admission could predict final infarct volume and location in acute stroke patients. Methods: In 445 patients, we trained and validated an attention-gated (AG) DCNN to predict final infarcts as delineated on follow-up studies obtained 3 to 7 days after stroke. The input channels consisted of MR diffusion-weighted imaging (DWI), apparent diffusion coefficients (ADC) maps, and thresholded ADC maps with values less than 620 × 10−6 mm2/s, while the output was a voxel-by-voxel probability map of tissue infarction. We evaluated performance of the model using the area under the receiver-operator characteristic curve (AUC), the Dice similarity coefficient (DSC), absolute lesion volume error, and the concordance correlation coefficient (ρc) of the predicted and true infarct volumes. Results: The model obtained a median AUC of 0.91 (IQR: 0.84–0.96). After thresholding at an infarction probability of 0.5, the median sensitivity and specificity were 0.60 (IQR: 0.16–0.84) and 0.97 (IQR: 0.93–0.99), respectively, while the median DSC and absolute volume error were 0.50 (IQR: 0.17–0.66) and 27 ml (IQR: 7–60 ml), respectively. The model’s predicted lesion volumes showed high correlation with ground truth volumes (ρc = 0.73, p < 0.01). Conclusion: An AG-DCNN using diffusion information alone upon admission was able to predict infarct volumes at 3–7 days after stroke onset with comparable accuracy to models that consider both DWI and PWI. This may enable treatment decisions to be made with shorter stroke imaging protocols

    Decoding Music-Evoked Emotions in the Auditory and Motor Cortex.

    Get PDF
    Music can induce strong subjective experience of emotions, but it is debated whether these responses engage the same neural circuits as emotions elicited by biologically significant events. We examined the functional neural basis of music-induced emotions in a large sample (n = 102) of subjects who listened to emotionally engaging (happy, sad, fearful, and tender) pieces of instrumental music while their hemodynamic brain activity was measured with functional magnetic resonance imaging (fMRI). Ratings of the four categorical emotions and liking were used to predict hemodynamic responses in general linear model (GLM) analysis of the fMRI data. Multivariate pattern analysis (MVPA) was used to reveal discrete neural signatures of the four categories of music-induced emotions. To map neural circuits governing non-musical emotions, the subjects were scanned while viewing short emotionally evocative film clips. The GLM revealed that most emotions were associated with activity in the auditory, somatosensory, and motor cortices, cingulate gyrus, insula, and precuneus. Fear and liking also engaged the amygdala. In contrast, the film clips strongly activated limbic and cortical regions implicated in emotional processing. MVPA revealed that activity in the auditory cortex and primary motor cortices reliably discriminated the emotion categories. Our results indicate that different music-induced basic emotions have distinct representations in regions supporting auditory processing, motor control, and interoception but do not strongly rely on limbic and medial prefrontal regions critical for emotions with survival value

    An updated review of nanofluids in various heat transfer devices

    No full text
    corecore