110 research outputs found

    Examining the Link Between Social Anxiety and Alcohol Use: The Role of Post-Event Processing

    Get PDF
    Post-event processing (PEP) involves a negative, self-focused review of one’s performance in past social events. PEP is common in social anxiety (SA); by increasing anticipation of failure for future social events, PEP serves to maintain anxiety. Research is needed to clarify the association between SA and PEP in the context of alcohol use, given that although socially anxious individuals are at increased risk of drinking problems, this risk pathway is poorly understood. To help resolve this, the present two studies aimed to assess the role of PEP in the link between SA and alcohol use among two samples of young adults who ranged in SA severity. Study 1 used a 3-week diary (via smartphone) to assess alcohol intoxication during, and PEP after, social events in a sample of individuals (N=92) high (n=40) and low (n=52) in SA. Of interest was the association between PEP and next-event intoxication, and the moderating effect of SA. Compared to the low SA group, those high in SA reported more PEP, similar intoxication, and a positive correlation between PEP and next-event intoxication. In the low SA group PEP and intoxication were unrelated. Multilevel models supported a SA by PEP interaction in the high SA group only. Specifically, increases in PEP corresponded with increases in intoxication at the next event, but only for those moderately high in SA. Study 2 used a lab-based procedure, and participants (N=103) consumed alcoholic (n=52) or non-alcoholic (n=51) beverages, engaged in an anxiety-provoking interaction, and completed follow-up assessments of PEP about the interaction (sent via email). Regression models supported a SA by drinking status interaction in predicting PEP in the alcohol condition only. Specifically, for those who consumed alcohol before the interaction, elevated (baseline) SA was associated with increased PEP, but only for light drinkers. For heavy drinkers in the alcohol condition, SA was unrelated to PEP. These results underscore the importance of PEP, and variables that influence PEP, in understanding the link between SA and alcohol use. The results are discussed in terms of theoretical and clinical considerations

    An Evaluation of a Computerized Measure of Interpretation Bias in Generalized Anxiety Disorder (GAD)

    Get PDF
    Theories suggest that individuals with generalized anxiety disorder (GAD) make threatening appraisals of ambiguous information related to health, finances, and relationships, among other domains. As a result, we have recently developed two parallel word-sentence association paradigm (WSAP) computer tasks designed to assess threat and benign interpretation biases relating to GAD worry. It was hypothesized that the GAD analogue group (i.e., individuals meeting diagnostic criteria by questionnaire) would endorse more threatening interpretations and fewer benign interpretations of ambiguous sentences relative to the non-GAD group (i.e., individuals not meeting diagnostic criteria by questionnaire) in WSAP Sets A and B. In the current study, 97 university students and community volunteers were randomly assigned to Set A (n = 49) or B (n = 48), and completed self-report measures of anxiety, worry, and related symptomatology. The results indicate that of those assigned to Set A, no differences were found between the GAD analogue (n = 19) and non-GAD group (n = 30) on tendency to endorse threat interpretations. Of those assigned to Set B, the GAD analogue group (n = 17) was significantly more likely to endorse an overall threat interpretation bias and specifically, to reject benign disambiguations than the non-GAD group (n = 31). No differences were found between the groups in either Set in the tendency to accept threatening disambiguations. More research is needed on the specific role of biases in the etiology and treatment of GAD, and why Set A did not distinguish between the groups. This study provides preliminary support for the use of word-sentence paradigms to assess, and possibly modify, threat interpretation biases in GAD

    Measuring Cognitive Errors: Initial Development of the Cognitive Distortions Scale (CDS)

    Get PDF
    The ability to assess and correct biases in thinking is central to cognitive-behavioral therapy. Although measures of cognitive distortions exist, no measure comprehensively assesses the cognitive errors that are typically cited in the literature. The development and initial validation of the Cognitive Distortions Scale (CDS), a questionnaire that measures the tendency to make 10 cognitive distortions (e.g., mindreading, catastrophizing, all-or-nothing thinking) as they occur in interpersonal and achievement domains, is described. Across two studies, undergraduate students (n = 318) completed the CDS and other clinically relevant measures. The CDS and its two subscales appear to exhibit good psychometric properties; however, a factor analysis supported the use of a one-factor solution. Additional analyses suggested that some errors occur more frequently in some domains than others and that some errors may have more clinical significance than others. Notwithstanding issues inherent in measuring cognitive errors, and study limitations, the CDS appears to be a promising new measure of cognitive distortion, with good research and clinical potential

    Characterizing Width Uniformity by Wave Propagation

    Full text link
    This work describes a novel image analysis approach to characterize the uniformity of objects in agglomerates by using the propagation of normal wavefronts. The problem of width uniformity is discussed and its importance for the characterization of composite structures normally found in physics and biology highlighted. The methodology involves identifying each cluster (i.e. connected component) of interest, which can correspond to objects or voids, and estimating the respective medial axes by using a recently proposed wavefront propagation approach, which is briefly reviewed. The distance values along such axes are identified and their mean and standard deviation values obtained. As illustrated with respect to synthetic and real objects (in vitro cultures of neuronal cells), the combined use of these two features provide a powerful description of the uniformity of the separation between the objects, presenting potential for several applications in material sciences and biology.Comment: 14 pages, 23 figures, 1 table, 1 referenc

    Coarse-to-fine skeleton extraction for high resolution 3D meshes

    Get PDF
    This paper presents a novel algorithm for medial surfaces extraction that is based on the density-corrected Hamiltonian analysis of Torsello and Hancock [1]. In order to cope with the exponential growth of the number of voxels, we compute a first coarse discretization of the mesh which is iteratively refined until a desired resolution is achieved. The refinement criterion relies on the analysis of the momentum field, where only the voxels with a suitable value of the divergence are exploded to a lower level of the hierarchy. In order to compensate for the discretization errors incurred at the coarser levels, a dilation procedure is added at the end of each iteration. Finally we design a simple alignment procedure to correct the displacement of the extracted skeleton with respect to the true underlying medial surface. We evaluate the proposed approach with an extensive series of qualitative and quantitative experiments

    Principal component and Voronoi skeleton alternatives for curve reconstruction from noisy point sets

    Get PDF
    Surface reconstruction from noisy point samples must take into consideration the stochastic nature of the sample -- In other words, geometric algorithms reconstructing the surface or curve should not insist in following in a literal way each sampled point -- Instead, they must interpret the sample as a “point cloud” and try to build the surface as passing through the best possible (in the statistical sense) geometric locus that represents the sample -- This work presents two new methods to find a Piecewise Linear approximation from a Nyquist-compliant stochastic sampling of a quasi-planar C1 curve C(u) : R → R3, whose velocity vector never vanishes -- One of the methods articulates in an entirely new way Principal Component Analysis (statistical) and Voronoi-Delaunay (deterministic) approaches -- It uses these two methods to calculate the best possible tape-shaped polygon covering the planarised point set, and then approximates the manifold by the medial axis of such a polygon -- The other method applies Principal Component Analysis to find a direct Piecewise Linear approximation of C(u) -- A complexity comparison of these two methods is presented along with a qualitative comparison with previously developed ones -- It turns out that the method solely based on Principal Component Analysis is simpler and more robust for non self-intersecting curves -- For self-intersecting curves the Voronoi-Delaunay based Medial Axis approach is more robust, at the price of higher computational complexity -- An application is presented in Integration of meshes originated in range images of an art piece -- Such an application reaches the point of complete reconstruction of a unified mes
    corecore