217 research outputs found

    Atomistic potential for graphene and other sp2^2 carbon systems

    Full text link
    We introduce a torsional force field for sp2^2 carbon to augment an in-plane atomistic potential of a previous work (Kalosakas et al, J. Appl. Phys. {\bf 113}, 134307 (2013)) so that it is applicable to out-of-plane deformations of graphene and related carbon materials. The introduced force field is fit to reproduce DFT calculation data of appropriately chosen structures. The aim is to create a force field that is as simple as possible so it can be efficient for large scale atomistic simulations of various sp2^2 carbon structures without significant loss of accuracy. We show that the complete proposed potential reproduces characteristic properties of fullerenes and carbon nanotubes. In addition, it reproduces very accurately the out-of-plane ZA and ZO modes of graphene's phonon dispersion as well as all phonons with frequencies up to 1000~cm−1^{-1}.Comment: 9 pages, 6 figure

    Knowledge-aware Complementary Product Representation Learning

    Full text link
    Learning product representations that reflect complementary relationship plays a central role in e-commerce recommender system. In the absence of the product relationships graph, which existing methods rely on, there is a need to detect the complementary relationships directly from noisy and sparse customer purchase activities. Furthermore, unlike simple relationships such as similarity, complementariness is asymmetric and non-transitive. Standard usage of representation learning emphasizes on only one set of embedding, which is problematic for modelling such properties of complementariness. We propose using knowledge-aware learning with dual product embedding to solve the above challenges. We encode contextual knowledge into product representation by multi-task learning, to alleviate the sparsity issue. By explicitly modelling with user bias terms, we separate the noise of customer-specific preferences from the complementariness. Furthermore, we adopt the dual embedding framework to capture the intrinsic properties of complementariness and provide geometric interpretation motivated by the classic separating hyperplane theory. Finally, we propose a Bayesian network structure that unifies all the components, which also concludes several popular models as special cases. The proposed method compares favourably to state-of-art methods, in downstream classification and recommendation tasks. We also develop an implementation that scales efficiently to a dataset with millions of items and customers

    Predicting Evoked Emotions in Conversations

    Full text link
    Understanding and predicting the emotional trajectory in multi-party multi-turn conversations is of great significance. Such information can be used, for example, to generate empathetic response in human-machine interaction or to inform models of pre-emptive toxicity detection. In this work, we introduce the novel problem of Predicting Emotions in Conversations (PEC) for the next turn (n+1), given combinations of textual and/or emotion input up to turn n. We systematically approach the problem by modeling three dimensions inherently connected to evoked emotions in dialogues, including (i) sequence modeling, (ii) self-dependency modeling, and (iii) recency modeling. These modeling dimensions are then incorporated into two deep neural network architectures, a sequence model and a graph convolutional network model. The former is designed to capture the sequence of utterances in a dialogue, while the latter captures the sequence of utterances and the network formation of multi-party dialogues. We perform a comprehensive empirical evaluation of the various proposed models for addressing the PEC problem. The results indicate (i) the importance of the self-dependency and recency model dimensions for the prediction task, (ii) the quality of simpler sequence models in short dialogues, (iii) the importance of the graph neural models in improving the predictions in long dialogues

    Wrinkled few-layer graphene as highly efficient load bearer

    Full text link
    Multilayered graphitic materials are not suitable as load-bearers due to their inherent weak interlayer bonding (for example, graphite is a solid lubricant in certain applications). This situation is largely improved when two-dimensional (2-D) materials such as a monolayer (SLG) graphene are employed. The downside in these cases is the presence of thermally or mechanically induced wrinkles which are ubiquitous in 2-D materials. Here we set out to examine the effect of extensive large wavelength/ amplitude wrinkling on the stress transfer capabilities of exfoliated simply-supported graphene flakes. Contrary to common belief we present clear evidence that this type of "corrugation" enhances the load bearing capacity of few-layer graphene as compared to 'flat' specimens. This effect is the result of the significant increase of the graphene/polymer interfacial shear stress per increment of applied strain due to wrinkling and paves the way for designing affordable graphene composites with highly improved stress-transfer efficiency.Comment: 20 pages, 6 figure

    The Role of Preprocessing for Word Representation Learning in Affective Tasks

    Get PDF
    Affective tasks, including sentiment analysis, emotion classification, and sarcasm detection have drawn a lot of attention in recent years due to a broad range of useful applications in various domains. The main goal of affect detection tasks is to recognize states such as mood, sentiment, and emotions from textual data (e.g., news articles or product reviews). Despite the importance of utilizing preprocessing steps in different stages (i.e., word representation learning and building a classification model) of affect detection tasks, this topic has not been studied well. To that end, we explore whether applying various preprocessing methods (stemming, lemmatization, stopword removal, punctuation removal and so on) and their combinations in different stages of the affect detection pipeline can improve the model performance. The are many preprocessing approaches that can be utilized in affect detection tasks. However, their influence on the final performance depends on the type of preprocessing and the stages that they are applied. Moreover, the preprocessing impacts vary across different affective tasks. Our analysis provides thorough insights into how preprocessing steps can be applied in building an effect detection pipeline and their respective influence on performance
    • …
    corecore