242 research outputs found

    Atomic Mechanism and Criterion for Hydrogen-Induced Transgranular to Intergranular Fracture Transition

    Get PDF
    Please click Additional Files below to see the full abstrac

    The Uncertainty of Roughness and Its Influence on Dynamic Response and Performance of Canal System

    Get PDF
    Source: ICHE Conference Archive - https://mdi-de.baw.de/icheArchiv

    Bis[(2-pyrid­yl)(2-pyridyl­amino)­methano­lato]manganese(III) nitrate

    Get PDF
    The MnIII atom in the title complex, [Mn(C11H10N3O)2]NO3, is coordinated by the two tridentate (2-pyrid­yl)(2-pyridyl­amino)­methano­late ligands, forming a six-coordinate environment. The four pyridyl N atoms constitute the equatorial plane on which the manganese(III) ion lies; the coordination plane suffers a slight distortion as indicated by the average plane deviation of 0.058 Å. The methano­late O atoms occupy the axial positions. The coordination geometry is thus octa­hedral. In the title compound, the cations are linked by nitrate anions via N—H⋯O hydrogen bonds to form one-dimensional chains. Moreover, the one-dimensional structure is stabilized by inter­molecular edge-to-face aromatic π–π inter­actions with a center-of-inversion at a distance of ca 4.634 Å

    An extended Rice model for intergranular fracture

    Get PDF
    The plastic events occurring during the process of intergranular fracture in metals is still not well understood due to the complexity of grain boundary (GB) structures and their interactions with crack-tip dislocation plasticity. By considering the local GB structural transformation after dislocation emission from a GB in the Peierls-type Rice-Beltz model, herein we established a semi-analytical transition-state-theory-based framework to predict the most probable Mode-I stress intensity factor (SIF) for dislocation emission from a cracked GB. Using large-scale molecular dynamics (MD) simulations, we studied the fracture behaviors of bi-crystalline Fe samples with 12 different symmetric tilt GBs inside. The MD results demonstrate that the presence of GB could significantly change the SIF required for the activation of plastic events, confirming the theoretical predictions that attributes this to the energy change caused by the transformation of GB structure. Both the atomistic simulation and the theoretical model consistently indicate that, the critical dynamic SIF ( ) at which the dynamic SIF KI(t) deviates from the linearity with respect to the strain ε, increases with the increasing loading rate. However, the classical Rice model underestimates the due to its failure to consider the effects of localized fields. The present theoretical model provides a mechanism-based framework for the application of grain boundary engineering in the design and fabrication of nano-grained metals.An extended Rice model for intergranular fractureacceptedVersio

    Liquid layer generator for excellent icephobicity at extremely low temperature

    Get PDF
    Progress in icephobicity has been made in recent years. However, the majority of the icephobic surfaces reported are relying on mechanisms of static nature, and maintaining low ice adhesion of these surfaces at extreme temperature as low as -60 ℃ has been challenging. Dynamic anti-icing surfaces, that can melt ice or change the ice-substrate interfaces from solid to liquid phase after the formation of ice serves as a viable alternative. In this study, liquid layer generators (LLGs), which can release ethanol to the ice-solid interface and convert ice-substrate contact from solid-solid to solid-liquid-solid mode were introduced. Excellent icephobicity on surfaces with an ethanol lubricating layer is found to withstand extremely low temperature (-60 ℃) by both molecular dynamic simulations and experiments. Two prototypes of LLG, one by packing ethanol inside and the other by storing replenishable ethanol below the substrate, are fabricated. The LLGs are able to constantly release ethanol for maximally 593 days without source replenishing. Both prototypes demonstrate super-low ice adhesion strength of 1.0~4.6 kPa and 2.2~2.8 kPa at -18 ℃. For selected samples, by introducing interfacial ethanol layer, ice adhesion strength on the same surfaces unprecedented decreased from 709.2~760.9 kPa to 22.1~25.2 kPa at low temperature of -60 ℃.acceptedVersion© 2019. This is the authors' accepted and refereed manuscript to the article. Locked until 1.7.2020 due to copyright restrictions. The final authenticated version is available online at: http://dx.doi.org/10.1039/C9MH00859

    A Multi-State Comparative Coarse-Grained Modeling of Semi-Crystalline Poly(vinyl alcohol)

    Get PDF
    Poly(vinyl alcohol) (PVA) is a promising material with exceptional mechanical properties, adhesion, and abrasion resistance. To accurately predict its mesoscopic properties, such as crystal size and morphology, while improving computational efficiency, novel coarse-grained (CG) potentials are developed using iterative Boltzmann inversion (IBI) coupled with density correction. These CG potentials are derived from various thermodynamic states based on two different mapping schemes to overcome the limitations of traditional CG potentials in predicting the glass transition, crystallization, and melting temperatures. By comparing the simulation results obtained from these CG potentials with atomistic molecular dynamics (MD) simulations and experimental data, we identify the most suitable CG model of semicrystalline PVA that effectively reproduces both atomistic structures and thermodynamic properties. In particular, X-ray diffraction (XRD) experiments are used to further validate the accuracy of the CG potentials. This multistate comparative CG strategy provides efficient and accurate CG models for deeper investigations of PVA and other semicrystalline polymers. Our study paves the way for establishing a systematic and comprehensive database of CG potentials, serving as a valuable resource for future research on semicrystalline polymers.A Multi-State Comparative Coarse-Grained Modeling of Semi-Crystalline Poly(vinyl alcohol)acceptedVersionpublishedVersio

    POMP: Probability-driven Meta-graph Prompter for LLMs in Low-resource Unsupervised Neural Machine Translation

    Full text link
    Low-resource languages (LRLs) face challenges in supervised neural machine translation due to limited parallel data, prompting research into unsupervised methods. Unsupervised neural machine translation (UNMT) methods, including back-translation, transfer learning, and pivot-based translation, offer practical solutions for LRL translation, but they are hindered by issues like synthetic data noise, language bias, and error propagation, which can potentially be mitigated by Large Language Models (LLMs). LLMs have advanced NMT with in-context learning (ICL) and supervised fine-tuning methods, but insufficient training data results in poor performance in LRLs. We argue that LLMs can mitigate the linguistic noise with auxiliary languages to improve translations in LRLs. In this paper, we propose Probability-driven Meta-graph Prompter (POMP), a novel approach employing a dynamic, sampling-based graph of multiple auxiliary languages to enhance LLMs' translation capabilities for LRLs. POMP involves constructing a directed acyclic meta-graph for each source language, from which we dynamically sample multiple paths to prompt LLMs to mitigate the linguistic noise and improve translations during training. We use the BLEURT metric to evaluate the translations and back-propagate rewards, estimated by scores, to update the probabilities of auxiliary languages in the paths. Our experiments show significant improvements in the translation quality of three LRLs, demonstrating the effectiveness of our approach

    Self-Evolution Learning for Mixup: Enhance Data Augmentation on Few-Shot Text Classification Tasks

    Full text link
    Text classification tasks often encounter few shot scenarios with limited labeled data, and addressing data scarcity is crucial. Data augmentation with mixup has shown to be effective on various text classification tasks. However, most of the mixup methods do not consider the varying degree of learning difficulty in different stages of training and generate new samples with one hot labels, resulting in the model over confidence. In this paper, we propose a self evolution learning (SE) based mixup approach for data augmentation in text classification, which can generate more adaptive and model friendly pesudo samples for the model training. SE focuses on the variation of the model's learning ability. To alleviate the model confidence, we introduce a novel instance specific label smoothing approach, which linearly interpolates the model's output and one hot labels of the original samples to generate new soft for label mixing up. Through experimental analysis, in addition to improving classification accuracy, we demonstrate that SE also enhances the model's generalize ability

    Recursively Summarizing Enables Long-Term Dialogue Memory in Large Language Models

    Full text link
    Most open-domain dialogue systems suffer from forgetting important information, especially in a long-term conversation. Existing works usually train the specific retriever or summarizer to obtain key information from the past, which is time-consuming and highly depends on the quality of labeled data. To alleviate this problem, we propose to recursively generate summaries/ memory using large language models (LLMs) to enhance long-term memory ability. Specifically, our method first stimulates LLMs to memorize small dialogue contexts and then recursively produce new memory using previous memory and following contexts. Finally, the LLM can easily generate a highly consistent response with the help of the latest memory. We evaluate our method using ChatGPT and text-davinci-003, and the experiments on the widely-used public dataset show that our method can generate more consistent responses in a long-context conversation. Notably, our method is a potential solution to enable the LLM to model the extremely long context. Code and scripts will be released later
    corecore