34 research outputs found

    Thermal rheological analysis of cure process of epoxy prepreg

    Get PDF
    The cure process of epoxy prepreg used as composite pipe joints was studied by means of differential scanning calorimetry (DSC), Bohlin Rheometer and other techniques. Isothermal DSC measurements were conducted between 110 and 220 oC, at 10 oC intervals. The results show that the complete cure reaction could be achieved at oC. The isothermal cure process was simulated with the four-parameter autocatalytic model. Except in the late stage of cure reaction, the model agrees well with the experimental data, especially at high temperatures. To account for the effect of diffusion on the cure rate, a diffusion factor was introduced into the model. The modified model greatly improved the predicated data at the late stage of cure reaction. The dynamic cure process was different from the isothermal cure process in that it is composed of two cure reactions. For dynamic cure process, a three-parameter autocatalytic model was used. The parameters in the model were determined by two methods. One was based on Kissinger and Ozawa approach. The whole curing process was modeled with two reactions. Another method was based on Borchardt and Daniels kinetic approach with whole curing process was modeled with one reaction. The fitting results by first and second method agreed well with experimental value in the late and early cure stage, separately. Rheological properties of epoxy prepreg are closely related to the cure process. With the development of cure reaction, gelation occurs and epoxy prepreg becomes difficult to process. As temperature increases, the gel time decreases. Viscosity profiles were described by different models. Except the first and nth order viscosity models, new viscosity models were proposed The proposed new viscosity models are better than the old models for both isothermal and dynamic cure processes. To graphically represent the phase changes of the cure process, the isothermal cure diagrams of time-temperature-transformation (TTT) and conversion-temperature-transformation (CTT) are constructed. Each region in TTT and CTT diagrams corresponds with the phase state of the cure process, so the cure mechanism is clearly shown in the diagrams

    Boost the Lead Conversion Efficiency for the Synthesis of Colloidal 2D PbS Nanosheets

    Get PDF
    In the synthesis of colloidal PbS nanosheets, the supernatant of the reaction solution is reused to boost the lead conversion efficiency. It doubles the conversion efficiency of the lead precursors to the PbS nanosheets. The nanosheets synthesized by reusing the supernatant have similar morphology, nearly identical thickness, and optical properties as the original ones, confirmed by transmission electron microscopy, X-ray diffraction, and photoluminescence spectroscopy. This method reduces the toxic Pb-containing waste during the synthesis, a step toward the green and scalable synthesis of colloidal 2D PbS nanosheets

    ETCE2002/OT-29154 RHEOLOGICAL ANALYSIS OF CURING PROCESS OF EPOXY PREPREG USED AS COMPOSITE PIPE JOINTS

    Get PDF
    ABSTRACT The rheological properties of curing process of epoxy prepreg were measured by Bohlin Rheometer. The variations of storage modulus, loss modulus and viscosity are monitored vs. the cure time and temperature. Viscosity profiles were described by different models. Except the first order viscosity models, new viscosity models based on Boltzmann function were proposed. In the new models, a parameter called critical time was introduced. Critical time is a function of temperature and also meets an Arrhenius law. The activation energy calculated by critical time closes to that obtained by initial viscosity. The kinetic rate constants in the old and new models are comparable at each temperature, and the kinetic activation energies calculated from rate constants in the old and new models are very close. The fitting results show that the proposed new viscosity models are better than the old models for both isothermal and dynamic cure processes

    Visualizing the Structure of the Earth’s Lithosphere on the Google Earth Virtual-Globe Platform

    No full text
    While many of the current methods for representing the existing global lithospheric models are suitable for academic investigators to conduct professional geological and geophysical research, they are not suited to visualize and disseminate the lithospheric information to non-geological users (such as atmospheric scientists, educators, policy-makers, and even the general public) as they rely on dedicated computer programs or systems to read and work with the models. This shortcoming has become more obvious as more and more people from both academic and non-academic institutions struggle to understand the structure and composition of the Earth’s lithosphere. Google Earth and the concomitant Keyhole Markup Language (KML) provide a universal and user-friendly platform to represent, disseminate, and visualize the existing lithospheric models. We present a systematic framework to visualize and disseminate the structure of the Earth’s lithosphere on Google Earth. A KML generator is developed to convert lithospheric information derived from the global lithospheric model LITHO1.0 into KML-formatted models, and a web application is deployed to disseminate and visualize those models on the Internet. The presented framework and associated implementations can be easily exported for application to support interactively integrating and visualizing the internal structure of the Earth with a global perspective

    SPSQL: Step-by-step Parsing Based Framework for Text-to-SQL Generation

    Full text link
    Converting text into the structured query language (Text2SQL) is a research hotspot in the field of natural language processing (NLP), which has broad application prospects. In the era of big data, the use of databases has penetrated all walks of life, in which the collected data is large in scale, diverse in variety, and wide in scope, making the data query cumbersome and inefficient, and putting forward higher requirements for the Text2SQL model. In practical applications, the current mainstream end-to-end Text2SQL model is not only difficult to build due to its complex structure and high requirements for training data, but also difficult to adjust due to massive parameters. In addition, the accuracy of the model is hard to achieve the desired result. Based on this, this paper proposes a pipelined Text2SQL method: SPSQL. This method disassembles the Text2SQL task into four subtasks--table selection, column selection, SQL generation, and value filling, which can be converted into a text classification problem, a sequence labeling problem, and two text generation problems, respectively. Then, we construct data formats of different subtasks based on existing data and improve the accuracy of the overall model by improving the accuracy of each submodel. We also use the named entity recognition module and data augmentation to optimize the overall model. We construct the dataset based on the marketing business data of the State Grid Corporation of China. Experiments demonstrate our proposed method achieves the best performance compared with the end-to-end method and other pipeline methods.Comment: 8 pages, 6 figure

    High stability light-emitting electrochemical cells from cationic iridium complexes with bulky 5,5 ' substituents

    No full text
    We explore the photophysical, electrochemical, and electroluminescent properties of the ionic transition metal complex [(ppy)(2)Ir(bpy*)](PF(6)) where ppyH is 2-phenylpyridine and bpy* is 5,5'-diaryl-2,2'-bipyridine. Single layer devices of the structure ITO/[(ppy)(2)Ir(bpy*)](PF(6))/Au exhibited high stability, with half-lives on the order of 100 h at a bias of -4 V. Long lifetimes are achieved through the bulky nature of the aryl substituents, which serves to limit chromophore-chromophore self-quenching, and 5,5' positioning of these bulky groups is clearly advantageous for device performance.</p

    Bending behavior of diamane and twisted bilayer graphene: Insights from four-point bending deformation

    No full text
    The intriguing physical properties of two-dimensional (2D) nanomaterials make them promising building blocks for flexible electronics. Using a four-point bending approach, this work establishes a comprehensive understanding of the bending behavior of diamane – a 2D diamond nanostructure, from elastic deformation to structural failure through atomistic simulations. The four-point bending method accurately reproduces the pure bending of the sample, and the obtained force-displacement curve fit well with the classical Euler beam theory. Structural failure is observed from diamane under bending when its thickness or the number of layers increases. Atomic insights reveal that the crack initiates from the tension side of the sample, resulting in a tension-induced bending failure. Specifically, the bending limit is found to be slightly larger than the fracture strain under tensile deformation. Additionally, the bending behaviour of the diamane analogous – twisted bilayer graphene with interlayer-bonding (TBGIB), has been investigated. Different from diamane, TBGIB bends elastically at the initial stage and then experiences structural failures with increasing bending strain. Higher interlayer bonding density is observed to result in a higher bending stiffness. Meanwhile, significant interlayer shear strain is detected during bending, which leads to interlayer bond breakage, rippling, and buckling of the graphene layer. This work provides a full description of the pure bending behavior of diamane and its analogous structure, which could be beneficial for their applications in flexible electronics.</p
    corecore