12,615 research outputs found

    Dynamic response of elastic beam to a moving pulse: finite element analysis of critical velocity

    Get PDF
    Dynamic behaviour of a semi-infinite elastic beam subjected to a moving single sinusoidal pulse was investigated by using finite element method associated with dimensionless analysis. The typical features of the equivalent stress and beam deflection were presented. It is found that the average value of maximal equivalent stress in the beam reaches its maximum value when the velocity of moving pulse is closed to a critical velocity. The critical velocity decreases as the pulse duration increases. The material, structural and load parameters influencing the critical velocity were analysed. An empirical formula of the critical velocity with respect to the speed of elastic wave, the gyration radius of the cross-section and the pulse duration was obtained

    Condensed Multiwalled Carbon Nanotubes as Super Fibers

    Full text link
    The ultra-low intershell shear strength in carbon nanotubes (CNTs) has been the primary obstacle to applications of CNTs as mechanical reinforcements. In this paper we propose a new CNT-system composed of comprising of coaxial cylindrical shells of sp2-bonded carbons with condensed intershell spacings. Our atomistic calculations show that such condensed multiwalled carbon nanotubes (CMWNTs) can greatly enhance intershell shear strengths by several orders, and can simultaneously generate higher tensile strengths and moduli respectively than those of ordinary CNTs. It has further shown that CMWNTs can maintain thermally stable up to 2,000 K. By taking advantage of the primary enhancement mechanism of CMWNTs, a method of producing CMWNTs is therefore proposed tentatively. It is believed that CMWNTs featured with those properties can be taken as excellent candidates of super fibers for creating space elevators

    Diffusion Language Models Can Perform Many Tasks with Scaling and Instruction-Finetuning

    Full text link
    The recent surge of generative AI has been fueled by the generative power of diffusion probabilistic models and the scalable capabilities of large language models. Despite their potential, it remains elusive whether diffusion language models can solve general language tasks comparable to their autoregressive counterparts. This paper demonstrates that scaling diffusion models w.r.t. data, sizes, and tasks can effectively make them strong language learners. We build competent diffusion language models at scale by first acquiring knowledge from massive data via masked language modeling pretraining thanks to their intrinsic connections. We then reprogram pretrained masked language models into diffusion language models via diffusive adaptation, wherein task-specific finetuning and instruction finetuning are explored to unlock their versatility in solving general language tasks. Experiments show that scaling diffusion language models consistently improves performance across downstream language tasks. We further discover that instruction finetuning can elicit zero-shot and few-shot in-context learning abilities that help tackle many unseen tasks by following natural language instructions, and show promise in advanced and challenging abilities such as reasoning.Comment: added reference

    Policy Optimization with Stochastic Mirror Descent

    Full text link
    Improving sample efficiency has been a longstanding goal in reinforcement learning. In this paper, we propose the VRMPO\mathtt{VRMPO}: a sample efficient policy gradient method with stochastic mirror descent. A novel variance reduced policy gradient estimator is the key of VRMPO\mathtt{VRMPO} to improve sample efficiency. Our VRMPO\mathtt{VRMPO} needs only O(ϵ−3)\mathcal{O}(\epsilon^{-3}) sample trajectories to achieve an ϵ\epsilon-approximate first-order stationary point, which matches the best-known sample complexity. We conduct extensive experiments to show our algorithm outperforms state-of-the-art policy gradient methods in various settings
    • …
    corecore