224 research outputs found

    Well-posedness and Robust Preconditioners for the Discretized Fluid-Structure Interaction Systems

    Full text link
    In this paper we develop a family of preconditioners for the linear algebraic systems arising from the arbitrary Lagrangian-Eulerian discretization of some fluid-structure interaction models. After the time discretization, we formulate the fluid-structure interaction equations as saddle point problems and prove the uniform well-posedness. Then we discretize the space dimension by finite element methods and prove their uniform well-posedness by two different approaches under appropriate assumptions. The uniform well-posedness makes it possible to design robust preconditioners for the discretized fluid-structure interaction systems. Numerical examples are presented to show the robustness and efficiency of these preconditioners.Comment: 1. Added two preconditioners into the analysis and implementation 2. Rerun all the numerical tests 3. changed title, abstract and corrected lots of typos and inconsistencies 4. added reference

    Novel disilane chemistry: silyl radical catalyzed cyclo-trimerization of alkynes, synthesis of 1,4-disilacyclohexa-2,5-dienes and silicon hypercoordination studies

    Get PDF
    This dissertation consists of four papers and a research proposal as an extension of current research;Si2Cl6 and Si(OMe)6 were found to be efficient in cyclo-trimerizing alkynes; into the corresponding aromatic products. A proposed silyl radical pathway is supported by the addition of radical quenchers to the reaction system, by the observation of catalytic trimerization properties of hexachlorodisilane and hexamethyoxydisilane toward alkynes, and by UV-visible irradiation experiment which afforded cyclo-trimerization products;1,4-Disilacyclohexa-2,5-dienes can be synthesized either by reacting disilanes bearing multiple dimethylamino groups. A proposed silylene pathway is supported by the identification of the by-products from the reactions, and by trapping an intermediate reaction product after the addition of 1,4-diphenyl-1,3-butadiene to the pentakis(dimethylamino)disilane/diphenylacetylene reaction system;Disilanes with electron-withdrawing groups react readily with 1,2-quinones and p-quinones to afford disilylated products in the absence of a transition metal catalyst in contrast to earlier reports. A proposed pathway involving the formation of hypercoordinated silicon species was supported by adduct formation reactions between Si2Cl 6 and diamines and DMF, and the reaction between acetamide salts and Si2Cl6, a hexacoordinated silicon complex was synthesized in one of these experiments;The observations reported in the papers of this thesis are related by the following properties of disilanes: relative weakness of the Si-Si bonds, especially when electron-withdrawing substituents are present on the silicon atoms; relatively easy disproportionation of disilanes bearing multiple dimethylamino to afford silylene species due to the weakness of Si-Si and Si-N bonds; enhanced hypercoordination tendencies of disilanes bearing electron-withdrawing groups

    Multi-Zone Unit for Recurrent Neural Networks

    Full text link
    Recurrent neural networks (RNNs) have been widely used to deal with sequence learning problems. The input-dependent transition function, which folds new observations into hidden states to sequentially construct fixed-length representations of arbitrary-length sequences, plays a critical role in RNNs. Based on single space composition, transition functions in existing RNNs often have difficulty in capturing complicated long-range dependencies. In this paper, we introduce a new Multi-zone Unit (MZU) for RNNs. The key idea is to design a transition function that is capable of modeling multiple space composition. The MZU consists of three components: zone generation, zone composition, and zone aggregation. Experimental results on multiple datasets of the character-level language modeling task and the aspect-based sentiment analysis task demonstrate the superiority of the MZU.Comment: Accepted at AAAI 202

    Enhancing Context Modeling with a Query-Guided Capsule Network for Document-level Translation

    Full text link
    Context modeling is essential to generate coherent and consistent translation for Document-level Neural Machine Translations. The widely used method for document-level translation usually compresses the context information into a representation via hierarchical attention networks. However, this method neither considers the relationship between context words nor distinguishes the roles of context words. To address this problem, we propose a query-guided capsule networks to cluster context information into different perspectives from which the target translation may concern. Experiment results show that our method can significantly outperform strong baselines on multiple data sets of different domains.Comment: 11 pages, 7 figures, 2019 Conference on Empirical Methods in Natural Language Processin

    Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation

    Full text link
    Non-Autoregressive Neural Machine Translation (NAT) achieves significant decoding speedup through generating target words independently and simultaneously. However, in the context of non-autoregressive translation, the word-level cross-entropy loss cannot model the target-side sequential dependency properly, leading to its weak correlation with the translation quality. As a result, NAT tends to generate influent translations with over-translation and under-translation errors. In this paper, we propose to train NAT to minimize the Bag-of-Ngrams (BoN) difference between the model output and the reference sentence. The bag-of-ngrams training objective is differentiable and can be efficiently calculated, which encourages NAT to capture the target-side sequential dependency and correlates well with the translation quality. We validate our approach on three translation tasks and show that our approach largely outperforms the NAT baseline by about 5.0 BLEU scores on WMT14 En↔\leftrightarrowDe and about 2.5 BLEU scores on WMT16 En↔\leftrightarrowRo.Comment: AAAI 202
    • …
    corecore