4,711 research outputs found

    Natural products as sources of new fungicides (I): synthesis and antifungal activity of Kakuol derivatives against phytopathogenic fungi

    Get PDF
    In order to establish an advanced structural-activity relationship (SAR) and to explore the feasibility of kakuol analogues with better anti-fungi activity, a series of 2-hydroxy-4,5-methylenedioxyaryl ketones were conveniently synthesized by the Friedel-Crafts acyl reaction, etherification reaction, reduction reaction and oximation reaction. Their structures characterized by 1H and 13C NMR and HRMS methods. Their in vitro antifungal activities were assayed. Most of the derivatives showed a remarkable in vitro activity, and some of them appeared significantly more effective than a commercial fungicide hymexazol as positive control. In particular compounds 2h and 2i, were found active with a IC50 value of 3.1 mg/ml and 2.9 mg/ml against Glomerella cingulate, which suggested that 2-hydroxy-4,5-methylenedioxyaryl ketones might be a promising candidates in the development of new antifungal compounds. Compounds 2e, 5 and 6 also exhibited high antifungal activities on a wide range of organisms, which might be considered as leading compounds in the development of new antifungal compounds.DOI: http://doi.dx.org/10.5564/mjc.v15i0.331 Mongolian Journal of Chemistry 15 (41), 2014, p94-10

    Segatron: Segment-Aware Transformer for Language Modeling and Understanding

    Full text link
    Transformers are powerful for sequence modeling. Nearly all state-of-the-art language models and pre-trained language models are based on the Transformer architecture. However, it distinguishes sequential tokens only with the token position index. We hypothesize that better contextual representations can be generated from the Transformer with richer positional information. To verify this, we propose a segment-aware Transformer (Segatron), by replacing the original token position encoding with a combined position encoding of paragraph, sentence, and token. We first introduce the segment-aware mechanism to Transformer-XL, which is a popular Transformer-based language model with memory extension and relative position encoding. We find that our method can further improve the Transformer-XL base model and large model, achieving 17.1 perplexity on the WikiText-103 dataset. We further investigate the pre-training masked language modeling task with Segatron. Experimental results show that BERT pre-trained with Segatron (SegaBERT) can outperform BERT with vanilla Transformer on various NLP tasks, and outperforms RoBERTa on zero-shot sentence representation learning.Comment: Accepted by AAAI 202

    Seasonal variability does not impact in vitro fertilization success

    Get PDF
    Peer reviewedPublisher PD

    QR-CLIP: Introducing Explicit Open-World Knowledge for Location and Time Reasoning

    Full text link
    Daily images may convey abstract meanings that require us to memorize and infer profound information from them. To encourage such human-like reasoning, in this work, we teach machines to predict where and when it was taken rather than performing basic tasks like traditional segmentation or classification. Inspired by Horn's QR theory, we designed a novel QR-CLIP model consisting of two components: 1) the Quantity module first retrospects more open-world knowledge as the candidate language inputs; 2) the Relevance module carefully estimates vision and language cues and infers the location and time. Experiments show our QR-CLIP's effectiveness, and it outperforms the previous SOTA on each task by an average of about 10% and 130% relative lift in terms of location and time reasoning. This study lays a technical foundation for location and time reasoning and suggests that effectively introducing open-world knowledge is one of the panaceas for the tasks.Comment: Technical Report. Github: https://github.com/Shi-Wm/QR-CLI
    corecore