305 research outputs found

    Engineering Metal-Ligand Interface for Selective Electrochemical Carbon Dioxide Reduction

    Get PDF
    Unsustainable exploitation of fossil fuel and its massive greenhouse gas emission necessitates the development in alternative energy sources. Chemical fuels (CH3OH or C2H5OH) outperform other choices, such as batteries for their high energy densities, which is key to portability. Electrochemical reduction of CO2is capable of producing a wide range of valuable fuels (Syngas, formic acid, methane and methanol, etc.). Converting CO2into carbon-based fuels further closes the carbon neutral cycle, which contributes to the effort in reducing global CO2emission. Integration of organic ligands with transition metals shows great potential in developing selectiveelectrochemical CO2reduction catalyst. Thiols covalently bonding to Au exhibits moeity-dependent catalysis characteristics: 6-fold enhancement in yield with 2-fold increase in selectivity for CO evolution accompanied by the suppression in the competing hydrogen evolution reaction (HER) through ligand induced surface reconstruction to specific sites; 20% increase in selectivity and 3-fold in yield for energy-dense liquid product (HCOOH) were achieved through ligand facilitated proton-coupled electron transfer by leveraging the dissociation constant (pKa) of the ligand functional moiety. Based on the insights on ligated Au electrodes, composite catalyst that integrated proton donating ligand on silica substrate with the strong CO binding Pd nanoparticle was fabricated and showed up to 6-fold selectivity and 2-fold yield increase in CH3OH production

    EVA-CLIP: Improved Training Techniques for CLIP at Scale

    Full text link
    Contrastive language-image pre-training, CLIP for short, has gained increasing attention for its potential in various scenarios. In this paper, we propose EVA-CLIP, a series of models that significantly improve the efficiency and effectiveness of CLIP training. Our approach incorporates new techniques for representation learning, optimization, and augmentation, enabling EVA-CLIP to achieve superior performance compared to previous CLIP models with the same number of parameters but significantly smaller training costs. Notably, our largest 5.0B-parameter EVA-02-CLIP-E/14+ with only 9 billion seen samples achieves 82.0 zero-shot top-1 accuracy on ImageNet-1K val. A smaller EVA-02-CLIP-L/14+ with only 430 million parameters and 6 billion seen samples achieves 80.4 zero-shot top-1 accuracy on ImageNet-1K val. To facilitate open access and open research, we release the complete suite of EVA-CLIP to the community at https://github.com/baaivision/EVA/tree/master/EVA-CLIP.Comment: To Rei and the moon. Code & Models: https://github.com/baaivision/EVA/tree/master/EVA-CLI

    Delving into Semantic Scale Imbalance

    Full text link
    Model bias triggered by long-tailed data has been widely studied. However, measure based on the number of samples cannot explicate three phenomena simultaneously: (1) Given enough data, the classification performance gain is marginal with additional samples. (2) Classification performance decays precipitously as the number of training samples decreases when there is insufficient data. (3) Model trained on sample-balanced datasets still has different biases for different classes. In this work, we define and quantify the semantic scale of classes, which is used to measure the feature diversity of classes. It is exciting to find experimentally that there is a marginal effect of semantic scale, which perfectly describes the first two phenomena. Further, the quantitative measurement of semantic scale imbalance is proposed, which can accurately reflect model bias on multiple datasets, even on sample-balanced data, revealing a novel perspective for the study of class imbalance. Due to the prevalence of semantic scale imbalance, we propose semantic-scale-balanced learning, including a general loss improvement scheme and a dynamic re-weighting training framework that overcomes the challenge of calculating semantic scales in real-time during iterations. Comprehensive experiments show that dynamic semantic-scale-balanced learning consistently enables the model to perform superiorly on large-scale long-tailed and non-long-tailed natural and medical datasets, which is a good starting point for mitigating the prevalent but unnoticed model bias.Comment: 47 pages, 26 figures, 12 tables, Published as a conference paper at ICLR 202

    An overview of biological research on hypoxia-inducible factors (HIFs)

    Get PDF
    Hypoxia-inducible factors (HIFs), as a family of transcription factors involved in the cellular response to hypoxia, are key regulatory factors in the regulation mechanism of an organism’s response to hypoxia. A large number of studies have shown that HIFs are closely related to the angiogenesis, erythropoiesis, cell metabolism, and autophagy of organisms, as well as the occurrence and development of tumours. Therefore, it is of great significance to further study HIFs to understand and treat tumours or other related diseases. This paper summarises the structure, oxygen-dependent degradation mechanism, non-oxygen-dependent degradation mechanism, transcriptional activation mechanism, relevant signalling pathways, and inhibitors of HIFs, in order to provide new clues for the treatment of tumour, vascular, and other related diseases.
    • …
    corecore