956 research outputs found

    Infrared perfect absorber based on nanowire metamaterial cavities

    Full text link
    An infrared perfect absorber based on gold nanowire metamaterial cavities array on a gold ground plane is designed. The metamaterial made of gold nanowires embedded in alumina host exhibits an effective permittivity with strong anisotropy, which supports cavity resonant modes of both electric dipole and magnetic dipole. The impedance of the cavity modes matches the incident plane wave in free space, leading to nearly perfect light absorption. The incident optical energy is efficiently converted into heat so that the local temperature of the absorber will increase. Simulation results show that the designed metamaterial absorber is polarization-insensitive and nearly omnidirectional for the incident angle.Comment: 3 pages, 4 figure

    Learning Semantic Representations for the Phrase Translation Model

    Get PDF
    This paper presents a novel semantic-based phrase translation model. A pair of source and target phrases are projected into continuous-valued vector representations in a low-dimensional latent semantic space, where their translation score is computed by the distance between the pair in this new space. The projection is performed by a multi-layer neural network whose weights are learned on parallel training data. The learning is aimed to directly optimize the quality of end-to-end machine translation results. Experimental evaluation has been performed on two Europarl translation tasks, English-French and German-English. The results show that the new semantic-based phrase translation model significantly improves the performance of a state-of-the-art phrase-based statistical machine translation sys-tem, leading to a gain of 0.7-1.0 BLEU points

    Tensor Product Generation Networks for Deep NLP Modeling

    Full text link
    We present a new approach to the design of deep networks for natural language processing (NLP), based on the general technique of Tensor Product Representations (TPRs) for encoding and processing symbol structures in distributed neural networks. A network architecture --- the Tensor Product Generation Network (TPGN) --- is proposed which is capable in principle of carrying out TPR computation, but which uses unconstrained deep learning to design its internal representations. Instantiated in a model for image-caption generation, TPGN outperforms LSTM baselines when evaluated on the COCO dataset. The TPR-capable structure enables interpretation of internal representations and operations, which prove to contain considerable grammatical content. Our caption-generation model can be interpreted as generating sequences of grammatical categories and retrieving words by their categories from a plan encoded as a distributed representation
    • …