3,061 research outputs found

    Character-Aware Neural Language Models

    Full text link
    We describe a simple neural language model that relies only on character-level inputs. Predictions are still made at the word-level. Our model employs a convolutional neural network (CNN) and a highway network over characters, whose output is given to a long short-term memory (LSTM) recurrent neural network language model (RNN-LM). On the English Penn Treebank the model is on par with the existing state-of-the-art despite having 60% fewer parameters. On languages with rich morphology (Arabic, Czech, French, German, Spanish, Russian), the model outperforms word-level/morpheme-level LSTM baselines, again with fewer parameters. The results suggest that on many languages, character inputs are sufficient for language modeling. Analysis of word representations obtained from the character composition part of the model reveals that the model is able to encode, from characters only, both semantic and orthographic information.Comment: AAAI 201

    Thermostructural applications of heat pipes

    Get PDF
    The feasibility of integrating heat pipes in high temperature structure to reduce local hot spot temperature was evaluated for a variety of hypersonic aerospace vehicles. From an initial list of twenty-two potential applications, the single stage to orbit wing leading edge showed the greatest promise and was selected for preliminary design of an integrated heat pipe thermostructural system. The design consisted of a Hastelloy X assembly with sodium heat pipe passages aligned normal to the wing leading edge. A d-shaped heat pipe cross section was determined to be optimum from the standpoint of structural weight
    corecore