8,820 research outputs found

    Separability, Locality, and Higher Dimensions in Quantum Mechanics

    Get PDF
    *A shortened version of this paper will appear in Current Controversies in Philosophy of Science, Dasgupta and Weslake, eds. Routledge.* This paper describes the case that can be made for a high-dimensional ontology in quantum mechanics based on the virtues of avoiding both nonseparability and non locality

    Photometric observations of recent comets

    Get PDF
    Infrared observations of comet Bennett, Kohoutek, Bradfield, and Encke are analyzed with emphasis on the detection of the silicate emission feature. Results are summarized

    RETURNN as a Generic Flexible Neural Toolkit with Application to Translation and Speech Recognition

    Full text link
    We compare the fast training and decoding speed of RETURNN of attention models for translation, due to fast CUDA LSTM kernels, and a fast pure TensorFlow beam search decoder. We show that a layer-wise pretraining scheme for recurrent attention models gives over 1% BLEU improvement absolute and it allows to train deeper recurrent encoder networks. Promising preliminary results on max. expected BLEU training are presented. We are able to train state-of-the-art models for translation and end-to-end models for speech recognition and show results on WMT 2017 and Switchboard. The flexibility of RETURNN allows a fast research feedback loop to experiment with alternative architectures, and its generality allows to use it on a wide range of applications.Comment: accepted as demo paper on ACL 201

    Robust Beam Search for Encoder-Decoder Attention Based Speech Recognition without Length Bias

    Full text link
    As one popular modeling approach for end-to-end speech recognition, attention-based encoder-decoder models are known to suffer the length bias and corresponding beam problem. Different approaches have been applied in simple beam search to ease the problem, most of which are heuristic-based and require considerable tuning. We show that heuristics are not proper modeling refinement, which results in severe performance degradation with largely increased beam sizes. We propose a novel beam search derived from reinterpreting the sequence posterior with an explicit length modeling. By applying the reinterpreted probability together with beam pruning, the obtained final probability leads to a robust model modification, which allows reliable comparison among output sequences of different lengths. Experimental verification on the LibriSpeech corpus shows that the proposed approach solves the length bias problem without heuristics or additional tuning effort. It provides robust decision making and consistently good performance under both small and very large beam sizes. Compared with the best results of the heuristic baseline, the proposed approach achieves the same WER on the 'clean' sets and 4% relative improvement on the 'other' sets. We also show that it is more efficient with the additional derived early stopping criterion.Comment: accepted at INTERSPEECH202

    Beyond neighborhood revitalization

    Get PDF
    After community groups rescue a troubled neighborhood, the area may attract many newcomers—and become expensive. One community shows how to help activists retain the diversity of the towns they work so hard to revitalize.Community development - Massachusetts ; Housing - Prices ; Housing - Boston
    • …
    corecore