271 research outputs found

    Translating Phrases in Neural Machine Translation

    Full text link
    Phrases play an important role in natural language understanding and machine translation (Sag et al., 2002; Villavicencio et al., 2005). However, it is difficult to integrate them into current neural machine translation (NMT) which reads and generates sentences word by word. In this work, we propose a method to translate phrases in NMT by integrating a phrase memory storing target phrases from a phrase-based statistical machine translation (SMT) system into the encoder-decoder architecture of NMT. At each decoding step, the phrase memory is first re-written by the SMT model, which dynamically generates relevant target phrases with contextual information provided by the NMT model. Then the proposed model reads the phrase memory to make probability estimations for all phrases in the phrase memory. If phrase generation is carried on, the NMT decoder selects an appropriate phrase from the memory to perform phrase translation and updates its decoding state by consuming the words in the selected phrase. Otherwise, the NMT decoder generates a word from the vocabulary as the general NMT decoder does. Experiment results on the Chinese to English translation show that the proposed model achieves significant improvements over the baseline on various test sets.Comment: Accepted by EMNLP 201

    Identity-Seeking Self-Supervised Representation Learning for Generalizable Person Re-identification

    Full text link
    This paper aims to learn a domain-generalizable (DG) person re-identification (ReID) representation from large-scale videos \textbf{without any annotation}. Prior DG ReID methods employ limited labeled data for training due to the high cost of annotation, which restricts further advances. To overcome the barriers of data and annotation, we propose to utilize large-scale unsupervised data for training. The key issue lies in how to mine identity information. To this end, we propose an Identity-seeking Self-supervised Representation learning (ISR) method. ISR constructs positive pairs from inter-frame images by modeling the instance association as a maximum-weight bipartite matching problem. A reliability-guided contrastive loss is further presented to suppress the adverse impact of noisy positive pairs, ensuring that reliable positive pairs dominate the learning process. The training cost of ISR scales approximately linearly with the data size, making it feasible to utilize large-scale data for training. The learned representation exhibits superior generalization ability. \textbf{Without human annotation and fine-tuning, ISR achieves 87.0\% Rank-1 on Market-1501 and 56.4\% Rank-1 on MSMT17}, outperforming the best supervised domain-generalizable method by 5.0\% and 19.5\%, respectively. In the pre-training\rightarrowfine-tuning scenario, ISR achieves state-of-the-art performance, with 88.4\% Rank-1 on MSMT17. The code is at \url{https://github.com/dcp15/ISR_ICCV2023_Oral}.Comment: ICCV 2023 Ora

    山谷行書和東坡草書《赤壁懷古》詞石刻的真僞及文獻價值

    Full text link
    宋哲宗元祐二年丁卯(1087)黄庭堅行書赤壁詞,自南宋以來,既有原書真迹傳世,也有石刻搨本流傳,傳承有序,源流清晰。今存山谷行書赤壁詞石刻及其搨本,雖然不一定是山谷手書的真迹,卻是可信的文本。東坡草書赤壁詞的傳藏經過,不如山谷行書赤壁詞那樣歷歷可考,只知明嘉靖以前已有石刻。與山谷行書赤壁詞互證,東坡草書赤壁詞的文本有可信度,即便字不一定是蘇軾所寫真迹,内容應是蘇軾所作。山谷和東坡所書兩種赤壁詞,是現存最早最接近東坡赤壁詞原作的文本,可據以校正諸本異文的正誤得失:其中“談笑間、强虜灰飛煙滅”應作“笑談間、檣櫓灰飛煙滅”,“强虜”是南宋人所改,藉以影射金人

    Capacity sharing, product differentiation and welfare

    Get PDF
    This article constructs a duopoly market with product differentiation and analyses profits, consumer surplus and social welfare under three conditions: (a) two enterprises have sufficient capacity; (b) one enterprise has insufficient capacity, and another enterprise has excess capacity that is not shared; and (c) one enterprise has insufficient capacity, and another enterprise has excess capacity and engages in capacity sharing. Through comparison, the implementation conditions for and effects of capacity sharing and the role of product differentiation are revealed. The results show that capacity sharing helps increase producer surplus and social welfare. Capacity constraints reduce social welfare but can be solved by capacity sharing. Capacity sharing can only be realised when both enterprises are profitable, and the charge for capacity sharing should not be too high or too low. Product differentiation has impacts on output, profit, consumer surplus and social welfare, and these impacts are restricted by the existence of capacity constraints and capacity sharing
    corecore