553 research outputs found

    Theory of optical imaging beyond the diffraction limit with a far-field superlens

    Full text link
    Recent theoretical and experimental studies have shown that imaging with resolution well beyond the diffraction limit can be obtained with so-called superlenses. Images formed by such superlenses are, however, in the near field only, or a fraction of wavelength away from the lens. In this paper, we propose a far-field superlens (FSL) device which is composed of a planar superlens with periodical corrugation. We show in theory that when an object is placed in close proximity of such a FSL, a unique image can be formed in far-field. As an example, we demonstrate numerically that images of 40 nm lines with a 30 nm gap can be obtained from far-field data with properly designed FSL working at 376nm wavelength.Comment: 6 pages, 3 figure

    Direct photoluminescence probing of ferromagnetism in monolayer two-dimensional CrBr3

    Full text link
    Atomically thin magnets are the key element to build up spintronics based on two-dimensional materials. The surface nature of two-dimensional ferromagnet opens up opportunities to improve the device performance efficiently. Here, we report the intrinsic ferromagnetism in atomically thin monolayer CrBr3, directly probed by polarization resolved magneto-photoluminescence. The spontaneous magnetization persists in monolayer CrBr3 with a Curie temperature of 34 K. The development of magnons by the thermal excitation is in line with the spin-wave theory. We attribute the layer-number dependent hysteresis loops in thick layers to the magnetic domain structures. As a stable monolayer material in air, CrBr3 provides a convenient platform for fundamental physics and pushes the potential applications of the two-dimensional ferromagnetism.Comment: 27 pages, 10 figure

    Diet Code Is Healthy: Simplifying Programs for Pre-trained Models of Code

    Full text link
    Pre-trained code representation models such as CodeBERT have demonstrated superior performance in a variety of software engineering tasks, yet they are often heavy in complexity, quadratically with the length of the input sequence. Our empirical analysis of CodeBERT's attention reveals that CodeBERT pays more attention to certain types of tokens and statements such as keywords and data-relevant statements. Based on these findings, we propose DietCode, which aims at lightweight leverage of large pre-trained models for source code. DietCode simplifies the input program of CodeBERT with three strategies, namely, word dropout, frequency filtering, and an attention-based strategy which selects statements and tokens that receive the most attention weights during pre-training. Hence, it gives a substantial reduction in the computational cost without hampering the model performance. Experimental results on two downstream tasks show that DietCodeBERT provides comparable results to CodeBERT with 40% less computational cost in fine-tuning and testing.Comment: Accepted to be published in ESEC/FSE 202
    corecore