6,598 research outputs found

    Quasi-Whittaker modules for the Schr\"odinger algebra

    Full text link
    In this paper, we construct a new class of modules for the Schr\"{o}dinger algebra \mS, called quasi-Whittaker module. Different from \cite{[ZC]}, the quasi-Whittaker module is not induced by the Borel subalgebra of the Schr\"{o}dinger algebra related with the triangular decomposition, but its Heisenberg subalgebra \mH. We prove that, for a simple \mS-module VV, VV is a quasi-Whittaker module if and only if VV is a locally finite \mH-module; Furthermore, we classify the simple quasi-Whittaker modules by the elements with the action similar to the center elements in U(\mS) and their quasi-Whittaker vectors. Finally, we characterize arbitrary quasi-Whittaker modules.Comment: 17 page

    How has TV dramas legitimised China's rural neoliberal transformation agenda?

    Get PDF
    The Chinese state is leading a neoliberal transformation in China's rural area. A growing number of rural topic TV dramas choose to follow its agenda. However, it is not clear why the TV drama industry gets involved in this rural transformation process, and how much these dramas can help the state to carry out its policies. This study aims to address these issues. By conducting in-depth interviews with government officials, drama professionals and peasants in two villages, supplemented by analyses of relevant literature and archives, this research reveals how China's rural neoliberal transformation process looks like when it intersects with China's media marketisation process. It concludes that the Chinese state is increasingly collaborating with the market for the interpenetration of political-economic interests, and thereby joins the global discussion on how neoliberalism, as a way of governing, works in different socio-political contexts

    Theoretical Exploration on the Magnetic Properties of Ferromagnetic Metallic Glass: An Ising Model on Random Recursive Lattice

    Full text link
    The ferromagnetic Ising spins are modeled on a recursive lattice constructed from random-angled rhombus units with stochastic configurations, to study the magnetic properties of the bulk Fe-based metallic glass. The integration of spins on the structural glass model well represents the magnetic moments in the glassy metal. The model is exactly solved by the recursive calculation technique. The magnetization of the amorphous Ising spins, i.e. the glassy metallic magnet is investigated by our modeling and calculation on a theoretical base. The results show that the glassy metallic magnets has a lower Curie temperature, weaker magnetization, and higher entropy comparing to the regular ferromagnet in crystal form. These findings can be understood with the randomness of the amorphous system, and agrees well with others' experimental observations.Comment: 11 pages, 5 figure

    Direct reconstruction of dynamical dark energy from observational Hubble parameter data

    Full text link
    Reconstructing the evolution history of the dark energy equation of state parameter w(z)w(z) directly from observational data is highly valuable in cosmology, since it contains substantial clues in understanding the nature of the accelerated expansion of the Universe. Many works have focused on reconstructing w(z)w(z) using Type Ia supernova data, however, only a few studies pay attention to Hubble parameter data. In the present work, we explore the merit of Hubble parameter data and make an attempt to reconstruct w(z)w(z) from them through the principle component analysis approach. We find that current Hubble parameter data perform well in reconstructing w(z)w(z); though, when compared to supernova data, the data are scant and their quality is worse. Both Λ\LambdaCDM and evolving w(z)w(z) models can be constrained within 10%10\% at redshifts z≲1.5z \lesssim 1.5 and even 5%5\% at redshifts 0.1 ≲\lesssim z ≲\lesssim 1 by using simulated H(z)H(z) data of observational quality.Comment: 25 pages, 11 figure

    Distilling Word Embeddings: An Encoding Approach

    Full text link
    Distilling knowledge from a well-trained cumbersome network to a small one has recently become a new research topic, as lightweight neural networks with high performance are particularly in need in various resource-restricted systems. This paper addresses the problem of distilling word embeddings for NLP tasks. We propose an encoding approach to distill task-specific knowledge from a set of high-dimensional embeddings, which can reduce model complexity by a large margin as well as retain high accuracy, showing a good compromise between efficiency and performance. Experiments in two tasks reveal the phenomenon that distilling knowledge from cumbersome embeddings is better than directly training neural networks with small embeddings.Comment: Accepted by CIKM-16 as a short paper, and by the Representation Learning for Natural Language Processing (RL4NLP) Workshop @ACL-16 for presentatio
    • …
    corecore