5,964 research outputs found
Ascent and descent of Gorenstein homological properties
Let be a ring homomorphism, where is a
commutative noetherian ring and is a finite -algebra. We give criteria
for detecting the ascent and descent of Gorenstein homological properties. As
an application, we get a result that supports a question of Avramov and Foxby.
We observe that the ascent and descent of Gorenstein homological property can
detect the Gorensein properties of rings along . Finally, we describe
when induces a triangle equivalence between the stable categories of
finitely generated Gorenstein projective modules.Comment: 21 pages, Any comments are welcome
Singular equivalences induced by bimodules and quadratic monomial algebras
We investigate the problem when the tensor functor by a bimodule yields a
singular equivalence. It turns out that this problem is equivalent to the one
when the Hom functor given by the same bimodule induces a triangle equivalence
between the homotopy categories of acyclic complexes of injective modules. We
give conditions on when a bimodule appears in a pair of bimodules, that defines
a singular equivalence with level. We construct an explicit bimodule, which
yields a singular equivalence between a quadratic monomial algebra and its
associated algebra with radical square zero. Under certain conditions which
include the Gorenstein cases, the bimodule does appear in a pair of bimodules
defining a singular equivalence with level.Comment: 20 pages, all comments are welcome
Empower Sequence Labeling with Task-Aware Neural Language Model
Linguistic sequence labeling is a general modeling approach that encompasses
a variety of problems, such as part-of-speech tagging and named entity
recognition. Recent advances in neural networks (NNs) make it possible to build
reliable models without handcrafted features. However, in many cases, it is
hard to obtain sufficient annotations to train these models. In this study, we
develop a novel neural framework to extract abundant knowledge hidden in raw
texts to empower the sequence labeling task. Besides word-level knowledge
contained in pre-trained word embeddings, character-aware neural language
models are incorporated to extract character-level knowledge. Transfer learning
techniques are further adopted to mediate different components and guide the
language model towards the key knowledge. Comparing to previous methods, these
task-specific knowledge allows us to adopt a more concise model and conduct
more efficient training. Different from most transfer learning methods, the
proposed framework does not rely on any additional supervision. It extracts
knowledge from self-contained order information of training sequences.
Extensive experiments on benchmark datasets demonstrate the effectiveness of
leveraging character-level knowledge and the efficiency of co-training. For
example, on the CoNLL03 NER task, model training completes in about 6 hours on
a single GPU, reaching F1 score of 91.710.10 without using any extra
annotation.Comment: AAAI 201
- …