10,179 research outputs found
Artinian level algebras of codimension 3
In this paper, we continue the study of which -vectors can be the Hilbert function of a level algebra by
investigating Artinian level algebras of codimension 3 with the condition
, where is
the lex-segment ideal associated with an ideal . Our approach is to adopt an
homological method called {\it Cancellation Principle}: the minimal free
resolution of is obtained from that of by canceling some
adjacent terms of the same shift.
We prove that when ,
can be an Artinian level -algebra only if either
or holds. We also apply our results to show that for
, the Hilbert function of an Artinian
algebra of codimension 3 with the condition ,
(a) if , then -vector \H cannot be level, and
(b) if , then there is a level algebra with Hilbert function
\H for some value of .Comment: 15 page
Generic Initial Ideals And Graded Artinian Level Algebras Not Having The Weak-Lefschetz Property
We find a sufficient condition that \H is not level based on a reduction
number. In particular, we prove that a graded Artinian algebra of codimension 3
with Hilbert function cannot be level
if , and that there exists a level O-sequence of codimension 3 of
type \H for for . Furthermore, we show that \H is
not level if , and also
prove that any codimension 3 Artinian graded algebra cannot be level if
\beta_{1,d+2}(\Gin(I))=\beta_{2,d+2}(\Gin(I)). In this case, the Hilbert
function of does not have to satisfy the condition .
Moreover, we show that every codimension graded Artinian level algebra
having the Weak-Lefschetz Property has the strictly unimodal Hilbert function
having a growth condition on for every
where
In particular, we find that if is of codimension 3, then for every and , and prove that
if is a codimension 3 Artinian algebra with an -vector
such that h_{d-1}-h_d=2(h_d-h_{d+1})>0 \quad \text{and}
\quad \soc(A)_{d-1}=0 for some , then is
-regular and \dim_k\soc(A)_d=h_d-h_{d+1}.Comment: 25 page
Hierarchically Clustered Representation Learning
The joint optimization of representation learning and clustering in the
embedding space has experienced a breakthrough in recent years. In spite of the
advance, clustering with representation learning has been limited to flat-level
categories, which often involves cohesive clustering with a focus on instance
relations. To overcome the limitations of flat clustering, we introduce
hierarchically-clustered representation learning (HCRL), which simultaneously
optimizes representation learning and hierarchical clustering in the embedding
space. Compared with a few prior works, HCRL firstly attempts to consider a
generation of deep embeddings from every component of the hierarchy, not just
leaf components. In addition to obtaining hierarchically clustered embeddings,
we can reconstruct data by the various abstraction levels, infer the intrinsic
hierarchical structure, and learn the level-proportion features. We conducted
evaluations with image and text domains, and our quantitative analyses showed
competent likelihoods and the best accuracies compared with the baselines.Comment: 10 pages, 7 figures, Under review as a conference pape
- β¦