112 research outputs found
Classification of -groups via their -nilpotent multipliers
For a -group of order , it is known that the order of -nilpotent
multiplier is equal to |\mathcal{M}^{(2)}(G)|=p^{\f12n(n-1)(n-2)+3-s_2(G)}
for an integer . In this article, we characterize all of non abelian
-groups satisfying in $s_2(G)\in\{1,2,3\}
Characterizing nilpotent Lie algebras that satisfy on converse of the Schur's theorem
Let be a finite dimensional nilpotent Lie algebra and be the
minimal number generators for It is known that for an integer In this paper, we classify all
finite dimensional nilpotent Lie algebras when We find also a construction, which shows that there exist Lie
algebras of arbitrary $ t(L).
On the Schur multipliers of Lie superalgebras of maximal class
Let be a non-abelian nilpotent Lie superalgebra of dimensiom .
Nayak shows there is a non-negative such that
. Here we intend that
classify all non-abelian nilpotent Lie superalgebras, when .
Moreover, we classify the structure of all Lie superalgebras of dimension at
most such that
The exterior degree of a pair of finite groups
The exterior degree of a pair of finite groups , which is a
generalization of the exterior degree of finite groups, is the probability for
two elements in such that . In the present paper,
we state some relations between this concept and the relative commutatively
degree, capability and the Schur multiplier of a pair of groups.Comment: To appear in Mediterr. J. Mat
Physics-Inspired Interpretability Of Machine Learning Models
The ability to explain decisions made by machine learning models remains one
of the most significant hurdles towards widespread adoption of AI in highly
sensitive areas such as medicine, cybersecurity or autonomous driving. Great
interest exists in understanding which features of the input data prompt model
decision making. In this contribution, we propose a novel approach to identify
relevant features of the input data, inspired by methods from the energy
landscapes field, developed in the physical sciences. By identifying conserved
weights within groups of minima of the loss landscapes, we can identify the
drivers of model decision making. Analogues to this idea exist in the molecular
sciences, where coordinate invariants or order parameters are employed to
identify critical features of a molecule. However, no such approach exists for
machine learning loss landscapes. We will demonstrate the applicability of
energy landscape methods to machine learning models and give examples, both
synthetic and from the real world, for how these methods can help to make
models more interpretable.Comment: 6 pages, 2 figures, ICLR 2023 Workshop on Physics for Machine
Learnin
- β¦