3,096 research outputs found
A system of dual quaternion matrix equations with its applications
We employ the M-P inverses and ranks of quaternion matrices to establish the
necessary and sufficient conditions for solving a system of the dual quaternion
matrix equations , along with providing an expression for
its general solution. Serving as an application, we investigate the solutions
to the dual quaternion matrix equations and , including
-Hermitian solutions. Lastly, we design a numerical example to validate
the main research findings of this paper
Individual position diversity in dependence socioeconomic networks increases economic output
The availability of big data recorded from massively multiplayer online
role-playing games (MMORPGs) allows us to gain a deeper understanding of the
potential connection between individuals' network positions and their economic
outputs. We use a statistical filtering method to construct dependence networks
from weighted friendship networks of individuals. We investigate the 30
distinct motif positions in the 13 directed triadic motifs which represent
microscopic dependences among individuals. Based on the structural similarity
of motif positions, we further classify individuals into different groups. The
node position diversity of individuals is found to be positively correlated
with their economic outputs. We also find that the economic outputs of leaf
nodes are significantly lower than that of the other nodes in the same motif.
Our findings shed light on understanding the influence of network structure on
economic activities and outputs in socioeconomic system.Comment: 19 pages, 5 figure
Segatron: Segment-Aware Transformer for Language Modeling and Understanding
Transformers are powerful for sequence modeling. Nearly all state-of-the-art
language models and pre-trained language models are based on the Transformer
architecture. However, it distinguishes sequential tokens only with the token
position index. We hypothesize that better contextual representations can be
generated from the Transformer with richer positional information. To verify
this, we propose a segment-aware Transformer (Segatron), by replacing the
original token position encoding with a combined position encoding of
paragraph, sentence, and token. We first introduce the segment-aware mechanism
to Transformer-XL, which is a popular Transformer-based language model with
memory extension and relative position encoding. We find that our method can
further improve the Transformer-XL base model and large model, achieving 17.1
perplexity on the WikiText-103 dataset. We further investigate the pre-training
masked language modeling task with Segatron. Experimental results show that
BERT pre-trained with Segatron (SegaBERT) can outperform BERT with vanilla
Transformer on various NLP tasks, and outperforms RoBERTa on zero-shot sentence
representation learning.Comment: Accepted by AAAI 202
Quantifying immediate price impact of trades based on the -shell decomposition of stock trading networks
Traders in a stock market exchange stock shares and form a stock trading
network. Trades at different positions of the stock trading network may contain
different information. We construct stock trading networks based on the limit
order book data and classify traders into classes using the -shell
decomposition method. We investigate the influences of trading behaviors on the
price impact by comparing a closed national market (A-shares) with an
international market (B-shares), individuals and institutions, partially filled
and filled trades, buyer-initiated and seller-initiated trades, and trades at
different positions of a trading network. Institutional traders professionally
use some trading strategies to reduce the price impact and individuals at the
same positions in the trading network have a higher price impact than
institutions. We also find that trades in the core have higher price impacts
than those in the peripheral shell.Comment: 6 pages including 3 figures and 1 tabl
- …