22,985 research outputs found
Edge-Fault Tolerance of Hypercube-like Networks
This paper considers a kind of generalized measure of fault
tolerance in a hypercube-like graph which contain several well-known
interconnection networks such as hypercubes, varietal hypercubes, twisted
cubes, crossed cubes and M\"obius cubes, and proves for any with by the induction on
and a new technique. This result shows that at least edges of
have to be removed to get a disconnected graph that contains no vertices of
degree less than . Compared with previous results, this result enhances
fault-tolerant ability of the above-mentioned networks theoretically
Hashing based Answer Selection
Answer selection is an important subtask of question answering (QA), where
deep models usually achieve better performance. Most deep models adopt
question-answer interaction mechanisms, such as attention, to get vector
representations for answers. When these interaction based deep models are
deployed for online prediction, the representations of all answers need to be
recalculated for each question. This procedure is time-consuming for deep
models with complex encoders like BERT which usually have better accuracy than
simple encoders. One possible solution is to store the matrix representation
(encoder output) of each answer in memory to avoid recalculation. But this will
bring large memory cost. In this paper, we propose a novel method, called
hashing based answer selection (HAS), to tackle this problem. HAS adopts a
hashing strategy to learn a binary matrix representation for each answer, which
can dramatically reduce the memory cost for storing the matrix representations
of answers. Hence, HAS can adopt complex encoders like BERT in the model, but
the online prediction of HAS is still fast with a low memory cost. Experimental
results on three popular answer selection datasets show that HAS can outperform
existing models to achieve state-of-the-art performance
- β¦