Brain-inspired Hyperdimensional (HD) computing is an emerging technique for
cognitive tasks in the field of low-power design. As a fast-learning and
energy-efficient computational paradigm, HD computing has shown great success
in many real-world applications. However, an HD model incrementally trained on
multiple tasks suffers from the negative impacts of catastrophic forgetting.
The model forgets the knowledge learned from previous tasks and only focuses on
the current one. To the best of our knowledge, no study has been conducted to
investigate the feasibility of applying multi-task learning to HD computing. In
this paper, we propose Task-Projected Hyperdimensional Computing (TP-HDC) to
make the HD model simultaneously support multiple tasks by exploiting the
redundant dimensionality in the hyperspace. To mitigate the interferences
between different tasks, we project each task into a separate subspace for
learning. Compared with the baseline method, our approach efficiently utilizes
the unused capacity in the hyperspace and shows a 12.8% improvement in averaged
accuracy with negligible memory overhead.Comment: To be published in 16th International Conference on Artificial
Intelligence Applications and Innovation