2 research outputs found
Sound Event Detection Utilizing Graph Laplacian Regularization with Event Co-occurrence
A limited number of types of sound event occur in an acoustic scene and some
sound events tend to co-occur in the scene; for example, the sound events
"dishes" and "glass jingling" are likely to co-occur in the acoustic scene
"cooking". In this paper, we propose a method of sound event detection using
graph Laplacian regularization with sound event co-occurrence taken into
account. In the proposed method, the occurrences of sound events are expressed
as a graph whose nodes indicate the frequencies of event occurrence and whose
edges indicate the sound event co-occurrences. This graph representation is
then utilized for the model training of sound event detection, which is
optimized under an objective function with a regularization term considering
the graph structure of sound event occurrence and co-occurrence. Evaluation
experiments using the TUT Sound Events 2016 and 2017 detasets, and the TUT
Acoustic Scenes 2016 dataset show that the proposed method improves the
performance of sound event detection by 7.9 percentage points compared with the
conventional CNN-BiGRU-based detection method in terms of the segment-based F1
score. In particular, the experimental results indicate that the proposed
method enables the detection of co-occurring sound events more accurately than
the conventional method.Comment: Accepted to IEICE Transactions on Information and System