The timed pattern matching problem is formulated by Ulus et al. and has been
actively studied since, with its evident application in monitoring real-time
systems. The problem takes as input a timed word/signal and a timed pattern
(specified either by a timed regular expression or by a timed automaton); and
it returns the set of those intervals for which the given timed word, when
restricted to the interval, matches the given pattern. We contribute a
Boyer-Moore type optimization in timed pattern matching, relying on the classic
Boyer-Moore string matching algorithm and its extension to (untimed) pattern
matching by Watson and Watson. We assess its effect through experiments; for
some problem instances our Boyer-Moore type optimization achieves speed-up by
two times, indicating its potential in real-world monitoring tasks where data
sets tend to be massive