1 research outputs found
Fostering Event Compression Using Gated Surprise
Our brain receives a dynamically changing stream of sensorimotor data. Yet,
we perceive a rather organized world, which we segment into and perceive as
events. Computational theories of cognitive science on event-predictive
cognition suggest that our brain forms generative, event-predictive models by
segmenting sensorimotor data into suitable chunks of contextual experiences.
Here, we introduce a hierarchical, surprise-gated recurrent neural network
architecture, which models this process and develops compact compressions of
distinct event-like contexts. The architecture contains a contextual LSTM
layer, which develops generative compressions of ongoing and subsequent
contexts. These compressions are passed into a GRU-like layer, which uses
surprise signals to update its recurrent latent state. The latent state is
passed forward into another LSTM layer, which processes actual dynamic sensory
flow in the light of the provided latent, contextual compression signals. Our
model shows to develop distinct event compressions and achieves the best
performance on multiple event processing tasks. The architecture may be very
useful for the further development of resource-efficient learning, hierarchical
model-based reinforcement learning, as well as the development of artificial
event-predictive cognition and intelligence.Comment: submitted to ICANN 202