Context Modulation Enables Multi-tasking and Resource Efficiency in Liquid State Machines
Memory storage and retrieval are context-sensitive in both humans and animals; memories are more accurately retrieved in the context where they were acquired, and similar stimuli can elicit different responses in different contexts. Researchers have suggested that such effects may be underpinned by mechanisms that modulate the dynamics of neural circuits in a context-dependent fashion. Based on this idea, we design a mechanism for context-dependent modulation of a liquid state machine, a recurrent spiking artificial neural network. We find that context modulation enables a single network to multitask and requires fewer neurons than when several smaller networks are used to perform the tasks individually.