Skip to content
This repository has been archived by the owner on Sep 1, 2023. It is now read-only.

Extended Temporal Memory

Nathanael Romano edited this page Jun 24, 2016 · 2 revisions

This document describes the algorithm in extended_temporal_memory.py.

Summary

Extended Temporal Memory (ETM) adds an improvement to regular Temporal Memory (TM), enabling learning from external input, such as motor efference copy, as well as from apical input, such as feedback from higher regions.

Learning

Learning from external input

External input is fed through basal distal dendrites, and thus is treated the same way as lateral connections in the regular temporal memory.

Learning from apical input

Learning through apical dendrites is done in a similar manner as through distal segments, using Hebbian learning rules on synapse permanences, which are either reinforced or decremented depending on whether the presynaptic and postsynaptic cells fired simultaneously.

The inhibitory mechanism is extended to apical activations, using the following rules. A neuron will not become active if one of the following is true:

  • it is not receiving feed-forward input on its proximal segment
  • it is not receiving lateral input on a distal segment and another neuron in the same column is receiving lateral input
  • it is not receiving external input on an apical segment and another neuron in the same column is receiving lateral and external input.

Otherwise the neuron will be activated.

Other Differences

  • ETM takes one extra parameter compared to regular TM, learnOnOneCell. When set to True, the winner cell for each column is fixed between resets, enabling for example a more stable attention mechanism during sensorimotor inference.

  • When feeding an input record through ETM, an additional argument, formInternalConnections, can be specified. It enables the temporal memory to form connections with its internal cells.