Classifiers¶
SDR Classifier¶
Implementation of a SDR classifier.
The SDR classifier takes the form of a single layer classification network that takes SDRs as input and outputs a predicted distribution of classes.
-
class
nupic.algorithms.sdr_classifier.
SDRClassifier
(steps=(1, ), alpha=0.001, actValueAlpha=0.3, verbosity=0)¶ The SDR Classifier accepts a binary input pattern from the level below (the “activationPattern”) and information from the sensor and encoders (the “classification”) describing the true (target) input.
The SDR classifier maps input patterns to class labels. There are as many output units as the number of class labels or buckets (in the case of scalar encoders). The output is a probabilistic distribution over all class labels.
During inference, the output is calculated by first doing a weighted summation of all the inputs, and then perform a softmax nonlinear function to get the predicted distribution of class labels
During learning, the connection weights between input units and output units are adjusted to maximize the likelihood of the model
The SDR Classifier is a variation of the previous CLAClassifier which was not based on the references below.
Example Usage:
c = SDRClassifier(steps=[1], alpha=0.1, actValueAlpha=0.1, verbosity=0) # learning c.compute(recordNum=0, patternNZ=[1, 5, 9], classification={"bucketIdx": 4, "actValue": 34.7}, learn=True, infer=False) # inference result = c.compute(recordNum=1, patternNZ=[1, 5, 9], classification={"bucketIdx": 4, "actValue": 34.7}, learn=False, infer=True) # Print the top three predictions for 1 steps out. topPredictions = sorted(zip(result[1], result["actualValues"]), reverse=True)[:3] for probability, value in topPredictions: print "Prediction of {} has probability of {}.".format(value, probability*100.0)
References:
- Alex Graves. Supervised Sequence Labeling with Recurrent Neural Networks, PhD Thesis, 2008
- J. S. Bridle. Probabilistic interpretation of feedforward classification network outputs, with relationships to statistical pattern recognition
- In F. Fogleman-Soulie and J.Herault, editors, Neurocomputing: Algorithms, Architectures and Applications, pp 227-236, Springer-Verlag, 1990
Parameters: - steps – (list) Sequence of the different steps of multi-step predictions to learn
- alpha – (float) The alpha used to adapt the weight matrix during learning. A larger alpha results in faster adaptation to the data.
- actValueAlpha – (float) Used to track the actual value within each bucket. A lower actValueAlpha results in longer term memory
- verbosity – (int) verbosity level, can be 0, 1, or 2
-
compute
(recordNum, patternNZ, classification, learn, infer)¶ Process one input sample.
This method is called by outer loop code outside the nupic-engine. We use this instead of the nupic engine compute() because our inputs and outputs aren’t fixed size vectors of reals.
Parameters: - recordNum – Record number of this input pattern. Record numbers normally increase sequentially by 1 each time unless there are missing records in the dataset. Knowing this information insures that we don’t get confused by missing records.
- patternNZ – List of the active indices from the output below. When the input is from TemporalMemory, this list should be the indices of the active cells.
- classification –
Dict of the classification information where:
- bucketIdx: index of the encoder bucket
- actValue: actual value going into the encoder
Classification could be None for inference mode.
- learn – (bool) if true, learn this sample
- infer – (bool) if true, perform inference
Returns: Dict containing inference results, there is one entry for each step in self.steps, where the key is the number of steps, and the value is an array containing the relative likelihood for each bucketIdx starting from bucketIdx 0.
There is also an entry containing the average actual value to use for each bucket. The key is ‘actualValues’.
for example:
{1 : [0.1, 0.3, 0.2, 0.7], 4 : [0.2, 0.4, 0.3, 0.5], 'actualValues': [1.5, 3,5, 5,5, 7.6], }
-
infer
(patternNZ, classification)¶ Return the inference value from one input sample. The actual learning happens in compute().
Parameters: - patternNZ – list of the active indices from the output below
- classification – dict of the classification information: bucketIdx: index of the encoder bucket actValue: actual value going into the encoder
Returns: dict containing inference results, one entry for each step in self.steps. The key is the number of steps, the value is an array containing the relative likelihood for each bucketIdx starting from bucketIdx 0.
for example:
{'actualValues': [0.0, 1.0, 2.0, 3.0] 1 : [0.1, 0.3, 0.2, 0.7] 4 : [0.2, 0.4, 0.3, 0.5]}
-
inferSingleStep
(patternNZ, weightMatrix)¶ Perform inference for a single step. Given an SDR input and a weight matrix, return a predicted distribution.
Parameters: - patternNZ – list of the active indices from the output below
- weightMatrix – numpy array of the weight matrix
Returns: numpy array of the predicted class label distribution
-
class
nupic.algorithms.sdr_classifier_factory.
SDRClassifierFactory
¶ Factory for instantiating SDR classifiers.
-
static
create
(*args, **kwargs)¶ Create a SDR classifier factory. The implementation of the SDR Classifier can be specified with the “implementation” keyword argument.
- The SDRClassifierFactory uses the implementation as specified in
- Default NuPIC Configuration.
-
static