public class BPNN extends AbstractNeuralNet
Modifier and Type | Field and Description |
---|---|
protected Jama.Matrix[] |
dW_ |
protected Random |
r |
Jama.Matrix[] |
W
Weight Matrix
|
m_E, m_H, m_M, m_R, m_Seed
m_InstancesTemplate
Constructor and Description |
---|
BPNN() |
Modifier and Type | Method and Description |
---|---|
double |
backPropagate(double[][] X_,
double[][] Y_)
Back Propagate - Do one round of Back Propagation on batch X_,Y_.
|
void |
buildClassifier(weka.core.Instances D) |
double[] |
distributionForInstance(weka.core.Instance xy) |
Jama.Matrix[] |
forwardPass(double[][] X_)
Forward Pass - Given input X_, get output of all layers Z[0]...
|
void |
initWeights(int d,
int L,
int[] H)
InitWeights - Initialize a BPNN of H.length hidden layers with H[0], H[1], etc hidden units in each layer (W will be random, and of the corresponding dimensions).
|
static void |
main(String[] args) |
double[] |
popy(double[] x_)
Forward Pass - Given input x_, get output y_.
|
double[][] |
popY(double[][] X_)
Forward Pass - Given input X_, get output Y_.
|
void |
presetWeights(Jama.Matrix[] W,
int L)
Preset Weights - Initialize a BPNN with (pre-trained) weight matrices W (which also determines X dimensions).
|
double |
train(double[][] X_,
double[][] Y_) |
double |
train(double[][] X_,
double[][] Y_,
int I)
Train - Train for I iterations.
|
double |
update(double[][] X_,
double[][] Y_)
Update - A single training epoch.
|
eTipText, getE, getH, getLearningRate, getMomentum, getOptions, getSeed, hTipText, learningRateTipText, listOptions, momentumTipText, seedTipText, setE, setH, setLearningRate, setMomentum, setOptions, setSeed, toString
defaultClassifierString, evaluation, getCapabilities, getModel, getRevision, getTemplate, globalInfo, makeCopies, runClassifier, testCapabilities
classifierTipText, defaultClassifierOptions, getClassifier, getClassifierSpec, postExecution, preExecution, setClassifier
batchSizeTipText, classifyInstance, debugTipText, distributionsForInstances, doNotCheckCapabilitiesTipText, forName, getBatchSize, getDebug, getDoNotCheckCapabilities, getNumDecimalPlaces, implementsMoreEfficientBatchPrediction, makeCopies, makeCopy, numDecimalPlacesTipText, run, runClassifier, setBatchSize, setDebug, setDoNotCheckCapabilities, setNumDecimalPlaces
clone, equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
debugTipText, getDebug, setDebug
public Jama.Matrix[] W
protected Random r
protected Jama.Matrix[] dW_
public void buildClassifier(weka.core.Instances D) throws Exception
buildClassifier
in interface weka.classifiers.Classifier
buildClassifier
in class ProblemTransformationMethod
Exception
public double[] distributionForInstance(weka.core.Instance xy) throws Exception
distributionForInstance
in interface weka.classifiers.Classifier
distributionForInstance
in class ProblemTransformationMethod
Exception
public void presetWeights(Jama.Matrix[] W, int L) throws Exception
W
- pre-trained weight matrix (should include bias weights, assume W[-1]-1 hidden units in penultimate layer not including bias])L
- the number of labels (for making the final matrix)Exception
public void initWeights(int d, int L, int[] H) throws Exception
d
- number of visible unitsL
- number of labels (output units)H
- number of units in hidden layers, H.length = number of hidden layers. CURRENTLY LIMITED TO 1.Exception
public double train(double[][] X_, double[][] Y_, int I) throws Exception
Exception
public double update(double[][] X_, double[][] Y_) throws Exception
Exception
public double[] popy(double[] x_)
x_
- inputpublic double[][] popY(double[][] X_)
X_
- inputpublic Jama.Matrix[] forwardPass(double[][] X_)
X_
- input (no bias included)public double backPropagate(double[][] X_, double[][] Y_) throws Exception
X_
- inputY_
- teacher valuesException
Copyright © 2017. All Rights Reserved.