public class BPNN extends AbstractNeuralNet
Modifier and Type | Field and Description |
---|---|
Jama.Matrix[] |
W
Weight Matrix
|
Constructor and Description |
---|
BPNN() |
Modifier and Type | Method and Description |
---|---|
double |
backPropagate(double[][] X_,
double[][] Y_)
Back Propagate - Do one round of Back Propagation on batch X_,Y_.
|
void |
buildClassifier(weka.core.Instances D) |
double[] |
distributionForInstance(weka.core.Instance xy) |
Jama.Matrix[] |
forwardPass(double[][] X_)
Forward Pass - Given input X_, get output of all layers Z[0]...
|
void |
initWeights(int d,
int L,
int[] H)
InitWeights - Initialize a BPNN of H.length hidden layers with H[0], H[1], etc hidden units in each layer (W will be random, and of the corresponding dimensions).
|
static void |
main(java.lang.String[] args) |
double[] |
popy(double[] x_)
Forward Pass - Given input x_, get output y_.
|
double[][] |
popY(double[][] X_)
Forward Pass - Given input X_, get output Y_.
|
void |
setWeights(Jama.Matrix[] W,
int L)
SetWeights - Initialize a BPNN with (pre-trained) weight matrices W (which also determines X dimensions).
|
double |
train(double[][] X_,
double[][] Y_) |
double |
train(double[][] X_,
double[][] Y_,
int I)
Train - Train for I iterations.
|
double |
update(double[][] X_,
double[][] Y_)
Update - A single training epoch.
|
getE, getH, getOptions, listOptions, setE, setH, setOptions, toString
evaluation, getCapabilities, getRevision, getTemplate, globalInfo, makeCopies, runClassifier, testCapabilities
classifierTipText, getClassifier, setClassifier
public void buildClassifier(weka.core.Instances D) throws java.lang.Exception
buildClassifier
in interface weka.classifiers.Classifier
buildClassifier
in class MultilabelClassifier
java.lang.Exception
public double[] distributionForInstance(weka.core.Instance xy) throws java.lang.Exception
distributionForInstance
in interface weka.classifiers.Classifier
distributionForInstance
in class MultilabelClassifier
java.lang.Exception
public void setWeights(Jama.Matrix[] W, int L) throws java.lang.Exception
W
- pre-trained weight matrix (should include bias weights, assume W[-1]-1 hidden units in penultimate layer not including bias])L
- the number of labels (for making the final matrix)java.lang.Exception
public void initWeights(int d, int L, int[] H) throws java.lang.Exception
d
- number of visible unitsL
- number of labels (output units)H
- number of units in hidden layers, H.length = number of hidden layers. CURRENTLY LIMITED TO 1.java.lang.Exception
public double train(double[][] X_, double[][] Y_) throws java.lang.Exception
java.lang.Exception
public double train(double[][] X_, double[][] Y_, int I) throws java.lang.Exception
java.lang.Exception
public double update(double[][] X_, double[][] Y_) throws java.lang.Exception
java.lang.Exception
public double[] popy(double[] x_)
x_
- inputpublic double[][] popY(double[][] X_)
X_
- inputpublic Jama.Matrix[] forwardPass(double[][] X_)
X_
- input (no bias included)public double backPropagate(double[][] X_, double[][] Y_) throws java.lang.Exception
X_
- inputY_
- teacher valuesjava.lang.Exception
public static void main(java.lang.String[] args) throws java.lang.Exception
java.lang.Exception