DaNNet
Class Hierarchy
This inheritance list is sorted roughly, but not completely, alphabetically:
[detail level 123]
 Cdnn::layerLayer base class
 Cdnn::layer_actActivation layer base class
 Cdnn::act_LReLULeaky ReLU activation class
 Cdnn::act_ReLUReLU activation class
 Cdnn::act_sigmoidSigmoid activation class
 Cdnn::act_softmaxSoftmax activation class
 Cdnn::act_softplusSoftplus activation class
 Cdnn::act_tanhTanh activation class
 Cdnn::layer_convA convolution layer class
 Cdnn::layer_costCost/output layer base class
 Cdnn::cost_CECross Entropy cost/output layer class with linear activation
 Cdnn::cost_CE_sigmoidCross Entropy cost/output layer class with sigmoid activation
 Cdnn::cost_CE_softmaxCross Entropy cost/output layer class with softmax activation
 Cdnn::cost_MSEMean Square Error cost/output layer class
 Cdnn::cost_MSE_sigmoidMean Square Error cost/output layer class with sigmoid activation
 Cdnn::layer_denseA fully connected/dense layer class
 Cdnn::layer_dropDropout layer class
 Cdnn::layer_inputInput/data layer class
 Cdnn::layer_normBatch normalization layer class
 Cdnn::layer_poolPooling layer base class
 Cdnn::pool_averageAverage pooling layer class
 Cdnn::pool_maxMax pooling layer class
 Cdnn::netNeural netowork model class
 Cdnn::optOptimizer base class
 Cdnn::opt_adadeltaADAdelta optimizer class
 Cdnn::opt_adagradADAgrad optimizer class
 Cdnn::opt_adamADAM optimizer class
 Cdnn::opt_adamaxADAMax optimizer class
 Cdnn::opt_rmspropRMSprop optimizer class
 Cdnn::opt_SGDStochastic Gradient Descent optimizer class
 Cdnn::opt_SGD_momentumStochastic Gradient Descent with momentum optimizer class
 Cdnn::opt_SGD_nesterovStochastic Gradient Descent with Nesterov momentum optimizer class