|
DaNNet
|
Optimizer base class. More...
#include <dnn_opt.h>
Public Member Functions | |
| opt () | |
| ~opt () | |
| virtual void | apply (arma::Cube< DNN_Dtype > &W, arma::Mat< DNN_Dtype > &B, const arma::Cube< DNN_Dtype > &Wgrad, const arma::Mat< DNN_Dtype > &Bgrad)=0 |
| Apply the optimizer to the layer parameters. More... | |
| virtual std::string | get_algorithm (void) |
| Get the optimizer algorithm information. More... | |
| void | set_learn_rate_alg (LR_ALG alg, DNN_Dtype a=0.0, DNN_Dtype b=10.0) |
| Set learning rate algorithm. More... | |
| void | update_learn_rate (void) |
| Update learning rate. More... | |
| DNN_Dtype | get_learn_rate (void) |
| Get the learning rate. More... | |
Protected Attributes | |
| std::string | alg |
| DNN_Dtype | lr |
| Learning rate. More... | |
| DNN_Dtype | reg_lambda |
| Regularisation parameter lambda. More... | |
| DNN_Dtype | reg_alpha |
| Elastic net mix parameter - 0=ridge (L2) .. 1=LASSO (L1) More... | |
| LR_ALG | lr_alg |
| Learning rate schedule algorithm. More... | |
| DNN_Dtype | lr_0 |
| Init value for lr. More... | |
| DNN_Dtype | lr_a |
| Internal parameter a. More... | |
| DNN_Dtype | lr_b |
| Internal parameter b. More... | |
| arma::uword | it |
| Iteration counter. More... | |
Optimizer base class.
Implements the optimizer for finding the minimum of the cost function with respect to the layers trainable parameters
|
pure virtual |
Apply the optimizer to the layer parameters.
| [in,out] | W,B | Learnable parameters |
| [in] | Wgrad,Bgrad | Gradient of the learnable parameters |
Implemented in dnn::opt_rmsprop, dnn::opt_adagrad, dnn::opt_adadelta, dnn::opt_adamax, dnn::opt_adam, dnn::opt_SGD_nesterov, dnn::opt_SGD_momentum, and dnn::opt_SGD.
|
inlinevirtual |
Get the optimizer algorithm information.
Reimplemented in dnn::opt_rmsprop, dnn::opt_adagrad, dnn::opt_adadelta, dnn::opt_adamax, dnn::opt_adam, dnn::opt_SGD_nesterov, dnn::opt_SGD_momentum, and dnn::opt_SGD.
|
inline |
Set learning rate algorithm.
| [in] | alg | Algorithm |
| [in] | a | Parameter a |
| [in] | b | Parameter b |
Sets the learning rate algorithm and the parameters CONST: constant learning rate lr = lr_0 TIME_DECAY: time based decay lr = lr_0/(1+at) STEP_DECAY: stepped decay lr = lr_0*(a)^(floor(b/t)) EXP_DECAY: eponential decreasing decay lr = lr_0*exp(-at)
|
inline |
|
protected |
|
protected |
|
protected |
1.8.13