ADAM optimizer class.
More...
#include <dnn_opt.h>
|
| opt_adam (DNN_Dtype s, DNN_Dtype l=0.0, DNN_Dtype a=0.0, DNN_Dtype b1=0.9, DNN_Dtype b2=0.999, DNN_Dtype e=1e-8) |
| ADAM constructor. More...
|
|
| ~opt_adam () |
|
void | apply (arma::Cube< DNN_Dtype > &W, arma::Mat< DNN_Dtype > &B, const arma::Cube< DNN_Dtype > &Wgrad, const arma::Mat< DNN_Dtype > &Bgrad) |
| Apply the optimizer to the layer parameters. More...
|
|
std::string | get_algorithm (void) |
| Get the optimizer algorithm information. More...
|
|
| opt () |
|
| ~opt () |
|
void | set_learn_rate_alg (LR_ALG alg, DNN_Dtype a=0.0, DNN_Dtype b=10.0) |
| Set learning rate algorithm. More...
|
|
void | update_learn_rate (void) |
| Update learning rate. More...
|
|
DNN_Dtype | get_learn_rate (void) |
| Get the learning rate. More...
|
|
ADAM optimizer class.
ADAM algorithm, see https://arxiv.org/pdf/1609.04747.pdf
Definition at line 356 of file dnn_opt.h.
◆ opt_adam()
ADAM constructor.
- Parameters
-
[in] | s | Step size - learning rate |
[in] | l | Regularisation param lambda |
[in] | a | Regularisation param alpha |
[in] | b1 | beta 1 parameter |
[in] | b2 | beta 2 parameter |
[in] | e | eps parameter |
Definition at line 377 of file dnn_opt.h.
◆ ~opt_adam()
dnn::opt_adam::~opt_adam |
( |
| ) |
|
|
inline |
◆ apply()
Apply the optimizer to the layer parameters.
- Parameters
-
[in,out] | W,B | Learnable parameters |
[in] | Wgrad,Bgrad | Gradient of the learnable parameters |
Implements dnn::opt.
Definition at line 396 of file dnn_opt.h.
◆ get_algorithm()
std::string dnn::opt_adam::get_algorithm |
( |
void |
| ) |
|
|
inlinevirtual |
Get the optimizer algorithm information.
- Returns
- Algorithm information string
Reimplemented from dnn::opt.
Definition at line 438 of file dnn_opt.h.
◆ beta1
◆ beta2
◆ eps
◆ mB
◆ vB
The documentation for this class was generated from the following file: