ADAgrad optimizer class.
More...
#include <dnn_opt.h>
ADAgrad optimizer class.
ADAgrad algorithm, see https://arxiv.org/pdf/1609.04747.pdf
Definition at line 648 of file dnn_opt.h.
◆ opt_adagrad()
ADAgrad constructor.
- Parameters
-
[in] | s | Step size - learning rate |
[in] | l | Regularisation param lambda |
[in] | a | Regularisation param alpha |
[in] | e | eps parameter |
Definition at line 663 of file dnn_opt.h.
◆ ~opt_adagrad()
dnn::opt_adagrad::~opt_adagrad |
( |
| ) |
|
|
inline |
◆ apply()
Apply the optimizer to the layer parameters.
- Parameters
-
[in,out] | W,B | Learnable parameters |
[in] | Wgrad,Bgrad | Gradient of the learnable parameters |
Implements dnn::opt.
Definition at line 680 of file dnn_opt.h.
◆ get_algorithm()
std::string dnn::opt_adagrad::get_algorithm |
( |
void |
| ) |
|
|
inlinevirtual |
Get the optimizer algorithm information.
- Returns
- Algorithm information string
Reimplemented from dnn::opt.
Definition at line 716 of file dnn_opt.h.
◆ eps
◆ vB
The documentation for this class was generated from the following file: