TensorFlow Tutorial
- Recommendations for Neural Network Training
- Image Recognition using TensorFlow
- TensorFlow - Forming Graphs
- Gradient Descent Optimization
- TensorFlow - XOR Implementation
- TensorFlow - Optimizers
- Hidden Layers of Perceptron
- Multi-Layer Perceptron Learning
- TensorFlow - Exporting
- TensorFlow - Distributed Computing
- TensorFlow - Keras
- CNN and RNN Difference
- TFLearn and its installation
- TensorFlow - Linear Regression
- Single Layer Perceptron
- TensorFlow - Word Embedding
- TensorBoard Visualization
- Recurrent Neural Networks
- Convolutional Neural Networks
- TensorFlow - Basics
TensorFlow Useful Resources
Selected Reading
- Who is Who
- Computer Glossary
- HR Interview Questions
- Effective Resume Writing
- Questions and Answers
- UPSC IAS Exams Notes
TensorFlow - Optimizers
Optimizers are the extended class, which include added information to train a specific model. The optimizer class is initiapzed with given parameters but it is important to remember that no Tensor is needed. The optimizers are used for improving speed and performance for training a specific model.
The basic optimizer of TensorFlow is −
tf.train.Optimizer
This class is defined in the specified path of tensorflow/python/training/optimizer.py.
Following are some optimizers in Tensorflow −
Stochastic Gradient descent
Stochastic Gradient descent with gradient cppping
Momentum
Nesterov momentum
Adagrad
Adadelta
RMSProp
Adam
Adamax
SMORMS3
We will focus on the Stochastic Gradient descent. The illustration for creating optimizer for the same is mentioned below −
def sgd(cost, params, lr = np.float32(0.01)): g_params = tf.gradients(cost, params) updates = [] for param, g_param in zip(params, g_params): updates.append(param.assign(param - lr*g_param)) return updates
The basic parameters are defined within the specific function. In our subsequent chapter, we will focus on Gradient Descent Optimization with implementation of optimizers.
Advertisements