Ahmet Taspinar walks us through creating a recurrent neural network topology using TensorFlow:
As we have also seen in the previous blog posts, our Neural Network consists of a
tf.Graph()
and atf.Session()
. Thetf.Graph()
contains all of the computational steps required for the Neural Network, and thetf.Session
is used to execute these steps.The computational steps defined in the
tf.Graph
can be divided into four main parts;
-
We initialize placeholders which are filled with batches of training data during the run.
-
We define the RNN model and to calculate the output values (logits)
-
The logits are used to calculate a loss value, which then
-
is used in an Optimizer to optimize the weights of the RNN.
As a lazy casual, I’ll probably stick with letting Keras do most of the heavy lifting.
Comments closed