Tensorflow Checkpoint Custom Map
Hire the world's top talent on demand or became one of them at Toptal: https://topt.al/25cXVn
and get $2,000 discount on your first invoice
--------------------------------------------------
Music by Eric Matyas
https://www.soundimage.org
Track title: Beneath the City Looping
--
Chapters
00:00 Tensorflow Checkpoint Custom Map
02:07 Answer 1 Score 1
02:53 Accepted Answer Score 5
03:14 Thank you
--
Full question
https://stackoverflow.com/questions/5972...
--
Content licensed under CC BY-SA
https://meta.stackexchange.com/help/lice...
--
Tags
#python #tensorflow
#avk47
ACCEPTED ANSWER
Score 5
The issue I was facing was that I was popping items form a dictionary passed into a layers.Layer
    self.cell_name = params.pop('cell', 'GRU')
    self.num_layers = params.pop('num_layers', 1)
When passing a dictionary into a layer it must remain unchanged as it is tracked.
My solution was to further abstract away parameter parsing and pass in a finalized dictionary.
ANSWER 2
Score 1
Your RecurrentConfig object should inherit from tf.keras.layers.Layer instead of BaseLayer. The TF documentation on checkpoints/delayed restorations covers why:
Layerobjects in TensorFlow may delay the creation of variables to their first call, when input shapes are available. For example the shape of a Dense layer's kernel depends on both the layer's input and output shapes, and so the output shape required as a constructor argument is not enough information to create the variable on its own. Since calling a Layer also reads the variable's value, a restore must happen between the variable's creation and its first use.To support this idiom,
tf.train.Checkpointqueues restores which don't yet have a matching variable.