vendredi 17 mai 2019

Tensorflow - Training on condition

I am training a neural network with tensorflow (1.12) in a supervised fashion. I'd like to only train on specific examples. The examples are created on the fly by cutting out subsequences, hence I want to do the conditioning within tensorflow.

This is my original part of code:

train_step, gvs = minimize_clipped(optimizer, loss,
                               clip_value=FLAGS.gradient_clip,
                               return_gvs=True)
gradients = [g for (g,v) in gvs]
gradient_norm = tf.global_norm(gradients)
tf.summary.scalar('gradients/norm', gradient_norm)
eval_losses = {'loss1': loss1,
               'loss2': loss2}

The training step is later executed as:

batch_eval, _ = sess.run([eval_losses, train_step])

I was thinking about inserting something like

train_step_fake = ????
eval_losses_fake = tf.zeros_like(tensor)
train_step_new = tf.cond(my_cond, train_step, train_step_fake)
eval_losses_new = tf.cond(my_cond, eval_losses, eval_losses_fake)

and then doing

batch_eval, _ = sess.run([eval_losses, train_step])

However, I am not sure how to create a fake train_step.

Also, is this a good idea in general or is there a smoother way of doing this? I am using a tfrecords pipeline, but no other high-level modules (like keras, tf.estimator, eager execution etc.).

Any help is obviously greatly appreciated!

Aucun commentaire:

Enregistrer un commentaire