本文主要是介绍Tensorflow学习率的learning rate decay,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
x = tf.Variable(1.0)
y = x.assign_add(1)
with tf.Session() as sess:sess.run(tf.global_variables_initializer())print sess.run(x)print sess.run(y)print sess.run(x)
输出 1,2,2注意其x会变的
import tensorflow as tfglobal_step = tf.Variable(0, trainable=False)initial_learning_rate = 0.1 #初始学习率learning_rate = tf.train.exponential_decay(initial_learning_rate,global_step=global_step,decay_steps=10,decay_rate=0.9)
opt = tf.train.GradientDescentOptimizer(learning_rate)add_global = global_step.assign_add(1)
with tf.Session() as sess:tf.global_variables_initializer().run()print(sess.run(learning_rate))for i in range(1):_, rate = sess.run([add_global, learning_rate])print(rate)
参考:
http://blog.csdn.net/u012436149/article/details/62058318
这篇关于Tensorflow学习率的learning rate decay的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!