tensorflow exponential_decay使用

1
2
3
4
5
6
7
8
exponential_decay(
learning_rate,
global_step,
decay_steps,
decay_rate,
staircase=False,
name=None
)

一般用法

1
2
3
4
5
6
learning_rate = tf.train.exponential_decay(
LEARNING_RATE_BASE,
global_step,
mnist.train.num_examples / BATCH_SIZE,
LEARNING_RATE_DECAY
)

因为每次train会使用BATCH_SIZE多的数据

请作者喝一杯咖啡☕️