tensorflow exponential_decay使用 发表于 2018-01-Sun | 阅读次数: 12345678exponential_decay( learning_rate, global_step, decay_steps, decay_rate, staircase=False, name=None)一般用法123456learning_rate = tf.train.exponential_decay( LEARNING_RATE_BASE, global_step, mnist.train.num_examples / BATCH_SIZE, LEARNING_RATE_DECAY)因为每次train会使用BATCH_SIZE多的数据请作者喝一杯咖啡☕️打赏微信支付