Track learning rate

Is there an easy way to track the learning rate during training?

I tried to see which field of the AdamOptimizer object, gradients or the Operation(after apply_gradients() call) is used to keep track the changing learning rate value, but failed. any help?

Another related question:
when resuming training ( from the last checkpoint, for example) does it restore the optimizer and the last reached learning rate value? or it will initialize a new optimizer with the given lr parsed by the flags?