I try to use LearningRateScheduler to adjust the learning rate when running ImportanceTraining(model).fit. But it seems that the real learning rate keeps unchanged during the whole training process and equals to the lr in optimizer=SGD(lr). Would you please show me how to correctly adjust the learning rate? I use tf=1.4.0 and keras=2.2.0.
I try to use LearningRateScheduler to adjust the learning rate when running ImportanceTraining(model).fit. But it seems that the real learning rate keeps unchanged during the whole training process and equals to the lr in optimizer=SGD(lr). Would you please show me how to correctly adjust the learning rate? I use tf=1.4.0 and keras=2.2.0.