生活随笔
收集整理的這篇文章主要介紹了
使用 keras 训练大规模数据
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
參考1
參考2
train_on_batch
n_epoch = 12
batch_size = 16
for e in range(n_epoch):print("epoch", e)batch_num = 0loss_sum=np.array([0.0,0.0])for X_train, y_train in GET_DATASET_SHUFFLE(train_X, batch_size, True): # chunks of 100 images for X_batch, y_batch in train_datagen.flow(X_train, y_train, batch_size=batch_size): # chunks of 32 samplesloss = model.train_on_batch(X_batch, y_batch)loss_sum += loss batch_num += 1break #手動breakif batch_num%200==0:print("epoch %s, batch %s: train_loss = %.4f, train_acc = %.4f"%(e, batch_num, loss_sum[0]/200, loss_sum[1]/200))loss_sum=np.array([0.0,0.0])res = model.evaluate_generator(GET_DATASET_SHUFFLE(val_X, batch_size, False),int(len(val_X)/batch_size))print("val_loss = %.4f, val_acc = %.4f: "%( res[0], res[1]))model.save("weight.h5")
總結
以上是生活随笔為你收集整理的使用 keras 训练大规模数据的全部內容,希望文章能夠幫你解決所遇到的問題。
如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。