T11 TensorFlow入門實戰——優化器對比實驗

  • 🍨 本文為🔗365天深度學習訓練營?中的學習紀錄博客
  • 🍖 原作者:K同學啊 | 接輔導、項目定制

一、前期準備

1. 導入數據

# Import the required libraries
import pathlib
import matplotlib.pyplot as plt
import tensorflow as tf
from tensorflow.keras.layers import Dropout,Dense,BatchNormalization
from tensorflow.keras.models import Model
from matplotlib.ticker import MultipleLocator
from datetime import datetime# Load the data
data_dir = './data/48-data/'
data_dir = pathlib.Path(data_dir)data_paths = list(data_dir.glob('*'))
classeNames = [str(path).split("\\")[2] for path in data_paths]image_count = len(list(data_dir.glob('*/*')))
print("Total number of images:", image_count)

二、數據預處理

1. 加載數據

# Data loading and preprocessing
batch_size = 16
img_height = 336
img_width = 336train_ds = tf.keras.preprocessing.image_dataset_from_directory(data_dir,validation_split=0.2,subset="training",seed=12,image_size=(img_height, img_width),batch_size=batch_size)

val_ds = tf.keras.preprocessing.image_dataset_from_directory(data_dir,validation_split=0.2,subset="validation",seed=12,image_size=(img_height, img_width),batch_size=batch_size)

class_names = train_ds.class_names
print(class_names)
Nicole Kidman', 'Robert Downey Jr', 'Sandra Bullock', 'Scarlett Johansson', 'Tom Cruise', 'Tom Hanks', 'Will Smith']

?2. 檢查數據

# Check the shape of the data
for image_batch, labels_batch in train_ds:print(image_batch.shape)print(labels_batch.shape)break

?? 3. 配置數據集

AUTOTUNE = tf.data.AUTOTUNEdef train_preprocessing(image,label):return (image/255.0,label)train_ds = (train_ds.cache().shuffle(1000).map(train_preprocessing)    # 這里可以設置預處理函數
#     .batch(batch_size)           # 在image_dataset_from_directory處已經設置了batch_size.prefetch(buffer_size=AUTOTUNE)
)val_ds = (val_ds.cache().shuffle(1000).map(train_preprocessing)    # 這里可以設置預處理函數
#     .batch(batch_size)         # 在image_dataset_from_directory處已經設置了batch_size.prefetch(buffer_size=AUTOTUNE)
)

4. 數據可視化

plt.rcParams['font.family'] = 'SimHei'  # 設置字體為黑體(支持中文)
plt.rcParams['axes.unicode_minus'] = False  # 正常顯示負號plt.figure(figsize=(10, 8))  # 圖形的寬為10高為5
plt.suptitle("數據展示")for images, labels in train_ds.take(1):for i in range(15):plt.subplot(4, 5, i + 1)plt.xticks([])plt.yticks([])plt.grid(False)# 顯示圖片plt.imshow(images[i])# 顯示標簽plt.xlabel(class_names[labels[i]-1])plt.show()

三、訓練模型?

1.?構建模型

def create_model(optimizer='adam'):# 加載預訓練模型vgg16_base_model = tf.keras.applications.vgg16.VGG16(weights='imagenet',include_top=False,input_shape=(img_width, img_height, 3),pooling='avg')for layer in vgg16_base_model.layers:layer.trainable = FalseX = vgg16_base_model.outputX = Dense(170, activation='relu')(X)X = BatchNormalization()(X)X = Dropout(0.5)(X)output = Dense(len(class_names), activation='softmax')(X)vgg16_model = Model(inputs=vgg16_base_model.input, outputs=output)vgg16_model.compile(optimizer=optimizer,loss='sparse_categorical_crossentropy',metrics=['accuracy'])return vgg16_modelmodel1 = create_model(optimizer=tf.keras.optimizers.Adam())
model2 = create_model(optimizer=tf.keras.optimizers.SGD())
model2.summary()
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/vgg16/vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5
58889256/58889256 [==============================] - 5s 0us/step
Model: "model_1"
_________________________________________________________________Layer (type)                Output Shape              Param #   
=================================================================input_2 (InputLayer)        [(None, 336, 336, 3)]     0         block1_conv1 (Conv2D)       (None, 336, 336, 64)      1792      block1_conv2 (Conv2D)       (None, 336, 336, 64)      36928     block1_pool (MaxPooling2D)  (None, 168, 168, 64)      0         block2_conv1 (Conv2D)       (None, 168, 168, 128)     73856     block2_conv2 (Conv2D)       (None, 168, 168, 128)     147584    block2_pool (MaxPooling2D)  (None, 84, 84, 128)       0         block3_conv1 (Conv2D)       (None, 84, 84, 256)       295168    block3_conv2 (Conv2D)       (None, 84, 84, 256)       590080    block3_conv3 (Conv2D)       (None, 84, 84, 256)       590080    block3_pool (MaxPooling2D)  (None, 42, 42, 256)       0         block4_conv1 (Conv2D)       (None, 42, 42, 512)       1180160   block4_conv2 (Conv2D)       (None, 42, 42, 512)       2359808   block4_conv3 (Conv2D)       (None, 42, 42, 512)       2359808   block4_pool (MaxPooling2D)  (None, 21, 21, 512)       0         block5_conv1 (Conv2D)       (None, 21, 21, 512)       2359808   block5_conv2 (Conv2D)       (None, 21, 21, 512)       2359808   block5_conv3 (Conv2D)       (None, 21, 21, 512)       2359808   block5_pool (MaxPooling2D)  (None, 10, 10, 512)       0         global_average_pooling2d_1  (None, 512)               0         (GlobalAveragePooling2D)                                       dense_2 (Dense)             (None, 170)               87210     batch_normalization_1 (Bat  (None, 170)               680       chNormalization)                                                dropout_1 (Dropout)         (None, 170)               0         dense_3 (Dense)             (None, 17)                2907      =================================================================
Total params: 14805485 (56.48 MB)
Trainable params: 90457 (353.35 KB)
Non-trainable params: 14715028 (56.13 MB)
_________________________________________________________________

?3. 訓練模型?

# Train the model
NO_EPOCHS = 50history_model1  = model1.fit(train_ds, epochs=NO_EPOCHS, verbose=1, validation_data=val_ds)
history_model2  = model2.fit(train_ds, epochs=NO_EPOCHS, verbose=1, validation_data=val_ds)
Epoch 1/50
90/90 [==============================] - 1202s 13s/step - loss: 0.3176 - accuracy: 0.8965 - val_loss: 7.7180 - val_accuracy: 0.0583
Epoch 2/50
90/90 [==============================] - 1090s 12s/step - loss: 0.2925 - accuracy: 0.9167 - val_loss: 7.4216 - val_accuracy: 0.0472
Epoch 3/50
90/90 [==============================] - 1296s 14s/step - loss: 0.3077 - accuracy: 0.9125 - val_loss: 8.2351 - val_accuracy: 0.0583
Epoch 4/50
90/90 [==============================] - 1302s 14s/step - loss: 0.2624 - accuracy: 0.9326 - val_loss: 8.9317 - val_accuracy: 0.0583
Epoch 5/50
90/90 [==============================] - 1040s 12s/step - loss: 0.2837 - accuracy: 0.9174 - val_loss: 9.0407 - val_accuracy: 0.0583
Epoch 6/50
90/90 [==============================] - 961s 11s/step - loss: 0.2769 - accuracy: 0.9139 - val_loss: 8.2484 - val_accuracy: 0.0583
Epoch 7/50
90/90 [==============================] - 950s 11s/step - loss: 0.2749 - accuracy: 0.9160 - val_loss: 8.8199 - val_accuracy: 0.0444
Epoch 8/50
90/90 [==============================] - 934s 10s/step - loss: 0.2525 - accuracy: 0.9292 - val_loss: 8.1721 - val_accuracy: 0.0722
Epoch 9/50
90/90 [==============================] - 1260s 14s/step - loss: 0.2306 - accuracy: 0.9361 - val_loss: 8.6387 - val_accuracy: 0.0583
Epoch 10/50
90/90 [==============================] - 1429s 16s/step - loss: 0.2448 - accuracy: 0.9208 - val_loss: 9.7182 - val_accuracy: 0.0583
Epoch 11/50
90/90 [==============================] - 1044s 12s/step - loss: 0.2269 - accuracy: 0.9299 - val_loss: 10.4608 - val_accuracy: 0.0583
Epoch 12/50
90/90 [==============================] - 1352s 15s/step - loss: 0.2121 - accuracy: 0.9333 - val_loss: 9.2537 - val_accuracy: 0.0472
Epoch 13/50
90/90 [==============================] - 1969s 22s/step - loss: 0.2014 - accuracy: 0.9368 - val_loss: 9.2780 - val_accuracy: 0.0722
Epoch 14/50
90/90 [==============================] - 1372s 15s/step - loss: 0.1803 - accuracy: 0.9486 - val_loss: 9.4223 - val_accuracy: 0.0583
Epoch 15/50
90/90 [==============================] - 1460s 16s/step - loss: 0.1795 - accuracy: 0.9535 - val_loss: 8.9366 - val_accuracy: 0.0583
Epoch 16/50
90/90 [==============================] - 1409s 16s/step - loss: 0.2325 - accuracy: 0.9215 - val_loss: 10.3105 - val_accuracy: 0.0472
Epoch 17/50
90/90 [==============================] - 1353s 15s/step - loss: 0.2212 - accuracy: 0.9271 - val_loss: 9.2342 - val_accuracy: 0.0583
Epoch 18/50
90/90 [==============================] - 1201s 13s/step - loss: 0.1793 - accuracy: 0.9500 - val_loss: 9.9170 - val_accuracy: 0.0472
Epoch 19/50
90/90 [==============================] - 929s 10s/step - loss: 0.1930 - accuracy: 0.9354 - val_loss: 9.9911 - val_accuracy: 0.0583
Epoch 20/50
90/90 [==============================] - 13115s 147s/step - loss: 0.2122 - accuracy: 0.9333 - val_loss: 9.5141 - val_accuracy: 0.0750
Epoch 21/50
90/90 [==============================] - 849s 9s/step - loss: 0.2142 - accuracy: 0.9319 - val_loss: 9.9998 - val_accuracy: 0.0472
Epoch 22/50
90/90 [==============================] - 806s 9s/step - loss: 0.1790 - accuracy: 0.9417 - val_loss: 9.0953 - val_accuracy: 0.0583
Epoch 23/50
90/90 [==============================] - 953s 11s/step - loss: 0.1722 - accuracy: 0.9486 - val_loss: 10.1111 - val_accuracy: 0.0583
Epoch 24/50
90/90 [==============================] - 1117s 12s/step - loss: 0.1824 - accuracy: 0.9368 - val_loss: 11.0077 - val_accuracy: 0.0472
Epoch 25/50
90/90 [==============================] - 1111s 12s/step - loss: 0.1613 - accuracy: 0.9514 - val_loss: 11.9721 - val_accuracy: 0.0472
Epoch 26/50
90/90 [==============================] - 1148s 13s/step - loss: 0.1641 - accuracy: 0.9556 - val_loss: 12.8058 - val_accuracy: 0.0472
Epoch 27/50
90/90 [==============================] - 1227s 14s/step - loss: 0.1286 - accuracy: 0.9590 - val_loss: 10.5750 - val_accuracy: 0.0472
Epoch 28/50
90/90 [==============================] - 1191s 13s/step - loss: 0.1791 - accuracy: 0.9493 - val_loss: 12.0891 - val_accuracy: 0.0472
Epoch 29/50
90/90 [==============================] - 1191s 13s/step - loss: 0.1629 - accuracy: 0.9493 - val_loss: 11.8981 - val_accuracy: 0.0472
Epoch 30/50
90/90 [==============================] - 1234s 14s/step - loss: 0.1545 - accuracy: 0.9479 - val_loss: 10.4402 - val_accuracy: 0.0472
Epoch 31/50
90/90 [==============================] - 956s 11s/step - loss: 0.1687 - accuracy: 0.9507 - val_loss: 8.6383 - val_accuracy: 0.0472
Epoch 32/50
90/90 [==============================] - 896s 10s/step - loss: 0.1470 - accuracy: 0.9528 - val_loss: 12.8927 - val_accuracy: 0.0472
Epoch 33/50
90/90 [==============================] - 901s 10s/step - loss: 0.1373 - accuracy: 0.9556 - val_loss: 10.4122 - val_accuracy: 0.0472
Epoch 34/50
90/90 [==============================] - 899s 10s/step - loss: 0.1428 - accuracy: 0.9521 - val_loss: 11.1399 - val_accuracy: 0.0750
Epoch 35/50
90/90 [==============================] - 878s 10s/step - loss: 0.1343 - accuracy: 0.9583 - val_loss: 12.0714 - val_accuracy: 0.0583
Epoch 36/50
90/90 [==============================] - 886s 10s/step - loss: 0.1432 - accuracy: 0.9535 - val_loss: 12.5365 - val_accuracy: 0.0583
Epoch 37/50
90/90 [==============================] - 863s 10s/step - loss: 0.1337 - accuracy: 0.9569 - val_loss: 10.0840 - val_accuracy: 0.0583
Epoch 38/50
90/90 [==============================] - 889s 10s/step - loss: 0.1632 - accuracy: 0.9514 - val_loss: 9.1576 - val_accuracy: 0.0722
Epoch 39/50
90/90 [==============================] - 881s 10s/step - loss: 0.1418 - accuracy: 0.9549 - val_loss: 14.8210 - val_accuracy: 0.0583
Epoch 40/50
90/90 [==============================] - 890s 10s/step - loss: 0.1690 - accuracy: 0.9514 - val_loss: 11.0727 - val_accuracy: 0.0472
Epoch 41/50
90/90 [==============================] - 870s 10s/step - loss: 0.1260 - accuracy: 0.9701 - val_loss: 10.9087 - val_accuracy: 0.0583
Epoch 42/50
90/90 [==============================] - 868s 10s/step - loss: 0.1620 - accuracy: 0.9417 - val_loss: 18.5777 - val_accuracy: 0.0583
Epoch 43/50
90/90 [==============================] - 885s 10s/step - loss: 0.1554 - accuracy: 0.9444 - val_loss: 16.1502 - val_accuracy: 0.0583
Epoch 44/50
90/90 [==============================] - 861s 10s/step - loss: 0.1444 - accuracy: 0.9472 - val_loss: 11.4246 - val_accuracy: 0.0583
Epoch 45/50
90/90 [==============================] - 891s 10s/step - loss: 0.1707 - accuracy: 0.9479 - val_loss: 9.7772 - val_accuracy: 0.0472
Epoch 46/50
90/90 [==============================] - 871s 10s/step - loss: 0.1733 - accuracy: 0.9368 - val_loss: 11.6579 - val_accuracy: 0.0472
Epoch 47/50
90/90 [==============================] - 867s 10s/step - loss: 0.1455 - accuracy: 0.9521 - val_loss: 10.5239 - val_accuracy: 0.0722
Epoch 48/50
90/90 [==============================] - 886s 10s/step - loss: 0.1527 - accuracy: 0.9472 - val_loss: 12.6337 - val_accuracy: 0.0583
Epoch 49/50
90/90 [==============================] - 894s 10s/step - loss: 0.1689 - accuracy: 0.9451 - val_loss: 13.6906 - val_accuracy: 0.0583
Epoch 50/50
90/90 [==============================] - 882s 10s/step - loss: 0.1434 - accuracy: 0.9458 - val_loss: 11.2179 - val_accuracy: 0.0583
Epoch 1/50
90/90 [==============================] - 914s 10s/step - loss: 3.0652 - accuracy: 0.1132 - val_loss: 2.8820 - val_accuracy: 0.0417
Epoch 2/50
90/90 [==============================] - 855s 10s/step - loss: 2.4852 - accuracy: 0.2215 - val_loss: 2.9252 - val_accuracy: 0.0444
Epoch 3/50
90/90 [==============================] - 856s 10s/step - loss: 2.2494 - accuracy: 0.2639 - val_loss: 3.0725 - val_accuracy: 0.0417
Epoch 4/50
90/90 [==============================] - 865s 10s/step - loss: 2.0995 - accuracy: 0.3368 - val_loss: 3.3332 - val_accuracy: 0.0417
Epoch 5/50
90/90 [==============================] - 859s 10s/step - loss: 1.9039 - accuracy: 0.3833 - val_loss: 3.5608 - val_accuracy: 0.0444
Epoch 6/50
90/90 [==============================] - 871s 10s/step - loss: 1.7996 - accuracy: 0.4236 - val_loss: 4.3610 - val_accuracy: 0.0417
Epoch 7/50
90/90 [==============================] - 868s 10s/step - loss: 1.6905 - accuracy: 0.4313 - val_loss: 4.8573 - val_accuracy: 0.0417
Epoch 8/50
90/90 [==============================] - 875s 10s/step - loss: 1.6161 - accuracy: 0.4750 - val_loss: 5.4109 - val_accuracy: 0.0417
Epoch 9/50
90/90 [==============================] - 855s 10s/step - loss: 1.5523 - accuracy: 0.4889 - val_loss: 5.2799 - val_accuracy: 0.0417
Epoch 10/50
90/90 [==============================] - 855s 10s/step - loss: 1.4717 - accuracy: 0.5312 - val_loss: 5.2821 - val_accuracy: 0.0417
Epoch 11/50
90/90 [==============================] - 888s 10s/step - loss: 1.4668 - accuracy: 0.5257 - val_loss: 5.5069 - val_accuracy: 0.0417
Epoch 12/50
90/90 [==============================] - 890s 10s/step - loss: 1.3670 - accuracy: 0.5639 - val_loss: 5.6636 - val_accuracy: 0.0417
Epoch 13/50
90/90 [==============================] - 861s 10s/step - loss: 1.3412 - accuracy: 0.5618 - val_loss: 5.5362 - val_accuracy: 0.0417
Epoch 14/50
90/90 [==============================] - 885s 10s/step - loss: 1.2694 - accuracy: 0.5931 - val_loss: 5.9473 - val_accuracy: 0.0417
Epoch 15/50
90/90 [==============================] - 882s 10s/step - loss: 1.2464 - accuracy: 0.6062 - val_loss: 6.1568 - val_accuracy: 0.0417
Epoch 16/50
90/90 [==============================] - 890s 10s/step - loss: 1.1958 - accuracy: 0.6306 - val_loss: 5.9811 - val_accuracy: 0.0417
Epoch 17/50
90/90 [==============================] - 881s 10s/step - loss: 1.1817 - accuracy: 0.6257 - val_loss: 5.8977 - val_accuracy: 0.0417
Epoch 18/50
90/90 [==============================] - 885s 10s/step - loss: 1.1527 - accuracy: 0.6354 - val_loss: 5.9559 - val_accuracy: 0.0472
Epoch 19/50
90/90 [==============================] - 870s 10s/step - loss: 1.0981 - accuracy: 0.6507 - val_loss: 6.1796 - val_accuracy: 0.0417
Epoch 20/50
90/90 [==============================] - 873s 10s/step - loss: 1.0697 - accuracy: 0.6667 - val_loss: 5.8840 - val_accuracy: 0.0417
Epoch 21/50
90/90 [==============================] - 901s 10s/step - loss: 1.0661 - accuracy: 0.6646 - val_loss: 6.1797 - val_accuracy: 0.0472
Epoch 22/50
90/90 [==============================] - 879s 10s/step - loss: 0.9922 - accuracy: 0.6903 - val_loss: 6.2074 - val_accuracy: 0.0417
Epoch 23/50
90/90 [==============================] - 876s 10s/step - loss: 0.9992 - accuracy: 0.6806 - val_loss: 5.4473 - val_accuracy: 0.0417
Epoch 24/50
90/90 [==============================] - 905s 10s/step - loss: 0.9279 - accuracy: 0.7069 - val_loss: 5.5743 - val_accuracy: 0.0417
Epoch 25/50
90/90 [==============================] - 894s 10s/step - loss: 0.9319 - accuracy: 0.7118 - val_loss: 6.1316 - val_accuracy: 0.0472
Epoch 26/50
90/90 [==============================] - 927s 10s/step - loss: 0.8869 - accuracy: 0.7222 - val_loss: 6.0186 - val_accuracy: 0.0472
Epoch 27/50
90/90 [==============================] - 893s 10s/step - loss: 0.9086 - accuracy: 0.7118 - val_loss: 6.8811 - val_accuracy: 0.0417
Epoch 28/50
90/90 [==============================] - 877s 10s/step - loss: 0.8965 - accuracy: 0.7118 - val_loss: 6.9371 - val_accuracy: 0.0472
Epoch 29/50
90/90 [==============================] - 912s 10s/step - loss: 0.9026 - accuracy: 0.7194 - val_loss: 6.2633 - val_accuracy: 0.0417
Epoch 30/50
90/90 [==============================] - 906s 10s/step - loss: 0.8067 - accuracy: 0.7535 - val_loss: 6.3067 - val_accuracy: 0.0472
Epoch 31/50
90/90 [==============================] - 900s 10s/step - loss: 0.7955 - accuracy: 0.7556 - val_loss: 6.1450 - val_accuracy: 0.0472
Epoch 32/50
90/90 [==============================] - 918s 10s/step - loss: 0.7941 - accuracy: 0.7486 - val_loss: 6.2223 - val_accuracy: 0.0472
Epoch 33/50
90/90 [==============================] - 1473s 16s/step - loss: 0.7692 - accuracy: 0.7667 - val_loss: 6.2006 - val_accuracy: 0.0528
Epoch 34/50
90/90 [==============================] - 1436s 16s/step - loss: 0.7648 - accuracy: 0.7514 - val_loss: 6.1662 - val_accuracy: 0.0472
Epoch 35/50
90/90 [==============================] - 1386s 15s/step - loss: 0.7358 - accuracy: 0.7722 - val_loss: 6.1199 - val_accuracy: 0.0417
Epoch 36/50
90/90 [==============================] - 1033s 11s/step - loss: 0.7337 - accuracy: 0.7604 - val_loss: 6.4092 - val_accuracy: 0.0472
Epoch 37/50
90/90 [==============================] - 897s 10s/step - loss: 0.7166 - accuracy: 0.7743 - val_loss: 7.1209 - val_accuracy: 0.0472
Epoch 38/50
90/90 [==============================] - 897s 10s/step - loss: 0.6971 - accuracy: 0.7910 - val_loss: 6.5154 - val_accuracy: 0.0417
Epoch 39/50
90/90 [==============================] - 874s 10s/step - loss: 0.6958 - accuracy: 0.7833 - val_loss: 6.9477 - val_accuracy: 0.0472
Epoch 40/50
90/90 [==============================] - 1045s 12s/step - loss: 0.6516 - accuracy: 0.8049 - val_loss: 6.6442 - val_accuracy: 0.0472
Epoch 41/50
90/90 [==============================] - 1187s 13s/step - loss: 0.6481 - accuracy: 0.7903 - val_loss: 6.5062 - val_accuracy: 0.0472
Epoch 42/50
90/90 [==============================] - 975s 11s/step - loss: 0.6312 - accuracy: 0.8021 - val_loss: 6.6628 - val_accuracy: 0.0583
Epoch 43/50
90/90 [==============================] - 887s 10s/step - loss: 0.6247 - accuracy: 0.8042 - val_loss: 6.5811 - val_accuracy: 0.0417
Epoch 44/50
90/90 [==============================] - 898s 10s/step - loss: 0.6188 - accuracy: 0.7951 - val_loss: 6.3517 - val_accuracy: 0.0583
Epoch 45/50
90/90 [==============================] - 894s 10s/step - loss: 0.6151 - accuracy: 0.8139 - val_loss: 7.5465 - val_accuracy: 0.0583
Epoch 46/50
90/90 [==============================] - 911s 10s/step - loss: 0.5698 - accuracy: 0.8271 - val_loss: 7.7967 - val_accuracy: 0.0583
Epoch 47/50
90/90 [==============================] - 904s 10s/step - loss: 0.5727 - accuracy: 0.8188 - val_loss: 7.2678 - val_accuracy: 0.0417
Epoch 48/50
90/90 [==============================] - 887s 10s/step - loss: 0.5595 - accuracy: 0.8167 - val_loss: 7.5204 - val_accuracy: 0.0583
Epoch 49/50
90/90 [==============================] - 874s 10s/step - loss: 0.5318 - accuracy: 0.8299 - val_loss: 7.6148 - val_accuracy: 0.0583
Epoch 50/50
90/90 [==============================] - 1299s 15s/step - loss: 0.5296 - accuracy: 0.8313 - val_loss: 6.7918 - val_accuracy: 0.0417

四、模型評估

1. Loss與Accuracy圖

plt.rcParams['savefig.dpi'] = 300 #圖片像素
plt.rcParams['figure.dpi']  = 300 #分辨率current_time = datetime.now() # 獲取當前時間acc1     = history_model1.history['accuracy']
acc2     = history_model2.history['accuracy']
val_acc1 = history_model1.history['val_accuracy']
val_acc2 = history_model2.history['val_accuracy']loss1     = history_model1.history['loss']
loss2     = history_model2.history['loss']
val_loss1 = history_model1.history['val_loss']
val_loss2 = history_model2.history['val_loss']epochs_range = range(len(acc1))plt.figure(figsize=(16, 4))
plt.subplot(1, 2, 1)plt.plot(epochs_range, acc1, label='Training Accuracy-Adam')
plt.plot(epochs_range, acc2, label='Training Accuracy-SGD')
plt.plot(epochs_range, val_acc1, label='Validation Accuracy-Adam')
plt.plot(epochs_range, val_acc2, label='Validation Accuracy-SGD')
plt.legend(loc='lower right')
plt.title('Training and Validation Accuracy')
plt.xlabel(current_time) # 打卡請帶上時間戳,否則代碼截圖無效
# 設置刻度間隔,x軸每1一個刻度
ax = plt.gca()
ax.xaxis.set_major_locator(MultipleLocator(1))plt.subplot(1, 2, 2)
plt.plot(epochs_range, loss1, label='Training Loss-Adam')
plt.plot(epochs_range, loss2, label='Training Loss-SGD')
plt.plot(epochs_range, val_loss1, label='Validation Loss-Adam')
plt.plot(epochs_range, val_loss2, label='Validation Loss-SGD')
plt.legend(loc='upper right')
plt.title('Training and Validation Loss')# 設置刻度間隔,x軸每1一個刻度
ax = plt.gca()
ax.xaxis.set_major_locator(MultipleLocator(1))plt.show()

2. 評估模型

def test_accuracy_report(model):score = model.evaluate(val_ds, verbose=0)print('Loss function: %s, accuracy:' % score[0], score[1])test_accuracy_report(model2)
Loss function: 6.791763782501221, accuracy: 0.0416666679084301

本文來自互聯網用戶投稿,該文觀點僅代表作者本人,不代表本站立場。本站僅提供信息存儲空間服務,不擁有所有權,不承擔相關法律責任。
如若轉載,請注明出處:http://www.pswp.cn/diannao/76275.shtml
繁體地址,請注明出處:http://hk.pswp.cn/diannao/76275.shtml
英文地址,請注明出處:http://en.pswp.cn/diannao/76275.shtml

如若內容造成侵權/違法違規/事實不符,請聯系多彩編程網進行投訴反饋email:809451989@qq.com,一經查實,立即刪除!

相關文章

Docker部署sprintboot后端項目

創建Docker網絡 docker network create icjs 部署Redis docker run -d \--network icjs \--name redis \-p 6379:6379 \redis:latest數據持久化 docker run --restartalways --network icjs -p 6379:6379 --name redis -v /opt/docker/redis/redis.conf:/etc/redis/redis.c…

01小游戲

問題描述 小明得到了一個長度為 nn 的字符串 ss ,該字符串都是由數字 00 和 11 組成,并且下標從 11 開始,小明現在需要對這個字符串進行 qq 次操作,每次操作包含以下兩種操作之一: 操作 11 :小明查詢該字符…

Androidstudio開發,實現商品分類

文章目錄 1. 功能需求2. 代碼實現過程1. 編寫布局文件2. 創建商品分類(Adapter)適配器3. 實現商品分類Activity4. 在res/values/ 下新建 array.xml ,用于添加商品分類數據5. 效果演示 6. 關于作者其它項目視頻教程介紹 1. 功能需求 顯示商品分…

Linux快速安裝docker和docker-componse步驟

在 CentOS 7 上安裝 Docker 和 Docker Compose 的步驟如下: 1. 安裝 Docker 1.1. 更新系統 首先,確保你的系統是最新版本: sudo yum update -y1.2. 安裝必要的包 安裝 yum-utils,這是管理 YUM 源的工具: sudo yu…

VBA代碼解決方案第二十三講 EXCEL中,如何刪除工作表中的空白行

《VBA代碼解決方案》(版權10028096)這套教程是我最早推出的教程,目前已經是第三版修訂了。這套教程定位于入門后的提高,在學習這套教程過程中,側重點是要理解及掌握我的“積木編程”思想。要靈活運用教程中的實例像搭積木一樣把自己喜歡的代碼…

Pytorch--tensor.view()

在 PyTorch 中,tensor.view() 是一個常用的方法,用于改變張量(Tensor)的形狀(shape),但不會改變其數據本身。它類似于 NumPy 的 reshape(),但有一些關鍵區別。 1. 基本用法 import …

【機器學習】——機器學習思考總結

摘要 這篇文章深入探討了機器學習中的數據相關問題,重點分析了神經網絡(DNN)的學習機制,包括層級特征提取、非線性激活函數、反向傳播和梯度下降等關鍵機制。同時,文章還討論了數據集大小的標準、機器學習訓練數據量的…

CoAP Shell 筆記

CoAP Shell 筆記 1. 概述 CoAP (Constrained Application Protocol) 是一種專為物聯網 (IoT) 中資源受限的節點和網絡設計的 RESTful Web 傳輸協議。CoAP Shell 是一個基于命令行的交互式工具,用于與支持 CoAP 的服務器進行交互。 2. 主要功能 協議支持&#xff…

【最新】探索CFD的未來:從OpenFOAM到深度學習,全面解析計算流體力學的頂級資源與前沿技術

計算流體力學(CFD)作為現代工程與科學研究的核心工具,正以前所未有的速度邁向智能化與多物理場耦合的新時代。本文全面梳理了在線學習CFD的頂級資源,涵蓋了從傳統數值模擬到深度學習驅動的物理信息模型的廣泛領域,旨在為研究者、工程師和學生提供一站式參考指南。內容分為…

[leetcode]2492. 兩個城市間路徑的最小分數(并查集 排序后建邊)

題目鏈接 題意 給定一個 n n n個點 m m m條邊的無向圖 每條邊有邊權 求1-n的路徑中最小的邊權是多少 每條路可以重復走 思路 把邊按邊權降序排序 用并查集維護連通性 遍歷每條邊 每次合并邊的起點和終點 如果1和n聯通 并且這條邊在1和n的這個連通塊中 就對ans取min Code…

Windows中IDEA2024.1的安裝和使用

如果你也喜歡,記得一鍵三連啊 一、卸載 二、安裝 三、注冊 1、打開Crack文件,直接雙擊 “安裝.bat”,否則可能安裝會出錯!! 2、選擇【Activation code】(不要關閉該界面繼續后面的步驟)。 …

【C#】構造協議幀通過串口下發

構造一個“協議幀”&#xff0c;打包串口/網絡通信幀頭部結構的核心部分 &#x1f527; 代碼&#xff1a; List<byte> frame new List<byte>();// 1. 固定幀頭 frame.AddRange(BitConverter.GetBytes(0x0130)); // 幀頭 (4B) frame.AddRange(BitConverter…

04_SQL概述及DDL

文章目錄 一、關于SQL1.1、SQL概述1.2、SQL分類 二、數據庫操作2.1、查看數據庫2.2、切換數據庫2.3、查詢當前使用的數據庫2.4、創建數據庫2.5、查看數據庫創建信息2.6、修改數據庫2.7、刪除數據庫 三、表的操作3.1、數據類型3.1.1、數值類型3.1.2、字符串類型3.1.3、日期時間類…

HCIA-數據通信datacom認證

文章目錄 一、數據通信簡介1.1 標準協議1.2 數據傳輸過程 二、通用路由平臺VRP2.1 VRP簡介2.2 命令行基礎 三 、網絡層協議IP3.1 數據封裝3.2 數據包傳輸2.3 IP地址2.4 子網劃分2.5 ICMP 四、IP路由基礎4.1 路由概述4.2 路由表4.3 路由轉發4.4 靜態路由4.5 動態路由4.6 路由高級…

fast_pow(),c語言冪函數

double fast_pow(double a, int n) { double res 1.0; while (n > 0) { if (n & 1) res * a; // 如果當前位是1&#xff0c;累乘 a * a; // 平方 n >> 1; // 右移一位&#xff08;相當于 n / 2&…

OpenBMC:BmcWeb 處理http請求2 查找路由對象

OpenBMC:BmcWeb 處理http請求1 生成Request和AsyncResp對象_bmc web-CSDN博客 當接收到http請求,并且完成解析后,調用了App::handle處理請求 而App::handle又調用了router.handle(req, asyncResp);來處理請求 1.Router::handle void handle(const std::shared_ptr<Requ…

[Mac]利用hexo-theme-fluid美化個人博客

接上文,使用Fluid美化個人博客 文章目錄 一、安裝hexo-theme-fluid安裝依賴指定主題創建「關于頁」效果展示 二、修改個性化配置1. 修改網站設置2.修改文章路徑顯示3.體驗分類和標簽4.左上角博客名稱修改5.修改背景圖片6.修改關于界面 歡迎大家參觀 一、安裝hexo-theme-fluid 參…

深入理解二叉樹、B樹與B+樹:原理、應用與實現

文章目錄 引言一、二叉樹&#xff1a;基礎而強大的結構基本概念特性分析Java實現應用場景 二、B樹&#xff1a;適合外存的多路平衡樹基本概念關鍵特性查詢流程示例Java簡化實現典型應用 三、B樹&#xff1a;數據庫索引的首選核心改進優勢分析范圍查詢示例Java簡化實現實際應用 …

8.4考研408簡單選擇排序與堆排序知識點深度解析

考研408「簡單選擇排序與堆排序」知識點全解析 一、簡單選擇排序 1.1 定義與核心思想 簡單選擇排序(Selection Sort)是一種選擇排序算法,其核心思想是: 每趟選擇:從待排序序列中選擇最小(或最大)的元素,與當前位置的元素交換。逐步構建有序序列:經過 n ? 1 n-1

為什么需要開源成分分析?庫博同源分析工具介紹

在當今的軟件開發世界中&#xff0c;開源組件已經成為不可或缺的一部分。無論是加速開發進程&#xff0c;還是降低開發成本&#xff0c;開源組件都為我們帶來了巨大的便利。然而&#xff0c;隨著開源組件的廣泛使用&#xff0c;安全風險也隨之而來。你是否曾擔心過&#xff0c;…