The Vehicle Type Classification Project¶
Summary¶
- Use Case: Vehicle Type
- Algorithm: MobileNetV2
- Number of training images: 603
- Number of classes: 7
- Batch Size: 64
- Optimizer: Adam
- Learning Rate: 0.0001
- Loss Type:CategoricalCrossentropy
- Transfer Learning: Yes | Imagenet
Labels¶
0: 'car-bus-alltypes',
1: 'car-sedan-alltypes',
2: 'car-suv-alltypes',
3: 'motocycle-bicycle-kids',
4: 'motocycle-bicycle-racing',
5: 'motocycle-motorbike-chopper',
6: 'motocycle-motorbike-sport'
Import Library¶
In [1]:
# import the necessary packages
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from tensorflow.keras.layers import AveragePooling2D, GlobalAveragePooling2D, BatchNormalization
#from tensorflow.keras.applications import ResNet50
#from tensorflow.keras.applications import Xception
from tensorflow.keras.applications import MobileNetV2
from tensorflow.keras.layers import AveragePooling2D
from tensorflow.keras.layers import Dropout
from tensorflow.keras.layers import Flatten
from tensorflow.keras.layers import Dense
from tensorflow.keras.layers import Input
from tensorflow.keras.models import Model
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.optimizers import SGD
from tensorflow.keras.utils import to_categorical
from sklearn.preprocessing import LabelBinarizer
from sklearn.model_selection import train_test_split
from sklearn.metrics import classification_report
from sklearn.metrics import confusion_matrix
from imutils import paths
import matplotlib.pyplot as plt
import numpy as np
import argparse
import cv2
import os
import sys
import tensorflow as tf
import h5py
import numpy as np
import sys
In [2]:
print(tf.__version__)
2.9.2
구글드라이브 마운트 하기¶
In [3]:
from google.colab import drive
drive.mount('/content/drive')
Mounted at /content/drive
다음 데이터셋을 다운로드하여, 압축을 풀고, 구글 드라이브에 올려놓는다.¶
https://drive.google.com/file/d/1oAW1QoG77-WLPkPa2tAM4Mnpt1cznnDl/view?usp=sharing
In [4]:
# 구글드라이브에서 다운로드가 안되서 다운url 변경
!wget --no-check-certificate https://block-yh-test1.s3.amazonaws.com/38_%E1%84%89%E1%85%B5%E1%86%AF%E1%84%89%E1%85%B3%E1%86%B8%E1%84%91%E1%85%A1%E1%84%8B%E1%85%B5%E1%86%AF.zip -O ./data.zip
--2023-01-02 05:26:09-- https://block-yh-test1.s3.amazonaws.com/38_%E1%84%89%E1%85%B5%E1%86%AF%E1%84%89%E1%85%B3%E1%86%B8%E1%84%91%E1%85%A1%E1%84%8B%E1%85%B5%E1%86%AF.zip Resolving block-yh-test1.s3.amazonaws.com (block-yh-test1.s3.amazonaws.com)... 52.216.212.233, 52.217.89.76, 52.217.82.100, ... Connecting to block-yh-test1.s3.amazonaws.com (block-yh-test1.s3.amazonaws.com)|52.216.212.233|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 326052257 (311M) [application/zip] Saving to: ‘./data.zip’ ./data.zip 100%[===================>] 310.95M 12.6MB/s in 27s 2023-01-02 05:26:37 (11.5 MB/s) - ‘./data.zip’ saved [326052257/326052257]
In [5]:
! unzip data.zip
Archive: data.zip creating: test-images/ inflating: __MACOSX/._test-images inflating: test-images/bicycle.png inflating: __MACOSX/test-images/._bicycle.png inflating: test-images/bus.jpg inflating: __MACOSX/test-images/._bus.jpg creating: trained-models/ inflating: __MACOSX/._trained-models creating: trained-models/mobilenetv2/ inflating: __MACOSX/trained-models/._mobilenetv2 inflating: trained-models/mobilenetv2/vehicle-classification-by-type-mobilenetv2-anis-1.h5 inflating: __MACOSX/trained-models/mobilenetv2/._vehicle-classification-by-type-mobilenetv2-anis-1.h5 inflating: trained-models/mobilenetv2/vehicle-classification-by-type-mobilenetv2-anis-1-2.h5 inflating: __MACOSX/trained-models/mobilenetv2/._vehicle-classification-by-type-mobilenetv2-anis-1-2.h5 inflating: trained-models/mobilenetv2/vehicle-classification-by-type-mobilenetv2-anis-2.h5 inflating: __MACOSX/trained-models/mobilenetv2/._vehicle-classification-by-type-mobilenetv2-anis-2.h5 creating: vehicle_datasets/ inflating: __MACOSX/._vehicle_datasets inflating: vehicle_datasets/vehicle-type-dataset-SIZE224.hdf5.csv inflating: __MACOSX/vehicle_datasets/._vehicle-type-dataset-SIZE224.hdf5.csv inflating: vehicle_datasets/vehicle-type-dataset-SIZE224.hdf5 inflating: __MACOSX/vehicle_datasets/._vehicle-type-dataset-SIZE224.hdf5 inflating: vehicle_datasets/vehicle-type-dataset-SIZE224-train-dev-test-v2.hdf5 inflating: __MACOSX/vehicle_datasets/._vehicle-type-dataset-SIZE224-train-dev-test-v2.hdf5
In [5]:
In [5]:
경로 셋팅¶
In [6]:
TYPE='type'
model_type='mobilenetv2'
user='block'
iteration='1-2'
first_time_training=True
PROJECT_PATH= '/content/drive/MyDrive/Colab Notebooks/ml_plus' # 내 구글드라이브 경로.
HDF5_DATASET_PATH=PROJECT_PATH+'/vehicle_datasets/vehicle-type-dataset-SIZE224-train-dev-test-v2.hdf5'
TARGET_CLASSIFICATION_MODEL=PROJECT_PATH+'/trained-models/'+model_type+'/'+'vehicle-classification-by-'+TYPE+'-'+model_type+'-'+user+'-'+iteration+'.h5'
CHECKPOINT_PATH = PROJECT_PATH+'/checkpoints/'+model_type+'/'+'by-'+TYPE+'-'+model_type+'-'+user+'-'+iteration+'.h5' # ModelCHECKPOINT // 이모델은 구글드라이브에 저장(영구저장을위해)
LOGFILE_PATH=PROJECT_PATH+'/log/'+model_type+'/'+model_type+'-by-'+TYPE+'-training-log'+user+'-'+iteration+'.csv' # 마찬가지로 로그도 저장한다
In [7]:
print('PROJECT_PATH: ',PROJECT_PATH)
print('HDF5_DATASET_PATH: ', HDF5_DATASET_PATH)
print('TARGET_CLASSIFICATION_MODEL: ',TARGET_CLASSIFICATION_MODEL)
print('CHECKPOINT_PATH: ',CHECKPOINT_PATH)
print('LOGFILE_PATH: ',LOGFILE_PATH)
PROJECT_PATH: /content/drive/MyDrive/Colab Notebooks/ml_plus HDF5_DATASET_PATH: /content/drive/MyDrive/Colab Notebooks/ml_plus/vehicle_datasets/vehicle-type-dataset-SIZE224-train-dev-test-v2.hdf5 TARGET_CLASSIFICATION_MODEL: /content/drive/MyDrive/Colab Notebooks/ml_plus/trained-models/mobilenetv2/vehicle-classification-by-type-mobilenetv2-block-1-2.h5 CHECKPOINT_PATH: /content/drive/MyDrive/Colab Notebooks/ml_plus/checkpoints/mobilenetv2/by-type-mobilenetv2-block-1-2.h5 LOGFILE_PATH: /content/drive/MyDrive/Colab Notebooks/ml_plus/log/mobilenetv2/mobilenetv2-by-type-training-logblock-1-2.csv
In [8]:
sys.path.append(PROJECT_PATH)
Load the Dataset¶
In [9]:
import os
os.chdir(PROJECT_PATH)
In [10]:
# 이 데이터셋을 만든 사람이 공개한 구조임.
def load_dataset_from_hdf5_file(hdf_file_path):
# 함수 만들기
hf = h5py.File(hdf_file_path, 'r')
X_train = np.array(hf['trainX'])
y_train = np.array(hf['trainY'])
train_label = np.array(hf['trainLabels'])
X_test = np.array(hf['testX'])
y_test = np.array(hf['testY'])
test_label = np.array(hf['testLabels'])
X_val = np.array(hf['devX'])
y_val = np.array(hf['devY'])
val_label = np.array(hf['devLabels'])
return X_train, y_train, train_label, X_test, y_test, test_label, X_val, y_val, val_label
In [11]:
X_train, y_train, train_label, X_test, y_test, test_label, X_val, y_val, val_label=load_dataset_from_hdf5_file(HDF5_DATASET_PATH)
In [12]:
X_train.shape
Out[12]:
(603, 224, 224, 3)
In [13]:
plt.imshow(X_train[2])
plt.show()
In [14]:
y_train[2]
Out[14]:
array([0, 0, 0, 0, 1, 0, 0])
In [15]:
y_train[2].argmax()
Out[15]:
4
In [16]:
train_label[2]
Out[16]:
b'motocycle-bicycle-racing'
In [17]:
X_train[0].max() # min,max 로 피처스케일링이 되어있는지 확인.
Out[17]:
1.0
In [18]:
X_train[0].min()
Out[18]:
0.0
사진이미지 확인해 보기¶
In [18]:
In [19]:
base_model = MobileNetV2(input_shape= (224,224,3), include_top=False ) # input_shape의 범위는 메뉴얼에 정해져 있음. // 파라미터 : weights== 학습시킨데이터는 어디꺼냐
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/mobilenet_v2/mobilenet_v2_weights_tf_dim_ordering_tf_kernels_1.0_224_no_top.h5 9406464/9406464 [==============================] - 0s 0us/step
In [20]:
base_model.summary()
Model: "mobilenetv2_1.00_224" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) [(None, 224, 224, 3 0 [] )] Conv1 (Conv2D) (None, 112, 112, 32 864 ['input_1[0][0]'] ) bn_Conv1 (BatchNormalization) (None, 112, 112, 32 128 ['Conv1[0][0]'] ) Conv1_relu (ReLU) (None, 112, 112, 32 0 ['bn_Conv1[0][0]'] ) expanded_conv_depthwise (Depth (None, 112, 112, 32 288 ['Conv1_relu[0][0]'] wiseConv2D) ) expanded_conv_depthwise_BN (Ba (None, 112, 112, 32 128 ['expanded_conv_depthwise[0][0]'] tchNormalization) ) expanded_conv_depthwise_relu ( (None, 112, 112, 32 0 ['expanded_conv_depthwise_BN[0][0 ReLU) ) ]'] expanded_conv_project (Conv2D) (None, 112, 112, 16 512 ['expanded_conv_depthwise_relu[0] ) [0]'] expanded_conv_project_BN (Batc (None, 112, 112, 16 64 ['expanded_conv_project[0][0]'] hNormalization) ) block_1_expand (Conv2D) (None, 112, 112, 96 1536 ['expanded_conv_project_BN[0][0]' ) ] block_1_expand_BN (BatchNormal (None, 112, 112, 96 384 ['block_1_expand[0][0]'] ization) ) block_1_expand_relu (ReLU) (None, 112, 112, 96 0 ['block_1_expand_BN[0][0]'] ) block_1_pad (ZeroPadding2D) (None, 113, 113, 96 0 ['block_1_expand_relu[0][0]'] ) block_1_depthwise (DepthwiseCo (None, 56, 56, 96) 864 ['block_1_pad[0][0]'] nv2D) block_1_depthwise_BN (BatchNor (None, 56, 56, 96) 384 ['block_1_depthwise[0][0]'] malization) block_1_depthwise_relu (ReLU) (None, 56, 56, 96) 0 ['block_1_depthwise_BN[0][0]'] block_1_project (Conv2D) (None, 56, 56, 24) 2304 ['block_1_depthwise_relu[0][0]'] block_1_project_BN (BatchNorma (None, 56, 56, 24) 96 ['block_1_project[0][0]'] lization) block_2_expand (Conv2D) (None, 56, 56, 144) 3456 ['block_1_project_BN[0][0]'] block_2_expand_BN (BatchNormal (None, 56, 56, 144) 576 ['block_2_expand[0][0]'] ization) block_2_expand_relu (ReLU) (None, 56, 56, 144) 0 ['block_2_expand_BN[0][0]'] block_2_depthwise (DepthwiseCo (None, 56, 56, 144) 1296 ['block_2_expand_relu[0][0]'] nv2D) block_2_depthwise_BN (BatchNor (None, 56, 56, 144) 576 ['block_2_depthwise[0][0]'] malization) block_2_depthwise_relu (ReLU) (None, 56, 56, 144) 0 ['block_2_depthwise_BN[0][0]'] block_2_project (Conv2D) (None, 56, 56, 24) 3456 ['block_2_depthwise_relu[0][0]'] block_2_project_BN (BatchNorma (None, 56, 56, 24) 96 ['block_2_project[0][0]'] lization) block_2_add (Add) (None, 56, 56, 24) 0 ['block_1_project_BN[0][0]', 'block_2_project_BN[0][0]'] block_3_expand (Conv2D) (None, 56, 56, 144) 3456 ['block_2_add[0][0]'] block_3_expand_BN (BatchNormal (None, 56, 56, 144) 576 ['block_3_expand[0][0]'] ization) block_3_expand_relu (ReLU) (None, 56, 56, 144) 0 ['block_3_expand_BN[0][0]'] block_3_pad (ZeroPadding2D) (None, 57, 57, 144) 0 ['block_3_expand_relu[0][0]'] block_3_depthwise (DepthwiseCo (None, 28, 28, 144) 1296 ['block_3_pad[0][0]'] nv2D) block_3_depthwise_BN (BatchNor (None, 28, 28, 144) 576 ['block_3_depthwise[0][0]'] malization) block_3_depthwise_relu (ReLU) (None, 28, 28, 144) 0 ['block_3_depthwise_BN[0][0]'] block_3_project (Conv2D) (None, 28, 28, 32) 4608 ['block_3_depthwise_relu[0][0]'] block_3_project_BN (BatchNorma (None, 28, 28, 32) 128 ['block_3_project[0][0]'] lization) block_4_expand (Conv2D) (None, 28, 28, 192) 6144 ['block_3_project_BN[0][0]'] block_4_expand_BN (BatchNormal (None, 28, 28, 192) 768 ['block_4_expand[0][0]'] ization) block_4_expand_relu (ReLU) (None, 28, 28, 192) 0 ['block_4_expand_BN[0][0]'] block_4_depthwise (DepthwiseCo (None, 28, 28, 192) 1728 ['block_4_expand_relu[0][0]'] nv2D) block_4_depthwise_BN (BatchNor (None, 28, 28, 192) 768 ['block_4_depthwise[0][0]'] malization) block_4_depthwise_relu (ReLU) (None, 28, 28, 192) 0 ['block_4_depthwise_BN[0][0]'] block_4_project (Conv2D) (None, 28, 28, 32) 6144 ['block_4_depthwise_relu[0][0]'] block_4_project_BN (BatchNorma (None, 28, 28, 32) 128 ['block_4_project[0][0]'] lization) block_4_add (Add) (None, 28, 28, 32) 0 ['block_3_project_BN[0][0]', 'block_4_project_BN[0][0]'] block_5_expand (Conv2D) (None, 28, 28, 192) 6144 ['block_4_add[0][0]'] block_5_expand_BN (BatchNormal (None, 28, 28, 192) 768 ['block_5_expand[0][0]'] ization) block_5_expand_relu (ReLU) (None, 28, 28, 192) 0 ['block_5_expand_BN[0][0]'] block_5_depthwise (DepthwiseCo (None, 28, 28, 192) 1728 ['block_5_expand_relu[0][0]'] nv2D) block_5_depthwise_BN (BatchNor (None, 28, 28, 192) 768 ['block_5_depthwise[0][0]'] malization) block_5_depthwise_relu (ReLU) (None, 28, 28, 192) 0 ['block_5_depthwise_BN[0][0]'] block_5_project (Conv2D) (None, 28, 28, 32) 6144 ['block_5_depthwise_relu[0][0]'] block_5_project_BN (BatchNorma (None, 28, 28, 32) 128 ['block_5_project[0][0]'] lization) block_5_add (Add) (None, 28, 28, 32) 0 ['block_4_add[0][0]', 'block_5_project_BN[0][0]'] block_6_expand (Conv2D) (None, 28, 28, 192) 6144 ['block_5_add[0][0]'] block_6_expand_BN (BatchNormal (None, 28, 28, 192) 768 ['block_6_expand[0][0]'] ization) block_6_expand_relu (ReLU) (None, 28, 28, 192) 0 ['block_6_expand_BN[0][0]'] block_6_pad (ZeroPadding2D) (None, 29, 29, 192) 0 ['block_6_expand_relu[0][0]'] block_6_depthwise (DepthwiseCo (None, 14, 14, 192) 1728 ['block_6_pad[0][0]'] nv2D) block_6_depthwise_BN (BatchNor (None, 14, 14, 192) 768 ['block_6_depthwise[0][0]'] malization) block_6_depthwise_relu (ReLU) (None, 14, 14, 192) 0 ['block_6_depthwise_BN[0][0]'] block_6_project (Conv2D) (None, 14, 14, 64) 12288 ['block_6_depthwise_relu[0][0]'] block_6_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_6_project[0][0]'] lization) block_7_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_6_project_BN[0][0]'] block_7_expand_BN (BatchNormal (None, 14, 14, 384) 1536 ['block_7_expand[0][0]'] ization) block_7_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_7_expand_BN[0][0]'] block_7_depthwise (DepthwiseCo (None, 14, 14, 384) 3456 ['block_7_expand_relu[0][0]'] nv2D) block_7_depthwise_BN (BatchNor (None, 14, 14, 384) 1536 ['block_7_depthwise[0][0]'] malization) block_7_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_7_depthwise_BN[0][0]'] block_7_project (Conv2D) (None, 14, 14, 64) 24576 ['block_7_depthwise_relu[0][0]'] block_7_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_7_project[0][0]'] lization) block_7_add (Add) (None, 14, 14, 64) 0 ['block_6_project_BN[0][0]', 'block_7_project_BN[0][0]'] block_8_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_7_add[0][0]'] block_8_expand_BN (BatchNormal (None, 14, 14, 384) 1536 ['block_8_expand[0][0]'] ization) block_8_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_8_expand_BN[0][0]'] block_8_depthwise (DepthwiseCo (None, 14, 14, 384) 3456 ['block_8_expand_relu[0][0]'] nv2D) block_8_depthwise_BN (BatchNor (None, 14, 14, 384) 1536 ['block_8_depthwise[0][0]'] malization) block_8_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_8_depthwise_BN[0][0]'] block_8_project (Conv2D) (None, 14, 14, 64) 24576 ['block_8_depthwise_relu[0][0]'] block_8_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_8_project[0][0]'] lization) block_8_add (Add) (None, 14, 14, 64) 0 ['block_7_add[0][0]', 'block_8_project_BN[0][0]'] block_9_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_8_add[0][0]'] block_9_expand_BN (BatchNormal (None, 14, 14, 384) 1536 ['block_9_expand[0][0]'] ization) block_9_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_9_expand_BN[0][0]'] block_9_depthwise (DepthwiseCo (None, 14, 14, 384) 3456 ['block_9_expand_relu[0][0]'] nv2D) block_9_depthwise_BN (BatchNor (None, 14, 14, 384) 1536 ['block_9_depthwise[0][0]'] malization) block_9_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_9_depthwise_BN[0][0]'] block_9_project (Conv2D) (None, 14, 14, 64) 24576 ['block_9_depthwise_relu[0][0]'] block_9_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_9_project[0][0]'] lization) block_9_add (Add) (None, 14, 14, 64) 0 ['block_8_add[0][0]', 'block_9_project_BN[0][0]'] block_10_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_9_add[0][0]'] block_10_expand_BN (BatchNorma (None, 14, 14, 384) 1536 ['block_10_expand[0][0]'] lization) block_10_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_10_expand_BN[0][0]'] block_10_depthwise (DepthwiseC (None, 14, 14, 384) 3456 ['block_10_expand_relu[0][0]'] onv2D) block_10_depthwise_BN (BatchNo (None, 14, 14, 384) 1536 ['block_10_depthwise[0][0]'] rmalization) block_10_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_10_depthwise_BN[0][0]'] block_10_project (Conv2D) (None, 14, 14, 96) 36864 ['block_10_depthwise_relu[0][0]'] block_10_project_BN (BatchNorm (None, 14, 14, 96) 384 ['block_10_project[0][0]'] alization) block_11_expand (Conv2D) (None, 14, 14, 576) 55296 ['block_10_project_BN[0][0]'] block_11_expand_BN (BatchNorma (None, 14, 14, 576) 2304 ['block_11_expand[0][0]'] lization) block_11_expand_relu (ReLU) (None, 14, 14, 576) 0 ['block_11_expand_BN[0][0]'] block_11_depthwise (DepthwiseC (None, 14, 14, 576) 5184 ['block_11_expand_relu[0][0]'] onv2D) block_11_depthwise_BN (BatchNo (None, 14, 14, 576) 2304 ['block_11_depthwise[0][0]'] rmalization) block_11_depthwise_relu (ReLU) (None, 14, 14, 576) 0 ['block_11_depthwise_BN[0][0]'] block_11_project (Conv2D) (None, 14, 14, 96) 55296 ['block_11_depthwise_relu[0][0]'] block_11_project_BN (BatchNorm (None, 14, 14, 96) 384 ['block_11_project[0][0]'] alization) block_11_add (Add) (None, 14, 14, 96) 0 ['block_10_project_BN[0][0]', 'block_11_project_BN[0][0]'] block_12_expand (Conv2D) (None, 14, 14, 576) 55296 ['block_11_add[0][0]'] block_12_expand_BN (BatchNorma (None, 14, 14, 576) 2304 ['block_12_expand[0][0]'] lization) block_12_expand_relu (ReLU) (None, 14, 14, 576) 0 ['block_12_expand_BN[0][0]'] block_12_depthwise (DepthwiseC (None, 14, 14, 576) 5184 ['block_12_expand_relu[0][0]'] onv2D) block_12_depthwise_BN (BatchNo (None, 14, 14, 576) 2304 ['block_12_depthwise[0][0]'] rmalization) block_12_depthwise_relu (ReLU) (None, 14, 14, 576) 0 ['block_12_depthwise_BN[0][0]'] block_12_project (Conv2D) (None, 14, 14, 96) 55296 ['block_12_depthwise_relu[0][0]'] block_12_project_BN (BatchNorm (None, 14, 14, 96) 384 ['block_12_project[0][0]'] alization) block_12_add (Add) (None, 14, 14, 96) 0 ['block_11_add[0][0]', 'block_12_project_BN[0][0]'] block_13_expand (Conv2D) (None, 14, 14, 576) 55296 ['block_12_add[0][0]'] block_13_expand_BN (BatchNorma (None, 14, 14, 576) 2304 ['block_13_expand[0][0]'] lization) block_13_expand_relu (ReLU) (None, 14, 14, 576) 0 ['block_13_expand_BN[0][0]'] block_13_pad (ZeroPadding2D) (None, 15, 15, 576) 0 ['block_13_expand_relu[0][0]'] block_13_depthwise (DepthwiseC (None, 7, 7, 576) 5184 ['block_13_pad[0][0]'] onv2D) block_13_depthwise_BN (BatchNo (None, 7, 7, 576) 2304 ['block_13_depthwise[0][0]'] rmalization) block_13_depthwise_relu (ReLU) (None, 7, 7, 576) 0 ['block_13_depthwise_BN[0][0]'] block_13_project (Conv2D) (None, 7, 7, 160) 92160 ['block_13_depthwise_relu[0][0]'] block_13_project_BN (BatchNorm (None, 7, 7, 160) 640 ['block_13_project[0][0]'] alization) block_14_expand (Conv2D) (None, 7, 7, 960) 153600 ['block_13_project_BN[0][0]'] block_14_expand_BN (BatchNorma (None, 7, 7, 960) 3840 ['block_14_expand[0][0]'] lization) block_14_expand_relu (ReLU) (None, 7, 7, 960) 0 ['block_14_expand_BN[0][0]'] block_14_depthwise (DepthwiseC (None, 7, 7, 960) 8640 ['block_14_expand_relu[0][0]'] onv2D) block_14_depthwise_BN (BatchNo (None, 7, 7, 960) 3840 ['block_14_depthwise[0][0]'] rmalization) block_14_depthwise_relu (ReLU) (None, 7, 7, 960) 0 ['block_14_depthwise_BN[0][0]'] block_14_project (Conv2D) (None, 7, 7, 160) 153600 ['block_14_depthwise_relu[0][0]'] block_14_project_BN (BatchNorm (None, 7, 7, 160) 640 ['block_14_project[0][0]'] alization) block_14_add (Add) (None, 7, 7, 160) 0 ['block_13_project_BN[0][0]', 'block_14_project_BN[0][0]'] block_15_expand (Conv2D) (None, 7, 7, 960) 153600 ['block_14_add[0][0]'] block_15_expand_BN (BatchNorma (None, 7, 7, 960) 3840 ['block_15_expand[0][0]'] lization) block_15_expand_relu (ReLU) (None, 7, 7, 960) 0 ['block_15_expand_BN[0][0]'] block_15_depthwise (DepthwiseC (None, 7, 7, 960) 8640 ['block_15_expand_relu[0][0]'] onv2D) block_15_depthwise_BN (BatchNo (None, 7, 7, 960) 3840 ['block_15_depthwise[0][0]'] rmalization) block_15_depthwise_relu (ReLU) (None, 7, 7, 960) 0 ['block_15_depthwise_BN[0][0]'] block_15_project (Conv2D) (None, 7, 7, 160) 153600 ['block_15_depthwise_relu[0][0]'] block_15_project_BN (BatchNorm (None, 7, 7, 160) 640 ['block_15_project[0][0]'] alization) block_15_add (Add) (None, 7, 7, 160) 0 ['block_14_add[0][0]', 'block_15_project_BN[0][0]'] block_16_expand (Conv2D) (None, 7, 7, 960) 153600 ['block_15_add[0][0]'] block_16_expand_BN (BatchNorma (None, 7, 7, 960) 3840 ['block_16_expand[0][0]'] lization) block_16_expand_relu (ReLU) (None, 7, 7, 960) 0 ['block_16_expand_BN[0][0]'] block_16_depthwise (DepthwiseC (None, 7, 7, 960) 8640 ['block_16_expand_relu[0][0]'] onv2D) block_16_depthwise_BN (BatchNo (None, 7, 7, 960) 3840 ['block_16_depthwise[0][0]'] rmalization) block_16_depthwise_relu (ReLU) (None, 7, 7, 960) 0 ['block_16_depthwise_BN[0][0]'] block_16_project (Conv2D) (None, 7, 7, 320) 307200 ['block_16_depthwise_relu[0][0]'] block_16_project_BN (BatchNorm (None, 7, 7, 320) 1280 ['block_16_project[0][0]'] alization) Conv_1 (Conv2D) (None, 7, 7, 1280) 409600 ['block_16_project_BN[0][0]'] Conv_1_bn (BatchNormalization) (None, 7, 7, 1280) 5120 ['Conv_1[0][0]'] out_relu (ReLU) (None, 7, 7, 1280) 0 ['Conv_1_bn[0][0]'] ================================================================================================== Total params: 2,257,984 Trainable params: 2,223,872 Non-trainable params: 34,112 __________________________________________________________________________________________________
In [21]:
base_model.trainable = False
In [22]:
base_model.summary()
Model: "mobilenetv2_1.00_224" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) [(None, 224, 224, 3 0 [] )] Conv1 (Conv2D) (None, 112, 112, 32 864 ['input_1[0][0]'] ) bn_Conv1 (BatchNormalization) (None, 112, 112, 32 128 ['Conv1[0][0]'] ) Conv1_relu (ReLU) (None, 112, 112, 32 0 ['bn_Conv1[0][0]'] ) expanded_conv_depthwise (Depth (None, 112, 112, 32 288 ['Conv1_relu[0][0]'] wiseConv2D) ) expanded_conv_depthwise_BN (Ba (None, 112, 112, 32 128 ['expanded_conv_depthwise[0][0]'] tchNormalization) ) expanded_conv_depthwise_relu ( (None, 112, 112, 32 0 ['expanded_conv_depthwise_BN[0][0 ReLU) ) ]'] expanded_conv_project (Conv2D) (None, 112, 112, 16 512 ['expanded_conv_depthwise_relu[0] ) [0]'] expanded_conv_project_BN (Batc (None, 112, 112, 16 64 ['expanded_conv_project[0][0]'] hNormalization) ) block_1_expand (Conv2D) (None, 112, 112, 96 1536 ['expanded_conv_project_BN[0][0]' ) ] block_1_expand_BN (BatchNormal (None, 112, 112, 96 384 ['block_1_expand[0][0]'] ization) ) block_1_expand_relu (ReLU) (None, 112, 112, 96 0 ['block_1_expand_BN[0][0]'] ) block_1_pad (ZeroPadding2D) (None, 113, 113, 96 0 ['block_1_expand_relu[0][0]'] ) block_1_depthwise (DepthwiseCo (None, 56, 56, 96) 864 ['block_1_pad[0][0]'] nv2D) block_1_depthwise_BN (BatchNor (None, 56, 56, 96) 384 ['block_1_depthwise[0][0]'] malization) block_1_depthwise_relu (ReLU) (None, 56, 56, 96) 0 ['block_1_depthwise_BN[0][0]'] block_1_project (Conv2D) (None, 56, 56, 24) 2304 ['block_1_depthwise_relu[0][0]'] block_1_project_BN (BatchNorma (None, 56, 56, 24) 96 ['block_1_project[0][0]'] lization) block_2_expand (Conv2D) (None, 56, 56, 144) 3456 ['block_1_project_BN[0][0]'] block_2_expand_BN (BatchNormal (None, 56, 56, 144) 576 ['block_2_expand[0][0]'] ization) block_2_expand_relu (ReLU) (None, 56, 56, 144) 0 ['block_2_expand_BN[0][0]'] block_2_depthwise (DepthwiseCo (None, 56, 56, 144) 1296 ['block_2_expand_relu[0][0]'] nv2D) block_2_depthwise_BN (BatchNor (None, 56, 56, 144) 576 ['block_2_depthwise[0][0]'] malization) block_2_depthwise_relu (ReLU) (None, 56, 56, 144) 0 ['block_2_depthwise_BN[0][0]'] block_2_project (Conv2D) (None, 56, 56, 24) 3456 ['block_2_depthwise_relu[0][0]'] block_2_project_BN (BatchNorma (None, 56, 56, 24) 96 ['block_2_project[0][0]'] lization) block_2_add (Add) (None, 56, 56, 24) 0 ['block_1_project_BN[0][0]', 'block_2_project_BN[0][0]'] block_3_expand (Conv2D) (None, 56, 56, 144) 3456 ['block_2_add[0][0]'] block_3_expand_BN (BatchNormal (None, 56, 56, 144) 576 ['block_3_expand[0][0]'] ization) block_3_expand_relu (ReLU) (None, 56, 56, 144) 0 ['block_3_expand_BN[0][0]'] block_3_pad (ZeroPadding2D) (None, 57, 57, 144) 0 ['block_3_expand_relu[0][0]'] block_3_depthwise (DepthwiseCo (None, 28, 28, 144) 1296 ['block_3_pad[0][0]'] nv2D) block_3_depthwise_BN (BatchNor (None, 28, 28, 144) 576 ['block_3_depthwise[0][0]'] malization) block_3_depthwise_relu (ReLU) (None, 28, 28, 144) 0 ['block_3_depthwise_BN[0][0]'] block_3_project (Conv2D) (None, 28, 28, 32) 4608 ['block_3_depthwise_relu[0][0]'] block_3_project_BN (BatchNorma (None, 28, 28, 32) 128 ['block_3_project[0][0]'] lization) block_4_expand (Conv2D) (None, 28, 28, 192) 6144 ['block_3_project_BN[0][0]'] block_4_expand_BN (BatchNormal (None, 28, 28, 192) 768 ['block_4_expand[0][0]'] ization) block_4_expand_relu (ReLU) (None, 28, 28, 192) 0 ['block_4_expand_BN[0][0]'] block_4_depthwise (DepthwiseCo (None, 28, 28, 192) 1728 ['block_4_expand_relu[0][0]'] nv2D) block_4_depthwise_BN (BatchNor (None, 28, 28, 192) 768 ['block_4_depthwise[0][0]'] malization) block_4_depthwise_relu (ReLU) (None, 28, 28, 192) 0 ['block_4_depthwise_BN[0][0]'] block_4_project (Conv2D) (None, 28, 28, 32) 6144 ['block_4_depthwise_relu[0][0]'] block_4_project_BN (BatchNorma (None, 28, 28, 32) 128 ['block_4_project[0][0]'] lization) block_4_add (Add) (None, 28, 28, 32) 0 ['block_3_project_BN[0][0]', 'block_4_project_BN[0][0]'] block_5_expand (Conv2D) (None, 28, 28, 192) 6144 ['block_4_add[0][0]'] block_5_expand_BN (BatchNormal (None, 28, 28, 192) 768 ['block_5_expand[0][0]'] ization) block_5_expand_relu (ReLU) (None, 28, 28, 192) 0 ['block_5_expand_BN[0][0]'] block_5_depthwise (DepthwiseCo (None, 28, 28, 192) 1728 ['block_5_expand_relu[0][0]'] nv2D) block_5_depthwise_BN (BatchNor (None, 28, 28, 192) 768 ['block_5_depthwise[0][0]'] malization) block_5_depthwise_relu (ReLU) (None, 28, 28, 192) 0 ['block_5_depthwise_BN[0][0]'] block_5_project (Conv2D) (None, 28, 28, 32) 6144 ['block_5_depthwise_relu[0][0]'] block_5_project_BN (BatchNorma (None, 28, 28, 32) 128 ['block_5_project[0][0]'] lization) block_5_add (Add) (None, 28, 28, 32) 0 ['block_4_add[0][0]', 'block_5_project_BN[0][0]'] block_6_expand (Conv2D) (None, 28, 28, 192) 6144 ['block_5_add[0][0]'] block_6_expand_BN (BatchNormal (None, 28, 28, 192) 768 ['block_6_expand[0][0]'] ization) block_6_expand_relu (ReLU) (None, 28, 28, 192) 0 ['block_6_expand_BN[0][0]'] block_6_pad (ZeroPadding2D) (None, 29, 29, 192) 0 ['block_6_expand_relu[0][0]'] block_6_depthwise (DepthwiseCo (None, 14, 14, 192) 1728 ['block_6_pad[0][0]'] nv2D) block_6_depthwise_BN (BatchNor (None, 14, 14, 192) 768 ['block_6_depthwise[0][0]'] malization) block_6_depthwise_relu (ReLU) (None, 14, 14, 192) 0 ['block_6_depthwise_BN[0][0]'] block_6_project (Conv2D) (None, 14, 14, 64) 12288 ['block_6_depthwise_relu[0][0]'] block_6_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_6_project[0][0]'] lization) block_7_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_6_project_BN[0][0]'] block_7_expand_BN (BatchNormal (None, 14, 14, 384) 1536 ['block_7_expand[0][0]'] ization) block_7_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_7_expand_BN[0][0]'] block_7_depthwise (DepthwiseCo (None, 14, 14, 384) 3456 ['block_7_expand_relu[0][0]'] nv2D) block_7_depthwise_BN (BatchNor (None, 14, 14, 384) 1536 ['block_7_depthwise[0][0]'] malization) block_7_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_7_depthwise_BN[0][0]'] block_7_project (Conv2D) (None, 14, 14, 64) 24576 ['block_7_depthwise_relu[0][0]'] block_7_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_7_project[0][0]'] lization) block_7_add (Add) (None, 14, 14, 64) 0 ['block_6_project_BN[0][0]', 'block_7_project_BN[0][0]'] block_8_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_7_add[0][0]'] block_8_expand_BN (BatchNormal (None, 14, 14, 384) 1536 ['block_8_expand[0][0]'] ization) block_8_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_8_expand_BN[0][0]'] block_8_depthwise (DepthwiseCo (None, 14, 14, 384) 3456 ['block_8_expand_relu[0][0]'] nv2D) block_8_depthwise_BN (BatchNor (None, 14, 14, 384) 1536 ['block_8_depthwise[0][0]'] malization) block_8_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_8_depthwise_BN[0][0]'] block_8_project (Conv2D) (None, 14, 14, 64) 24576 ['block_8_depthwise_relu[0][0]'] block_8_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_8_project[0][0]'] lization) block_8_add (Add) (None, 14, 14, 64) 0 ['block_7_add[0][0]', 'block_8_project_BN[0][0]'] block_9_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_8_add[0][0]'] block_9_expand_BN (BatchNormal (None, 14, 14, 384) 1536 ['block_9_expand[0][0]'] ization) block_9_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_9_expand_BN[0][0]'] block_9_depthwise (DepthwiseCo (None, 14, 14, 384) 3456 ['block_9_expand_relu[0][0]'] nv2D) block_9_depthwise_BN (BatchNor (None, 14, 14, 384) 1536 ['block_9_depthwise[0][0]'] malization) block_9_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_9_depthwise_BN[0][0]'] block_9_project (Conv2D) (None, 14, 14, 64) 24576 ['block_9_depthwise_relu[0][0]'] block_9_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_9_project[0][0]'] lization) block_9_add (Add) (None, 14, 14, 64) 0 ['block_8_add[0][0]', 'block_9_project_BN[0][0]'] block_10_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_9_add[0][0]'] block_10_expand_BN (BatchNorma (None, 14, 14, 384) 1536 ['block_10_expand[0][0]'] lization) block_10_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_10_expand_BN[0][0]'] block_10_depthwise (DepthwiseC (None, 14, 14, 384) 3456 ['block_10_expand_relu[0][0]'] onv2D) block_10_depthwise_BN (BatchNo (None, 14, 14, 384) 1536 ['block_10_depthwise[0][0]'] rmalization) block_10_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_10_depthwise_BN[0][0]'] block_10_project (Conv2D) (None, 14, 14, 96) 36864 ['block_10_depthwise_relu[0][0]'] block_10_project_BN (BatchNorm (None, 14, 14, 96) 384 ['block_10_project[0][0]'] alization) block_11_expand (Conv2D) (None, 14, 14, 576) 55296 ['block_10_project_BN[0][0]'] block_11_expand_BN (BatchNorma (None, 14, 14, 576) 2304 ['block_11_expand[0][0]'] lization) block_11_expand_relu (ReLU) (None, 14, 14, 576) 0 ['block_11_expand_BN[0][0]'] block_11_depthwise (DepthwiseC (None, 14, 14, 576) 5184 ['block_11_expand_relu[0][0]'] onv2D) block_11_depthwise_BN (BatchNo (None, 14, 14, 576) 2304 ['block_11_depthwise[0][0]'] rmalization) block_11_depthwise_relu (ReLU) (None, 14, 14, 576) 0 ['block_11_depthwise_BN[0][0]'] block_11_project (Conv2D) (None, 14, 14, 96) 55296 ['block_11_depthwise_relu[0][0]'] block_11_project_BN (BatchNorm (None, 14, 14, 96) 384 ['block_11_project[0][0]'] alization) block_11_add (Add) (None, 14, 14, 96) 0 ['block_10_project_BN[0][0]', 'block_11_project_BN[0][0]'] block_12_expand (Conv2D) (None, 14, 14, 576) 55296 ['block_11_add[0][0]'] block_12_expand_BN (BatchNorma (None, 14, 14, 576) 2304 ['block_12_expand[0][0]'] lization) block_12_expand_relu (ReLU) (None, 14, 14, 576) 0 ['block_12_expand_BN[0][0]'] block_12_depthwise (DepthwiseC (None, 14, 14, 576) 5184 ['block_12_expand_relu[0][0]'] onv2D) block_12_depthwise_BN (BatchNo (None, 14, 14, 576) 2304 ['block_12_depthwise[0][0]'] rmalization) block_12_depthwise_relu (ReLU) (None, 14, 14, 576) 0 ['block_12_depthwise_BN[0][0]'] block_12_project (Conv2D) (None, 14, 14, 96) 55296 ['block_12_depthwise_relu[0][0]'] block_12_project_BN (BatchNorm (None, 14, 14, 96) 384 ['block_12_project[0][0]'] alization) block_12_add (Add) (None, 14, 14, 96) 0 ['block_11_add[0][0]', 'block_12_project_BN[0][0]'] block_13_expand (Conv2D) (None, 14, 14, 576) 55296 ['block_12_add[0][0]'] block_13_expand_BN (BatchNorma (None, 14, 14, 576) 2304 ['block_13_expand[0][0]'] lization) block_13_expand_relu (ReLU) (None, 14, 14, 576) 0 ['block_13_expand_BN[0][0]'] block_13_pad (ZeroPadding2D) (None, 15, 15, 576) 0 ['block_13_expand_relu[0][0]'] block_13_depthwise (DepthwiseC (None, 7, 7, 576) 5184 ['block_13_pad[0][0]'] onv2D) block_13_depthwise_BN (BatchNo (None, 7, 7, 576) 2304 ['block_13_depthwise[0][0]'] rmalization) block_13_depthwise_relu (ReLU) (None, 7, 7, 576) 0 ['block_13_depthwise_BN[0][0]'] block_13_project (Conv2D) (None, 7, 7, 160) 92160 ['block_13_depthwise_relu[0][0]'] block_13_project_BN (BatchNorm (None, 7, 7, 160) 640 ['block_13_project[0][0]'] alization) block_14_expand (Conv2D) (None, 7, 7, 960) 153600 ['block_13_project_BN[0][0]'] block_14_expand_BN (BatchNorma (None, 7, 7, 960) 3840 ['block_14_expand[0][0]'] lization) block_14_expand_relu (ReLU) (None, 7, 7, 960) 0 ['block_14_expand_BN[0][0]'] block_14_depthwise (DepthwiseC (None, 7, 7, 960) 8640 ['block_14_expand_relu[0][0]'] onv2D) block_14_depthwise_BN (BatchNo (None, 7, 7, 960) 3840 ['block_14_depthwise[0][0]'] rmalization) block_14_depthwise_relu (ReLU) (None, 7, 7, 960) 0 ['block_14_depthwise_BN[0][0]'] block_14_project (Conv2D) (None, 7, 7, 160) 153600 ['block_14_depthwise_relu[0][0]'] block_14_project_BN (BatchNorm (None, 7, 7, 160) 640 ['block_14_project[0][0]'] alization) block_14_add (Add) (None, 7, 7, 160) 0 ['block_13_project_BN[0][0]', 'block_14_project_BN[0][0]'] block_15_expand (Conv2D) (None, 7, 7, 960) 153600 ['block_14_add[0][0]'] block_15_expand_BN (BatchNorma (None, 7, 7, 960) 3840 ['block_15_expand[0][0]'] lization) block_15_expand_relu (ReLU) (None, 7, 7, 960) 0 ['block_15_expand_BN[0][0]'] block_15_depthwise (DepthwiseC (None, 7, 7, 960) 8640 ['block_15_expand_relu[0][0]'] onv2D) block_15_depthwise_BN (BatchNo (None, 7, 7, 960) 3840 ['block_15_depthwise[0][0]'] rmalization) block_15_depthwise_relu (ReLU) (None, 7, 7, 960) 0 ['block_15_depthwise_BN[0][0]'] block_15_project (Conv2D) (None, 7, 7, 160) 153600 ['block_15_depthwise_relu[0][0]'] block_15_project_BN (BatchNorm (None, 7, 7, 160) 640 ['block_15_project[0][0]'] alization) block_15_add (Add) (None, 7, 7, 160) 0 ['block_14_add[0][0]', 'block_15_project_BN[0][0]'] block_16_expand (Conv2D) (None, 7, 7, 960) 153600 ['block_15_add[0][0]'] block_16_expand_BN (BatchNorma (None, 7, 7, 960) 3840 ['block_16_expand[0][0]'] lization) block_16_expand_relu (ReLU) (None, 7, 7, 960) 0 ['block_16_expand_BN[0][0]'] block_16_depthwise (DepthwiseC (None, 7, 7, 960) 8640 ['block_16_expand_relu[0][0]'] onv2D) block_16_depthwise_BN (BatchNo (None, 7, 7, 960) 3840 ['block_16_depthwise[0][0]'] rmalization) block_16_depthwise_relu (ReLU) (None, 7, 7, 960) 0 ['block_16_depthwise_BN[0][0]'] block_16_project (Conv2D) (None, 7, 7, 320) 307200 ['block_16_depthwise_relu[0][0]'] block_16_project_BN (BatchNorm (None, 7, 7, 320) 1280 ['block_16_project[0][0]'] alization) Conv_1 (Conv2D) (None, 7, 7, 1280) 409600 ['block_16_project_BN[0][0]'] Conv_1_bn (BatchNormalization) (None, 7, 7, 1280) 5120 ['Conv_1[0][0]'] out_relu (ReLU) (None, 7, 7, 1280) 0 ['Conv_1_bn[0][0]'] ================================================================================================== Total params: 2,257,984 Trainable params: 0 Non-trainable params: 2,257,984 __________________________________________________________________________________________________
In [23]:
head_model = base_model.output
In [24]:
head_model = Flatten()(head_model) # 평탄화
In [25]:
head_model = Dense(128, 'relu')(head_model) # 히든레이어생성
In [26]:
head_model = Dropout(0.4)(head_model)
In [27]:
head_model = Dense(64, 'relu')(head_model) # 히든레이어의 갯수는 사용자 마음
In [28]:
head_model = Dense(7, 'softmax')(head_model)
In [29]:
model = Model(inputs = base_model.input , outputs = head_model ) # 합친다.
In [29]:
콜백 만들기 : 가장 좋은 모델을 자동 저장하도록 함. 로그도 저장하도록 함.¶
In [30]:
# 파이썬 코드로, 디렉토리 만드는 방법
In [31]:
if not os.path.exists(PROJECT_PATH+'/checkpoints/'+model_type+'/'): # 이런 경로가 존재하지 않으면
os.makedirs(PROJECT_PATH+'/checkpoints/'+model_type+'/') # 디렉토리를 생성
In [32]:
if not os.path.exists(PROJECT_PATH+'/log/'+model_type+'/'):
os.makedirs(PROJECT_PATH+'/log/'+model_type+'/')
In [33]:
from keras.callbacks import ModelCheckpoint # 콜백중에 가장 많이 사용함
In [35]:
mcp = ModelCheckpoint(CHECKPOINT_PATH,
monitor= 'val_accuracy',
save_best_only=True,
verbose = 1) # 가장 정확도가 높은 모델을 해당경로에 저장하라. //verbose == 디버깅용으로 화면에 찍어라
In [36]:
# 에포크가 끝날때마다, 현재는 화면에 표시하는데,
# 이 정보를, 파일로 저장해서, 화면 안보더라도, 나중에
# 파일열어서 확인이 가능하도록, Log 를 남기는 방법
In [47]:
from keras.callbacks import CSVLogger
csv_logger = CSVLogger(LOGFILE_PATH, append=True ) # append == 파일내용이 있으면 다시 추가해서 저장하라
In [40]:
y_train
Out[40]:
array([[0, 0, 0, ..., 0, 1, 0], [0, 0, 0, ..., 0, 1, 0], [0, 0, 0, ..., 1, 0, 0], ..., [1, 0, 0, ..., 0, 0, 0], [0, 1, 0, ..., 0, 0, 0], [1, 0, 0, ..., 0, 0, 0]])
In [41]:
model.compile(Adam(0.0001) , 'categorical_crossentropy', ['accuracy'])
In [42]:
# 바로 학습하기 전에, 이미지 데이터 증강하고 학습한다.
# X_train 을 확인해보니, 이미 0~ 1 사이의 값으로 피처 스케일링 되어있으므로
# 데이터증강만 한다.
In [43]:
X_train
Out[43]:
array([[[[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]], [[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]], [[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]], ..., [[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]], [[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]], [[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]]], [[[0.80392157, 0.76078431, 0.54901961], [0.80784314, 0.76470588, 0.55686275], [0.81568627, 0.77254902, 0.56078431], ..., [0.80784314, 0.74117647, 0.56078431], [0.80784314, 0.74117647, 0.56078431], [0.8 , 0.73333333, 0.55294118]], [[0.78431373, 0.73333333, 0.54117647], [0.78823529, 0.74117647, 0.54509804], [0.79607843, 0.74509804, 0.55294118], ..., [0.80392157, 0.7372549 , 0.56862745], [0.8 , 0.73333333, 0.56078431], [0.79607843, 0.72941176, 0.55686275]], [[0.8 , 0.75294118, 0.58039216], [0.80784314, 0.75686275, 0.58823529], [0.81176471, 0.76470588, 0.59215686], ..., [0.79215686, 0.73333333, 0.57647059], [0.78431373, 0.7254902 , 0.56862745], [0.78039216, 0.72156863, 0.56470588]], ..., [[0.49411765, 0.43529412, 0.33333333], [0.4745098 , 0.40392157, 0.30980392], [0.49411765, 0.41568627, 0.32156863], ..., [0.31372549, 0.32156863, 0.16862745], [0.35686275, 0.35686275, 0.20784314], [0.37647059, 0.36862745, 0.21568627]], [[0.48235294, 0.41960784, 0.3254902 ], [0.4745098 , 0.40392157, 0.30980392], [0.50980392, 0.43137255, 0.3372549 ], ..., [0.30588235, 0.30980392, 0.16862745], [0.34509804, 0.34117647, 0.19215686], [0.37647059, 0.36862745, 0.21568627]], [[0.47843137, 0.40784314, 0.31372549], [0.47843137, 0.40784314, 0.31372549], [0.5254902 , 0.44705882, 0.35294118], ..., [0.2627451 , 0.2627451 , 0.1254902 ], [0.30588235, 0.30196078, 0.15294118], [0.36470588, 0.35294118, 0.2 ]]], [[[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]], [[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]], [[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]], ..., [[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]], [[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]], [[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]]], ..., [[[0.43529412, 0.4 , 0.37647059], [0.34901961, 0.31764706, 0.30196078], [0.35294118, 0.32156863, 0.30588235], ..., [0.29411765, 0.26666667, 0.24705882], [0.32941176, 0.28627451, 0.2627451 ], [0.42745098, 0.38431373, 0.35686275]], [[0.4745098 , 0.43921569, 0.4 ], [0.37647059, 0.34509804, 0.3254902 ], [0.31764706, 0.28627451, 0.27843137], ..., [0.28235294, 0.25098039, 0.25098039], [0.30588235, 0.26666667, 0.25098039], [0.35294118, 0.31372549, 0.28627451]], [[0.45490196, 0.41568627, 0.40392157], [0.37254902, 0.3372549 , 0.31764706], [0.38823529, 0.35294118, 0.32156863], ..., [0.24705882, 0.20784314, 0.20392157], [0.2745098 , 0.23137255, 0.22745098], [0.3254902 , 0.28627451, 0.25882353]], ..., [[0.49411765, 0.49019608, 0.47843137], [0.49019608, 0.48627451, 0.47058824], [0.47058824, 0.47058824, 0.45490196], ..., [0.57254902, 0.56078431, 0.5372549 ], [0.57647059, 0.56470588, 0.54509804], [0.55686275, 0.54509804, 0.5254902 ]], [[0.48627451, 0.48627451, 0.47843137], [0.49019608, 0.48627451, 0.47058824], [0.4745098 , 0.47843137, 0.4627451 ], ..., [0.55294118, 0.54117647, 0.52156863], [0.56078431, 0.54901961, 0.52941176], [0.55686275, 0.54509804, 0.5254902 ]], [[0.48235294, 0.47843137, 0.4627451 ], [0.48627451, 0.48235294, 0.46666667], [0.47058824, 0.47058824, 0.4627451 ], ..., [0.55294118, 0.54117647, 0.52156863], [0.58823529, 0.57647059, 0.55686275], [0.55294118, 0.54117647, 0.52156863]]], [[[0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], ..., [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922]], [[0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], ..., [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922]], [[0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], ..., [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922]], ..., [[0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], ..., [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922]], [[0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], ..., [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922]], [[0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], ..., [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922], [0.27843137, 0.43921569, 0.29803922]]], [[[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]], [[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]], [[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]], ..., [[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]], [[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]], [[1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ], ..., [1. , 1. , 1. ], [1. , 1. , 1. ], [1. , 1. , 1. ]]]])
In [44]:
train_datagen = ImageDataGenerator(rotation_range=10, horizontal_flip=True)
In [46]:
train_generator = train_datagen.flow(X_train, y_train, batch_size=64)
In [49]:
epoch_history = model.fit(train_generator, epochs=40 , validation_data = (X_val,y_val) , batch_size = 64 , callbacks = [ mcp , csv_logger ],)
Epoch 1/40 10/10 [==============================] - ETA: 0s - loss: 1.0159 - accuracy: 0.6136 Epoch 1: val_accuracy improved from -inf to 0.89333, saving model to /content/drive/MyDrive/Colab Notebooks/ml_plus/checkpoints/mobilenetv2/by-type-mobilenetv2-block-1-2.h5 10/10 [==============================] - 18s 772ms/step - loss: 1.0159 - accuracy: 0.6136 - val_loss: 0.2230 - val_accuracy: 0.8933 Epoch 2/40 10/10 [==============================] - ETA: 0s - loss: 0.3447 - accuracy: 0.8905 Epoch 2: val_accuracy improved from 0.89333 to 0.92000, saving model to /content/drive/MyDrive/Colab Notebooks/ml_plus/checkpoints/mobilenetv2/by-type-mobilenetv2-block-1-2.h5 10/10 [==============================] - 7s 725ms/step - loss: 0.3447 - accuracy: 0.8905 - val_loss: 0.2020 - val_accuracy: 0.9200 Epoch 3/40 10/10 [==============================] - ETA: 0s - loss: 0.2301 - accuracy: 0.9121 Epoch 3: val_accuracy improved from 0.92000 to 0.94667, saving model to /content/drive/MyDrive/Colab Notebooks/ml_plus/checkpoints/mobilenetv2/by-type-mobilenetv2-block-1-2.h5 10/10 [==============================] - 7s 708ms/step - loss: 0.2301 - accuracy: 0.9121 - val_loss: 0.1078 - val_accuracy: 0.9467 Epoch 4/40 10/10 [==============================] - ETA: 0s - loss: 0.1465 - accuracy: 0.9536 Epoch 4: val_accuracy improved from 0.94667 to 0.97333, saving model to /content/drive/MyDrive/Colab Notebooks/ml_plus/checkpoints/mobilenetv2/by-type-mobilenetv2-block-1-2.h5 10/10 [==============================] - 7s 681ms/step - loss: 0.1465 - accuracy: 0.9536 - val_loss: 0.0785 - val_accuracy: 0.9733 Epoch 5/40 10/10 [==============================] - ETA: 0s - loss: 0.1371 - accuracy: 0.9519 Epoch 5: val_accuracy improved from 0.97333 to 1.00000, saving model to /content/drive/MyDrive/Colab Notebooks/ml_plus/checkpoints/mobilenetv2/by-type-mobilenetv2-block-1-2.h5 10/10 [==============================] - 7s 682ms/step - loss: 0.1371 - accuracy: 0.9519 - val_loss: 0.0491 - val_accuracy: 1.0000 Epoch 6/40 10/10 [==============================] - ETA: 0s - loss: 0.1445 - accuracy: 0.9469 Epoch 6: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 595ms/step - loss: 0.1445 - accuracy: 0.9469 - val_loss: 0.1008 - val_accuracy: 0.9600 Epoch 7/40 10/10 [==============================] - ETA: 0s - loss: 0.1294 - accuracy: 0.9469 Epoch 7: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 595ms/step - loss: 0.1294 - accuracy: 0.9469 - val_loss: 0.1258 - val_accuracy: 0.9333 Epoch 8/40 10/10 [==============================] - ETA: 0s - loss: 0.0653 - accuracy: 0.9801 Epoch 8: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 596ms/step - loss: 0.0653 - accuracy: 0.9801 - val_loss: 0.0847 - val_accuracy: 0.9467 Epoch 9/40 10/10 [==============================] - ETA: 0s - loss: 0.0768 - accuracy: 0.9768 Epoch 9: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 603ms/step - loss: 0.0768 - accuracy: 0.9768 - val_loss: 0.1124 - val_accuracy: 0.9467 Epoch 10/40 10/10 [==============================] - ETA: 0s - loss: 0.0751 - accuracy: 0.9801 Epoch 10: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 588ms/step - loss: 0.0751 - accuracy: 0.9801 - val_loss: 0.1103 - val_accuracy: 0.9600 Epoch 11/40 10/10 [==============================] - ETA: 0s - loss: 0.0432 - accuracy: 0.9867 Epoch 11: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 631ms/step - loss: 0.0432 - accuracy: 0.9867 - val_loss: 0.0561 - val_accuracy: 0.9733 Epoch 12/40 10/10 [==============================] - ETA: 0s - loss: 0.0226 - accuracy: 0.9900 Epoch 12: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 596ms/step - loss: 0.0226 - accuracy: 0.9900 - val_loss: 0.0708 - val_accuracy: 0.9733 Epoch 13/40 10/10 [==============================] - ETA: 0s - loss: 0.0285 - accuracy: 0.9917 Epoch 13: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 588ms/step - loss: 0.0285 - accuracy: 0.9917 - val_loss: 0.0827 - val_accuracy: 0.9733 Epoch 14/40 10/10 [==============================] - ETA: 0s - loss: 0.0253 - accuracy: 0.9917 Epoch 14: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 601ms/step - loss: 0.0253 - accuracy: 0.9917 - val_loss: 0.0582 - val_accuracy: 0.9600 Epoch 15/40 10/10 [==============================] - ETA: 0s - loss: 0.0290 - accuracy: 0.9900 Epoch 15: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 601ms/step - loss: 0.0290 - accuracy: 0.9900 - val_loss: 0.0779 - val_accuracy: 0.9600 Epoch 16/40 10/10 [==============================] - ETA: 0s - loss: 0.0275 - accuracy: 0.9967 Epoch 16: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 607ms/step - loss: 0.0275 - accuracy: 0.9967 - val_loss: 0.1142 - val_accuracy: 0.9333 Epoch 17/40 10/10 [==============================] - ETA: 0s - loss: 0.0268 - accuracy: 0.9917 Epoch 17: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 592ms/step - loss: 0.0268 - accuracy: 0.9917 - val_loss: 0.1013 - val_accuracy: 0.9467 Epoch 18/40 10/10 [==============================] - ETA: 0s - loss: 0.0353 - accuracy: 0.9884 Epoch 18: val_accuracy did not improve from 1.00000 10/10 [==============================] - 7s 745ms/step - loss: 0.0353 - accuracy: 0.9884 - val_loss: 0.0802 - val_accuracy: 0.9733 Epoch 19/40 10/10 [==============================] - ETA: 0s - loss: 0.0227 - accuracy: 0.9934 Epoch 19: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 601ms/step - loss: 0.0227 - accuracy: 0.9934 - val_loss: 0.0697 - val_accuracy: 0.9600 Epoch 20/40 10/10 [==============================] - ETA: 0s - loss: 0.0242 - accuracy: 0.9934 Epoch 20: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 594ms/step - loss: 0.0242 - accuracy: 0.9934 - val_loss: 0.0518 - val_accuracy: 0.9867 Epoch 21/40 10/10 [==============================] - ETA: 0s - loss: 0.0199 - accuracy: 0.9950 Epoch 21: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 643ms/step - loss: 0.0199 - accuracy: 0.9950 - val_loss: 0.0869 - val_accuracy: 0.9600 Epoch 22/40 10/10 [==============================] - ETA: 0s - loss: 0.0178 - accuracy: 0.9917 Epoch 22: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 595ms/step - loss: 0.0178 - accuracy: 0.9917 - val_loss: 0.1192 - val_accuracy: 0.9867 Epoch 23/40 10/10 [==============================] - ETA: 0s - loss: 0.0163 - accuracy: 0.9950 Epoch 23: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 583ms/step - loss: 0.0163 - accuracy: 0.9950 - val_loss: 0.0511 - val_accuracy: 0.9600 Epoch 24/40 10/10 [==============================] - ETA: 0s - loss: 0.0224 - accuracy: 0.9967 Epoch 24: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 596ms/step - loss: 0.0224 - accuracy: 0.9967 - val_loss: 0.0316 - val_accuracy: 1.0000 Epoch 25/40 10/10 [==============================] - ETA: 0s - loss: 0.0173 - accuracy: 0.9967 Epoch 25: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 586ms/step - loss: 0.0173 - accuracy: 0.9967 - val_loss: 0.0344 - val_accuracy: 0.9733 Epoch 26/40 10/10 [==============================] - ETA: 0s - loss: 0.0179 - accuracy: 0.9950 Epoch 26: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 588ms/step - loss: 0.0179 - accuracy: 0.9950 - val_loss: 0.0490 - val_accuracy: 0.9733 Epoch 27/40 10/10 [==============================] - ETA: 0s - loss: 0.0170 - accuracy: 0.9934 Epoch 27: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 586ms/step - loss: 0.0170 - accuracy: 0.9934 - val_loss: 0.0467 - val_accuracy: 0.9600 Epoch 28/40 10/10 [==============================] - ETA: 0s - loss: 0.0147 - accuracy: 0.9983 Epoch 28: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 625ms/step - loss: 0.0147 - accuracy: 0.9983 - val_loss: 0.0576 - val_accuracy: 0.9733 Epoch 29/40 10/10 [==============================] - ETA: 0s - loss: 0.0139 - accuracy: 0.9967 Epoch 29: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 587ms/step - loss: 0.0139 - accuracy: 0.9967 - val_loss: 0.0366 - val_accuracy: 0.9733 Epoch 30/40 10/10 [==============================] - ETA: 0s - loss: 0.0069 - accuracy: 1.0000 Epoch 30: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 590ms/step - loss: 0.0069 - accuracy: 1.0000 - val_loss: 0.0402 - val_accuracy: 0.9733 Epoch 31/40 10/10 [==============================] - ETA: 0s - loss: 0.0183 - accuracy: 0.9950 Epoch 31: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 584ms/step - loss: 0.0183 - accuracy: 0.9950 - val_loss: 0.0762 - val_accuracy: 0.9733 Epoch 32/40 10/10 [==============================] - ETA: 0s - loss: 0.0088 - accuracy: 0.9983 Epoch 32: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 585ms/step - loss: 0.0088 - accuracy: 0.9983 - val_loss: 0.0573 - val_accuracy: 0.9867 Epoch 33/40 10/10 [==============================] - ETA: 0s - loss: 0.0150 - accuracy: 0.9950 Epoch 33: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 576ms/step - loss: 0.0150 - accuracy: 0.9950 - val_loss: 0.0390 - val_accuracy: 0.9867 Epoch 34/40 10/10 [==============================] - ETA: 0s - loss: 0.0105 - accuracy: 0.9967 Epoch 34: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 594ms/step - loss: 0.0105 - accuracy: 0.9967 - val_loss: 0.0314 - val_accuracy: 0.9867 Epoch 35/40 10/10 [==============================] - ETA: 0s - loss: 0.0057 - accuracy: 0.9983 Epoch 35: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 588ms/step - loss: 0.0057 - accuracy: 0.9983 - val_loss: 0.0197 - val_accuracy: 1.0000 Epoch 36/40 10/10 [==============================] - ETA: 0s - loss: 0.0055 - accuracy: 1.0000 Epoch 36: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 579ms/step - loss: 0.0055 - accuracy: 1.0000 - val_loss: 0.0205 - val_accuracy: 1.0000 Epoch 37/40 10/10 [==============================] - ETA: 0s - loss: 0.0070 - accuracy: 0.9950 Epoch 37: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 588ms/step - loss: 0.0070 - accuracy: 0.9950 - val_loss: 0.0379 - val_accuracy: 0.9867 Epoch 38/40 10/10 [==============================] - ETA: 0s - loss: 0.0145 - accuracy: 0.9967 Epoch 38: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 603ms/step - loss: 0.0145 - accuracy: 0.9967 - val_loss: 0.0267 - val_accuracy: 1.0000 Epoch 39/40 10/10 [==============================] - ETA: 0s - loss: 0.0051 - accuracy: 1.0000 Epoch 39: val_accuracy did not improve from 1.00000 10/10 [==============================] - 7s 700ms/step - loss: 0.0051 - accuracy: 1.0000 - val_loss: 0.0355 - val_accuracy: 0.9733 Epoch 40/40 10/10 [==============================] - ETA: 0s - loss: 0.0114 - accuracy: 0.9967 Epoch 40: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 633ms/step - loss: 0.0114 - accuracy: 0.9967 - val_loss: 0.0500 - val_accuracy: 0.9600
In [ ]:
from matplotlib.ticker import Locator
컴파일과 학습 : 에포크는 50¶
In [34]:
In [34]:
In [34]:
In [34]:
In [34]:
트레이닝 및 밸리데이션의 정확도와 로스를 차트로 보기¶
In [50]:
model.evaluate(X_test, y_test)
3/3 [==============================] - 1s 166ms/step - loss: 0.3704 - accuracy: 0.9605
Out[50]:
[0.37035828828811646, 0.9605262875556946]
In [51]:
# 오버피팅 확인
plt.plot(epoch_history.history['accuracy'])
plt.plot(epoch_history.history['val_accuracy'])
plt.legend( ['train','validation'])
plt.show()
In [34]:
In [34]:
In [34]:
모델 평가¶
In [34]:
In [34]:
In [34]:
fine tuning 해본다¶
In [52]:
# 1. base_model 을 먼저, 모두 학습 가능하도록 만들어 놓는다.
In [53]:
base_model.trainable = True
In [54]:
base_model.summary()
Model: "mobilenetv2_1.00_224" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) [(None, 224, 224, 3 0 [] )] Conv1 (Conv2D) (None, 112, 112, 32 864 ['input_1[0][0]'] ) bn_Conv1 (BatchNormalization) (None, 112, 112, 32 128 ['Conv1[0][0]'] ) Conv1_relu (ReLU) (None, 112, 112, 32 0 ['bn_Conv1[0][0]'] ) expanded_conv_depthwise (Depth (None, 112, 112, 32 288 ['Conv1_relu[0][0]'] wiseConv2D) ) expanded_conv_depthwise_BN (Ba (None, 112, 112, 32 128 ['expanded_conv_depthwise[0][0]'] tchNormalization) ) expanded_conv_depthwise_relu ( (None, 112, 112, 32 0 ['expanded_conv_depthwise_BN[0][0 ReLU) ) ]'] expanded_conv_project (Conv2D) (None, 112, 112, 16 512 ['expanded_conv_depthwise_relu[0] ) [0]'] expanded_conv_project_BN (Batc (None, 112, 112, 16 64 ['expanded_conv_project[0][0]'] hNormalization) ) block_1_expand (Conv2D) (None, 112, 112, 96 1536 ['expanded_conv_project_BN[0][0]' ) ] block_1_expand_BN (BatchNormal (None, 112, 112, 96 384 ['block_1_expand[0][0]'] ization) ) block_1_expand_relu (ReLU) (None, 112, 112, 96 0 ['block_1_expand_BN[0][0]'] ) block_1_pad (ZeroPadding2D) (None, 113, 113, 96 0 ['block_1_expand_relu[0][0]'] ) block_1_depthwise (DepthwiseCo (None, 56, 56, 96) 864 ['block_1_pad[0][0]'] nv2D) block_1_depthwise_BN (BatchNor (None, 56, 56, 96) 384 ['block_1_depthwise[0][0]'] malization) block_1_depthwise_relu (ReLU) (None, 56, 56, 96) 0 ['block_1_depthwise_BN[0][0]'] block_1_project (Conv2D) (None, 56, 56, 24) 2304 ['block_1_depthwise_relu[0][0]'] block_1_project_BN (BatchNorma (None, 56, 56, 24) 96 ['block_1_project[0][0]'] lization) block_2_expand (Conv2D) (None, 56, 56, 144) 3456 ['block_1_project_BN[0][0]'] block_2_expand_BN (BatchNormal (None, 56, 56, 144) 576 ['block_2_expand[0][0]'] ization) block_2_expand_relu (ReLU) (None, 56, 56, 144) 0 ['block_2_expand_BN[0][0]'] block_2_depthwise (DepthwiseCo (None, 56, 56, 144) 1296 ['block_2_expand_relu[0][0]'] nv2D) block_2_depthwise_BN (BatchNor (None, 56, 56, 144) 576 ['block_2_depthwise[0][0]'] malization) block_2_depthwise_relu (ReLU) (None, 56, 56, 144) 0 ['block_2_depthwise_BN[0][0]'] block_2_project (Conv2D) (None, 56, 56, 24) 3456 ['block_2_depthwise_relu[0][0]'] block_2_project_BN (BatchNorma (None, 56, 56, 24) 96 ['block_2_project[0][0]'] lization) block_2_add (Add) (None, 56, 56, 24) 0 ['block_1_project_BN[0][0]', 'block_2_project_BN[0][0]'] block_3_expand (Conv2D) (None, 56, 56, 144) 3456 ['block_2_add[0][0]'] block_3_expand_BN (BatchNormal (None, 56, 56, 144) 576 ['block_3_expand[0][0]'] ization) block_3_expand_relu (ReLU) (None, 56, 56, 144) 0 ['block_3_expand_BN[0][0]'] block_3_pad (ZeroPadding2D) (None, 57, 57, 144) 0 ['block_3_expand_relu[0][0]'] block_3_depthwise (DepthwiseCo (None, 28, 28, 144) 1296 ['block_3_pad[0][0]'] nv2D) block_3_depthwise_BN (BatchNor (None, 28, 28, 144) 576 ['block_3_depthwise[0][0]'] malization) block_3_depthwise_relu (ReLU) (None, 28, 28, 144) 0 ['block_3_depthwise_BN[0][0]'] block_3_project (Conv2D) (None, 28, 28, 32) 4608 ['block_3_depthwise_relu[0][0]'] block_3_project_BN (BatchNorma (None, 28, 28, 32) 128 ['block_3_project[0][0]'] lization) block_4_expand (Conv2D) (None, 28, 28, 192) 6144 ['block_3_project_BN[0][0]'] block_4_expand_BN (BatchNormal (None, 28, 28, 192) 768 ['block_4_expand[0][0]'] ization) block_4_expand_relu (ReLU) (None, 28, 28, 192) 0 ['block_4_expand_BN[0][0]'] block_4_depthwise (DepthwiseCo (None, 28, 28, 192) 1728 ['block_4_expand_relu[0][0]'] nv2D) block_4_depthwise_BN (BatchNor (None, 28, 28, 192) 768 ['block_4_depthwise[0][0]'] malization) block_4_depthwise_relu (ReLU) (None, 28, 28, 192) 0 ['block_4_depthwise_BN[0][0]'] block_4_project (Conv2D) (None, 28, 28, 32) 6144 ['block_4_depthwise_relu[0][0]'] block_4_project_BN (BatchNorma (None, 28, 28, 32) 128 ['block_4_project[0][0]'] lization) block_4_add (Add) (None, 28, 28, 32) 0 ['block_3_project_BN[0][0]', 'block_4_project_BN[0][0]'] block_5_expand (Conv2D) (None, 28, 28, 192) 6144 ['block_4_add[0][0]'] block_5_expand_BN (BatchNormal (None, 28, 28, 192) 768 ['block_5_expand[0][0]'] ization) block_5_expand_relu (ReLU) (None, 28, 28, 192) 0 ['block_5_expand_BN[0][0]'] block_5_depthwise (DepthwiseCo (None, 28, 28, 192) 1728 ['block_5_expand_relu[0][0]'] nv2D) block_5_depthwise_BN (BatchNor (None, 28, 28, 192) 768 ['block_5_depthwise[0][0]'] malization) block_5_depthwise_relu (ReLU) (None, 28, 28, 192) 0 ['block_5_depthwise_BN[0][0]'] block_5_project (Conv2D) (None, 28, 28, 32) 6144 ['block_5_depthwise_relu[0][0]'] block_5_project_BN (BatchNorma (None, 28, 28, 32) 128 ['block_5_project[0][0]'] lization) block_5_add (Add) (None, 28, 28, 32) 0 ['block_4_add[0][0]', 'block_5_project_BN[0][0]'] block_6_expand (Conv2D) (None, 28, 28, 192) 6144 ['block_5_add[0][0]'] block_6_expand_BN (BatchNormal (None, 28, 28, 192) 768 ['block_6_expand[0][0]'] ization) block_6_expand_relu (ReLU) (None, 28, 28, 192) 0 ['block_6_expand_BN[0][0]'] block_6_pad (ZeroPadding2D) (None, 29, 29, 192) 0 ['block_6_expand_relu[0][0]'] block_6_depthwise (DepthwiseCo (None, 14, 14, 192) 1728 ['block_6_pad[0][0]'] nv2D) block_6_depthwise_BN (BatchNor (None, 14, 14, 192) 768 ['block_6_depthwise[0][0]'] malization) block_6_depthwise_relu (ReLU) (None, 14, 14, 192) 0 ['block_6_depthwise_BN[0][0]'] block_6_project (Conv2D) (None, 14, 14, 64) 12288 ['block_6_depthwise_relu[0][0]'] block_6_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_6_project[0][0]'] lization) block_7_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_6_project_BN[0][0]'] block_7_expand_BN (BatchNormal (None, 14, 14, 384) 1536 ['block_7_expand[0][0]'] ization) block_7_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_7_expand_BN[0][0]'] block_7_depthwise (DepthwiseCo (None, 14, 14, 384) 3456 ['block_7_expand_relu[0][0]'] nv2D) block_7_depthwise_BN (BatchNor (None, 14, 14, 384) 1536 ['block_7_depthwise[0][0]'] malization) block_7_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_7_depthwise_BN[0][0]'] block_7_project (Conv2D) (None, 14, 14, 64) 24576 ['block_7_depthwise_relu[0][0]'] block_7_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_7_project[0][0]'] lization) block_7_add (Add) (None, 14, 14, 64) 0 ['block_6_project_BN[0][0]', 'block_7_project_BN[0][0]'] block_8_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_7_add[0][0]'] block_8_expand_BN (BatchNormal (None, 14, 14, 384) 1536 ['block_8_expand[0][0]'] ization) block_8_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_8_expand_BN[0][0]'] block_8_depthwise (DepthwiseCo (None, 14, 14, 384) 3456 ['block_8_expand_relu[0][0]'] nv2D) block_8_depthwise_BN (BatchNor (None, 14, 14, 384) 1536 ['block_8_depthwise[0][0]'] malization) block_8_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_8_depthwise_BN[0][0]'] block_8_project (Conv2D) (None, 14, 14, 64) 24576 ['block_8_depthwise_relu[0][0]'] block_8_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_8_project[0][0]'] lization) block_8_add (Add) (None, 14, 14, 64) 0 ['block_7_add[0][0]', 'block_8_project_BN[0][0]'] block_9_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_8_add[0][0]'] block_9_expand_BN (BatchNormal (None, 14, 14, 384) 1536 ['block_9_expand[0][0]'] ization) block_9_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_9_expand_BN[0][0]'] block_9_depthwise (DepthwiseCo (None, 14, 14, 384) 3456 ['block_9_expand_relu[0][0]'] nv2D) block_9_depthwise_BN (BatchNor (None, 14, 14, 384) 1536 ['block_9_depthwise[0][0]'] malization) block_9_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_9_depthwise_BN[0][0]'] block_9_project (Conv2D) (None, 14, 14, 64) 24576 ['block_9_depthwise_relu[0][0]'] block_9_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_9_project[0][0]'] lization) block_9_add (Add) (None, 14, 14, 64) 0 ['block_8_add[0][0]', 'block_9_project_BN[0][0]'] block_10_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_9_add[0][0]'] block_10_expand_BN (BatchNorma (None, 14, 14, 384) 1536 ['block_10_expand[0][0]'] lization) block_10_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_10_expand_BN[0][0]'] block_10_depthwise (DepthwiseC (None, 14, 14, 384) 3456 ['block_10_expand_relu[0][0]'] onv2D) block_10_depthwise_BN (BatchNo (None, 14, 14, 384) 1536 ['block_10_depthwise[0][0]'] rmalization) block_10_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_10_depthwise_BN[0][0]'] block_10_project (Conv2D) (None, 14, 14, 96) 36864 ['block_10_depthwise_relu[0][0]'] block_10_project_BN (BatchNorm (None, 14, 14, 96) 384 ['block_10_project[0][0]'] alization) block_11_expand (Conv2D) (None, 14, 14, 576) 55296 ['block_10_project_BN[0][0]'] block_11_expand_BN (BatchNorma (None, 14, 14, 576) 2304 ['block_11_expand[0][0]'] lization) block_11_expand_relu (ReLU) (None, 14, 14, 576) 0 ['block_11_expand_BN[0][0]'] block_11_depthwise (DepthwiseC (None, 14, 14, 576) 5184 ['block_11_expand_relu[0][0]'] onv2D) block_11_depthwise_BN (BatchNo (None, 14, 14, 576) 2304 ['block_11_depthwise[0][0]'] rmalization) block_11_depthwise_relu (ReLU) (None, 14, 14, 576) 0 ['block_11_depthwise_BN[0][0]'] block_11_project (Conv2D) (None, 14, 14, 96) 55296 ['block_11_depthwise_relu[0][0]'] block_11_project_BN (BatchNorm (None, 14, 14, 96) 384 ['block_11_project[0][0]'] alization) block_11_add (Add) (None, 14, 14, 96) 0 ['block_10_project_BN[0][0]', 'block_11_project_BN[0][0]'] block_12_expand (Conv2D) (None, 14, 14, 576) 55296 ['block_11_add[0][0]'] block_12_expand_BN (BatchNorma (None, 14, 14, 576) 2304 ['block_12_expand[0][0]'] lization) block_12_expand_relu (ReLU) (None, 14, 14, 576) 0 ['block_12_expand_BN[0][0]'] block_12_depthwise (DepthwiseC (None, 14, 14, 576) 5184 ['block_12_expand_relu[0][0]'] onv2D) block_12_depthwise_BN (BatchNo (None, 14, 14, 576) 2304 ['block_12_depthwise[0][0]'] rmalization) block_12_depthwise_relu (ReLU) (None, 14, 14, 576) 0 ['block_12_depthwise_BN[0][0]'] block_12_project (Conv2D) (None, 14, 14, 96) 55296 ['block_12_depthwise_relu[0][0]'] block_12_project_BN (BatchNorm (None, 14, 14, 96) 384 ['block_12_project[0][0]'] alization) block_12_add (Add) (None, 14, 14, 96) 0 ['block_11_add[0][0]', 'block_12_project_BN[0][0]'] block_13_expand (Conv2D) (None, 14, 14, 576) 55296 ['block_12_add[0][0]'] block_13_expand_BN (BatchNorma (None, 14, 14, 576) 2304 ['block_13_expand[0][0]'] lization) block_13_expand_relu (ReLU) (None, 14, 14, 576) 0 ['block_13_expand_BN[0][0]'] block_13_pad (ZeroPadding2D) (None, 15, 15, 576) 0 ['block_13_expand_relu[0][0]'] block_13_depthwise (DepthwiseC (None, 7, 7, 576) 5184 ['block_13_pad[0][0]'] onv2D) block_13_depthwise_BN (BatchNo (None, 7, 7, 576) 2304 ['block_13_depthwise[0][0]'] rmalization) block_13_depthwise_relu (ReLU) (None, 7, 7, 576) 0 ['block_13_depthwise_BN[0][0]'] block_13_project (Conv2D) (None, 7, 7, 160) 92160 ['block_13_depthwise_relu[0][0]'] block_13_project_BN (BatchNorm (None, 7, 7, 160) 640 ['block_13_project[0][0]'] alization) block_14_expand (Conv2D) (None, 7, 7, 960) 153600 ['block_13_project_BN[0][0]'] block_14_expand_BN (BatchNorma (None, 7, 7, 960) 3840 ['block_14_expand[0][0]'] lization) block_14_expand_relu (ReLU) (None, 7, 7, 960) 0 ['block_14_expand_BN[0][0]'] block_14_depthwise (DepthwiseC (None, 7, 7, 960) 8640 ['block_14_expand_relu[0][0]'] onv2D) block_14_depthwise_BN (BatchNo (None, 7, 7, 960) 3840 ['block_14_depthwise[0][0]'] rmalization) block_14_depthwise_relu (ReLU) (None, 7, 7, 960) 0 ['block_14_depthwise_BN[0][0]'] block_14_project (Conv2D) (None, 7, 7, 160) 153600 ['block_14_depthwise_relu[0][0]'] block_14_project_BN (BatchNorm (None, 7, 7, 160) 640 ['block_14_project[0][0]'] alization) block_14_add (Add) (None, 7, 7, 160) 0 ['block_13_project_BN[0][0]', 'block_14_project_BN[0][0]'] block_15_expand (Conv2D) (None, 7, 7, 960) 153600 ['block_14_add[0][0]'] block_15_expand_BN (BatchNorma (None, 7, 7, 960) 3840 ['block_15_expand[0][0]'] lization) block_15_expand_relu (ReLU) (None, 7, 7, 960) 0 ['block_15_expand_BN[0][0]'] block_15_depthwise (DepthwiseC (None, 7, 7, 960) 8640 ['block_15_expand_relu[0][0]'] onv2D) block_15_depthwise_BN (BatchNo (None, 7, 7, 960) 3840 ['block_15_depthwise[0][0]'] rmalization) block_15_depthwise_relu (ReLU) (None, 7, 7, 960) 0 ['block_15_depthwise_BN[0][0]'] block_15_project (Conv2D) (None, 7, 7, 160) 153600 ['block_15_depthwise_relu[0][0]'] block_15_project_BN (BatchNorm (None, 7, 7, 160) 640 ['block_15_project[0][0]'] alization) block_15_add (Add) (None, 7, 7, 160) 0 ['block_14_add[0][0]', 'block_15_project_BN[0][0]'] block_16_expand (Conv2D) (None, 7, 7, 960) 153600 ['block_15_add[0][0]'] block_16_expand_BN (BatchNorma (None, 7, 7, 960) 3840 ['block_16_expand[0][0]'] lization) block_16_expand_relu (ReLU) (None, 7, 7, 960) 0 ['block_16_expand_BN[0][0]'] block_16_depthwise (DepthwiseC (None, 7, 7, 960) 8640 ['block_16_expand_relu[0][0]'] onv2D) block_16_depthwise_BN (BatchNo (None, 7, 7, 960) 3840 ['block_16_depthwise[0][0]'] rmalization) block_16_depthwise_relu (ReLU) (None, 7, 7, 960) 0 ['block_16_depthwise_BN[0][0]'] block_16_project (Conv2D) (None, 7, 7, 320) 307200 ['block_16_depthwise_relu[0][0]'] block_16_project_BN (BatchNorm (None, 7, 7, 320) 1280 ['block_16_project[0][0]'] alization) Conv_1 (Conv2D) (None, 7, 7, 1280) 409600 ['block_16_project_BN[0][0]'] Conv_1_bn (BatchNormalization) (None, 7, 7, 1280) 5120 ['Conv_1[0][0]'] out_relu (ReLU) (None, 7, 7, 1280) 0 ['Conv_1_bn[0][0]'] ================================================================================================== Total params: 2,257,984 Trainable params: 2,223,872 Non-trainable params: 34,112 __________________________________________________________________________________________________
In [34]:
# 2. base_model 의 레이어 수를 확인한다.
In [56]:
len( base_model.layers )
Out[56]:
154
In [57]:
# 3. 몇번째 레이어 까지 frozen 시킬지 결정한다.
In [58]:
end_layer = 130
In [ ]:
# 4. base_model의 첫번째 레이어부터 end_layer 까지는 학습이
# 되지 않도록 frozen 시킨다.
In [59]:
for layer in base_model.layers[0 : end_layer+1 ] :
layer.trainable = False
In [60]:
base_model.summary()
Model: "mobilenetv2_1.00_224" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) [(None, 224, 224, 3 0 [] )] Conv1 (Conv2D) (None, 112, 112, 32 864 ['input_1[0][0]'] ) bn_Conv1 (BatchNormalization) (None, 112, 112, 32 128 ['Conv1[0][0]'] ) Conv1_relu (ReLU) (None, 112, 112, 32 0 ['bn_Conv1[0][0]'] ) expanded_conv_depthwise (Depth (None, 112, 112, 32 288 ['Conv1_relu[0][0]'] wiseConv2D) ) expanded_conv_depthwise_BN (Ba (None, 112, 112, 32 128 ['expanded_conv_depthwise[0][0]'] tchNormalization) ) expanded_conv_depthwise_relu ( (None, 112, 112, 32 0 ['expanded_conv_depthwise_BN[0][0 ReLU) ) ]'] expanded_conv_project (Conv2D) (None, 112, 112, 16 512 ['expanded_conv_depthwise_relu[0] ) [0]'] expanded_conv_project_BN (Batc (None, 112, 112, 16 64 ['expanded_conv_project[0][0]'] hNormalization) ) block_1_expand (Conv2D) (None, 112, 112, 96 1536 ['expanded_conv_project_BN[0][0]' ) ] block_1_expand_BN (BatchNormal (None, 112, 112, 96 384 ['block_1_expand[0][0]'] ization) ) block_1_expand_relu (ReLU) (None, 112, 112, 96 0 ['block_1_expand_BN[0][0]'] ) block_1_pad (ZeroPadding2D) (None, 113, 113, 96 0 ['block_1_expand_relu[0][0]'] ) block_1_depthwise (DepthwiseCo (None, 56, 56, 96) 864 ['block_1_pad[0][0]'] nv2D) block_1_depthwise_BN (BatchNor (None, 56, 56, 96) 384 ['block_1_depthwise[0][0]'] malization) block_1_depthwise_relu (ReLU) (None, 56, 56, 96) 0 ['block_1_depthwise_BN[0][0]'] block_1_project (Conv2D) (None, 56, 56, 24) 2304 ['block_1_depthwise_relu[0][0]'] block_1_project_BN (BatchNorma (None, 56, 56, 24) 96 ['block_1_project[0][0]'] lization) block_2_expand (Conv2D) (None, 56, 56, 144) 3456 ['block_1_project_BN[0][0]'] block_2_expand_BN (BatchNormal (None, 56, 56, 144) 576 ['block_2_expand[0][0]'] ization) block_2_expand_relu (ReLU) (None, 56, 56, 144) 0 ['block_2_expand_BN[0][0]'] block_2_depthwise (DepthwiseCo (None, 56, 56, 144) 1296 ['block_2_expand_relu[0][0]'] nv2D) block_2_depthwise_BN (BatchNor (None, 56, 56, 144) 576 ['block_2_depthwise[0][0]'] malization) block_2_depthwise_relu (ReLU) (None, 56, 56, 144) 0 ['block_2_depthwise_BN[0][0]'] block_2_project (Conv2D) (None, 56, 56, 24) 3456 ['block_2_depthwise_relu[0][0]'] block_2_project_BN (BatchNorma (None, 56, 56, 24) 96 ['block_2_project[0][0]'] lization) block_2_add (Add) (None, 56, 56, 24) 0 ['block_1_project_BN[0][0]', 'block_2_project_BN[0][0]'] block_3_expand (Conv2D) (None, 56, 56, 144) 3456 ['block_2_add[0][0]'] block_3_expand_BN (BatchNormal (None, 56, 56, 144) 576 ['block_3_expand[0][0]'] ization) block_3_expand_relu (ReLU) (None, 56, 56, 144) 0 ['block_3_expand_BN[0][0]'] block_3_pad (ZeroPadding2D) (None, 57, 57, 144) 0 ['block_3_expand_relu[0][0]'] block_3_depthwise (DepthwiseCo (None, 28, 28, 144) 1296 ['block_3_pad[0][0]'] nv2D) block_3_depthwise_BN (BatchNor (None, 28, 28, 144) 576 ['block_3_depthwise[0][0]'] malization) block_3_depthwise_relu (ReLU) (None, 28, 28, 144) 0 ['block_3_depthwise_BN[0][0]'] block_3_project (Conv2D) (None, 28, 28, 32) 4608 ['block_3_depthwise_relu[0][0]'] block_3_project_BN (BatchNorma (None, 28, 28, 32) 128 ['block_3_project[0][0]'] lization) block_4_expand (Conv2D) (None, 28, 28, 192) 6144 ['block_3_project_BN[0][0]'] block_4_expand_BN (BatchNormal (None, 28, 28, 192) 768 ['block_4_expand[0][0]'] ization) block_4_expand_relu (ReLU) (None, 28, 28, 192) 0 ['block_4_expand_BN[0][0]'] block_4_depthwise (DepthwiseCo (None, 28, 28, 192) 1728 ['block_4_expand_relu[0][0]'] nv2D) block_4_depthwise_BN (BatchNor (None, 28, 28, 192) 768 ['block_4_depthwise[0][0]'] malization) block_4_depthwise_relu (ReLU) (None, 28, 28, 192) 0 ['block_4_depthwise_BN[0][0]'] block_4_project (Conv2D) (None, 28, 28, 32) 6144 ['block_4_depthwise_relu[0][0]'] block_4_project_BN (BatchNorma (None, 28, 28, 32) 128 ['block_4_project[0][0]'] lization) block_4_add (Add) (None, 28, 28, 32) 0 ['block_3_project_BN[0][0]', 'block_4_project_BN[0][0]'] block_5_expand (Conv2D) (None, 28, 28, 192) 6144 ['block_4_add[0][0]'] block_5_expand_BN (BatchNormal (None, 28, 28, 192) 768 ['block_5_expand[0][0]'] ization) block_5_expand_relu (ReLU) (None, 28, 28, 192) 0 ['block_5_expand_BN[0][0]'] block_5_depthwise (DepthwiseCo (None, 28, 28, 192) 1728 ['block_5_expand_relu[0][0]'] nv2D) block_5_depthwise_BN (BatchNor (None, 28, 28, 192) 768 ['block_5_depthwise[0][0]'] malization) block_5_depthwise_relu (ReLU) (None, 28, 28, 192) 0 ['block_5_depthwise_BN[0][0]'] block_5_project (Conv2D) (None, 28, 28, 32) 6144 ['block_5_depthwise_relu[0][0]'] block_5_project_BN (BatchNorma (None, 28, 28, 32) 128 ['block_5_project[0][0]'] lization) block_5_add (Add) (None, 28, 28, 32) 0 ['block_4_add[0][0]', 'block_5_project_BN[0][0]'] block_6_expand (Conv2D) (None, 28, 28, 192) 6144 ['block_5_add[0][0]'] block_6_expand_BN (BatchNormal (None, 28, 28, 192) 768 ['block_6_expand[0][0]'] ization) block_6_expand_relu (ReLU) (None, 28, 28, 192) 0 ['block_6_expand_BN[0][0]'] block_6_pad (ZeroPadding2D) (None, 29, 29, 192) 0 ['block_6_expand_relu[0][0]'] block_6_depthwise (DepthwiseCo (None, 14, 14, 192) 1728 ['block_6_pad[0][0]'] nv2D) block_6_depthwise_BN (BatchNor (None, 14, 14, 192) 768 ['block_6_depthwise[0][0]'] malization) block_6_depthwise_relu (ReLU) (None, 14, 14, 192) 0 ['block_6_depthwise_BN[0][0]'] block_6_project (Conv2D) (None, 14, 14, 64) 12288 ['block_6_depthwise_relu[0][0]'] block_6_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_6_project[0][0]'] lization) block_7_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_6_project_BN[0][0]'] block_7_expand_BN (BatchNormal (None, 14, 14, 384) 1536 ['block_7_expand[0][0]'] ization) block_7_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_7_expand_BN[0][0]'] block_7_depthwise (DepthwiseCo (None, 14, 14, 384) 3456 ['block_7_expand_relu[0][0]'] nv2D) block_7_depthwise_BN (BatchNor (None, 14, 14, 384) 1536 ['block_7_depthwise[0][0]'] malization) block_7_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_7_depthwise_BN[0][0]'] block_7_project (Conv2D) (None, 14, 14, 64) 24576 ['block_7_depthwise_relu[0][0]'] block_7_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_7_project[0][0]'] lization) block_7_add (Add) (None, 14, 14, 64) 0 ['block_6_project_BN[0][0]', 'block_7_project_BN[0][0]'] block_8_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_7_add[0][0]'] block_8_expand_BN (BatchNormal (None, 14, 14, 384) 1536 ['block_8_expand[0][0]'] ization) block_8_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_8_expand_BN[0][0]'] block_8_depthwise (DepthwiseCo (None, 14, 14, 384) 3456 ['block_8_expand_relu[0][0]'] nv2D) block_8_depthwise_BN (BatchNor (None, 14, 14, 384) 1536 ['block_8_depthwise[0][0]'] malization) block_8_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_8_depthwise_BN[0][0]'] block_8_project (Conv2D) (None, 14, 14, 64) 24576 ['block_8_depthwise_relu[0][0]'] block_8_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_8_project[0][0]'] lization) block_8_add (Add) (None, 14, 14, 64) 0 ['block_7_add[0][0]', 'block_8_project_BN[0][0]'] block_9_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_8_add[0][0]'] block_9_expand_BN (BatchNormal (None, 14, 14, 384) 1536 ['block_9_expand[0][0]'] ization) block_9_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_9_expand_BN[0][0]'] block_9_depthwise (DepthwiseCo (None, 14, 14, 384) 3456 ['block_9_expand_relu[0][0]'] nv2D) block_9_depthwise_BN (BatchNor (None, 14, 14, 384) 1536 ['block_9_depthwise[0][0]'] malization) block_9_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_9_depthwise_BN[0][0]'] block_9_project (Conv2D) (None, 14, 14, 64) 24576 ['block_9_depthwise_relu[0][0]'] block_9_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_9_project[0][0]'] lization) block_9_add (Add) (None, 14, 14, 64) 0 ['block_8_add[0][0]', 'block_9_project_BN[0][0]'] block_10_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_9_add[0][0]'] block_10_expand_BN (BatchNorma (None, 14, 14, 384) 1536 ['block_10_expand[0][0]'] lization) block_10_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_10_expand_BN[0][0]'] block_10_depthwise (DepthwiseC (None, 14, 14, 384) 3456 ['block_10_expand_relu[0][0]'] onv2D) block_10_depthwise_BN (BatchNo (None, 14, 14, 384) 1536 ['block_10_depthwise[0][0]'] rmalization) block_10_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_10_depthwise_BN[0][0]'] block_10_project (Conv2D) (None, 14, 14, 96) 36864 ['block_10_depthwise_relu[0][0]'] block_10_project_BN (BatchNorm (None, 14, 14, 96) 384 ['block_10_project[0][0]'] alization) block_11_expand (Conv2D) (None, 14, 14, 576) 55296 ['block_10_project_BN[0][0]'] block_11_expand_BN (BatchNorma (None, 14, 14, 576) 2304 ['block_11_expand[0][0]'] lization) block_11_expand_relu (ReLU) (None, 14, 14, 576) 0 ['block_11_expand_BN[0][0]'] block_11_depthwise (DepthwiseC (None, 14, 14, 576) 5184 ['block_11_expand_relu[0][0]'] onv2D) block_11_depthwise_BN (BatchNo (None, 14, 14, 576) 2304 ['block_11_depthwise[0][0]'] rmalization) block_11_depthwise_relu (ReLU) (None, 14, 14, 576) 0 ['block_11_depthwise_BN[0][0]'] block_11_project (Conv2D) (None, 14, 14, 96) 55296 ['block_11_depthwise_relu[0][0]'] block_11_project_BN (BatchNorm (None, 14, 14, 96) 384 ['block_11_project[0][0]'] alization) block_11_add (Add) (None, 14, 14, 96) 0 ['block_10_project_BN[0][0]', 'block_11_project_BN[0][0]'] block_12_expand (Conv2D) (None, 14, 14, 576) 55296 ['block_11_add[0][0]'] block_12_expand_BN (BatchNorma (None, 14, 14, 576) 2304 ['block_12_expand[0][0]'] lization) block_12_expand_relu (ReLU) (None, 14, 14, 576) 0 ['block_12_expand_BN[0][0]'] block_12_depthwise (DepthwiseC (None, 14, 14, 576) 5184 ['block_12_expand_relu[0][0]'] onv2D) block_12_depthwise_BN (BatchNo (None, 14, 14, 576) 2304 ['block_12_depthwise[0][0]'] rmalization) block_12_depthwise_relu (ReLU) (None, 14, 14, 576) 0 ['block_12_depthwise_BN[0][0]'] block_12_project (Conv2D) (None, 14, 14, 96) 55296 ['block_12_depthwise_relu[0][0]'] block_12_project_BN (BatchNorm (None, 14, 14, 96) 384 ['block_12_project[0][0]'] alization) block_12_add (Add) (None, 14, 14, 96) 0 ['block_11_add[0][0]', 'block_12_project_BN[0][0]'] block_13_expand (Conv2D) (None, 14, 14, 576) 55296 ['block_12_add[0][0]'] block_13_expand_BN (BatchNorma (None, 14, 14, 576) 2304 ['block_13_expand[0][0]'] lization) block_13_expand_relu (ReLU) (None, 14, 14, 576) 0 ['block_13_expand_BN[0][0]'] block_13_pad (ZeroPadding2D) (None, 15, 15, 576) 0 ['block_13_expand_relu[0][0]'] block_13_depthwise (DepthwiseC (None, 7, 7, 576) 5184 ['block_13_pad[0][0]'] onv2D) block_13_depthwise_BN (BatchNo (None, 7, 7, 576) 2304 ['block_13_depthwise[0][0]'] rmalization) block_13_depthwise_relu (ReLU) (None, 7, 7, 576) 0 ['block_13_depthwise_BN[0][0]'] block_13_project (Conv2D) (None, 7, 7, 160) 92160 ['block_13_depthwise_relu[0][0]'] block_13_project_BN (BatchNorm (None, 7, 7, 160) 640 ['block_13_project[0][0]'] alization) block_14_expand (Conv2D) (None, 7, 7, 960) 153600 ['block_13_project_BN[0][0]'] block_14_expand_BN (BatchNorma (None, 7, 7, 960) 3840 ['block_14_expand[0][0]'] lization) block_14_expand_relu (ReLU) (None, 7, 7, 960) 0 ['block_14_expand_BN[0][0]'] block_14_depthwise (DepthwiseC (None, 7, 7, 960) 8640 ['block_14_expand_relu[0][0]'] onv2D) block_14_depthwise_BN (BatchNo (None, 7, 7, 960) 3840 ['block_14_depthwise[0][0]'] rmalization) block_14_depthwise_relu (ReLU) (None, 7, 7, 960) 0 ['block_14_depthwise_BN[0][0]'] block_14_project (Conv2D) (None, 7, 7, 160) 153600 ['block_14_depthwise_relu[0][0]'] block_14_project_BN (BatchNorm (None, 7, 7, 160) 640 ['block_14_project[0][0]'] alization) block_14_add (Add) (None, 7, 7, 160) 0 ['block_13_project_BN[0][0]', 'block_14_project_BN[0][0]'] block_15_expand (Conv2D) (None, 7, 7, 960) 153600 ['block_14_add[0][0]'] block_15_expand_BN (BatchNorma (None, 7, 7, 960) 3840 ['block_15_expand[0][0]'] lization) block_15_expand_relu (ReLU) (None, 7, 7, 960) 0 ['block_15_expand_BN[0][0]'] block_15_depthwise (DepthwiseC (None, 7, 7, 960) 8640 ['block_15_expand_relu[0][0]'] onv2D) block_15_depthwise_BN (BatchNo (None, 7, 7, 960) 3840 ['block_15_depthwise[0][0]'] rmalization) block_15_depthwise_relu (ReLU) (None, 7, 7, 960) 0 ['block_15_depthwise_BN[0][0]'] block_15_project (Conv2D) (None, 7, 7, 160) 153600 ['block_15_depthwise_relu[0][0]'] block_15_project_BN (BatchNorm (None, 7, 7, 160) 640 ['block_15_project[0][0]'] alization) block_15_add (Add) (None, 7, 7, 160) 0 ['block_14_add[0][0]', 'block_15_project_BN[0][0]'] block_16_expand (Conv2D) (None, 7, 7, 960) 153600 ['block_15_add[0][0]'] block_16_expand_BN (BatchNorma (None, 7, 7, 960) 3840 ['block_16_expand[0][0]'] lization) block_16_expand_relu (ReLU) (None, 7, 7, 960) 0 ['block_16_expand_BN[0][0]'] block_16_depthwise (DepthwiseC (None, 7, 7, 960) 8640 ['block_16_expand_relu[0][0]'] onv2D) block_16_depthwise_BN (BatchNo (None, 7, 7, 960) 3840 ['block_16_depthwise[0][0]'] rmalization) block_16_depthwise_relu (ReLU) (None, 7, 7, 960) 0 ['block_16_depthwise_BN[0][0]'] block_16_project (Conv2D) (None, 7, 7, 320) 307200 ['block_16_depthwise_relu[0][0]'] block_16_project_BN (BatchNorm (None, 7, 7, 320) 1280 ['block_16_project[0][0]'] alization) Conv_1 (Conv2D) (None, 7, 7, 1280) 409600 ['block_16_project_BN[0][0]'] Conv_1_bn (BatchNormalization) (None, 7, 7, 1280) 5120 ['Conv_1[0][0]'] out_relu (ReLU) (None, 7, 7, 1280) 0 ['Conv_1_bn[0][0]'] ================================================================================================== Total params: 2,257,984 Trainable params: 1,360,000 Non-trainable params: 897,984 __________________________________________________________________________________________________
In [61]:
model.summary()
Model: "model" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) [(None, 224, 224, 3 0 [] )] Conv1 (Conv2D) (None, 112, 112, 32 864 ['input_1[0][0]'] ) bn_Conv1 (BatchNormalization) (None, 112, 112, 32 128 ['Conv1[0][0]'] ) Conv1_relu (ReLU) (None, 112, 112, 32 0 ['bn_Conv1[0][0]'] ) expanded_conv_depthwise (Depth (None, 112, 112, 32 288 ['Conv1_relu[0][0]'] wiseConv2D) ) expanded_conv_depthwise_BN (Ba (None, 112, 112, 32 128 ['expanded_conv_depthwise[0][0]'] tchNormalization) ) expanded_conv_depthwise_relu ( (None, 112, 112, 32 0 ['expanded_conv_depthwise_BN[0][0 ReLU) ) ]'] expanded_conv_project (Conv2D) (None, 112, 112, 16 512 ['expanded_conv_depthwise_relu[0] ) [0]'] expanded_conv_project_BN (Batc (None, 112, 112, 16 64 ['expanded_conv_project[0][0]'] hNormalization) ) block_1_expand (Conv2D) (None, 112, 112, 96 1536 ['expanded_conv_project_BN[0][0]' ) ] block_1_expand_BN (BatchNormal (None, 112, 112, 96 384 ['block_1_expand[0][0]'] ization) ) block_1_expand_relu (ReLU) (None, 112, 112, 96 0 ['block_1_expand_BN[0][0]'] ) block_1_pad (ZeroPadding2D) (None, 113, 113, 96 0 ['block_1_expand_relu[0][0]'] ) block_1_depthwise (DepthwiseCo (None, 56, 56, 96) 864 ['block_1_pad[0][0]'] nv2D) block_1_depthwise_BN (BatchNor (None, 56, 56, 96) 384 ['block_1_depthwise[0][0]'] malization) block_1_depthwise_relu (ReLU) (None, 56, 56, 96) 0 ['block_1_depthwise_BN[0][0]'] block_1_project (Conv2D) (None, 56, 56, 24) 2304 ['block_1_depthwise_relu[0][0]'] block_1_project_BN (BatchNorma (None, 56, 56, 24) 96 ['block_1_project[0][0]'] lization) block_2_expand (Conv2D) (None, 56, 56, 144) 3456 ['block_1_project_BN[0][0]'] block_2_expand_BN (BatchNormal (None, 56, 56, 144) 576 ['block_2_expand[0][0]'] ization) block_2_expand_relu (ReLU) (None, 56, 56, 144) 0 ['block_2_expand_BN[0][0]'] block_2_depthwise (DepthwiseCo (None, 56, 56, 144) 1296 ['block_2_expand_relu[0][0]'] nv2D) block_2_depthwise_BN (BatchNor (None, 56, 56, 144) 576 ['block_2_depthwise[0][0]'] malization) block_2_depthwise_relu (ReLU) (None, 56, 56, 144) 0 ['block_2_depthwise_BN[0][0]'] block_2_project (Conv2D) (None, 56, 56, 24) 3456 ['block_2_depthwise_relu[0][0]'] block_2_project_BN (BatchNorma (None, 56, 56, 24) 96 ['block_2_project[0][0]'] lization) block_2_add (Add) (None, 56, 56, 24) 0 ['block_1_project_BN[0][0]', 'block_2_project_BN[0][0]'] block_3_expand (Conv2D) (None, 56, 56, 144) 3456 ['block_2_add[0][0]'] block_3_expand_BN (BatchNormal (None, 56, 56, 144) 576 ['block_3_expand[0][0]'] ization) block_3_expand_relu (ReLU) (None, 56, 56, 144) 0 ['block_3_expand_BN[0][0]'] block_3_pad (ZeroPadding2D) (None, 57, 57, 144) 0 ['block_3_expand_relu[0][0]'] block_3_depthwise (DepthwiseCo (None, 28, 28, 144) 1296 ['block_3_pad[0][0]'] nv2D) block_3_depthwise_BN (BatchNor (None, 28, 28, 144) 576 ['block_3_depthwise[0][0]'] malization) block_3_depthwise_relu (ReLU) (None, 28, 28, 144) 0 ['block_3_depthwise_BN[0][0]'] block_3_project (Conv2D) (None, 28, 28, 32) 4608 ['block_3_depthwise_relu[0][0]'] block_3_project_BN (BatchNorma (None, 28, 28, 32) 128 ['block_3_project[0][0]'] lization) block_4_expand (Conv2D) (None, 28, 28, 192) 6144 ['block_3_project_BN[0][0]'] block_4_expand_BN (BatchNormal (None, 28, 28, 192) 768 ['block_4_expand[0][0]'] ization) block_4_expand_relu (ReLU) (None, 28, 28, 192) 0 ['block_4_expand_BN[0][0]'] block_4_depthwise (DepthwiseCo (None, 28, 28, 192) 1728 ['block_4_expand_relu[0][0]'] nv2D) block_4_depthwise_BN (BatchNor (None, 28, 28, 192) 768 ['block_4_depthwise[0][0]'] malization) block_4_depthwise_relu (ReLU) (None, 28, 28, 192) 0 ['block_4_depthwise_BN[0][0]'] block_4_project (Conv2D) (None, 28, 28, 32) 6144 ['block_4_depthwise_relu[0][0]'] block_4_project_BN (BatchNorma (None, 28, 28, 32) 128 ['block_4_project[0][0]'] lization) block_4_add (Add) (None, 28, 28, 32) 0 ['block_3_project_BN[0][0]', 'block_4_project_BN[0][0]'] block_5_expand (Conv2D) (None, 28, 28, 192) 6144 ['block_4_add[0][0]'] block_5_expand_BN (BatchNormal (None, 28, 28, 192) 768 ['block_5_expand[0][0]'] ization) block_5_expand_relu (ReLU) (None, 28, 28, 192) 0 ['block_5_expand_BN[0][0]'] block_5_depthwise (DepthwiseCo (None, 28, 28, 192) 1728 ['block_5_expand_relu[0][0]'] nv2D) block_5_depthwise_BN (BatchNor (None, 28, 28, 192) 768 ['block_5_depthwise[0][0]'] malization) block_5_depthwise_relu (ReLU) (None, 28, 28, 192) 0 ['block_5_depthwise_BN[0][0]'] block_5_project (Conv2D) (None, 28, 28, 32) 6144 ['block_5_depthwise_relu[0][0]'] block_5_project_BN (BatchNorma (None, 28, 28, 32) 128 ['block_5_project[0][0]'] lization) block_5_add (Add) (None, 28, 28, 32) 0 ['block_4_add[0][0]', 'block_5_project_BN[0][0]'] block_6_expand (Conv2D) (None, 28, 28, 192) 6144 ['block_5_add[0][0]'] block_6_expand_BN (BatchNormal (None, 28, 28, 192) 768 ['block_6_expand[0][0]'] ization) block_6_expand_relu (ReLU) (None, 28, 28, 192) 0 ['block_6_expand_BN[0][0]'] block_6_pad (ZeroPadding2D) (None, 29, 29, 192) 0 ['block_6_expand_relu[0][0]'] block_6_depthwise (DepthwiseCo (None, 14, 14, 192) 1728 ['block_6_pad[0][0]'] nv2D) block_6_depthwise_BN (BatchNor (None, 14, 14, 192) 768 ['block_6_depthwise[0][0]'] malization) block_6_depthwise_relu (ReLU) (None, 14, 14, 192) 0 ['block_6_depthwise_BN[0][0]'] block_6_project (Conv2D) (None, 14, 14, 64) 12288 ['block_6_depthwise_relu[0][0]'] block_6_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_6_project[0][0]'] lization) block_7_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_6_project_BN[0][0]'] block_7_expand_BN (BatchNormal (None, 14, 14, 384) 1536 ['block_7_expand[0][0]'] ization) block_7_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_7_expand_BN[0][0]'] block_7_depthwise (DepthwiseCo (None, 14, 14, 384) 3456 ['block_7_expand_relu[0][0]'] nv2D) block_7_depthwise_BN (BatchNor (None, 14, 14, 384) 1536 ['block_7_depthwise[0][0]'] malization) block_7_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_7_depthwise_BN[0][0]'] block_7_project (Conv2D) (None, 14, 14, 64) 24576 ['block_7_depthwise_relu[0][0]'] block_7_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_7_project[0][0]'] lization) block_7_add (Add) (None, 14, 14, 64) 0 ['block_6_project_BN[0][0]', 'block_7_project_BN[0][0]'] block_8_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_7_add[0][0]'] block_8_expand_BN (BatchNormal (None, 14, 14, 384) 1536 ['block_8_expand[0][0]'] ization) block_8_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_8_expand_BN[0][0]'] block_8_depthwise (DepthwiseCo (None, 14, 14, 384) 3456 ['block_8_expand_relu[0][0]'] nv2D) block_8_depthwise_BN (BatchNor (None, 14, 14, 384) 1536 ['block_8_depthwise[0][0]'] malization) block_8_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_8_depthwise_BN[0][0]'] block_8_project (Conv2D) (None, 14, 14, 64) 24576 ['block_8_depthwise_relu[0][0]'] block_8_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_8_project[0][0]'] lization) block_8_add (Add) (None, 14, 14, 64) 0 ['block_7_add[0][0]', 'block_8_project_BN[0][0]'] block_9_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_8_add[0][0]'] block_9_expand_BN (BatchNormal (None, 14, 14, 384) 1536 ['block_9_expand[0][0]'] ization) block_9_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_9_expand_BN[0][0]'] block_9_depthwise (DepthwiseCo (None, 14, 14, 384) 3456 ['block_9_expand_relu[0][0]'] nv2D) block_9_depthwise_BN (BatchNor (None, 14, 14, 384) 1536 ['block_9_depthwise[0][0]'] malization) block_9_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_9_depthwise_BN[0][0]'] block_9_project (Conv2D) (None, 14, 14, 64) 24576 ['block_9_depthwise_relu[0][0]'] block_9_project_BN (BatchNorma (None, 14, 14, 64) 256 ['block_9_project[0][0]'] lization) block_9_add (Add) (None, 14, 14, 64) 0 ['block_8_add[0][0]', 'block_9_project_BN[0][0]'] block_10_expand (Conv2D) (None, 14, 14, 384) 24576 ['block_9_add[0][0]'] block_10_expand_BN (BatchNorma (None, 14, 14, 384) 1536 ['block_10_expand[0][0]'] lization) block_10_expand_relu (ReLU) (None, 14, 14, 384) 0 ['block_10_expand_BN[0][0]'] block_10_depthwise (DepthwiseC (None, 14, 14, 384) 3456 ['block_10_expand_relu[0][0]'] onv2D) block_10_depthwise_BN (BatchNo (None, 14, 14, 384) 1536 ['block_10_depthwise[0][0]'] rmalization) block_10_depthwise_relu (ReLU) (None, 14, 14, 384) 0 ['block_10_depthwise_BN[0][0]'] block_10_project (Conv2D) (None, 14, 14, 96) 36864 ['block_10_depthwise_relu[0][0]'] block_10_project_BN (BatchNorm (None, 14, 14, 96) 384 ['block_10_project[0][0]'] alization) block_11_expand (Conv2D) (None, 14, 14, 576) 55296 ['block_10_project_BN[0][0]'] block_11_expand_BN (BatchNorma (None, 14, 14, 576) 2304 ['block_11_expand[0][0]'] lization) block_11_expand_relu (ReLU) (None, 14, 14, 576) 0 ['block_11_expand_BN[0][0]'] block_11_depthwise (DepthwiseC (None, 14, 14, 576) 5184 ['block_11_expand_relu[0][0]'] onv2D) block_11_depthwise_BN (BatchNo (None, 14, 14, 576) 2304 ['block_11_depthwise[0][0]'] rmalization) block_11_depthwise_relu (ReLU) (None, 14, 14, 576) 0 ['block_11_depthwise_BN[0][0]'] block_11_project (Conv2D) (None, 14, 14, 96) 55296 ['block_11_depthwise_relu[0][0]'] block_11_project_BN (BatchNorm (None, 14, 14, 96) 384 ['block_11_project[0][0]'] alization) block_11_add (Add) (None, 14, 14, 96) 0 ['block_10_project_BN[0][0]', 'block_11_project_BN[0][0]'] block_12_expand (Conv2D) (None, 14, 14, 576) 55296 ['block_11_add[0][0]'] block_12_expand_BN (BatchNorma (None, 14, 14, 576) 2304 ['block_12_expand[0][0]'] lization) block_12_expand_relu (ReLU) (None, 14, 14, 576) 0 ['block_12_expand_BN[0][0]'] block_12_depthwise (DepthwiseC (None, 14, 14, 576) 5184 ['block_12_expand_relu[0][0]'] onv2D) block_12_depthwise_BN (BatchNo (None, 14, 14, 576) 2304 ['block_12_depthwise[0][0]'] rmalization) block_12_depthwise_relu (ReLU) (None, 14, 14, 576) 0 ['block_12_depthwise_BN[0][0]'] block_12_project (Conv2D) (None, 14, 14, 96) 55296 ['block_12_depthwise_relu[0][0]'] block_12_project_BN (BatchNorm (None, 14, 14, 96) 384 ['block_12_project[0][0]'] alization) block_12_add (Add) (None, 14, 14, 96) 0 ['block_11_add[0][0]', 'block_12_project_BN[0][0]'] block_13_expand (Conv2D) (None, 14, 14, 576) 55296 ['block_12_add[0][0]'] block_13_expand_BN (BatchNorma (None, 14, 14, 576) 2304 ['block_13_expand[0][0]'] lization) block_13_expand_relu (ReLU) (None, 14, 14, 576) 0 ['block_13_expand_BN[0][0]'] block_13_pad (ZeroPadding2D) (None, 15, 15, 576) 0 ['block_13_expand_relu[0][0]'] block_13_depthwise (DepthwiseC (None, 7, 7, 576) 5184 ['block_13_pad[0][0]'] onv2D) block_13_depthwise_BN (BatchNo (None, 7, 7, 576) 2304 ['block_13_depthwise[0][0]'] rmalization) block_13_depthwise_relu (ReLU) (None, 7, 7, 576) 0 ['block_13_depthwise_BN[0][0]'] block_13_project (Conv2D) (None, 7, 7, 160) 92160 ['block_13_depthwise_relu[0][0]'] block_13_project_BN (BatchNorm (None, 7, 7, 160) 640 ['block_13_project[0][0]'] alization) block_14_expand (Conv2D) (None, 7, 7, 960) 153600 ['block_13_project_BN[0][0]'] block_14_expand_BN (BatchNorma (None, 7, 7, 960) 3840 ['block_14_expand[0][0]'] lization) block_14_expand_relu (ReLU) (None, 7, 7, 960) 0 ['block_14_expand_BN[0][0]'] block_14_depthwise (DepthwiseC (None, 7, 7, 960) 8640 ['block_14_expand_relu[0][0]'] onv2D) block_14_depthwise_BN (BatchNo (None, 7, 7, 960) 3840 ['block_14_depthwise[0][0]'] rmalization) block_14_depthwise_relu (ReLU) (None, 7, 7, 960) 0 ['block_14_depthwise_BN[0][0]'] block_14_project (Conv2D) (None, 7, 7, 160) 153600 ['block_14_depthwise_relu[0][0]'] block_14_project_BN (BatchNorm (None, 7, 7, 160) 640 ['block_14_project[0][0]'] alization) block_14_add (Add) (None, 7, 7, 160) 0 ['block_13_project_BN[0][0]', 'block_14_project_BN[0][0]'] block_15_expand (Conv2D) (None, 7, 7, 960) 153600 ['block_14_add[0][0]'] block_15_expand_BN (BatchNorma (None, 7, 7, 960) 3840 ['block_15_expand[0][0]'] lization) block_15_expand_relu (ReLU) (None, 7, 7, 960) 0 ['block_15_expand_BN[0][0]'] block_15_depthwise (DepthwiseC (None, 7, 7, 960) 8640 ['block_15_expand_relu[0][0]'] onv2D) block_15_depthwise_BN (BatchNo (None, 7, 7, 960) 3840 ['block_15_depthwise[0][0]'] rmalization) block_15_depthwise_relu (ReLU) (None, 7, 7, 960) 0 ['block_15_depthwise_BN[0][0]'] block_15_project (Conv2D) (None, 7, 7, 160) 153600 ['block_15_depthwise_relu[0][0]'] block_15_project_BN (BatchNorm (None, 7, 7, 160) 640 ['block_15_project[0][0]'] alization) block_15_add (Add) (None, 7, 7, 160) 0 ['block_14_add[0][0]', 'block_15_project_BN[0][0]'] block_16_expand (Conv2D) (None, 7, 7, 960) 153600 ['block_15_add[0][0]'] block_16_expand_BN (BatchNorma (None, 7, 7, 960) 3840 ['block_16_expand[0][0]'] lization) block_16_expand_relu (ReLU) (None, 7, 7, 960) 0 ['block_16_expand_BN[0][0]'] block_16_depthwise (DepthwiseC (None, 7, 7, 960) 8640 ['block_16_expand_relu[0][0]'] onv2D) block_16_depthwise_BN (BatchNo (None, 7, 7, 960) 3840 ['block_16_depthwise[0][0]'] rmalization) block_16_depthwise_relu (ReLU) (None, 7, 7, 960) 0 ['block_16_depthwise_BN[0][0]'] block_16_project (Conv2D) (None, 7, 7, 320) 307200 ['block_16_depthwise_relu[0][0]'] block_16_project_BN (BatchNorm (None, 7, 7, 320) 1280 ['block_16_project[0][0]'] alization) Conv_1 (Conv2D) (None, 7, 7, 1280) 409600 ['block_16_project_BN[0][0]'] Conv_1_bn (BatchNormalization) (None, 7, 7, 1280) 5120 ['Conv_1[0][0]'] out_relu (ReLU) (None, 7, 7, 1280) 0 ['Conv_1_bn[0][0]'] flatten (Flatten) (None, 62720) 0 ['out_relu[0][0]'] dense (Dense) (None, 128) 8028288 ['flatten[0][0]'] dropout (Dropout) (None, 128) 0 ['dense[0][0]'] dense_1 (Dense) (None, 64) 8256 ['dropout[0][0]'] dense_2 (Dense) (None, 7) 455 ['dense_1[0][0]'] ================================================================================================== Total params: 10,294,983 Trainable params: 9,396,999 Non-trainable params: 897,984 __________________________________________________________________________________________________
In [62]:
model.compile( Adam(0.0001), 'categorical_crossentropy', ['accuracy'] )
In [64]:
epoch_history = model.fit(train_generator, epochs = 20 , validation_data = (X_val, y_val), callbacks = [ mcp, csv_logger ], batch_size = 64 )
Epoch 1/20 10/10 [==============================] - ETA: 0s - loss: 0.1486 - accuracy: 0.9585 Epoch 1: val_accuracy did not improve from 1.00000 10/10 [==============================] - 10s 625ms/step - loss: 0.1486 - accuracy: 0.9585 - val_loss: 0.0481 - val_accuracy: 0.9733 Epoch 2/20 10/10 [==============================] - ETA: 0s - loss: 0.0621 - accuracy: 0.9784 Epoch 2: val_accuracy did not improve from 1.00000 10/10 [==============================] - 8s 773ms/step - loss: 0.0621 - accuracy: 0.9784 - val_loss: 0.0526 - val_accuracy: 0.9600 Epoch 3/20 10/10 [==============================] - ETA: 0s - loss: 0.0531 - accuracy: 0.9818 Epoch 3: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 590ms/step - loss: 0.0531 - accuracy: 0.9818 - val_loss: 0.0507 - val_accuracy: 0.9600 Epoch 4/20 10/10 [==============================] - ETA: 0s - loss: 0.0241 - accuracy: 0.9950 Epoch 4: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 594ms/step - loss: 0.0241 - accuracy: 0.9950 - val_loss: 0.0411 - val_accuracy: 0.9733 Epoch 5/20 10/10 [==============================] - ETA: 0s - loss: 0.0473 - accuracy: 0.9834 Epoch 5: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 588ms/step - loss: 0.0473 - accuracy: 0.9834 - val_loss: 0.0422 - val_accuracy: 0.9600 Epoch 6/20 10/10 [==============================] - ETA: 0s - loss: 0.0382 - accuracy: 0.9867 Epoch 6: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 635ms/step - loss: 0.0382 - accuracy: 0.9867 - val_loss: 0.0457 - val_accuracy: 0.9600 Epoch 7/20 10/10 [==============================] - ETA: 0s - loss: 0.0348 - accuracy: 0.9900 Epoch 7: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 595ms/step - loss: 0.0348 - accuracy: 0.9900 - val_loss: 0.0478 - val_accuracy: 0.9600 Epoch 8/20 10/10 [==============================] - ETA: 0s - loss: 0.0165 - accuracy: 0.9967 Epoch 8: val_accuracy did not improve from 1.00000 10/10 [==============================] - 7s 733ms/step - loss: 0.0165 - accuracy: 0.9967 - val_loss: 0.0373 - val_accuracy: 0.9733 Epoch 9/20 10/10 [==============================] - ETA: 0s - loss: 0.0263 - accuracy: 0.9900 Epoch 9: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 594ms/step - loss: 0.0263 - accuracy: 0.9900 - val_loss: 0.0308 - val_accuracy: 0.9733 Epoch 10/20 10/10 [==============================] - ETA: 0s - loss: 0.0246 - accuracy: 0.9950 Epoch 10: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 589ms/step - loss: 0.0246 - accuracy: 0.9950 - val_loss: 0.0464 - val_accuracy: 0.9600 Epoch 11/20 10/10 [==============================] - ETA: 0s - loss: 0.0141 - accuracy: 0.9950 Epoch 11: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 598ms/step - loss: 0.0141 - accuracy: 0.9950 - val_loss: 0.0602 - val_accuracy: 0.9600 Epoch 12/20 10/10 [==============================] - ETA: 0s - loss: 0.0071 - accuracy: 1.0000 Epoch 12: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 593ms/step - loss: 0.0071 - accuracy: 1.0000 - val_loss: 0.0604 - val_accuracy: 0.9600 Epoch 13/20 10/10 [==============================] - ETA: 0s - loss: 0.0107 - accuracy: 0.9983 Epoch 13: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 593ms/step - loss: 0.0107 - accuracy: 0.9983 - val_loss: 0.0571 - val_accuracy: 0.9600 Epoch 14/20 10/10 [==============================] - ETA: 0s - loss: 0.0088 - accuracy: 0.9967 Epoch 14: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 636ms/step - loss: 0.0088 - accuracy: 0.9967 - val_loss: 0.0625 - val_accuracy: 0.9600 Epoch 15/20 10/10 [==============================] - ETA: 0s - loss: 0.0101 - accuracy: 0.9983 Epoch 15: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 588ms/step - loss: 0.0101 - accuracy: 0.9983 - val_loss: 0.0629 - val_accuracy: 0.9600 Epoch 16/20 10/10 [==============================] - ETA: 0s - loss: 0.0183 - accuracy: 0.9934 Epoch 16: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 598ms/step - loss: 0.0183 - accuracy: 0.9934 - val_loss: 0.0409 - val_accuracy: 0.9733 Epoch 17/20 10/10 [==============================] - ETA: 0s - loss: 0.0182 - accuracy: 0.9934 Epoch 17: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 592ms/step - loss: 0.0182 - accuracy: 0.9934 - val_loss: 0.0367 - val_accuracy: 0.9733 Epoch 18/20 10/10 [==============================] - ETA: 0s - loss: 0.0198 - accuracy: 0.9934 Epoch 18: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 598ms/step - loss: 0.0198 - accuracy: 0.9934 - val_loss: 0.0374 - val_accuracy: 0.9867 Epoch 19/20 10/10 [==============================] - ETA: 0s - loss: 0.0076 - accuracy: 1.0000 Epoch 19: val_accuracy did not improve from 1.00000 10/10 [==============================] - 6s 578ms/step - loss: 0.0076 - accuracy: 1.0000 - val_loss: 0.0380 - val_accuracy: 0.9867 Epoch 20/20 10/10 [==============================] - ETA: 0s - loss: 0.0166 - accuracy: 0.9934 Epoch 20: val_accuracy did not improve from 1.00000 10/10 [==============================] - 10s 990ms/step - loss: 0.0166 - accuracy: 0.9934 - val_loss: 0.0397 - val_accuracy: 0.9867
In [ ]:
In [65]:
model.evaluate(X_test, y_test)
3/3 [==============================] - 0s 48ms/step - loss: 0.2733 - accuracy: 0.9737
Out[65]:
[0.27334487438201904, 0.9736841917037964]
In [ ]:
In [66]:
### 가장 좋은 모델은 어디에 저장?? 구글드라이브에 있다.
### 구글드라이브에 저장된, 가장 좋은 모델을 불러와서 테스트 해본다.
In [71]:
CHECKPOINT_PATH
Out[71]:
'/content/drive/MyDrive/Colab Notebooks/ml_plus/checkpoints/mobilenetv2/by-type-mobilenetv2-block-1-2.h5'
In [73]:
best_model = tf.keras.models.load_model(CHECKPOINT_PATH)
In [74]:
best_model.evaluate(X_test, y_test)
3/3 [==============================] - 1s 42ms/step - loss: 0.1537 - accuracy: 0.9737
Out[74]:
[0.15367211401462555, 0.9736841917037964]
In [ ]:
'DataScience > TensorFlow[CNN]' 카테고리의 다른 글
딥러닝 텐서플로우 Transfer_Learning_and_Fine_Tunning (0) | 2023.01.02 |
---|---|
딥러니 텐서플로우 CNN을 이용하여, CIFAR-10 이미지 분류하기, 데이트셋이 이미 넘파이라면? (0) | 2023.01.02 |
딥러닝 텐서플로우 이미지 증강 예시 (0) | 2022.12.30 |
딥러닝 텐서플로우 개와 고양이 분류하는 Neural Network 만들기 (0) | 2022.12.30 |
딥러닝 텐서플로우 CNN 인간,말 분류, 파이썬으로 압축풀기, 이미지파일을 넘파이 어레이로가져오기 (0) | 2022.12.30 |