데이터 로드 & 전처리

import tensorflow as tf
import numpy as np
from tensorflow.keras.layers import Dense, Flatten
from tensorflow.keras.models import Sequential
from tensorflow.keras.callbacks import ModelCheckpoint
fashion_mnist = tf.keras.datasets.fashion_mnist

(x_train, y_train), (x_test, y_test) = fashion_mnist.load_data()

x_train = x_train / 255.0
x_test = x_test / 255.0
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-labels-idx1-ubyte.gz
32768/29515 [=================================] - 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-images-idx3-ubyte.gz
26427392/26421880 [==============================] - 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-labels-idx1-ubyte.gz
8192/5148 [===============================================] - 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-images-idx3-ubyte.gz
4423680/4422102 [==============================] - 0s 0us/step

간단한 모델링

model = Sequential([
    Flatten(input_shape=(28, 28)),
    Dense(512, activation='relu'),
    Dense(256, activation='relu'),
    Dense(128, activation='relu'),
    Dense(64, activation='relu'),
    Dense(10, activation='softmax')
])
    
model.compile(optimizer='adam',
                loss='sparse_categorical_crossentropy',
                metrics=['acc'])

텐서보드 콜백 만들기

# 시간정보를 활용하여 폴더 생성
import datetime
# 학습데이터의 log를 저장할 폴더 생성 (지정)
log_dir = "logs/my_board/" + datetime.datetime.now().strftime("%Y%m%d-%H%M%S")

# 텐서보드 콜백 정의 하기
tensorboard_callback = tf.keras.callbacks.TensorBoard(log_dir=log_dir, histogram_freq=1)
model.fit(x_train, y_train,
            validation_data=(x_test, y_test), 
            epochs=100, 
            callbacks=[tensorboard_callback],
            )
Epoch 1/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1707 - acc: 0.9347 - val_loss: 0.4105 - val_acc: 0.8926
Epoch 2/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1650 - acc: 0.9370 - val_loss: 0.4031 - val_acc: 0.8901
Epoch 3/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1628 - acc: 0.9386 - val_loss: 0.3677 - val_acc: 0.8975
Epoch 4/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1608 - acc: 0.9388 - val_loss: 0.3881 - val_acc: 0.8959
Epoch 5/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1563 - acc: 0.9405 - val_loss: 0.4043 - val_acc: 0.8941
Epoch 6/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1500 - acc: 0.9424 - val_loss: 0.4640 - val_acc: 0.8903
Epoch 7/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1466 - acc: 0.9431 - val_loss: 0.4193 - val_acc: 0.8927
Epoch 8/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1467 - acc: 0.9444 - val_loss: 0.4256 - val_acc: 0.8903
Epoch 9/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1418 - acc: 0.9460 - val_loss: 0.4540 - val_acc: 0.8919
Epoch 10/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1413 - acc: 0.9456 - val_loss: 0.4457 - val_acc: 0.8958
Epoch 11/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1329 - acc: 0.9485 - val_loss: 0.4682 - val_acc: 0.8910
Epoch 12/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1353 - acc: 0.9481 - val_loss: 0.4412 - val_acc: 0.8988
Epoch 13/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1346 - acc: 0.9502 - val_loss: 0.4475 - val_acc: 0.8897
Epoch 14/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1247 - acc: 0.9524 - val_loss: 0.4914 - val_acc: 0.8909
Epoch 15/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1292 - acc: 0.9518 - val_loss: 0.5506 - val_acc: 0.8893
Epoch 16/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1261 - acc: 0.9525 - val_loss: 0.5956 - val_acc: 0.8931
Epoch 17/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1211 - acc: 0.9545 - val_loss: 0.5109 - val_acc: 0.8919
Epoch 18/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1183 - acc: 0.9550 - val_loss: 0.4551 - val_acc: 0.8974
Epoch 19/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1173 - acc: 0.9563 - val_loss: 0.5354 - val_acc: 0.8952
Epoch 20/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1193 - acc: 0.9543 - val_loss: 0.4980 - val_acc: 0.8989
Epoch 21/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.1128 - acc: 0.9569 - val_loss: 0.4920 - val_acc: 0.8950
Epoch 22/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1130 - acc: 0.9572 - val_loss: 0.5204 - val_acc: 0.8960
Epoch 23/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1146 - acc: 0.9568 - val_loss: 0.5607 - val_acc: 0.8959
Epoch 24/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1046 - acc: 0.9601 - val_loss: 0.5466 - val_acc: 0.8894
Epoch 25/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1104 - acc: 0.9578 - val_loss: 0.4906 - val_acc: 0.8902
Epoch 26/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1036 - acc: 0.9605 - val_loss: 0.6082 - val_acc: 0.8969
Epoch 27/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1011 - acc: 0.9617 - val_loss: 0.5635 - val_acc: 0.8958
Epoch 28/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.1082 - acc: 0.9598 - val_loss: 0.5285 - val_acc: 0.8980
Epoch 29/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0976 - acc: 0.9633 - val_loss: 0.5600 - val_acc: 0.9006
Epoch 30/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0957 - acc: 0.9639 - val_loss: 0.5759 - val_acc: 0.8974
Epoch 31/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0982 - acc: 0.9628 - val_loss: 0.5656 - val_acc: 0.8976
Epoch 32/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.0989 - acc: 0.9634 - val_loss: 0.7333 - val_acc: 0.8920
Epoch 33/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0995 - acc: 0.9635 - val_loss: 0.6287 - val_acc: 0.8974
Epoch 34/100
1875/1875 [==============================] - 9s 5ms/step - loss: 0.0983 - acc: 0.9637 - val_loss: 0.6142 - val_acc: 0.8919
Epoch 35/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0902 - acc: 0.9665 - val_loss: 0.6181 - val_acc: 0.8951
Epoch 36/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0938 - acc: 0.9647 - val_loss: 0.6208 - val_acc: 0.8961
Epoch 37/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.0884 - acc: 0.9669 - val_loss: 0.6464 - val_acc: 0.8922
Epoch 38/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0903 - acc: 0.9665 - val_loss: 0.6417 - val_acc: 0.8960
Epoch 39/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0874 - acc: 0.9679 - val_loss: 0.6625 - val_acc: 0.8964
Epoch 40/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0865 - acc: 0.9687 - val_loss: 0.5832 - val_acc: 0.8965
Epoch 41/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0860 - acc: 0.9683 - val_loss: 0.6223 - val_acc: 0.8957
Epoch 42/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0868 - acc: 0.9680 - val_loss: 0.6720 - val_acc: 0.8919
Epoch 43/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0823 - acc: 0.9698 - val_loss: 0.7297 - val_acc: 0.8914
Epoch 44/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.0784 - acc: 0.9709 - val_loss: 0.6507 - val_acc: 0.8972
Epoch 45/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0798 - acc: 0.9702 - val_loss: 0.7340 - val_acc: 0.8940
Epoch 46/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0818 - acc: 0.9706 - val_loss: 0.6842 - val_acc: 0.8960
Epoch 47/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0815 - acc: 0.9701 - val_loss: 0.7152 - val_acc: 0.8971
Epoch 48/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.0796 - acc: 0.9710 - val_loss: 0.6894 - val_acc: 0.8965
Epoch 49/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.0836 - acc: 0.9709 - val_loss: 0.7392 - val_acc: 0.8972
Epoch 50/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0754 - acc: 0.9726 - val_loss: 0.6901 - val_acc: 0.8930
Epoch 51/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.0739 - acc: 0.9733 - val_loss: 0.7541 - val_acc: 0.8954
Epoch 52/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.0793 - acc: 0.9721 - val_loss: 0.9184 - val_acc: 0.8963
Epoch 53/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0705 - acc: 0.9740 - val_loss: 0.7979 - val_acc: 0.8920
Epoch 54/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0770 - acc: 0.9726 - val_loss: 0.8838 - val_acc: 0.8921
Epoch 55/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.0801 - acc: 0.9711 - val_loss: 0.7412 - val_acc: 0.8963
Epoch 56/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0682 - acc: 0.9757 - val_loss: 0.7688 - val_acc: 0.8978
Epoch 57/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.0729 - acc: 0.9741 - val_loss: 0.8291 - val_acc: 0.8970
Epoch 58/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.0670 - acc: 0.9756 - val_loss: 0.8252 - val_acc: 0.8956
Epoch 59/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0751 - acc: 0.9738 - val_loss: 0.8129 - val_acc: 0.8946
Epoch 60/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0629 - acc: 0.9764 - val_loss: 0.8045 - val_acc: 0.8939
Epoch 61/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0673 - acc: 0.9756 - val_loss: 0.8284 - val_acc: 0.8953
Epoch 62/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.0646 - acc: 0.9767 - val_loss: 0.7973 - val_acc: 0.8986
Epoch 63/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0665 - acc: 0.9764 - val_loss: 0.7878 - val_acc: 0.8941
Epoch 64/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0831 - acc: 0.9768 - val_loss: 0.8536 - val_acc: 0.8810
Epoch 65/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0617 - acc: 0.9794 - val_loss: 0.8640 - val_acc: 0.8939
Epoch 66/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0630 - acc: 0.9778 - val_loss: 0.8888 - val_acc: 0.8967
Epoch 67/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0631 - acc: 0.9772 - val_loss: 0.8712 - val_acc: 0.8973
Epoch 68/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0595 - acc: 0.9787 - val_loss: 0.9541 - val_acc: 0.8970
Epoch 69/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0727 - acc: 0.9750 - val_loss: 0.7918 - val_acc: 0.8939
Epoch 70/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0695 - acc: 0.9763 - val_loss: 0.6942 - val_acc: 0.8910
Epoch 71/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0662 - acc: 0.9772 - val_loss: 0.7427 - val_acc: 0.8942
Epoch 72/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0606 - acc: 0.9788 - val_loss: 0.9599 - val_acc: 0.8954
Epoch 73/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0611 - acc: 0.9787 - val_loss: 0.7712 - val_acc: 0.8985
Epoch 74/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0554 - acc: 0.9793 - val_loss: 0.8665 - val_acc: 0.8956
Epoch 75/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0573 - acc: 0.9797 - val_loss: 0.9220 - val_acc: 0.8986
Epoch 76/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0604 - acc: 0.9786 - val_loss: 0.9112 - val_acc: 0.9027
Epoch 77/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.0571 - acc: 0.9799 - val_loss: 1.0635 - val_acc: 0.8945
Epoch 78/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0590 - acc: 0.9793 - val_loss: 0.9318 - val_acc: 0.8962
Epoch 79/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.0560 - acc: 0.9805 - val_loss: 0.8744 - val_acc: 0.8953
Epoch 80/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.0524 - acc: 0.9816 - val_loss: 1.0237 - val_acc: 0.8958
Epoch 81/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0620 - acc: 0.9792 - val_loss: 0.7830 - val_acc: 0.8933
Epoch 82/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0670 - acc: 0.9779 - val_loss: 0.8376 - val_acc: 0.8983
Epoch 83/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0526 - acc: 0.9818 - val_loss: 0.9866 - val_acc: 0.8972
Epoch 84/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0571 - acc: 0.9810 - val_loss: 0.9040 - val_acc: 0.8984
Epoch 85/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0474 - acc: 0.9833 - val_loss: 0.8977 - val_acc: 0.8972
Epoch 86/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0545 - acc: 0.9811 - val_loss: 0.8684 - val_acc: 0.8979
Epoch 87/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0569 - acc: 0.9816 - val_loss: 0.8833 - val_acc: 0.8962
Epoch 88/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0525 - acc: 0.9828 - val_loss: 0.8961 - val_acc: 0.8995
Epoch 89/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0514 - acc: 0.9815 - val_loss: 0.8783 - val_acc: 0.8933
Epoch 90/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0486 - acc: 0.9833 - val_loss: 1.1661 - val_acc: 0.8964
Epoch 91/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.0732 - acc: 0.9804 - val_loss: 0.9135 - val_acc: 0.8987
Epoch 92/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0441 - acc: 0.9845 - val_loss: 0.9990 - val_acc: 0.8986
Epoch 93/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.0537 - acc: 0.9822 - val_loss: 0.8894 - val_acc: 0.8927
Epoch 94/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0543 - acc: 0.9828 - val_loss: 1.0728 - val_acc: 0.8936
Epoch 95/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0531 - acc: 0.9830 - val_loss: 1.1280 - val_acc: 0.8968
Epoch 96/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0415 - acc: 0.9858 - val_loss: 1.0438 - val_acc: 0.8960
Epoch 97/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0561 - acc: 0.9803 - val_loss: 0.9498 - val_acc: 0.8950
Epoch 98/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0460 - acc: 0.9847 - val_loss: 1.0721 - val_acc: 0.8975
Epoch 99/100
1875/1875 [==============================] - 7s 4ms/step - loss: 0.0615 - acc: 0.9815 - val_loss: 1.0190 - val_acc: 0.8900
Epoch 100/100
1875/1875 [==============================] - 8s 4ms/step - loss: 0.0442 - acc: 0.9846 - val_loss: 1.0824 - val_acc: 0.8958
<tensorflow.python.keras.callbacks.History at 0x7f5886e590f0>
model.summary()
Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
flatten_1 (Flatten)          (None, 784)               0         
_________________________________________________________________
dense_6 (Dense)              (None, 512)               401920    
_________________________________________________________________
dense_7 (Dense)              (None, 256)               131328    
_________________________________________________________________
dense_8 (Dense)              (None, 128)               32896     
_________________________________________________________________
dense_9 (Dense)              (None, 64)                8256      
_________________________________________________________________
dense_10 (Dense)             (None, 10)                650       
=================================================================
Total params: 575,050
Trainable params: 575,050
Non-trainable params: 0
_________________________________________________________________

텐서보드를 colab에서 바로 로드 하기

%load_ext tensorboard
The tensorboard extension is already loaded. To reload it, use:
  %reload_ext tensorboard
%tensorboard --logdir {log_dir}
Reusing TensorBoard on port 6007 (pid 793), started 0:31:59 ago. (Use '!kill 793' to kill it.)