less than 1 minute read

batch normalization

from tensorflow.keras.models import Sequntial
from tensorflow.keras.layers import Dropout, Flatten
from tensorflow.keras.layers import Dense, BatchNormalization, Activation

model = Sequential([
  Flatten(input_shape=(784,)),
  Dense(units=32),
  Activation('relu'),
  Dropout(rate=0.25),
  Dense(units=64, activation='relu')
])

Callback 기능 중에 ModelCheckpoint

Leave a comment