Introduction à Keras¶
In [1]:
import tensorflow as tf
import keras
Keras pour reconnaître des chiffres¶
Données¶
In [2]:
from keras.datasets import mnist
(X_train, y_train), (X_test, y_test) = mnist.load_data()
print('X_train: ', X_train.shape)
print('y_train: ', y_train.shape)
print('X_test: ', X_test.shape)
print('y_test: ', y_test.shape)
X_train: (60000, 28, 28) y_train: (60000,) X_test: (10000, 28, 28) y_test: (10000,)
In [3]:
import matplotlib.pyplot as plt
%matplotlib inline
plt.imshow(X_train[100], cmap='binary')
print(y_train[100])
5
In [4]:
# print(X_train[100])
Prétraitement des données :¶
On normalise X_train et X_test¶
In [5]:
X_train = X_train / 255
X_test = X_test / 255
On les code en rééls¶
In [6]:
X_train = X_train.astype('float32')
X_test = X_test.astype('float32')
On transforme chaque image (matrice 28x28) en un vecteur 784¶
In [7]:
nb_pixels = X_train.shape[1] * X_train.shape[2]
# print(nb_pixels)
X_train = X_train.reshape((X_train.shape[0], nb_pixels))
X_test = X_test.reshape((X_test.shape[0], nb_pixels))
print('X_train: ', X_train.shape)
X_train: (60000, 784)
One-hot encoding pour les y¶
In [8]:
from keras.utils import to_categorical
y_train = to_categorical(y_train)
y_test = to_categorical(y_test)
print('y_train: ', y_train.shape)
print('y_train[100]: ', y_train[100])
y_train: (60000, 10) y_train[100]: [0. 0. 0. 0. 0. 1. 0. 0. 0. 0.]
Des réseaux pour reconnaitre les chiffres¶
Un réseau à une seule couche :¶
In [9]:
from keras.models import Sequential
from keras.layers import Dense
nb_classes = 10
model = Sequential()
# Une couche avec 10 neurones, chacun utilisant la softmax comme fonction d'activation
model.add(Dense(nb_classes, activation='softmax', kernel_initializer='normal'))
# On indique la fonction de loss, l'algorithme d'optimisation, et les métriques qui nous intéresse
model.compile(loss='categorical_crossentropy', optimizer='sgd', metrics=['accuracy'])
On entraîne le modèle¶
In [10]:
epochs = 10
batch_size = 32
model.fit(X_train, y_train, epochs=epochs, batch_size=batch_size)
Epoch 1/10 1875/1875 [==============================] - 1s 540us/step - loss: 0.7741 - accuracy: 0.8178 Epoch 2/10 1875/1875 [==============================] - 1s 529us/step - loss: 0.4548 - accuracy: 0.8816 Epoch 3/10 1875/1875 [==============================] - 1s 545us/step - loss: 0.4024 - accuracy: 0.8916 Epoch 4/10 1875/1875 [==============================] - 1s 525us/step - loss: 0.3763 - accuracy: 0.8969 Epoch 5/10 1875/1875 [==============================] - 1s 526us/step - loss: 0.3596 - accuracy: 0.9008 Epoch 6/10 1875/1875 [==============================] - 1s 531us/step - loss: 0.3477 - accuracy: 0.9040 Epoch 7/10 1875/1875 [==============================] - 1s 531us/step - loss: 0.3389 - accuracy: 0.9063 Epoch 8/10 1875/1875 [==============================] - 1s 529us/step - loss: 0.3317 - accuracy: 0.9084 Epoch 9/10 1875/1875 [==============================] - 1s 540us/step - loss: 0.3258 - accuracy: 0.9097 Epoch 10/10 1875/1875 [==============================] - 1s 526us/step - loss: 0.3210 - accuracy: 0.9111
Out[10]:
<keras.src.callbacks.History at 0xffff185ea190>
On évalue la qualité du modèle¶
In [13]:
score = model.evaluate(X_test, y_test)
print('accuracy = {:2.2%}'.format(score[0]))
313/313 [==============================] - 0s 530us/step - loss: 0.3058 - accuracy: 0.9155 accuracy = 30.58%
Un réseau à une deux couche :¶
In [14]:
# Attention : on relie les données, on les formate, etc
(X_train, y_train), (X_test, y_test) = mnist.load_data()
X_train = X_train / 255
X_test = X_test / 255
X_train = X_train.astype('float32')
X_test = X_test.astype('float32')
nb_pixels = X_train.shape[1] * X_train.shape[2]
X_train = X_train.reshape((X_train.shape[0], nb_pixels))
X_test = X_test.reshape((X_test.shape[0], nb_pixels))
y_train = to_categorical(y_train)
y_test = to_categorical(y_test)
In [18]:
nb_classes = 10
model = Sequential()
# Une couche avec 64 neurones, chacun utilisant la relu comme fonction d'activation
model.add(Dense(64, activation='softmax', kernel_initializer='normal'))
# Une couche avec 10 neurones, chacun utilisant la softmax comme fonction d'activation
model.add(Dense(nb_classes, activation='softmax', kernel_initializer='normal'))
# On indique la fonction de loss, l'algorithme d'optimisation, et les métriques qui nous intéresse
model.compile(loss='categorical_crossentropy', optimizer='sgd', metrics=['accuracy'])
epochs = 100
batch_size = 32
model.fit(X_train, y_train, epochs=epochs, batch_size=batch_size)
Epoch 1/100 1875/1875 [==============================] - 1s 680us/step - loss: 2.3007 - accuracy: 0.1133 Epoch 2/100 1875/1875 [==============================] - 1s 669us/step - loss: 2.2978 - accuracy: 0.1124 Epoch 3/100 1875/1875 [==============================] - 1s 676us/step - loss: 2.2827 - accuracy: 0.1586 Epoch 4/100 1875/1875 [==============================] - 1s 684us/step - loss: 2.2013 - accuracy: 0.2139 Epoch 5/100 1875/1875 [==============================] - 1s 685us/step - loss: 2.0440 - accuracy: 0.3028 Epoch 6/100 1875/1875 [==============================] - 1s 676us/step - loss: 1.8709 - accuracy: 0.3168 Epoch 7/100 1875/1875 [==============================] - 1s 690us/step - loss: 1.7555 - accuracy: 0.3207 Epoch 8/100 1875/1875 [==============================] - 1s 682us/step - loss: 1.6594 - accuracy: 0.3214 Epoch 9/100 1875/1875 [==============================] - 1s 692us/step - loss: 1.5942 - accuracy: 0.3223 Epoch 10/100 1875/1875 [==============================] - 1s 689us/step - loss: 1.5599 - accuracy: 0.3232 Epoch 11/100 1875/1875 [==============================] - 1s 692us/step - loss: 1.5401 - accuracy: 0.3278 Epoch 12/100 1875/1875 [==============================] - 1s 693us/step - loss: 1.5271 - accuracy: 0.3245 Epoch 13/100 1875/1875 [==============================] - 1s 704us/step - loss: 1.5180 - accuracy: 0.3284 Epoch 14/100 1875/1875 [==============================] - 1s 694us/step - loss: 1.5110 - accuracy: 0.3298 Epoch 15/100 1875/1875 [==============================] - 1s 691us/step - loss: 1.5056 - accuracy: 0.3293 Epoch 16/100 1875/1875 [==============================] - 1s 693us/step - loss: 1.5010 - accuracy: 0.3303 Epoch 17/100 1875/1875 [==============================] - 1s 703us/step - loss: 1.4972 - accuracy: 0.3313 Epoch 18/100 1875/1875 [==============================] - 1s 703us/step - loss: 1.4938 - accuracy: 0.3365 Epoch 19/100 1875/1875 [==============================] - 1s 692us/step - loss: 1.4909 - accuracy: 0.3343 Epoch 20/100 1875/1875 [==============================] - 1s 714us/step - loss: 1.4881 - accuracy: 0.3382 Epoch 21/100 1875/1875 [==============================] - 1s 732us/step - loss: 1.4855 - accuracy: 0.3390 Epoch 22/100 1875/1875 [==============================] - 1s 695us/step - loss: 1.4832 - accuracy: 0.3419 Epoch 23/100 1875/1875 [==============================] - 1s 694us/step - loss: 1.4807 - accuracy: 0.3412 Epoch 24/100 1875/1875 [==============================] - 1s 695us/step - loss: 1.4782 - accuracy: 0.3431 Epoch 25/100 1875/1875 [==============================] - 1s 694us/step - loss: 1.4758 - accuracy: 0.3471 Epoch 26/100 1875/1875 [==============================] - 1s 704us/step - loss: 1.4732 - accuracy: 0.3501 Epoch 27/100 1875/1875 [==============================] - 1s 699us/step - loss: 1.4703 - accuracy: 0.3535 Epoch 28/100 1875/1875 [==============================] - 1s 700us/step - loss: 1.4669 - accuracy: 0.3571 Epoch 29/100 1875/1875 [==============================] - 1s 702us/step - loss: 1.4630 - accuracy: 0.3596 Epoch 30/100 1875/1875 [==============================] - 1s 698us/step - loss: 1.4578 - accuracy: 0.3656 Epoch 31/100 1875/1875 [==============================] - 1s 700us/step - loss: 1.4517 - accuracy: 0.3705 Epoch 32/100 1875/1875 [==============================] - 1s 717us/step - loss: 1.4441 - accuracy: 0.3760 Epoch 33/100 1875/1875 [==============================] - 1s 709us/step - loss: 1.4349 - accuracy: 0.3805 Epoch 34/100 1875/1875 [==============================] - 1s 724us/step - loss: 1.4247 - accuracy: 0.3860 Epoch 35/100 1875/1875 [==============================] - 1s 724us/step - loss: 1.4149 - accuracy: 0.3884 Epoch 36/100 1875/1875 [==============================] - 1s 689us/step - loss: 1.4058 - accuracy: 0.3925 Epoch 37/100 1875/1875 [==============================] - 1s 683us/step - loss: 1.3980 - accuracy: 0.3959 Epoch 38/100 1875/1875 [==============================] - 1s 683us/step - loss: 1.3908 - accuracy: 0.4005 Epoch 39/100 1875/1875 [==============================] - 1s 682us/step - loss: 1.3839 - accuracy: 0.4013 Epoch 40/100 1875/1875 [==============================] - 1s 692us/step - loss: 1.3769 - accuracy: 0.4060 Epoch 41/100 1875/1875 [==============================] - 1s 695us/step - loss: 1.3700 - accuracy: 0.4069 Epoch 42/100 1875/1875 [==============================] - 1s 690us/step - loss: 1.3627 - accuracy: 0.4114 Epoch 43/100 1875/1875 [==============================] - 1s 693us/step - loss: 1.3554 - accuracy: 0.4156 Epoch 44/100 1875/1875 [==============================] - 1s 697us/step - loss: 1.3480 - accuracy: 0.4186 Epoch 45/100 1875/1875 [==============================] - 1s 694us/step - loss: 1.3410 - accuracy: 0.4221 Epoch 46/100 1875/1875 [==============================] - 1s 697us/step - loss: 1.3338 - accuracy: 0.4241 Epoch 47/100 1875/1875 [==============================] - 1s 697us/step - loss: 1.3273 - accuracy: 0.4261 Epoch 48/100 1875/1875 [==============================] - 1s 713us/step - loss: 1.3211 - accuracy: 0.4313 Epoch 49/100 1875/1875 [==============================] - 1s 710us/step - loss: 1.3154 - accuracy: 0.4322 Epoch 50/100 1875/1875 [==============================] - 1s 702us/step - loss: 1.3103 - accuracy: 0.4335 Epoch 51/100 1875/1875 [==============================] - 1s 684us/step - loss: 1.3060 - accuracy: 0.4341 Epoch 52/100 1875/1875 [==============================] - 1s 692us/step - loss: 1.3017 - accuracy: 0.4364 Epoch 53/100 1875/1875 [==============================] - 1s 687us/step - loss: 1.2978 - accuracy: 0.4377 Epoch 54/100 1875/1875 [==============================] - 1s 681us/step - loss: 1.2940 - accuracy: 0.4381 Epoch 55/100 1875/1875 [==============================] - 1s 690us/step - loss: 1.2906 - accuracy: 0.4412 Epoch 56/100 1875/1875 [==============================] - 1s 704us/step - loss: 1.2874 - accuracy: 0.4408 Epoch 57/100 1875/1875 [==============================] - 1s 688us/step - loss: 1.2847 - accuracy: 0.4437 Epoch 58/100 1875/1875 [==============================] - 1s 693us/step - loss: 1.2817 - accuracy: 0.4446 Epoch 59/100 1875/1875 [==============================] - 1s 695us/step - loss: 1.2790 - accuracy: 0.4476 Epoch 60/100 1875/1875 [==============================] - 1s 746us/step - loss: 1.2763 - accuracy: 0.4487 Epoch 61/100 1875/1875 [==============================] - 1s 713us/step - loss: 1.2740 - accuracy: 0.4489 Epoch 62/100 1875/1875 [==============================] - 1s 703us/step - loss: 1.2714 - accuracy: 0.4528 Epoch 63/100 1875/1875 [==============================] - 1s 692us/step - loss: 1.2693 - accuracy: 0.4529 Epoch 64/100 1875/1875 [==============================] - 1s 689us/step - loss: 1.2669 - accuracy: 0.4554 Epoch 65/100 1875/1875 [==============================] - 1s 683us/step - loss: 1.2646 - accuracy: 0.4589 Epoch 66/100 1875/1875 [==============================] - 1s 680us/step - loss: 1.2624 - accuracy: 0.4600 Epoch 67/100 1875/1875 [==============================] - 1s 682us/step - loss: 1.2600 - accuracy: 0.4614 Epoch 68/100 1875/1875 [==============================] - 1s 677us/step - loss: 1.2576 - accuracy: 0.4674 Epoch 69/100 1875/1875 [==============================] - 1s 680us/step - loss: 1.2555 - accuracy: 0.4703 Epoch 70/100 1875/1875 [==============================] - 1s 716us/step - loss: 1.2534 - accuracy: 0.4701 Epoch 71/100 1875/1875 [==============================] - 1s 724us/step - loss: 1.2505 - accuracy: 0.4737 Epoch 72/100 1875/1875 [==============================] - 1s 692us/step - loss: 1.2478 - accuracy: 0.4768 Epoch 73/100 1875/1875 [==============================] - 1s 681us/step - loss: 1.2458 - accuracy: 0.4817 Epoch 74/100 1875/1875 [==============================] - 1s 679us/step - loss: 1.2428 - accuracy: 0.4840 Epoch 75/100 1875/1875 [==============================] - 1s 687us/step - loss: 1.2402 - accuracy: 0.4854 Epoch 76/100 1875/1875 [==============================] - 1s 688us/step - loss: 1.2375 - accuracy: 0.4884 Epoch 77/100 1875/1875 [==============================] - 1s 716us/step - loss: 1.2345 - accuracy: 0.4916 Epoch 78/100 1875/1875 [==============================] - 1s 751us/step - loss: 1.2321 - accuracy: 0.4925 Epoch 79/100 1875/1875 [==============================] - 1s 730us/step - loss: 1.2292 - accuracy: 0.4953 Epoch 80/100 1875/1875 [==============================] - 1s 705us/step - loss: 1.2267 - accuracy: 0.4968 Epoch 81/100 1875/1875 [==============================] - 1s 700us/step - loss: 1.2238 - accuracy: 0.5005 Epoch 82/100 1875/1875 [==============================] - 1s 702us/step - loss: 1.2210 - accuracy: 0.5022 Epoch 83/100 1875/1875 [==============================] - 1s 710us/step - loss: 1.2183 - accuracy: 0.5036 Epoch 84/100 1875/1875 [==============================] - 1s 705us/step - loss: 1.2157 - accuracy: 0.5048 Epoch 85/100 1875/1875 [==============================] - 1s 711us/step - loss: 1.2138 - accuracy: 0.5072 Epoch 86/100 1875/1875 [==============================] - 1s 712us/step - loss: 1.2109 - accuracy: 0.5089 Epoch 87/100 1875/1875 [==============================] - 1s 714us/step - loss: 1.2088 - accuracy: 0.5095 Epoch 88/100 1875/1875 [==============================] - 1s 718us/step - loss: 1.2068 - accuracy: 0.5129 Epoch 89/100 1875/1875 [==============================] - 1s 718us/step - loss: 1.2043 - accuracy: 0.5127 Epoch 90/100 1875/1875 [==============================] - 1s 717us/step - loss: 1.2016 - accuracy: 0.5170 Epoch 91/100 1875/1875 [==============================] - 1s 716us/step - loss: 1.1997 - accuracy: 0.5197 Epoch 92/100 1875/1875 [==============================] - 1s 708us/step - loss: 1.1979 - accuracy: 0.5219 Epoch 93/100 1875/1875 [==============================] - 1s 717us/step - loss: 1.1955 - accuracy: 0.5242 Epoch 94/100 1875/1875 [==============================] - 1s 725us/step - loss: 1.1937 - accuracy: 0.5262 Epoch 95/100 1875/1875 [==============================] - 1s 721us/step - loss: 1.1912 - accuracy: 0.5283 Epoch 96/100 1875/1875 [==============================] - 1s 725us/step - loss: 1.1896 - accuracy: 0.5299 Epoch 97/100 1875/1875 [==============================] - 1s 710us/step - loss: 1.1879 - accuracy: 0.5312 Epoch 98/100 1875/1875 [==============================] - 1s 715us/step - loss: 1.1862 - accuracy: 0.5312 Epoch 99/100 1875/1875 [==============================] - 1s 716us/step - loss: 1.1846 - accuracy: 0.5331 Epoch 100/100 1875/1875 [==============================] - 1s 717us/step - loss: 1.1828 - accuracy: 0.5347
Out[18]:
<keras.src.callbacks.History at 0xfffef933a250>
In [20]:
score = model.evaluate(X_test, y_test)
print('accuracy = {:2.2%}'.format(score[1]))
313/313 [==============================] - 0s 583us/step - loss: 1.1975 - accuracy: 0.5304 accuracy = 53.04%
Et en fait, combien de paramètres a-t-on dans notre réseau ?¶
In [21]:
print(model.summary())
Model: "sequential_3" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= dense_5 (Dense) (None, 64) 50240 dense_6 (Dense) (None, 10) 650 ================================================================= Total params: 50890 (198.79 KB) Trainable params: 50890 (198.79 KB) Non-trainable params: 0 (0.00 Byte) _________________________________________________________________ None
In [ ]: