基于keras的fashion_mnist模型训练过程

2024-04-26 03:32

本文主要是介绍基于keras的fashion_mnist模型训练过程,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!

1.下载fashion_mnist数据

使用以下三行代码下载fashion mnist数据

from tensorflow import keras
fashion_mnist = keras.datasets.fashion_mnist
(train_images, train_labels), (test_images, test_labels) = fashion_mnist.load_data()

查看数据信息:

构建网络:

import tensorflow as tf
from tensorflow import keras
fashion_mnist = keras.datasets.fashion_mnist
(train_images, train_labels), (test_images, test_labels) = fashion_mnist.load_data()model = keras.Sequential()
model.add(keras.layers.Flatten(input_shape=(28,28)))
model.add(keras.layers.Dense(128,activation=tf.nn.relu))
model.add(keras.layers.Dense(10,activation=tf.nn.softmax))
model.summary()

模型参数

Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
flatten_1 (Flatten)          (None, 784)               0         
_________________________________________________________________
dense_2 (Dense)              (None, 128)               100480    
_________________________________________________________________
dense_3 (Dense)              (None, 10)                1290      
=================================================================
Total params: 101,770
Trainable params: 101,770
Non-trainable params: 0

 这是一个全链接的神经网络结构,结构如下:

每一层均是全连接

中间层的参数计算过程为:

784 * 128(weight) + 128(bias) = 100480.

最终层的参数计算过程为:

128 * 10(weight) + 10(bias) = 1290.

 计算结果符合summary方法给出的统计数据。

训练模型

model.compile(optimizer=tf.optimizers.Adam(),loss=tf.losses.sparse_categorical_crossentropy,metrics=['accuracy'])
model.fit(train_images,train_labels,epochs=50)

Epoch 1/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.5283 - accuracy: 0.8235
Epoch 2/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.5076 - accuracy: 0.8300
Epoch 3/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4954 - accuracy: 0.8316
Epoch 4/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4892 - accuracy: 0.8350
Epoch 5/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4853 - accuracy: 0.8364
Epoch 6/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4847 - accuracy: 0.8388
Epoch 7/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4769 - accuracy: 0.8414
Epoch 8/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4733 - accuracy: 0.8407
Epoch 9/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4796 - accuracy: 0.8410
Epoch 10/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4935 - accuracy: 0.8381: 0s - loss: 0.4940 - accura
Epoch 11/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4756 - accuracy: 0.8408
Epoch 12/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4745 - accuracy: 0.8424
Epoch 13/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4707 - accuracy: 0.8434
Epoch 14/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4693 - accuracy: 0.8434
Epoch 15/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4614 - accuracy: 0.8463
Epoch 16/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4617 - accuracy: 0.8468
Epoch 17/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4481 - accuracy: 0.8492
Epoch 18/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4638 - accuracy: 0.8464
Epoch 19/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4624 - accuracy: 0.8481
Epoch 20/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4500 - accuracy: 0.8501
Epoch 21/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4546 - accuracy: 0.8479
Epoch 22/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4560 - accuracy: 0.8495
Epoch 23/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4552 - accuracy: 0.8490
Epoch 24/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4492 - accuracy: 0.8501
Epoch 25/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4497 - accuracy: 0.8525
Epoch 26/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4489 - accuracy: 0.8495
Epoch 27/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4464 - accuracy: 0.8515
Epoch 28/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4529 - accuracy: 0.8507
Epoch 29/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4495 - accuracy: 0.8526
Epoch 30/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4454 - accuracy: 0.8508
Epoch 31/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4578 - accuracy: 0.8513
Epoch 32/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4434 - accuracy: 0.8534
Epoch 33/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4387 - accuracy: 0.8541
Epoch 34/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4481 - accuracy: 0.8536
Epoch 35/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4448 - accuracy: 0.8536: 
Epoch 36/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4459 - accuracy: 0.8540
Epoch 37/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4397 - accuracy: 0.8537
Epoch 38/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4399 - accuracy: 0.8559
Epoch 39/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4528 - accuracy: 0.8530
Epoch 40/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4404 - accuracy: 0.8561
Epoch 41/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4433 - accuracy: 0.8540
Epoch 42/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4354 - accuracy: 0.8555
Epoch 43/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4319 - accuracy: 0.8572
Epoch 44/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4408 - accuracy: 0.8557
Epoch 45/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4708 - accuracy: 0.8440
Epoch 46/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4461 - accuracy: 0.8542
Epoch 47/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4337 - accuracy: 0.8584
Epoch 48/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4346 - accuracy: 0.8577
Epoch 49/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4386 - accuracy: 0.8569
Epoch 50/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4505 - accuracy: 0.8560
<keras.callbacks.History at 0x7f5c3f065430>

可以看到,经过50次的训练,正确率在%85左右徘徊 ,中间经过几次周旋倒退,最终总体上还是在正确率上升的轨道上进行,稳定副在%85.6

加入normalization,提升正确率

说白了,正规化就是这样一个过程:

将数据正规化,将每个像素的灰度值转换为[0,1]区间的数据,

train_images = train_images/255
model.compile(optimizer=tf.optimizers.Adam(),loss=tf.losses.sparse_categorical_crossentropy,metrics=['accuracy'])
model.fit(train_images,train_labels,epochs=50)
Epoch 1/50
1875/1875 [==============================] - 2s 1ms/step - loss: 1.0806 - accuracy: 0.6693
Epoch 2/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.6354 - accuracy: 0.7707
Epoch 3/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.5593 - accuracy: 0.8002
Epoch 4/50
1875/1875 [==============================] - 3s 1ms/step - loss: 0.5169 - accuracy: 0.8167
Epoch 5/50
1875/1875 [==============================] - 3s 1ms/step - loss: 0.4904 - accuracy: 0.8270
Epoch 6/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4709 - accuracy: 0.8342
Epoch 7/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4553 - accuracy: 0.8400
Epoch 8/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4430 - accuracy: 0.8442
Epoch 9/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4333 - accuracy: 0.8471
Epoch 10/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4246 - accuracy: 0.8502
Epoch 11/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4172 - accuracy: 0.8526
Epoch 12/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4102 - accuracy: 0.8553
Epoch 13/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4045 - accuracy: 0.8586
Epoch 14/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3987 - accuracy: 0.8605
Epoch 15/50
1875/1875 [==============================] - 3s 1ms/step - loss: 0.3939 - accuracy: 0.8608
Epoch 16/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3904 - accuracy: 0.8626
Epoch 17/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3858 - accuracy: 0.8647
Epoch 18/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3818 - accuracy: 0.8654
Epoch 19/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3778 - accuracy: 0.8677
Epoch 20/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3745 - accuracy: 0.8683
Epoch 21/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3718 - accuracy: 0.8690
Epoch 22/50
1875/1875 [==============================] - 3s 1ms/step - loss: 0.3680 - accuracy: 0.8701
Epoch 23/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3653 - accuracy: 0.8717
Epoch 24/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3620 - accuracy: 0.8724
Epoch 25/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3594 - accuracy: 0.8736
Epoch 26/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3569 - accuracy: 0.8743
Epoch 27/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3543 - accuracy: 0.8750
Epoch 28/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3521 - accuracy: 0.8763
Epoch 29/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3492 - accuracy: 0.8766
Epoch 30/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3467 - accuracy: 0.8773
Epoch 31/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3445 - accuracy: 0.8788
Epoch 32/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3424 - accuracy: 0.8783
Epoch 33/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3403 - accuracy: 0.8801
Epoch 34/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3379 - accuracy: 0.8795
Epoch 35/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3360 - accuracy: 0.8816
Epoch 36/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3338 - accuracy: 0.8817
Epoch 37/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3317 - accuracy: 0.8824
Epoch 38/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3296 - accuracy: 0.8832
Epoch 39/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3271 - accuracy: 0.8837
Epoch 40/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3257 - accuracy: 0.8842
Epoch 41/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3236 - accuracy: 0.8846
Epoch 42/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3211 - accuracy: 0.8864
Epoch 43/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3196 - accuracy: 0.8859
Epoch 44/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3174 - accuracy: 0.8868
Epoch 45/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3162 - accuracy: 0.8874
Epoch 46/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3144 - accuracy: 0.8877
Epoch 47/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3124 - accuracy: 0.8884
Epoch 48/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3108 - accuracy: 0.8894
Epoch 49/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3086 - accuracy: 0.8895
Epoch 50/50
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3074 - accuracy: 0.8897
<keras.callbacks.History at 0x7f5c4011ddc0>

可以看到,最终的正确率比正规化之前有所提升。 

利用模型进行推理

test_images_scaled=test_images/255
model.evaluate(test_images_scaled,test_labels)

验证非正规化的参数推理: 

import numpy as np
demo = tf.reshape(test_images[0],(1,28,28))
print(np.argmax(model.predict(demo)))
print(test_labels[0])

如果训练的时候,输入经过了正规化,则推理也需要正规化:

 通过上面几次的训练过程输出的准确率差异我们可以总结出,同一个网络模型,不同的时间进行训练,可以得到完全不同的正确率,这也说明模型本身的复杂性和时变性,至于为何会发生, 模型的大量参数组成一个高维的数据空间,我们用梯度下降法寻找最优解的时候,依赖于初始参数权重和BIAS在高维空间中的位置,不同次的训练过程,初始化参数在空间中的位置不同,优化的方向则不同,最终的最优解也不相同,所以可以看出,模型训练是一个非常复杂的数学计算过程,并且很可能寻找到的最优解是局部最优解,而并非全局最优解。

调试

得到每层的名称:

得到权重和BIAS

可以看到,权重的维度和上文中的计算是相符合的。

参考排错:

ValueError: Input 0 of layer dense is incompatible with the layer: expected axis -1 of input shape_善良995的博客-CSDN博客


结束!

这篇关于基于keras的fashion_mnist模型训练过程的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!



http://www.chinasem.cn/article/936594

相关文章

Spring AI集成DeepSeek三步搞定Java智能应用的详细过程

《SpringAI集成DeepSeek三步搞定Java智能应用的详细过程》本文介绍了如何使用SpringAI集成DeepSeek,一个国内顶尖的多模态大模型,SpringAI提供了一套统一的接口,简... 目录DeepSeek 介绍Spring AI 是什么?Spring AI 的主要功能包括1、环境准备2

SpringBoot集成图片验证码框架easy-captcha的详细过程

《SpringBoot集成图片验证码框架easy-captcha的详细过程》本文介绍了如何将Easy-Captcha框架集成到SpringBoot项目中,实现图片验证码功能,Easy-Captcha是... 目录SpringBoot集成图片验证码框架easy-captcha一、引言二、依赖三、代码1. Ea

C#集成DeepSeek模型实现AI私有化的流程步骤(本地部署与API调用教程)

《C#集成DeepSeek模型实现AI私有化的流程步骤(本地部署与API调用教程)》本文主要介绍了C#集成DeepSeek模型实现AI私有化的方法,包括搭建基础环境,如安装Ollama和下载DeepS... 目录前言搭建基础环境1、安装 Ollama2、下载 DeepSeek R1 模型客户端 ChatBo

pycharm远程连接服务器运行pytorch的过程详解

《pycharm远程连接服务器运行pytorch的过程详解》:本文主要介绍在Linux环境下使用Anaconda管理不同版本的Python环境,并通过PyCharm远程连接服务器来运行PyTorc... 目录linux部署pytorch背景介绍Anaconda安装Linux安装pytorch虚拟环境安装cu

SpringBoot快速接入OpenAI大模型的方法(JDK8)

《SpringBoot快速接入OpenAI大模型的方法(JDK8)》本文介绍了如何使用AI4J快速接入OpenAI大模型,并展示了如何实现流式与非流式的输出,以及对函数调用的使用,AI4J支持JDK8... 目录使用AI4J快速接入OpenAI大模型介绍AI4J-github快速使用创建SpringBoot

SpringBoot项目注入 traceId 追踪整个请求的日志链路(过程详解)

《SpringBoot项目注入traceId追踪整个请求的日志链路(过程详解)》本文介绍了如何在单体SpringBoot项目中通过手动实现过滤器或拦截器来注入traceId,以追踪整个请求的日志链... SpringBoot项目注入 traceId 来追踪整个请求的日志链路,有了 traceId, 我们在排

Spring Boot 3 整合 Spring Cloud Gateway实践过程

《SpringBoot3整合SpringCloudGateway实践过程》本文介绍了如何使用SpringCloudAlibaba2023.0.0.0版本构建一个微服务网关,包括统一路由、限... 目录引子为什么需要微服务网关实践1.统一路由2.限流防刷3.登录鉴权小结引子当前微服务架构已成为中大型系统的标

Java中对象的创建和销毁过程详析

《Java中对象的创建和销毁过程详析》:本文主要介绍Java中对象的创建和销毁过程,对象的创建过程包括类加载检查、内存分配、初始化零值内存、设置对象头和执行init方法,对象的销毁过程由垃圾回收机... 目录前言对象的创建过程1. 类加载检查2China编程. 分配内存3. 初始化零值4. 设置对象头5. 执行

SpringBoot整合easy-es的详细过程

《SpringBoot整合easy-es的详细过程》本文介绍了EasyES,一个基于Elasticsearch的ORM框架,旨在简化开发流程并提高效率,EasyES支持SpringBoot框架,并提供... 目录一、easy-es简介二、实现基于Spring Boot框架的应用程序代码1.添加相关依赖2.添

SpringBoot中整合RabbitMQ(测试+部署上线最新完整)的过程

《SpringBoot中整合RabbitMQ(测试+部署上线最新完整)的过程》本文详细介绍了如何在虚拟机和宝塔面板中安装RabbitMQ,并使用Java代码实现消息的发送和接收,通过异步通讯,可以优化... 目录一、RabbitMQ安装二、启动RabbitMQ三、javascript编写Java代码1、引入