TensorFlow(keras)入门课程--04 卷积神经网络
发布日期:2021-06-29 15:45:30 浏览次数:3 分类:技术文章

本文共 5354 字,大约阅读时间需要 17 分钟。

目录

  • 1 简介
  • 2 使用卷积提高计算机视觉准确度
  • 3 可视化卷积核池

1 简介

在本节中,我们将学习如何使用卷积神经网络来改进图像分类模型。

2 使用卷积提高计算机视觉准确度

在之前的实验中 ,使用了包含了三个层的深度神经网络进行时尚图像识别-输入层(以输入数据的形状)、输出层(以及所需输出的形状)和一个隐藏层,

为方便起见,先运行DNN的代码并打印出测试精度。

import tensorflow as tffashion_mnist = tf.keras.datasets.fashion_mnist(training_images,training_labels),(test_images,test_labels) = fashion_mnist.load_data()trainging_images = training_images / 255.0test_images = test_images / 255.0model = tf.keras.models.Sequential([    tf.keras.layers.Flatten(),    tf.keras.layers.Dense(128,activation="relu"),    tf.keras.layers.Dense(10,activation="softmax")])model.compile(optimizer="adam",loss='sparse_categorical_crossentropy', metrics=['accuracy'])model.fit(trainging_images,training_labels,epochs=5)test_loss, test_accuracy = model.evaluate(test_images, test_labels)print ('Test loss: {}, Test accuracy: {}'.format(test_loss, test_accuracy*100))
Epoch 1/560000/60000 [==============================] - 4s 72us/sample - loss: 0.4982 - acc: 0.8257Epoch 2/560000/60000 [==============================] - 4s 74us/sample - loss: 0.3746 - acc: 0.8649Epoch 3/560000/60000 [==============================] - 5s 77us/sample - loss: 0.3388 - acc: 0.8765Epoch 4/560000/60000 [==============================] - 4s 74us/sample - loss: 0.3133 - acc: 0.8858Epoch 5/560000/60000 [==============================] - 4s 73us/sample - loss: 0.2991 - acc: 0.890510000/10000 [==============================] - 0s 28us/sample - loss: 0.3888 - acc: 0.8607Test loss: 0.38882760289907453, Test accuracy: 86.0700011253357

DNN测试集的准确率为86%。

import tensorflow as tfprint(tf.__version__)fashion_mnist = tf.keras.datasets.fashion_mnist(trainging_images,training_labels),(test_images,test_labels) = fashion_mnist.load_data()training_images = trainging_images.reshape(60000,28,28,1)trainging_images = trainging_images / 255.0test_images = test_images.reshape(10000,28,28,1)test_images = test_images / 255.0model = tf.keras.models.Sequential([    tf.keras.layers.Conv2D(64,(3,3),activation="relu",input_shape=(28,28,1)),    tf.keras.layers.MaxPooling2D(2,2),    tf.keras.layers.Conv2D(64,(3,3),activation="relu"),    tf.keras.layers.MaxPooling2D(2,2),    tf.keras.layers.Flatten(),    tf.keras.layers.Dense(128,activation="relu"),    tf.keras.layers.Dense(10,activation="softmax")])model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])model.summary()model.fit(training_images,training_labels,epochs=5)test_loss, test_accuracy = model.evaluate(test_images, test_labels)print ('Test loss: {}, Test accuracy: {}'.format(test_loss, test_accuracy*100))
1.13.1_________________________________________________________________Layer (type)                 Output Shape              Param #   =================================================================conv2d_4 (Conv2D)            (None, 26, 26, 64)        640       _________________________________________________________________max_pooling2d_4 (MaxPooling2 (None, 13, 13, 64)        0         _________________________________________________________________conv2d_5 (Conv2D)            (None, 11, 11, 64)        36928     _________________________________________________________________max_pooling2d_5 (MaxPooling2 (None, 5, 5, 64)          0         _________________________________________________________________flatten_2 (Flatten)          (None, 1600)              0         _________________________________________________________________dense_4 (Dense)              (None, 128)               204928    _________________________________________________________________dense_5 (Dense)              (None, 10)                1290      =================================================================Total params: 243,786Trainable params: 243,786Non-trainable params: 0_________________________________________________________________Epoch 1/560000/60000 [==============================] - 75s 1ms/sample - loss: 13.4033 - acc: 0.1681Epoch 2/560000/60000 [==============================] - 74s 1ms/sample - loss: 14.5063 - acc: 0.1000Epoch 3/560000/60000 [==============================] - 72s 1ms/sample - loss: 14.5063 - acc: 0.1000Epoch 4/560000/60000 [==============================] - 73s 1ms/sample - loss: 14.5063 - acc: 0.1000Epoch 5/560000/60000 [==============================] - 73s 1ms/sample - loss: 14.5063 - acc: 0.100010000/10000 [==============================] - 4s 364us/sample - loss: 8.4485 - acc: 0.1000Test loss: 8.44854741897583, Test accuracy: 10.000000149011612

3 可视化卷积和池化

import matplotlib.pyplot as pltf, axarr = plt.subplots(3,4)FIRST_IMAGE=0SECOND_IMAGE=23THIRD_IMAGE=28CONVOLUTION_NUMBER = 6from tensorflow.keras import modelslayer_outputs = [layer.output for layer in model.layers]activation_model = tf.keras.models.Model(inputs = model.input, outputs = layer_outputs)for x in range(0,4):    f1 = activation_model.predict(test_images[FIRST_IMAGE].reshape(1, 28, 28, 1))[x]    axarr[0,x].imshow(f1[0, : , :, CONVOLUTION_NUMBER], cmap='inferno')    axarr[0,x].grid(False)    f2 = activation_model.predict(test_images[SECOND_IMAGE].reshape(1, 28, 28, 1))[x]    axarr[1,x].imshow(f2[0, : , :, CONVOLUTION_NUMBER], cmap='inferno')    axarr[1,x].grid(False)    f3 = activation_model.predict(test_images[THIRD_IMAGE].reshape(1, 28, 28, 1))[x]    axarr[2,x].imshow(f3[0, : , :, CONVOLUTION_NUMBER], cmap='inferno')    axarr[2,x].grid(False)

在这里插入图片描述

转载地址:https://codingchaozhang.blog.csdn.net/article/details/90718102 如侵犯您的版权,请留言回复原文章的地址,我们会给您删除此文章,给您带来不便请您谅解!

上一篇:TensorFlow(keras)入门课程--05 复杂图像处理
下一篇:TensorFlow(keras)入门课程--03 卷积介绍

发表评论

最新留言

逛到本站,mark一下
[***.202.152.39]2024年04月12日 02时57分15秒

关于作者

    喝酒易醉,品茶养心,人生如梦,品茶悟道,何以解忧?唯有杜康!
-- 愿君每日到此一游!

推荐文章

数据可视化工具:Matplotlib绘图 2019-04-29
用Python写个超级小恐龙跑酷游戏,上班摸鱼我能玩一天 2019-04-29
闺蜜看我用Python画了一幅樱花图,吵着要我给他介绍程序员小哥哥 2019-04-29
【Python爬虫实战】知乎热榜数据采集,上班工作摸鱼两不误,知乎热门信息一网打尽 2019-04-29
自从我学会了数据挖掘Matplotlib、Numpy、Pandas、Ta-Lib等一系列库,我把领导开除了 2019-04-29
Python抓取哔哩哔哩up主信息:只要爬虫学的好,牢饭吃的早 2019-04-29
有个码龄5年的程序员跟我说:“他连wifi从来不用密码” 2019-04-29
领导让我整理上个季度的销售额,幸好我会Python数据分析,你猜我几点下班 2019-04-29
【Python爬虫实战】为何如此痴迷Python?还不是因为爱看小姐姐图 2019-04-29
零基础自学Python,你也可以实现经济独立! 2019-04-29
ElasticSearch与Mysql对比(ElasticSearch常用方法大全,持续更新) 2019-04-29
数字化转型的主干道上,华为云以“三大关键”成企业智能化推手 2019-04-29
数字化为何不走“捷”“径”? 2019-04-29
和总裁、专家交朋友,华为云助推政企智能化升级又做到前面去了 2019-04-29
BCOP章鱼船长,6月22日晚上8点上线薄饼 2019-04-29
为战疫助力,半导体功不可没 2019-04-29
了解这些操作,Python中99%的文件操作都将变得游刃有余! 2019-04-29
知道如何操作还不够!深入了解4大热门机器学习算法 2019-04-29
只有经历过,才能深刻理解的9个编程道理 2019-04-29
发现超能力:这些数据科学技能助你更高效专业 2019-04-29