CNN实例——精选推荐

更新时间:2023-06-17 05:11:30 阅读: 评论:0

CNN实例
机器学习算法完整版见
MINST
1 Data
1.1 get data
u panadas to read data from .csv file
import pandas as pd
df = pd.read_csv('../DATA/train.csv')
labels = df.as_matrix(columns=['label'])
datat = df.drop('label', axis=1).as_matrix()
print('Got', len(datat), 'training examples(with', len(labels), 'labels).')
Got 4200 training examples(with 4200 labels).
df = pd.read_csv('DATA/train.csv')
labels = df.as_matrix(columns=['label'])#find lable to transform to matrix
datat = df.drop('label', axis=1).as_matrix()#transform datat to matrxi without drop lable print('Got', len(datat), 'training examples (with', len(labels), 'labels).')
Got 4200 training examples (with 4200 labels).
split datat into training and validation datats
train_len = int(len(labels.ravel()) * 0.75)
prompting
train_datat = datat[:train_len]
train_labels = labels[:train_len]
valid_datat = datat[train_len:]
valid_labels = labels[train_len:]
1.2 plot
we can u matplotlib to get better understanding of the data
import numpy as np
import matplotlib.pyplot as plt
def display_sample(datat, rows=4, colunms=5):
index = 1
for image in datat[:rows*colunms]:
img = np.reshape(image, [28, 28])
plt.subplot(rows, colunms, index)
plt.axis('off')
plt.imshow(img)
台式电脑怎么连网
index += 1
plt.show()
display_sample(train_datat)
1.3 Data validation
def count_each_label(labels):
counters = np.zeros(10, int)
for i in range(len(labels)):
counters[labels[i]] += 1
for i in range(10):
print(i, ':', counters[i])
print('\nmin:\t%d' % np.min(counters))
print('mean:\t%d' % np.mean(counters))
print('max:\t%d' % np.max(counters))
print('stddev:\t%.2f' % np.std(counters))
count_each_label(train_labels)
0 : 319
1 : 354
2 : 343
3 : 286
4 : 310
5 : 301
6 : 320
7 : 318
8 : 295
9 : 304
min:    286
mean:  315
max:    354
stddev: 19.84
2 Logistic Regression
First, using off-the-shelf classifier,it can give you a how result
def off_the_shelf():
from sklearn.linear_model import LogisticRegression
logreg = LogisticRegression(solver='sag', max_iter=256)
十一旅游%time logreg.fit(train_datat, train_labels.ravel())
print('Acc on train datat: {:.2%}'.format(logreg.score(train_datat, train_labels)))    print('Acc on valid datat: {:.2%}'.format(logreg.score(valid_datat, valid_labels)))
off_the_shelf()
C:\Urs\htfenght\Anaconda3\lib\site-packages\sklearn\linear_model\sag.py:286: ConvergenceWarning: The max_iter was reached which means the coef_ did not converge
"the coef_ did not converge", ConvergenceWarning)
Wall time: 1min 11s
Acc on train datat: 99.30%
Acc on valid datat: 86.57%
one-Hot encoding
def one_hot_encode(labels, class=10):
one_hot = np.zeros([len(labels), 10])
惊惶失措for i in range(len(labels)):
one_hot[i, labels[i]] = 1.
return one_hot
train_labels = one_hot_encode(train_labels)
valid_labels = one_hot_encode(valid_labels)
3 Multillayer Convolutional Network
3.1 Model
import tensorflow as tf
class TFModel():
def__init__(lf):
def weight_variable(shape):
return tf.uncated_normal(shape, stddev=0.1))
def bias_variable(shape):
return tf.stant(0.1, shape=shape))
def conv2d(x, W):
v2d(x, W, strides=[1, 1, 1, 1], padding='SAME')
def max_pool_2x2(x):
ax_pool(x, ksize=[1, 2, 2, 1],
袁隆平名言strides=[1, 2, 2, 1], padding='SAME')
# feed dictionary entries needed
lf.x = tf.placeholder(tf.float32, shape=[None, 784])
lf.t = tf.placeholder(tf.float32, shape=[None, 10])表达爱情的古诗
lf.keep_prob = tf.placeholder(tf.float32)
# reshape inputs
x_img = tf.reshape(lf.x, [-1, 28, 28, 1])
# first convolutional layer
W_conv1 = weight_variable([5, 5, 1, 32])
什么是白色家电
厦门最低工资
b_conv1 = bias_variable([32])
h_conv1 = lu(conv2d(x_img, W_conv1) + b_conv1)
h_pool1 = max_pool_2x2(h_conv1)
# cond convolutional layer
W_conv2 = weight_variable([5, 5, 32, 64])
b_conv2 = bias_variable([64])
h_conv2 = lu(conv2d(h_pool1, W_conv2) + b_conv2)
h_pool2 = max_pool_2x2(h_conv2)
# fully connected layer
W_fcl = weight_variable([7 * 7 * 64, 1024])
b_fcl = bias_variable([1024])
h_pool2_flat = tf.reshape(h_pool2, [-1, 7*7*64])
h_fcl = lu(tf.matmul(h_pool2_flat, W_fcl) + b_fcl)
# dropout layer
h_fcl_drop = tf.nn.dropout(h_fcl, lf.keep_prob)
# readout layer
W_fc2 = weight_variable([1024, 10])
b_fc2 = bias_variable([10])
logits = tf.matmul(h_fcl_drop, W_fc2) + b_fc2
# output
lf.y = tf.nn.softmax(logits)
cross_entropy = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=lf.t)        cost = tf.reduce_mean(cross_entropy)
correct_prediction = tf.equal(tf.argmax(lf.y, 1), tf.argmax(lf.t, 1))
lf.accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
3.2 Training
def shuffle(a, b):
p = np.random.permutation(len(a))
return a[p], b[p]
#help(np.random.permutation)
import os
save_dir = 'save'
if not tf.gfile.Exists(save_dir):
tf.gfile.MakeDirs(save_dir)
print('save directory created')
el:
print('save directory contains:', os.listdir(save_dir))
save_path = os.path.join(save_dir, 'model.ckpt') save directory created
结果

本文发布于:2023-06-17 05:11:30,感谢您对本站的认可!

本文链接:https://www.wtabcd.cn/fanwen/fan/82/973124.html

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。

标签:算法   连网   表达   最低工资   机器   电脑   爱情
相关文章
留言与评论(共有 0 条评论)
   
验证码:
推荐文章
排行榜
Copyright ©2019-2022 Comsenz Inc.Powered by © 专利检索| 网站地图