苏木三少
错的不是你,而是这个世界。

tensorflow学习笔记

结果可视化 plot result

codes

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
import tensorflow as tf
import matplotlib.pyplot as plt
import numpy as np

def add_layer(inputs, in_size, out_size, activation_function=None):

    Weights =tf.Variable(tf.random_normal([in_size,out_size]))
    biases = tf.Variable(tf.zeros([1,out_size])+0.1)

    Wx_plus_b = tf.matmul(inputs, Weights)+biases
    if activation_function is None:
        outputs = Wx_plus_b
    else:
        outputs = activation_function(Wx_plus_b)
    return outputs


x_data = np.linspace(-1,1,300)[:,np.newaxis]
noise = np.random.normal(0,0.05,x_data.shape)
y_data = np.square(x_data)-0.5+noise

xs = tf.placeholder(tf.float32,[None,1])
ys = tf.placeholder(tf.float32,[None,1])
l1 =add_layer(xs,1,10,activation_function=tf.nn.relu)
prediction = add_layer(l1,10,1,activation_function=None)


#predition和真实值y_data chabie

loss = tf.reduce_mean(tf.reduce_sum(tf.square(ys-prediction),
               reduction_indices=[1]))
train_step = tf.train.GradientDescentOptimizer(0.1).minimize(loss)

init = tf.global_variables_initializer()
sess = tf.Session()
sess.run(init)
fig = plt.figure()
ax = fig.add_subplot(1,1,1)
ax.scatter(x_data,y_data)
plt.show()
for i in range(1000):
    #sess.run(train_step,feed_dict={xs:x_data,ys:y_data})
    #if i % 50==0:
        #print(sess.run(loss,feed_dict={xs:x_data,ys:y_data}))
    try:
        ax.lines.remove(lines[0])
    except Exception:
        pass
    prediction_value = sess.run(prediction,feed_dict={xs:x_data})
    lines = ax.plot(x_data,prediction_value, 'r-', lw=5)
    plt.pause(0.1)

代码解析

其中 我们使用python的matplotlib 来绘画出结果。

赞(2) 打赏
有问题的朋友随时留言,或者加我为好友。我的QQ是805375353. <<苏木三少博客 » tensorflow学习笔记

评论 抢沙发

  • 昵称 (必填)
  • 邮箱 (必填)
  • 网址

觉得文章有用就打赏一下文章作者

支付宝扫一扫打赏

微信扫一扫打赏

十年之约