您的位置:首页 > 其它

Notes on Tensorflow

2017-08-15 14:20 489 查看

一、保存和恢复模型

保存和恢复模型:

http://cv-tricks.com/tensorflow-tutorial/save-restore-tensorflow-models-quick-complete-tutorial/

二、区分好variable和tensor

灵活利用dir()函数来解析对象的属性和方法

tf.layers.dense和tf.layer.cov2d返回的都是tensorflow.python.framework.ops.Tensor,也就是tensor,而type(tf.Variable([1]))返回的是tensorflow.python.ops.variables.Variable

在variable的创建过程中,

# Create two variables.
weights = tf.Variable(tf.random_normal([784, 200], stddev=0.35), name="weights")
biases = tf.Variable(tf.zeros([200]), name="biases")


其实有多个ops(对应TF中的nodes)被创建

Calling tf.Variable() adds several ops to the graph:

- A variable op that holds the variable value.

- An initializer op that sets the variable to its initial value. This is actually a tf.assign op.

- The ops for the initial value, such as the zeros op for the biases variable in the example are also added to the graph.

The value returned by tf.Variable() value is an instance of the Python class tf.Variable.

最后返回的是一个tf.Variable类的实例化对象

根据文档所说:

Just like any Tensor, variables created with Variable() can be used as inputs for other Ops in the graph. Additionally, all the operators overloaded for the Tensor class are carried over to variables, so you can also add nodes to the graph by just doing arithmetic on variables.

也就是说和tensor一样,variable(一种区别于tensor的对象)也可以作为其他ops的输入,并且所有能够重载tensor的操作都适用于variable,所以可以直接像tensor一样拿variable来做计算,如下:

import tensorflow as tf

# Create a variable.
w = tf.Variable(<initial-value>, name=<optional-name>)

# Use the variable in the graph like any Tensor.
y = tf.matmul(w, ...another variable or tensor...)

# The overloaded operators are available too.
z = tf.sigmoid(w + y)

# Assign a new value to the variable with `assign()` or a related method.
w.assign(w + 1.0)
w.assign_add(1.0)


三、Debug TensorFlow

A Practical Guide for Debugging TensorFlow Codes

https://github.com/wookayin/tensorflow-talk-debugging



要意识到细心地解剖神经网络是一门艺术,因此要耐心地考虑,首先要可以从简单的查看acc和loss入手:

I recommend to take simple and essential scalar summaries only (e.g. train/validation loss, overall accuracy, etc.), and to include debugging stuffs only on demand

其他的一些建议:

养成给tensor/variable命名的好习惯

Use proper names (prefixed or scoped) for tensors and variables (specifying name=… to tensor/variable declaration)

会使用tf.Print()函数,每当指定的op被evaluated的时候就会打印出相应的信息,它本身是一个 identity op with the side effect of printing data, when this op is evaluated.

Debug Trick:

获取graph中的ops和对应的tensors,参考https://stackoverflow.com/questions/35336648/list-of-tensor-names-in-graph-in-tensorflow

sess.graph.get_operations() gives you a list of operations. For an op, op.name gives you the name and op.values() gives you a list of tensors it produces (in the inception-v3 model, all tensor names are the op name with a “:0” appended to it, so pool_3:0 is the tensor produced by the final pooling op.)

为了获得模型的参数,可以直接访问名字含有
kernel
或者
bias
字符串的tensor,参考https://stackoverflow.com/questions/43244446/how-to-get-cnn-kernel-values-in-tensorflow

获取variables

可以通过tf.global_variables()来获取所有的变量

可以通过tf.trainable_variables()来获取所有可以训练的变量,为了区分开不需要训练的变量,可以在定义变量的时候设置好trainable的参数,如下:

global_step = tf.Variable(0, dtype=tf.int32, trainable=False)
train_op = tf.train.AdamOptimizer(learning_rate=0.001)\
.minimize(loss, global_step=global_step)


更多variables相关的helper functions请见:

https://www.tensorflow.org/versions/r0.12/api_docs/python/state_ops/variable_helper_functions

以上获取ops/tensors和variables的方法再结合下图就很完整了:



当weight未初始化的时候,默认的初始化方式:

https://stackoverflow.com/questions/43284047/what-is-the-default-kernel-initializer-in-tf-layers-conv2d-and-tf-layers-dense
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: