您的位置:首页 > 其它

sklearn、TensorFlow、keras模型保存与读取

2017-08-05 17:29 981 查看
一、sklearn模型保存与读取

1、保存

from sklearn.externals import joblib
from sklearn import svm
X = [[0, 0], [1, 1]]
y = [0, 1]
clf = svm.SVC()
clf.fit(X, y)
joblib.dump(clf, "train_model.m")


2、读取

clf = joblib.load("train_model.m")
clf.predit([0,0]) #此处test_X为特征集


二、TensorFlow模型保存与读取(该方式tensorflow只能保存变量而不是保存整个网络,所以在提取模型时,我们还需要重新第一网络结构。)

1、保存

import tensorflow as tf
import numpy as np

W = tf.Variable([[1,1,1],[2,2,2]],dtype = tf.float32,name='w')
b = tf.Variable([[0,1,2]],dtype = tf.float32,name='b')

init = tf.initialize_all_variables()
saver = tf.train.Saver()
with tf.Session() as sess:
sess.run(init)
save_path = saver.save(sess,"save/model.ckpt")


2、加载

import tensorflow as tf
import numpy as np

W = tf.Variable(tf.truncated_normal(shape=(2,3)),dtype = tf.float32,name='w')
b = tf.Variable(tf.truncated_normal(shape=(1,3)),dtype = tf.float32,name='b')

saver = tf.train.Saver()
with tf.Session() as sess:
saver.restore(sess,"save/model.ckpt")


三、TensorFlow模型保存与读取(该方式tensorflow保存整个网络)

转载http://www.jianshu.com/p/8487db911d9a

1、保存

import tensorflow as tf

# First, you design your mathematical operations
# We are the default graph scope

# Let's design a variable
v1 = tf.Variable(1. , name="v1")
v2 = tf.Variable(2. , name="v2")
# Let's design an operation
a = tf.add(v1, v2)

# Let's create a Saver object
# By default, the Saver handles every Variables related to the default graph
all_saver = tf.train.Saver()
# But you can precise which vars you want to save under which name
v2_saver = tf.train.Saver({"v2": v2})

# By default the Session handles the default graph and all its included variables
with tf.Session() as sess:
# Init v and v2
sess.run(tf.global_variables_initializer())
# Now v1 holds the value 1.0 and v2 holds the value 2.0
# We can now save all those values
all_saver.save(sess, 'data.chkp')
# or saves only v2
v2_saver.save(sess, 'data-v2.chkp')
模型的权重是保存在 .chkp 文件中,模型的图是保存在 .chkp.meta 文件中。


2、加载

import tensorflow as tf

# Let's laod a previous meta graph in the current graph in use: usually the default graph
# This actions returns a Saver
saver = tf.train.import_meta_graph('results/model.ckpt-1000.meta')

# We can now access the default graph where all our metadata has been loaded
graph = tf.get_default_graph()

# Finally we can retrieve tensors, operations, etc.
global_step_tensor = graph.get_tensor_by_name('loss/global_step:0')
train_op = graph.get_operation_by_name('loss/train_op')
hyperparameters = tf.get_collection('hyperparameters')

恢复权重

请记住,在实际的环境中,真实的权重只能存在于一个会话中。也就是说,restore 这个操作必须在一个会话中启动,然后将数据权重导入到图中。理解恢复操作的最好方法是将它简单的看做是一种数据初始化操作。
with tf.Session() as sess:
# To initialize values with saved data
saver.restore(sess, 'results/model.ckpt-1000-00000-of-00001')
print(sess.run(global_step_tensor)) # returns 1000


四、keras模型保存和加载

model.save('my_model.h5')
model = load_model('my_model.h5')
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: