您的位置:首页 > 产品设计 > UI/UE

ValueError: Variable E_conv0/w/Adam/ does not exist, or was not created with tf.get_variable().

2017-09-05 11:55 691 查看
运行tf.train.AdamOptimizer()函数,例如下面代码:

self.EG_optimizer = tf.train.AdamOptimizer(
learning_rate=EG_learning_rate,
beta1=beta1
).minimize(
loss=self.loss_EG,
global_step=self.EG_global_step,
var_list=self.E_variables + self.G_variables
)




出现错误:


ValueError: Variable E_conv0/w/Adam/ does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope?

解决方法:

在函数第一行添加:

with tf.variable_scope("encoder") as scope:




例如原encoder()函数代码为:

def encoder(self, image, reuse_variables=False):

if reuse_variables:
tf.get_variable_scope().reuse_variables()
num_layers = int(np.log2(self.size_image)) - int(self.size_kernel / 2)
current = image
# conv layers with stride 2
for i in range(num_layers):
name = 'E_conv' + str(i)
current = conv2d(
input_map=current,
num_output_channels=self.num_encoder_channels * (2 ** i),
size_kernel=self.size_kernel,
name=name
)
current = tf.nn.relu(current)

# fully connection layer
name = 'E_fc'
current = fc(
input_vector=tf.reshape(current, [self.size_batch, -1]),
num_output_length=self.num_z_channels,
name=name
)

# output
return tf.nn.tanh(current)



修改后为:
def encoder(self, image, reuse_variables=False):
with tf.variable_scope("encoder") as scope:
if reuse_variables:
tf.get_variable_scope().reuse_variables()
num_layers = int(np.log2(self.size_image)) - int(self.size_kernel / 2)
current = image
# conv layers with stride 2
for i in range(num_layers):
name = 'E_conv' + str(i)
current = conv2d(
input_map=current,
num_output_channels=self.num_encoder_channels * (2 ** i),
size_kernel=self.size_kernel,
name=name
)
current = tf.nn.relu(current)

# fully connection layer
name = 'E_fc'
current = fc(
input_vector=tf.reshape(current, [self.size_batch, -1]),
num_output_length=self.num_z_channels,
name=name
)

# output
return tf.nn.tanh(current)
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: 
相关文章推荐