두 개의 입력이 동일한 지 아닌지 알아 내기 위해 샴 네트워크를 사용하려고합니다. 샴 네트워크의 간단한 요약입니다 : 내가 tensorflow를 사용하여 다음과 같은 네트워크를 만든TensorFlow를 사용하여 샴 네트워크를 만들 때 ValueError
A siamese network is a network consisting of two identical neural networks with tied weights (the weights of the two networks are the same). Given two inputs X_1 and X_2, X_1 is fed to the first network and X_2 to the second network. Then, the outputs from the two networks are combined and produce an answer to the question: are the two inputs similar or different?
,하지만 나는 오류를 얻고있다.
graph = tf.Graph()
# Add nodes to the graph
with graph.as_default():
with tf.variable_scope('siamese_network') as scope:
labels = tf.placeholder(tf.int32, [None, None], name='labels')
keep_prob = tf.placeholder(tf.float32, name='question1_keep_prob')
question1_inputs = tf.placeholder(tf.int32, [None, None], name='question1_inputs')
question1_embedding = tf.get_variable(name='embedding', initializer=tf.random_uniform((n_words, embed_size), -1, 1))
question1_embed = tf.nn.embedding_lookup(question1_embedding, question1_inputs)
question1_lstm = tf.contrib.rnn.BasicLSTMCell(lstm_size)
question1_drop = tf.contrib.rnn.DropoutWrapper(question1_lstm, output_keep_prob=keep_prob)
question1_multi_lstm = tf.contrib.rnn.MultiRNNCell([question1_drop] * lstm_layers)
initial_state = question1_multi_lstm.zero_state(batch_size, tf.float32)
question1_outputs, question1_final_state = tf.nn.dynamic_rnn(question1_multi_lstm, question1_embed, initial_state=initial_state, scope='question1_siamese')
question1_predictions = tf.contrib.layers.fully_connected(question1_outputs[:, -1], 1, activation_fn=tf.sigmoid)
scope.reuse_variables()
question2_inputs = tf.placeholder(tf.int32, [None, None], name='question2_inputs')
question2_embedding = tf.get_variable(name='embedding', initializer=tf.random_uniform((n_words, embed_size), -1, 1))
question2_embed = tf.nn.embedding_lookup(question2_embedding, question2_inputs)
question2_lstm = tf.contrib.rnn.BasicLSTMCell(lstm_size)
question2_drop = tf.contrib.rnn.DropoutWrapper(question2_lstm, output_keep_prob=keep_prob)
question2_multi_lstm = tf.contrib.rnn.MultiRNNCell([question2_drop] * lstm_layers)
question2_outputs, question2_final_state = tf.nn.dynamic_rnn(question2_multi_lstm, question2_embed, initial_state=initial_state)
question2_predictions = tf.contrib.layers.fully_connected(question2_outputs[:, -1], 1, activation_fn=tf.sigmoid)
내가 다음 줄에 다음과 같은 오류가 점점 오전 :
문제는 다음 줄에 있었다
ValueError: Variable siamese_network/rnn/multi_rnn_cell/cell_0/basic_lstm_cell/weights does not exist,
or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope?
해결책 : 여기
question2_outputs, question2_final_state = tf.nn.dynamic_rnn(question2_multi_lstm, question2_embed, initial_state=initial_state)
오류입니다 :
question1_outputs, question1_final_state = tf.nn.dynamic_rnn(question1_multi_lstm, question1_embed, initial_state=initial_state, scope='question1_siamese')
나는 scope
속성 만 제거해야 정상적으로 작동했습니다. 당신은 당신이 나중에 사용되는 변수가 이미 선언 및 재사용해야한다는 tensorflow 말하고있다
scope.reuse_variables()
전화
내가'reuse_variables() '를 사용한 이유는 샴 네트워크가 두 네트워크 모두에서 동일한 가중치를 필요로하기 때문입니다. – Mithun
네, 그리고'question2_embedding = question1_embedding'을 작성하면됩니다. – user1735003