代碼:
import tensorflow as tf sess = tf.Session()check_point_path = 'variables' saver = tf.train.import_meta_graph('variables/save_variables.ckpt.meta') saver.restore(sess, tf.train.latest_checkpoint(check_point_path)) graph = tf.get_default_graph() #print(graph.get_operations()) #with open('op.txt','a') as f:# f.write(str(graph.get_operations()))op1 = graph.get_tensor_by_name('fully_connected/biases:0')print(op1)
使用函數graph.get_operations()獲取ckpt.meta中保存的graph中的所有operation,而tensor_name為'op_name:0'。
然后使用graph.get_tensor_by_name('op_name:0') 獲取tensor信息。
代碼從ckpt文件中獲取保存的variable的數據(tensor的name和value):
import osimport tensorflow as tffrom tensorflow.python import pywrap_tensorflowcheck_point_path = 'variables'#checkpoint_path = os.path.join(logs_train_dir, 'model.ckpt')ckpt = tf.train.get_checkpoint_state(checkpoint_dir=check_point_path)checkpoint_path = os.path.join('.', ckpt.model_checkpoint_path)#print(ckpt.model_checkpoint_path)reader = pywrap_tensorflow.NewCheckpointReader(checkpoint_path)var_to_shape_map = reader.get_variable_to_shape_map()for key in var_to_shape_map: print("tensor_name: ", key) #print(reader.get_tensor(key))
法二:
from tensorflow.python.tools.inspect_checkpoint import print_tensors_in_checkpoint_file print_tensors_in_checkpoint_file("variables/save_variables.ckpt",tensor_name='', all_tensors=False, all_tensor_names=False)
注意:tf.train.latest_checkpoint(check_point_path) 方法用來獲取最后一次ckeckpoint的路徑,等價于
ckpt = tf.train.get_checkpoint_state(check_point_path)ckpt.model_checkpoint_path
不能將tf.train.latest_checkpoint與tf.train.get_checkpoint_state 搞混了
以上這篇tensorflow 實現從checkpoint中獲取graph信息就是小編分享給大家的全部內容了,希望能給大家一個參考,也希望大家多多支持武林站長站。
新聞熱點
疑難解答