您现在的位置是:首页 > 文章详情

TensorFlow 初级

日期:2018-09-27点击:311

TensorFlow 是基于数据流图 (Data Flow Graph), 支持自动微分 (简称AD) 的数值计算库。本文仅仅考虑低级 API.

TensorFlow 的计算图模型一般分为两个步骤:创建计算图,在 Session 中运行。(暂不考虑 Eager)

为了更好的管理模型,最好在特定的 Graph 中创建模型,且对于实现不同功能的模块最好按照 name_scope 对其进行划分。下面是一个 Demo:

# Explicitly create a Graph object graph = tf.Graph() with graph.as_default(): with tf.name_scope("variables"): # Variable to keep track of how many times the graph has been run global_step = tf.Variable(0, dtype=tf.int32, name="global_step") # Variable that keeps track of the sum of all output values over time: total_output = tf.Variable(0.0, dtype=tf.float32, name="total_output") # Primary transformation Operations with tf.name_scope("transformation"): # Separate input layer with tf.name_scope("input"): # Create input placeholder- takes in a Vector a = tf.placeholder(tf.float32, shape=[None], name="input_placeholder_a") # Separate middle layer with tf.name_scope("intermediate_layer"): b = tf.reduce_prod(a, name="product_b") c = tf.reduce_sum(a, name="sum_c") # Separate output layer with tf.name_scope("output"): output = tf.add(b, c, name="output") with tf.name_scope("update"): # Increments the total_output Variable by the latest output update_total = total_output.assign_add(output) # Increments the above `global_step` Variable, should be run whenever the graph is run increment_step = global_step.assign_add(1) # Summary Operations with tf.name_scope("summaries"): avg = tf.div(update_total, tf.cast(increment_step, tf.float32), name="average") # Creates summaries for output node tf.summary.scalar('Output', output) tf.summary.scalar('Sum of outputs over time', update_total) tf.summary.scalar('Average of outputs over time', avg) # Global Variables and Operations with tf.name_scope("global_ops"): # Initialization Op init = tf.initialize_all_variables() # Merge all summaries into one Operation merged_summaries = tf.summary.merge_all() def run_graph(input_tensor): """ Helper function; runs the graph with given input tensor and saves summaries """ feed_dict = {a: input_tensor} out, step, summary = sess.run([output, increment_step, merged_summaries], feed_dict=feed_dict) writer.add_summary(summary, global_step=step) # Start a Session, using the explicitly created Graph sess = tf.Session(graph=graph) # Open a SummaryWriter to save summaries writer = tf.summary.FileWriter('../graph/improved_graph', graph) # Initialize Variables sess.run(init) # Run the graph with various inputs run_graph([2,8]) run_graph([3,1,3,3]) run_graph([8]) run_graph([1,2,3]) run_graph([11,4]) run_graph([4,1]) run_graph([7,3,1]) run_graph([6,3]) run_graph([0,2]) run_graph([4,5,6]) # Write the summaries to disk writer.flush() # Close the SummaryWriter writer.close() # Close the session sess.close()
原文链接:https://yq.aliyun.com/articles/646261
关注公众号

低调大师中文资讯倾力打造互联网数据资讯、行业资源、电子商务、移动互联网、网络营销平台。

持续更新报道IT业界、互联网、市场资讯、驱动更新,是最及时权威的产业资讯及硬件资讯报道平台。

转载内容版权归作者及来源网站所有,本站原创内容转载请注明来源。

文章评论

共有0条评论来说两句吧...

文章二维码

扫描即可查看该文章

点击排行

推荐阅读

最新文章