Tensors are the central unit of data in tensorflow. Tensors are like numpy arrays, we can conceptually think of them as the n-dimensional abstraction of matrices. A zero-dimensional tensor is a scalar or a constant. A 1-dimensional tensor is a list or vector. A 2-D tensor is same as a n x m matrix where n = rows and m = columns. Above that we can just say n-dimensional tensors. For example,
a = 3 (treated as 0 dimensional tensor or scalars)
a = [3,5] (treated as 1D tensor or vector)
a = [[3,5],[1,1]] (treated as 2D tensor or a matrix) These tensors are passed to operations that perform computations on them.
We can visualize a computational graph like this.

We’ll code it in the following section.
Constants and Running a Session :
import tensorflow as tf
print(tf.__version__)
We will define two constant tensors a and b with
tf.constant constants 5 and 3 and add them up with astf.add shown in the computational graph.a = tf.constant(5,name = "a")
b = tf.constant(3, name = "b")
result = tf.add(a,b,name='add_a_b')
result
# Output <tf.Tensor 'add_a_b:0' shape=() dtype=int32>
Unfortunately enough, our code has not produced the expected output. We can think of tensorflow core programs as having two distinct sections, first we have to define a computational graph that specifies the computations we want to do, then we have to run the code to get our actual results. We have defined our computational graph in this case, but we have not run the graph yet.
To evaluate
result and get the output we have to run the code under a 'session'. A session takes a computational graph or part of a graph and executes it. It also holds the intermediate values and the results of performing the computation. We can create an instance of a session object from tf.Session class.
Following code creates a session, and evaluates the output.
sess = tf.Session()
sess.run(result)
# Output
8
Comments
Post a Comment