Key ideas in TensorFlow

Do not miss this exclusive book on Binary Tree Problems. Get it now for free.

Reading time: 15 minutes

TensorFlow is a popular Deep Learning library that is used to build Deep Learning models and is backed by Google. A few key ideas of TensorFlow are :

Tensor

Tensor as the unified data type. Each computing vertex takes in zero or more tensor as input, and output zero or more tensor.

Computation Graph

Computation as a graph, thus the name "flow". Many computing vertex form a large computation graph, the system is responsible for scheduling and distribution. Using graph to model computation (allowing cycles and loops), and "variable" instead of dedicated parameter servers in the DistBelief, it can model beyond FNN, easily model RNN and other more complex network architecture.

Common operation abstraction

Common abstraction of computation (ops, attributes, etc), thus providing cross language functionality: user can write computation in either python or C++, or other language down the road. It also allow extensibility through those abstractions, user could define its own operations.

Kernel abstraction

Kernel abstraction to provide cross platform functionality: GPU, CPU, etc.

Automatic gradient computation

Automatic gradient computation

Distributed Computations

Support distributed computation across machines, this is the killer feature of TensorFlow over other platforms like Theano, Torch, etc. Unfortunately this part is not open sourced. It supports both data parallel and model parallel, like DistBelief.

Sign up for FREE 3 months of Amazon Music. YOU MUST NOT MISS.