Introduction to TensorFlow
Tensor flow is a distributed computing tool, which allows colossus neural networks to train over a distributed server. Tensor flow is a product of the Google brain team, and Google used tensor flow for its internal use. Used in google photos, google search and google cloud speech.
What is Tensor flow?
- Tensor flow is an open source software library. This library used for data flow programming throughout the scope of tasks.
- Tensor flow is a symbolic Math library, used for machine learning applications such as Neural Networks.
- Tensor flow is a machine-learning library, suited for large-scale machine learning.
- TensorFlow uses computational graphs/ data flow for numerical computations.
- Google uses TensorFlow for Research and production.
History of TensorFlow
Distbelief is a previous version of tensorflow. Distbelief based on deep learning neural networks. Distbelief released in 2011. After updating the Distbelief, tensorflow come into existence. Tensorflow developed by Google Brain Team for the internal use of Google. TensorFlow released in 2015.
Characteristics of Tensorflow
Following are the characteristics of TensorFlow:
- Tensor flow has C++ implementations of Machine learning, which is highly efficient. Besides, it has custom C++ operations.
- Tensor flow runs on primary operating systems like Linux, Windows, and MacOS. Besides, it runs on a mobile operating system like Android, iOS.
- The TF layers, Pretty tensor and Keras are the high-level API. High-level API’s runs on the top of the tensor flow.
- For simple training routines, tensor flow provides simple API TF-slim (tensorflow.contrib.slim).
- For the reduction of a cost function, optimization nodes search for parameters. Tensor flow provides the optimization nodes, which reduces the cost of function.
- Tensor flow provides AutoDiff (Automatic Differentiating). AutoDiff automatically computes the slopes (gradients) of cost functions.
- Tensor flow provides the visualization tool, called as TensorBoard. Through TensorBoard, user able to view learning curves and computation graph.
- For the training of neural networks, tensor flow provides a small Python API called ‘TF.Learn’. It has a few lines of code.
Types of API, which Tensorflow supports
Tensor flow provides two kinds of API such as:
- TensorFlow core API
- A higher level of API
TensorFlow core API
TensorFlow core API is a low-level of API. This API used in low-level machine learning development. This API gives complete programming control. Besides, it provides a fine level of control. This API is suitable for machine learning Researchers.
A higher level of API
A higher level of API provides tf.layers and tf.contrib.learn API, which is more compact. These API present on the top of the tenser flow. Higher level API is easy to learn and use than TensorFlow core API. It makes the repeated task, smooth and more consistent between different users. The high-level API manages datasets, inference, estimators, and training.
What is Tensor?
A tensor is a central unit of data in tensorflow. Tensors are a generalization of Matrices, vectors, and scalars to an arbitrary number of indices. Matrices have exactly two indices, and vectors have single indices whereas scalars have no index.
Tensors are the inputs, outputs of tensor flow or multidimensional data array.
Mathematically tensor represents a physical entity that describes features by the magnitude and multiple directions.
Usually, tensors contain float values. Moreover, it carries strings in the form of byte arrays. Tensors travel among the nodes of computation graphs.
NumPy used for numerical computations, NumPy is a Python API.
What is TensorBoard?
For the visualization, Tensorflow provides the TensorBoard. This TensorBoard used to visualize the graph, to plot the quantitative metrics of the graph and pass the images through it.
Computational Graph or Data Flow Graph
In the computational graph, tensor flow operations arranged into graphs. Tensor flow builds a graph of ‘program logic’ in memory, that graph known as a computational graph. Tensor Flow uses the Data flow graph to represents the computations.
Computational graphs allow creating large scale neural networks as computing and distributed among several CPUs or GPUs (Graphical Processing Unit) similarly.
Nodes in the graph represent mathematical operations. A node represents the unit of computation. Operator: ‘+’ addition is the mathematical operation.
Edges in the graph represent a multi-dimensional data array called Tensors. Edges represent data consumed or produced by operation/ computation.
Input Tensor: X, Y
Output Tensor: Z
Example: Following tensor flow graph/ data flow graph/ computational graph represents
- Single node corresponding addition operator.
- Two incoming edges indicate the input to the operation.
- The One outgoing edge indicates the output of the computation.
Data flow programming model has several advantages like compilation, Portability, Parallelism and Distributed execution.
Tensor flow grasps the advantages of data flow while executing the program. Let us study them in brief:
Explicit edges used to represent the dependencies between operations. The system can easily find out the operations, which executes in parallel.
Edges used to indicate the values/ data set or tensors travel in a graph. Explicit edges used to show the values that flow between operations. Tensor flow partitions the single program code (data flow graph) among multiple devices such as CPUs, GPUs, and TPUs, attached with different machines. Tensor flow provides necessary communication co-ordinations between devices.
Data flow graphs are not language dependent. So that the program code represented using, data flow graphs are language independent. For example: if you build data flow graph in Python and stored in Saved-Model and restore in C++ program.
Tensor flow has XLA compiler. For the generation of the faster code, XLA compiler uses the information present in the data flow graph. Example: By using together adjacent operations.
What is the Default Computational Graph?
When we are experimenting something, it is very common to run the same commands repeatedly. That will result in default graphs containing in many duplicates nodes especially when we are working with jupyter or Python shell. Solution to this is to restart the jupyter kernel or reset the default graph.
Tensor Flow Program Elements
Let us see the elements of the Tensorflow program.
- Constant: The value, which does not change.
- Placeholder: The Placeholder allows the value to assigned later.
- Variable: A Variable is the value that can change. It’s value is not fix. it may or may not varry.
- Session: A Session is an element of tensor flow and session runs to evaluate the nodes.A session called as Tensor flow Runtime.
- feed_dict parameter: A feed_dict parameter instructs to the tensor, to pass the actual value to the placeholder.
Phases of Computational Graph
There are four phases of Computational Graphs.
- Construction Phase
- Execution Phase
- Initialization Phase
- Initialization and Execution Phase
Products build using TensorFlow
Following some products make using TensorFlow:
1. Teachable Machine
This teachable machine developed by Google Creative Labs. Teachable Machine build using Tensorflow. It uses tensorflow.js that allows the user to use the computer camera and teach live in the browser.
2. Nsynth Super
Nsynth super developed by Google Creative Labs. Nsynth is one of the best projects. This project allows the user to create new music, by using entirely new sounds. Those unique sounds created using the Nsynth algorithm.
3. Giorgio Camthat
Giorgio Camthat project used tensorflow. This project allows the user to create new music just by clicking on the images.
4. Rank Brain
Rank Brain id developed by Google. For the search ranking on the www.google.com, Rank brain deploys the deep neural nets on large-scale.
The rank brain is a part of the search algorithm. It sorts billions of pages and finds the most relevant one.
5. Deep Speech
Deep speech project developed by Mozilla. This model can learn speech from spectrogram. There is another project, which generates strokes based on the user’s handwriting style — moreover, these strokes by using neural networks.
Advantages of Tensorflow
Following points indicates the advantages of Tensorflow:
- Responsive Constructs: Tensorflow has responsive constructs. The user can easily visualize every graph. Moreover user able to visualize each part of the graph.
- Platform Flexibility/ Cross Platform: Tensorflow is flexible with any platform. A tensorflow is a modular library. Tensorflow has some standalone parts, and some mingle/ consolidated part.
- Easily Trainable: For distributed computing, Tensorflow is easily trainable on CPU and GPU.
- Auto Differentiation Capability: Tensorflow has auto differentiation capabilities. Auto differentiation capability helps in gradient-based machine learning algorithms. To obtain the graph extension, the user can compute the derivation of values concerning other value. This process results in a graph extension.
- Supports to threads, queue, and asynchronous computation: the Tensorflow supports asynchronous calculations. It supports to the concept of threads and queue.
- Open source and Customizable: The Tensorflow is an open source library. Tensorflow is customizable means; it can modify any suite or separate task.
Disadvantages of Tensorflow
TensorFlow has some limitations:
- GPU memory conflicts: TensorFlow has some memory conflicts. These conflicts occurred only in GPU (Graphical Processing Unit) memory when Theano imported in the same scope. When imports in other scope conflicts do not occur.
- No Support for OpenCL : Tensorflow does not support for OpenCL.
- Need to have prior knowledge of Machine Learning, advanced Calculus and linear algebra: For understanding the concepts of tensorflow, the user must have a good understanding of Machine Learning. Moreover, the user must be clear with the ideas of advanced calculus and linear algebra.
Applications of TensorFlow
Following are the applications of Tensorflow:
- Speech Recognition System
- Image/ Video Recognition
- Self-Driving Cars
- Text Summarization
- Sentiment Analysis
- Google translate