Transfer Learning

“Storing the knowledge learned from experience of solving problem A and applying it to solve B.”

Transfer Learning is one of the most hot area of research in the Field of Deep Learning . It focuses on Developing general Neural network models which can further be used to solve specific problems.

The Whole concept of Transfer Learning is highly inspired from human learning and adaptation as we use our prior experience learned from certain task A and unconsciously apply it to learn some new task B.The whole idea is that first train a model in more generic data then use those pre-trained weights and built the model on top of that.

Reasons for using Transfer Learning

  1. Not having enough data for that problem.
  2. Not having enough computing power to train model from scratch.
  3. To Gain more accuracy.
  4. Just to speedup whole process.

Using Transfer Learning

There are mainly two common approaches:

  1. Using a pre-trained model from library
  2. Developing your own model from scratch.

We will cover both approaches in this article

Using a Pre-trained model

This is one more followed approach among two specified . In this we use General Models which are trained and designed by organizations on very large dataset like image-net dataset.

But Before using any pre-designed model you must understand following points.

  1. Problem domain : The model which is chosen must belongs to same problem domain i.e a image classification model must be used for a x-ray image classification problem.
  2. Tuning model : Before using a model you must tune it according to your problem.

Developing a model from scratch

This is not very popular approach in case of transfer learning but if you have a very deeper understanding of Deep Learning then this approach is worth following, but before that you must understand following points.

  1. Select Problem Domain : This means build a source model for which abundance amount of data available .
  2. Tune Model : Tune the source model according to target problem.
Let’s Consider Some Pre-trained models

There are multiple Problem domains for which Transfer Learning can be used .Some of the popular pre-trained models for popular problem domains are listed below.

Image Classification

Image classification is one the most popular task in field of Machine Learning and using a ConvNet trained on ImageNet dataset with 1000 class labels is a very good starting point. There are some very popular pre-trained models already present in market.

  • AlexNet
  • Oxford VGG
  • Google inception-v3
  • Microsoft Resnet

These models exist in almost all popular Machine Learning Libraries like TensorFlow ,Keras,Pytorch.The code implementation may differ from Library to Library. Training these models from scratch requires weeks of time on a power hardware , so making these models from scratch on local machines is next to impossible.

Text Recognition

Recognizing written text is also one the hardest problems for Machine Learning  experts and years have been spent to find a good solution which can do both ,

  • Preserve Semantic meaning of text
  • Preserve meaning of text

Pre Deep learning era the solution to this problem was not at all sophisticated but with advancement of DeepLearning we are good in approximating the solution for this problem.Some of the best pre-designed models for text recognition ,

  • Google Word2Vec
  • Stanford Glove

How Transfer Learning works

Let’s consider a Cnn trained on ImageNet data set using following layers

So if we want to use this model for Fracture detection in X-ray images then all we will do is drop our Output layer and its features (aka weights) and add a new Output layer with random weights  and train only those weights rather than whole model from scratch like in image given below.

Transfer learning is not a very developed area in deeplearning and it is also one the the most hot areas in the field of deep learning.We will be covering many such interesting topics in Machine Learning so stay tuned.

“Happy Machine Learning”

 

Send a Message