Colorizing black and white images on Android using Tflite

Adesh Gautam
3 min readNov 19, 2020

--

In this story we’ll see how you can deploy your tensorflow model on an Android device using Tflite. Ok, but why do you want to run your models on Android ? For fun ? or testing you model on REAL data ?
Well, for me it’s both.

Training a model on laptop using python is a no brainer. Gather data and boom ! Using Pytorch or Tensorflow you can train it pretty easily. But what about putting your model on an Android device.

You have to understand the limitations of putting you model on phone. First you don’t have numpy there so less flexibility. You can’t just read an image and reshape it and do different operations in 5–6 lines and do inference. You have to do it the Android way. It’s a different experience for an ML engineer.

The code is available on my Github repo.

The Android Way

I have a basic workflow for the app:

  1. Create a Imageview on the app.
  2. Add a button and add an onclicklistener to it.
  3. Write stuff in the onclicklistener to load an image from gallery or camera to the imageview on clicking it.
  4. Further in the onclicklistener, add code to load tflite model, get the Image from imageview into a bitmap object and apply preprocessing using tensorflow’s already provided functions or using your own.
  5. Feed the bitmap to the model, get the output and do post processing if required. Now feed this output as bitmap into the imageview.

Things to keep a note of

This is the simplest workflow to do inference on an image.

But, we have some differences in the workflow for our app. First, we have 2 model inputs, for a mobilenet and a custom model. So, we have to preprocess the image differently for each model input. Second, for the custom model we are feeding the output from the mobilenet and a separate input, so basically there are two inputs for the custom model. It will generate a colorized image as output.

You can refer to the below image for the model architecture.

Colornet Model Architecture

Also, the custom model input takes a LAB color format and the mobilenet takes RGB format. But this depends on your model.

Also, you have the following things on Android which can be new at first but they are just the required things to do inference on Android:

  1. Tflite Interpreter
  2. MappedByteBuffer
  3. GpuDelegate

Hope this woud help you in deploying your models on Android device.

Please click on the 👏 button if you liked the post and hold it for giving more love.

If you wish to connect:

Github Twitter Instagram LinkedIn

--

--