Powered by AppSignal & Oban Pro
Would you like to see your link here? Contact us

Use TensorFlow Lite on Nerves Livebook

priv/samples/tflite.livemd

Use TensorFlow Lite on Nerves Livebook

Mix.install([
  {:tflite_elixir, "~> 0.3.0"},
  {:req, "~> 0.3.0"},
  {:progress_bar, "~> 2.0.0"},
  {:kino, "~> 0.9.0"}
])

Introduction

TensorFlow Lite is a stripped-down version of TensorFlow, a free and open-source software library for machine learning and artificial intelligence.

In Elixir, we can use TensorFlow Lite through the tflite_elixir package, which does the TensorFlow Lite Elixir bindings with optional Edge TPU support.

In this notebook, we will perform image classification with pre-trained mobilenet_v2_1.0_224_inat_bird_quant.tflite model. The example code below is based on the instructions in the tflite_elixir README. For more information, check out the tflite_elixir API reference.

Prepare helper functions

defmodule Utils do
  def download!(source_url, req_options \\ []) do
    Req.get!(source_url, [finch_request: &finch_request/4] ++ req_options).body
  end

  defp finch_request(req_request, finch_request, finch_name, finch_options) do
    acc = Req.Response.new()

    case Finch.stream(finch_request, finch_name, acc, &handle_message/2, finch_options) do
      {:ok, response} -> {req_request, response}
      {:error, exception} -> {req_request, exception}
    end
  end

  defp handle_message({:status, status}, response), do: %{response | status: status}

  defp handle_message({:headers, headers}, response) do
    {_, total_size} = Enum.find(headers, &match?({"content-length", _}, &1))

    response
    |> Map.put(:headers, headers)
    |> Map.put(:private, %{total_size: String.to_integer(total_size), downloaded_size: 0})
  end

  defp handle_message({:data, data}, response) do
    new_downloaded_size = response.private.downloaded_size + byte_size(data)
    ProgressBar.render(new_downloaded_size, response.private.total_size, suffix: :bytes)

    response
    |> Map.update!(:body, &amp;(&amp;1 <> data))
    |> Map.update!(:private, &amp;%{&amp;1 | downloaded_size: new_downloaded_size})
  end
end

Decide on where downloaded files are saved

downloads_dir = "/data/tmp"
File.mkdir_p!(downloads_dir)

Download pre-trained model

model_source =
  "https://raw.githubusercontent.com/google-coral/test_data/master/mobilenet_v2_1.0_224_inat_bird_quant.tflite"

model_file = Path.join(downloads_dir, "mobilenet_v2_1.0_224_inat_bird_quant.tflite")
Utils.download!(model_source, output: model_file)
IO.puts("Model saved to #{model_file}")

Download labels

label_source =
  "https://raw.githubusercontent.com/google-coral/test_data/master/inat_bird_labels.txt"

labels = String.split(Utils.download!(label_source), "\n", trim: true)
Kino.DataTable.new(Enum.with_index(labels, &amp;%{class_name: &amp;1, class_id: &amp;2}), name: "Labels")

Choose image to be classified

An input image can be uploaded here, or default parrot image will be used.

image_input = Kino.Input.image("Image", size: {224, 224})
uploaded_image = Kino.Input.read(image_input)

default_input_image_url =
  "https://raw.githubusercontent.com/google-coral/test_data/master/parrot.jpg"

input_image =
  if uploaded_image do
    # Build a tensor from the raw pixel data
    uploaded_image.data
    |> Nx.from_binary(:u8)
    |> Nx.reshape({uploaded_image.height, uploaded_image.width, 3})
  else
    IO.puts("Loading default image from #{default_input_image_url}")

    Utils.download!(default_input_image_url)
    |> StbImage.read_binary!()
    |> StbImage.to_nx()
  end

Kino.Image.new(input_image)

Classify image

how_many_results = 3
labels = List.to_tuple(labels)

input_nx =
  input_image
  |> StbImage.from_nx()
  |> StbImage.resize(224, 224)
  |> StbImage.to_nx()

interpreter = TFLiteElixir.Interpreter.new!(model_file)
[output_tensor_0] = TFLiteElixir.Interpreter.predict(interpreter, input_nx[[.., .., 0..2]])
indices_nx = Nx.flatten(output_tensor_0)

class_ids =
  indices_nx
  |> Nx.argsort(direction: :desc)
  |> Nx.take(Nx.iota({how_many_results}))
  |> Nx.to_flat_list()

class_ids
|> Enum.map(fn class_id -> %{class_id: class_id, class_name: elem(labels, class_id)} end)
|> Kino.DataTable.new(name: "Inference results")

Next steps

Run other models

You can find a variety of pre-trained open-source models in TensorFlow Hub. For Elixir code, check out example notebooks in tflite_elixir repository.

In case some example notebooks require the evision package for using OpenCV, add it to your Nerves project’s mix.exs file and rebuild Nerves firmware.

Run inference on Edge TPU

You can speed up model inference time, running a TensorFlow Lite model on the Edge TPU. Check out tflite_elixir‘s “Inference on TPU” example.