Use TensorFlow Lite on Nerves Livebook
Mix.install([
{:tflite_elixir, "~> 0.3.0"},
{:req, "~> 0.4"},
{:kino, "~> 0.13"}
])
Introduction
TensorFlow Lite is a stripped-down version of TensorFlow, a free and open-source software library for machine learning and artificial intelligence.
In Elixir, we can use TensorFlow Lite through the
tflite_elixir
package, which
does the TensorFlow Lite Elixir bindings with optional Edge
TPU support.
In this notebook, we will perform image classification with pre-trained mobilenet_v2_1.0_224_inat_bird_quant.tflite model. The example code below is based on the instructions in the tflite_elixir README. For more information, check out the tflite_elixir API reference.
Decide on where downloaded files are saved
downloads_dir = "/data/tmp"
File.mkdir_p!(downloads_dir)
Download pre-trained model
model_source =
"https://raw.githubusercontent.com/google-coral/test_data/master/mobilenet_v2_1.0_224_inat_bird_quant.tflite"
model_file = Path.join(downloads_dir, "mobilenet_v2_1.0_224_inat_bird_quant.tflite")
Req.get!(model_source, into: File.stream!(model_file))
IO.puts("Model saved to #{model_file}")
Download labels
label_source =
"https://raw.githubusercontent.com/google-coral/test_data/master/inat_bird_labels.txt"
labels = Req.get!(label_source).body |> String.split("\n", trim: true)
Kino.DataTable.new(Enum.with_index(labels, &%{class_name: &1, class_id: &2}), name: "Labels")
Choose image to be classified
An input image can be uploaded here, or default parrot image will be used.
image_input = Kino.Input.image("Image", size: {224, 224})
uploaded_image = Kino.Input.read(image_input)
default_input_image_url =
"https://raw.githubusercontent.com/google-coral/test_data/master/parrot.jpg"
input_image =
if uploaded_image do
# Build a tensor from the raw pixel data
uploaded_image.file_ref
|> Kino.Input.file_path()
|> File.read!()
|> Nx.from_binary(:u8)
|> Nx.reshape({uploaded_image.height, uploaded_image.width, 3})
else
IO.puts("Loading default image from #{default_input_image_url}")
Req.get!(default_input_image_url).body
|> StbImage.read_binary!()
|> StbImage.to_nx()
end
Kino.Image.new(input_image)
Classify image
how_many_results = 3
labels = List.to_tuple(labels)
input_nx =
input_image
|> StbImage.from_nx()
|> StbImage.resize(224, 224)
|> StbImage.to_nx()
interpreter = TFLiteElixir.Interpreter.new!(model_file)
[output_tensor_0] = TFLiteElixir.Interpreter.predict(interpreter, input_nx[[.., .., 0..2]])
indices_nx = Nx.flatten(output_tensor_0)
class_ids =
indices_nx
|> Nx.argsort(direction: :desc)
|> Nx.take(Nx.iota({how_many_results}))
|> Nx.to_flat_list()
class_ids
|> Enum.map(fn class_id -> %{class_id: class_id, class_name: elem(labels, class_id)} end)
|> Kino.DataTable.new(name: "Inference results")
Next steps
Run other models
You can find a variety of pre-trained open-source models in TensorFlow
Hub. For Elixir code, check out example
notebooks
in tflite_elixir
repository.
In case some example notebooks require the
evision
package for using
OpenCV, add it to your Nerves project’s mix.exs
file
and rebuild Nerves firmware.
Run inference on Edge TPU
You can speed up model inference time, running a TensorFlow Lite model on the
Edge TPU. Check out tflite_elixir
‘s “Inference on TPU”
example.