Coral Sign Language Tutorial

Overview

This tutorial builds off of the Convolution Tutorial by showing how to export an image recognition model from PerceptiLabs and transfer it to a Coral Dev Board for inference at the edge. Coral is Google's platform for on-device AI.

It also demonstrates how quantization can be used to optimize models for edge devices. The model from the Convolution Tutorial classifies pictures of sign language hand gestures representing the digits 0 through 9.

Introduction to Quantized Models

TensorFlow provides post-training conversion of models to reduce their size and increase inference speed at the expense of losing a bit of accuracy. Using Full integer quantization, all the 32-bit floating-point values in the model are converted to the nearest 8-bit fixed-point numbers. These values generally include weights and activation outputs. Specific types of hardware accelerators used for faster machine learning computations such as Coral's EdgeTPU only support fully-quantized models.

Exporting a Quantized Model

Note

Before continuing:

  • Ensure that you have PerceptiLabs 0.11.4 or higher installed as this is the version where the functionality to export quantized models was added to PerceptiLabs.

  • Ensure you have built and trained the model in PerceptiLabs as described in the Convolution Tutorial.

  • Ensure you have the data from the Convolution Tutorial on hand. You can find that data in the Sign-Language GitHub repo.

Follow the steps below to export the model in PerceptiLabs to a trained TensorFlow Lite model:

  1. Navigate to File > Export.

  2. Enter a path.

  3. Set Export as to TensorFlow Model.

  4. Enable Quantized to set full integer quantization.

  5. Click Export.

  6. Locate the exported .tflite file in the path you specified in Step 2. This will be used in subsequent steps below.

Preparing the Coral Dev Board and Resources

The Coral Dev Board is a single-board computer with an EdgeTPUcoprocessor. Before proceeding, set up the board by following the steps in Coral's Dev Board Set up Guide which should take around 30 minutes.

Currently Coral has two APIs: TF Lite API and EdgeTPU API to perform inference using quantized models on EdgeTPU devices. We shall use the latter (EdgeTPU API) in this tutorial.

To run the inference on the Dev Board, we need the following:

  • Quantized model (.tflite file) exported from PerceptiLabs

  • Input data to run the inference

  • Python code to run the inference

  • (Optional) Label data

Preparing the Code

Below is a sample Python wrapper class (ClassificationEngine) that loads the image data and quantized (trained) model, and runs inference using the EdgeTPU API. Also included is a simple main()function that instantiates that class and invokes its methods.

Copy and paste all of the code into a Python script (.py file) named hand_recognizer.py.

import numpy as np
from edgetpu.basic.basic_engine import BasicEngine
import argparse
import sys
class ClassificationEngine(BasicEngine):
    def __init__(self, model_path, device_path=None):
        super().__init__(model_path, device_path)
    def load_data(self, dataset_path):
        self.dataset = np.load(dataset_path)
    def load_labels(self, labels_path):
        self.labels = np.load(labels_path)
    def get_all_inputs(self):
        return self.get_all_input_tensors()
    def get_all_outputs(self):
        return self.get_all_output_tensors()
    def inference_step(self):
        for sample in self.dataset:
            input_tensor_shape = self.required_input_array_size()
            sample = np.reshape(sample, input_tensor_shape).astype(np.uint8)
            yield self.run_inference(sample)
    def get_inference_time(self):
        return self.get_inference_time()
    def get_loss(self, label, output, loss_fn):
        return loss_fn(label, output)
    def required_input_array_size(self):
        return self.required_input_array_size()
def main():
    parser = argparse.ArgumentParser(
        formatter_class=argparse.ArgumentDefaultsHelpFormatter)
    parser.add_argument(
        '-m', '--model', required=True, help='File path of .tflite file.')
    parser.add_argument(
        '-l', '--labels', help='File path of labels file.')
    parser.add_argument(
        '-i', '--input', required=True, help='File path of input data file')
    args = parser.parse_args()
    engine = classificationEngine(args.model)
    engine.load_data(args.input)
    inference_step = engine.inference_step()
    while True:
        if input("continue:"):
            output = next(inference_step)
            print(output)
        else:
            break
if __name__ == '__main__':
    main()

ClassificationEnginederives from Coral's BasicEnginethat provides many useful methods which include calculating inference time, finding output shapes are included in the EdgeTPU API’s, etc.

ClassificationEngineadds the following key methods:

  • load_data(): loads the image data.

  • required_input_array_size(): returns the required input shape for the model. Using this, the input can then be adjusted to match the required input shape and subsequently fed to the model.

  • inference_step(): returns a generator that can be used to run inference on each input sample each step.

In main(), ClassificationEngineis used to run inference on one sample at a time and the output of the network is printed to the console.

Running the Code at the Edge

Once you have all the files (Python script, quantized model, and data files) they can then be sent from your local computer to the Dev Board.

Follow the steps below to run the Python script at the edge (i.e., on the Dev Board):

1. SSH into the Dev Board as described in the Coral documentation here.

2. Copy the files to the Dev Board using Coral's mdt push command.

3. Run the script on the Dev Board using the following command: python3 hand_recognizer.py --model tflite_model.tflite --input data/X.npy

python3 hand_recognizer.py --model tflite_model.tflite --input data/X.npy

This should output a 10-dimensional array with class probabilities for the current step, predicting what digit the current hand sign image represents.

4. Enter a non-empty value or a string to have the script continue to the next step.

You have now run image classification at the edge on a Coral Dev Board.

Last updated