Run pycoral inside docker. How to accelerate code in a… | by Filippo Valle | Jan, 2024

AI in container (Made with Fooocus)

How to accelerate code in a contenarised environment

I wanted to run inference of Fashion MNIST using the new Google Coral acccelerator.

In the realm of edge computing, the Coral Edge TPU stands out as a potent hardware accelerator for machine learning tasks. In this article, we delve into a Python script that harnesses the capabilities of Coral Edge TPU to make predictions on a test dataset. Not only that, but it also provides a visual representation of the model’s performance through a confusion matrix. Let’s break down the code and understand how Coral Edge TPU is seamlessly integrated into the workflow.


Google coral is a TPU accelerator via USB

Google coral (Image by Coral website)


The container uses Debian Buster, to be sure it is compatible with all the necessary dependencies, as the base image and installs necessary dependencies, including Coral Edge TPU packages, Python packages from a requirements.txt file, and key utilities. It copies two scripts, and, to the /home/ directory. The script is set as the default command when the container starts. The image is configured for Coral Edge TPU development and application execution.

FROM debian:buster

RUN apt-get update && apt-get install --yes curl gpg
RUN echo "deb coral-edgetpu-stable main" | tee /etc/apt/sources.list.d/coral-edgetpu.list
RUN curl | apt-key add -
RUN apt-get update
RUN apt-get install --yes libedgetpu1-std
RUN apt-get install --yes python3-pycoral
RUN apt-get install --yes edgetpu-compiler
RUN apt-get install --yes usbutils
RUN apt-get install --yes python3-pip
RUN python3 -m pip install --no-cache-dir -U pip

COPY requirements.txt /home/data/requirements.txt
RUN python3 -m pip install --no-cache-dir -r /home/data/requirements.txt
COPY /home/.
COPY /home/.

CMD ["/home/"]


The training was done in a quite standard fashion using keras.

The model was converted to tflite using the tensorflow tools.

saved_keras_model = 'model.h5'

def representative_dataset():
for data in,28,28,1))).batch(1).take(100):
yield [tf.dtypes.cast(data, tf.float32)]

converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.target_spec.supported_types = [tf.int8] # extra line missing
converter.inference_input_type = tf.uint8
converter.inference_output_type = tf.uint8
tflite_model = converter.convert()

with open('model.tflite', 'wb') as f:

Run inference

The inference is performed using the pycoral tools available in the pycoral python package.

from pycoral.adapters import common
from pycoral.adapters import segment
from pycoral.adapters import classify
from pycoral.utils.edgetpu import make_interpreter
import os

X_test = np.loadtxt("X_test.txt")
Y_test = np.loadtxt("Y_test.txt")

# %%
import os

# %%
interpreter = make_interpreter('model_edgetpu.tflite', device=":0")
width, height = common.input_size(interpreter)

# %%
classify.get_classes(interpreter, top_k=1)

# %%
def pred(X_data):
common.set_input(interpreter, X_data.reshape((width, height, 1)))
return classify.get_classes(interpreter, top_k=1)[0].id

y_pred = [pred(x_test) for x_test in X_test.reshape(-1,28,28,1)]
y_real = Y_test


All the code to run this model is available at

Source link

Be the first to comment

Leave a Reply

Your email address will not be published.