Onnx Machine Learning in Production - Blog
Onnx Machine Learning in Production - Blog
com
Projects
Your email
Blog
Trending
Topics
Install Python 3.8,
Virtual
Environments using
Messages
Pipenv, Django 3+
on Windows
Questions
Production
Ideas
on macOS
Light on September 11, 2020 · Justin Mitchel · AI · Deep Learning · Deployment · Keras · Machine OpenCV & Python:
Learning · onnx · Production How to Change
Resolution or
Rescale Frame
I recently had a project that I needed to use PyInstaller along with a Keras-trained
model. Unfortunately, PyInstaller and Keras only work some of the time.. as in, not that
reliable of a build. How to Create a
Custom Django User
How to Implement
Why is this important? interoperability Django's Built In
Let's say I have a flask web app in production serving a ML model. Doing so, is easy Password
enough. Let's say the model I originally created this in was in Keras. Then, a new Management
member joins my team that is just a pro at PyTorch. How do I deploy these two models
on the same project? Large File Uploads
with Amazon S3 +
Well, you could package up both tensorflow and PyTorch for running inference but that
Django
starts to make our simple web app a bulky one and, more importantly, one that's
significantly more difficult to manage.
Setup React
onnx simplifies this problem by providing a standardized way to run models in
production. All you have to do is export your ML model to an onnx model. Once you Create a Standalone
have that, you can easily run it in production as you'll see below. React App
A real-world example
Featured
Recently, I was working on a Python project that needed to be compiled into a single
executable. For this, I used PyInstaller since it's a very reliable way to turn python into an Django vs Node.js
executable.
Setup React
https://www.codingforentrepreneurs.com/blog/onnx-machine-learning-in-production 1/4
12/04/2021 Onnx Machine Learning in Production / Blog / codingforentrepreneurs.com
Unfortunately, pyinstaller doesn't play nice will all packages and package types and How to Create a
containers (like docker) cannot (as far as I know) be compiled into a single binary. Custom Django User
Model
I was having all kinds of trouble getting PyInstaller and tensorflow to compile
correctly so I decided to give onnx a try. Not only did it work, but it worked incredibly
reliably. But what are Django
signals?
The post below will show you exactly how to convert a keras model
How to integrate
into a onnx one and then put it into production.
Python with Airtable
Python 3.7
venv (and not my preferred pipenv)
$ cd path/to/your/dev/folder Copy
Activate
Mac/Linux
Windows
.\Scripts\activate Copy
Step 2. Installations
pip install tensorflow keras2onnx onnxruntime numpy pillow Copy
tensorflow: our machine learning framework (but using tf.keras which is built-in
to tensorflow now)
keras2onnx: our conversion package
onnxruntime: how we run inference on onnx models in production
numpy: numerical python; common for dealing with arrays and matrices in Python
& ML Projects
pillow: the Python Image Library installer (PIL) which makes it easy to open
images within python.
For PyTorch you can easily export a model too by using PyTorch's
built-in method outlined here
https://www.codingforentrepreneurs.com/blog/onnx-machine-learning-in-production 2/4
12/04/2021 Onnx Machine Learning in Production / Blog / codingforentrepreneurs.com
model.h5 (keras)
model.onnx (onnx)
I recommend keeping a keras-saved model for future resumable training. onnx can be
converted but I don't use the extra step if I don't need to.
Preprocessing
# preprocessing.py Copy
import numpy as np
from PIL import Image
Response Encoding
Below is a json encoder that converts numpy data types.
# encoding.py Copy
import numpy as np
import json
class NumpyEncoder(json.JSONEncoder):
""" Special json encoder for numpy types """
def default(self, obj):
if isinstance(obj, np.integer):
return int(obj)
elif isinstance(obj, np.floating):
return float(obj)
elif isinstance(obj, np.ndarray):
return obj.tolist()
return json.JSONEncoder.default(self, obj)
In Flask
@app.route('/numpy')
def get_numpy_response():
data = {"preds": np.array([0.87, 0.13])}
return jsonify(data)
https://www.codingforentrepreneurs.com/blog/onnx-machine-learning-in-production 3/4
12/04/2021 Onnx Machine Learning in Production / Blog / codingforentrepreneurs.com
import onnxruntime
ONNX_SESSION = None
def get_session():
global ONNX_SESSION
if ONNX_SESSION == None:
model_path = str(pathlib.Path("model.onnx"))
sess = onnxruntime.InferenceSession(model_path)
ONNX_SESSION = sess
return ONNX_SESSION
Next Steps
Now, you just need to take all of the above information and turn it into a webapp or
add it to a local python project. I'll leave that to you.
Good luck!
Write a comment
https://www.codingforentrepreneurs.com/blog/onnx-machine-learning-in-production 4/4