Skip to content

Flask app designed for serving pretrained ML models

Notifications You must be signed in to change notification settings

matthewdparker/serveML

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

serveML

Flask app designed for serving pretrained ML models. To launch serveML:

  1. cd to serveML directory
  2. run: docker build --tag serve-ml .
  3. run: docker run -p 5000:5000 serve-ml

Note: when dill serializes an object the serialization seems to end at when the definition of the object does; in particular, an object which uses import statements or functions which are defined outside of that object will fail to perform properly.

One workaround is to import within the definition of the object, and define functions within the context of the object (or assign them to attributes, in the case of a class definition).

About

Flask app designed for serving pretrained ML models

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published