Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm a bit of a novice in the space, if keras isn't for inference, what's the intended workflow?

Train with keras and then?



You can absolutely serve with Keras if your inference server is in Python. For instance, if you're looking for a basic solution, you can just set up a Flask app that calls `predict()` on a Keras model.

If you're looking for a high-performance solution that is entirely Python-free, then you can simply export your Keras model as a TF SavedModel and serve it via TFServing. TFServing is C++ based and works on both CPU and GPU.


Then there's standard formats like ONNX and inference on any platform or language or hardware that you prefer




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: