简介
TensorFlow Serving is a flexible, high-performance serving system for machine learning models, designed for production environments. TensorFlow Serving makes it easy to deploy new algorithms and experiments, while keeping the same server architecture and APIs. TensorFlow Serving provides out-of-the-box integration with TensorFlow models, but can be easily extended to serve other types of models and data.
搭建
具体搭建可以参考官方文档:https://tensorflow.github.io/serving/。
https://medium.com/zendesk-engineering/how-zendesk-serves-tensorflow-models-in-production-751ee22f0f4b
第一步
mkdir test_dir
touch WORKSPACE
mkdir test_dir/data
# cp
cp $train_files $test_files vocab test_dir/data
example
git clone https://github.com/tensorflow/serving.git
启动服务
bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000 --model_name=output --model_base_path=$DIR/output
参考
- How Zendesk Serves TensorFlow Models in Production
- TensorFlow Serving - github
- Tensorflow Serving 模型部署和服务
- TensorFlow for Java
- Installing TensorFlow for Java
- https://github.com/tobegit3hub/tensorflow_template_application/tree/master/python_predict_client