本文介绍了生产中的TensorFlow可在高流量应用中实时预测-如何使用?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在高流量应用中使用TensorFlow进行实时预测的正确方法是什么.

What is the right way to use TensorFlow for real time predictions in a high traffic application.

理想情况下,我将让一个运行tensorflow的服务器/集群在一个端口上侦听,在该端口上我可以从应用程序服务器进行连接,并获得与使用数据库的方式类似的预测.培训应通过cron作业通过网络将培训数据馈送到同一服务器/集群来完成.

Ideally I would have a server/cluster running tensorflow listening on a port(s) where I can connect from app servers and get predictions similar to the way databases are used.Training should be done by cron jobs feeding the training data through the network to the same server/cluster.

一个人如何在生产中实际使用张量流?我是否应该在将python作为服务器运行的情况下建立安装程序,并使用python脚本获取预测?我对此还不陌生,但我认为此类脚本需要打开会话等. (我说的是每秒100的预测).

How does one actually use tensorflow in production? Should I build a setup where the python is running as a server and use the python scripts to get predictions? I'm still new to this but I feel that such script will need to open sessions etc.. which is not scalable. (I'm talking about 100s of predictions/sec).

任何指向相关信息的指针将不胜感激.我找不到任何东西.

Any pointer to relevant information will be highly appreciated. I could not find any.

推荐答案

今天早上,我们的同事在GitHub上发布了 TensorFlow服务 ,它解决了您提到的一些用例.它是TensorFlow的分布式包装器,旨在支持多种模型的高性能服务.它支持批量处理和来自应用服务器的交互式请求.

This morning, our colleagues released TensorFlow Serving on GitHub, which addresses some of the use cases that you mentioned. It is a distributed wrapper for TensorFlow that is designed to support high-performance serving of multiple models. It supports both bulk processing and interactive requests from app servers.

有关更多信息,请参见基本和教程.

For more information, see the basic and advanced tutorials.

这篇关于生产中的TensorFlow可在高流量应用中实时预测-如何使用?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-31 18:35