multi gpu inference with tfserving or keras

在tfserving上,目测只能在每个gpu上起docker    https://github.com/tensorflow/serving/issues/311#issuecomment-480176078

对于keras,github有利用多线程来解决多gpu推理  https://github.com/yuanyuanli85/Keras-Multiple-Process-Prediction

 

原文地址:https://www.cnblogs.com/573177885qq/p/11906056.html