58 Tech
Dec 21, 2021 · Artificial Intelligence
dl_inference: Open‑Source Deep Learning Inference Service with TensorRT and MKL Acceleration
dl_inference is an open‑source, production‑grade deep learning inference platform that supports TensorFlow, PyTorch and Caffe models, offering GPU and CPU deployment, TensorRT and MKL acceleration, multi‑node load balancing, and extensive Q&A on model conversion, hardware requirements, INT8 quantization, and performance gains.
Deep LearningGPUInference
0 likes · 8 min read