Tag

dl_inference

0 views collected around this technical thread.

58 Tech
58 Tech
Dec 21, 2021 · Artificial Intelligence

dl_inference: Open‑Source Deep Learning Inference Service with TensorRT and MKL Acceleration

dl_inference is an open‑source, production‑grade deep learning inference platform that supports TensorFlow, PyTorch and Caffe models, offering GPU and CPU deployment, TensorRT and MKL acceleration, multi‑node load balancing, and extensive Q&A on model conversion, hardware requirements, INT8 quantization, and performance gains.

Deep LearningGPUInference
0 likes · 8 min read
dl_inference: Open‑Source Deep Learning Inference Service with TensorRT and MKL Acceleration
58 Tech
58 Tech
Mar 27, 2020 · Artificial Intelligence

dl_inference: Open‑Source General Deep Learning Inference Service

dl_inference is an open‑source inference platform that simplifies deployment of TensorFlow and PyTorch models in production, offering unified gRPC access, load‑balanced multi‑node serving, GPU/CPU options, customizable pre‑ and post‑processing, and extensible architecture for future AI workloads.

AI inferenceDeep LearningPyTorch
0 likes · 11 min read
dl_inference: Open‑Source General Deep Learning Inference Service