Skip to content

deployment

YonghaoHe edited this page Mar 1, 2021 · 2 revisions

deployment

Currently, we only support deployment of TensorRT which is widely used in productions. You are encouraged to read the source code: build_engine and inference.

Use cases for WIDERFACE and TT100K: WIDERFACE demo, TT100K demo.

Here, pre-process and post-process are not included, only forward is performed by inference engine. Actually, pre-process (mainly image normalization) can be merged into inference. But post-process (mainly NMS) should be implemented individually.

Clone this wiki locally