MIVisionX toolkit is a set of comprehensive computer vision and machine intelligence libraries, utilities, and applications bundled into a single toolkit. AMD MIVisionX also delivers a highly optimized open-source implementation of the Khronos OpenVX™ and OpenVX™ Extensions.

MIVisionX Inference Server

This Sample Inference Server supports:

Command-line usage:

  inference_server_app  [-p     <port>                           default:26262]
                        [-b     <batch size>                     default:64]
                        [-n     <model compiler path>            default:/opt/rocm/libexec/mivisionx/model_compiler/python]
                        [-fp16  <ON:1 or OFF:0>                  default:0]
                        [-w     <server working directory>       default:~/]
                        [-t     <num cpu decoder threads [2-64]> default:1]
                        [-gpu   <comma separated list of GPUs>]
                        [-q     <max pending batches>]
                        [-s     <local shadow folder full path>]

Make sure that all executables and libraries are in PATH and LD_LIBRARY_PATH environment variables.

% export PATH=$PATH:/opt/rocm/bin
% export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/rocm/lib

The inference_server_app works with Client Application.