Project

General

Profile

Support #23723

TensorRT inference server

Added by Michael Wang about 2 months ago. Updated 2 days ago.

Status:
Assigned
Priority:
Normal
Assignee:
Target version:
-
Start date:
12/06/2019
Due date:
% Done:

80%

Estimated time:
16.00 h
Spent time:
Scope:
Internal
Experiment:
LArSoft
SSI Package:
Co-Assignees:
Duration:

Description

We'd like to request to have the NVidia TensorRT client libraries made available as a UPS product. We are trying to implement the "GPU as a service" feature in LArSoft and we need these libraries to be able to send requests to and retrieve results from a remote inference server. Kevin and Nhan (in the watchers list) can correct me but I believe we only need the client libraries and the version we need is r19.10 to be compatible with the server. The source code is available here:

https://github.com/NVIDIA/tensorrt-inference-server

History

#1 Updated by Kyle Knoepfel about 2 months ago

  • Status changed from New to Feedback

It will take some analysis to determine exactly how the client libraries can be built and packaged as a UPS product. We will setup a meeting to discuss this.

#2 Updated by Kevin Pedro about 2 months ago

In case it is helpful, here is the setup script we are currently using to install tensorrt-inference-server in CMSSW:
https://github.com/hls-fpga-machine-learning/SonicCMS/blob/db32a67483e5520944778f8027242a9f552b9c86/TensorRT/setup.sh#L5-L48

#3 Updated by Kyle Knoepfel about 1 month ago

  • Estimated time set to 16.00 h
  • Assignee set to Kyle Knoepfel
  • Status changed from Feedback to Assigned

#4 Updated by Kyle Knoepfel 16 days ago

  • Status changed from Assigned to Feedback

I have a test UPS installation of the required package. Do you have some sample code I can try to build with my test installation?

#5 Updated by Kevin Pedro 15 days ago

My uses of the client code also require CMSSW, so I'm not sure if that would be the easiest way to test it.

However, there are some example clients in tensorrt-inference-server itself:
https://github.com/NVIDIA/tensorrt-inference-server/tree/master/src/clients/c%2B%2B/perf_client
https://docs.nvidia.com/deeplearning/sdk/tensorrt-inference-server-guide/docs/perf_client.html

#6 Updated by Kyle Knoepfel 9 days ago

  • % Done changed from 0 to 80

A preliminary UPS package for testing has been given to Mike Wang. We await his feedback.

Providing this package required new UPS versions of protobuf and opencv.

#7 Updated by Michael Wang 8 days ago

Hi Kyle,

I set up a cmake file to try compiling the example client code using the new TRTIS UPS package. Here are the first few relevant lines of the CMakeLists.txt file:

cmake_minimum_required (VERSION 3.5)

project(trtclient_test)

find_package(TRTIS CONFIG REQUIRED)
message(STATUS "Using TRTIS client library ${TRTIS_VERSION}")

but when I run cmake, I get the following errors:

p1{mwang}1467% cmake ../src
-- The C compiler identification is GNU 8.2.0
-- The CXX compiler identification is GNU 8.2.0
-- Check for working C compiler: /products/gcc/v8_2_0/Linux64bit+3.10-2.17/bin/cc
-- Check for working C compiler: /products/gcc/v8_2_0/Linux64bit+3.10-2.17/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /products/gcc/v8_2_0/Linux64bit+3.10-2.17/bin/c++
-- Check for working CXX compiler: /products/gcc/v8_2_0/Linux64bit+3.10-2.17/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found ZLIB: /usr/lib64/libz.so (found version "1.2.7") 
-- Looking for pthread.h
-- Looking for pthread.h - found
-- Looking for pthread_create
-- Looking for pthread_create - not found
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - found
-- Found Threads: TRUE  
-- Found Protobuf: /products/protobuf/v3_11_2/Linux64bit+3.10-2.17-e19/bin/protoc-3.11.2.0 (found suitable version "3.11.2.0", minimum required is "3.11.2.0") 
CMake Error at /products/trtis_clients/v19_11_0/Linux64bit+3.10-2.17-e19-prof/lib/cmake/TRTIS/TRTISConfig.cmake:35 (find_package):
  Could not find a package configuration file provided by "gRPC" (requested
  version 1.19.0) with any of the following names:

    gRPCConfig.cmake
    grpc-config.cmake

  Add the installation prefix of "gRPC" to CMAKE_PREFIX_PATH or set
  "gRPC_DIR" to a directory containing one of the above files.  If "gRPC" 
  provides a separate development package or SDK, be sure it has been
  installed.
Call Stack (most recent call first):
  CMakeLists.txt:5 (find_package)

-- Configuring incomplete, errors occurred!

#8 Updated by Kyle Knoepfel 8 days ago

Mike, this should work in the context of a cetbuildtools/mrb. See scisoftbuild01.fnal.gov:/home/knoepfel/trtis_clients_test as an example. The CMakeLists.txt file looks like:

cmake_minimum_required (VERSION 3.5)

project(trtclient_test)

find_package(cetbuildtools 7.13.01 REQUIRED)

include(CetCMakeEnv)
cet_cmake_env()

find_ups_product(trtis_clients)
message(STATUS "Using TRTIS client library ${TRTIS_CLIENTS_VERSION}")

I got a successful output:

knoepfel@scisoftbuild01 trtis_clients_test $ source /home/knoepfel/trtis_clients_test/setup_for_develop -p

...

knoepfel@scisoftbuild01 trtis_clients_test $ buildtool
INFO: Install prefix = /dev/null
INFO: CETPKG_TYPE = Prof

------------------------------------
INFO: Stage cmake.
------------------------------------

-- cetbuildtools_BINDIR = /products/cetbuildtools/v7_14_00/bin
-- full qual e19:prof reduced to e19
-- Product is trtis_clients_test v0_00_00 e19:prof
-- Module path is /products/cetbuildtools/v7_14_00/Modules
-- set_install_root: PACKAGE_TOP_DIRECTORY is /home/knoepfel/trtis_clients_test
-- cet dot version: 0.00.00
-- CET_REPORT: /products/cetbuildtools/v7_14_00/bin/cet_report
-- Building for Linux slf7 x86_64
-- Using TRTIS client library v19_11_0
-- Configuring done
-- Generating done
-- Build files have been written to: /home/knoepfel/scratch/builds/trtis_clients_test

------------------------------------
INFO: Stage cmake successful.
------------------------------------

------------------------------------
INFO: Stage build.
------------------------------------

0.01user 0.01system 0:00.03elapsed 100%CPU (0avgtext+0avgdata 5604maxresident)k
0inputs+0outputs (0major+4290minor)pagefaults 0swaps

------------------------------------
INFO: Stage build successful.
------------------------------------

Having said that, the error message you see may indicate an error in the installation that I need to resolve. Please let me know.

#9 Updated by Michael Wang 8 days ago

Hi Kyle,

I'm probably missing something here, but this is still not working for me.

Using the line you had in your CMakeLists.txt file:

find_ups_product(trtis_clients)

gives me access to variables like ${TRTIS_CLIENTS_VERSION} and ${TRTIS_CLIENTS_INC} but not to others like ${TRTIS_CLIENTS_FQ_DIR} or
${TRTIS_CLIENTS_LIB} so I've had to manually enter the path:

link_directories("/products/trtis_clients/v19_11_0/Linux64bit+3.10-2.17-e19-prof/lib")

I tried compiling one of the sample clients "image_client.cc" provided by Nvidia in ./src/clients/c++/examples but get errors like:

Scanning dependencies of target image_client
[ 50%] Building CXX object CMakeFiles/image_client.dir/image_client.cc.o
[100%] Linking CXX executable bin/image_client
/usr/bin/ld: /products/trtis_clients/v19_11_0/Linux64bit+3.10-2.17-e19-prof/lib/librequest_static.a(request_http.cc.o): undefined reference to symbol 'pthread_create@@GLIBC_2.2.5'
//usr/lib64/libpthread.so.0: error adding symbols: DSO missing from command line
collect2: error: ld returned 1 exit status

Could you try compiling this or one of the other sample clients provided by Nvidia in ./src/clients/c++/examples to see if you have similar problems?

Mike

#10 Updated by Kyle Knoepfel 8 days ago

  • Status changed from Feedback to Assigned

I'll take a look, Mike, and let you know what I find out. It may be helpful to have a working session together in the next week or two.

#11 Updated by Kyle Knoepfel 8 days ago

I have updated the scisoftbuild01.fnal.gov:/home/knoepfel/trtis_clients_test directory so that image_client.cc builds successfully. I would like to show members of the SciSoft team the result so they can give input on what expected CMake interface should look like.

#12 Updated by Michael Wang 7 days ago

Thanks Kyle,
I implemented your modifications to the CMakeLists.txt and was able to compile the code successfully. I also tested the executable against a tensorrt inference server I was running in a docker container and it seems to work properly:

p1{mwang}1199% image_client -m resnet50_netdef -s INCEPTION -c 5 mug.jpg 
Request 0, batch size 1
Image 'mug.jpg':
    504 (COFFEE MUG) = 0.723992
    968 (CUP) = 0.270953
    967 (ESPRESSO) = 0.00115996
    899 (WATER JUG) = 0.00102757
    505 (COFFEEPOT) = 0.00082841

#13 Updated by Michael Wang 2 days ago

We've implemented a standalone client that replicates the functionality of the emtrkmichelid module in ProtoDUNE which we've compiled using the ups TensorRT libraries and with which we've been able to successfully run inference requests on a TensorRT inference server. We'd like to integrate this functionality now into LArSoft. I may be wrong but it seems that the UPS product you provided uses newer versions of certain products like GCC and PYTHON that may not be compatible with existing releases of dunetpc/larsoft. Let me know if this is correct.



Also available in: Atom PDF