This module supports TNN on x86 architecture and includes Openvino framework into TNN, which allows a TNN model running on Openvino Network.
Cmake(>=3.7.2)
Visual Stuido(>=2017)
CMake(>=3.7.2 or use build-in CMake in Visual Studio Tools)
Linux:
$ cd scripts/
$ sh build_linux.sh
Windows:
cd scripts\
.\build_msvc.bat [VS2015/VS2017/VS2019]
Refer to FAQ if failed.
Move to build_openvino/test/
, run TNNTest
with model, and set device_type to X86
$ cd build_openvino/test/
$ ./TNNTest -mp PATH_TO_MODEL -dt X86 -ip PATH_TO_INPUT -op PATH_TO_OUTPUT
Refer to API Documentation, which needs to set config.device_type
as DEVICE_X86
and config.network_type
as NETWORK_TYPE_OPENVINO
config.device_type = TNN_NS::DEVICE_X86
// run with native x86 optimized code, if network type is not set
config.network_type = TNN_NS::NETWORK_TYPE_OPENVINO
Move to example/openivno/
and run build_openvino.sh
to compile demos with x86 architecture. Then call demo_x86_linux_imageclassify
or demo_x86_linux_facedetector
to run demos. For details move to
Refer to demo documentaion
Q: CMake not found in Windows?
A: If CMake was installed, add the CMake path to Windows Environment Viraibles. Or use Visual Studio Prompt to run build_x86_msvc.bat, which includes build-in CMake.
Q: Visual Studio not found in Windows?
A: Execute the scripts with Visual Studio Version, Like
.\build_x86_msvc.bat VS2019
Q: Error 0x4001 or 16385 with message "Invalid Model Content"
A: set std::ios::binary
when reading Model stream:
std::ifstream model_stream(mode_path, std::ios::binary);