File Exchange

image thumbnail

GPU Coder Interface for Deep Learning Libraries

Interface for Deep Learning Libraries from GPU Coder

33 Downloads

Updated 09 Jan 2020

GPU Coder™ enables generates optimized C++ code for deep learning, embedded vision, and autonomous systems. The generated code calls:
- optimized NVIDIA CUDA libraries and can be used for prototyping on all NVIDIA GPU platforms
- optimized Arm libraries and can be used for prototyping on Arm Mali GPU platforms

You can deploy a variety of trained deep learning networks such as Yolo, ResNet-50, SegNet, MobileNet etc. from Deep Learning Toolbox™ to NVIDIA GPUs. You can generate optimized code for preprocessing and postprocessing along with your trained deep learning networks to deploy complete algorithms.

https://www.mathworks.com/help/gpucoder/ug/gpucoder-supported-networks-layers.html

GPU Coder Interface for Deep Learning Libraries provides the ability to customize the generated code by leveraging target specific libraries on the embedded target. With this support package, you can integrate with libraries optimized for specific GPU targets for deep learning such as the TensorRT library for NVIDIA GPUs or Arm compute library for Arm Mali GPUs .

GPU Coder Interface for Deep Learning integrates with the following deep learning accelerator libraries and the corresponding GPU architectures:
• cuDNN and TensorRT libraries for NVIDIA GPUs
• Arm compute library for Arm Mali GPUs

This hardware support package is functional for R2018b and beyond.

It requires GPU Coder except when using Deep Learning Toolbox’s predict function with the (“Acceleration”,“mex”) name-value pair option.

If you have download or installation problems, please contact Technical Support - https://www.mathworks.com/support/contact_us.html

Comments and Ratings (5)

cui

blank tian

Ram

dudu tassa

Bill Chou

MATLAB Release Compatibility
Created with R2018b
Compatible with R2018b to R2019b
Platform Compatibility
Windows macOS Linux