Deep Learning Toolbox Interface for TensorFlow Lite
Incorporate pre-trained TensorFlow Lite (TFLite) models into MATLAB and Simulink applications for simulation and deployment to hardware.
Updated 15 Mar 2023
The Deep Learning Toolbox Interface for TensorFlow Lite enables you to run cosimulations of MATLAB and Simulink applications with TensorFlow Lite models. This workflow allows you to use pre-trained TensorFlow Lite (TFLite) models, including classification and object detection networks, with the rest of the application code implemented in MATLAB or Simulink for development and testing.
Inference of pre-trained TFLite models is executed by the TensorFlow Lite Interpreter while the rest of the application code is executed by MATLAB or Simulink. Data exchange between MATLAB or Simulink and TensorFlow Lite is handled automatically.
When used with MATLAB Coder, you can generate C++ code for the complete application for deployment to target hardware. In the generated code, inference of the TFLite model is executed by the TensorFlow Lite Interpreter while C++ code is generated for the remainder of the MATLAB or Simulink application, including pre- and post-processing. Data exchange between the generated code and the TFLite Interpreter is again handled automatically.
Please see the following list for a list of prerequisites for using this software package:
If you experience download or installation problems, please contact Technical Support:
MATLAB Release Compatibility
Created with R2022a
Compatible with R2022a to R2023a
Platform CompatibilityWindows macOS Linux
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!Start Hunting!
Discover Live Editor
Create scripts with code, output, and formatted text in a single executable document.