Best way to integrate GPU use in my code?

7 views (last 30 days)
AlexRD
AlexRD on 18 May 2021
Commented: Infinite_king on 18 Apr 2024 at 9:06
I've started doing a lot of work on a neural net implementation i've built from scratch using Matlab, and initially changed from using GPU to using CPU only as it was easier to debug and write code for, and would allow me to focus on the GPU aspect of it later.
I am now however on the GPU implementation part, but struggling a bit to get an optimized result. I noticed that the GPU struggles a lot with multiple layers, with the processing time often being directly proportional to how many layers I have, whereas the CPU doesn't really care about number of layers (as long as amount of neurons aren't crazy high) but struggles a bit with the input layer, considering the amount of weights and biases.
I've tried a hybrid approach, where the input and any convolutional layers are assigned to the GPU, the GPU data is then fetched and processed by the CPU. But often the fetch time isn't worth the hassle.
Some feedback would be very welcome, and my project can be found here, fully documented: https://github.com/AlexRDX/Neural-Net
Or attached to this post. Any criticism at all is welcome.
Thank you for your time!

Answers (0)

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Products


Release

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!