How do you change the output dimension of a NN layer?

I am attempting to implement the NeRF2 architecture from this paper. So far, I have the layout of the network working properly, as well as the attenuation network and the radiance network separately. At the moment, my network looks like this:
However, the outputs from relu_8 have shape CB, while the other two inputs to radianceConcat have shape SCB. The AttnReshape layer was supposed to fix this, but I can't figure out how to actually reshape the labels on the data. I've attached the code I'm using below. I am completely at a loss here, any help would be greatly appriciated.

8 Comments

Your attached code returns errors:
  • Unrecognized function or variable 'ReshapeLayer'.
  • Unable to find file or directory 'reshapeLayer.mat'.
Sorry, uploaded the wrong file (I was trying different examples from the helpsite to see if they worked). The correct file is uploaded now.
The AttnReshape layer was supposed to fix this
What does it mean to "fix" it? What dimensions do you think they should have?
However, the outputs from relu_8 have shape CB, while the other two inputs to radianceConcat have shape SCB.
That alone is not the problem. When you concatenate arrays together, they have to have the same dimensions along every dimension except the one you along which the concatenation is intended. You have configured radianceConcat to concatenate along the 3rd dimension, which means you need all three inputs to have size(Input,1:2) the same. However the sizes of the input activations are
size(headingDirInputs_Activation)=[3,20,1]
size(rxPosInputs_Activation)=[2,8,1]
size(relu_8_Activation)=[1,1]
It is not at all clear how any reshaping is supposed to let you concatenate along the 3rd dimension, and particularly not relu_8 which is just a scalar.
When the outputs from the first network get sent to AttnReshape it has dimensions CB, while the other inputs to the concat layer that AttnReshape feeds into have dimensions SCB, so the concat layer throws an error saying the dimensions aren't consistent. I tried using the ReshapeLayer to cast the inputs to a dlarray with the proper size, but that gave me an error as well.
When I say fix all I mean is making the concat layer actually work.
so the concat layer throws an error saying the dimensions aren't consistent.
They are not consistent, but it is not just that some inputs are CB and some are SCB. You can see in the output of analyzeNetwork(nerfNet) that all the activations going into radianceConcat are of different sizes in all dimensions. You cannot freely concatenate arrays of arbitrary sizes even if all of them were SCB.
You need to tell us what dimensions the activations, both going into radianceConcat and coming out, are expected to have.
I tried using the ReshapeLayer to cast the inputs to a dlarray with the proper size, but that gave me an error as well.
The only place the ReshapeLayer is applied is the output of relu_8, but the output of relu_8 is just a scalar. A scalar cannot be reshaped.
In reply to your response - so what would be the proper way to implement this architecture then? Should I flatten the inputs and then concatenate them similar to this example?
Or would the proper way to do this be to define a custom loss function with multiple inputs and multiple outputs?
I wish this forum had a reply button.
To answer your question about inputs:
  • The attenuation network takes an encoded 3d position (of shape [3, 20, 1] (SCB)) and produces a scalar output, representing either the real or complex part of that cells signal attenuation
  • The radiance network produces a singal scalar output representing the real or complex part of a signal transmitted in direction omega, given three inputs:
  • An encoded receiver position (of shape [3, 20, 1] (SCB))
  • An encoded transmission direction (of shape [2, 8, 1] (SCB))
  • The scalar output from the attenuation network (of shape [1,1] with no dimensional labels)

Sign in to comment.

Answers (1)

Should I flatten the inputs and then concatenate them similar to this example?
Probably. See if the attached network does what you want.

Products

Release

R2024b

Asked:

on 5 Mar 2025

Answered:

on 5 Mar 2025

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!