matrices subtraction of different dimensions
Show older comments
My apologies for not defining the question in detailed format. Here is the problem which I am trying to solve for neural network optimization. The dataset A has 500 observations with 9 features variables making a matrix of 500 X 9. For better optimization, couple of parameters are calculated which is defined as
NumberofHiddenNeurons = 10
NumberofInputNeurons = 9 %(No, of features of dataset A)
InputWeight=rand(NumberofHiddenNeurons,NumberofInputNeurons)*2-1;
NumberofTrainingData = 500 %A = 500 X 9
for j=1:NumberofHiddenNeurons
for M=1:NumberofInputNeurons
C(j,M)= -(InputWeight(j,M)/2*(InputWeight(j,NumberofInputNeurons)));
end
end
%%subtraction of C from original A
for g= 1:NumberofHiddenNeurons
for k = 1:NumberofTrainingData
tempH(g,k)=norm(bsxfun(@minus,(A(g,k)),C(g,:))).^2;
end
end
Now the problem is C 10 X 9 as 10 hidden nodes for 9 features and original dataset is 500 X 9 features. The output as tempH should be 10 X 500 as 10 outputs from 10 hidden neurons for 500 observations. But calculation of tempH variable gives an error because C and A are of different dimensions. Is any way to subtract these two matrices C and A to get desirable output as tempH of 10 X 500. Please help.
2 Comments
Walter Roberson
on 13 Dec 2013
Reformatted, including reformatting your code. Indented code is a lot easier to read.
lavneet singh
on 13 Dec 2013
Accepted Answer
More Answers (0)
Categories
Find more on Downloads in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!