WHAT WILL BE THE FEATURE LAYER ? IF I USE DIFFERENT NETWORKS LIKE mobilenetv2, DENSENET201, VGG16?

4 views (last 30 days)
preprocessedTrainingData = transform(trainingData, @(data)preprocessData(data,inputSize));
numAnchors = 3;
anchorBoxes = estimateAnchorBoxes(preprocessedTrainingData,numAnchors)
featureExtractionNetwork = mobilenetv2/;
featureLayer = '';
numClasses = 1;
lgraph = fasterRCNNLayers(inputSize,numClasses,anchorBoxes,featureExtractionNetwork,featureLayer);
augmentedTrainingData = transform(trainingData,@augmentData);
I WANT TO USE DIFFERENT NETWORKS FOR FASTER R CNN MODEL AS ITS BACKBONE. SO WHAT WILL BE THE FEATURE LAYERS FOR EACH NETWORK LIKE VVG16, DENSENET201, MOBILENET-V2, INCEPTION, SQUEEZENET, ETC.. / CAN I KNOW FOR WHICH NETWORK WAHT CAN BE USED?

Accepted Answer

Venu
Venu on 19 Mar 2024
The selection of feature layers is guided by a balance of theoretical considerations (such as the depth of the network and the type of operations performed) and empirical evidence (such as performance on benchmark datasets and computational efficiency).
1. VGG16
  • Feature Layer: 'relu5_3'
  • The last convolutional layer before the fully connected layers, providing a rich feature set.
2. DenseNet201
  • Feature Layer: 'relu5_blk'
  • In DenseNet, the feature maps grow in depth as they pass through blocks. This layer is deep in the network, providing detailed features.
3. MobileNetV2
  • Feature Layer: 'block_13_expand_relu' or 'out_relu'
  • For MobileNetV2, earlier layers like 'block_13_expand_relu' can be used for speed, while 'out_relu' provides more detailed features but is deeper in the network.
4. Inception (e.g., InceptionV3)
  • Feature Layer: 'mixed7'
  • Inception networks have multiple mixed layers; 'mixed7' is a commonly used layer for feature extraction before the network becomes too deep.
5. SqueezeNet
  • Feature Layer: 'fire9_concat'
  • SqueezeNet is designed to be efficient, with 'fire9_concat' being a layer late in the network that combines features from the fire modules.
6. ResNet (e.g., ResNet50)
  • Feature Layer: 'activation_40_relu' or 'activation_49_relu'
  • ResNet networks have several residual blocks, and layers like 'activation_40_relu' or closer to the output 'activation_49_relu' are good choices for feature extraction.
7. Xception
  • Feature Layer: 'block14_sepconv2_act'
  • Near the end of the network, this layer provides a deep set of features after several separable convolutional layers.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!