Solved – Common activation function in fully connected layer

conv-neural-networkneural networks

I'm a newbie in deep learning. As I have known, each neuron has a gain/weight and an offset/bias with an activation function (e.g. sigmoid, tanh, ReLU, identity and etc).

In the convolution layer in a Convolution-Neural-Networks mentioned that it usually concatenate with ReLU activation function, but what happened in the fully connected layer?

What is the most common activation function in a fully connected layer in a deep CNN?

  • Fully connected input layer (flatten)━takes the output of the previous
    layers, “flattens” them and turns them into a single vector that can
    be an input for the next stage.
  • The first fully connected layer━takes
    the inputs from the feature analysis and applies weights to predict
    the correct label.
  • Fully connected output layer━gives the final
    probabilities for each label.