Posts

Showing posts from February, 2022

ReLU Layer

The convolution operation is interleaved with the pooling and ReLU operations. The ReLU activation is not very different from how it is applied in a traditional neural network. For each  of the  $L_q ×B_q ×d_q$  values in a layer, the ReLU activation function is applied to it to create  $L_q×B_q×d_q$  thresholded values. These values are then passed on to the next layer. Therefore,  applying the ReLU does not change the dimensions of a layer because it is a simple one-to one  mapping of activation values. In traditional neural networks, the activation function is  combined with a linear transformation with a matrix of weights to create the next layer of  activations. Similarly, a ReLU typically follows a convolution operation (which is the rough  equivalent of the linear transformation in traditional neural networks), and the ReLU layer  is often not explicitly shown in pictorial illustrations of the convolution neural network  architectures. It is noteworthy that the use of the ReLU

Pooling

Image
The pooling operation is quite different. The pooling operation works on small grid regions of size $P_q × P_q$ in each layer, and produces another layer with the same depth (unlike filters). For each square region of size $P_q ×P_q$ in each of the $d_q$ activation maps, the maximum of these values is returned. This approach is referred to as max-pooling . If a stride of 1 is used, then this will produce a new layer of size $(L_q − P_q + 1) × (B_q − P_q + 1) × d_q$.However, it is more common to use a stride $S_q > 1$ in pooling. In such cases, the length of the new layer will be $(L_q −P_q)/S_q +1$ and the breadth will be $(B_q −P_q)/S_q +1$. Therefore, pooling drastically reduces the spatial dimensions of each activation map.Unlike with convolution operations, pooling is done at the level of each activation map. Whereas a convolution operation simultaneously uses all $d_q$ feature maps in combination with a filter to produce a single feature value, pooling independently operates