# (2014. 12) Fractional Max-pooling

* Submitted on 2014. 12
* Benjamin Graham

## Simple Summary

> Our version of max-pooling is stochastic as there are lots of different ways of constructing suitable pooling regions. We find that our form of fractional max-pooling reduces overfitting on a variety of datasets.

* Spatial pooling layers are building blocks for Convolutional Neural Networks (CNNs).
* **Max Pooling 2x2**
  * Pros:
    1. Fast.
    2. Quickly reduces the size of the hidden layer.
    3. Encodes a degree of invariance with respect to translations and elastic distortions.
  * Cons:
    1. Disjoint nature of pooling regions.
    2. Since size decreases rapidly, stacks of back-to-back CNNs are needed to build deep networks.

![images](https://1712266326-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LMrEcS7cR9bGTHSnCnB%2F-LRazIqPSIKca1ujTQk5%2F-LRazKUWtuQRzqNERyRE%2Ffractional_max-pooling_1.png?generation=1542547404206557\&alt=media)

![images](https://1712266326-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LMrEcS7cR9bGTHSnCnB%2F-LRazIqPSIKca1ujTQk5%2F-LRazKUYGCPRZ_AtSuAe%2Ffractional_max-pooling_2.png?generation=1542547408790425\&alt=media)

* **Fractional Max-Pooling**
  * Reduces the spatial size of the image by a factor of α, where α ∈ (1, 2).
  * Introduces randomness in terms of choice of pooling region.
  * Pooling regions can be chosen in a random or pseudorandom manner.
  * Pooling regions can be disjoint or overlapping.
  * Random FMP is good on its own but may underfit when combined with dropout or training data augmentation.
