# (2014. 12) Fractional Max-pooling

* Submitted on 2014. 12
* Benjamin Graham

## Simple Summary

> Our version of max-pooling is stochastic as there are lots of different ways of constructing suitable pooling regions. We find that our form of fractional max-pooling reduces overfitting on a variety of datasets.

* Spatial pooling layers are building blocks for Convolutional Neural Networks (CNNs).
* **Max Pooling 2x2**
  * Pros:
    1. Fast.
    2. Quickly reduces the size of the hidden layer.
    3. Encodes a degree of invariance with respect to translations and elastic distortions.
  * Cons:
    1. Disjoint nature of pooling regions.
    2. Since size decreases rapidly, stacks of back-to-back CNNs are needed to build deep networks.

![images](/files/-LRazKUWtuQRzqNERyRE)

![images](/files/-LRazKUYGCPRZ_AtSuAe)

* **Fractional Max-Pooling**
  * Reduces the spatial size of the image by a factor of α, where α ∈ (1, 2).
  * Introduces randomness in terms of choice of pooling region.
  * Pooling regions can be chosen in a random or pseudorandom manner.
  * Pooling regions can be disjoint or overlapping.
  * Random FMP is good on its own but may underfit when combined with dropout or training data augmentation.


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://humanbrain.gitbook.io/notes/notes/vision/fractional_max-pooling.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
