Definition

A regularization technique in neural networks where randomly selected neurons are temporarily removed during training. Dropout prevents co-adaptation of neurons and reduces overfitting by forcing the network to learn more robust and distributed representations.

Defined Term