Posts

Showing posts from July, 2022

Dropout

Image
  Parametric Model Selection and Averaging One challenge in the case of neural network construction is the selection of a large number of  hyperparameters like the depth of the network and the number of neurons in each layer.  Furthermore, the  choice of the activation function also has an effect on performance , depending on the application at hand. The presence of a large number of parameters creates problems in model construction, because the performance might be sensitive to the particular configuration used. One possibility is to hold out a portion of the training data and try different combinations of parameters and model choices. The selection that provides the highest accuracy on the held-out portion of the training data is then used for prediction. This is, of course, the standard approach used for parameter tuning in all machine learning models, and is also referred to as model selection. In a sense, model selection is inherently an ensemble-centric app...