What is batch normalization in deep learning?
Experience Level: Junior
Tags: Machine learning
Answer
Batch normalization is a technique used in deep learning to improve the training of artificial neural networks. It involves normalizing the activations of the previous layer for each batch of data before applying the activation function. Batch normalization can accelerate training, reduce overfitting, and make the model more robust to changes in the distribution of the input data.
Related Machine learning job interview questions
What is a loss function in machine learning?
Machine learning JuniorWhat is a decision tree in machine learning?
Machine learning JuniorWhat is a kernel in machine learning?
Machine learning JuniorWhat is a hyperparameter in machine learning?
Machine learning JuniorWhat is backpropagation?
Machine learning Junior