edited by
3,492 views
0 votes
0 votes

Which among the following may help to reduce overfitting demonstrated by a model

  1. Change the loss function. 
  2. Reduce model complexity.
  3. Increase the training data.
  4. Increase the number of optimization routine steps.

 

  1. $\text{ii and i}$
  2. $\text{ii and iii}$
  3. $\text{i, ii, and iii}$
  4. $\text{i, ii, iii, and iv}$
edited by

6 Answers

2 votes
2 votes

Ans: Choice B

Option 2 is an obvious one and hence included in every choice 😁

Option 3 Train with more data is one of the techniques to reduce overfitting.

Reference: https://www.ibm.com/topics/overfitting

reshown by
1 votes
1 votes
  1. Change the loss function:

    • Choosing a loss function that penalizes the model for overfitting tendencies or introduces regularization terms might guide the model towards better generalization.
  2. Reduce model complexity:

    • Simplifying the model architecture, like using fewer layers or parameters, can make it less prone to memorizing the training data and more likely to generalize well to new, unseen data.
  3. Increase the training data:

    • More data can provide the model with a broader view of the underlying patterns in the data, making it less likely to overfit specific examples. It's like giving the model a richer experience to learn from.
  4. Increase the number of optimization routine steps:

    • increasing the number of optimization steps might make the model more prone to overfitting the training data, especially if the model is already complex. It's generally better to focus on the other strategies mentioned.

So, the effective ways to combat overfitting from your list would be:

  • Change the loss function.
  • Reduce model complexity.
  • Increase the training data.

 

correct answer is C

0 votes
0 votes
  1. ii and iii –  Answer

Overfitting means the training error is low but the testing error is very high. That means the variance of the trained model is high but bias of the model is low for the given training data.

To reduce the variance of the model the following steps can be taken – 

  1. Reduce the model complexity(Increase regularization).
  2. Train on larger dataset.
  3. Early stopping the training.
  4. Bagging.

Reference – https://www.youtube.com/watch?v=zUJbRO0Wavo&list=PLl8OlHZGYOQ7bkVbuRthEsaLr7bONzbXS&index=19&pp=iAQB

https://www.youtube.com/watch?v=65UJPA10dW8&list=PLl8OlHZGYOQ7bkVbuRthEsaLr7bONzbXS&index=20&pp=iAQB

0 votes
0 votes
Answer: C

Statement 1 is also true
Modifying the loss function can indeed help reduce overfitting. For example, using regularization terms in the loss function (like L1 or L2 regularization) can penalize overly complex models

Related questions

4.0k
views
2 answers
0 votes
admin asked Oct 21, 2023
4,017 views
Given $3$ literals $\text{A, B}$, and $\text{C}$, how many models are there for the sentence $\text{A $\vee$ $\neg$ B $\vee$ C}$ ?
3.6k
views
2 answers
1 votes
admin asked Oct 21, 2023
3,566 views
Which of the following first-order logic sentence matches closest with the sentence "All students are not equal"?$\forall x \exists y[\operatorname{student}(x) \wedge \op...
1.7k
views
1 answers
0 votes
admin asked Oct 21, 2023
1,698 views
The mean of the observations of the first $50$ observations of a process is $12$. If the $51$ $\text{st}$ observation is $18$, then, the mean of the first $51$ observatio...
3.1k
views
1 answers
0 votes
admin asked Oct 21, 2023
3,077 views
Given $\mathrm{n}$ indistinguishable particles and $m(>n)$ distinguishable boxes, we place at random each particle in one of the boxes. The probability that in $\mathrm{n...