219 views
0 votes
0 votes
Suppose we want to compute $10-Fold$ Cross-Validation error on $100$ training examples. We need to compute error $N1$ times, and the Cross-Validation error is the average of the errors. To compute each error, we need to build a model with data of size $N2$, and test the model on the data of size $N3$. What are the appropriate numbers for $N1, N2, N3$?

 
(a) $N1 = 10, N2 = 90, N3 = 10$

(b) $N1 = 1, N2 = 90, N3 = 10$

(c) $N1 = 10, N2 = 100, N3 = 10$

(d) $N1 = 10, N2 = 100, N3 = 10$

1 Answer

0 votes
0 votes

N1 = 10:

  • In 10-fold cross-validation, the data is divided into 10 equal folds.
  • Each fold is used as a test set once, while the remaining 9 folds are used for training.
  • Therefore, we need to compute the error 10 times (once for each fold).

N2 = 90:

  • For each of the 10 iterations, we use 90% of the data (90 examples) for training the model.
  • This is because we're using 9 out of the 10 folds for training in each iteration.

N3 = 10:

  • The remaining 10% of the data (10 examples) are used as the test set in each iteration.
  • This is the fold that wasn't used for training in that particular iteration.

Related questions

357
views
2 answers
0 votes
rajveer43 asked Jan 13
357 views
Suppose you have picked the parameter \( \theta \) for a model using 10-fold cross-validation. The best way to pick a final model to use and estimate its error ... the \( \theta \) you found; use the average CV error as its error estimate
287
views
1 answers
0 votes
rajveer43 asked Jan 13
287 views
P1: In the limit of infinite training and test data, consistent estimators always give at least as low a test error as biased estimators. P2: Leave-one out cross ... ?Only P1 is TrueOnly P2 is TrueP1 is True and P2 is FalseBoth are False
585
views
1 answers
0 votes
rajveer43 asked Jan 14
585 views
Suppose you have a three-class problem where class label \( y \in \{0, 1, 2\} \), and each training example \( \mathbf{X} \) has 3 binary attributes \( X_1, ... an example using the Naive Bayes classifier?(a) 5b) 9(c) 11(d) 13(e) 23
359
views
1 answers
0 votes
rajveer43 asked Jan 13
359 views
After applying a regularization penalty in linear regression, you find that some of the coefficients of $w$ are zeroed out. Which of the following penalties might have been used?(a) ... (c) L2 norm(d) either (A) or (B)(e) any of the above