354 views
0 votes
0 votes
In fitting some data using radial basis functions with kernel width $σ$, we compute training error of $345$ and a testing error of $390$.

(a) increasing $σ$ will most likely reduce test set error

(b) decreasing $σ$ will most likely reduce test set error

(C) not enough information is provided to determine how $σ$ should be changed

1 Answer

0 votes
0 votes

I think Option (A) is correct.

training data has less error compared to test set error and test set has high error compared to training that means it’s leading to overfitting.

As increasing sigma will make the RBF function to be more broader and less prone to overfitting. Lower sigma indicates that the curve is capturing all the noise and peculiarities of the training data and more chances to get overfit. That’s why increasing sigma will lead to reduce test set errror.

Related questions

585
views
1 answers
0 votes
rajveer43 asked Jan 14
585 views
Suppose you have a three-class problem where class label \( y \in \{0, 1, 2\} \), and each training example \( \mathbf{X} \) has 3 binary attributes \( X_1, ... an example using the Naive Bayes classifier?(a) 5b) 9(c) 11(d) 13(e) 23
359
views
1 answers
0 votes
rajveer43 asked Jan 13
359 views
After applying a regularization penalty in linear regression, you find that some of the coefficients of $w$ are zeroed out. Which of the following penalties might have been used?(a) ... (c) L2 norm(d) either (A) or (B)(e) any of the above
179
views
0 answers
0 votes
rajveer43 asked Jan 13
179 views
Using the same data as above \( \mathbf{X} = [-3, 5, 4] \) and \( \mathbf{Y} = [-10, 20, 20] \), assuming a ridge penalty \( \lambda = 50 \), what ratio versus the MLE ... \mathbf{w}}_{\text{ridge}} \) will be?(a)] 2b)] 1(c)] 0.666(d)] 0.5
213
views
1 answers
0 votes
rajveer43 asked Jan 13
213 views
Consider the statements:$P1:$ It is generally more important to use consistent estimators when one has smaller numbers of training examples.$P2:$ It is generally more important to ... C) Only $P2$ is True(D) Both $P1$ and $P2$ are False