NPTEL Introduction To Machine Learning – IITKGP Week 7 Assignment Answers 2024

Sanket
By Sanket

NPTEL Introduction To Machine Learning – IITKGP Week 7 Assignment Answers 2024

1. Which of the following options is / are correct regarding the benefits of ensemble model?

  1. Better performance
  2. More generalized model
  3. Better interpretability

A) 1 and 3
B) 2 and 3
C) 1 and 2
D) 1, 2 and 3

Answer :- For Answers Click Here 

2. In AdaBoost, we give more weights to points having been misclassified in previous iterations. Now, if we introduce a limit or cap on the weight that any point can take (for example, say we introduce a restriction that prevents any point’s weight from exceeding a value of 10). Which among the following would be the effect of such a modification?

A) It will have no effect on the performance of the Adaboost method.
B) It makes the final classifier robust to outliers.
C) It may result in lower overall performance.
D) None of these.

Answer :- For Answers Click Here 

3. Identify whether the following statement is true or false:
“Boosting is easy to parallelize whereas bagging is inherently a sequential process.”

A) True
B) False

Answer :- For Answers Click Here 

4. Considering the AdaBoost algorithm, which among the following statements is true?

A) In each stage, we try to train a classifier which makes accurate predictions on a subset of the data points where the subset contains more of the data points which were misclassified in earlier stages.
B) The weight assigned to an individual classifier depends upon the weighted sum error of misclassified points for that classifier.
C) Both option A and B are true
D) None of them are true

Answer :- 

5. Which of the following is FALSE about bagging?

A) Bagging increases the variance of the classifier
B) Bagging can help make robust classifiers from unstable classifiers.
C) Majority Voting is one way of combining outputs from various classifiers which are being bagged.

Answer :- 

6. Suppose the VC dimension of a hypothesis space is 6. Which of the following are true?

A) At least one set of 6 points can be shattered by the hypothesis space.
B) Two sets of 6 points can be shattered by the hypothesis space.
C) All sets of 6 points can be shattered by the hypothesis space.
D) No set of 7 points can be shattered by the hypothesis space.

Answer :- For Answers Click Here 

7. Identify whether the following statement is true or false:
“Ensembles will yield bad results when there is a significant diversity among the models.”

A) True
B) False

Answer :- 

8. Which of the following algorithms is not an ensemble learning algorithm?

A) Random Forest
B) Adaboost
C) Decision Trees

Answer :- 

9. Suppose you have run Adaboost on a training set for three boosting iterations. The results are classifiers h1, h2, and h3, with coefficients al = 0.2, a2 = -0.3, and a3 = -0.2. For a given test input x, you find that the classifiers results are h1(x) = 1, h2(x) = 1, and h3(x) = – 1, What is the class returned by the Adaboost ensemble classifier H on test example x?

A) 1
B) -1

Answer :- 

10. Generally, an ensemble method works better, if the individual base models have____________? (Note: Individual models have accuracy greater than 50%)

A) Less correlation among predictions
B) High correlation among predictions
C) Correlation does not have an impact on the ensemble output
D) None of the above.

Answer :- For Answers Click Here 
Share This Article
Leave a comment