STUDENT
AI - EVALUATION
Total Q: 31
Time: 30 Mins
Q 1.
Which one of the following scenario result in a high false negative cost?
Viral outbreak
Mining
Copyright detection
spam filter
Q 2.
In spam email detection, which of the following will be considered as "False Negative" ?
When a legitimate email is accurately identified as not spam.
When a spam email is mistakenly identified as legitimate.
When an email is accurately recognised as spam.
When an email is inaccurately labelled as important.
Q 3.
Prediction and Reality can be easily mapped together with the help of :
Prediction
Reality
Accuracy
Confusion Matrix
Q 4.
Which of the following statements is not true about overfitting models?
This model learns the pattern and noise in the data to such extent that it harms the performance of the model on the new dataset
Training result is very good and the test result is poor
It interprets noise as patterns in the data
The training accuracy and test accuracy both are low
Q 5.
Which evaluation parameter takes into account the True Positives and False Positives?
Precision
Recall
F1 Score
Accuracy
Q 6.
Which of the following talks about how true the predictions are by any model ?
Accuracy
Reliability
Recall
F1 score
Q 7.
____value is known as the perfect value for F1 Score.
1
2
0
100%
Q 8.
Statement 1 : Confusion matrix is an evaluation metric.
Statement 2 : Confusion Matrix is a record which helps in evaluation.
Both Statement 1 and Statement 2 are correct.
Both Statement 1 and Statement 2 are incorrect.
Statement 1 is correct and Statement 2 is incorrect.
Statement 2 is correct and Statement 1 is incorrect.
Q 9.
Which of the following is defined as the measure of balance between precision and recall?
Accuracy
F1 Score
Reliability
Punctuality
Q 10.
Sarthak made a face mask detector system for which he had collected the dataset and used all the dataset to train the model. Then, he used some different data set to evaluate the model which resulted in the correct answer all the time. Name the concept.
Perfect Fit
Under Fitting
Over Fitting
Correct Fit
Q 11.
Which of the following statements is true for the term Evaluation?
Helps in classifying the type and genre of a document.
It helps in predicting the topic for a corpus.
Helps in understanding the reliability of any AI model
Process to extract the important information out of a corpus.
Q 12.
When the prediction matches the reality, the condition is termed as______.
True Positive or True Negative
True Positive or False Negative
True Positive or False Positive
False Positive and False Negative
Q 13.
Statement1: The output given by the AI model is known as reality.
Statement2:The real scenario is known as Prediction.
Both Statement 1 and Statement 2 are correct
Both Statement1 and Statement 2 are incorrect
Statement 1 is correct but Statement 2 is incorrect
Statement 2 is correct but Statement 1 is incorrect
Q 14.
Which one of the following scenario result in a high false positive cost?
Viral outbreak
Forest fire
Flood
Spam filter
Q 15.
What will be the outcome, if the Prediction is "Yes" and it matches with the Reality?
What will be the outcome, if the Prediction is "Yes" and it does not match the Reality?
True Positive, True Negative
True Negative, False Negative
True Negative, False Positive
True Positive, False Positive
Q 16.
Two conditions when prediction matches with the reality are true positive and _______
True Negative
False Positive
False Negative
Negative False
Q 17.
What is the primary need for evaluating an AI model's performance in the AI Model Development process?
To increase the complexity of the model.
To visualize the data.
To assess how well the chosen model will work in future.
To reduce the amount of data used for training.
Q 18.
Differentiate between Prediction and Reality.
Prediction is the input given to the machine to receive the expected result of the reality.
Prediction is the output given to match the reality.
The prediction is the output which is given by the machine and the reality is the real scenario in which the prediction has been made.
Prediction and reality both can be used interchangeably.
Q 19.
____________ is used to record the result of comparison between the prediction and reality. It is not an evaluation metric but a record which can help in evaluation.
Confusion Matrix
F1 Score
Precision
Accuracy
Q 20.
While evaluating a model's performance, recall parameter considers
(i) False positive
(ii) True positive
(iii) False negative
(iv) True negative
Choose the correct option :
only (i)
(ii) and (iii)
(iii) and (iv)
(i) and (iv)
Q 21.
F1 Score is the measure of the balance between
Accuracy and Precision
Precision and Recall
Recall and Accuracy
Recall and Reality
Q 22.
Raunak was learning the conditions that make up the confusion matrix. He came across a scenario in which the machine that was supposed to predict an animal was always predicting not an animal. What is this condition called?
False Positive
True Positive
False Negative
True Negative
Q 23.
Statement 1 : To evaluate a models' performance, we need either precision or recall.
Statement 2 : When the value of both Precision and Recall is 1, the F1 score is 0.
Both statement 1 and statement 2 are correct.
Both statement 1 and statement 2 are incorrect.
Statement 1 is correct, but statement 2 is incorrect.
Statement 1 is incorrect, but statement 2 is correct.
Q 24.
______ is one of the parameter for evaluating a model's performance and is defined as the fraction of positive cases that are correctly identified.
Precision
Recall
Accuracy
F1
Q 25.
Sarthak made a face mask detector system for which he had collected the dataset and used all the dataset to train the model. Then, he used the same data to evaluate the model which resulted in the correct answer all the time but was not able to perform with unknown dataset. Name the concept.
Underfitting
Perfect Fit
Overfitting
True Positive
Q 26.
________ helps to find the best model that represents our data and how well the chosen model will work in future.
Problem Scoping
Data Acquisition
Data Exploration
Evaluation
Q 27.
Which two evaluation methods are used to calculate F1 Score?
Precision and Accuracy
Precision and Recall
Accuracy and Recall
Precision, F1 score
Q 28.
The output given by the AI machine is known as ________
Prediction
Reality
True
False
Q 29.
Which evaluation parameter takes into consideration all the correct predictions?
Accuracy
Precision
Recall
F1 Score
Q 30.
Recall-Evaluation method is
defined as the fraction of positive cases that are correctly identified.
defined as the percentage of true positive cases versus all the cases where the prediction is true.
defined as the percentage of correct predictions out of all the observations.
comparison between the prediction and reality
Q 31.
Priya was confused with the terms used in the evaluation stage. Suggest her the term used for the percentage of correct predictions out of all the observations.
Accuracy
Precision
Recall
F1 Score