Posts

NPTEL | DATA MINING | WEEK-4

Image
  NPTEL DATA MINING WEEK-4 Which of the following statement is true about Bayes classifier? It always provides zero error when class distributions are known It always provides the lowest possible error when class distributions are known It may not always provide the lowest possible error when class distributions are known It always provides the lowest possible error when class distributions are estimated Let A be an example, and C be a class. The probability P(C|A) is known as: Apriori probability Aposteriori probability Class conditional probability None of the above Let A be an example, and C be a class. The probability P(C) is known as: Apriori probability Aposteriori probability Class conditional probability None of the above Consider a binary classification problem with two classes C1 and C2. Class labels of ten other training set instances sorted in increasing order of their distance to an instance x are as follows: {C1, C2, C1, C2, C2, C2, C1, C2, C1, C2}. How will a K=3 nea...

NPTEL | DATA MINING | WEEK-3

Image
  NPTEL DATA MINING WEEK-3 Internal nodes of the decision tree correspond to: Attributes Classes Data instances None of the above In a multiclass classification problem, the Bayes classifier assigns an instance to the class corresponding to: Highest aposteriori probability Highest apriori probability Lowest aposteriori probability Lowest apriori probability Three identical bags contain blue and yellow balls. The first bag contains 3 blue and 2 yellow balls, the second bag gas 4 blue and 5 yellow balls, and the third bag has 2 blue and 4 yellow balls. A bag is chosen randomly and a ball is chosen from it. If the ball that is drawn out is blue, what will be the probability that the second bag is chosen? 15/62 27/62 10/31 None of these What is the entropy of the dataset? 0.5 0.94 1 0 Which attribute would information gain choose as the root of the tree? Age Income Student Credit rating Whether a person will buy if {Age=35, Student=No, Income=Low, Credit_rating=Fair}? Yes No The exampl...

NPTEL | DATA MINING | WEEK-2

Image
NPTEL DATA MINING WEEK-2 If a store has N items, the number of possible itemsets is 2N-1 2 N -1 N/2 N-1 An association rule is valid if it satisfies: Support criteria Confidence criteria Both support and confidence criteria None of these An itemset is frequent if it satisfies the: Support criteria Confidence criteria Both support and confidence criteria None of these Which of the following property is used by the apriori algorithm: Positive definiteness property of support Positive semidefiniteness property of support Monotone property of support Antimonotone property of support Consider three itemsets I1={bat, ball, wicket}, I2={bat, ball}, I3={bat}. Which of the following statements are correct? support(I1) > support(I2) support(I2) > support(I3) Both statements A and B None of statements A and B For questions 6-10, consider the following small database of four transactions. The minimum support is 60% and the minimum confidence is 80%. Trans_id          ...