MDA 5203 MACHINE LEARNING  KCA Past Paper

UNIVERSITY EXAMINATIONS: 2016/2017
EXAMINATION FOR THE DEGREE OF MASTERS OF SCIENCE IN
DATA ANALYTICS
MDA 5203 MACHINE LEARNING
ORDINARY EXAMINATIONS
DATE: AUGUST, 2017 TIME: 2 HOURS
INSTRUCTIONS: Answer Question One & ANY OTHER TWO questions.

QUESTION ONE (COMPULSORY)
a) Logical operators (i.e. NOT, AND, OR, XOR, etc) are the building blocks of any
computational device. They return only two possible values, true or false, based on the
truth or false values of their operators. An AND operator returns true only when all its
arguments are true, otherwise it returns false. If we denote truth by 1 and false by 0, then
the function AND can be represented by the following table:

The function can be implemented by a single-unit with two inputs:

 

If the weights are W1 = 1 and W2 = 1 and the activation function is:

Test how the neural network works.
(6 Marks)
b) Explain any four application areas of machine learning.
(4 Marks)
c) State Bayesian theory and explain all the component probabilities.
(4 Marks)
d) Evolutionary computation methods are based on Darwinian concept of “survival for the
fittest”. Using a suitable illustration diagram and example discuss a typical evolutionary
algorithm.
(6 Marks)
e) By first outlining the procedure for KNN decide the class for the new instance given in
the table below: Use K = 3

Explain the concept of K-Means and consequently determine the resulting clusters from
the dummy data given below: Use K= 2 and let A and B be the initial point chosen. Use
Euclidean distance as the metric in this case.

QUESTION TWO
Consider a two-class decision problem represented by the input-target pairs below. Train a
perceptron network to solve this problem using the perceptron learning rule.
Let the initial weight and bias be: W = [0,0] and b = 0. Use the Hardlim function as your
activation function, defined as:

QUESTION THREE
a) Decision trees are one of the main methods that use induction as a learning algorithm.
Explain the main concept behind induction.
(2 Marks)
b) Consider the following data set for a binary class problem.

Which attribute would the decision tree induction algorithm choose as the base
attribute based on entropy?
(2 Marks)
ii. Develop the entire decision tree model without pruning.
(6 Marks)
QUESTION FOUR
Using Naive Bayes determine the class for a new instance

based on the table given in Qn 3(b).
(10 Marks)

(Visited 338 times, 1 visits today)
Share this:

Written by