SOFT COMPUTING (SC) [ELECTIVE], Semester 7,
B.E. Computer Science (CS), December 2011.
Con.6873-11
MP- 5602
(3 Hours)
[Total Mark: 100]
N.B. 1. Question No. 1 is compulsory
2. Attempt any Four out of remaining
3. Assume suitable data if necessary and justify the assumptions
4. Figures to the right indicate full marks
1. (a) Explain Mc cul loch Pitts Neuron Model with help of an example. --- (5 Marks)
(b) Define support, Core, Normality, Crossover point and α – cut for fuzzy set. --- (5 Marks)
(c) A neuron with 4 inputs has the weight vector w = [1 2 3 4]t. The activation function is linear, that is, the activation function is given by f (net) = 2* net. If the input vector is X == [5 6 7 8]t, then find the output of the neuron. ---- (5 Marks)
(d) Explain Fuzzy Extension principle with example. --- (5 Marks)
2. (a) High speed rail monitoring devices sometimes make use of sensitive sensors to measure the deflection of the earth when a rail car passes. These deflections are measured with respect to some distance from the rail car and, hence are actually very small angles measured in micro radians. Let a universe of deflection be A = [1, 2, 3, 4] where A is the angle in mieroradians, and let a universe of distances be D == [1, 2, 5, 7] where D is distance in feet, suppose a relation between these two parameters has been determined as follows: --- (10 Marks)
Now let a universe of rail car weights be W = [1, 2], where W is the weight in units of 100,000 pounds. Suppose the fuzzy relation of W to A is given by
Using these two relations, tind the relation R To S = T
- Using max-min composition
- Using max-product composition
3. (a) Explain Error back propagation training algorithm with the help of a flowchart. --- (10 Marks)
(b) Explain Genetic algorithm with the help of example. ---- (10 Marks)
4. (a) Explain Random Search method with example. ---- (10 Marks)
(b) A single neuron network using f(net) = sgn(net) has been trained using the pairs of (Xi, di) as given below : --- (10 Marks)
X1 = [1 -2 3 -1] t, d1 = -1
X2 = [0 -1 2 -1] t, d2 = 1
X3 = [-2 0 -3 -1] t, d3 = -1
The final weights obtained using the perception rule are
W4 = [3 2 6 1] t
Knowing that correction has been performed in each step for c=1, determine the following weights :
(a) W3,W2, W1 by backtracking the training.
(b) W5, W6, W7, obtained for steps 4,5,6 of training by reusing the sequence
(X1. d1), (X2, d2), (X3, d3)
5. (a) Explain with example perceptron learning rule. ---- (10 Marks)
(b) Explain with example gradient based optimization technique. ---- (10 Marks)
6. Design a fuzzy logic controller for a train approaching or leaving a station. The inputs are the distance from the station and speed of the train. The output is the amount of break power used. Use four descriptors for each variable use Mamdani Fuzzy model. ---- (20 Marks)
7. Write short notes on any two of the following: --- (20 Marks)
(a) TSP using simulated Annealing
(b) Kohonen's self organizing network
(c) Character Recognition using neural network
(d) RBF network
No comments:
Post a Comment