IBM Deep Learning Fundamentals with Keras EDX week-2 MCQ quiz

This is answer-key to IBM course named IBM DL0101EN Deep Learning Fundamentals with Keras Mcq of week-2

Deep learning fundamentals with Keras Edx MCQ quiz answers

Question-1 The weights and biases in a neural network are optimized using:

Solution= Gradient Descent

Question-2 For a cost function, J=∑mi=1(zi−wxi−b)2, that we would like to minimize, which of the following expressions represent updating the parameter, w, using gradient descent?

Solution= w→w−η∗b∂J∂w

Question-3 What type of activation function is this?

IBM Deep Learning Fundamentals with Keras EDX weel-2 MCQ quiz

Solution= ReLU

Question-4 What type of activation function is this?

IBM Deep Learning Fundamentals with Keras EDX weel-2 MCQ quiz

Solution= Hyperbolic Tangent Function

Question-5 Softmax activation function is most commonly used in hidden layers?

Solution= FALSE

Leave a Comment

Your email address will not be published. Required fields are marked *