This is answer-key to IBM course named IBM DL0101EN Deep Learning Fundamentals with Keras Mcq of week-2
Deep learning fundamentals with Keras Edx MCQ quiz answers
Question-1 The weights and biases in a neural network are optimized using:
Solution= Gradient Descent
Question-2 For a cost function, J=∑mi=1(zi−wxi−b)2, that we would like to minimize, which of the following expressions represent updating the parameter, w, using gradient descent?
Question-3 What type of activation function is this?
Question-4 What type of activation function is this?
Solution= Hyperbolic Tangent Function
Question-5 Softmax activation function is most commonly used in hidden layers?