Your Learning Progress
0/32 topics (0%)
Quiz 8: Neural Networks Basics
Test your knowledge on Perceptrons, Activation Functions, and Multilayer Perceptrons (MLPs).
Next Quiz
All Quizzes
1. What is a perceptron?
A type of regression model
A simple artificial neural network
A clustering algorithm
A data preprocessing tool
2. Which function is applied in a perceptron to produce the output?
ReLU
Step function
Sigmoid
Tanh
3. What does the bias term in a perceptron do?
Multiplies the inputs
Adjusts the weighted sum to fine-tune behavior
Normalizes the data
Controls the activation function
4. What is the primary purpose of an activation function in a neural network?
To reduce dimensionality
To introduce non-linearity
To calculate the weighted sum
To normalize inputs
5. Which activation function outputs values between 0 and 1?
ReLU
Sigmoid
Tanh
Leaky ReLU
6. What does the ReLU activation function do for negative inputs?
Outputs 0
Outputs -1
Outputs the input value
Outputs a small gradient
7. What is a Multilayer Perceptron (MLP)?
A single-layer neural network
A clustering algorithm
A neural network with an input, one or more hidden layers, and an output layer
A data normalization tool
8. What is the role of hidden layers in an MLP?
To accept input features
To apply activation functions
To perform computations using weights, biases, and activation functions
To store the final predictions
9. What is the mathematical representation of a neuron in an MLP?
z = w * x
z = Σ(w * x) + b
z = x / w
z = b - x
10. Which of the following activation functions is zero-centered?
Sigmoid
Tanh
ReLU
Softmax
11. Why is the Softmax activation function used in classification problems?
To normalize the data
To calculate weighted sums
To convert raw scores into probabilities
To introduce non-linearity
12. What happens when no activation function is used in a neural network?
The network becomes non-linear
The network behaves as a linear model
The network becomes unstable
The gradients vanish
13. What is the primary benefit of using multiple hidden layers in an MLP?
To increase the number of input features
To enable the network to learn more complex patterns
To improve the efficiency of the activation function
To reduce the number of weights
14. Which parameter adjusts how fast a neural network learns during training?
Activation function
Learning rate
Weight initialization
Bias
15. What is the primary goal of backpropagation in neural networks?
To initialize weights and biases
To propagate input values through the network
To calculate and update gradients to minimize error
To normalize output values
Submit
Next Quiz
All Quizzes