-
In the binary classification model, 3 linear layers were used along with a ReLU, with the forward method defined as:
How does this work? Would we have 3 lines (linear layers), each with a bend in them (from the ReLUs)? |
Beta Was this translation helpful? Give feedback.
Answered by
LuluW8071
Jul 25, 2024
Replies: 1 comment 3 replies
-
Yes we would have 3 linear layers,
ReLU works as activation function to introduce non-linearity for the layer; if total weight of a neuron is greater than 0 it fires(1) else doesnt fire(0) |
Beta Was this translation helpful? Give feedback.
3 replies
Answer selected by
mrdbourke
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Yes we would have 3 linear layers,
ReLU works as activation function to introduce non-linearity for the layer; if total weight of a neuron is greater than 0 it fires(1) else doesnt fire(0)