edited by
288 views
0 votes
0 votes

Let $\textbf{W}_{ij}$ represents weight between node $i$ at layer $k$ and node $j$ at layer $(k-1)$ of a given multilayer perceptron. The weight updation using gradient descent method is given by

  1. $\textbf{W}_{ij}(t+1) = \textbf{W}_{ij}(t)+ \alpha \dfrac{\partial \textbf{E}}{\partial \textbf{W}_{ij}}, 0 \leq \alpha \leq 1$
  2. $\textbf{W}_{ij}(t+1) = \textbf{W}_{ij}(t)- \alpha \dfrac{\partial \textbf{E}}{\partial \textbf{W}_{ij}}, 0 \leq \alpha \leq 1$
  3. $\textbf{W}_{ij}(t+1) =  \alpha \dfrac{\partial \textbf{E}}{\partial \textbf{W}_{ij}}, 0 \leq \alpha \leq 1$
  4. $\textbf{W}_{ij}(t+1) = –  \alpha \dfrac{\partial \textbf{E}}{\partial \textbf{W}_{ij}}, 0 \leq \alpha \leq 1$

Where $\alpha$ and $E$ represents learning rate and Error in the output respectively.

edited by

Please log in or register to answer this question.

Related questions

1 votes
1 votes
2 answers
1
4 votes
4 votes
6 answers
3
soujanyareddy13 asked May 12, 2021
1,939 views
The Boolean expression $AB+A \overline{B}+\overline{A}C+AC$ is unaffected by the value of the Boolean variable _________.$A$$B$$C$$A, B$ and $C$