0 votes 0 votes In a single perceptron, the updation rule of weight vector is given by $w(n+1) = w(n) + \eta [d(n)-y(n)]$ $w(n+1) = w(n) - \eta [d(n)-y(n)]$ $w(n+1) = w(n) + \eta [d(n)-y(n)]*x(n)$ $w(n+1) = w(n) - \eta [d(n)-y(n)]*x(n)$ Artificial Intelligence ugcnetcse-sep2013-paper3 neural-network machine-learning + – go_editor asked Jul 24, 2016 • recategorized Oct 19, 2018 by Pooja Khatri go_editor 3.2k views answer comment Share Follow See all 0 reply Please log in or register to add a comment.
2 votes 2 votes ans is C f Steps[edit] Initialize the weights and the threshold. Weights may be initialized to 0 or to a small random value. In the example below, we use 0. For each example j in our training set D, perform the following steps over the input {\displaystyle \mathbf {x} _{j}\,}and desired output {\displaystyle d_{j}\,}: Calculate the actual output: {\displaystyle {\begin{aligned}y_{j}(t)&=f[\mathbf {w} (t)\cdot \mathbf {x} _{j}]\\&=f[w_{0}(t)x_{j,0}+w_{1}(t)x_{j,1}+w_{2}(t)x_{j,2}+\dotsb +w_{n}(t)x_{j,n}]\end{aligned}}} Update the weights: {\displaystyle w_{i}(t+1)=w_{i}(t)+(d_{j}-y_{j}(t))x_{j,i}\,}, for all features {\displaystyle 0\leq i\leq n} for more details refer https://en.wikipedia.org/wiki/Perceptron Sanjay Sharma answered Jul 24, 2016 Sanjay Sharma comment Share Follow See 1 comment See all 1 1 comment reply anchitjindal07 commented Aug 24, 2018 reply Follow Share From which subject is this 0 votes 0 votes Please log in or register to add a comment.