recategorized by
3,183 views
0 votes
0 votes

In a single perceptron, the updation rule of weight vector is given by

  1. $w(n+1) = w(n) + \eta [d(n)-y(n)]$
  2. $w(n+1) = w(n) - \eta [d(n)-y(n)]$
  3. $w(n+1) = w(n) + \eta [d(n)-y(n)]*x(n)$
  4. $w(n+1) = w(n) - \eta [d(n)-y(n)]*x(n)$
recategorized by

1 Answer

2 votes
2 votes

ans is C  f

Steps[edit]

  1. Initialize the weights and the threshold. Weights may be initialized to 0 or to a small random value. In the example below, we use 0.
  2. For each example j in our training set D, perform the following steps over the input \mathbf {x} _{j}\,and desired output d_{j}\,:
    1. Calculate the actual output:
      {\displaystyle {\begin{aligned}y_{j}(t)&=f[\mathbf {w} (t)\cdot \mathbf {x} _{j}]\\&=f[w_{0}(t)x_{j,0}+w_{1}(t)x_{j,1}+w_{2}(t)x_{j,2}+\dotsb +w_{n}(t)x_{j,n}]\end{aligned}}}
    2. Update the weights:
      {\displaystyle w_{i}(t+1)=w_{i}(t)+(d_{j}-y_{j}(t))x_{j,i}\,}, for all features 0\leq i\leq n
       
       
      for more details refer https://en.wikipedia.org/wiki/Perceptron
Answer:

Related questions

1 votes
1 votes
1 answer
1
go_editor asked Jul 24, 2016
3,645 views
Support of a fuzzy set $A= \big\{ \frac{x_1}{0.2}, \frac{x_2}{0.15}, \frac{x_3}{0.9}, \frac{x_4}{0.95}, \frac{x_5}{0.15} \big \}$ within a universal set X is given as$\bi...
0 votes
0 votes
1 answer
2
go_editor asked Jul 24, 2016
3,450 views
Let A be a set of comfortable houses given as $A = \big\{ \frac{x_1}{0.8}, \frac{x_2}{0.9}, \frac{x_3}{0.1}, \frac{x_4}{0.7} \big \}$ and be the set of affordable houses ...
1 votes
1 votes
1 answer
3
go_editor asked Jul 22, 2016
1,566 views
If an artificial variable is present in the ‘basic variable’ of optimal simplex table then the solution is Alternative solutionInfeasible solutionUnbounded solutionDe...