The Gateway to Computer Science Excellence
0 votes

In a single perceptron, the updation rule of weight vector is given by

  1. $w(n+1) = w(n) + \eta [d(n)-y(n)]$
  2. $w(n+1) = w(n) - \eta [d(n)-y(n)]$
  3. $w(n+1) = w(n) + \eta [d(n)-y(n)]*x(n)$
  4. $w(n+1) = w(n) - \eta [d(n)-y(n)]*x(n)$
in Machine Learning by Veteran (105k points)
recategorized by | 813 views

1 Answer

+2 votes

ans is C  f


  1. Initialize the weights and the threshold. Weights may be initialized to 0 or to a small random value. In the example below, we use 0.
  2. For each example j in our training set D, perform the following steps over the input \mathbf {x} _{j}\,and desired output d_{j}\,:
    1. Calculate the actual output:
      {\displaystyle {\begin{aligned}y_{j}(t)&=f[\mathbf {w} (t)\cdot \mathbf {x} _{j}]\\&=f[w_{0}(t)x_{j,0}+w_{1}(t)x_{j,1}+w_{2}(t)x_{j,2}+\dotsb +w_{n}(t)x_{j,n}]\end{aligned}}}
    2. Update the weights:
      {\displaystyle w_{i}(t+1)=w_{i}(t)+(d_{j}-y_{j}(t))x_{j,i}\,}, for all features 0\leq i\leq n
      for more details refer
by Boss (48.8k points)
From which subject is this

Related questions

Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true
50,644 questions
56,512 answers
101,074 users