edited by
1,826 views
1 votes
1 votes

Consider a single perceptron with sign activation function. The perceptron is represented by weight vector $\begin{bmatrix} 0.4 & -0.3 & 0.1 \end{bmatrix}^t$ and a bias $\theta=0$. If the input vector to the perceptron is $X=\begin{bmatrix} 0.2 & 0.6 & 0.5 \end{bmatrix}$ then the output of the perceptron is

  1. $1$
  2. $0$
  3. $-0.05$
  4. $-1$
edited by

2 Answers

0 votes
0 votes

Ans: (3) -0.05

Explanation: Here it is given input vector X=[0.2 0.6 0.5] and weight vector W=[0.4 −0.3 0.1]T 

The output of the perceptron will be the summation of X and W represented as,

$\sum_{i}^{}$Xi Wi                                    (i=1,2,3 because there are 3 values of X and W)

[0.2 0.6 0.5]*[0.4 −0.3 0.1]=[0.2*0.4     0.6*(-0.3)     0.5*0.1]=[0.08-0.18+0.05] = -0.05

 

 

0 votes
0 votes

Consider a single perceptron with sign activation function. The perceptron is represented by
weight vector [0.4 −0.3 0.1] t and a bias θ=0. If the input vector to the perceptron is
X=[0.2 0.6 0.5] then the output of the perceptron is

A. 1       B. 0         C. −0.05       D. −1


ANS:  ​​​​​​​OPTION C
Output = AX + B where A = weight, X= input, B = bias
Output = [0.4 −0.3 0.1]^t   [0.2 0.6 0.5] + [0 0 0]
Output = 0.4 * 0.2 + -0.3*0.6 + 0.1*0.5
Output = 0.08 - 0.18 + 0.05
Output = 0.13 - 0.18
Output = -0.05
 

Answer:

Related questions

1 votes
1 votes
3 answers
1
Arjun asked Nov 5, 2017
837 views
Which of the following routing technique / techniques is/are used in distributed systems?Fixed RoutingVirtual RoutingDynamic Routing(a) only(a) and (b) only(c) onlyAll (a...
1 votes
1 votes
2 answers
3
Arjun asked Nov 5, 2017
908 views
The Sigmoid activation function $f(t)$ is defined as$\dfrac{1}{\text{exp} (t) + \text{exp} (-t)}$$t \text{ exp}(-t)$$\dfrac{1}{1+ \text{exp} (t)}$$\dfrac{1}{1+ \text{exp...