in Probability edited by
12,284 views
41 votes
41 votes

For any discrete random variable $X$, with probability mass function

$P(X=j)=p_j, p_j \geq 0, j \in \{0, \dots , N \}$, and $\Sigma_{j=0}^N \: p_j =1$, define the polynomial function $g_x(z) = \Sigma_{j=0}^N \: p_j \: z^j$. For a certain discrete random variable $Y$, there exists a scalar $\beta \in [0,1]$ such that $g_y(z) =(1- \beta+\beta z)^N$. The expectation of $Y$ is

  1. $N \beta(1-\beta)$
  2. $N \beta$
  3. $N (1-\beta)$
  4. Not expressible in terms of $N$ and $\beta$ alone
in Probability edited by
by
12.3k views

4 Comments

@rupesh17

How did you take P(Y = 1) = $\beta$ ? And why does Y have only two values (0,1) ?
0
0
edited by

 see polynomial function g(z) is defined like that….

$g_{x}(z)=\sum_{0}^{N}p_{j}z^{^{j}}$ for  P(X=j)=$p_{j}$, j∈{0,…,N}

Meaning-

suppose  variable x  has probability p0,p1,p2..at x=0 x=1 and x=2 respectively then 

$g_{x}(z)=p_{0}+p_{1}z+p_{2}z^{2}+...$  by definition of given function….

=================================================================

now we are given that—

$g{_{y}}$(z)=$(1−β+βz)^{^{N}}$ and we assume N=1

then

 $g{_{y}}$(z)=1−β+βz

so

coefficient of $z^{^{0}}$ gives probability at y=0 i.e p0 and

coefficient of z gives probability at y=1 i.e p1 .

Expectation of Y is β at N=1 ..only option B matching…..

=====================================================================

We can verify by taking any value eg. N=2 

 

 

 

 

 

 

10
10
rupesh17

Very nice. Thanks for the explanation
1
1

6 Answers

51 votes
51 votes
Best answer

Notice that the derivative of $\large g{_x}(z)$ evaluated at $z=1$ gives expectation  $E(X)$

$g'_x(z)|_{z=1}= \Sigma_{j=1}^N\,j\, p_j\, z^{j-1} |_{z=1} = \Sigma_{j=1}^N\,j\, p_j = \Sigma_{j=0}^N\,j\, p_j = E(X)$

Therefore, take derivative of $\large g{_y}(z)$ with respect to $z,$ and plug in $z=1$

$E(Y) = g'_y(z)|_{z=1}= ((1-\beta + \beta\,z)^N )'  |_{z=1}  = N\beta(1-\beta + \beta\,z)^{N-1}|_{z=1} = N\beta(1-\beta + \beta)^{N-1} = N\beta$



So, Answer is option (B)

edited by

4 Comments

How does derivative gives us mean?

How do we get this result?
0
0
Any book which has questions dealing with calculus and random variables will have questions like these...if you have calculus intuition you will quickly see that differentiating the polynomial function gives you something close to the expression of the expectation/mean.

$g_{x}(z) = p_{0} + p_{1}z + p_{2}z^{2} + ...$ by definition of the polynomial function. Differentiate this to get -:

$g'_{x}(z) = 0 + p_{1} + 2p_{2}z + 3p_{3}z^{2} + ...$

Now recall that the expectation of a discrete random variable is defined as -:

$E[X] = \sum_{x = 0}^{N}xp(x) = 0 + p_{1} + 2p_{2} + 3p_{3} + ...$

So if we could just remove the $z$ terms from our differentiated polynomial function, we would have $E[X]$. To do that, just calculate $g'_{x}(1) = p_{1} + 2p_{2} + ...$

We are given a closed-form expression for $Y$'s polynomial function. Follow the same logic - differentiate it and then apply $z = 1$ to get the answer.
15
15
briliant…
u ll do something ossm in GATE 2021
1
1
5 votes
5 votes

B)

If you expand gy(z) you get binomial distribution and mean of binomial distribution is N*p.

by
1 vote
1 vote

For a discrete random variable X,
Given gy (z) = (1 - β + βz)N ⇾ it is a binomial distribution like (x+y)n
Expectation (i.e., mean) of a binomial distribution will be np.
The polynomial function ,
given 
Mean of Binomial distribution of b(xj,n,p)=
The probability Mass function,

Given:

Hope this helps...

1 vote
1 vote

The following solution is based on the idea shared by @Sachin Mittal sir.


Given,

$g_y(z) = (1 – \beta + \beta z)^N$

$= g_y(z) = ((1 – \beta) + \beta z)^N$

Using binomial expansion, 

$= g_y(z) = \sum_{k=0}^{N}{\binom{N}{k}(1 – \beta)^{N-k}(\beta z)^k}$

$= g_y(z) = \sum_{k=0}^{N}{\binom{N}{k}(1 – \beta)^{N-k}(\beta)^k(z)^k}$

Let $a = (1 – \beta)$ and $b = \beta$

$= g_y(z) = \sum_{k=0}^{N}{\binom{N}{k}a^{N-k}b^k(z)^k}$

comapring $g_x(z) = \sum_{j=0}^{N}p_jz^j$ to the above $eq^n$ we know that each of the terms in the above summation will give $P(Y = j) = p_j, where$ $0 \leq j \leq N$

and  $\because E(x) = \sum_{k=0}^{N}k*p_k$

$= E(x) = \sum_{k=0}^{N}{\binom{N}{k}a^{N-k}b^{k}k}$

$= E(x) = Nb(a + b)^{N-1}$

(https://www.wolframalpha.com/input/?i=sum+ncr%28n%2C+k%29+*+x%5E%28n-k%29*y%5Ek*k%2Ck%3D0+to+n)

$= E(x) = N\beta(1 – \beta + \beta)$

$= E(x) = N\beta$

$\therefore Answer$ $is$ $B$

 

1 comment

In the question E[Y]  is to be answered but here it’s mentioned E[X]. Is this a mistake or am I missing something? @Soumya Saurav

0
0
Answer:

Related questions