10,107 views

For any discrete random variable $X$, with probability mass function

$P(X=j)=p_j, p_j \geq 0, j \in \{0, \dots , N \}$, and $\Sigma_{j=0}^N \: p_j =1$, define the polynomial function $g_x(z) = \Sigma_{j=0}^N \: p_j \: z^j$. For a certain discrete random variable $Y$, there exists a scalar $\beta \in [0,1]$ such that $g_y(z) =(1- \beta+\beta z)^N$. The expectation of $Y$ is

1. $N \beta(1-\beta)$
2. $N \beta$
3. $N (1-\beta)$
4. Not expressible in terms of $N$ and $\beta$ alone

CAN SOMEONE EXPLAIN THIS QUESTION BETTER??
Small typo in this question. Please correct it.
Any book to practice these type of questions ??
Sheldon Ross.

verification

Here is one way to solve the question –

https://gateoverflow.in/?qa=blob&qa_blobid=14447251432178239340

(Not putting this as an answer since I have not solved completely and only written steps)

@rupesh17

How did you take P(Y = 1) = $\beta$ ? And why does Y have only two values (0,1) ?
edited by

see polynomial function g(z) is defined like that….

$g_{x}(z)=\sum_{0}^{N}p_{j}z^{^{j}}$ for  P(X=j)=$p_{j}$, j∈{0,…,N}

Meaning-

suppose  variable x  has probability p0,p1,p2..at x=0 x=1 and x=2 respectively then

$g_{x}(z)=p_{0}+p_{1}z+p_{2}z^{2}+...$  by definition of given function….

=================================================================

now we are given that—

$g{_{y}}$(z)=$(1−β+βz)^{^{N}}$ and we assume N=1

then

$g{_{y}}$(z)=1−β+βz

so

coefficient of $z^{^{0}}$ gives probability at y=0 i.e p0 and

coefficient of z gives probability at y=1 i.e p1 .

Expectation of Y is β at N=1 ..only option B matching…..

=====================================================================

We can verify by taking any value eg. N=2

rupesh17

Very nice. Thanks for the explanation

### Subscribe to GO Classes for GATE CSE 2022

Notice that the derivative of $\large g{_x}(z)$ evaluated at $z=1$ gives expectation  $E(X)$

$g'_x(z)|_{z=1}= \Sigma_{j=1}^N\,j\, p_j\, z^{j-1} |_{z=1} = \Sigma_{j=1}^N\,j\, p_j = \Sigma_{j=0}^N\,j\, p_j = E(X)$

Therefore, take derivative of $\large g{_y}(z)$ with respect to $z,$ and plug in $z=1$

$E(Y) = g'_y(z)|_{z=1}= ((1-\beta + \beta\,z)^N )' |_{z=1} = N\beta(1-\beta + \beta\,z)^{N-1}|_{z=1} = N\beta(1-\beta + \beta)^{N-1} = N\beta$

I think this answer needs some correction.

what is the significance of range of beta?

How are you calculating derivative of gx(z) ?

editing by Pavan Singh was incorrect, so I re-edited it back to my original answer with some more derivation.

BY finding derivative of gx(z) we found it is equal fo E(X)

But why same logic to gy(z) also how we come to know that derivative of gy(z) is E(Y) ?

"Notice that the derivative of gx(z) evaluated at z=1gives expectation  E(X)"

Can you please explain this statement ? And also tell when and where we can use this logic.

I dont understand what is the relation between these two functions ?

Any book to practice these type of questions ??
How does derivative gives us mean?

How do we get this result?
Any book which has questions dealing with calculus and random variables will have questions like these...if you have calculus intuition you will quickly see that differentiating the polynomial function gives you something close to the expression of the expectation/mean.

$g_{x}(z) = p_{0} + p_{1}z + p_{2}z^{2} + ...$ by definition of the polynomial function. Differentiate this to get -:

$g'_{x}(z) = 0 + p_{1} + 2p_{2}z + 3p_{3}z^{2} + ...$

Now recall that the expectation of a discrete random variable is defined as -:

$E[X] = \sum_{x = 0}^{N}xp(x) = 0 + p_{1} + 2p_{2} + 3p_{3} + ...$

So if we could just remove the $z$ terms from our differentiated polynomial function, we would have $E[X]$. To do that, just calculate $g'_{x}(1) = p_{1} + 2p_{2} + ...$

We are given a closed-form expression for $Y$'s polynomial function. Follow the same logic - differentiate it and then apply $z = 1$ to get the answer.
briliant…
u ll do something ossm in GATE 2021

B)

If you expand gy(z) you get binomial distribution and mean of binomial distribution is N*p.

by

The following solution is based on the idea shared by @Sachin Mittal sir.

Given,

$g_y(z) = (1 – \beta + \beta z)^N$

$= g_y(z) = ((1 – \beta) + \beta z)^N$

Using binomial expansion,

$= g_y(z) = \sum_{k=0}^{N}{\binom{N}{k}(1 – \beta)^{N-k}(\beta z)^k}$

$= g_y(z) = \sum_{k=0}^{N}{\binom{N}{k}(1 – \beta)^{N-k}(\beta)^k(z)^k}$

Let $a = (1 – \beta)$ and $b = \beta$

$= g_y(z) = \sum_{k=0}^{N}{\binom{N}{k}a^{N-k}b^k(z)^k}$

comapring $g_x(z) = \sum_{j=0}^{N}p_jz^j$ to the above $eq^n$ we know that each of the terms in the above summation will give $P(Y = j) = p_j, where$ $0 \leq j \leq N$

and  $\because E(x) = \sum_{k=0}^{N}k*p_k$

$= E(x) = \sum_{k=0}^{N}{\binom{N}{k}a^{N-k}b^{k}k}$

$= E(x) = Nb(a + b)^{N-1}$

$= E(x) = N\beta(1 – \beta + \beta)$

$= E(x) = N\beta$

$\therefore Answer$ $is$ $B$

For a discrete random variable X,
Given gy (z) = (1 - β + βz)N ⇾ it is a binomial distribution like (x+y)n
Expectation (i.e., mean) of a binomial distribution will be np.
The polynomial function ,
given
Mean of Binomial distribution of b(xj,n,p)=
The probability Mass function,

Given:

Hope this helps...

B option only mean of binomial distribution is np

Can you please provide explanation for this?
Can some one explain this in some easy way?