I can model my given scenario as given below
So, P is done with carrier sensing and begins transmitting at t=0. Now, Q will sense the medium for 5 units of time till t=4 and begin transmitting at t=5.
If untill this time unit, the first bit of P's data reaches Q, Q will refrain from sending its own data. If it doesn't reach Q at t=4, at t=5, Q will start sending it's own data.
This means we have only 5 units of time till which we can wait for the first bit of signal of P to reach Q. And we know that this is nothing but one-way propagation delay ($T_p$).
Given the distance is d meters and the velocity of the signal is 10 meters per time unit, $T_p=\frac{d}{10}$ units of time.
Now, as you can see in the image, the worst case till when we can wait for the P's signal is till 5 units of time.
So, my $T_p=5$ units of time
$5=\frac{d}{10}$
$d=50\,m$ Answer