in Quantitative Aptitude edited by
4,011 views
10 votes
10 votes
A tiger is $50$ leaps of its own behind a deer. The tiger takes $5$ leaps per minute to the deer's $4.$ If the tiger and the deer cover $8$ meter and $5$ meter per leap respectively, what distance in meters will the tiger have to run before it catches the deer$?$
in Quantitative Aptitude edited by
4.0k views

3 Answers

12 votes
12 votes
Best answer
Tiger covers $40$ meter/minute
Deer covers $20$ meter/minute
Relative speed of tiger is $20$ meter/minute

Deer is ahead of the tiger by $50\times 8=400$ meters
Time taken $=400/20=20$ minutes
In 20 minutes tiger covers $20\times 40=800$ meters

Correct Answer: $800$
selected by
by

1 comment

third line in the solution should be 'relative speed'.  correct it please
0
0
3 votes
3 votes

Tiger (runs) -->                                 Deer (runs) --->

Tiger----------------------------------------Tree--------------------------Tiger catch Deer (here)

|---------------------50 Leaps --------------|-------------x m-----------|

                        50*8 = 400 m .       

|--------------------------------------------400 + x -------------------------|

 

Tiger(speed)=8*5=40m/s

Deer(speed)=5*4=20m/s

after t time tiger catch deer

t=(400+x) / 40 (tiger)

t=x / 20 (deer)

(400+x) / 40 = x/20

x=400

total time to catch deer = 400+x

 = 400 + 400

= 800

0 votes
0 votes
Tiger’s speed = (8 meter) * (5leaps) = 40m/minute

Deer’s speed = 20 m/minute

40t = 400 + 20t $\rightarrow t=20minutes$

Since, tiger’s speed * t = distance covered by tiger

Therefore distance covered by tiger = 40m/min * 20min = 800m
Answer:

Related questions