Same here.. 43 :/

The Gateway to Computer Science Excellence

0 votes

Suppose the functions F and G can be computed in 8 and 3 nanoseconds by functional units UF and UG, respectively. Given three instances of UF and three instances of UG, it is required to implement the computation F(G(Xi)) for 1 ≤ i ≤ 13. A control Unit selects next task/s and allocate/s it to currently free resource/s and allocation can be done in parallel independent of each other. Each such allocation takes 1 nano second and Control unit waits for the resources to be freed before deciding the next round of allocation. Ignoring all other delays, the minimum time required to complete this computation is ( in nanoseconds): (A) 28 (B) 33 (C) 43 (D) 49

my answer is 43 but gatebook answer is 49.

0 votes

answer will be 49

given Each such allocation takes 1 nano second

first set of 3 instruction take 13 ns ,How?

1 ns for allocation ,3 ns for GF, 1 ns for Uf allocation , 8 ns for UF =13 ns

after that every set of 3 instruction take 9 ns (3 times same thing happen)

last one take 9 ns

so total 13+9+9+9+9=49 ns

- All categories
- General Aptitude 1.9k
- Engineering Mathematics 7.6k
- Digital Logic 2.9k
- Programming and DS 4.9k
- Algorithms 4.4k
- Theory of Computation 6.2k
- Compiler Design 2.1k
- Databases 4.1k
- CO and Architecture 3.4k
- Computer Networks 4.2k
- Non GATE 1.4k
- Others 1.5k
- Admissions 595
- Exam Queries 573
- Tier 1 Placement Questions 23
- Job Queries 72
- Projects 18

50,833 questions

57,729 answers

199,453 comments

107,866 users