retagged by
662 views
3 votes
3 votes
Suppose the functions F and G can be computed in 8 and 3 nanoseconds by functional 
units UF and UG, respectively. Given three instances of UF and three instances of UG, 
it is required to implement the computation F(G(Xi)) for 1 ≤ i ≤ 13. 
A control Unit selects next task/s and allocate/s it to currently free resource/s 
and allocation can be done in parallel independent of each other. 
Each such allocation takes 1 nano second and Control unit waits for the resources to 
be freed before deciding the next round of allocation. Ignoring all other delays, 
the minimum time required to complete this computation is ( in nanoseconds): 
(A) 28 
(B) 33 
(C) 43 
(D) 49

my answer is 43 but gatebook answer is 49.

retagged by

2 Answers

2 votes
2 votes

answer will be 49 

given Each such allocation takes 1 nano second 

first set of 3 instruction take 13 ns  ,How?

1 ns for allocation ,3 ns for GF, 1 ns for Uf allocation , 8 ns for UF =13 ns

after that every set of 3 instruction  take 9 ns  (3 times same thing happen)

last one take 9 ns 

so total 13+9+9+9+9=49 ns

Related questions

0 votes
0 votes
0 answers
1
HeadShot asked Oct 7, 2018
391 views
Approach :
2 votes
2 votes
2 answers
3
4 votes
4 votes
3 answers
4
val_pro20 asked May 15, 2019
1,275 views
What is the time complexity for checking whether an assignment of truth values to variables $x_1,\dots ,x_n$ satisfies a given formula $f(x_1\dots,x_n)$?$O(2^n)$$O(g(n))$...