The Gateway to Computer Science Excellence
First time here? Checkout the FAQ!
x
0 votes
89 views
Suppose the functions F and G can be computed in 8 and 3 nanoseconds by functional 
units UF and UG, respectively. Given three instances of UF and three instances of UG, 
it is required to implement the computation F(G(Xi)) for 1 ≤ i ≤ 13. 
A control Unit selects next task/s and allocate/s it to currently free resource/s 
and allocation can be done in parallel independent of each other. 
Each such allocation takes 1 nano second and Control unit waits for the resources to 
be freed before deciding the next round of allocation. Ignoring all other delays, 
the minimum time required to complete this computation is ( in nanoseconds): 
(A) 28 
(B) 33 
(C) 43 
(D) 49

my answer is 43 but gatebook answer is 49.

in CO and Architecture by Boss (16.6k points)
retagged by | 89 views
+1
Same here.. 43 :/

1 Answer

0 votes

answer will be 49 

given Each such allocation takes 1 nano second 

first set of 3 instruction take 13 ns  ,How?

1 ns for allocation ,3 ns for GF, 1 ns for Uf allocation , 8 ns for UF =13 ns

after that every set of 3 instruction  take 9 ns  (3 times same thing happen)

last one take 9 ns 

so total 13+9+9+9+9=49 ns

by Loyal (10k points)
0
after that every set of 3 instruction take 9 ns

 

could you explain how ?

Related questions

Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true
50,093 questions
55,333 answers
190,852 comments
86,257 users