- $1000$ consecutive records
- Size of $1$ record $= 3200$ Bytes
- Access Time of device $D = 10$ms
- Data Transfer Rate of device $D = 800\ast 10^3$ Bytes per second.
- CPU time to process Each record $= 3$ms.
- Time to transfer $1$ record $(3200 \;\text{Bytes})=\frac{3200\;\text{Bytes}}{800\ast 10^3} = 4$ms
(A) Unblocked records with No buffer. Hence, each time only when a record is fetched in its full entirety it will be processed.
Time to fetch $=$ Access Time for $D($Every time you'll access the device. This is also known as device latency$) +($Data transfer time)
$=10\text{ms} + 4\text{ms} = 14\text{ms}$
Total time taken by CPU for each record $=$ fetch $+$ execute $=14\text{ms} + 3\text{ms} = 17\text{ms}$
Total time for program $p=1000\ast 17\text{ms} = 17\text{sec}$
(B) Unblocked records and $1$ buffer. Records will be accessed one by one and for each record fetched into the buffer, the device delay has to be taken into account.
Time to bring one record into buffer $=10 + 4 = 14$ms.
Now let us see how the program goes.
- At $t=0$ms, the program starts and the buffer is empty.
- At $t=14$ms, $R_1$ fetched into the buffer and CPU starts processing it.
- At $t=17$ms, cpu has processed $R_1$ and waiting for more records.
- At $t=28$ms, buffer gets filled with $R_2$ and CPU starts processing it.
To get the Total time of the program we think in terms of the last record because when it is processed, all others would already have been processed too!.
Last record $R_{1000}$ would be fetched at $t=0+14\ast 1000=14000$ms and $3$ms will be taken by CPU to process this.
So, total elapsed time of program $P=14000+3=14003\text{ms} = 14.003\text{sec}$
(C) Each disk block contains $2$ records and Assuming buffer can hold $1$ disk block at a time.
So, $1$ Block Size $=2\ast 3200=6400$Bytes
Time to read a block $=\frac{6400}{800\ast 10^3} = 8$ms.
Each block read you have to incur the device access cost.
So, the total time to fetch one block and bring it into buffer $= 10 + 8 = 18$ms.
We have $1000$ files and so we need to read in $500$ blocks.
Each block has two records and therefore CPU time per block $= 6$ms.
Again to count the program time $P,$ we think in terms of the last Block.
Last block would be fetched at $t=0+(18\ast 500)=9000$ms.
After this $6$ ms more to process $2$ records present in the $500^{\text{th}}$ block.
So, program time $P=9000+6=9006\text{ms} = 9.006\text{sec}$.