Consider a system running 10 I/O bound tasks and 1 CPU bound task. Assume that the I/O bound tasks issue an I/O operation once for every ms of CPU computing and that each I/O operation takes 10ms to complete. Also, assume that the context switching overhead is 0.1ms and that all processes are long running tasks. What is the CPU utilization (in %) for a round-robin scheduler when the time quantum is 10 ms.
Considering the CPU bound process runs first and subsequently other 10 I/O bound processes run in a sequence.
Only CPU Bound process will run for full time quantum of 10 ms. Other 10 processes will run only for 1 ms and then there will be a context switch after every ms( Context switching overhead being one second).
The time distribution in order to start execution of each process will look something like this:
p1,cs,p2,cs,p3,cs............,p11....
10,0.1,1,0.1,0.1.............,1...
Time taken by p1 = 10 ms
Time taken by each process p2,p3 ....p11 = 1 ms
Time taken by context switch = 0.1 ms
In one cycle of execution: There will be CPU utilization by p1 will be 10 ms, CPU utilization by other 10 processes( p2,p3...p11) will be 1*10 ms = 10ms, and 10 context switches( during which cpu doesn't execute any process) which will lead to 0.1*10 ms = 1 ms of CPU time wastage.
Total Time for one cycle, T = (10 + 10 + 1) ms = 21 ms
Total Time CPU was busy executing the processes, t = 20 ms
Therefore, CPU utilization = t/T * 100 = 20/21 * 100 = 95.24