Suppose each intruction size is 1 Byte, So, if DMA doesn't interrupt CPU then it will fetch $10^6$ instructions in one second.
To transfer 1 Byte $= \frac{1}{16K} = 62.5$ Microseconds
So DMA will interrupt CPU $\frac{1Sec}{62.5 Micro Sec}=16000 times$
When CPU can fetch $10^6$ instructions in 1 second, it will fetch now $10^6-16000 = 984000$ instructions only.
Hence CPU is slowed down by $=\frac{16000}{10^6} = 0.016 = 1.6\text{%}$