Three processes $A$, $B$ and $C$ each execute a loop of $100$ iterations. In each iteration of the loop, a process performs a single computation that requires $t_c$ CPU milliseconds and then initiates a single I/O operation that lasts for $t_{io}$ milliseconds. It is assumed that the computer where the processes execute has sufficient number of I/O devices and the OS of the computer assigns different I/O devices to each process. Also, the scheduling overhead of the OS is negligible. The processes have the following characteristics:
$$\begin{array}{|c|c|c|} \hline \textbf{Process id} & \textbf{$t_c$} & \textbf{$t_{io}$} \\\hline \textbf{A} & \text{100 ms} & \text{500 ms} \\\hline \textbf{B} & \text{350 ms} & \text{500 ms}\\\hline \textbf{C} & \text{200 ms} & \text{500 ms} \\\hline \end{array}$$
The processes $A$, $B$, and $C$ are started at times $0$, $5$ and $10$ milliseconds respectively, in a pure time sharing system (round robin scheduling) that uses a time slice of $50$ milliseconds. The time in milliseconds at which process C would complete its first I/O operation is ___________.