See the thing here is char is of 1 Byte,although it is platform dependent,but most of the time it is 1 Bytes.
and you can represent only in -128 to 127(range of 8 bit sign char).
Now binary representation of 200 is 11001000 which clearly is out of range.
Now represent 200 in 2's form i.e 00111000(56) add -(minus) to get true magnitude so -56 is the answer.
Refer :: https://gateoverflow.in/164498/c-programming