the american standard code for information interchange (ascii) has 128 binary-coded characters. if a certain computer generates data at 600,000 characters per second, determine the following. (a) the number of bits (binary digits) required per character. (b) the number of bits per second required to transmit the computer output, and the minimum bandwidth required to transmit this signal. (c) for single error detection capability, an additional bit (parity bit) is added to the code of each character. modify your answers in parts (a) and (b) in view of this information. (d) show how many ds1 carriers would be required to transmit the signal of part (c) in the north american digital hierarchy (sec. 5.4.2).