5.4-1 the american standard code for information interchange (ascii) has 128 binary-coded characters. if a certain computer generates data at 600,000 characters per second, determine the following. (a) the number of bits (binary digits) required per character. (b) the number of bits per second required to transmit the computer output, and the minimum bandwidth required to transmit this signal. (c) for single error detection capability, an additional bit (parity bit) is added to the code of each character. modify your answers in parts (a) and (b) in view of this information. (d) show how many ds1 carriers would be required to transmit the signal of part (c) in the north american digital hierarchy (sec. 5.4.2).

Respuesta :

A bit (binary digit) is a single switch that can be turned on or off, zero or one. ASCII is a 7-bit code.

What is a bit?

  • The smallest piece of data that a computer can process and store is called a bit (binary digit).
  • Similar to an on/off switch for lights, a bit is always in one of two physical states. One binary number, often a 0 or 1, is used to indicate the state. But yes/no, on/off, or true/false can also be used to describe the state.
  • Capacitors that store electrical charges are used to store bits in memory. Each bit's state and value are determined by the charge.
  • Although a computer may be able to examine and change data at the bit level, most systems process and store data in bytes.
  • An eight-bit string that is considered a single unit is called a byte. Computer memory and storage are always described in terms of bytes.
  • For instance, a storage device might be able to hold 1 terabyte (TB), or 1,000,000 megabytes, of data. In order to put this into perspective, 1 MB is equivalent to 1 million bytes, or 8 million bits. In other words, a 1 TB drive can hold 8 trillion bits of data.

To Learn more About   bit (binary digit) refer to:

https://brainly.com/question/16612919

#SPJ4