Microsoft Word - Digital Logic Design v_4_6a

(lily) #1

1.9. Binary Codes


Binary codes are used to translate human symbols to one and zeros. The most important of the symbols
is the alphabet used for human communications. So every key and character has to have a unique
binary code. The minimum number of bits required to uniquely identify all the keys on the keyboard must
meet the following condition:


2 Number of Bits ≥ Number of keys


 ASCII Code
Initially, IBM’s scheme of representing alphanumeric and control characters for computers was the
most commonly used coding method. The coding scheme was referred to as the Extended Binary-
Coded Decimal Interchange Code (EBCDIC). Its dominance was driven by IBM’s near-monopoly
position in the computer industry until the early 1980’s.


The majority of other manufacturers were looking for a non-proprietary coding, leading to the
American Standard Code for Information Interchange (ASCII) coding. ASCII was adopted by the
majority of vendors and very quickly overtook EBCDIC as the most commonly used coding scheme.

ASCII code is used to represent alphanumeric and control characters with 8 bits. The ASCII code
table is shown below:

In early 1990, the need for a code that was capable of representing Asian languages with large
Free download pdf