You are on page 1of 1

ASCII codes and their importance

Computers only understand binary. Humans understand languages. So to translate d


ata to a language you need a character set. There have been many over the years
but ASCII has become the standard for almost all OS's today.
It stands for American Standard code for information interchange.
What a computer does when it pulls up data that will be displayed is it checks t
he number of the character for exampe 65 decimal Then translates it to the chara
cter coresponding to that number. So for American English ASCII that means a cap
ital A.
So stored on a computer
0110001 0110010 0110011 gets turned into ABC when read to the screen.
Where it is most often used by humans is by programmers. Programmers will trap k
eystrokes by the character code, then decide what to do when certain keystrokes
are pressed. Not many people would know what to do with a hex dump any more but
some folks still use them in the IT industry. Especially when doing low level pa
cket sniffing or reverse engineering binary files.
Included a couple links with an ASCII chart and with decent explanation of chara
cter sets.
It is a way to encode information. It is a 7-bit encoding (for transmission prot
ocol reason); the first codes were for controlling the transmission (and are cal
led control codes); the rest are "printable" character, i.e. each code "maps" fo
r a graphics form (a glyph), like a letter, a number (not binary encoded of cour
se), punctuation and so on.
it's not the only standard for such a task, other standard existed and exist, bu
t surely ASCII is the most common and it is the "base" of a lot of 8 bit long en
codings (since ASCII was born to transmit "english", it has not accented letters
, no foreign diacritics and so on, so extension are needed to represent and tran
smit other languages).
Nowadays ASCII is almost everywhere: html tags are ASCII (packed into byte-size)
, and a lot of computer languages intend that their "keywords" are ascii encoded
.

You might also like