You are on page 1of 3

What are the similarities and differences between 7-bit and 8-bit ASCII?

Answer Follow · 6 Request

Ad by DataCamp

What are the major differences between Python and R for data science?
Both Python and R have vast software ecosystems and communities, so either language is
suitable for almost any data science task. That said, there are some areas in which one is
s(Continue reading in feed)

6 Answers

Travis Casey, Co-founder and Lead Writer


Answered November 14, 2019
Originally Answered: What are the similarities between 7 IT and 8 bit ASCII?
I’m going to assume that what you’re trying to ask is “What are the similarities between 7 bit and 8 bit ASCII?”

ASCII is a 7 bit code. Note that ASCII was not originally created for computers — it was created for teletypes,
which are an older technology. Since it was convenient to connect teletypes to computers for input and
output, many computer systems came to support ASCII for input and output.

At first, the size of a “byte” varied between different computers, but eventually, the industry settled on bytes
of 8 bits. Now, you could “pack” ASCII in, so that 8 ASCII characters would fit into 7 bytes. This, however,
would be extremely inconvenient to handle, so most companies chose to put one ASCII character per byte
when storing ASCII.

This left the problem of what to do with the additional bit. Some systems used this to indicate a character
attribute, such as bold or underlined text. Other companies, however, chose to extend the ASCII character set,
adding another 128 characters. However, not all companies that did this added the same characters, or did so
with the same characters having the same addresses. Thus, there are many different “extended ASCII”
character sets that use 8 bits per character. One of the most widely used became the IBM Extended ASCII set,
used by the IBM PC.

Unfortunately, without context, when someone says “8 bit ASCII”, you can’t tell whether they mean “standard
7 bit ASCII, but with the high bit ignored”, “IBM extended ASCII”, “Commodore extended ASCII”, etc.

We can say that all of these are similar in that they agree on what the numbers 0 to 127 stand for, though. So,
that’s the similarity between 7 bit and 8 bit ASCII — they agree on what 0 to 127 stand for. In 7 bit ASCII,
higher numbered character codes simply don’t exist. In 8 bit ASCII, they can mean different things, depending
on which extended ASCII set is in use.
1K viewsView Upvoters

Adding comments disabled

Sponsored by JetBrains
Learn how ReSharper helps .NET & web developers in Visual Studio.
Eliminate errors and code smells, refactor and navigate your code, comply with coding
standards, & more.

Download

Dan Higdon, works at Retro Studios


2
Answered July 7, 2019
Originally Answered: What are the differences between ASCII-7 and ASCII-8?
Both are stored in a byte (8 bits), but the original ASCII standard only defined the values 0–127 (ie, the lower 7
bits). The high bit, and thus the values 128–255, were left undefined.

Why would anyone make a standard like that?

In data transmission of the day, each bit took enough time to transmit that if you could only send 7 instead of
8, that was a 1/8th speed improvement.

Since bytes with the high bit set were not valid ASCII data, some programs of the “8 bit era” used the high bit
as a marker of some kind, so that a letter A with the high bit set might be considered a bolded A, or the first
letter of a sentence, etc.

But unfortunately, 7 bit ASCII didn’t contain characters useful for languages other than English, and by the
80’s, most computer manufacturers had added their own set of characters in the “high number” space -
ATASCII for Atari, PETSCII for Commodore, etc. In many cases, these character sets included things like block
graphics, playing card symbols (hearts, clubs, etc.), but there were no standards beyond 127.

Eventually, the ISO group stepped in (see wikipedia’s Extended_ASCII article for a good discussion) and
eventually we wound up with UTF-8 Unicode, which is a superset of ASCII-7, and a very popular encoding
scheme (to say the least).
1.1K viewsView Upvoters

Add Comment

Related Spaces (More Answers Below)


PHP Learning Program PHP Development in PHP Development Hub Code
Become a member & USA PHP is a free and open Novice or Advanced, we
Learn a PHP PHP is one of the source technology have advice. Start Ta
Programming leading programming which can be used for programming or lev
languages in the world. creating rich. optimize your code!

Follow 193 Follow 247 Follow 179 Follow 354.4K

View More Spaces

Brant Merryman, 25 years of app development for Mac, Windows, and Linux.
Answered October 20, 2018
Originally Answered: What is the difference between 7 bit ASCII and 8 bit ASCII?
7 bit ASCII is a subset of the characters available in 8bit ASCII. The reason it is typically used is so that the
extra bit can be used for parity which is a method of error detection. This used to be important back when
people used acoustic coupler modems because it was more likely that there could be noise on the line
causing characters to be garbled. So you would enable Parity as either “even” or “odd” and it would add up
all of the on bits in the character and then set the value for the parity bit. This allowed for the integrity of
each character to be checked. Nowadays we mostly don’t use parity so most of the time serial connections
are setup with 8 data bits. If you look up an ASCII chart you may see a section called “extended ASCII” which
are the values from 128–255. 7 bit ASCII just uses the values from 0–127.
2.4K viewsView Upvoters

You might also like