You are on page 1of 7

Bandwidth is defined as a range within a band of frequencies or wavelengths.

Bandwidth is also

defined as the amount of data that can be transmitted in a fixed amount of time.

For digital devices, the bandwidth is usually expressed in bits per second(bps) or bytes per

second. For analog devices, the bandwidth is expressed in cycles per second, or Hertz (Hz).

The bandwidth is particularly important for I/O devices. For example, a fast disk drive can be

hampered by a bus with a low bandwidth. This is the main reason that new buses, such as AGP,

have been developed for the PC.

by Tim Fisher
Updated August 14, 2017

The term bandwidth has a number of technical meanings but since the popularization of the
internet, it has generally referred to the volume of information per unit of time that a
transmission medium (like an internet connection) can handle.

An internet connection with a larger bandwidth can move a set amount of data (say, a video file)
much faster than an internet connection with a lower bandwidth.

Bandwidth is typically expressed in bits per second, like 60 Mbps or 60 Mb/s to explain a data
transfer rate of 60 million bits (megabits) every second.

How Much Bandwidth Do You Have? (& How Much Do You Need?)

See How to Test Your Internet Speed for help on how to accurately determine how much
bandwidth you have available to you. Internet speed test sites are often, but not always, the
best way to do that.

How much bandwidth you need depends on what you plan on doing with your internet
connection. For the most part, more is better, constrained of course by your budget.

In general, if you plan on doing nothing but Facebook and the occasional video watching, a low-
end high-speed plan is probably just fine.
If you have a few TVs that will be streaming Netflix, and more than a few computers and
devices that might be doing who-knows-what, I'd go with as much as you can afford. You won't
be sorry.

Bandwidth Is a Lot Like Plumbing

Plumbing provides a great analogy for bandwidth... seriously!

Data is to available bandwidth as water is to the size of the pipe.

In other words, as the bandwidth increases so does the amount of data that can flow through in
a given amount of time, just like as the diameter of the pipe increases, so does the amount of
water that can flow through during a period of time.

Say you're streaming a movie, someone else is playing an online multiplayer video game, and a
couple others on your same network are downloading files or using their phones to watch online
videos. It's likely that everyone will feel that things are a bit sluggish if not constantly starting
and stopping. This has to do with bandwidth.

To return to the plumbing analogy, assuming the water pipe to a home (the bandwidth) remains
the same size, as the home's faucets and showers are turned on (data downloads to the
devices being used), the water pressure at each point (the perceived "speed" at each device)
will reduce - again, because there's only so much water (bandwidth) available to the home (your
network).

Put another way: the bandwidth is a fixed amount based on what you pay for, so, while one
person may be able to stream a high-def video without any lag whatsoever, the moment you
begin adding other download requests to the network, each one will get just their portion of the
full capacity.

For example, if a speed test identifies my download speed as 7.85 Mbps, it means that given no
interruptions or other bandwidth-hogging applications, I could download a 7.85 megabit (or 0.98
megabytes) file in one second.

A little math would tell you that at this allowed bandwidth, I could download about 60 MB of
information in one minute, or 3,528 MB in an hour, which is equivalent to a 3.5 GB file... pretty
close to a full-length, DVD-quality movie.
So while I could theoretically download a 3.5 GB video file in an hour, if someone else on my
network tries to download a similar file at the same time, it would now take two hours to
complete the download because again, the network only permits xamount of data to be
downloaded at any given time, so it now must allow the other download to use some of that
bandwidth too.

Technically, the network would now see 3.5 GB + 3.5 GB, for 7 GB of total data that needs
downloaded. The bandwidth capacity doesn't change because that's a level you pay
your ISP for, so the same concept applies - a 7.85 Mbps network is going to now take two hours
to download the 7 GB file just like it would take just one hour to download half that amount.

The Difference in Mbps and MBps

It's important to understand that bandwidth can be expressed in any unit (bytes, kilobytes,
megabytes, gigabits, etc.). Your ISP might use one term, a testing service another, and a video
streaming service yet another. You'll need to understand how these terms are all related and
how to convert between them if you want to avoid paying for too much internet service or,
maybe worse, ordering too little for what you want to do with it.

For example, 15 MBs is not the same as 15 Mbs (note the lowercase b). The first reads as 15
megaBYTES while the second is 15 megaBITS. These two values are different by a factor of 8
since there are 8 bits in a byte.

If these two bandwidth readings were written in megabytes (MB), they'd be 15 MBs and 1.875
MBs (since 15/8 is 1.875). However, when written in megabits (Mb), the first would be 120 Mbs
(15x8 is 120) and the second 15 Mbps.

Tip: This same concept applies to any data unit you might encounter. You can use an online
conversion calculator like this one if you'd rather not do the math manually. See Terabytes,
Gigabytes, & Petabytes: How Big are They? for more information.

More Information on Bandwidth

Some software lets you limit the amount of bandwidth that the program is allowed to use, which
is really helpful if you still want the program to function but it doesn't necessarily need to be
running at a certain speed. This intentional bandwidth limitation is often called bandwidth
control.
Some download managers, like Free Download Manager, for example, support bandwidth
control, as do numerous online backup services and some cloud storage services. These are all
services and programs that tend to use massive amounts of bandwidth, so it makes sense to
have options that limit their access.

As an example, say you want to download a really large 10 GB file. Instead of having it
download for hours, sucking away all the available bandwidth, you could use a download
manager and instruct the program to limit the download to use only 10% of the available
bandwidth. This would of course drastically add time to the total download time but it would also
free up a lot more bandwidth for other time-sensitive activities like live video streams.

Something similar to bandwidth control is bandwidth throttling. This is also a deliberate


bandwidth control that's sometimes set by internet service providers to either limit certain types
of traffic (like Netflix streaming or file sharing) or to limit all traffic during particular periods of
time during the day in order to reduce congestion.

Network performance is determined by more than just how much bandwidth you have available.
There are also factors like latency, jitter, and packet loss that could be contributing to less-than-
desirable performance in any given network.
There are three frequently used definitions of bandwidth in the context of Information
Technology (IT) and general business.

1) In computer networks, bandwidth is used as a synonym for data transfer rate, the


amount of data that can be carried from one point to another in a given time period
(usually a second). Network bandwidth is usually expressed in bits per second (bps);
modern networks typically have speeds measured in the millions of bits per second
(megabits per second, or Mbps) or billions of bits per second (gigabits per second,
or Gbps).

Note that bandwidth is not the only factor that affects network performance: There is
also packet loss, latency and jitter, all of which degrade network throughput and
make a link perform like one with lower bandwidth.  A network path usually consists
of a succession of links, each with its own bandwidth, so the end-to-end bandwidth
is limited to the bandwidth of the lowest speed link (the bottleneck).

Different applications require different bandwidths.  An instant messaging


conversation might take less than 1,000 bits per second (bps); a voice over
IP (VoIP) conversation requires 56 kilobits per second (Kbps) to sound smooth and
clear.  Standard definition video (480p) works at 1 megabit per second (Mbps), but
HD video (720p) wants around 4 Mbps, and HDX (1080p), more than 7 Mbps.

Effective bandwidth -- the highest reliable transmission rate a path can provide -- is
measured with a bandwidth test. This rate can be determined by repeatedly
measuring the time required for a specific file to leave its point of origin and
successfully download at its destination.

2) Bandwidth is the range of frequencies -- the difference between the highest-


frequency signal component and the lowest-frequency signal component -- an
electronic signal uses on a given transmission medium. Like the frequency of a
signal, bandwidth is measured in hertz (cycles per second). This is the original
meaning of bandwidth, although it is now used primarily in discussions about cellular
networks and the spectrum of frequencies that operators license from various
governments for use in mobile services.

3) In business, bandwidth is sometimes used as a synonym for capacity or ability. In


this sense, bandwidth usually refers to having time or staffing available to tackle
something, e.g. "We just don't have the bandwidth to take on mobile app
development, we're already short-staffed on developers."

THROUGHPUT

Throughput is a measure of how many units of information a system can process in

a given amount of time. It is applied broadly to systems ranging from various

aspects of computer and network systems to organizations. Related measures of

system productivity include , the speed with which some specific workload can be

completed, and response time, the amount of time between a single interactive user

request and receipt of the response.  

Historically, throughput has been a measure of the comparative effectiveness of


large commercial computers that run many programs concurrently. An early
throughput measure was the number of batch jobs completed in a day. More recent
measures assume either a more complicated mixture of work or focus on some
particular aspect of computer operation. Units like "trillion floating-point operations
per second (TeraFLOPs or TFLOPS)" provide a metric for comparing the cost of raw
computing over time or by manufacturer. A benchmark can be used to measure
throughput. In data transmission, network throughput is the amount of data moved
successfully from one place to another in a given time period, and typically
measured in bits per second (bps), as in megabits per second (Mbps) or gigabits per
second (Gbps).

Likewise, in storage systems, throughput refers to either the amount of data that can
be received and written to the storage medium or read from media and returned to
the requesting system, typically measured in bytes per second (Bps). It can also
refer to the number of discrete input or output (I/O) operations responded to in a
second (IOPS).

Throughput applies at higher levels of the IT infrastructure as well. Databases or


other middleware can be discussed in terms of "transactions per second" (TPS);
Web servers can be discussed in terms of page-views per minute.

Throughput also applies to the people and organizations using these systems:
Independent of the TPS rating of its help desk software, for example, a help
desk has its own throughput rate that includes the time staff spend on developing
responses to requests.

You might also like