You are on page 1of 8

DISCUSSION

 
There are heaps of developments of science and technology during this century and it
keeps on upgrading. The following are some of the remarkable invention that had made impact
on human being.
 
The Airplane
An airplane or aeroplane was invented by the Wright brothers, Wilbur and Orville. It is a
powered, fixed-wing aircraft that is propelled forward by thrust from a jet engine or propeller.
Their work leads them to make the first controlled, sustained, powered flight on December 17,
1903 in Kitty Hawk, North Carolina. On Jan. 1,1914, the St. Petersburg. Tampa Airboat Line
became the world’s first scheduled passenger airline service, opening between St. Petersburg and
Tampa, Florida. It was a short-lived undertaking but it paved the way for today’s daily
transcontinental flights.
 
The extensive uses of airplanes include recreation, transportation of goods and people,
military, and research. Commercial aviation is a massive industry involving the flying of tens of
thousands of passengers daily on airliners. Most airplanes are flown by a pilot on board the
aircraft, but some are designed to be remotely or computer-controlled.
 
Airplanes had a presence in all the major battles of World War II. The first Jet aircraft
was the German Heinkel He 178 in 1939. The first jet airliner, the de Havilland Comet, was
introduced in 1952. The Boeing 707, the first widely successful commercial jet, was in
commercial service for more than 50 years, from 1958 to at least 2013.
 
Computers
A computer is an electronic machine that accepts information, stores it, processes it
according to the instructions provided by a user and then returns the result. Today, computers
have become part of our everyday activities. While computers as we know them today are
relatively recent, the concepts and ideas behind computers have quite a bit of history.
 
Charles Babbage referred to as ‘the father of computers conceived an analytical engine in
1830 which could be programmed with punched cards to carry out calculations. It was different
from its predecessors because it was able to make decisions based on its’ own computations,
such as sequential control, branching and looping. Konrad Zuse built the very first electronic
computers in Germany in the period 1935 to 1941. The Z3 was the first working, programmable
and fully automatic digital computer. Zuse is often regarded as the ‘inventor of the computer:
 
The British built the Colossus and the Americans built the Electronic Numerical
Integrator Analyzer and Computer, or ENIAC between 1943 ad 1945. Both Colossus and
ENIAC relied heavily on vacuum tubes, which can act as an electronic switch that can be turned
on or off much faster than mechanical switches. Computer systems using vacuum tubes are
considered the first generation of computers.
 
The first semiconductor transistor was invented in 1926, but only in 1947 was it
developed into a solid-state, reliable transistor for the use in computers. Similar to a vacuum
tube, a transistor controls the flow of electricity, but it was only a few millimeters in size and
generated little heat. Computer systems Using transistors are considered the second generation of
computers.
 
In 1954, IBM introduced the first mass-produced computer. By 1958 it became possible
to combine several components, including transistors, and the circuitry connecting them on a
single piece of silicon. This Was the first integrated circuit. Computer systems using integrated
circuits are Considered the third generation of computers. Integrated circuits led to the computer
processors we use today.Computers became quickly more powerful. By 1970 it became possible
to squeeze all the integrated circuits that are part of a single computer on a single chip called a
microprocessor. Computer systems using microprocessors are considered the fourth generation
of computers.
 
In the early 1970s computers were still mostly used by larger corporations, government
agencies and universities. The first device that could be called a personal computer was
introduced in 1975.
 
The following are some of the highlighted development of computer:

 Steve Jobs and Steve Wozniak introduce Apple Computers on April Fool’s Day and roll
out the Apple I, the first computer with a single-circuit board in 1976.
 The first IBM personal computer, code-named “Acorn:’ was introduced. It uses
Microsoft’s MS-DOS operating system. It has an Intel chip, two floppy disks and an
optional color monitor in 1981
 The first dot-corn domain name was registered on March 15, years before the World
Wide Web would mark the formal beginning of lnternet history in 1985.
 Tim Berners-Lee, a researcher at CERN, the high-energy physics laboratory in Geneva,
develops HyperText Markup Language (HTML), giving rise to the World Wide Web
(WWW) in 1990.
 The Pentium microprocessor advances the use of graphics and music on PCs on 1993.
 PCs became gaming machines as "Command & Conquer", “Alone in the Dark 2",
“Theme Park:’ “Magic Carpeta” “Descent” and “Little Big Adventure were among the
games to hit the market in 1994.
 The term Wi-Fi becomes part ofthe computing language and users begin connecting to
the Internet without wires in 1999.
 Apple unveils the Mac OS X operating system, which provides protected memory
architecture and pre-emptive multi-tasking, among other benefits in 2001.
 Mozilla’s Firefox 1 .0 challenges Microsoft’s Internet Explorer, the dominant Web
browser. Facebook, a social fletW0”g site, launches in 2004.
 YouTube, a video sharing service, is founded. Google acquires Android, a Linux-based
mobile phone operating system in 2005.
 Apple introduces the MacBook Pro, its first Intel-based, dual-core mobile computer, as
well as an Intel-based iMac. Nintendo’s Wii game console hits the market in 2006
 The iPhone brings many computer functions to the smartphone in 2007.
 Google releases the Chromebook, a laptop that runs the Google Chrome OS in 2011.
 Facebook gains I billion users on October 4, 2012.
 First reprogrammable quantum compUter was created in 2016.
 The Defense Advanced Research Projects Agency (DARPA) is developing a
 “Molecular Informatics” program that uses molecules as computers (2017)
 
Magnetic Resonance Imaging
Magnetic resonance imaging (MRI) is a 110n-invasive medical test that physicians use to
diagnose medical conditions. Magnetic resonance imaging (MRI) of the body uses a powerful
magnetic field, radio waves or pulses and a computer to produce detailed pictures of the inside of
your body such as organs soft tissues, bone and virtually all other internal body structures. It may
be used to help diagnose the presence of certain disease and abnormalities or monitor treatment
for a variety of conditions within the body.
 
Physicians use an MR examination to help diagnose or monitor treatment for conditions
such as: tumors of the chest, abdomen or pelvis; diseases of the liver, such as cirrhosis, and
abnormalities of the bile ducts and pancreas; inflammatory bowel disease such as Crohn’s
disease and ulcerative colitis; heart problems such as congenital heart disease, malformations of
the blood vessels and inflammation of the vessels (vasculitis); a fetus in the womb of a pregnant
woman.
 
The Internet
The Internet was the work of dozens of pioneering scientists, programmers and engineers
who each developed new features and technologies that eventually merged to become the
“information superhighway” we know today.
 
It started in early 1900 when Nikola Tesla toyed with the idea of a “world wireless
system” Paul Otlet and Vannevar Bush conceived of mechanized, searchable storage systems of
books and media in ‘the 1930s and 1940s. J.C.R. Licklider popularized the idea of an
“Intergalactic Network” of computers. These ground breaking ideas landed him a position as
director of the U.S. Department of Defense Advanced Research Projects Agency. (ARPA) the
government agency responsible for creating a time-sharing network of computers known as
ARPANET, the precursor to today’s internet in 1960. Leonard Kleinrock invented the packet
switching, a method for effectively transmitting electronic data that would later become one of
the major building blocks of the Internet. ARPANET used packet to allow multiple computers to
communicate on a single network. Robërt Kahn and Vinton Cerf in 1970, developed
Transmission Control Protocol and Internet Protocol, or TCPR’ a communications model that set
standards for how data could be transmitted between multiple networks. In 1972, Ray Tomlinson
introduced network email. ARPANET adopted TCP/IP on January 1, 1983, and from there
researchers began to assemble the “network of networks” that became the modern Internet. Tim
Berners-Lee invented the World Wide Web in 199. The web served as most common means of
accessing data online in the form of websites and hyperlinks. The web served as a crucial step in
developing the vast trove of information that most of us now access on a daily basis.
 
During the l980s, the National Science Foundation started to build a nationwide computer
network that included its own supercomputers, called NSFNET. ARPANET had grown well
beyond the needs of the Department of Defense, and so the NSF took control of the “civilian
nodes’. In 1990, ARPANET was officially decommissioned. Ultimately, the NSF aimed to build
a network that was independent of government funding. The NSF lifted all restrictions on
commercial use on its network in 1991 and in 1995, the Internet was officially privatized. At the
time, the Internet was 50,000 networks strong, spanned seven continents, and reached into space.
 
Optical Fiber
In 1880 Alexander Graham Bell created a very early precursor to fiber-optic
communications, the world’s first wireless telephone (Photophone). Bell considered it his most
important invention. The device allowed for the transmission of sound on a beam of light. Due to
its use of an atmospheric transmission medium, the Photophone would not prove practical until
advances in laser and optical fiber technologies permitted the secure
transport of light. The Photophone’s first practical use came in military communication systems
many decades later. .
 
In 1 952, UK based physicist Narinder Singh Kapany invented the first actual fiber
optical cable based on John Tyndall’s experiments three decades earlier. Jun-ichi Nishizawa, a
Japanese scientist proposed the use of optical fibers for communications in 1963. Optical fiber
was successfully developed in 1970 by Corning Glass Works (Robert Maurer, Donald Keck,
Peter Schultz, and Frank Zimar), with attenuation low enough for communication purposes and
at the same time GaAs semiconductor lasers were developed that were compact and therefore
suitable for transmitting light through fiber optic cables for long distances. By
the early 1 990’s as the Internet was becoming popularized in the public realm, fiber optic cables
started to be laid around the world with a major push to wire the world in order to provide
communication infrastructure.
 
Fiber optic is preferred over electrical cabling when high bandwidth, long distance, or
immunity to electromagnetic interference are required. Due to much lower attenuation and
interference, optical fiber has large advantages over existing copper wire in long-distance, high-
demand applications.
 
Optical fiber is used by many telecommunications companies to transmit telephone
signals, Internet communication, and cable television signals. The prices of fiber-optic
communications have dropped considerably since 2000. Today, fiber is present in virtually every
nation on the Earth, forming the absolute strength of the modern communications infrastructure.
 
 
Air Conditioning System
Primitive air-conditioning systems have existed since ancient times. Attempts to control
indoor temperatures began in ancient Rome, where wealthy citizens took advantage of the
remarkable aqueduct system to circulate cool water through the walls of their homes. The
emperor Elagabalus in the third century, built a mountain of snow, imported from the mountains
via donkey trains and put it in the garden next to his villa to keep cool during the summer, but
this was so costly and inefficient. Such luxuries disappeared during the Dark Ages, and large-
scale air-conditioning efforts didn’t resurface in the West.
 
In the intervening centuries, fans were the coolant of choice. Hand fans were used in
China as early as 3,000 years ago, and a second-century Chinese inventor has been credited with
building the first room-sized rotary fan. Architecture also played a major role in pre-modern
temperature control. In traditional Middle Eastern construction, windows faced away from the
sun, and larger buildings featured “wind towers " designed to catch and circulate the prevailing
breezes.
 
In late 19th-century American engineers pick up where the Romans had left off. In 1881,
a dying President James Garfield got a respite from Washington D.C. oppressive summer
swelter, thanks to an awkward device involving air blown through cotton sheets doused in ice
water.
 
Nikola Tesla’s development of alternating current motors made possible the invention of
oscillating fans in the early 20th century using electricity. And in 1 902, a 25-year-old engineer
from New York named Willis Carrier invented the first modern air-conditioning system. The
mechanical unit, which sent air through water-cooled coils, was not aimed at human comfort,
however; it was designed to control humidity in the printing plant where he worked. In 1 922, he
followed up with the invention of the centrifugal chiller, which added a central compressor to
reduce the unit’s size. For years afterward, people piled into air-conditioned movie theaters on
hot summer days, giving rise to the summer blockbuster.
 
Carrier’s innovation shaped 20th-century America. In the 1930s, air conditioning spread
to department stores, rail cars, and offices, sending workers’ summer productivity soaring. As
late as 1965, just 10 percent of U.S. homes had it, according to the Carrier Corporation. By 2007,
cool air spread across the country. Many Americans are turning to their air conditioners to
combat the current heat wave. These artificial breezes are a relatively novel innovation.
 
Gene Therapy
Gene therapy attempts to treat genetic diseases at the molecular level by correcting what
is wrong with defective genes. The first gene therapy was approved in the European Union in
2012, after two decades of dashed expectations. This approval boosted the investment in
developing gene therapies.
 
The gene therapy is successful;, if it could work by preventing a protein from doing
something that causes harm, restoring the normal function of a protein, giving proteins new
function or enhancing the existing functions of proteins.
 
Gene therapy relies on finding a dependable delivery system to carry the correct gene to
the affected cells. The gene must be delivered inside the target cells and work properly without
causing adverse effects. Delivering genes that will work correctly for the long term is the greatest
challenge of gene therapy. Viruses are often used by researchers to deliver the correct gene to
cells. In gene therapy, the DNA for the desired gene is inserted into the genetic material of the
virus and deliver its new genetic material which contains the desired DNA. Fatty molecules,
known as liposomes may also be used as can micropipettes, sometimes called "gene guns” to
insert genes into cells physically.
 
ADA: The First Gene Therapy Trial. A four-year old girl became the first gene therapy patient
on September 14, 1990 at the NIH Clinical Center. She has adenosine deaminase (ADA)
deficiency, a genetic disease which leaves her defenseless against infections. White blood cells
were taken from her, and the normal genes for making adenosine deaminase were inserted into
them. The corrected cells were reinjected into her. Dr. W. French Anderson helped develop this
landmark clinical trial when he worked at the National Heart, Lung, and Blood Institute.
 
3D Metal Printing
3D Metal Printing is one of the advances in the technology that provide instant metal
fabrication. This innovation enables the ability to create large, intricate metal structures on
demand and therefore could revolutionize manufacturing. It gives the manufacturers the ability
to make a single or small number of metal parts much more cheaply than using existing mass-
production techniques.
 
Artificial Embryos
Artificial Embryos are made from stem cells alone without using egg or sperm cells. It is
a breakthrough that will open new possibilities for understanding how life comes into existence
— but clearly also raises vital ethical and even philosophical problems. Embryologists working
at the University of Cambridge in the UK have grown realistic-looking mouse embryos using
only stem cells. No egg. No sperm. Just cells plucked from another embryo. The researchers
placed the cells carefully in a three-dimensional scaffold and watched, fascinated, as they started
communicating and lining up into the distinctive bullet shape of a mouse embryo several days
old. Synthetic human embryos would be a boon to scientists, letting them tease apart events early
in development. And since such embryos start with easily manipulated stem cells, labs will be
able to employ a full range of tools, such as gene editing, to investigate them as they grow.
 
 
Cell-free Fetal DNA Testing
Pregnant Tomen sometimes need to have cells of their fetus tested for chromosomal
defects such as Edwards Syndrome and Down Syndrome. These tests require an acquisition of
cells that are quite invasive for the unborn baby. The test brought risk of miscarriage and
increased stress for pregnant mothers. With medical advances, it is now possible for doctors to
test cell-free fetal DNA by using the mother’s blood. This advance has become more widely used
and accepted internationally in the past year.
 
Cancer Nanotherapy
Nano devices and technology are already in wide use, and as the years pass, the
technology in pharmaceuticals and medicine will only continue to improve. One of which is an
emerging cancer treatment technology that implements nanomaterial in a more aggressive
method. For example, researchers at Israel’s Bar-Tian University have developed nanobots to
target and deliver drugs to defective cells, while leaving healthy ones unharmed.
 
The 25-35 nm devices are made frond single strands of DNA folded into a desired shape
— for instance, a clamshell-shaped package that protects a drug while on route to the desired site
but opens up to release it upon arrival.
 
As the years pass, technology in pharmaceuticals and medicine will continue to improve.
People are living longer and fewer diseases are deemed incurable. Jobs in the pharmaceutical
industry are in higher demand now than ever.

You might also like