You are on page 1of 8

UNIT NINE

FUTURE OF COMPUTING (FINAL TOPIC)


WEEK 12

As we have noticed, from the humble beginning of the first generation of computers to their
current capabilities, development has been a crucial part of creating an ever-evolving
environment. This journey of endless growth shows no signs of stopping as we continuously
push forward with breakthroughs and technological advances. In time, our machines will move
past mere tools and evolve into something that can help humans make decisions thereby helping
across many spheres of life including medicine, education & business.

It is bold to say that with continuous innovation through research and science, we all (including
you and I) are eager to see what comes next. The question therefore is “What does the future
hold for us in terms of computing”?

In answering the above question, we may be surprised to see how much has changed from today.
There could be no physical machines as we know them but instead, advanced computing systems
might fit within our bodies or on a piece of jewelry. Humans will continue to control these
technologies though; However, this is something that is not likely to change in what lies ahead.

The future is here. Robotic systems are rapidly evolving and computers have become an integral
part of our lives, providing us with a more luxurious lifestyle. Breakthroughs in many domains
such as science, technology, and biotechnology continue to revolutionize the capabilities of
computer-based technologies making futuristic dreams a reality.

Imagine a future where your home is brimming with possibilities thanks to the Fifth Generation
of Computers. We are already seeing glimpses of these devices in action today, providing us
with sleek and integrated systems that allow for unprecedented control over our electronic
equipment. As technology continues to evolve, humanity can unlock countless potential
opportunities through this remarkable generation of computers.

To further the discussion in our topic context, let us look at some of the Future Computers that
are already with us or are likely to come into light soon:

1
1. Quantum Computing

Quantum computing will revolutionize computers in a truly remarkable way. This will apply
principles of quantum mechanics and measure data, not in bits, but qubits. Quantum computing
will redefine computers and offer them a limitless scope.

Imagine a scientist experimenting virtually instead of using any solid physical materials. Or an
engineer designing a car model without actually using any tools. When a computer is powered by
quantum computing technology, it will use physical matter such as subatomic particles such as
photons and electrons to carry out complex processing. Data will exist in multiple states and a
computer will make billions of copies of a computation, which will also exist in different states.
Quantum computing is the future of computers that will offer limitless possibilities for all.

Further, Quantum computing has the potential to revolutionize computing by leveraging the
principles of quantum mechanics. It promises to solve complex problems much faster than
classical computers. While practical, scalable quantum computers are still in the early stages of
development, they could have a significant impact on fields like cryptography, optimization, and
simulation.

What is a quantum?
A quantum (plural: quanta) is the smallest discrete unit of a phenomenon. For example,
a quantum of light is a photon, and a quantum of electricity is
an electron. Quantum comes from Latin, meaning "an amount" or "how much?" If
something is quantifiable, then it can be measured.

2
What is quantum in physics?
The modern use of quantum in physics was coined by Max Planck in 1901. He was
trying to explain black-body radiation and how objects changed color after being heated.
Instead of assuming that the energy was emitted in a constant wave, he posed that the
energy was emitted in discrete packets, or bundles. These were termed quanta of
energy.
What is quantum in computing?
Quantum computing uses the nature of subatomic particles to perform calculations
instead of using electrical signals as in classical computing. Quantum computers
use qubits(quantum bits) instead of binary bits. By programming the initial conditions of
the qubit, quantum computing can solve a problem when the superposition state
collapses. The forefront of quantum computer research is in linking greater numbers of
qubits together to be able to solve larger and more complex problems.
Quantum computing uses the nature of subatomic particles to execute calculations as
an alternative to the electrical signals used in classical computing.
Quantum computers can perform certain calculations much faster than classical
computers. To find an answer to a problem, classical computers need to go through
each option one at a time. It can take a long time to go through all the options for some
types of problems. Quantum computers do not need to try each option; instead, they
resolve the answer almost instantly.

2. Artificial Intelligence (AI)

Artificial Intelligence, or AI for short, is already transforming the way computers think and act.
But its potential goes far beyond what we have seen so far this rapidly evolving technology
promises to take computer intelligence to new heights in the years ahead. From hospitals relying
on automated diagnoses to understanding customer preferences more quickly than ever before;
from faster manufacturing processes that boost productivity, Artificial Intelligence has
transformative possibilities across a range of industries like healthcare services, education, and
even farming.

3
We are on the brink of a new era, one where robots will not only clean our cars, serve us food,
and make our homes more secure but also completely revolutionise society through Artificial
Intelligence-enabled computers. Automation is just scratching the surface. Soon we will be able
to shop with ease, pay quickly without tedious manual entries, and manufacture products faster
than ever before. AI technology stands at the heart of it all. With its help we can unlock
unprecedented levels of efficiency in almost every aspect of life has to offer.

What Is Artificial Intelligence?


Artificial intelligence (AI) is a wide-ranging branch of computer science concerned with building
smart machines capable of performing tasks that typically require human intelligence. While AI
is an interdisciplinary science with multiple approaches, advancements in machine
learning and deep learning, in particular, are creating a paradigm shift in virtually every sector of
the tech industry.
Artificial intelligence allows machines to model, or even improve upon, the capabilities of the
human mind. And from the development of self-driving cars to the proliferation of generative
AI tools like ChatGPT and Google’s Bard, AI is increasingly becoming part of everyday life —
and an area companies across every industry are investing in.
Some Areas of application of AI
Natural language processing, Computer vision, Machine learning, Robotics, Agriculture,
Education, Health & Business

3. Internet of Things

Imagine a future where you can command your appliances from anywhere else in the world
when there is no need to be at home. This radical notion is made possible by the Internet of
Things (IoT) technology, making devices smarter and more connected than ever before. With
IoT, computers can communicate autonomously with each other like never seen before.

4
Imagine living in a world where cars start on their own, ovens know when to turn off and
bicycles greet you with an automated “hello” when they sense your watch. The future of
computers has much more potential than we ever imagined.

This technology will make homes smarter, cities better managed, schools safer and hospitals run
like well-oiled machines. With this mind-boggling connectivity between devices powered by
computer software, it’s easy to see just how efficient our day-to-day lives can become due to the
power of ‘smart’ communication.

The internet of things, or IoT, is a network of interrelated devices that connect and exchange
data with other IoT devices and the cloud.

4. Edge Computing
As the number of connected devices continues to grow, there is an increasing need to process
data closer to the source. Edge computing involves processing data at or near the device instead
of relying on a centralized cloud server. This approach reduces latency(Network latency is the
delay in network communication. It shows the time that data takes to transfer across the
network. Networks with a longer delay or lag have high latency, while those with fast
response times have low latency), enhances privacy, and can lead to more efficient use of
bandwidth(refers to the capacity at which a network can transmit data. For example, if the
bandwidth of a network is 40 Mbps, it implies that the network cannot transmit data faster
than 40 Mbps in any given case.).
5. 5G Technology

The rollout of 5G networks will significantly impact computing by providing faster and more
reliable wireless connectivity. This will enable new applications, such as augmented reality

5
(AR), virtual reality (VR), and Internet of Things (IoT) devices that require high-speed, low-
latency connections.
6. Biocomputing and DNA Data Storage
Researchers are exploring biological systems and DNA for computing purposes. DNA storage, in
particular, could offer a highly dense and durable means of storing vast amounts of data.
Biocomputing involves using biological components to perform computations, presenting an
alternative approach to traditional silicon-based computing.
What Is Biocomputing?
Biocomputing — a cutting-edge field of technology — operates at the intersection of biology,
engineering, and computer science. It seeks to use cells or their sub-component molecules (such
as DNA or RNA) to perform functions traditionally performed by an electronic computer.
The ultimate goal of biocomputing is to mimic some of the biological ‘hardware’ of bodies like
ours — and to use it for our computing needs. From less to more complicated, this could include:
1. Using DNA or RNA as a medium of information storage and data processing
2. Connecting neurons to one another, similar to how they are connected in our brains
3. Designing computational hardware from the genome level up
Cells Already Compute
Cells are far more powerful at computing than our best computers.
For example:
1. Cells store data in DNA
2. Receive chemical inputs in RNA (data input)
3. Perform complex logic operations using ribosomes
4. Produce outputs by synthesizing proteins
Biocomputing’s engineering challenge is to gain a granular level of control of the reactions
between organic compounds like DNA or RNA.

7. Bio-Computers

Make way for the newest medical marvels Bio-Computers. Imagine taking a computer not just
as small as an aspirin, but actually swallowing it like one. Or, getting a chip implanted in your
hand to constantly monitor any unexpected changes in your DNA cells? Believe it or not, this is
no longer science fiction these new technologies are closer than ever and will revolutionise
healthcare by providing cutting-edge solutions for biotechnology fields.

6
Imagine a computer that is much, much smaller than it currently exists and does not just offer
tremendous processing power but can even learn by itself. This will be possible with Bio-
computers of the future which have biological and organic elements running processes to store
data. With such technology available in the near future, there are endless possibilities for how we
could use this new form of computing from detecting abnormalities or badly structured DNA to
providing large benefits both economically and socially.

Conclusion
Conclusively class, we can boldly say that, from the humble beginnings of computers to their
current capabilities, development has been a crucial part of creating an ever-evolving
environment. This journey of endless growth shows no signs of stopping as we continuously
push forward with breakthroughs and technological advances. In time, our beloved machines will
move past mere tools and evolve into something that can make decisions for us helping across
many spheres including medicine, education and business. With continuous innovation through
research and science, we are all eager to see what comes next.

Unit Ten
Microsoft Office
Microsoft Word
Microsoft Word (MS Word) is the most popular word-processing software in use today. It is a
word processing program you can use to write letters, resumes, reports, memos, circulars, and
more. Anything you can create with a typewriter; you can create with Word. You can make your
documents more appealing and easier to read by applying formatting to the text. One major
benefit of MS Word is that it helps users type and produce documents faster and more
accurately.

In MS Word, there are various versions ranging from older versions of 2003, and 2007 to recent
versions of 2016 to 2019. Note that these versions may become older in a few years’ time
depending on business needs in the future.

7
To close our lecture with you students, we will look at some shortcuts frequently use in Ms
Word.

Keyboard Shortcuts
Action Keystrokes
Create a new document CTRL+N
Save a document CTRL+S
Open an existing document CTRL+O
Print a document CTRL+P
Close a document CTRL+W
Select an entire document CTRL+A
Highlight a Word Double-click within the word
Highlight a Paragraph Triple-click within the paragraph
Highlight a Sentence CTRL + click anywhere within the sentence
Copy text CTRL+C
Cut text CTRL+X
Paste text CTRL+V
Undo CTRL+Z
Redo CTRL+Y
Bold CTRL+B
Italics CTRL+I
Underline CTRL+U
Align Left CTRL+L
Center CTRL+E
Align Right CTRL+R
Justify CTRL+J

You might also like