Professional Documents
Culture Documents
SECOND EDITION
2021
LITE MATTERS 2021
TOPICS
1. Recent and Emerging Technologies
2. Grand Challenges of ICT
LESSON 5.1
RECENT AND EMERGING
TECHNOLOGIES
INTRODUCTION
In today's day and age, technology has been rising at a rate that the world has never
even seen before. New innovations are being made every day, ideas being patented, and
creations by brilliant minds that are being shown to the world, most of which used to only be
possible through the imagination. With technology being the skeleton of every system being
made nowadays, it is only right for inventors to look ahead and embrace this marvel, which is
technology. Along with this are the current trends that people will have to embrace in order to
fully utilize the ever-growing state of technology. Let us look at some examples of recent and
emerging technologies and see how they affect our way of living.
“IoT is the future towards artificial Intelligence and complete automation of the physically
existing world.”
In Chapter 1, we use the term Internet of Things (IoT) to refer to devices – sensors
(devices that measure or sense their surroundings) and actuators (devices that change or
control their surroundings) that can be accessed and controlled over the Internet. Now we will
delve more on what really IoT is.
Figure 1 -- Illustration of a home using a centralized wireless router to connect a set of wireless IoT devices.
Figure 2 -- Illustration of a large home using a repeater plus a centralized wireless router to span a large
distance.
Figure 3 -- Illustration of a mesh network in which each IoT device agrees to forward packets between the wireless
router and other devices.
Short History
The term “Internet of Things” (IoT) was first used in 1999 by British technology pioneer
Kevin Ashton to describe a system in which objects in the physical world could be connected
to the Internet by sensors. Ashton coined the term to illustrate the power of connecting Radio-
Frequency Identification (RFID) tags used in corporate supply chains to the Internet in order
to count and track goods without the need for human intervention. Today, the Internet of
Things has become a popular term for describing scenarios in which Internet connectivity and
computing capability extend to a variety of objects, devices, sensors, and everyday items.
IoT Advantages
IoT Disadvantages
a. Compatibility – As of now, there is no standard
for tagging and monitoring with sensors. A
uniform concept like the USB or Bluetooth is
required which should not be that difficult to do.
c. Privacy/Security – Privacy is a big issue with IoT. All the data must be encrypted so
that data about your financial status or how much milk you consume is not common
knowledge at the workplace or with your friends.
d. Safety – There is a chance that the software can be hacked, and your personal
information misused. The possibilities are endless. Your prescription being changed,
or your account details being hacked could put you at risk. Hence, all the safety risks
become the consumer’s responsibility.
IoT Applications
a. Human – Devices (wearables and ingestible) to monitor and maintain human health
and wellness, disease management, increased fitness, higher productivity.
c. Factories – Places with repetitive work routines, including hospitals and farms;
operating efficiencies, optimizing equipment use and inventory.
d. Vehicles – Vehicles including cars, trucks, ships, aircraft, and trains; condition-based
maintenance, usage-based design, presales analytics.
These cities will be highly sensitive to minimal data changes. They will generate
alarms and warnings upon the slightest detection of problems.
With the help of mobile apps, we can connect the small IoT technology-based
programs of smart home projects with the owner’s mobile phone. This will not only
make the task more comfortable but with the help of advanced development
techniques, these can become controllable globally by just entering a few details on
your mobile phones.
b. Better Healthcare
With the availability of extensive data range, the healthcare results will improve.
The recently attended workshop revealed that companies are already working over IoT
technologies that identify and collect data over cancer, images readings change in
body parameters at different stages of the disease to develop a technology that will
determine the development of cancerous cells in the body at very early stages. This
will not only help with new medication but will also help in the prevention of disease
and ensure a longer life of the patient. Many other such technologies will come and
thrive in the coming years in the market of medicine.
“Our technology improved ever since and becomes more innovative as time passes
by. RPA will be more advanced as the A.I. and Machine Learning to be more powerful
in the future that it’ll make our lives easier and more efficient.”
The institute for Robotic Process Automation and Artificial Intelligence (IRPAAI)
defined robotic process automation (RPA) as “the application of technology allows employees
in a company to figure computer software or a “robot” to capture and interpret existing
applications for processing the transaction, manipulating data, triggering responses and
communicating with other digital systems” (Institute for Robotic Process Automation & Artificial
Intelligence, 2018, para. 3).
It is a technology used for software tools that functions partially or fully automate
human activities that are manual, rule-based, and is repetitive. It works by replicating the
actions of an actual human that interacts with one or many software applications to perform
different tasks such as data entry, process standard transactions, or respond to simple
customer service queries.
All of this said, in practice there are severe limitations on what a robotic process automation
tool can do. It must be scripted/programmed to perform a repetitive task. To do that a subject
matter expert who really understands how the work is done manually must be employed to
map out those steps.
In addition, the data sources and destinations need to be highly structured and
unchanging – robotic process automation tools do not deal with quirks, errors, exceptions or
the normal mess of human interactions well at all.
Short History
One of the said first steps toward the innovation would eventually lead to the creation
of RPA was Machine Learning. It is widely credited that the name was first coined in 1959 by
Arthur Samuel, A pioneer in the field of artificial intelligence who at the time was working for
infamous computer company, IBM. Machine learning started as a scientific endevour aimed
at creating artificial intelligence.
In 1960s, they combined artificial intelligence with the interactions between computers
and human languages. The goal is to help computers understand and process human
language more accurately since computers does not have the same understanding of natural
language that humans do.
Then in 1990s, there were a few more key developments. One of those was screen
scraping software.
By the early 2000s simple RPA was developed; however, it remained relatively
unknown for some time– it was not until 2015 when RPA began to enter the mainstream.
Log into any application Connect to system APIs Copy and Paste data
Categories
PROS CONS
RPA Uses
RPA can help diminish the workload of the human help desk by
Help Desk
taking care of straightforward, repetitive issues.
Pulling Data from Multiple RPA tech can help make it happen by scraping data off
Sites for Best Deals websites, comparing it and showing you the best deal.
BLOCKCHAIN TECHNOLOGY
Transaction Process
The reason why the blockchain has gained so much admiration is that:
a. It is not owned by a single entity; hence it is decentralized.
b. The data is cryptographically stored inside.
c. The blockchain is immutable, so no one can tamper with the data that is inside the
blockchain.
d. The blockchain is transparent so one can track the data if they want to.
Some companies that have already incorporated blockchain include Walmart, Pfizer,
AIG, Siemens, Unilever, and a host of others. For example, IBM has created its Food Trust
blockchain to trace the journey that food products take to get to its locations.
PROS CONS
First proposed as a research project in 1991, Blockchain is comfortably settling into its
late twenties. Like most millennials its age, blockchain has seen its fair share of public scrutiny
over the last two decades, with businesses around the world speculating about what the
technology is capable of and where it’s headed in the years to come.
With many practical applications for the technology already being implemented and
explored, blockchain is finally making a name for itself at age twenty-seven, in no small part
because of bitcoin and cryptocurrency. As a buzzword on the tongue of every investor in the
nation, blockchain stands to make business and government operations more accurate,
efficient, secure, and cheap with fewer middlemen.
“I don’t need a hard disk in my computer if I can get to the server faster… carrying
around these non-connected computers is byzantine by comparison.”
Edge Computing
Edge computing brings data storage and compute power closer to the device or data
source where it is most needed. Information is not processed on the cloud filtered through
distant data centers; instead, the cloud comes to you. This distribution eliminates lag-time and
saves bandwidth. The processes include computing and storage, and networking.
Cloud Computing
Cloud computing revolves around large, centralized servers stored in data centers.
After data is created on an end device, that data travels to that central server for processing.
Cloud computing refers to the use of various services such as software development
platforms, storage, servers, and other software through internet connectivity. Vendors for
cloud computing have three common characteristics which are mentioned below:
Service Models
Cloud computing services can be deployed in terms of business models, which can
differ depending on specific requirements. Some of the conventional service models employed
are described in brief below.
c. Infrastructure as a Service or IaaS – Here, consumers can control and manage the
operating systems, applications, network connectivity, and storage, without controlling
the cloud themselves.
Deployment Models
Just like the service models, cloud computing deployment models also depend on
requirements. There are four main deployment models, each of which has its characteristics.
Despite the many challenges faced by Cloud Computing, there are many benefits of
the cloud as well.
5G TECHNOLOGY
5G is the 5th generation mobile network. It is a new global wireless standard after 1G,
2G, 3G, and 4G networks. 5G enables a new kind of network that is designed to connect
virtually everyone and everything together including machines, objects, and devices.
5G wireless technology is meant to deliver higher multi-Gbps peak data speeds, ultra-
low latency, more reliability, massive network capacity, increased availability, and a more
uniform user experience to more users. Higher performance and improved efficiency empower
new user experiences and connects new industries.
First generation - 1G
1G, 2G, 3G, and 4G all led to 5G, which is designed to provide more connectivity than
was ever available before.
With high speeds, superior reliability, and negligible latency, 5G will expand the mobile
ecosystem into new realms. 5G will impact every industry, making safer transportation, remote
healthcare, precision agriculture, digitized logistics — and more — a reality.
5G is already here today, and global operators started launching new 5G networks in
early 2019. Also, all major phone manufacturers are commercializing 5G phones. And soon,
even more people may be able to access 5G.
The COVID-19 has resulted in schools shut across the world. Globally, over 1.2 billion
children are out of the classroom. As a result, education has changed dramatically, with the
distinctive rise of e-learning, whereby teaching is undertaken remotely and on digital platforms.
OTHERS
a. Mobile Apps
Mobile applications have only grown in
popularity over the past few years, and this year,
they are surfacing in bigger and better ways.
Brands and industries all over the world are
trying to find ways in which one can improve their
work using mobile apps and through the
implementation of new resources that can make
working on the go more efficient.
b. Virtual Reality
The gaming industry has always been one
that has experienced growth alongside the
field of information technology, and
virtual reality has taken this one step
further, giving customers the very
epitome of digital experience. Virtual
reality gaming has already started to
become popular due to new
technology, which improves the way the industry can grow.
c. Augmented Reality
Augmented reality is another approach to
‘artificial experiences’ that individuals are now
being given access to. This has improved the
way the field has been able to develop. AR is
seeing a lot more applicability outside the
gaming industry as well and is something that
is seeing more implementation as compared
to virtual reality.
d. Cyber Security
With the growth of digital mediums
and technology, the potential
threats that people can face are
only rising. Because of this,
cybersecurity has had to grow
extensively over the past few
years, simply to stay in touch with
the growth that is being
experienced. Industries all over the
world also realize the importance of investing in cybersecurity, which is why the field
is experiencing growth at such a rapid pace.
g. Open-Source Solutions
Open-Source programs give users access to some of the
main files and framework in a particular program, enabling
them to modify it with absolute ease. As more and more
users become technologically proficient, giving them the
option to work with applications themselves is proving to be
incredibly beneficial.
LESSON 5.22
RECENT AND EMERGING
TECHNOLOGIES
There are a number of problems that I.T has been facing over the past couple of years
1. IoT Security
A whopping 82 percent of organizations feel challenged in their efforts to fully identify and
secure devices connected to a company’s network via the Internet of Things (IoT), according
to research cited by CIO.
Gone are the days when the chief issue facing IT was concerns with end-users struggling
to understand their systems or wrestling with their passwords. Now concerns lie much more
with the potential outside that shouldn’t be inside.
More than half of IT executives, 54 percent, are anxious about the security of their systems.
Cybersecurity threats are very real, and IT heads feel them to be. Cyber threats to an IoT
system have the potential to disable company systems and impact business performance.
2. Retraining of IT Personnel
Back in the day, many firms expected their people to pursue retraining on their own. Now,
that’s less expected, simply because platform complexity has increased. IT professionals
accept that retraining should be a company responsibility.
But how to find the time for retraining? IT heads feel challenged in this as well. As a result,
it sometimes doesn’t happen enough. In fact, 40 percent of IT professionals believe they’re
not getting the training they should be.
3. Data Overload
Massive quantities of data available has led to data overload everywhere, and nowhere is
this more true than in IT departments.
Data overload is likely to always be with us, at least for the foreseeable future. Why?
Because data streams are superabundant and likely to be only increasing from here.
Since data overload itself can’t be fought, the key is to manage it well. How? Know your
company’s key performance metrics, and follow the data that pertain to them, which can drive
improvement on the metrics. In other words, if improving cybersecurity is a goal, identify and
utilize the data that can augment cybersecurity.
4. Workload
Both IT staff and decision-makers are overwhelmed with work demands. It’s sort of a
chicken and egg scenario—decision-makers are using increasing workloads as an excuse not
to authorize training, and staff are struggling to complete assignments because they lack the
proper skills. Either way, time that was previously designated toward skills development is
now being used to catch up on an increasing backlog of work.
Workload concerns are the highest they’ve been in the history of our IT Skills and Salary
Report. It’s the number one training inhibitor, as IT professionals believe mounting Workloads
limit the amount of time they can spend out of office or in a training course.
Better manager oversight and strategy is required to address this issue. Automation may
also be a solution as a means to reduce time-consuming tasks that are not high priority.
5. Cybersecurity
Organizations cannot take IT security lightly. An analysis of worldwide identity and access
management by the International Data Corporation (IDC) revealed that 55% of consumers
would switch platforms or providers due to the threat of a data breach, and 78% would switch
if a breach impacted them directly.1 Customers aren’t willing to put their data at risk.
The problem is there aren’t enough IT professionals with cybersecurity expertise. Forty
percent of IT decision-makers say they have cybersecurity skills gaps on their teams. It’s also
identified as the most challenging hiring area in IT.
There isn’t an immediate solution to this problem, but a long-term fix is to build your cyber
workforce from the inside. Invest in cybersecurity training and upskill your current staff. Hiring
and outsourcing isn’t always a viable (or cheap) solution. Current IT professionals who know
the industry are more appropriate to transition into successful cybersecurity professionals.
6. Skills Gaps
Over 80% of North American IT departments have skills gaps. Globally, IT skills gaps have
increased by 155% in three years. They can no longer be ignored, especially as a lack of
necessary skills can be credited for increased employee stress, development and deployment
delays, and increased operating costs.
According to IT decision-makers, skills gaps will cost employers up to 416 hours and over
$22,000 per employee, per year. You would think those numbers would motivate organizations
to increase skill development opportunities for employees, but that isn’t always the case. Less
than 60% of decision-makers say their organizations offer formal training for technical
employees, down one percent from the previous year. This tells us that organizations aren’t
serious enough about skill development.
The time to act is now—skills gaps will only grow and further debilitate IT departments
unless actions are taken. Strategic and continual training is the antidote. That’s the good news.
The uphill battle is conveying the value to management and securing budget to ensure
employees receive continual training. IT professionals need better support. If organizations do
not invest in their employees’ skills now, they will pay for it down the road.
7. Digital Transformation
But it’s not that simple. As discussed above, IT departments are suffering from gaps in
critical skills areas such as cybersecurity, cloud computing and DevOps. Even IT professionals
who are offered professional development opportunities are struggling to keep up. The rate of
technological change is outpacing training.
IT professionals and departments are falling behind—they are failing to meet business
objectives and seize market opportunities. While continual training is part of the equation,
prioritizing skill needs is even more of a priority. That’s why we created the Skills Development
Index™ to help IT professionals rank their most critical skill needs and determine which type
of training to pursue. Informal training has its merits, especially when on-the-fly knowledge
must be acquired, but when a high-value project is on the line, more formal learning is the
better option.
8. Cloud Computing
Cloud is the top investment area worldwide for IT departments. Organizations require an
infusion of cloud skills to match their monetary investment in cloud platforms. Much like
cybersecurity, cloud professionals are in high demand and short supply. According to IT
decision-makers, cloud computing is the second most challenging hiring area in the world.
The opportunities of cloud computing are impossible to ignore. Cloud is the ultimate
enabler, opening new channels of revenue by leveraging technologies like artificial intelligence
(AI) and the Internet of Things (IoT). But professionals are needed to capitalize on this
technology, and currently, there aren’t enough of them.
Despite the worker shortage, organizations are all-in on cloud solutions. In fact, more than
50% of organizations use more than one cloud provider. It’s not unique for an organization to
require cloud skills in AWS, Microsoft Azure and Google Cloud. And generic cloud computing
expertise isn’t enough, especially if you’re an engineer or architect. It’s imperative that cloud
professionals have current skill sets and train on the platforms they engage with regularly.
Aside from cybersecurity and cloud computing, this is the biggest skill gap area for IT
departments. Organizations are struggling to manage a wealth of new data. By 2025, IDC
estimates the world will create and replicate 163 zettabytes (ZB) of data, 10 times the number
that was created in 2016. New data is constantly accumulating, creating a host of storage and
security risks that must be addressed. IT professionals are desperately needed to manage
this data growth, but the problem has exacerbated because qualified individuals are difficult
to come by.
It’s not enough to accumulate this data. Organizations need analysts and critical thinkers
to create a culture of information, enabling data-driven decisions to inform almost all business
activities.
The good news is most cloud platforms, such as AWS and GCP, allow you to capture,
process, store and analyze data all in one place. The key now is to upskill and certify
professionals on the technologies and services associated with these platforms.
10. Automation
Since workload is the biggest challenge for IT professionals, finding ways to automate
more mundane and time-consuming tasks such as email sends and social media posting is
crucial.
But companies are now looking to automate larger and more business-critical tasks, such
as cyberattack response, log monitoring and ERP integration.
Automation’s role in cybersecurity is certainly growing. It’s a tool that should be used to
predict cyber threats and implement responses more quickly than can be accomplished
manually.
Hackers are using automation to execute their attacks, so it’s time to bring the fight back
to them. Automation allows attackers to move quickly, so organizations demand a faster
detection and response time.
Automation is also useful in cloud migration. For organizations moving to the cloud, many
of the migration tasks, such as manual configuration, can be automated, which reduces
migration time from days to minutes.
REFERENCES
Comer, D.E. (2019) The Internet Book: Everything You Need to Know about Computer
Networking and How the Internet Works (5th ed.). CRC Press Taylor & Francis Group
https://www.aiim.org/What-is-Robotic-Process-Automation
https://ayehu.com/pros-cons-robotic-process-automation/
https://www.infoworld.com/article/3230383/intro-to-robotic-process-automation.html
https://searchcio.techtarget.com/definition/RPA#:~:text=Robotic%20process%20automation
%20(RPA)%20is,maintenance%20of%20records%20and%20transactions.
https://www.uipath.com/rpa/robotic-process-automation
https://phoenixnap.com/blog/edge-computing-vs-cloud-computing
https://www.educba.com/example-of-cloud-computing//
https://www.netcov.com/what-is-cloud-computing/
https://www.investopedia.com/terms/b/blockchain.asp
https://www.qualcomm.com/5g/what-is-
5g#:~:text=A%3A%205G%20is%20the%205th,machines%2C%20objects%2C%20and%20d
evices.
https://linchpinseo.com/trends-in-the-information-technology-industry/
https://www.netcov.com/what-is-cloud-computing/
https://www.bmc.com/blogs/saas-vs-paas-vs-iaas-whats-the-difference-and-how-to-choose/
https://nescoresource.com/blogs/details/the-biggest-issues-facing-it-today-/180/
https://www.globalknowledge.com/us-en/resources/resource-library/articles/12-challenges-
facing-it-professionals/