Professional Documents
Culture Documents
Internal Verifier
Date
signature
Programme Leader
Date
signature (if required)
LO2 Research state-of-the-art Emerging Technologies and choose one you believe will have significant
impact in the future
Pass, Merit & Distinction P3 P4 M3 M4
Descripts
LO3 Discuss the current state and future impact of your chosen Emerging Technology
LO4 Evaluate the political, economic and social factors which play a role in the competition between
Emerging Technologies and their success or failure in the future
Resubmission Feedback:
* Please note that grade decisions are provisional. They are only confirmed once internal and external moderation has taken place and grades decisions have been agreed at the
assessment board.
Assignment Feedback
Formative Feedback: Assessor to Student
Action Plan
1. A Cover page or title page – You should always attach a title page to your assignment. Use previous page as your cover sheet and
make sure all the details are accurately filled.
2. Attach this brief as the first section of your assignment.
3. All the assignments should be prepared using a word processing software.
4. All the assignments should be printed on A4 sized papers. Use single side printing.
5. Allow 1” for top, bottom , right margins and 1.25” for the left margin of each page.
1. The font size should be 12 point, and should be in the style of Time New Roman.
2. Use 1.5 line spacing. Left justify all paragraphs.
3. Ensure that all the headings are consistent in terms of the font size and font style.
4. Use footer function in the word processor to insert Your Name, Subject, Assignment No, and Page Number on each page. This
is useful if individual sheets become detached for any reason.
5. Use word processing application spell check and grammar check function to help editing your assignment.
Important Points:
1. It is strictly prohibited to use textboxes to add texts in the assignments, except for the compulsory information. eg: Figures, tables
of comparison etc. Adding text boxes in the body except for the before mentioned compulsory information will result in rejection
of your work.
2. Avoid using page borders in your assignment body.
I hereby, declare that I know what plagiarism entails, namely to use another’s work and to present it as my own without attributing the
sources in the correct form. I further understand what it means to copy another’s work.
Issue Date
Submission format
For the final report and the research report, you are expected to make use of appropriate
structure – including headings, paragraphs, subsections and illustrations as appropriate and all
work must be supported with research and referenced using Harvard referencing system.
LO1 - Assess what Emerging Technologies are necessary and appropriate when designing
software applications for the future.
LO2 - Research state-of-the-art Emerging Technologies and choose one you believe will have
significant impact in the future.
LO3 - Discuss the current state and future impact of your chosen Emerging Technology.
LO4 - Evaluate the political, economic and social factors, which play a role in the competition
between emerging technologies and their success or failure in the future.
Scenario
‘Dex Consulting’ is a leading research and consultancy firm researching new market trends
and Emerging Technologies for corporate clients and the consumer market. You currently
work as a trainee technology analyst for ‘Dex Consulting’. As part of your role, your manager
has tasked you to research on an Emerging Technology suitable for a potential client. You are
required to identify a specific user group you believe will be most influenced by this Emerging
Technology.
As part of this assignment, you must develop a report using research data gathered about
your chosen Emerging Technology, industry and end user and present your findings in a 15
minutes presentation.
You may supporting evidence and material such as user personas, hype cycle, etc to the
report.
Activity 01
• Assess formats, characteristics, trends, of Emerging Technologies and evaluate how
they can challenge the status quo of the markets, established practices and end user
experiences. Your answer should support with valid and relevant examples.
Activity 02
Select and research on a specific emerging technology that would be impacted on software
development industry. Organize your research findings and produce a small research report
with the following
• Select a specific emerging technology as stated in the scenario and relate it to the
existing technologies to demonstrate how the selected ET is likely to merge or replace
an existing technology in the industry. Defend your choice of emerging technology by
evaluating why you believe that it would have the most impact on the future software
development.
• Critically evaluate the above findings while justifying the selected ET and its impact on
its end users and software development industry as a whole.
Activity 03
Demonstrate your research findings in a 15 minutes presentation to the client that you are
supposed to recommend the emerging technology. The presentation should cover the
following
• Gather feedback from the audience and answer the questions they raise about your
research. Document the feedback received and questions raised by the end users and
attach to your report.
LO1 Assess what Emerging Technologies are necessary and appropriate when designing software applicat
LO2 Research state-of-the-art Emerging Technologies and choose one you believe will have significant im
LO4 Evaluate the political, economic and social factors which play a role in the competition between Emerging Techn
It is indeed a great pleasure to have got this opportunity for me to do this project work and
complete it within a specified period. Although it was a Herculean task at the beginning, I
took it as a challenge, and it was a good experience for me. I am very glad that I benefitted
greatly by doing this project. There are a few people who are really involved in helping me
with this project. It is with great enthusiasm that I extend my gratitude to ESoft Metropolitan
Campus for giving me an opportunity.
My gratitude is also extended to my lecturer Mr. Yoshiharaan, for his encouragement and
guidance at all times.
Secondly my colleagues for their support and cooperation in completing this project.
A big Thank You to all of you.
Table of Tables
Table 1: ET table ....................................................................................................................... 30
The term "emerging technology" usually refers to a new technology, but it can also refer to
the continued evolution of an existing technology. It can have slightly various connotations
depending on the situation, such as in the media, business, science, or education. The
expression is frequently reserved for innovations that have, or are expected to have,
significant societal or economic ramifications. It is used to describe technologies that are
currently under development or will be available within the next five to 10 years.
Emerging digital technologies have introduced new legal concerns, particularly those
affecting copyright, trademarks, patents, royalties, and licensing, as well as new
opportunities. For example, the development of new digital communication technologies and
media has resulted in the emergence of new problems around the digital reproduction and
dissemination of copyrighted works. To give adequate protections and legal clarity to
copyright owners, digital technology enterprises, the public, and other interested parties, the
federal government, concerned industry, and non-governmental organizations (NGOs)
working for the public good have taken (and continue to take) action.
Emerging technologies differ from established technologies in various ways. Here are some
of the fundamental characteristics of developing technologies:
Data is considered as the new oil and a strategic asset that drives or perhaps controls the
future of science, technology, the economy, and potentially everything in our world today and
tomorrow because we live in the age of big data. Data has not only created a lot of excitement
and noise, but it also provides significant challenges that stimulate amazing creativity and
financial opportunities. Not only is data evolving and moving paradigms, but so are all the
other aspects that can be generated, altered, or improved through comprehension, research,
and application of data. The trend and its implications have prompted a new debate about
data-intensive scientific methodologies. There is some skepticism about the capacity of data
science and analytics to support data-driven theory as a growing technology, the so-called
"fourth industrial revolution," The economics and professional development are becoming
more well-known. This includes not just fundamental areas like computing, informatics, and
statistics, but also more general domains like economics, social science, and medical/health
science.
Emerging technology is a term that typically refers to a new technology, but it can also refer
to the ongoing evolution of an existing technology; it can have slightly different meanings
when used in other fields such as media, business, science, or education. The word usually
refers to technologies that are now being developed or that are projected to be available
within the next five to ten years, and it is usually reserved for technologies that are having, or
are expected to have, substantial societal or economic implications.
Emerging digital technologies have created new opportunities while also posing new legal
issues, particularly in the areas of intellectual property, trademarks, patents, royalties, and
Assess formats
The fact that we are still learning about the consequences of developing technologies and
practices for learning, teaching, and education, as well as for students, instructors, and
institutions, makes them exceptional. Another area where we lack understanding is the
contextual, negotiated, and symbiotic relationship that exists between practices and
technologies.
Emerging technologies and processes are not fully understood since they have not been
thoroughly researched. Early assessments of emerging technologies frequently reach
conclusions that are either evangelical, overly gloomy, or dystopian, and merely list benefits
and drawbacks without contemplating how they will affect online education. A case study is
commonly utilized in new technology and practice research.
Innovation:
Emerging technologies are innovative and novel. They frequently represent a major departure
from conventional technology, providing new capabilities or addressing issues in novel ways.
Rapid Development:
Emerging technologies are characterized by rapid development and evolution, with new
versions, features, and capabilities arriving on a regular basis.
Disruptiveness:
Emerging technologies have the potential to disrupt existing industries and business
structures, posing challenges to established players while also generating opportunities for
new entrants.
High-risk/high reward:
Emerging technology frequently raises worries about human health and safety as well as
financial considerations. They have the potential, however, to provide considerable benefits
such as increased productivity, efficacy, and quality of life.
Experimental:
Researchers and innovators routinely test new methods and technologies in the creation of
developing technologies to discover which are the most effective. This experimental
approach could lead to discoveries and technology that disrupt entire industries.
Complexity:
Emerging technologies may entail complex systems and technologies that require specific
knowledge and experience to create and implement. They can also be very complicated. This
intricacy can make evaluating the viability and potential ramifications of emerging
technology challenging for those in charge of making decisions.
Definition for AI
Businesses and solution providers are increasingly turning to artificial intelligence as a tool of
choice. AI, when paired with machine learning, deep learning, and neural networks, can be a
potent combo, as seen frequently in social media. Businesses can utilize AI to save costs,
optimize corporate operations, improve customer experience, enable more efficient
communication via chatbots, raise customer happiness, and provide insight into purchase
behaviour to inform decision-making.
Furthermore, machine learning can analyse enormous datasets and providing scalable insight.
We're only touching the surface of how machine learning and artificial intelligence can work
together to help businesses. In fact, the worldwide AI industry is expected to reach $407
billion by 2027 at a CAGR of 36.2%. Current applications offer enormous growth potential
Figure 3: Blockchain
Previous discussions about blockchain have frequently focused on bitcoin, but the true value
of blockchain lies in its immutability and transparency. Blockchains employ distributed
ledger technology, which results in a permanent and highly visible record of activity with
significant business application possibilities.
The blockchain is an information technology that has the potential to improve supply chain
management by allowing transparency into the movement of goods from origin to product.
Blockchain technology will also improve record management by delivering a snapshot of any
record dating back to its creation. This could be used to validate orders, purchases, refunds,
product receipt, and so on.
Smart contracts are yet another blockchain use that ensures that requirements are met. When
both parties have met the terms of an agreement, smart contracts release data. They have
limitless opportunities for ensuring frameworks are followed and, as a result, can assist
establish you as a reliable solution provider.
IoT reduces the need for computer-to-computer or human-to-human contact to convey data
over a network.
The term "thing" refers to any natural or artificial object that can be assigned an Internet
Protocol address and can transfer data over a network, such as people with implanted heart
monitors, farm animals with biochip transponders, cars with built-in tire pressure monitors,
and other examples.
This field enables computers to interpret visual input and then act or make judgments based
on that information. To train algorithms to detect small distinctions and distinguish diverse
visual inputs, computer vision requires massive amounts of data.
Computer vision prototypes have considerable business potential since they can evaluate
items and processes as part of quality control to detect practically undetectable discrepancies
and flaws. Using Google convert to convert signage to a native language and making sense of
traffic signs in self-driving cars are two business uses.
If data is the new gold, consumer data platforms are the new banks. Businesses gain from
knowing as much about their customers as possible so that they may hyper-personalize
experiences and know how to connect and engage prospects and customers. However,
knowledge is frequently dispersed, dispersed over multiple systems or platforms with no
single source of truth. Customer data platforms combine this information into a single source
to present a complete image of customers and prevent the possibility of unclean data.
If COVID-19 left a lasting impact, it is in digital health. As patients became unable to visit
doctors' offices and hospitals, digital health stepped in to fill the void. However, now that
wounded or sick people are aware that they can receive adequate healthcare from their
doctors without in-person visits, they are taking advantage of that possibility. This craze is
undoubtedly here to stay, and in the next years, it is expected to give rise to associated new
technologies that exploit advances such as biometrics to generate smart or connected medical
equipment that will permit continuing remote medical checks.
With the appropriate coding, almost anything is feasible, and we can now digitally replicate a
human. A person's virtual manifestation is a real-time simulation of what happens in the
physical body. Digital twins are extremely effective for predicting outcomes and measuring
performance. There are numerous business and healthcare applications that enable us to
conduct testing and use data analytics that were previously difficult and time consuming with
living individuals. Advances like genome mapping and gene therapy may become
increasingly feasible soon. Furthermore, as we try to address future medical concerns, digital
twin technology may eventually replace the need for human clinical trials.
Edge Computing
Edge computing is a decentralized approach in which computational nodes are placed closer
to the point of interaction. Edge computing, according to Gartner, is a model in which
"information processing, content collection, and delivery are placed closer to the sources,
repositories, and consumers of this information." To enable more effective and real-time data
consumption, this model optimizes technical interactions and lowers latency at the point of
origin. For localized interactions, edge computing is increasingly becoming the most
effective option.
As the world becomes more computerized, informed business is essential, and the internet of
behaviours, or IoB, provides better insight into customer behaviour. The IoB offers chances
for data collecting and analysis regarding consumer interactions, preferences, and purchase
behaviour to enterprises looking to maintain a competitive advantage.
IoB, like the internet of things (IoT), provides significantly more insight into how consumers
participate in the purchasing journey by evaluating data from IoT and online sources from a
psychological standpoint. Finally, this technology is intended to assist businesses in
improving the user experience and engaging with consumers more meaningfully.
Low-code technology makes software creation accessible to persons who lack advanced
technical skills. Traditional software development is a time-consuming and labour-intensive
process that necessitates a high degree of programming skills as well as a large time
investment. Software can be created using a drag-and-drop interface and no heavy backend
coding with low-code technologies. This enables business users to address a wide range of
individual problems without requiring the assistance of a highly technical resource.
Quantum computing provides unprecedented prospects for predictive analysis that go beyond
the capabilities of traditional computers. To process information on an exponential scale,
quantum computers use the principles of superposition and entanglement. While Google
promised the world's largest quantum computer in 2017, IBM has actually made it viable for
businesses to use this technology.
Businesses can use quantum computing as a strong tool for predictive analytics and big data
analytics. As new issues arise, quantum computing will aid in the prediction of possible
solutions.
Robotic process automation is a relatively new word. However, the moniker is somewhat
misleading because there are no physical robots participating in these operations. RPA is the
automation of processes that formerly required human labour utilizing bots that follow a
repeated pattern to do more of these computer-based jobs with increased efficiency. Many
firms are using RPA to help them achieve more effective workflows for rule-based tasks.
Spatial Computing
When we watch futuristic movies, we commonly see spatial computing. Rather than dealing
with a stationary computer on your desk, we now engage with computing as we go about our
daily lives. Spatial computing includes virtual reality (VR) and augmented reality (AR) but
goes well beyond those categories. Spatial computing is the interaction of digital elements
with the actual environment. Consider smart houses, where you can use voice commands to
achieve real-world goals, or smart glasses that you can wear as you travel around the world,
informing real-world experiences with digital resources and interfaces.
Overall Experience
5G in Daily Life
The connectivity speeds achieved by 5G vastly outperform those found with prior networks.
5G networks provide the platform for organizations to embrace numerous impending
disruptive technologies. However, 5G technology has not gained the traction that was
expected. It is expected that infrastructure will finally reach a practical stage, and gadgets will
become sufficiently affordable, allowing us to fully utilize the power of 5G.
Cloud computing, app development, data centers, and even eCommerce were formerly
considered emergent technology. Looking at new top technological trends has become a need
as we approach the age of the metaverse. If business models are to continue reaching new
heights in the coming years, they must innovate.
Which IT industry trend will have the greatest influence on our business.
❖ Enhanced Efficiency:
Emerging technologies frequently simplify procedures and tasks, resulting in increased
efficiency and production. Automation, artificial intelligence, and the Internet of Things, for
example, can help eliminate manual intervention and optimize operations.
❖ Improved Connectivity:
5G and IoT technologies offer seamless connectivity between devices, resulting in quicker
data transmission and real-time communication. This connectivity is essential for a variety of
applications, ranging from remote surgery to smart cities.
❖ Sustainability:
Renewable energy solutions and IoT-based resource management help with sustainability
efforts. They aid in the reduction of carbon footprints and the promotion of responsible
resource utilization.
❖ Healthcare Innovations:
Through telemedicine, remote monitoring, AI-assisted diagnostics, and individualized
treatment regimens, emerging technologies offer better healthcare, ultimately increasing
patient outcomes.
❖ Global Interconnection:
Communication technology and internet access have brought individuals from all over the
world together, fostering cross-cultural understanding, collaboration, and idea sharing.
❖ Economic Development:
Emerging technology innovation and adoption drive economic growth by establishing new
markets, industries, and job opportunities.
❖ Enhancements to Security:
While developing technologies pose security difficulties, advances in cybersecurity are being
made to secure sensitive data and systems from cyber assaults.
❖ Ethical Concerns:
Emerging technologies frequently create ethical issues, such as AI bias, autonomous car
decision-making in life-threatening scenarios, and the ethical application of biotechnology
such as gene editing.
❖ Environmental Implications:
Electronic waste is generated by the manufacture and disposal of electronic gadgets.
Furthermore, the energy requirements of data centers and high-tech companies can put a
burden on energy supplies and add to carbon emissions.
❖ Security Concerns:
Cyber risks evolve alongside technological advancements. Hackers and malevolent actors can
launch cyberattacks and disrupt key systems by exploiting weaknesses in new technology.
❖ Technology Dependence:
Overdependence on technology can expose civilizations to disruptions in the event of
technical failures, cyberattacks, or natural disasters.
❖ Regulatory Obstacles:
Emerging technologies can outrun regulatory development, resulting in legal and ethical
ambiguity. Regulating technologies such as AI, biotechnology, and self-driving cars creates
difficulties.
❖ Technological Difficulties:
Some new technologies encounter technical obstacles that prevent mainstream adoption, such
as the susceptibility of quantum computing to external variables and the limited battery life of
some IoT devices.
❖ Environmental Implications:
Electronic waste can be generated during the manufacturing and disposal of electronic
equipment, and the energy demands of data centers and high-tech sectors can strain energy
supplies and add to carbon emissions.
Platforms for Low-Code and No-Code Development: These platforms enable both
developers and non-developers to construct software applications with minimal coding. They
want to accelerate development and make it more accessible to a wider audience.
AI and Machine Learning in Software Development: AI and machine learning are being
used to automate processes in software development such as code generation, problem
identification, and code review.
Automated Testing and Test Automation: Automation tools and frameworks improve the
efficiency and efficacy of software testing, resulting in higher-quality software releases.
Edge Computing: Edge computing includes processing data closer to the source rather than
in a centralized data center, which is particularly useful for applications that demand minimal
latency or deal with massive amounts of data.
Secure Development methods: As cybersecurity concerns grow, secure coding methods and
technologies are becoming increasingly important for preventing vulnerabilities and breaches
in software systems.
Natural Language Processing (NLP) and Chatbots: NLP is being used to create intelligent
chatbots and virtual assistants that can understand and respond to human language, hence
improving user interactions.
These are only a few examples of rising technologies in the field of software engineering.
The landscape is constantly changing, and new technologies may have acquired importance
since then. Staying current with industry trends and improvements is critical for software
developers to remain competitive in their business.
For this study, I have chosen Artificial Intelligence (AI) as an Emerging Technology.
The simulation of human intelligence processes by machines, specifically computer systems,
is referred to as artificial intelligence (AI). It entails the creation of algorithms and models
that allow computers to execute tasks that would normally need human intelligence, such as
problem solving, decision making, learning, natural language understanding, pattern
recognition, and others.
Narrow or Weak AI: This sort of AI is built and trained to do a single task. It excels at one
activity but lacks the ability to apply what it knows to other fields. Virtual personal assistants
(such as Siri or Alexa), recommendation systems (such as those used by streaming
platforms), and certain AI-powered games are all examples.
General or Strong AI: This is a level of AI that has human-like cognitive capacities, such as
the ability to comprehend, learn, and apply information across a wide range of tasks. Such AI
systems would be able to think and reason like humans. However, truly universal AI is still a
theoretical idea that has yet to be realized as of my most recent knowledge update in
September 2021.
AI technologies are divided into several subfields, including machine learning (a subset of AI
focused on developing algorithms that improve their performance based on data), deep
learning (a subset of machine learning that employs neural networks with multiple layers to
model complex patterns), natural language processing (NLP, which enables machines to
understand and generate human language), and computer vision (which enables computers to
interpret and understand visual information from the environment).
When most people hear the word artificial intelligence, they immediately think of robots.
This is because big-budget films and novels weave storylines about human-like machines
wreaking havoc on Earth. However, this could not be further from the truth.
Artificial intelligence is built on the premise that human intelligence may be characterized in
such a way that a machine can simply duplicate it and complete tasks ranging from the
simplest to the most complicated. One of the goals of artificial intelligence is to emulate
human cognitive processes. Researchers and developers in the field are making very rapid
progress in simulating activities like learning, reasoning, and perception, to the point that
these may be concretely characterized. Some predict that inventors may soon be able to
create systems that outperform humans' ability to learn or reason on any subject. Others,
however, are sceptical since all cognitive activity is riddled with value judgments that are
vulnerable to human experience.
As technology progresses, prior criteria that defined artificial intelligence become obsolete.
Machines that calculate basic calculations or recognize text using optical character
recognition, for example, are no longer believed to represent artificial intelligence because
this function is now assumed to be an intrinsic computer function.
AI is always expanding to benefit many different businesses. Machines are wired utilizing a
multidisciplinary method that includes mathematics, computer science, linguistics,
psychology, and other disciplines.
The possibilities for artificial intelligence are limitless. The technique can be used in a wide
range of sectors and businesses. AI is being tested and employed in the healthcare business to
recommend medicine dosages, discover treatments, and assist with surgical procedures in the
operating room.
Computers that play chess and self-driving automobiles are further instances of machines
with artificial intelligence. Each of these machines must consider the ramifications of
whatever action they perform, as each action has an impact on the eventual product. The goal
of chess is to win the game. For self-driving cars, the computer system must account for all
external data and compute it to behave in a way that avoids a collision.
Artificial intelligence is also utilized in the financial industry to detect and highlight activities
in banking and finance, such as odd debit card usage and significant account deposits—all of
which help a bank's fraud department. AI applications are also being utilized to assist
expedite and simplify trading. This is accomplished by making it easier to estimate the
supply, demand, and pricing of securities.
Artificial intelligence is classified into two types: weak and powerful. Weak artificial
intelligence embodies a system meant to perform a single task. Weak AI systems include
video games like the chess example from before, as well as personal assistants like Amazon's
Alexa and Apple's Siri. You ask the assistant a question, and it responds.
Strong artificial intelligence systems are those that can perform tasks that are regarded
human-like. These are typically more sophisticated and difficult systems. They are
programmed to handle circumstances in which they may be required to solve problems
without the intervention of a human. These systems are used in applications like as self-
driving cars and hospital operating rooms.
Artificial intelligence has been scrutinized by scientists and the public alike since its
inception. One recurring motif is the notion that robots will become so advanced that humans
would be unable to keep up, and they will take off on their own, recreating themselves at an
exponential rate.
Another issue is that machines can invade people's privacy and potentially be armed. Other
debates center on the ethics of artificial intelligence and whether intelligent systems such as
robots should be granted the same rights as people.
Self-driving cars have been a source of contention because their machines are typically
developed with the least amount of risk and casualties in mind. When confronted with the
option of colliding with one person or another at the same moment, these autos would
calculate the option that would do the least amount of damage.
Another problematic issue that many people have with artificial intelligence is how it will
affect human employment. With many businesses trying to automate some tasks using
intelligent gear, there is concern that individuals would be forced out of the workforce. Self-
driving cars may eliminate the need for taxis and car-sharing programs, while manufacturers
may easily replace human work with robots, rendering people's talents obsolete.
❖ Reactive AI optimizes outputs based on a set of inputs using algorithms. AIs that play
chess, for example, are reactive systems that optimize the optimum strategy for victory.
Reactive AI is typically static, incapable of learning or adapting to new situations. As a
result, if the inputs are same, it will produce the same output.
❖ Theory-of-mind AI is entirely adaptable and has a vast ability to learn and remember
previous experiences. Advanced chat-bots that can pass the Turing Test, deceiving a
person into thinking the AI is a human being, are examples of this sort of AI. These AI
are not self-aware, despite being advanced and remarkable.
❖ As the name suggests, self-aware AI becomes sentient and aware of its own existence.
Some scientists feel that an AI will never become conscious or "alive," as it is still in the
realm of science fiction.
We've all done something because we had to, not because we wanted to. We regarded the
task to be uninteresting or tedious. However, with a computer, you never have to be bored.
Take, for example, Dialog flow, a Google subsidiary firm that claims credit for developing
the Google Assistant. We give this assistance so many commands in a single day! The
assistant can do everything from Ok Google, call mom, to Ok Google, order sandwiches.
At the same time, we can use this assistance to send out many calendar invitations to people.
All we must do is select a time for an event and enter the list of visitors. The rest of the work
is done by the helper.
Everyone on the guest list will receive an invitation. This is far more convenient than
contacting, texting, or visiting them to invite them to your event.
Keep up with the newest technology developments by following Data Flair on Google.
2) Ingestion of Data
One of the most crucial aspects of artificial intelligence is data ingestion. Artificial
intelligence systems must deal with massive volumes of data. Even a tiny company of
approximately 50 employees has massive amounts of data to examine; we can't even
conceive the amount of data handled by organizations like Facebook.
Elucify, which is essentially a database of different business relationships, is one such type of
artificial intelligence. Elucify operates on a simple premise: donate to receive.
The user must first create an account and sign in before the system can access and distribute
the information of the user's contacts. In exchange, the user receives relevant company
contacts who may be future customers. In other words, Elucify is crowdsourcing this data.
This explains a lot about coaching centers that call your friend first, then you, and then other
friends from the same batch.
An AI, like humans, studies its surroundings, draws inferences, and then interacts with it
appropriately.
However, as of now, it is not totally achievable, but developers and scientists are working on
systems that cater to the theory of mind and self-awareness of artificially intelligent systems.
This raises the prospect of an AI system being able to fully mimic the human mind and
behave exactly like a human. To be honest, I'm looking forward to the day when my own AI
system will engage with people in my place, and an asocial me will binge watch some web
series at home.
4) Futuristic
AI-powered systems are programmed to notice and respond to their surroundings. They not
only observe the surroundings and act accordingly, but they also retain in mind (isn't it
strange that I just used the word'mind' for artificial intelligence?) potential future
circumstances.
A self-driving car is a classic example of this trait. It observes the speed of the automobiles
on the road and attempts to apply similar speed patterns in the traffic, also known as neural
networks.
It feeds data into machine learning algorithms while also seeing when and how a car changes
lane. It attempts to attain a specific target or goal by examining multiple situations.
It is important to remember that such decisions are significantly reliant on the Automated
Driving System (ADS), which is essential to the operation of a self-driving automobile.
In basic terms, ADS is a system that combines multiple systems that a driver employs to
control the car. An automobile can travel with the assistance of ADS without the presence of
a driver inside.
We are all in favour of using AI for our businesses, gaming accounts, and other similar
things. It is now our responsibility to advance AI and perfect it so that governments may
employ it in disaster management.
When fed data from thousands of previous disasters, artificially intelligent systems may
reliably anticipate the future of disasters that may occur.
Today, using artificial intelligence aspects such as these, scientists are researching over a lack
of previously happened earthquakes and related calamities such as tremors and volcanic
eruptions to develop a neural network.
The network's mechanism was evaluated on approximately 30,000 occurrences, and the
system's predictions were shown to be more exact than traditional methodologies.
In May 2019, Cyclone Fani impacted Bangladesh and the Indian state of Odisha in the south-
east. The Indian Meteorological Department forecasted and tracked this storm ahead of time,
evacuating over 1.2 million Odisha citizens and transporting them to cyclone shelters
established at higher altitudes. This saved many lives, and the cyclone's death toll was
reduced to only 72 persons. One can only fathom how many people's lives AI could save if
these forecasts come true and are properly implemented.
Chatbots are software that allows a chat with the user to address whatever issues they are
having, either through auditory or texting techniques. These programs mimic human
behaviour while conversing with a human via an application. Many organizations, such as
Swiggy and Nykaa, have begun to use chatbots for customer assistance.
Swiggy chatbots provide services that revolve around issues that clients have when their meal
is stalled in traffic or the products, they requested are unavailable. The Nykaa chatbot exists
to make product ideas to users.
I was seeking for a conditioner for my curly, frizzy hair yesterday, and this chatbot gave me
some fantastic options that were also within my price range.
Summary
AI is no longer a sci-fi concept. We are now living alongside it. It has the scope of a child's
imagination.
There are numerous factors that distinguish AI, and people are hard at work improving these
technologies. We now live and breathe artificial intelligence as technology advances (Team,
2019).
Machines with AI skills are no longer a sci-fi concept; they can now achieve things that were
previously only possible in the human brain.
Artificial intelligence, in essence, enables a computer to learn and think (nearly) on its own.
While AI may be taught to process and interpret data, it lacks the cognitive powers of the
human brain.
It all begins with building the AI to achieve a specific purpose. It is then taught using
available data to understand how to best achieve the stated goal. When the AI has learned
enough, it can handle the data on its own. Then, after processing all the data, AI generates
predictions based on what it discovers.
It will apply what it has learned in the future to better its strategy. In other words, it becomes
smarter and smarter. This might be beneficial or disastrous along the future. However, it now
provides a set of benefits and drawbacks that you and your business must be aware of.
Humans can make mistakes, but machines, if properly programmed, will make less mistakes
in certain areas.
Because AI judgments are based on compiled data and created algorithms, errors are
decreased, accuracy is raised, and precision is possible.
Saving time by making quicker selections is always beneficial. This is something AI can
accomplish for you. AI works in conjunction with other technologies to let machines make
choices faster than many human workers. The more decisions AI makes, the more data it has
to draw on for future decisions, which improves the process.
AI (nearly) never sleeps or needs to rest, whereas the human body and mind require rest to
function well. I emphasized "almost" because, while AI systems can run indefinitely, they
nevertheless require regular maintenance, updates, and downtime for maintenance and
optimization.
However, this near-constant availability might have a significant impact on productivity
growth in your firm.
One significant advantage of AI is that it can perform risky tasks that would be exceedingly
dangerous for humans. This reduces the hazards associated with pursuits.
AI robots, for example, can mine for coal, explore the ocean's depths, disarm a bomb, and
even enter a volcano.
Repetitive work and duties continue to be a part of many jobs today, frequently not utilizing
the full potential of human workers.AI can automate repetition in a variety of ways, including
factory operations and email responses.
Simply put, by automating monotonous tasks, you may focus on becoming more productive,
freeing up your time to focus on creativity or other areas that require distinctively human
abilities.
❖ Recognizes Patterns
AI efficiently discovers trends in your data, allowing you to make more accurate forecasts.
When it comes to spotting patterns in words, numbers, or images, artificial intelligence
already outperforms humans. All of this will not only improve your marketing analytics
capabilities, but it will also help you when developing your next digital marketing strategy.
Developing better human workflows is one approach to work more efficiently and boost
production, and thus money. In other words, it enables AI and humans to collaborate to their
full potential and has the potential to positively revolutionize the way we operate in the
future.
The greater the amount of data available, the greater the need for AI to make sense of it all in
less time. Artificial intelligence is extremely useful in making sense of the massive amounts
of data that are now available. It could rapidly capture and extract data, but that's not all.
After that, AI interprets and transforms the data.
❖ Reduces employment.
While replacing repetitious occupations and other sorts of work with AI is advantageous to a
corporation, it will surely have an impact on employment. Traditional job responsibilities will
be phased out, resulting in job losses. While this may be perceived as a sign of progress,
workers will be excluded from many previously available career options.
One disadvantage of adopting AI, especially when it comes to content marketing, is its
inability to be creative and imaginative. Current AI systems are excellent at pattern
recognition and creating new material from existing data, but they lack true creativity and the
capacity to think outside the box. Many believe that AI will become more sophisticated in the
coming years, far surpassing human capabilities. However, this assertion is speculative and is
not uniformly acknowledged by specialists. Human wisdom and creative thinking are still
required and highly valued in many businesses around the world.
While AI-enhanced robots can operate quicker and more continuously, they cannot consider
emotion while making decisions. At all times, AI remains exceedingly reasonable and
practical. As a result, it is unable to form ties with humans or build genuine human
connections. Emotions play an important role throughout the buyer's journey, which is why
implementing AI into your digital marketing plan might be difficult. However, certain AI
systems are being created to identify and respond to human emotions to some level.
❖ Ethical Issues
One barrier to employing AI is its inability to incorporate ethics and morality, which are
crucial human characteristics.AI can only make conclusions and predictions based on data
and algorithms. In turn, bias may be inherent in the data in some way, whether conscious or
unconscious, and may result in discriminating output because it can only focus on logical
conclusions. Ethical considerations in AI development and deployment are a hot topic of
research and debate, and efforts are underway to create AI systems that may include ethical
standards.
Task automation and the use of additional digital assistants may lead to growing machine
dependency and even human laziness. When we rely on AI, we may use our brains less to
memorize, strategize, and solve problems on our own. If this is not understood, the
consequences for future generations could be devastating. Artificial intelligence has the
potential to be extremely beneficial to everyone in the future if caution is exercised to prevent
it from becoming dangerously sophisticated. At the end of the day, while AI can assist in
decision-making and automate certain jobs, it is up to individuals and businesses to select
how they use AI technology and whether it leads to increasing sloth or productivity.
To perform properly, AI systems frequently rely on massive volumes of data. This raises
worries regarding data security and privacy. Unauthorized access, data breaches, and
potential abuse of sensitive information are all risks associated with the large collecting and
analysis of personal data. When AI technologies are engaged, maintaining data privacy
becomes critical.
AI algorithms, particularly deep learning and neural network models can be complex and
difficult to grasp. Because of this lack of transparency and explain ability, it might be
difficult to discern how AI systems make conclusions or predictions. The "black box" aspect
of AI might pose questions of accountability, justice, and bias, since it becomes difficult to
discover and fix potential algorithmic biases or errors.
As AI gets more integrated into numerous systems and processes, there is a rising reliance on
its capabilities.
This reliance can be hazardous if AI systems fail or malfunction. When AI is heavily used in
crucial jobs and decision-making processes, reliability becomes an issue.
Ensuring the resilience and dependability of AI systems, as well as relying on human labour,
becomes critical for avoiding potential disruptions and mitigating risks associated with
reliance on AI technologies.
Transportation, the industry that deals with the movement of goods and people from one
location to another, has gone through several studies, research, trials, and modifications to get
to where it is now. The invention of the steamboat in 1787 was a watershed moment in
transportation history. Previously, people had to rely on animal-drawn carts to go around.
Following that, important achievements that aided the growth of the transportation business
included the invention of bicycles in the early nineteenth century, motor cars in the 1890s,
railroads in the nineteenth century, and airplanes in 1903. Today, the transportation industry
has advanced to the point where cars can navigate and travel without the assistance of
humans. Technological breakthroughs have aided the transportation sector's journey of
innovation and evolution. AI is one such cutting-edge technology that has aided the sector.
Using AI in transportation helps the industry boost passenger safety, reduce traffic congestion
and accidents, reduce carbon emissions, and lower overall financial costs.
AI has long since exceeded its theoretical presence in research labs to become prevalent in
our daily lives. And, to a significant extent, technology has been successful in its
AI use cases in transportation demonstrate why the sector is expanding and why businesses
should utilize the technology. Consider the following use cases:
❖ Autonomous vehicles
Similarly, the logistics industry in the United States is embracing self-driving trucks to gain
multiple benefits. According to a McKinsey research, vehicles deliver 65 percent of all
commodities worldwide.
Transportation management
Congestion is another transportation issue that people experience daily. AI is now poised to
fix this issue as well.
Sensors and cameras placed in the road collect a vast volume of traffic data. This data is then
transferred to the cloud, where it will be analyzed and traffic patterns discovered using big
data analytics and an AI-powered system. Data processing can provide valuable insights such
as traffic forecasting. Important information such as traffic projections, accidents, or road
closures can be communicated to commuters. Furthermore, users can be alerted of the
shortest path to their destination, allowing them to travel without the inconveniences of
traffic. In this way, artificial intelligence can be utilized to not only reduce undesired traffic,
but also to improve road safety and reduce wait times.
Predictions of delay
Flight delays are another major issue confronting air travel today. According to a study
undertaken by experts at the University of California, Berkeley, the estimated cost of aircraft
delays in the United States is 39 billion dollars. Flight delays, in addition to financial loss,
have a detrimental impact on passengers' traveling experiences. Negative flying experiences
can diminish the value of a transportation company, leading to greater client attrition. To
address these difficulties, AI comes to the aid of the aviation sector.
Using data lake technologies and computer vision, the sector can provide great service to
passengers by reducing wait times and improving their journey experience. Because
everything from inclement weather to a technological malfunction can cause aircraft delays,
it is critical to update flight details to passengers ahead of time to avoid excessive wait
Drone taxis are one of the most intriguing and inventive AI uses in transportation. Pilotless
helicopters provide a novel approach to reducing carbon emissions, alleviating traffic
congestion, and reducing the need for costly infrastructure investment plans. Furthermore,
drone taxis will let passengers arrive at their destinations much faster, reducing commuting
time.
Furthermore, expanding populations have put city planners under intense pressure to ensure
smart urban planning and infrastructure construction while conserving scarce resources.
Drone taxis may be the true solution to all the issues that these municipal officials are
attempting to address. The recent demonstration of an autonomous aerial vehicle in China, in
which 17 passengers experienced smart air mobility for the first time, is an excellent
predictor of such future uses.
Indeed, artificial intelligence has been one of the most remarkable technological discoveries
in human history. Despite all the wonderful inventions we've seen thus far, it's crucial to
remember that we've only scratched the surface of AI, and there's a lot more to come. The
examples of AI applications in transportation presented above are only a sampling of the
possibilities and opportunities that the technology may provide. Consider how incredible and
thrilling an AI-driven future might be.
Parts one and two of our AI blog series examined the fundamentals of AI concepts and AI
and the environment. Part three looks at how utilities employ this technology and how it
affects the energy market.
In recent years, the ever-changing world of technology has significantly transformed how we
generate, distribute, and use energy. While we strive for greater efficiency and renewable
alternatives, the rapid advancement of technology necessitates an increase in energy usage.
Artificial Intelligence (AI), which has found increasing utility acceptance, is one important
player revolutionizing the energy market. AI has the potential to increase efficiency and
reduce energy usage by transforming different elements of operations, management, and
decision-making. In this instalment of our AI blog series, we look at the important uses of AI
in utilities and the ramifications for the energy sector.
Let's look at some of the important applications of artificial intelligence in utilities and how
they might help us boost efficiency and offset some of our energy consumption:
The grid is changing, particularly with the introduction of distributed energy resources
(DERs). By evaluating real-time and historical data from modern sensors, communication
technologies, weather patterns, and trends, AI can optimize operations. AI can assist in
translating the size of human-made judgments in the tens and hundreds to millions of
switching decisions in shorter time frames, hence ensuring grid stability. For example, AI
models using historical data, weather patterns, and pertinent factors can improve energy
demand predictions by offering vital insights to properly plan and manage resources.
Furthermore, AI-powered predictive maintenance solutions can detect equipment failures,
schedule maintenance in advance, and improve energy infrastructure reliability.
To optimize energy use, many systems use AI-driven energy management in buildings. AI
can analyze sensor data, weather forecasts, and occupancy patterns to optimize heating,
cooling, and lighting systems for optimal energy efficiency—for example, only cooling or
heating rooms when they are occupied.
For many years, advanced algorithms have been used in market trading. AI can evaluate data
and optimize trading tactics in energy trading and pricing markets based on more precise
projections of bulk energy supply. This enables utilities to make informed real-time decisions
while maximizing profitability.
Grid Security and Resilience: sophisticated algorithms are also being employed in several
industries for cybersecurity. Image data from substations and other key equipment can help
AI monitor the grid for potential cybersecurity attacks and anomalies. AI-powered systems
can detect and respond to security breaches in real time, improving grid resilience.
Chatbots are not going away. Chatbots assist customer care departments in practically every
business by assisting and engaging with customers to either totally resolve their issue or help
find the problem so that a customer service human can swiftly assist.
The integration of AI with utilities and the energy business has the potential to optimize
energy usage, improve grid dependability, and generate a more sustainable and efficient
energy ecosystem. As technology advances, artificial intelligence (AI) is projected to play an
increasingly important role in determining the future of the energy sector. We developed the
AI Data Engine at Awesense to process all forms of utility data using AI. The Awesense AI
Data Engine consumes data sets from several sources and arranges them using our Energy
Data Model (EDM). The EDM provides the foundation for AI deployment in utilities,
allowing for simple and safe access via APIs to easily incorporate other AI technologies. The
If you want to apply this strategy, you must understand how AI affects software development
and analyze the changes. These are the features that AI may include into software
development to give highly tailored products to your clients.
AI has had a major and extensive impact on software engineering, and several tools and
approaches are now available to help engineers produce high-quality outcomes. AI enables
software developers to create more effective, dependable, and user-friendly software than
ever before, from design to development to deployment and, eventually, maintenance. It is
apparent that the impact of AI technology on software engineering will only grow in the
future.
AI tools can help engineers write better code by automatically detecting and addressing
defects and difficulties. Deep Code, an AI-powered tool that helps developers detect bugs in
their code before releasing it, is one example. Deep Code uses machine learning to analyze
Developers can speed up the coding process by automating time-consuming tasks such as
testing, debugging, and even code generation. Google's AutoML technology, for example,
can automatically generate machine-learning models for applications such as image
recognition, natural language processing, and others. According to a Google study, AutoML
can build models that outperform human-designed models in some tasks.
AI can improve the user experience by personalizing programs based on user behaviour and
preferences. Netflix use artificial intelligence to improve tailored suggestions for viewers to
choose relevant movie choices based on their viewing behaviour. According to a Deloitte
study, personalized recommendations can increase user engagement.
AI can foresee and avoid software failures for your firm by analyzing data from sensors, logs,
and other sources, which is a genuinely fantastic benefit. Predictive maintenance can help
save time and money by recognizing problems before they cause downtime.
If specific software testing is performed each time the source code is amended, repeating the
same tests may become time-consuming and costly. The most crucial result is that AI
improves software testing once more. AI can be used to develop test cases and perform
regression testing in a variety of technologies.
With the help of these AI tools, you can automate testing and ensure error-free testing.
Appvance and Functionize are two testing systems that use AI and machine learning.
Wi-Fi is an excellent example. Wi-Fi was accessible in the late 1990s, but a Wi-Fi router cost
thousands of dollars and could often only be set up by an IT professional.
Wi-Fi is now omnipresent. Wi-Fi is built into nearly every consumer electronic gadget.
Furthermore, prices are much lower than they were previously, and Wi-Fi routers have been
considerably simplified to the point where a non-technical person can set them up. These
advantages are directly related to technological convergence.
For businesses, technological convergence implies that it is easier to engage with clients and
learn more about their purchasing habits. In some circumstances, technology convergence
allows a company to influence a customer's purchase decisions. Some retailers track their
customers' smartphone locations. If a consumer stands in a specific area of the store for a
given amount of time, the retailer may offer the customer a coupon by text message or pop-
up notice for the item they're looking at, further persuading the customer to make a purchase.
People who aren't computer literate are more inclined to adopt the internet and video on
demand if they can access these technologies through their television. TV is comfortable and
nonthreatening. TVs have huge displays and are simple to control. They require almost
minimal training to use for web access.
The Internet of Things (IoT) and artificial intelligence (AI) are two of the most essential
technological developments of our day. The Internet of Things (IoT) is a network of physical
devices such as computers, phones, automobiles, and other appliances that can interact and
collect data.
Artificial intelligence (AI), on the other hand, is the building of machines that can mimic
human intelligence and decision-making abilities. This article looks at the rapid convergence
of IoT and AI, the benefits of combining the two technologies, IoT and AI applications,
challenges, and ethical considerations.
The consequences for the industrial and business-to-business sectors' financial health have
been extensively examined in the literature. Furthermore, as information becomes
increasingly integrated into the goods we buy and use daily, it will eventually have an impact
on all of us in a variety of ways.
Data is classified and real-world decisions are made by computer systems in artificial
intelligence and machine learning. Robotics includes real-world item manipulation.
Combining robots with machine learning enables control of the real world.
Although there is concern that these types of robots may replace workers in the
manufacturing business, these robots could coexist with humans as "cobots," partnering with
them rather than replacing their employment.
Data is distributed across multiple computers and encrypted, allowing the creation of
extremely dependable, impenetrable databases that can only be accessed and edited by
authorized personnel.
Humans may find it difficult to comprehend AI decisions on occasion. This is due to their
ability to examine a wide range of parameters on their own and "learn" which ones are
critical to the current goal-oriented task.
AI algorithms, for example, are projected to be used more frequently to detect fraudulent
financial transactions.
As a result, blockchains are ideal for storing incredibly confidential and sensitive data that,
when handled carefully, has the potential to substantially improve and simplify our lives.
Consider current healthcare systems that deliver precise diagnoses based on our scans and
medical records, or even Netflix or Amazon's recommendation algorithms.
Entertainment
Virtual reality headsets allow thrill seekers to experience the potentially fatal speeds and
heights of roller coasters, for example, by immersing them in virtual environments that create
the sense that they are playing the game. Though virtual reality is already a cutting-edge
element of the entertainment business, the addition of artificial intelligence will make it much
more engaging.
They would be able to react to real-time gamers. It would take gaming to a whole new level
by increasing its excitement and engagement.
Airlines, hotels, resorts, amusement parks, and well-known tourist destinations use
technology to give potential customers a sample of their experiences. Virtual reality (VR), for
example, allows travellers to experience what it's like to travel or stay at a resort. It allows
visitors to visually tour a resort's hotel rooms, restaurants, and swimming pools before
arriving, bringing images and descriptions to life. Customers will be able to have a more
dynamic vacation experience if AI is applied.
1) Emerging technologies
2) Political influences on AI
3) Economic influences on AI
4) AI's social impact
5) Political considerations affecting blockchain.
6) Economic variables that influence blockchain
7) Political considerations affecting blockchain.
8) Political influences on cloud computing
9) Economic influences on cloud computing
10) Political influences on cloud computing
11) Emerging technology demonstration
12) Conclusion
❖ Emerging Technologies
Emerging technologies are new and imaginative breakthroughs in a range of sectors, most
often in the scientific and technological fields, that have the potential to dramatically impact
many industries, societies, and our way of life. These technologies, which are still in the
research and development stages, can bring about substantial improvements by tackling
complicated issues, improving present methods, or opening entirely new possibilities.
Emerging technologies push the boundaries of what is now known and feasible, with the
potential to dramatically alter economies, cultures, and daily life.
It is well known that both lobbyists and extremists have utilized Facebook to spread false
information. One of the best examples of this is the rise in false accounts and fake news on
Facebook following President Rodrigo Duterte's election in the Philippines in 2016. Another
example is Facebook advertising efforts, which, given the massive quantity of data collected
on Facebook's network, are almost aggressively targeted to "suit" the personalities of the
people they are targeting.
Politics will be affected by it in the next years. To increase user protection on platforms that
incorporate AI technology, further safeguards should be introduced to the existing laws and
guidelines.
Artificial intelligence's influence on jobs and the global economy is now "inarguable."
According to a survey published today in London, artificial intelligence (AI) could replace 10
to 30% of occupations in industrialized countries by the middle of this century. According to
the authors, this is corroborated by employment patterns in the Silicon Valley region. Last
year, the US lost 57,000 jobs overall, but Silicon Valley added 155,000. This is an 80%
increase.
Social considerations influence the development, adoption, and effect of artificial intelligence
(AI) technology. These characteristics are inextricably linked to how artificial intelligence is
seen, accepted, and integrated into society. Here are some significant social factors
influencing the AI landscape:
Political Aspects:
Governments and regulatory organizations use laws and regulations to affect blockchain
adoption. Regulations, depending on how they are implemented, can either assist or inhibit
blockchain innovation. Legal framework clarity is critical for encouraging blockchain
development.
Regulation of Cryptocurrencies:
Many blockchains are linked to digital currency. Political decisions about cryptocurrency
legality, taxes, and classification can have a big impact on the blockchain ecosystem.
Economic Considerations:
Political Factors
Economic Considerations:
❖ Cost effectiveness:
Cloud adoption is being driven by economic factors to save money. Companies weigh the
economic benefits of migrating from traditional on-premises infrastructure to cloud-based
services, considering issues such as hardware costs, maintenance, and scalability.
❖ Market Competence:
Service offerings, pricing patterns, and innovation are all influenced by economic
competition among cloud service providers. Organizations select providers based on
considerations such as pricing, dependability, performance, and service availability.
Robotics is the emerging technology that I choose since it may help us in a variety of
circumstances in the future.
The focus of robotics engineering is the development, conceptualization, building, use, and
application of robots. A robot is described as a mechanical device that is operated
automatically and can do a variety of tasks that are usually performed by humans.
Artificial intelligence (AI) developments have resulted in the development of a new class of
robots. These devices can observe, detect, process, and react to their surroundings, unlike
traditional robots that carry out preprogrammed tasks.
Even if we do not see or interact with robots on a regular basis in the next five years, their
presence will eventually be essential for the proper operation of our homes and workplaces.
Though robotics and AI have long been researched, research and application have evolved
during the last 20 years. The fear that these machines will eventually replace us in the
workforce has subsided.
Instead, robots and AI collaborate with humans to complete tasks more efficiently and
correctly than we could alone. We can now design computer systems that can learn and adapt
over time because to advances in AI. The adoption of tools such as Alexa, Siri, Google
Assistant, and others has greatly altered how we engage with computers and AI in our daily
lives.
❖ Benefits
❖ Savings on expenses:
While developing and implementing AI robots may involve some upfront expenses, doing so
may result in long-term cost benefits by cutting labour costs, reducing errors, and optimizing
resource consumption.
❖ Security:
Artificial intelligence robots can be utilized in high-risk or dangerous environments, reducing
the risk to human workers. They can work in disaster-affected areas such as chemical plants
or nuclear reactors.
❖ Consistency:
AI robots routinely execute work, eliminating variations that can occur with human labour.
This consistency is critical in areas such as pharmaceuticals where precise formulations are
required.
Features
❖ Sensory Awareness:
Many AI robots have sensors like cameras, microphones, and touch sensors that allow them
to detect and interact with their surroundings. Object recognition, speech recognition, and
navigation are all possible with these sensors.
❖ Autonomy:
AI robots can operate independently, which means they can make judgments and conduct
acts without the need for direct human intervention. This autonomy is especially important
for applications like as self-driving cars and drones.
❖ Task Flexibility:
AI robots are frequently developed to handle a wide variety of jobs. They can be programmed
or trained to perform a wide range of activities, from simple repetitive actions to complicated
problem solving.
❖ Problem-Solving Skills:
AI robots can use advanced algorithms to solve problems, make decisions, and plan. This
enables them to do complex tasks including route planning, optimization, and decision
support.
❖ Connectivity:
Many artificial intelligence (AI) robots are built to link to networks and other devices,
allowing them to access and exchange data, get updates, and coordinate with other robots or
systems. This connection is required for tasks such as remote monitoring and control.
❖ Safety precautions:
Safety characteristics are essential for AI robots, especially when they interact with humans.
Collision avoidance, emergency stop methods, and fail-safe protocols are examples of such
features.
❖ Energy Conservation:
AI robots are frequently intended to be energy-efficient, as long battery life or efficient power
utilization is critical to their operation, particularly in applications where they must run for
lengthy periods of time.
❖ Why risk human life when a robot can do the job? Consider how much more effective
having a robot fighting a fire or maintaining a nuclear reactor core would be.
Disadvantages
Robots will continue to evolve from rote machines to collaborators with cognitive abilities as
sensor technology improves and incredible advances in machine learning and artificial
intelligence are made. These advances, which are on the rise in these and allied fields, will
have a large positive impact on robotics.
Although the number of individuals required to physically weld automotive frames will
reduce, the number of qualified professionals required to program, maintain, and repair the
machinery will increase. This often entails staff workers receiving beneficial internal training
and upskilling, which provides them with a set of abilities that they may apply in professions
and sectors outside than robotics, such as programming and maintaining robots.
Robotics will boost output and economic expansion while creating new work opportunities
for a huge number of people worldwide. There are still warnings about huge job losses, such
as projections that 20 million manufacturing jobs could be lost by 2030 or that 30% of all
employment could be automated by then.
Because of their very high levels of precision, robots will certainly take on more repetitive,
difficult manual labour activities, which will improve transportation, healthcare, and
individuals' potential to better themselves. Of course, how everything plays out will only
become obvious over time.
Emerging technologies are fast changing the world we live in; for decades, the idea of
sentient robots has captivated and stimulated our imaginations. Ideas regarding robots that
were once deemed science fiction are now being applied in a variety of enterprises.
Businesses are utilizing AI-powered robots to bridge the gap between humans and
technology, solve problems, and adjust business models to shifting customer expectations.
The initial stage is to evaluate the dependability of the source from which you acquired the
research findings. Is it a well-known university, a well-known research group, or a
trustworthy news source? Peer-reviewed academic papers are frequently reliable sources of
scientific research. Check to check if the research has been peer reviewed. Experts in the area
assess the quality and validity of peer-reviewed studies. This process helps to ensure the
accuracy of the results. Analyze the research methodology used in the study. Is it trustworthy
and well-supported? Sample size, data gathering techniques, statistical analysis, and any
potential biases should all be considered.
Analyze the research data and the conclusions drawn from it. Are the results supported by the
data and consistent with the study's goals and assumptions?
Pay attention to the date of publishing. Because AI and robots are ever-changing topics, older
studies may not adequately reflect current discoveries and trends.
Cross-Reference: Compare the findings to those of other trustworthy sources and relevant
research studies. Are the results from different sources consistent, or do they differ?
Expertise in writing: Consider the study's authors' and researchers' qualifications. Do they
have any experience in the field being examined, or in robots or AI?
Conflicts of Interest and Bias: Be aware of any potential biases or conflicts of interest that
may influence the research's outcomes. Open disclosure of any conflicts of interest
demonstrates research integrity.
Discussion and Conclusion Review the study's discussion and conclusion parts. Do the
authors provide a fair appraisal of the data, considering any potential impacts and promising
subjects for further research? Evaluating study findings is a critical step in ensuring that you
are using correct and relevant information to make decisions or draw conclusions. It is also
critical to stay up with the most recent discoveries and breakthroughs in the rapidly evolving
disciplines of AI and robotics.
von der Osten, B. (2021). 15 Artificial Intelligence Pros and Cons that you Need to
Know. [online] Rock Content. Available at:
https://rockcontent.com/blog/artificial-intelligence-pros-and-cons/.
Default. (n.d.). Page not found - 404 Error. [online] Available at:
https://connect.comptia.org/blog/emerging-trends-in-information- [Accessed 12
Nov. 2023].
❖ Enhanced effectiveness
❖ Better connectivity
❖ Creativity and invention
❖ Better data insights
❖ Personalization
❖ Sustainability
❖ Medical advances
❖ Global connectivity
Emerging Technology Drawbacks
❖ Smart homes, industrial IoT, and wearable gadgets are just a few
examples.
❖ Benefits include faster data rates, lower latency, and IoT support.
❖ Data privacy: The collection and use of large amounts of personal data causes
privacy concerns.