You are on page 1of 14

1.

Briefly explain the mapping of human thinking to artificial intelligence


components?

Explanation:
The structure of the brain or the connectome may not always translate exactly to human
reasoning. Any physical component of the brain may not compare to human reasoning. like the
virtual machines employed in computer technology.
Thus, for now, it's impossible.
We will learn about this because we are on a clear path to scientific discovery. It may take
another 50 to 200 years to fully master this knowledge.
Also, current computer hardware cannot support the human way of thinking. Although we know
that we will have this knowledge because we are on a clear path of scientific discovery and can
learn about it, at some point in the future. Is it a bit like the first statement? Because they are
directly connected. Because although we can build some cybernetic brains, we don't fully
understand how to make computers think like humans before we fully understand how the
human mind works. In fact, we do this very well, taking advantage of things we don't fully
understand.
2. What’s the difference between augmented, virtual and mixed reality? How is augmented
virtual and mixed reality achieved?
I came across this question in various meetings and conversations. Although similar in many ways,
the so-called immersive technologies represented by augmented reality (AR), virtual reality (VR) and
mixed reality (MR) have important differences in their proposal.

The most famous one is virtual reality. Mostly because of the strong symbolism represented by the
use of special glasses to access this technology. On the other hand, the most accessible, scalable and
which is expected to become increasingly popular is augmented reality. It requires only a smartphone
to access it, current available to 60% of Brazilians and 80% of Europeans. Mixed reality, on the other
hand, is the least known and can be considered a mixture of AR and VR and its access is
accomplished through a special equipment such as Microsoft’s Hololens glasses.

Let’s understand better these 3 technologies in a more detailed way

Virtual Reality (VR)

It provides total virtual immersion experience, moving the user from the real world to another totally
simulated by computer. It is possible, for example, to see yourself inside a game or take sight tours
around the globe without leaving physically a place. At the beginning, its use was greatly represented
in game industry. Over time, it has also been used in several other sectors such as health, education,
engineering, events and tourism.

Page 1
The experience takes place through the use of special glasses like Facebook’s Oculus Quest, which
works autonomously or by a smartphone compatible equipment such as the Samsung Gear VR. In a
quick web search, several VR glasses options are available. For those looking for a more affordable
equipment for example, Google Cardboard can be found for just US$ 15 on the internet. The user can
manually assemble the cardboard glasses and then attach a cell phone compatible with virtual reality
applications.

VR applications are already used in different sectors and for the most diverse purposes. To name a
few examples: doctors while preparing for surgical procedures, soldiers in combat simulations,
athletes in competition training, children, youth and adults during learning processes, among others.

Augmented Reality (AR)

It became popular all over the world due to the Pokémon GO phenomenon, the most known
augmented reality application. Unlike VR, augmented reality is a technology that inserts virtual
elements in real environments, such as images, videos, 3D objects, games, external links, etc.
Accessible by a smartphone or tablet compatible with AR applications, or even through special
glasses which should start reaching the market with more intensity next year. Apple should announce
2021 Apple Glass launch by the end of this year. Technology giants like Google and Facebook
should make announcements soon too. Other players will surely emerge not only with glasses offers
but also AR contact lenses. It’s been speculated that Apple’s glasses will hit the market at prices
starting at $ 500. However, the imminent increase in competition should result in price reduction and
thus favor the people’s access to this equipment.

Augmented reality goes far beyond entertainment. It is a very useful resource which generates
important positive impacts on strategic indicators. The cosmetics brand L’Oreal launched
its TryOn action allowing consumers to virtually try different products. The result: 80% conversion
into sales. The retailer chain IKEA has also been very successful launching its IKEA Place
app allowing customers to virtually view store products inserted in their homes with 98% accuracy.
Companies from diverse sectors are already using AR as an effective tool in sales, production,
trainings, among others.

Mixed reality (MR)

A mixed reality environment goes a step beyond augmented reality as users can interact with virtual
objects as if they were part of the real world, eliminating the need for a smartphone or tablet screen.

To experience mixed reality, it is necessary to use an MR headset that allows interaction with 3D
holograms and recognizes gestures, looks and sounds through motion controllers and MR
headphones. The equipment’s processing power for this technology needs to be much greater than
that used for virtual or augmented reality due to its greater operational complexity.

Page 2
As the youngest in the immersive technology family, it is still little known in the market. However,
some companies from different sectors and sizes have already invested in mixed reality projects.
Ford, for example, already uses MR to create future vehicles prototypes, instead of making real
prototypes that are much more expensive and time-consuming to make.

Simply put, the difference between virtual, augmented and mixed reality is:

• Virtual reality (VR): a totally immersive experience in which the user enters a fully digital
environment through special VR glasses.

• Augmented reality (AR): an experience in which virtual objects are superimposed on the real world
environment and can be viewed and accessed using smartphones, tablets or special AR glasses.

• Mixed reality (MR): allows interaction with 3D holograms in real environments and thus provides
users with an immersive experience that mixes the real and the virtual in a fully integrated way.

3. Briefly explain the following terms

A. Blockchain technology
Blockchain technology is an advanced database mechanism that allows transparent information
sharing within a business network. A blockchain database stores data in blocks that are linked
together in a chain. The data is chronologically consistent because you cannot delete or modify
the chain without consensus from the network. As a result, you can use blockchain technology to
create an unalterable or immutable ledger for tracking orders, payments, accounts, and other
transactions. The system has built-in mechanisms that prevent unauthorized transaction entries
and create consistency in the shared view of these transactions.

Why is blockchain important?

Traditional database technologies present several challenges for recording financial transactions.
For instance, consider the sale of a property. Once the money is exchanged, ownership of the
property is transferred to the buyer. Individually, both the buyer and the seller can record the
monetary transactions, but neither source can be trusted. The seller can easily claim they have
not received the money even though they have, and the buyer can equally argue that they have
paid the money even if they haven’t.

To avoid potential legal issues, a trusted third party has to supervise and validate transactions.
The presence of this central authority not only complicates the transaction but also creates a
single point of vulnerability. If the central database was compromised, both parties could suffer.

Blockchain mitigates such issues by creating a decentralized, tamper-proof system to record


transactions. In the property transaction scenario, blockchain creates one ledger each for the
buyer and the seller. All transactions must be approved by both parties and are automatically
Page 3
updated in both of their ledgers in real time. Any corruption in historical transactions will corrupt
the entire ledger. These properties of blockchain technology have led to its use in various sectors,
including the creation of digital currency like Bitcoin.

B. Nanotechnology

Nanotechnology is the term given to those areas of science and engineering where phenomena
that take place at dimensions in the nanometre scale are utilised in the design, characterisation,
production and application of materials, structures, devices and systems. Although in the natural
world there are many examples of structures that exist with nanometre dimensions (hereafter
referred to as the nanoscale), including essential molecules within the human body and
components of foods, and although many technologies have incidentally involved nanoscale
structures for many years, it has only been in the last quarter of a century that it has been possible
to actively and intentionally modify molecules and structures within this size range. It is this
control at the nanometre scale that distinguishes nanotechnology from other areas of technology.

Clearly the various forms of nanotechnology have the potential to make a very significant
impact on society. In general it may be assumed that the application of nanotechnology will be
very beneficial to individuals and organisations. Many of these applications involve new
materials which provide radically different properties through functioning at the nanoscale,
where new phenomena are associated with the very large surface area to volume ratios
experienced at these dimensions and with quantum effects that are not seen with larger sizes.
These include materials in the form of very thin films used in catalysis and electronics, two-
dimensional nanotubes and nanowires for optical and magnetic systems, and
as nanoparticles used in cosmetics, pharmaceuticals and coatings. The industrial sectors most
readily embracing nanotechnology are the information and communications sector, including
electronic and optoelectronic fields, food technology, energy technology and the medical
products sector, including many different facets of pharmaceuticals and drug delivery systems,
diagnostics and medical technology, where the terms nanomedicine and bionanotechnology are
already commonplace. Nanotechnology products may also offer novel challengies for the
reduction of environmental pollution

C. Cloud and quantum computing

Cloud computing:- Cloud computing is a means of networking remote servers that are hosted
on the Internet. Rather than storing and processing data on a local server, or a PC’s hard drive,
one of the following three types of cloud infrastructure is used.

Page 4
The first type is a public cloud. Here a third-party provider manages the servers, applications,
and storage much like a public utility. Anyone can subscribe to the provider’s cloud service,
which is usually operated through their own data center.
A business or organization would typically use a private cloud. This might be hosted on their
onsite data center, although some companies host through a third-party provider instead. Either
way, the computing infrastructure exists as a private network accessible over the Internet.
The third option is a hybrid cloud. Here private clouds are connected to public clouds, allowing
data and applications to be shared between them. Private clouds existing alone can be very
limiting, and a hybrid offers a business more flexibility.
Cloud computing services can focus on infrastructure, web development or a cloud-based app.
These are often regarded as a stack; all are on-demand, pay-as-you-go. Infrastructure as a Service
(IaaS) gives you management of the whole deal: servers, web development tools, applications.
Platform as a Service (PaaS) offers a complete web development environment, without the worry
of the hardware that runs it. Finally, Software as a Service (SaaS) allows access to cloud-based
apps, usually through a web browser interface. SaaS is the top of the stack. Cloud computing has
been around since 2000.
Quantum computing:-Quantum computers truly do represent the next generation of computing.
Unlike classic computers, they derive their computing power by harnessing the power of
quantum physics.
Because of the rather nebulous science behind it, a practical, working quantum computer still
remains a flight of fancy.
Give clients access to a quantum computer over the internet, and you have quantum cloud
computing. Currently, the only organization which provides a quantum computer in the cloud is
IBM. They allow free access to anyone who wishes to use their 5-qubit machine. Earlier this year
they installed a 17-qubit machine. So far over 40,000 users have taken advantage of their online
service to run experiments.
Not to be outdone, Google provided the fastest quantum computer with 53qu bits and speed of
200 seconds computation while the supercomputer took 10000 years.
So, what is qubit and how many do you need?
Qubit is short for a sequence of quantum bits. With a classic computer, data is stored in tiny
transistors that hold a single bit of information, either the binary value of 1 or 0. With a quantum
computer, the data is stored in qubits. Thanks to the mechanics of quantum physics, where
subatomic particles obey their own laws, a qubit can exist in two states at the same time. This
phenomenon is called superposition.
So, a qubit can have a value of 1, 0, or some value between. Two qubits can hold even more
values.
Before long, you are building yourself an exponentially more powerful computer the more qubits
you add.
Quantum computer theory was first rooted in the 1980s and only now are the first rudimentary
machines being constructed. Quantum computers are big machines, reminiscent of the old

Page 5
mainframe computers of the 1960s. One serious logistical problem is the need for deep-freezing
of the superconducting circuits. Only at sub-zero temperatures can the qubits maintain a
constant, predictable superposition. Heating up qubits can result in calculation errors.

D. Autonomic computing and some of its characteristics


Autonomic computing is a computer’s ability to manage itself automatically through adaptive
technologies that further computing capabilities and cut down on the time required by computer
professionals to resolve system difficulties and other maintenance such as software updates.
The move toward autonomic computing is driven by a desire for cost reduction and the need to
lift the obstacles presented by computer system complexities to allow for more advanced
computing technology.

Autonomic computing (AC) is distributed computing resources with self-


managing characteristics, adapting to unpredictable changes while hiding intrinsic complexity to
operators and users. Initiated by IBM in 2001, this initiative ultimately aimed to develop
computer systems capable of self-management, to overcome the rapidly growing complexity of
computing systems management, and to reduce the barrier that complexity poses to further
growth.
The AC system concept is designed to make adaptive decisions, using high-level policies. It will
constantly check and optimize its status and automatically adapt itself to changing conditions. An
autonomic computing framework is composed of autonomic components (AC) interacting with
each other. An AC can be modeled in terms of two main control schemes (local and global)
with sensors (for self-monitoring), effectors (for self-adjustment), knowledge and
planner/adapter for exploiting policies based on self- and environment awareness. This
architecture is sometimes referred to as Monitor-Analyze-Plan-Execute (MAPE).
There are four areas or characteristics of Autonomic Computing as defined by IBM. These
are as follows:

Self-Configuration
The system must be able to configure itself automatically according to the changes in its
environment. It should be able to add new components, configure its components, get rid of old
or faulty components, and reconfigure them by own its own, without needing human interference
or intervention (or with very little human assistance).

Self-Healing
IBM mentions that an autonomic system must have property by which it must be able to repair
itself from errors and also route the functions away from trouble whenever they are encountered.
It should have the ability to identify faulty components, diagnose them using a corrective
mechanism, and heal itself without damaging or harming any other components of the system.

Page 6
Self-Optimization
According to IBM an autonomic system must be able to perform in an optimized manner and
ensure that it follows an efficient algorithm for all computing operations. This capability is also
known as the self adjusting or self tuning property of an autonomous system. Resource
utilization and workload management both happen to be aspects of these characteristics.

Self-Protection
IBM states that an autonomic system must be able to perform detection, identification, and
protection from security and system attacks so that systems’ security and integrity remain intact.
Autonomic systems have the ability to detect hostile behaviours and then take several types of
corrective actions and measures to protect themselves from any sort of attack.
E. Computer vision and some real-world application
Humans have eyes that can see anything and a brain that, because of its abstract knowledge of
ideas through interactions and personal experiences, can make sense of much of what they see.
Before the introduction of computer vision, computers lacked this capability.
Computer vision is a subfield of study under Artificial Intelligence that trains the machine to
understand the concept of the visual world. With the help of machine learning and deep learning
models, computers are taught to observe an image, identify objects, classify them, and react
accurately.
How does Computer Vision work?
As we have said before, computer vision allows computers to “see” things and understand them,
that is to derive meaningful information from them. Computer vision-enabled devices may
evaluate visual data and make judgments based on it or comprehend their surroundings and
situations.

Using cameras, data, and algorithms rather than retinas, optic nerves, and a visual brain,
computer vision teaches computers to carry out these tasks in considerably less time. A system
trained to check items or monitor a production asset can swiftly outperform humans since it can
examine hundreds of products or processes per minute while spotting undetectable flaws or
problems.

Applications of Computer Vision:

1. Healthcare:
 MRI Scanning:
 X-Ray analysis:
 Cancer detection:
 Blood Loss measurement:
 MovementAnalysis:

Page 7
2. Transportation:
 Self driving cars
 Obstruction detection
 Traffic flow analysis

3. Face detection
4. Screen Reading
5. Manufacturing industry
 Defect detection
 Barcode reading

6. Agriculture
7. MachineVision
8. Construction
 Predictive maintenance
 PPE detection

F. Embedded system and its components


As its name suggests, Embedded means something that is attached to another thing. An
embedded system can be thought of as a computer hardware system having software embedded
in it. An embedded system can be an independent system or it can be a part of a large system. An
embedded system is a microcontroller or microprocessor based system which is designed to
perform a specific task. For example, a fire alarm is an embedded system; it will sense only
smoke.
An embedded system has three components −
 It has hardware.
 It has application software.
 It has Real Time Operating system (RTOS) that supervises the application software and
provide mechanism to let the processor run a process as per scheduling by following a
plan to control the latencies. RTOS defines the way the system works. It sets the rules
during the execution of application program. A small scale embedded system may not
have RTOS.
So we can define an embedded system as a Microcontroller based, software driven, reliable, real-
time control system.

Characteristics of an Embedded System


 Single-functioned
 Tightly constrained .

Page 8
 Reactive and Real time
 Microprocessors based
 Memory
 Connected .
 HW-SW systems

4. Briefly explain how IoT can be used in the agriculture and business sector. Give one
detailed example for each?

What is IoT in agriculture?

Broadly speaking, IoT in agriculture refers to the use of smart devices and sensors to monitor the
farming process, from planting to farming to harvesting and finally to distribution.

By using IoT sensors, farmers are able to collect environmental data (e.g. rainfall, humidity,
pollution, etc.) to make data-driven decisions to improve the different aspects of the farming
process.

One example is sensors monitoring the state of soil and enabling farmers to know the exact
amount of pesticides and fertilizers that they have to inject into it in order to reach optimal
growth efficiency.

What are the examples of IoT in agriculture?

IoT has the potential to transform agriculture in many aspects. In this section, we will look at 5
uses cases of that:

1. Data collection

Smart agriculture sensors can monitor weather conditions, soil quality, crop growth progress, or
cattle’s health.

The farmers can then use this data to track the status of crops, gain insight into performance
gaps, equipment efficiency, and future growth projections.

Explore data collection automation in more detail.

2. Risk mitigation

As we said, data collection can help farmers to make projections about their productions. This
ability to foresee the output amount will make for enhanced product distribution.

Page 9
If the farmer has knowledge of a certain crop’s date of harvest, they will be able to make
schedule the next shipment of seeds and grains on a day that once the readied product is out for
distribution, the next batch is ready to be planted immediately.

This lowers production risks as it could help farmers not face shortages in production, and
consequently, disruption in income.

3. Cost management

Sensors are there to monitor their use of inputs. This would mean the farmers have precise
knowledge of their water consumption, for example.

Scientifically knowing how much water a plant needs for growth, versus the actual amount that
it’s being fed, will educate farmers to cut down unnecessary irrigation and to conserve water.

4. Automation

The use of automated robots in the production cycle will make for increased efficiency and
decreased operational costs. That’s because robots will take care of undertaking repetitive tasks,
such as irrigation, fertilization, pest control, seed plantation, and more.

5. Enhanced product quality

Quality control is already an aspect of IoT in manufacturing, wherein thermal, and other related
sensors are able to constantly monitor the quality metrics of produced goods, and compare them
with the programmed codes to undertake an immediate production quality control.

The same technology could be adopted in agriculture, where embedded sensors in the crops’
environment could scan them and make sure their quality is constant across the board. This can
be done by monitoring leaves’ color or root strength.

What are some IoT devices in agriculture?

Some of the IoT devices used in agriculture are:

1. Irrigation sensors

Irrigation sensors are smart agriculture devices that control the sprinklers’ functionality. These
sensors monitor the dryness of the soil and operate the sprinkles accordingly, making sure that
the risk of under or over watering is minimal.

Page 10
2. Soil sensors

There are soil sensors that can be placed in the field where the crops are growing, and to provide
the farmer with specific data, from temperature and precipitation to the overall crop health. This
will aid the farmers to dedicate their time to more intricate tasks in the production cycle, while
the smart devices are undertaking the menial and time-consuming tasks such as monitoring
leaves’ health status.

3. Drones

Drones are not only for grocery delivery and aerial photography. Some drones are programmed
to spread seeds in defrosted areas. These unmanned aerial vehicles have the capability of
covering expansive areas in a faster, more efficient manner than a group of farmers would on
foot

How can IoT be used in business?


 IoT is all about the enhancement of business processes and solutions with sensors,
devices, gateways and platforms. By gathering all their data in one place, manufacturers
can make intelligent decisions and design more efficient processes.

Top 8 IoT applications and examples in business


1. Connected vehicles
Autonomous vehicles are one of the most notable examples of IoT in action. Self-driving cars
and trucks use a slew of connected devices to safely navigate roadways in all sorts of traffic and
weather conditions. The technologies in use include artificial intelligence (AI)-enabled cameras,
motion sensors and onboard computers.

2. Traffic management
Roadway infrastructure has become more connected in the past decade as well, with cameras,
sensors, traffic light controls, parking meters and even smartphone traffic apps transmitting data
that's then used to help avert traffic jams, prevent accidents and ensure smooth travel.

For example, cameras detect and transmit data about traffic volume to central management
groups that can then analyze the information to determine whether, what and when mitigation
steps must be taken.

3. Smart grids
Utilities are also using IoT to bring efficiency and resiliency to their energy grids.

Page 11
Historically, energy flowed one way along the grid: from the generation site to the customer.
However, connected devices now enable two-way communication along the entire energy supply
chain: from generation through distribution to use, thereby improving the utilities' ability to
move and manage it.

Utilities can take and analyze real-time data transmitted by connected devices to detect blackouts
and redirect distribution, as well as respond to changes in energy demand and load.

Meanwhile, smart meters installed at individual homes and businesses provide information about
both real-time use and historical usage patterns that customers and the utilities can analyze to
identify ways to improve efficiency.

4. Environmental monitoring
Connected devices can collect IoT data that indicates the health and quality of air, water and soil,
as well as fisheries, forests and other natural habitats. They can also collect weather and other
environmental data.

As such, IoT delivers the ability to not only access significantly more real-time data about the
environment at any given time and place, but it also enables a range of organizations in various
industries to use that data to glean actionable insights.

5. Smart buildings and smart homes


Property owners are using the power of IoT to make all sorts of buildings smarter, meaning
they're more energy-efficient, comfortable and convenient, as well as healthier and possibly
safer, too.

An IoT ecosystem in a commercial building could include monitoring of the HVAC


infrastructure that uses real-time data and automation technologies to constantly measure and
adjust the temperature for optimum energy efficiency and comfort. Meanwhile, cameras using AI
could aid in crowd management to ensure public safety at events such as sold-out concerts.

6. Smart cities
Smart cities are consolidating IoT deployments across many facets to give them a holistic view
of what's happening in their jurisdictions.

As such, smart cities generally incorporate connected traffic management systems and their own
smart buildings. They might incorporate private smart buildings, too. Smart cities might also tie

Page 12
into smart grids and use environmental monitoring to create an even larger IoT ecosystem that
provides real-time views of the various elements that affect life in their municipalities.

7. Supply chain management


Supply chain management has been undergoing a modernization, thanks to low-power sensors,
GPS and other tracking technologies that pinpoint assets as they move along a supply chain.
Such information lets managers both more effectively plan and more confidently reassure
stakeholders about the location of items shipped or received.

8. Industrial, agricultural and commercial management


IoT has numerous applications in industrial and commercial settings, enabling everything from
predictive maintenance to improved security at facilities to smart agriculture. These wide-
ranging use cases employ an equally expansive list of IoT technologies.

5. How we ingest streaming data into Hadoop Cluster?


Explanation: A Java-based ingestion tool, Flume is used when input data streams-in faster than
it can be consumed. Typically Flume is used to ingest streaming data into HDFS or Kafka topics,
where it can act as a Kafka producer. Multiple Flume agents can also be used collect data from
multiple sources into a Flume collector
Hadoop could be described as an open source program to handle big data. By utilizing
the MapReduce model, Hadoop distributes data across it's nodes which are
a network of connected computers. This data is shared across the systems and thus enables it
to handle and process a very large amount dataset at the same time. These collection
of networked nodes is called the HADOOP CLUSTER.
Hadoop leverages the power and capabilities of it's distributed file system (HDFS) which can
handle the reading and writing of large files from log files, databases, raw files and other forms
of streaming data for processing using ingestion tools such as the apache Flume, striim and so
on. The HDFS ensures the ingested data are divided into smaller subunits and distributed
across systems in the cluster

6. Discuss the cybersecurity and its application? How did you see Ethiopian technology
usage and security levels?

What is cybersecurity?

Cyber security is the practice of defending computers, servers, mobile devices, electronic
systems, networks, and data from malicious attacks. It's also known as information technology

Page 13
security or electronic information security. The term applies in a variety of contexts, from
business to mobile computing, and can be divided into a few common categories.

· Network security is the practice of securing a computer network from intruders, whether
targeted attackers or opportunistic malware.
· Application security focuses on keeping software and devices free of threats. A compromised
application could provide access to the data its designed to protect. Successful security begins in
the design stage, well before a program or device is deployed.
· Information security protects the integrity and privacy of data, both in storage and in transit.
· Operational security includes the processes and decisions for handling and protecting data
assets. The permissions users have when accessing a network and the procedures that determine
how and where data may be stored or shared all fall under this umbrella.
· Disaster recovery and business continuity define how an organization responds to a cyber-
security incident or any other event that causes the loss of operations or data. Disaster recovery
policies dictate how the organization restores its operations and information to return to the same
operating capacity as before the event. Business continuity is the plan the organization falls back
on while trying to operate without certain resources.
· End-user education addresses the most unpredictable cyber-security factor: people. Anyone can
accidentally introduce a virus to an otherwise secure system by failing to follow good security
practices. Teaching users to delete suspicious email attachments, not plug in unidentified USB
drives, and various other important lessons is vital for the security of any organization

Applications of Cybersecurity:

 DDoS security: DDoS stands for Distributed Denial for Service attack.
 Web Firewall
 Bots
 Antivirus and Antimalware
 Threat management systems
 Critical systems
 Rules and regulations

How do you see Ethiopian technology usage and security levels?


According to the 2021 National Bank of Ethiopia annual report, by the end of 2020/21, there
were 54.3 million mobile phone users (a 21.9 % annual increase), nearly 1 million fixed lines (a
62 % decrease), and 24.5 million Internet service subscribers (a 4.3% rise) in a country of over
100 million people

Page 14

You might also like