Professional Documents
Culture Documents
Mark Lumiti
POWER9 IC922 L2 Quiz /
POWER9 IC922 L2 QUIZ /
POWER9 IC922 Level 2 Quiz /
Started on Monday, February 21, 2022, 2:52 AM
State Finished
Completed on Monday, February 21, 2022, 7:38 AM
Time taken 4 hours 46 mins
Grade 22.00 out of 25.00 (88%)
Feedback A minimum of 19 correct answers is required to pass.
Congratulations, you passed the quiz for the POWER9 IC922 Level 2!
Question 1
Correct
1.00 points out of 1.00
Flag question
Question text
Elastic Distributed Inference (EDI) is a component of Watson Machine Learning Accelerator
(either in Technical Preview or GA depending on when you're taking this quiz). What is a
benefit of EDI?
Select one:
EDI is meant as a high-availability feature that allows a server in the public cloud to act as a hot
standby for an on-premises inference server.
EDI enables you to publish inference models as services across a scalable cluster of servers, from
which clients can consume the services.
EDI takes a pre-trained model and optimizes it for inferencing for the specific hardware it is
running on.
EDI allows a single inference request to be partitioned and distributed across GPUs running on
multiple servers, speeding up execution of that one request.
Question 2
Correct
1.00 points out of 1.00
Flag question
Question text
Select one:
1 TB
4 TB
2 TB
512 GB
Question 3
Correct
1.00 points out of 1.00
Flag question
Question text
What value-based framework puts emphasis on methods that make AI effective and helps guide
how AI models are created and applied to real-life problems?
Select one:
Discover-Derive-Deploy
Ingest-Train-Score
Develop-Deploy-Infer
Data-Train-Inference
Question 4
Correct
1.00 points out of 1.00
Flag question
Question text
As of the initial GA (February 2020), what is the maximum number of GPUs that can be
configured in the IC922 server?
Select one:
12
10
Question 5
Incorrect
0.00 points out of 1.00
Flag question
Question text
In Nvidia’s testing of the T4 GPU versus the V100 GPU, what was the difference in power
utilization? (Choose the closest number.)
Select one:
Question 6
Correct
1.00 points out of 1.00
Flag question
Question text
Select one:
Enterprise security
Fast insights
Future-ready
Question 7
Correct
1.00 points out of 1.00
Flag question
Question text
Select one:
Intel Xe
Nvidia T4
Intel V100
Question 8
Correct
1.00 points out of 1.00
Flag question
Question text
Which of the following is *NOT* a type of accelerator for machine learning and deep learning
workloads?
Select one:
ASIC
ESLC
FPGA
GPU
Question 9
Correct
1.00 points out of 1.00
Flag question
Question text
Select one:
19” rack 1U
19” rack 2U
24” rack 2U
24” rack 4U
Question 10
Incorrect
0.00 points out of 1.00
Flag question
Question text
Select one:
Question 11
Correct
1.00 points out of 1.00
Flag question
Question text
According to a May 2018 report by Forrester Research Inc., what is the fastest growing workload
type?
Select one:
Question 12
Correct
1.00 points out of 1.00
Flag question
Question text
What are the core configurations (per CPU) available for the IC922 server?
Select one:
10, 20 or 30 cores
12, 20 or 24 cores
10, 12 or 16 cores
12, 16 or 20 cores
Question 13
Correct
1.00 points out of 1.00
Flag question
Question text
What is “Quantization”?
Select one:
It is a compression technique that allows for fast and loss-less transfer of neural networks
between AI software running on different server architectures.
During model training, it is the act of analyzing the distribution of data values within the dataset
to determine how many categories to break the dataset into.
It is a machine learning algorithm popular with data scientists that is commonly used in
classification problems.
It is the reduction of the precision of numeric data values in a trained model, making it smaller
and more efficient for inferencing.
Question 14
Correct
1.00 points out of 1.00
Flag question
Question text
In addition to the IC922 being a great server for inference workloads, its storage characteristics
make it a great fit for data and cloud workloads as well. In a test involving MongoDB running on
Red Hat OpenShift, how well did the IC922 outperform the similarly configured Intel-based
system?
Select one:
Question 15
Correct
1.00 points out of 1.00
Flag question
Question text
Within the AI workflow, what does the Inference stage represent?
Select one:
Inference is a method of interpreting and understanding the decision-making process within a
complex machine learning model.
Inference is the earliest stage where data sources are discovered and cataloged, and the meaning
of the data is inferred from the column names within those data sources.
Inference is where the model is deployed into production and new never seen before data is
passed into it for the purposes of making a prediction.
Inference is where the model learns from historic business data, adjusting parameter values
within the model to make it as accurate as possible.
Question 16
Correct
1.00 points out of 1.00
Flag question
Question text
Select one:
Minutes
Hours
Sub-second
Days
Question 17
Correct
1.00 points out of 1.00
Flag question
Question text
Security is always top of mind for clients. IC922 has various security features built right into the
hardware and software stack. Which of the following is *NOT* a security feature or capability of
IC922?
Select one:
Trusted Boot
Secure Boot
Question 18
Correct
1.00 points out of 1.00
Flag question
Question text
On which server(s) is IBM Visual Insights (formerlly PowerAI Vision ) software supported?
Select one:
Question 19
Correct
1.00 points out of 1.00
Flag question
Question text
Select one:
Flag question
Question text
Select one:
Inference Cloud
Integrated Cloud
Inferencing Cognition
IBM Cognitive
Question 21
Correct
1.00 points out of 1.00
Flag question
Question text
The IC922 server is storage-dense with strong I/O characteristics, making it an ideal server for
data and cloud needs. How many drives can be supported in the server (through the local drive
bays)?
Select one:
24
18
30
12
Question 22
Correct
1.00 points out of 1.00
Flag question
Question text
While the IC922 is not intended to be a direct replacement for LC922, it does share some similar
characteristics. However, the IC922 does have advantages over the LC922. Which of the
following statements about these advantages is *NOT* correct?
Select one:
Question 23
Correct
1.00 points out of 1.00
Flag question
Question text
Clients who already own or are considering purchasing AC922 servers for training may ask why
these servers can’t also be used for inference. What should you tell them?
Select one:
The IC922 is just a rebranding of the AC922 server, meaning that the specifications are identical
between them and the client can in fact run inference workloads equally on either of the AC922
or IC922 servers.
Inference software is not supported on the GPUs that are available in the AC922 server, which
means that they must purchase a separate IC922 server for inference purposes.
This is possible, but training and inference workloads have different characteristics and the
AC922 might not be the most energy efficient and cost-effective option for inferencing.
The AC922 can be used for inference, but not by default. It must be configured at manufacturing
time with the inference-specific GPUs that are shipped with the IC922 server.
Question 24
Correct
1.00 points out of 1.00
Flag question
Question text
Select one:
Training is the stage of the machine learning workflow where auditors are educated on the
internal workings of a “black box” model.
Training is the building of a model by learning from the vast amounts of input data presented to
it.
Training is where an existing model is used to make predictions against data it has never seen
before.
Training is the act of choosing an appropriate algorithm to use based on the type of problem
being solved and inspecting a sample of the input data.
Question 25
Incorrect
0.00 points out of 1.00
Flag question
Question text
According to analysts, how large is the accelerated inferencing market expected to be by 2023?
Select one:
$7 billion USD
$3 billion USD