Professional Documents
Culture Documents
Professional Cloud
Security Engineer
Journey
Course Workbook
Certification Exam Guide Sections
1 Configuring access within a cloud solution environment
Courses Documentation
Active Directory user account provisioning |
Identity and access management | Google
Cloud
Security in Google Cloud Managing Security in Google Cloud What is Configuration Manager? - Google
M2 Securing Access to Google Cloud M2 Securing Access to Google Cloud Workspace Admin Help
Manage membership automatically with
dynamic groups - Google Workspace Admin
Help
Creating and updating a dynamic group | Cloud
Identity
Create and manage groups using APIs - Google
Workspace Admin Help
1.2 Diagnostic Question 03
Cymbal Bank leverages Google A. Create a service account with appropriate
Cloud storage services, an permissions. Authenticate the Spark Cluster
on-premises Apache Spark Cluster, and the web application as direct requests
and a web application hosted on a and share the service account key.
third-party cloud. The Spark cluster B. Create a service account with appropriate permissions. Have the Spark
and web application require limited Cluster and the web application authenticate as delegated requests, and
access to Cloud Storage buckets and share the short-lived service account credential as a JWT.
a Cloud SQL instance for only a few
C. Create a service account with appropriate permissions. Authenticate the
hours per day. You have been tasked
Spark Cluster and the web application as a delegated request, and share the
with sharing credentials while
service account key.
minimizing the risk that the
credentials will be compromised. D. Create a service account with appropriate permissions. Have the Spark
Cluster and the web application authenticate as a direct request, and share
the short-lived service account credentials as XML tokens.
What should you do?
1.2 Diagnostic Question 04
Cymbal Bank recently discovered A. Navigate to Organizational policies in the
service account key misuse in one Google Cloud Console. Select your organization. Select
of the teams during a security iam.disableServiceAccountKeyCreation.
audit. As a precaution, going Customize the applied to property, and set Enforcement to ‘On’. Click Save. Repeat the
forward you do not want any team process for iam.disableCrossProjectServiceAccountUsage.
in your organization to generate
B. Run the gcloud resource-manager org-policies enable-enforce command with
new external service account keys.
the constraints iam.disableServiceAccountKeyCreation, and
You also want to restrict every new
iam.disableCrossProjectServiceAccountUsage and the Project IDs you want the
service account’s usage to its
constraints to apply to.
associated Project.
C. Navigate to Organizational policies in the Google Cloud Console. Select your organization.
Select iam.disableServiceAccountKeyCreation. Under Policy Enforcement, select
Merge with parent. Click Save. Repeat the process for
iam.disableCrossProjectServiceAccountLienRemoval.
D. Run the gcloud resource-manager org-policies allow command with the boolean
What should you do? constraints iam.disableServiceAccountKeyCreation and
iam.disableCrossProjectServiceAccountUsage with Organization ID.
Proprietary + Confidential
Security in Google Cloud Access control for projects with IAM | Resource
Manager Documentation | Google Cloud
M3 Identity and Access
Management (IAM) Google Cloud Access control for organizations with IAM |
Resource Manager Documentation | Google
Ensure Access and Cloud
Identity in Google
Managing Security in Google Cloud Cloud Quest Access control for folders with IAM | Resource
Manager Documentation | Google Cloud
M3 Identity and Access
Management (IAM) Understanding roles | IAM Documentation
1.5 Diagnostic Question 09
Cymbal Bank is divided into separate A. Create an Organization node. Under the
departments. Each department is divided Organization node, create Department
into teams. Each team works on a distinct folders. Under each Department, create
product that requires Google Cloud Product folders. Under each Product, create Teams folders. In the Teams
resources for development. folder, add Projects.
B. Create an Organization node. Under the Organization node, create
Department folders. Under each Department, create Product folders. Add
Projects to the Product folders.
How would you design a Google C. Create an Organization node. Under the Organization node, create
Cloud organization hierarchy to best Department folders. Under each Department, create Teams folders. Add
match Cymbal Bank’s organization Projects to the Teams folders.
structure and needs?
D. Create an Organization node. Under the Organization node, create
Department folders. Under each Department, create a Teams folder. Under
each Team, create Product folders. Add Projects to the Product folders.
1.5 Diagnostic Question 10
Cymbal Bank has a team of A. Deny Serial Port Access and Service Account Creation at the
developers and administrators Organization level. Create an ‘admin’ folder and set enforced:
working on different sets of false for constraints/compute.disableSerialPortAccess.
Create a new ‘dev’ folder inside the ‘admin’ folder, and set enforced: false for
Google Cloud resources. The
constraints/iam.disableServiceAccountCreation. Give developers access to the ‘dev’ folder, and
Bank’s administrators should be
administrators access to the ‘admin’ folder.
able to access the serial ports on
B. Deny Serial Port Access and Service Account Creation at the organization level. Create a ‘dev’ folder and
Compute Engine Instances and
set enforced: false for constraints/compute.disableSerialPortAccess. Create a new ‘admin’ folder inside
create service accounts.
the ‘dev’ folder, and set enforced: false for constraints/iam.disableServiceAccountCreation. Give
Developers should only be able to developers access to the ‘dev’ folder, and administrators access to the ‘admin’ folder.
access serial ports.
C. Deny Serial Port Access and Service Account Creation at the organization level. Create a ‘dev’ folder and
set enforced: true for constraints/compute.disableSerialPortAccess and enforced: true for
constraints/iam.disableServiceAccountCreation. Create a new ‘admin’ folder inside the ‘dev’ folder, and
How would you design the set enforced: false for constraints/iam.disableServiceAccountCreation. Give developers access to the
organization hierarchy to provide ‘dev’ folder, and administrators access to the ‘admin’ folder.
the required access?
D. Allow Serial Port Access and Service Account Creation at the organization level. Create a ‘dev’ folder and
set enforced: true for constraints/iam.disableServiceAccountCreation. Create another ‘admin’ folder that
inherits from the parent inside the organization node. Give developers access to the ‘dev’ folder, and
administrators access to the ‘admin’ folder.
Proprietary + Confidential
Courses Documentation
Understanding hierarchy evaluation | Resource
Manager Documentation | Google Cloud
Creating and managing organizations |
Security in Google Cloud Managing Security in Google Cloud Resource Manager Documentation | Google
M3 Identity and Access M3 Identity and Access Cloud
Management (IAM) Management (IAM) Best practices for enterprise organizations |
Documentation | Google Cloud
Section 2:
Configuring perimeter and
boundary security
2.1 Diagnostic Question 01
Cymbal Bank has published an API that A. gcloud compute security-policies rules
internal teams will use through the create priority
HTTPS load balancer. You need to limit --security-policy sec-policy
--src-ip-ranges=source-range
the API usage to 200 calls every hour.
--action=throttle C. gcloud compute security-policies rules create priority
Any exceeding usage should inform --security-policy sec-policy
--rate-limit-threshold-count=200
the users that servers are busy. --rate-limit-threshold-interval-sec=3600 --src-ip-ranges=source-range
--conform-action=allow --action=rate-based-ban
--exceed-action=deny-429 --rate-limit-threshold-count=200
Which gcloud command would you --enforce-on-key=HTTP-HEADER --rate-limit-threshold-interval-sec=3600
run to throttle the load balancing for --conform-action=deny
the given specification? B. gcloud compute security-policies rules --exceed-action=deny-403
create priority --enforce-on-key=HTTP-HEADER
--security-policy sec-policy
--src-ip-ranges=source-range D. gcloud compute security-policies rules create priority
--action=throttle --security-policy sec-policy
--rate-limit-threshold-count=200 --src-ip-ranges="<source range>"
--rate-limit-threshold-interval-sec=60 --action=rate-based-ban
--conform-action=deny --rate-limit-threshold-count=200
--exceed-action=deny-404 --rate-limit-threshold-interval-sec=3600
--enforce-on-key=HTTP-HEADER --conform-action=allow
--exceed-action=deny-500
--enforce-on-key=IP
2.1 Diagnostic Question 02
Cymbal Bank is releasing a new loan A. Create a Google-managed SSL certificate. Attach a global
management application using a dynamic external IP address to the internal HTTPS load balancer.
Compute Engine managed instance Validate that an existing URL map will route the incoming service to
group. External users will connect to your managed instance group backend. Load your certificate and create an HTTPS proxy routing to
the application using a domain name your URL map. Create a global forwarding rule that routes incoming requests to the proxy.
or IP address protected with TLS 1.2.
B. Create a Google-managed SSL certificate. Attach a global static external IP address to the external
A load balancer already hosts this
HTTPS load balancer. Validate that an existing URL map will route the incoming service to your
application and preserves the
managed instance group backend. Load your certificate and create an HTTPS proxy routing to your
source IP address. You are tasked
URL map. Create a global forwarding rule that routes incoming requests to the proxy.
with setting up the SSL certificate
for this load balancer. C. Import a self-managed SSL certificate. Attach a global static external IP address to the TCP Proxy
load balancer. Validate that an existing URL map will route the incoming service to your managed
instance group backend. Load your certificate and create a TCP proxy routing to your URL map.
Create a global forwarding rule that routes incoming requests to the proxy.
What should you do? D. Import a self-managed SSL certificate. Attach a global static external IP address to the SSL Proxy
load balancer. Validate that an existing URL map will route the incoming service to your managed
instance group backend. Load your certificate and create an SSL proxy routing to your URL map.
Create a global forwarding rule that routes incoming requests to the proxy.
2.1 Diagnostic Question 03
Your organization has a website A. Set up Cloud VPN. Set up an unencrypted tunnel to one of the hosts
running on Compute Engine. This in the network. Create outbound or egress firewall rules. Use the
instance only has a private IP private IP address to log in using a gcloud ssh command.
address. You need to provide SSH B. Use SOCKS proxy over SSH. Set up an SSH tunnel to one of the hosts
access to an on-premises in the network. Create the SOCKS proxy on the client side.
developer who will debug the
C. Use the default VPC’s firewall. Open port 22 for TCP protocol using
website from the authorized
the Google Cloud Console.
on-premises location only.
D. Use Identity-Aware Proxy (IAP). Set up IAP TCP forwarding by
creating ingress firewall rules on port 22 for TCP using the gcloud
How do you enable this?
command.
Proprietary + Confidential
D. Create user accounts for the application and database. Create a firewall rule using:
gcloud compute firewall-rules create ALLOW_MONGO_DB
--network network-name
--deny UDP:27017
--source-service-accounts web-application-user-account
--target-service-accounts database-admin-user-account
2.2 Diagnostic Question 06
Cymbal Bank has designed an A. Use subnet isolation. Create a service account for the fraud detection VM.
application to detect credit card fraud Create one service account for all the teams’ Compute Engine instances that
will access the fraud detection VM. Create a new firewall rule using:
that will analyze sensitive information.
gcloud compute firewall-rules create ACCESS_FRAUD_ENGINE
The application that’s running on a
--network <network name>
Compute Engine instance is hosted in --allow TCP:80
a new subnet on an existing VPC. --source-service-accounts <one service account for all teams>
Multiple teams who have access to --target-service-accounts <fraud detection engine’s service account>
other VMs in the same VPC must
access the VM. You want to configure B. Use target filtering. Create two tags called ‘app’ and ‘data’. Assign the ‘app’ tag to the Compute Engine instance hosting
the access so that unauthorized VMs the Fraud Detection App (source), and assign the ‘data’ tag to the other Compute Engine instances (target). Create a
firewall rule to allow all ingress communication on this tag.
or users from the internet can’t access
the fraud detection VM. C. Use subnet isolation. Create a service account for the fraud detection engine. Create service accounts for each of the
teams’ Compute Engine instances that will access the engine. Add a firewall rule using:
gcloud compute firewall-rules create ACCESS_FRAUD_ENGINE
--network <network name>
--allow TCP:80
What should you do?
--source-service-accounts <list of service accounts>
--target-service-accounts <fraud detection engine’s service account>
D. Use target filtering. Create a tag called ‘app’, and assign the tag to both the source and the target. Create a firewall rule to
allow all ingress communication on this tag.
Proprietary + Confidential
Configuring boundary
2.2 segmentation
C. Add ingress firewall rules to allow NAT and Health Check ranges for the App Engine standard environment in the
Shared VPC network. Create a client-side connector in the Service Project using the Shared VPC Project ID. Verify
What should you do? that the connector is in a READY state. Create an ingress rule on the Shared VPC network to allow the connector
using Network Tags or IP ranges.
D. Add ingress firewall rules to allow NAT and Health Check ranges for App Engine standard environment in the Shared
VPC network. Create a server-side connector in the Host Project using the Shared VPC Project ID. Verify that the
connector is in a READY state. Create an ingress rule on the Shared VPC network to allow the connector using
Network Tags or IP ranges.
2.3 Diagnostic Question 08
Cymbal Bank’s Customer Details API runs on a A. Use a Content Delivery Network (CDN). Establish direct peering with
Compute Engine instance with only an internal one of Google’s nearby edge-enabled PoPs.
IP address. Cymbal Bank’s new branch is B. Use Carrier Peering. Use a service provider to access their enterprise
co-located outside the Google Cloud grade infrastructure to connect to the Google Cloud environment.
points-of-presence (PoPs) and requires a
low-latency way for its on-premises apps to C. Use Partner Interconnect. Use a service provider to access their
enterprise grade infrastructure to connect to the Google Cloud
consume the API without exposing the
environment.
requests to the public internet.
D. Use Dedicated Interconnect. Establish direct peering with one of
Google’s nearby edge-enabled PoPs.
Which solution would you recommend?
2.3 Diagnostic Question 09
An external audit agency needs to A. Use a Cloud VPN tunnel. Use your DNS provider to
perform a one-time review of Cymbal create DNS zones and records for private.googleapis.com.
Bank’s Google Cloud usage. The auditors Connect the DNS provider to your on-premises network.
should be able to access a Default VPC Broadcast the request from the on-premises environment.
containing BigQuery, Cloud Storage, and Use a software-defined firewall to manage incoming and outgoing requests.
Compute Engine instances where all the
B. Use Partner Interconnect. Configure an encrypted tunnel in the auditor's on-premises
usage information is stored. You have
environment. Use Cloud DNS to create DNS zones and A records for
been tasked with enabling the access
private.googleapis.com.
from their on-premises environment,
which already has a configured VPN. C. Use a Cloud VPN tunnel. Use Cloud DNS to create DNS zones and records for
*.googleapis.com. Set up on-premises routing with Cloud Router. Use Cloud Router
custom route advertisements to announce routes for Google Cloud destinations.
What should you do? D. Use Dedicated Interconnect. Configure a VLAN in the auditor's on-premises environment.
Use Cloud DNS to create DNS zones and records for restricted.googleapis.com and
private.googleapis.com. Set up on-premises routing with Cloud Router. Add custom static
routes in the VPC to connect individually to BigQuery, Cloud Storage, and Compute Engine
instances.
2.3 Diagnostic Question 10
An ecommerce portal uses Google A. Cloud DNS, subnet primary IP address range for nodes, and subnet
Kubernetes Engine to deploy its secondary IP address range for pods and services in the cluster
recommendation engine in Docker B. Cloud VPN, subnet secondary IP address range for nodes, and subnet
containers. This cluster instance does not secondary IP address range for pods and services in the cluster
have an external IP address. You need to
provide internet access to the pods in the C. Nginx load balancer, subnet secondary IP address range for nodes, and
subnet secondary IP address range for pods and services in the cluster
Kubernetes cluster. What configuration
would you add? D. Cloud NAT gateway, subnet primary IP address range for nodes, and
subnet secondary IP address range for pods and services in the cluster
Establishing private
2.3 connectivity
Courses Skill Badges
Documentation
Networking in Google Cloud
● M3 Sharing Networks Across Projects
Configuring Serverless VPC Access | Google
● M5 Hybrid Connectivity
● M6 Private Connection Options Google Cloud Cloud
Security in Google Cloud Overview of VPC Service Controls | Google
● M4 Configuring VPC for Isolation and Build and Secure Cloud
Security Networks in Google
● M5 Securing Compute Engine: Cloud Quest Choosing a Network Connectivity product |
Techniques and Best Practices Google Cloud
Networking in Google Cloud: Defining and
Private Google Access | VPC
implementing networks
● M3 Sharing Networks Across Projects Manage zones | Cloud DNS
Networking in Google Cloud: Hybrid
Private Google Access for on-premises hosts |
Connectivity and Network Management Google Cloud
VPC
● M1 Hybrid Connectivity
● M2 Private Connection Options Ensure Access and Simplifying cloud networking for enterprises:
Managing Security in Google Cloud Identity in Google announcing Cloud NAT and more | Google
● M4 Configuring VPC for Isolation and Cloud Quest Cloud Blog
Security
Security Best Practices in Google Cloud Example GKE setup | Cloud NAT
● M1 Securing Compute Engine: Techniques Cloud NAT overview
and Best Practices
Section 3:
Ensuring data protection
3.1 Diagnostic Question 01 Discussion
Cymbal Bank has hired a data analyst A. Use the Cloud Data Loss Prevention (DLP) API to make redact image
team to analyze scanned copies of requests. Provide your project ID, built-in infoTypes, and the
loan applications. Because this is an scanned copies when you make the requests.
external team, Cymbal Bank does not B. Use the Cloud Vision API to perform optical code recognition (OCR)
want to share the name, gender, phone from scanned images. Redact the text using the Cloud Natural
number, or credit card numbers listed Language API with regular expressions.
in the scanned copies. You have been
C. Use the Cloud Vision API to perform optical code recognition (OCR)
tasked with hiding this PII information
from scanned images. Redact the text using the Cloud Data Loss
while minimizing latency.
Prevention (DLP) API with regular expressions.
D. Use the Cloud Vision API to perform text extraction from scanned
What should you do? images. Redact the text using the Cloud Natural Language API with
regular expressions.
3.1 Diagnostic Question 02 Discussion
Cymbal Bank needs to statistically predict A. Generalize all dates to year and month with bucketing. Use the
the days customers delay the payments built-in infoType for customer name. Use a custom infoType for
for loan repayments and credit card customer type with a custom dictionary.
repayments. Cymbal Bank does not want B. Generalize all dates to year and month with bucketing. Use the
to share the exact dates a customer has built-in infoType for customer name. Use a custom infoType for
defaulted or made a payment with data customer type with regular expression.
analysts. Additionally, you need to hide
C. Generalize all dates to year and month with date shifting. Use a
the customer name and the customer
predefined infoType for customer name. Use a custom infoType
type, which could be corporate or retail.
for customer type with a custom dictionary.
How do you provide the appropriate D. Generalize all dates to year and month with date shifting. Use a
information to the data analysts? predefined infoType for customer name. Use a custom infoType
for customer type with regular expression.
3.1 Diagnostic Question 03 Discussion
Cymbal Bank stores customer information in a A. Create separate datasets for each department.
BigQuery table called ‘Information,’ which Create views for each dataset separately.
belongs to the dataset ‘Customers.’ Various Authorize these views to access the source
departments of Cymbal Bank, including loan, dataset. Share the datasets with departments.
credit card, and trading, access the Provide the bigquery.dataViewer role to each department’s required users.
information table. Although the data source
B. Create an authorized dataset in BigQuery’s Explorer panel. Write Customers’ table
remains the same, each department needs to
metadata into a JSON file, and edit the file to add each department’s Project ID and
read and analyze separate customers and
Dataset ID. Provide the bigquery.user role to each department’s required users.
customer-attributes. You want a
cost-effective way to configure departmental C. Secure data with classification. Open the Data Catalog Taxonomies page in the
access to BigQuery to provide optimal Google Cloud Console. Create policy tags for required columns and rows. Provide
performance. the bigquery.user role to each department’s required users. Provide policy tags
access to each department separately.
D. Create separate datasets for each department. Create authorized functions in each
dataset to perform required aggregations. Write transformed data to new tables for
What should you do?
each department separately. Provide the bigquery.dataViewer role to each
department’s required users.
3.1 Diagnostic Question 04 Discussion
Your colleague at Cymbal Bank is a cloud security A. Your colleague should have created one
engineer. She sketches out the following solution project for the Cloud Storage bucket and one
to manage her team’s access to application to store the KMS encryption keys. Two projects
security keys: create an unnecessary burden of IAM management.
1 - Create 2 projects B. Your colleague should cluster her secrets together in Cloud Storage.
● Project A: Cloud Storage to store secrets That way that can be easily accessed by applications.
● Project B: Cloud KMS to manage
C. It is not recommended to use Cloud KMS keys to encrypt buckets.
encryption keys
Default management is safer and more reliable.
2 - Store each secret individually in Cloud Storage
D. Your colleague’s proposal follows Google Cloud’s best practices.
3 - Rotate secrets and encryption keys regularly
4 - Protect each bucket by using encryption with
Cloud KMS
Cymbal Bank has a Cloud SQL instance A. Use Secret Manager. Use the duration attribute to set the expiry period to one year. Add
that must be shared with an external the secretmanager.secretAccessor role for the group that contains external developers.
agency. The agency’s developers will be
B. Use Cloud Key Management Service. Use the destination IP address and Port attributes to
assigned roles and permissions through a
provide access for developers at the external agency. Remove the IAM access after one
Google Group in Identity and Access
year and rotate the shared keys. Add cloudkms.cryptoKeyEncryptorDecryptor role for
Management (IAM). The external agency
the group that contains the external developers.
is on an annual contract and will require a
connection string, username, and C. Use Secret Manager. Use the resource attribute to set a key-value pair with key as
password to connect to the database. duration and values as expiry period one year from now. Add secretmanager.viewer role
for the group that contains external developers.
D. Use Secret Manager for the connection string and username, and use Cloud Key
Management Service for the password. Use tags to set the expiry period to the
How would you configure the
timestamp one year from now. Add secretmanager.secretVersionManager and
group’s access?
secretmanager.secretAccessor roles for the group that contains external developers.
3.1 Diagnostic Question 06 Discussion
Cymbal Bank wants to deploy an n-tier A. Use VM metadata to read the current machine’s
web application. The frontend must be IP address, and use a gcloud command to add
supported by an App Engine deployment, access to Cloud SQL. Store Cloud SQL’s connection
an API with a Compute Engine instance, string and password in Cloud Key Management Service.
and Cloud SQL for a MySQL database. Store the Username in Project metadata.
This application is only supported during
B. Use Project metadata to read the current machine’s IP address, and use a startup
working hours, App Engine is disabled, and
script to add access to Cloud SQL. Store Cloud SQL’s connection string in Cloud Key
Compute Engine is stopped. How would
Management Service, and store the password in Secret Manager. Store the Username
you enable the infrastructure to access
in Project metadata.
the database?
C. Use Project metadata to read the current machine’s IP address and use a gcloud
command to add access to Cloud SQL. Store Cloud SQL’s connection string and
username in Cloud Key Management Service, and store the password in Secret
Manager.
D. Use VM metadata to read the current machine’s IP address and use a startup script to
How would you enable the infrastructure
add access to Cloud SQL. Store Cloud SQL’s connection string, username, and
to access the database?
password in Secret Manager.
Proprietary + Confidential
Documentation
Protecting sensitive data
3.1 and preventing data loss
Image inspection and redaction | Data Loss
Prevention Documentation | Google Cloud
Redacting sensitive data from images | Data
Loss Prevention Documentation | Google Cloud
InfoType detector reference | Data Loss
Courses Prevention Documentation | Google Cloud
Pseudonymization | Data Loss Prevention
Documentation | Google Cloud
Security in Google Cloud Security Best Practices in Google Cloud
Authorized views | BigQuery | Google Cloud
● M5 Securing Compute Engine: ● M1 Securing Compute Engine:
Techniques and Best Practices Authorized datasets | BigQuery | Google Cloud
Techniques and Best Practices
● M6 Securing Cloud Data: ● M2 Securing Cloud Data: Techniques Sharing across perimeters with bridges | VPC
Techniques and Best Practices and Best Practices Service Controls | Google Cloud
● M7 Application Security: ● M3 Application Security: Techniques
Creating a perimeter bridge | VPC Service
Techniques and Best Practices and Best Practices
Controls | Google Cloud
● M10 Content-Related
Mitigating Security Vulnerabilities in Context-aware access with ingress rules | VPC
Vulnerabilities: Techniques and
Best Practices
Google Cloud Service Controls | Google Cloud
● M2 Content-Related Vulnerabilities: Frequently asked questions | IAM
Techniques and Best Practices Documentation
Access control with IAM | Secret Manager
Documentation | Google Cloud
About VM metadata | Compute Engine
Documentation | Google Cloud
3.2 Diagnostic Question 07 Discussion
Cymbal Bank calculates employee A. Import the spreadsheets to BigQuery, and
incentives on a monthly basis for the create separate tables for Sales and Marketing.
sales department and on a quarterly Set table expiry rules to 365 days for both tables.
basis for the marketing department. Create jobs scheduled to run every quarter for Marketing and every month for Sales.
The incentives are released with the
B. Upload the spreadsheets to Cloud Storage. Select the Nearline storage class for the
next month’s salary. Employee’s
sales department and Coldline storage for the marketing department. Use object
performance documents are stored as
lifecycle management rules to set the storage class to Archival after 365 days. Process
spreadsheets, which are retained for
the data on BigQuery using jobs that run monthly for Sales and quarterly for Marketing.
at least one year for audit. You want to
configure the most cost-effective C. Import the spreadsheets to Cloud SQL, and create separate tables for Sales and
storage for this scenario. Marketing. For Table Expiration, set 365 days for both tables. Use stored procedures to
calculate incentives. Use App Engine cron jobs to run stored procedures monthly for
Sales and quarterly for Marketing.
D. Import the spreadsheets into Cloud Storage and create NoSQL tables. Use App Engine
What should you do? cron jobs to run monthly for Sales and quarterly for Marketing. Use a separate job to
delete the data after 1 year.
3.2 Diagnostic Question 08 Discussion
Cymbal Bank uses Google Kubernetes A. In the Google Cloud console, navigate to
Engine (GKE) to deploy its Docker Google Kubernetes Engine. Select your cluster
containers. You want to encrypt the boot and the boot node inside the cluster. Enable
disk for a cluster running a custom customer-managed encryption. Use Cloud HSM to generate random bytes and
image so that the key rotation is provide an additional layer of security.
controlled by the Bank. GKE clusters will
B. Create a new GKE cluster with customer-managed encryption and HSM enabled.
also generate up to 1024 randomized
Deploy the containers to this cluster. Delete the old GKE cluster. Use Cloud HSM to
characters that will be used with the
generate random bytes and provide an additional layer of security.
keys with Docker containers.
C. Create a new key ring using Cloud Key Management Service. Extract this key to a
certificate. Use the kubectl command to update the Kubernetes configuration.
Validate using MAC digital signatures, and use a startup script to generate random
bytes.
What steps would you take to apply the
encryption settings with a dedicated D. Create a new key ring using Cloud Key Management Service. Extract this key to a
hardware security layer? certificate. Use the Google Cloud Console to update the Kubernetes configuration.
Validate using MAC digital signatures, and use a startup script to generate random
bytes.
3.2 Diagnostic Question 09 Discussion
You have recently joined Cymbal Bank as A. Set up an IPsec tunnel. This will allow you to create L3/L4
a cloud security engineer. You want to encryption between a user and a VM instance in her project.
encrypt a connection from a user on the B. Set up transport layer security (TLS). This will encrypt data sent
internet to a VM in your development to the Google Front End, and in turn, your VM. This is not setup
project. This is at the layer 3/4 by default.
(network/transport) level and you want
C. Set up a managed SSL certificate by configuring a load balancer.
to set up user configurable encryption
By default, this will encrypt at the L3/L4 layer.
for the in transit network traffic.
D. Set up app layer transport security (ALTS). This is a mutual
authentication and transport encryption system developed by
What architecture choice best Google. This is configured for L3/L4 network connections.
suits this use case?
3.2 Diagnostic Question 10 Discussion
Cymbal Bank needs to migrate existing
loan processing applications to Google
Cloud. These applications transform
confidential financial information. All the A. Create a Confidential VM instance with Customer-Supplied Encryption Keys. In
data should be encrypted at all stages, Cloud Logging, collect all logs for sevLaunchAttestationReportEvent.
including sharing between sockets and
RAM. An integrity test should also be B. Create a Shielded VM instance with Customer-Supplied Encryption Keys. In
performed every time these instances Cloud Logging, collect all logs for earlyBootReportEvent.
boot. You need to use Cymbal Bank’s C. Create a Confidential VM instance with Customer-Managed Encryption Keys. In
encryption keys to configure the Cloud Logging, collect all logs for earlyBootReportEvent.
Compute Engine instances.
D. Create a Shielded VM instance with Customer-Managed Encryption Keys. In
Cloud Logging, collect all logs for sevLaunchAttestationReportEvent.
Managing Security in Google Cloud Using Cloud KMS with other products
● M4 Configuring Virtual Private Cloud for Rotating keys | Cloud KMS Documentation
Isolation and Security Confidential VM and Compute Engine | Google
Security Best Practices in Google Cloud Cloud
Cymbal Bank has received Docker A. Prepare a cloudbuild.yaml file. In this file, add four steps in order—build, scan, severity
source files from its third-party check, and push—specifying the location of Artifact Registry repository. Specify severity
developers in an Artifact Registry level as CRITICAL. Start the build with the command gcloud builds submit.
repository. These Docker fil2es will
be part of a CI/CD pipeline to update B. Prepare a cloudbuild.yaml file. In this file, add four steps in order—scan, build, severity
Cymbal Bank’s personal loan check, and push—specifying the location of the Artifact Registry repository. Specify
offering. The bank wants to prevent
severity level as HIGH. Start the build with the command gcloud builds submit.
the possibility of remote users
arbitrarily using the Docker files to
C. Prepare a cloudbuild.yaml file. In this file, add four steps in order—scan, severity check,
run any code. You have been tasked
build, and—push specifying the location of the Artifact Registry repository. Specify
with using Container Analysis’
On-Demand scanning to scan the
severity level as HIGH. Start the build with the command gcloud builds submit.
images for a one-time update.
D. Prepare a cloudbuild.yaml file. In this file, add four steps in order—build, severity check,
scan, and push—specifying the location of the Artifact Registry repository. Specify
What should you do?
severity level as CRITICAL. Start the build with the command gcloud builds submit.
4.1 Diagnostic Question 02 Discussion
Cymbal Bank’s management is A. Set an organization-level policy that requires all Compute Engine VMs to be configured as
concerned about virtual machines Shielded VMs. Use Secure Boot enabled with Unified Extensible Firmware Interface (UEFI).
being compromised by bad actors. Validate integrity events in Cloud Monitoring and place alerts on launch attestation events.
More specifically, they want to
B. Set Cloud Logging measurement policies on the VMs. Use Cloud Logging to place alerts
receive immediate alerts if there
whenever actualMeasurements and policyMeasurements don’t match.
have been changes to the boot
sequence of any of their Compute C. Set an organization-level policy that requires all Compute Engine VMs to be configured as
Engine instances. Shielded VMs. Use Measured Boot enabled with Virtual Trusted Platform Module (vTPM). Validate
integrity events in Cloud Monitoring and place alerts on late boot validation events.
D. Set project-level policies that require all Compute Engine VMs to be configured as Shielded VMs.
Use Measured Boot enabled with Virtual Trusted Platform Module (vTPM). Validate integrity
What should you do?
events in Cloud Monitoring and place alerts on late boot validation events.
4.1 Diagnostic Question 03 Discussion
Cymbal Bank runs a Node.js A. Prepare a shell script. Add the command gcloud compute
application on a Compute Engine instances stop with the Node.js instance name. Set up
instance. Cymbal Bank needs to certificates for secure boot. Add gcloud compute images
share this base image with a create, and specify the Compute Engine instance’s persistent disk and zone and the certificate
‘development’ Google Group. This files. Add gcloud compute images add-iam-policy-binding and specify the ‘development’ group.
base image should support secure B. Start the Compute Engine instance. Set up certificates for secure boot. Prepare a cloudbuild.yaml
boot for the Compute Engine configuration file. Specify the persistent disk location of the Compute Engine and the
instances deployed from this ‘development’ group. Use the command gcloud builds submit --tag, and specify the configuration
image. How would you automate file path and the certificates.
the image creation? C. Prepare a shell script. Add the command gcloud compute instances start to the script to start the
Node.js Compute Engine instance. Set up Measured Boot for secure boot. Add gcloud compute
images create, and specify the persistent disk and zone of the Compute Engine instance.
D. Stop the Compute Engine instance. Set up Measured Boot for secure boot. Prepare a
How would you automate cloudbuild.yaml configuration file. Specify the persistent disk location of the Compute Engine
the image creation? instance and the ‘development’ group. Use the command gcloud builds submit --tag, and specify
the configuration file path.
4.1 Diagnostic Question 04 Discussion
Cymbal Bank uses Docker A. Build a foundation image. Define a job on Jenkins for the Docker
containers to interact with APIs image. Upload the deployment configuration and container
for its personal banking definition Packer template to a Git repository. In the template,
application. These APIs are under include a pre-processor attribute to tag the image with Git repository
PCI-DSS compliance. The and Container Registry. Use Jenkins to build the container, and deploy it in Google Kubernetes Engine. Use
Kubernetes environment running Container Registry to distribute the image.
the containers will not have B. Build an immutable image. Define a job on Jenkins for the Docker image. Upload the deployment
internet access to download configuration and container definition Packer template to a Git repository. In the template, include a
required packages. post-processor attribute to tag the image with Git repository and Container Registry. Use Jenkins to build
the container, and deploy it in Google Kubernetes Engine. Use Container Registry to distribute the image.
C. Build a foundation image. Store all artifacts and a Packer definition template in a Git repository. Use
Container Registry to build the artifacts and Packer definition. Use Cloud Build to extract the built
container and deploy it to a Google Kubernetes Engine (GKE) cluster. Add the required users and groups
How would you automate the to the GKE project.
pipeline that is building these
D. Build an immutable image. Store all artifacts and a Packer definition template in a Git repository. Use
containers? Container Registry to build the artifacts and Packer definition. Use Cloud Build to extract the built
container and deploy it to a Google Kubernetes Engine Cluster (GKE). Add the required users and groups
to the GKE project.
Proprietary + Confidential
Cymbal Bank wants to use A. Enable Data Access Logs for ADMIN_READ, DATA_READ, and DATA_WRITE at the
Cloud Storage and BigQuery to service level for BigQuery and Cloud Storage.
store safe deposit usage data. B. Enable Data Access Logs for ADMIN_READ, DATA_READ, and DATA_WRITE at the
Cymbal Bank needs a organization level.
cost-effective approach to
auditing only Cloud Storage and C. Enable Data Access Logs for ADMIN_READ, DATA_READ, and DATA_WRITE for
Cloud Storage. All Data Access Logs are enabled for BigQuery by default.
BigQuery data access activities.
D. Enable Data Access Logs for ADMIN_READ, DATA_READ, and DATA_WRITE for
BigQuery. All Data Access Logs are enabled for Cloud Storage by default.
How would you use Cloud Audit
Logs to enable this analysis?
4.2 Diagnostic Question 08 Discussion
Cymbal Bank has suffered a remote A. Use Event Threat Detection. Trigger the IAM Anomalous Grant detector to detect all admins
botnet attack on Compute Engine and users with admin or system permissions. Export these logs to the Security Command
instances in an isolated project. The Center. Give the external agency access to the Security Command Center.
affected project now requires
B. Use Cloud Audit Logs. Filter Admin Activity audit logs for only the affected project. Use a
investigation by an external agency.
Pub/Sub topic to stream the logs from Cloud Audit Logs to the external agency’s forensics
An external agency requests that you
tool.
provide all admin and system events
to analyze in their local forensics tool. C. Use the Security Command Center. Select Cloud Logging as the source, and filter by
You want to use the most category: Admin Activity and category: System Activity. View the Source property of the
cost-effective solution to enable the Finding Details section. Use Pub/Sub topics to export the findings to the external agency’s
external analysis. forensics tool.
D. Use Cloud Monitoring and Cloud Logging. Filter Cloud Monitoring to view only system and
admin logs. Expand the system and admin logs in Cloud Logging. Use Pub/Sub to export the
What should you do?
findings from Cloud Logging to the external agency’s forensics tool or storage.
4.2 Diagnostic Question 09 Discussion
The loan application from Cymbal A. Set up a logging export dataset in BigQuery to collect data from Cloud Logging and
Bank’s lending department collects Cloud Monitoring. Create table expiry rules to delete logs after three years.
credit reports that contain credit B. Set up a logging export dataset in BigQuery to collect data from Cloud Logging and
payment information from customers. the Security Command Center. Create table expiry rules to delete logs after three
According to bank policy, the PDF years.
reports are stored for six months in
C. Set up a logging export bucket in Cloud Storage to collect data from the Security
Cloud Storage, and access logs for the
Command Center. Configure object lifecycle management rules to delete logs after
reports are stored for three years. You
three years.
need to configure a cost-effective
storage solution for the access logs. D. Set up a logging export bucket in Cloud Storage to collect data from Cloud Audit
Logs. Configure object lifecycle management rules to delete logs after three years.
Cymbal Bank uses Compute A. Use Event Threat Detection’s threat detectors. Export findings from ‘Suspicious account activity’
Engine instances for its APIs, and and ‘Anomalous IAM behavior’ detectors and publish them to a Pub/Sub topic. Create a Cloud
recently discovered bitcoin mining Function to send notifications of suspect activities. Use Pub/Sub notifications to invoke the
activities on some instances. The Cloud Function.
bank wants to detect all future
B. Enable the VM Manager tools suite in the Security Command Center. Perform a scan of Compute
mining attempts and notify the
Engine instances. Publish results to Cloud Audit Logging. Create an alert in Cloud Monitoring to
security team. The security team
send notifications of suspect activities.
can view the Security Command
Center and Cloud Audit Logs. C. Enable Anomaly Detection in the Security Command Center. Create and configure a Pub/Sub
topic and an email service. Create a Cloud Function to send email notifications for suspect
activities. Export findings to a Pub/Sub topic, and use them to invoke the Cloud Function.
D. Enable the Web Security Scanner in the Security Command Center. Perform a scan of Compute
How should you configure the Engine instances. Publish results to Cloud Audit Logging. Create an alert in Cloud Monitoring to
detection and notification? send notifications for suspect activities.
Proprietary + Confidential
Configuring logging,
4.2 monitoring, and detection Documentation
Security controls and forensic analysis for GKE
apps | Cloud Architecture Center
Courses Scenarios for exporting logging data: Security
and access analytics | Cloud Architecture
Center | Google Cloud
Security in Google Cloud Mitigating Security
Vulnerabilities in Google Cloud Security controls and forensic analysis for GKE
M11 Monitoring, Logging, apps | Cloud Architecture Center
Auditing, and Scanning M3 Monitoring, Logging,
Auditing, and Scanning Cloud Audit Logs overview
Cloud Audit Logs with Cloud Storage | Google
Cloud
Configure Data Access audit logs
Scenarios for exporting Cloud Logging:
Compliance requirements | Cloud Architecture
Center | Google Cloud
Security sources for vulnerabilities and threats |
Security Command Center | Google Cloud
Configuring Security Command Center
Enabling real-time email and chat notifications
Section 5:
Supporting compliance
requirements
5.1 Diagnostic Question 01
Cymbal Bank’s lending department stores A. Generate an AES-256 key as a 32-byte bytestring. Decode
sensitive information, such as your it as a base-64 string. Upload the blob to the bucket using
customers’ credit history, address and phone this key.
number, in parquet files. You need to upload B. Generate an RSA key as a 32-byte bytestring. Decode it as
this personally identifiable information (PII) a base-64 string. Upload the blob to the bucket using this
key.
to Cloud Storage so that it’s secure and
compliant with ISO 27018. C. Generate a customer-managed encryption key (CMEK)
using RSA or AES256 encryption. Decode it as a base-64
string. Upload the blob to the bucket using this key.
How should you protect this sensitive
D. Generate a customer-managed encryption key (CMEK)
information using Cymbal Bank’s encryption
using Cloud KMS. Decode it as a base-64 string. Upload
keys and using the least amount of the blob to the bucket using this key.
computational resources?
5.1 Diagnostic Question 02
You are designing a web application for A. Use customer-supplied encryption keys (CSEK) and Cloud
Cymbal Bank so that customers who have Key Management Service (KMS) to detect and encrypt
credit card issues can contact dedicated sensitive information.
support agents. Customers may enter their B. Detect sensitive information with Cloud Natural Language
complete credit card number when chatting API.
with or emailing support agents. You want to C. Use customer-managed encryption keys (CMEK) and
ensure compliance with PCI-DSS and Cloud Key Management Service (KMS) to detect and
prevent support agents from viewing this encrypt sensitive information.
information in the most cost-effective way. D. Implement Cloud Data Loss Prevention using its REST API.
Documentation
Upload an object by using CSEK | Cloud Storage Cloud DLP client libraries | Data Loss Prevention
Customer-managed encryption keys (CMEK) | Documentation
Cloud KMS Documentation Data Loss Prevention Demo
Customer-supplied encryption keys | Cloud Overview of VPC Service Controls | Google Cloud
Storage Getting to know the Google Cloud Healthcare API: Part 1
Data encryption options | Cloud Storage Sharing and collaboration | Cloud Storage
ISO/IEC 27018 Certified Compliant | Google Cloud Google Cloud Platform HIPAA overview guide
Automating the Classification of Data Uploaded to Setting up a HIPAA-aligned project | Cloud Architecture
Cloud Storage | Cloud Architecture Center | Center
Google Cloud
PCI Data Security Standard compliance | Cloud
Architecture Center
When will you take the exam?
Google Cloud Networking in Networking in Build and secure Managing Security Security Best
Fundamentals: Google Cloud: Google Cloud: networks in in Google Cloud Practices in
Core Defining and Hybrid Google Cloud Google Cloud
Infrastructure implementing connectivity and Skill Badge
networks network
management
Mitigating Security Ensure Access & Google Review Sample questions Take the
Vulnerabilities on Identity in Google Kubernetes Engine documentation certification exam
Google Cloud Cloud Skill Badge Best Practices:
Security Skill
Badge
Weekly study plan
Now, consider what you’ve learned about your knowledge and skills
through the diagnostic questions in this course. You should have a
better understanding of what areas you need to focus on and what
resources are available.
Use the template that follows to plan your study goals for each week.
Consider:
● What exam guide section(s) or topic area(s) will you focus on?
● What courses (or specific modules) will help you learn more?
● What Skill Badges or labs will you work on for hands-on practice?
● What documentation links will you review?
● What additional resources will you use - such as sample
questions?
You may do some or all of these study activities each week.
Area(s) of focus:
Courses/modules
to complete:
Skill Badges/labs
to complete:
Documentation
to review:
Additional study: