New Year Sale Special - Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: xmaspas7

Easiest Solution 2 Pass Your Certification Exams

Professional-Cloud-Security-Engineer Google Cloud Certified - Professional Cloud Security Engineer Free Practice Exam Questions (2025 Updated)

Prepare effectively for your Google Professional-Cloud-Security-Engineer Google Cloud Certified - Professional Cloud Security Engineer certification with our extensive collection of free, high-quality practice questions. Each question is designed to mirror the actual exam format and objectives, complete with comprehensive answers and detailed explanations. Our materials are regularly updated for 2025, ensuring you have the most current resources to build confidence and succeed on your first attempt.

Your team wants to centrally manage GCP IAM permissions from their on-premises Active Directory Service. Your team wants to manage permissions by AD group membership.

What should your team do to meet these requirements?

A.

Set up Cloud Directory Sync to sync groups, and set IAM permissions on the groups.

B.

Set up SAML 2.0 Single Sign-On (SSO), and assign IAM permissions to the groups.

C.

Use the Cloud Identity and Access Management API to create groups and IAM permissions from Active Directory.

D.

Use the Admin SDK to create groups and assign IAM permissions from Active Directory.

You want to prevent users from accidentally deleting a Shared VPC host project. Which organization-level policy constraint should you enable?

A.

compute.restrictSharedVpcHostProjects

B.

compute.restrictXpnProjectLienRemoval

C.

compute.restrictSharedVpcSubnetworks

D.

compute.sharedReservationsOwnerProjects

You are a consultant for an organization that is considering migrating their data from its private cloud to Google Cloud. The organization’s compliance team is not familiar with Google Cloud and needs guidance on how compliance requirements will be met on Google Cloud. One specific compliance requirement is for customer data at rest to reside within specific geographic boundaries. Which option should you recommend for the organization to meet their data residency requirements on Google Cloud?

A.

Organization Policy Service constraints

B.

Shielded VM instances

C.

Access control lists

D.

Geolocation access controls

E.

Google Cloud Armor

Your company requires the security and network engineering teams to identify all network anomalies and be able to capture payloads within VPCs. Which method should you use?

A.

Define an organization policy constraint.

B.

Configure packet mirroring policies.

C.

Enable VPC Flow Logs on the subnet.

D.

Monitor and analyze Cloud Audit Logs.

Your company's users access data in a BigQuery table. You want to ensure they can only access the data during working hours.

What should you do?

A.

Assign a BigQuery Data Viewer role along with an 1AM condition that limits the access to specified working hours.

B.

Configure Cloud Scheduler so that it triggers a Cloud Functions instance that modifies the organizational policy constraints for BigQuery during the specified working hours.

C.

Assign a BigQuery Data Viewer role to a service account that adds and removes the users daily during the specified working hours

D.

Run a gsuttl script that assigns a BigQuery Data Viewer role, and remove it only during the specified working hours.

A customer needs to launch a 3-tier internal web application on Google Cloud Platform (GCP). The customer’s internal compliance requirements dictate that end-user access may only be allowed if the traffic seems to originate from a specific known good CIDR. The customer accepts the risk that their application will only have SYN flood DDoS protection. They want to use GCP’s native SYN flood protection.

Which product should be used to meet these requirements?

A.

Cloud Armor

B.

VPC Firewall Rules

C.

Cloud Identity and Access Management

D.

Cloud CDN

Your organization uses a microservices architecture based on Google Kubernetes Engine (GKE). Security reviews recommend tighter controls around deployed container images to reduce potential vulnerabilities and maintain compliance. You need to implement an automated system by using managed services to ensure that only approved container images are deployed to the GKE clusters. What should you do?

A.

Enforce Binary Authorization in your GKE clusters. Integrate container image vulnerability scanning into the CI/CD pipeline and require vulnerability scan results to be used for Binary Authorization policy decisions.​

B.

Develop custom organization policies that restrict GKE cluster deployments to container images hosted within a specific Artifact Registry project where your approved images reside.​

C.

Build a system using third-party vulnerability databases and custom scripts to identify potential Common Vulnerabilities and Exposures (CVEs) in your container images. Prevent image deployment if the CVE impact score is beyond a specified threshold.​

D.

Automatically deploy new container images upon successful CI/CD builds by using Cloud Build triggers. Set up firewall rules to limit and control access to instances to mitigate malware injection.​

A customer has 300 engineers. The company wants to grant different levels of access and efficiently manage IAM permissions between users in the development and production environment projects.

Which two steps should the company take to meet these requirements? (Choose two.)

A.

Create a project with multiple VPC networks for each environment.

B.

Create a folder for each development and production environment.

C.

Create a Google Group for the Engineering team, and assign permissions at the folder level.

D.

Create an Organizational Policy constraint for each folder environment.

E.

Create projects for each environment, and grant IAM rights to each engineering user.

You are a security engineer at a finance company. Your organization plans to store data on Google Cloud, but your leadership team is worried about the security of their highly sensitive data Specifically, your

company is concerned about internal Google employees' ability to access your company's data on Google Cloud. What solution should you propose?

A.

Use customer-managed encryption keys.

B.

Use Google's Identity and Access Management (IAM) service to manage access controls on Google Cloud.

C.

Enable Admin activity logs to monitor access to resources.

D.

Enable Access Transparency logs with Access Approval requests for Google employees.

You control network traffic for a folder in your Google Cloud environment. Your folder includes multiple projects and Virtual Private Cloud (VPC) networks You want to enforce on the folder level that egress connections are limited only to IP range 10.58.5.0/24 and only from the VPC network dev-vpc." You want to minimize implementation and maintenance effort

What should you do?

A.

• 1. Attach external IP addresses to the VMs in scope.• 2. Configure a VPC Firewall rule in "dev-vpc" that allows egress connectivity to IP range 10.58.5.0/24 for all source addresses in this network.

B.

• 1. Attach external IP addresses to the VMs in scope.• 2. Define and apply a hierarchical firewall policy on folder level to deny all egress connections and to allow egress to IP range 10 58.5.0/24 from network dev-vpc.

C.

• 1. Leave the network configuration of the VMs in scope unchanged.• 2. Create a new project including a new VPC network "new-vpc."• 3 Deploy a network appliance in "new-vpc" to filter access requests and only allow egress connections from -dev-vpc" to 10.58.5.0/24.

D.

• 1 Leave the network configuration of the VMs in scope unchanged• 2 Enable Cloud NAT for dev-vpc" and restrict the target range in Cloud NAT to 10.58.5 0/24.

You need to create a VPC that enables your security team to control network resources such as firewall rules. How should you configure the network to allow for separation of duties for network resources?

A.

Set up multiple VPC networks, and set up multi-NIC virtual appliances to connect the networks.

B.

Set up VPC Network Peering, and allow developers to peer their network with a Shared VPC.

C.

Set up a VPC in a project. Assign the Compute Network Admin role to the security team, and assign the Compute Admin role to the developers.

D.

Set up a Shared VPC where the security team manages the firewall rules, and share the network with developers via service projects.

A customer wants to run a batch processing system on VMs and store the output files in a Cloud Storage bucket. The networking and security teams have decided that no VMs may reach the public internet.

How should this be accomplished?

A.

Create a firewall rule to block internet traffic from the VM.

B.

Provision a NAT Gateway to access the Cloud Storage API endpoint.

C.

Enable Private Google Access on the VPC.

D.

Mount a Cloud Storage bucket as a local filesystem on every VM.

A company is using Google Kubernetes Engine (GKE) with container images of a mission-critical application The company wants to scan the images for known security issues and securely share the report with the security team without exposing them outside Google Cloud.

What should you do?

A.

1. Enable Container Threat Detection in the Security Command Center Premium tier.• 2. Upgrade all clusters that are not on a supported version of GKE to the latest possible GKE version.• 3. View and share the results from the Security Command Center

B.

• 1. Use an open source tool in Cloud Build to scan the images.• 2. Upload reports to publicly accessible buckets in Cloud Storage by using gsutil• 3. Share the scan report link with your security department.

C.

• 1. Enable vulnerability scanning in the Artifact Registry settings.• 2. Use Cloud Build to build the images• 3. Push the images to the Artifact Registry for automatic scanning.• 4. View the reports in the Artifact Registry.

D.

• 1. Get a GitHub subscription.• 2. Build the images in Cloud Build and store them in GitHub for automatic scanning• 3. Download the report from GitHub and share with the Security Team

You define central security controls in your Google Cloud environment for one of the folders in your organization you set an organizational policy to deny the assignment of external IP addresses to VMs. Two days later you receive an alert about a new VM with an external IP address under that folder.

What could have caused this alert?

A.

The VM was created with a static external IP address that was reserved in the project before the organizational policy rule was set.

B.

The organizational policy constraint wasn't properly enforced and is running in "dry run mode.

C.

At project level, the organizational policy control has been overwritten with an 'allow' value.

D.

The policy constraint on the folder level does not have any effect because of an allow" value for that constraint on the organizational level.

Your company has deployed an artificial intelligence model in a central project. As this model has a lot of sensitive intellectual property and must be kept strictly isolated from the internet, you must expose the model endpoint only to a defined list of projects in your organization. What should you do?

A.

Create a central project to host Shared VPC networks that are provided to all other projects. Centrally administer all firewall rules in this project to grant access to the model.

B.

Activate Private Google Access in both the model project as well as in each project that needs to connect to the model. Create a firewall policy to allow connectivity to Private Google Access addresses.

C.

Within the model project, create an internal Application Load Balancer that points to the model endpoint. Expose this load balancer with Private Service Connect to a configured list of projects.

D.

Within the model project, create an external Application Load Balancer that points to the model endpoint. Create a Cloud Armor policy to restrict IP addresses to Google Cloud.

A customer is collaborating with another company to build an application on Compute Engine. The customer is building the application tier in their GCP Organization, and the other company is building the storage tier in a different GCP Organization. This is a 3-tier web application. Communication between portions of the application must not traverse the public internet by any means.

Which connectivity option should be implemented?

A.

VPC peering

B.

Cloud VPN

C.

Cloud Interconnect

D.

Shared VPC

You must ensure that the keys used for at-rest encryption of your data are compliant with your organization's security controls. One security control mandates that keys get rotated every 90 days. You must implement an effective detection strategy to validate if keys are rotated as required. What should you do?​

A.

Analyze the crypto key versions of the keys by using data from Cloud Asset Inventory. If an active key is older than 90 days, send an alert message through your incident notification channel.​

B.

Identify keys that have not been rotated by using Security Health Analytics. If a key is not rotated after 90 days, a finding in Security Command Center is raised.​

C.

Assess the keys in the Cloud Key Management Service by implementing code in Cloud Run. If a key is not rotated after 90 days, raise a finding in Security Command Center.​

D.

Define a metric that checks for timely key updates by using Cloud Logging. If a key is not rotated after 90 days, send an alert message through your incident notification channel.​

Your financial services company is migrating its operations to Google Cloud. You are implementing a centralized logging strategy to meet strict regulatory compliance requirements. Your company's Google Cloud organization has a dedicated folder for all production projects. All audit logs, including Data Access logs from all current and future projects within this production folder, must be securely collected and stored in a central BigQuery dataset for long-term retention and analysis. To prevent duplicate log storage and to enforce centralized control, you need to implement a logging solution that intercepts and overrides any project-level log sinks for these audit logs, to ensure that logs are not inadvertently routed elsewhere. What should you do?

A.

Create an aggregated log sink at the production folder level with a destination of the central BigQuery dataset. Configure an inclusion filter for all audit and Data Access logs. Grant the Logs Bucket Writer role to the sink's service account on the production folder.

B.

Create a log sink in each production project to route audit logs to the central BigQuery dataset. Set the writer_identity field of each sink to a service account with BigQuery Data Editor permissions on the central dataset.

C.

Create an aggregated log sink at the organization level with a destination of the central BigQuery dataset and a filter for all audit logs. Use the --include-children flag and configure a log view for the production folder.

D.

Create an intercepting aggregated log sink at the production folder level with the central BigQuery dataset as the destination. Configure an inclusion filter for the necessary audit logs. Grant the appropriate IAM permissions to the sink's writer_identity on the BigQuery dataset.

An employer wants to track how bonus compensations have changed over time to identify employee outliers and correct earning disparities. This task must be performed without exposing the sensitive compensation data for any individual and must be reversible to identify the outlier.

Which Cloud Data Loss Prevention API technique should you use to accomplish this?

A.

Generalization

B.

Redaction

C.

CryptoHashConfig

D.

CryptoReplaceFfxFpeConfig

Your financial services company needs to process customer personally identifiable information (PII) for analytics while adhering to strict privacy regulations. You must transform this data to protect individual privacy to ensure that the data retains its original format and consistency for analytical integrity. Your solution must avoid full irreversible deletion. What should you do?

A.

Configure Sensitive Data Protection (SDP) to de-identify PII using format-preserving encryption (FPE).

B.

Use Cloud Key Management Service (Cloud KMS) to encrypt the entire dataset with a customer-managed encryption key (CMEK).

C.

Implement a custom BigQuery user-defined function (UDF) by using JavaScript to hash all sensitive fields before they are loaded into the analytical tables.

D.

Set up VPC Service Controls around the BigQuery project. Implement row-level encryption.

Copyright © 2014-2025 Solution2Pass. All Rights Reserved