Winter Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: s2p65

Easiest Solution 2 Pass Your Certification Exams

ARA-C01 Snowflake SnowPro Advanced: Architect Certification Exam Free Practice Exam Questions (2025 Updated)

Prepare effectively for your Snowflake ARA-C01 SnowPro Advanced: Architect Certification Exam certification with our extensive collection of free, high-quality practice questions. Each question is designed to mirror the actual exam format and objectives, complete with comprehensive answers and detailed explanations. Our materials are regularly updated for 2025, ensuring you have the most current resources to build confidence and succeed on your first attempt.

Page: 1 / 3
Total 162 questions

Is it possible for a data provider account with a Snowflake Business Critical edition to share data with an Enterprise edition data consumer account?

A.

A Business Critical account cannot be a data sharing provider to an Enterprise consumer. Any consumer accounts must also be Business Critical.

B.

If a user in the provider account with role authority to create or alter share adds an Enterprise account as a consumer, it can import the share.

C.

If a user in the provider account with a share owning role sets share_restrictions to False when adding an Enterprise consumer account, it can import the share.

D.

If a user in the provider account with a share owning role which also has override share restrictions privilege share_restrictions set to False when adding an Enterprise consumer account, it can import the share.

A Snowflake Architect created a new data share and would like to verify that only specific records in secure views are visible within the data share by the consumers.

What is the recommended way to validate data accessibility by the consumers?

A.

Create reader accounts as shown below and impersonate the consumers by logging in with their credentials.create managed account reader_acctl admin_name = userl , adroin_password ■ 'Sdfed43da!44T , type = reader;

B.

Create a row access policy as shown below and assign it to the data share.create or replace row access policy rap_acct as (acct_id varchar) returns boolean -> case when 'acctl_role' = current_role() then true else false end;

C.

Set the session parameter called SIMULATED_DATA_SHARING_C0NSUMER as shown below in order to impersonate the consumer accounts.alter session set simulated_data_sharing_consumer - 'Consumer Acctl*

D.

Alter the share settings as shown below, in order to impersonate a specific consumer account.alter share sales share set accounts = 'Consumerl’ share restrictions = true

A company wants to Integrate its main enterprise identity provider with federated authentication with Snowflake.

The authentication integration has been configured and roles have been created in Snowflake. However, the users are not automatically appearing in Snowflake when created and their group membership is not reflected in their assigned rotes.

How can the missing functionality be enabled with the LEAST amount of operational overhead?

A.

OAuth must be configured between the identity provider and Snowflake. Then the authorization server must be configured with the right mapping of users and roles.

B.

OAuth must be configured between the identity provider and Snowflake. Then the authorization server must be configured with the right mapping of users, and the resource server must be configured with the right mapping of role assignment.

C.

SCIM must be enabled between the identity provider and Snowflake. Once both are synchronized through SCIM, their groups will get created as group accounts in Snowflake and the proper roles can be granted.

D.

SCIM must be enabled between the identity provider and Snowflake. Once both are synchronized through SCIM. users will automatically get created and their group membership will be reflected as roles In Snowflake.

A company’s daily Snowflake workload consists of a huge number of concurrent queries triggered between 9pm and 11pm. At the individual level, these queries are smaller statements that get completed within a short time period.

What configuration can the company’s Architect implement to enhance the performance of this workload? (Choose two.)

A.

Enable a multi-clustered virtual warehouse in maximized mode during the workload duration.

B.

Set the MAX_CONCURRENCY_LEVEL to a higher value than its default value of 8 at the virtual warehouse level.

C.

Increase the size of the virtual warehouse to size X-Large.

D.

Reduce the amount of data that is being processed through this workload.

E.

Set the connection timeout to a higher value than its default.

An Architect needs to allow a user to create a database from an inbound share.

To meet this requirement, the user’s role must have which privileges? (Choose two.)

A.

IMPORT SHARE;

B.

IMPORT PRIVILEGES;

C.

CREATE DATABASE;

D.

CREATE SHARE;

E.

IMPORT DATABASE;

What considerations need to be taken when using database cloning as a tool for data lifecycle management in a development environment? (Select TWO).

A.

Any pipes in the source are not cloned.

B.

Any pipes in the source referring to internal stages are not cloned.

C.

Any pipes in the source referring to external stages are not cloned.

D.

The clone inherits all granted privileges of all child objects in the source object, including the database.

E.

The clone inherits all granted privileges of all child objects in the source object, excluding the database.

A company has a Snowflake environment running in AWS us-west-2 (Oregon). The company needs to share data privately with a customer who is running their Snowflake environment in Azure East US 2 (Virginia).

What is the recommended sequence of operations that must be followed to meet this requirement?

A.

1. Create a share and add the database privileges to the share2. Create a new listing on the Snowflake Marketplace3. Alter the listing and add the share4. Instruct the customer to subscribe to the listing on the Snowflake Marketplace

B.

1. Ask the customer to create a new Snowflake account in Azure EAST US 2 (Virginia)2. Create a share and add the database privileges to the share3. Alter the share and add the customer's Snowflake account to the share

C.

1. Create a new Snowflake account in Azure East US 2 (Virginia)2. Set up replication between AWS us-west-2 (Oregon) and Azure East US 2 (Virginia) for the database objects to be shared3. Create a share and add the database privileges to the share4. Alter the share and add the customer's Snowflake account to the share

D.

1. Create a reader account in Azure East US 2 (Virginia)2. Create a share and add the database privileges to the share3. Add the reader account to the share4. Share the reader account's URL and credentials with the customer

The Business Intelligence team reports that when some team members run queries for their dashboards in parallel with others, the query response time is getting significantly slower What can a Snowflake Architect do to identify what is occurring and troubleshoot this issue?

A)

B)

C)

D)

A.

Option A

B.

Option B

C.

Option C

D.

Option D

A Snowflake Architect is setting up database replication to support a disaster recovery plan. The primary database has external tables.

How should the database be replicated?

A.

Create a clone of the primary database then replicate the database.

B.

Move the external tables to a database that is not replicated, then replicate the primary database.

C.

Replicate the database ensuring the replicated database is in the same region as the external tables.

D.

Share the primary database with an account in the same region that the database will be replicated to.

The following table exists in the production database:

A regulatory requirement states that the company must mask the username for events that are older than six months based on the current date when the data is queried.

How can the requirement be met without duplicating the event data and making sure it is applied when creating views using the table or cloning the table?

A.

Use a masking policy on the username column using a entitlement table with valid dates.

B.

Use a row level policy on the user_events table using a entitlement table with valid dates.

C.

Use a masking policy on the username column with event_timestamp as a conditional column.

D.

Use a secure view on the user_events table using a case statement on the username column.

When using the Snowflake Connector for Kafka, what data formats are supported for the messages? (Choose two.)

A.

CSV

B.

XML

C.

Avro

D.

JSON

E.

Parquet

An Architect uses COPY INTO with the ON_ERROR=SKIP_FILE option to bulk load CSV files into a table called TABLEA, using its table stage. One file named file5.csv fails to load. The Architect fixes the file and re-loads it to the stage with the exact same file name it had previously.

Which commands should the Architect use to load only file5.csv file from the stage? (Choose two.)

A.

COPY INTO tablea FROM @%tablea RETURN_FAILED_ONLY = TRUE;

B.

COPY INTO tablea FROM @%tablea;

C.

COPY INTO tablea FROM @%tablea FILES = ('file5.csv');

D.

COPY INTO tablea FROM @%tablea FORCE = TRUE;

E.

COPY INTO tablea FROM @%tablea NEW_FILES_ONLY = TRUE;

F.

COPY INTO tablea FROM @%tablea MERGE = TRUE;

The IT Security team has identified that there is an ongoing credential stuffing attack on many of their organization’s system.

What is the BEST way to find recent and ongoing login attempts to Snowflake?

A.

Call the LOGIN_HISTORY Information Schema table function.

B.

Query the LOGIN_HISTORY view in the ACCOUNT_USAGE schema in the SNOWFLAKE database.

C.

View the History tab in the Snowflake UI and set up a filter for SQL text that contains the text "LOGIN".

D.

View the Users section in the Account tab in the Snowflake UI and review the last login column.

Consider the following scenario where a masking policy is applied on the CREDICARDND column of the CREDITCARDINFO table. The masking policy definition Is as follows:

Sample data for the CREDITCARDINFO table is as follows:

NAME EXPIRYDATE CREDITCARDNO

JOHN DOE 2022-07-23 4321 5678 9012 1234

if the Snowflake system rotes have not been granted any additional roles, what will be the result?

A.

The sysadmin can see the CREDICARDND column data in clear text.

B.

The owner of the table will see the CREDICARDND column data in clear text.

C.

Anyone with the Pl_ANALYTICS role will see the last 4 characters of the CREDICARDND column data in dear text.

D.

Anyone with the Pl_ANALYTICS role will see the CREDICARDND column as*** 'MASKED* **'.

How do Snowflake databases that are created from shares differ from standard databases that are not created from shares? (Choose three.)

A.

Shared databases are read-only.

B.

Shared databases must be refreshed in order for new data to be visible.

C.

Shared databases cannot be cloned.

D.

Shared databases are not supported by Time Travel.

E.

Shared databases will have the PUBLIC or INFORMATION_SCHEMA schemas without explicitly granting these schemas to the share.

F.

Shared databases can also be created as transient databases.

Following objects can be cloned in snowflake

A.

Permanent table

B.

Transient table

C.

Temporary table

D.

External tables

E.

Internal stages

A group of Data Analysts have been granted the role analyst role. They need a Snowflake database where they can create and modify tables, views, and other objects to load with their own data. The Analysts should not have the ability to give other Snowflake users outside of their role access to this data.

How should these requirements be met?

A.

Grant ANALYST_R0LE OWNERSHIP on the database, but make sure that ANALYST_ROLE does not have the MANAGE GRANTS privilege on the account.

B.

Grant SYSADMIN ownership of the database, but grant the create schema privilege on the database to the ANALYST_ROLE.

C.

Make every schema in the database a managed access schema, owned by SYSADMIN, and grant create privileges on each schema to the ANALYST_ROLE for each type of object that needs to be created.

D.

Grant ANALYST_ROLE ownership on the database, but grant the ownership on future [object type] s in database privilege to SYSADMIN.

The following chart represents the performance of a virtual warehouse over time:

A Data Engineer notices that the warehouse is queueing queries. The warehouse is sizeX-Small, theminimum and maximum cluster counts are set to 1, thescaling policy is set to standard, andauto-suspend is set to 10 minutes.

How can the performance be improved?

A.

Change the cluster settings.

B.

Increase the size of the warehouse.

C.

Change the scaling policy to economy.

D.

Change auto-suspend to a longer time frame.

An Architect needs to design a data unloading strategy for Snowflake, that will be used with the COPY INTO <location> command.

Which configuration is valid?

A.

Location of files: Snowflake internal location. File formats: CSV, XML. File encoding: UTF-8. Encryption: 128-bit

B.

Location of files: Amazon S3. File formats: CSV, JSON. File encoding: Latin-1 (ISO-8859). Encryption: 128-bit

C.

Location of files: Google Cloud Storage. File formats: Parquet. File encoding: UTF-8· Compression: gzip

D.

Location of files: Azure ADLS. File formats: JSON, XML, Avro, Parquet, ORC. Compression: bzip2. Encryption: User-supplied key

An Architect needs to improve the performance of reports that pull data from multiple Snowflake tables, join, and then aggregate the data. Users access the reports using several dashboards. There are performance issues on Monday mornings between 9:00am-11:00am when many users check the sales reports.

The size of the group has increased from 4 to 8 users. Waiting times to refresh the dashboards has increased significantly. Currently this workload is being served by a virtual warehouse with the following parameters:

AUTO-RESUME = TRUE AUTO_SUSPEND = 60 SIZE = Medium

What is the MOST cost-effective way to increase the availability of the reports?

A.

Use materialized views and pre-calculate the data.

B.

Increase the warehouse to size Large and set auto_suspend = 600.

C.

Use a multi-cluster warehouse in maximized mode with 2 size Medium clusters.

D.

Use a multi-cluster warehouse in auto-scale mode with 1 size Medium cluster, and set min_cluster_count = 1 and max_cluster_count = 4.

Page: 1 / 3
Total 162 questions
Copyright © 2014-2025 Solution2Pass. All Rights Reserved