A financial services firm is architecting a multi-tenant data platform on Snowflake. They will host data for several independent hedge funds. A critical requirement is that no hedge fund can ever see another's data, and network traffic for each must be isolated to a specific set of IP addresses. Additionally, the firm wants to manage all accounts centrally under a single master agreement. Which combination of Snowflake features is required to meet these stringent isolation and management requirements?
Q2
A data architect is designing a data vault model in Snowflake. To improve query performance for the business vault, they are considering applying constraints to the link and satellite tables. They want the query optimizer to use this metadata, but they do not want Snowflake to expend resources validating the constraints during data loading. What is the correct syntax to achieve this?
Q3
A data engineering team is building an ELT pipeline. A stream object has been created on a raw data table to capture changes (inserts, updates, deletes). A downstream task merges these changes into a dimension table. After a successful merge operation, the team notices that the stream is not empty and contains the same change records. The subsequent task run processes the same records again, causing data duplication issues. What is the most likely cause of this behavior?
Q4
During a performance review of a large data warehouse, an architect analyzes a query profile for a frequently executed report. The profile reveals that a significant portion of the execution time is spent on a 'TableScan' operation, and the 'Partitions scanned' is nearly equal to the 'Partitions total', despite the query having a highly selective `WHERE` clause on a `TIMESTAMP_NTZ` column. The table's data is naturally ordered by the timestamp of insertion. What is the most effective and cost-efficient first step to optimize this query?
Q5Multiple answers
A healthcare organization must implement a security model where data analysts can query patient data for statistical research but must NEVER see patient names or social security numbers. However, a separate group of 'Auditors' must be able to view the original, unmasked data for compliance checks. The solution must be centrally managed and automatically applied to any user with the `ANALYST` role. Which Snowflake security features should be combined to meet these requirements? (Select TWO)
Q6
True or False: When sharing a table with a consumer account via a standard Secure Share, the data is physically copied to the consumer's account, and the consumer is responsible for the storage costs of the shared data.
Q7
A media company is building a data pipeline to process video metadata files (JSON format) arriving in an S3 bucket. The JSON files have a deeply nested structure. The goal is to load this raw JSON into a staging table with a single VARIANT column and then transform and flatten the nested arrays into a structured analytical table. What is the most appropriate Snowflake function to use for un-nesting the JSON arrays during the transformation step?
Q8
An e-commerce company has a multi-cluster warehouse configured with a scaling policy set to `ECONOMY`. During the peak holiday season, they observe that user queries are frequently getting queued, leading to slow dashboard performance. The monitoring dashboard shows that while the warehouse has scaled out to its maximum cluster count, the average CPU utilization across all clusters remains low, around 30-40%. What is the most likely reason for this behavior?
Q9
**Case Study: Global Retailer's Data Mesh Architecture** A large retail corporation with headquarters in North America is implementing a decentralized Data Mesh architecture using Snowflake. They have business units in EMEA and APAC, each responsible for their own data products (e.g., Sales, Marketing, Supply Chain). Each business unit will have its own Snowflake account in their respective cloud region (AWS us-east-1, Azure West Europe, GCP asia-southeast1) to maintain data sovereignty and autonomy. **Current Situation & Technical Requirements:** 1. The central BI team in North America needs to build consolidated global sales dashboards. This requires joining sales data from all three regional accounts. 2. The solution must be real-time; as soon as a regional sales table is updated, the change should be reflected in the central account. 3. The central BI team must not incur storage costs for the regional data. They should only pay for the compute they use to query it. 4. The architecture must be resilient. If the primary cloud region for the central BI team (AWS us-east-1) becomes unavailable, they must be able to fail over to a secondary account in AWS us-west-2 with minimal data loss (RPO < 5 minutes) and be operational within an hour (RTO < 1 hour). Which architectural design best satisfies all the requirements of the global retailer?
Q10
A DevOps team is automating the deployment of a Snowflake environment using CI/CD. As part of the process, they need to programmatically check if a specific Row Access Policy is attached to a given table before proceeding with other changes. Which information source should they query to get this information reliably?