10/269 questions · Unlock full access
Q1

A financial services company is using an Azure SQL Managed Instance, which is part of a failover group spanning two Azure regions for disaster recovery. During a DR test, a planned failover is initiated. After the failover, applications report intermittent, long-running queries that were previously fast. Analysis shows that the issue is due to parameter-sensitive plans (PSP) that were optimal in the primary region but are inefficient with the data distribution on the now-primary secondary replica. Which action should be taken to resolve this performance issue with minimal service disruption?

Q2Multiple answers

A manufacturing company uses SQL Server 2022 on an Azure VM for its inventory management system. To comply with internal security policies, the database administrator must ensure that all database users are authenticated exclusively through Microsoft Entra ID and that SQL logins are disabled at the server level. Which TWO actions must be performed to enforce this policy? (Select TWO)

Q3

You are managing a fleet of Azure SQL Databases for a SaaS application using an elastic pool. You need to automate a script that archives data older than 90 days from a specific table across all databases in the pool. The script needs to run every Sunday at 2:00 AM UTC. Which Azure service should you use to create, schedule, and manage this recurring task with the least amount of operational overhead?

Q4

A database administrator is configuring a high availability solution for a critical SQL Server 2019 instance running on an Azure Virtual Machine. The requirements are to have an RPO of zero for databases within the same Azure region and an RTO of less than 15 minutes. The solution must also provide a readable secondary replica for offloading reporting queries. Which configuration meets all these requirements?

Q5

A retail company is migrating its on-premises SQL Server 2014 database to a General Purpose Azure SQL Database. The lead DBA wants to establish a performance baseline before the migration. The on-premises server has Query Store disabled. The goal is to capture key performance metrics like CPU usage, IOPS, and query execution statistics over a representative one-week period. Which tool is most appropriate for collecting this comprehensive performance data from the on-premises server for migration planning?

Q6

A database contains sensitive employee salary information in a column named 'Salary'. A new data analyst needs to query the employee table for statistical analysis but must not be able to see the actual salary values. The analyst should see a masked value, such as '0.00', for all employees except those in their own department, where they can see the actual salary. Which combination of security features should be implemented to meet this requirement?

Q7

You are investigating a blocking chain in an Azure SQL Database. You have identified the head blocker session ID as 72. You need to find the specific T-SQL statement that session 72 is currently executing. Which Dynamic Management View (DMV) and function should you query to retrieve this information?

Q8

True or False: When configuring an Always On availability group for SQL Server on Azure Virtual Machines, a load balancer is required to redirect client connections to the primary replica after a failover.

Q9

A database administrator needs to deploy a new Azure SQL Managed Instance using an ARM template. The deployment must be idempotent, meaning running the template multiple times should result in the same state without errors. The administrator must specify the name of the instance in the template. Which ARM template function should be used to ensure the managed instance name is globally unique to avoid deployment failures?

Q10

An e-commerce company is using Azure SQL Database Hyperscale. During peak sales events, the database experiences significant write activity, leading to transaction log generation rates that approach the 100 MBps limit. The company wants to avoid performance degradation or throttling due to this high log generation. What is the most effective way to scale the database to accommodate this workload?