The following is the list of SQL interview questions on different aspects of database design, drive space and error logs.
SQL Interview Questions 1: What is your approach in AUTO_GROWTH option while creating a new database?Auto Growth setting plays a vital role in the database performance perspective. There are two things we should handle.
File Growth (In Percent/In MB) Maximum File Size (Restricted/Unrestricted)Remember the following points while setting Auto Growth option:
Auto grow (Additional space will be allocated to the database) happens when a database is out of space. Most likely this new space will not be physically right next to the existing space, means it allocates space on somewhere on available disk which leads to a physical fragmentation.
SQL Interview Questions 2: We know we cannot use Order by inside a view. Can you tell me why?Even though I did a lot of research on the question “WHY?”, I could not find any specific answer on MSDN blogs. But one of the Authors answered this question which is more conveying.
“A view is just like a Table. As per the RDBMS rules we must use table data and produce a result set, we can apply as many as transformations while producing the result set but not at the source (in Table/View).”
Let’s say we have a facility to use Order By clause inside a view. In most of the scenarios, we do join view data with another dataset (A View or Table) which produces a new result set. In this case, we have done extra processing in sorting data at the initial level.
If we need to view data in a sorted order, we can simply apply Order by on view but not from inside view.
SQL Interview Questions 3: On a production server TEMPDB is getting full. How do you fix it?We usually don’t try to fix issues on production TEMPDB without a maintenance window. When we see TEMPDB is getting full we identify the active queries and from those active queries, we’ll identify queries/SPID causing the TEMPDB full and then kill that SPID.
When we need to give a permanent fix on TEMPDB, we take a maintenance window and then apply the required fix that may include increasing the TEMPDB data file size, creating a new data file with the same size, creating filter indexes based on problematic queries to reduce the load on Temdb etc.
SQL Interview Questions 4: What are the different phases in database testing? UNIT TESTING: Test all parts of your object working fine FUNCTIONAL TESTING: Test the code to make sure it is processing data as per the (domain acceptance) requirement document. INTEGRITY TESTING: Test data insertions/updates/Deletions are all following Domain, Referential and Entity Integrity. UI TESTING: Check data values are matching between source and destination using UI/Application LOAD/STRESS TESTING: Test your code can handle huge data loads and concurrent access PERFORMANCE TESTING: Test performance is not downgraded SECURITY TESTING: check code should be compatible with the standard security policies SQL Interview Questions 5: One of the disk drives is 95% full within a short period of time. Now you started getting disk full alerts and you need to free up space. What are the different options you try to clear the space? And what are the possible reasons that cause the sudden disk full issues? Ans:This is also a very common alert/issue in DBA’s life. Let me explain the different options we’ll try to handle the disk space issues.
Possible reasons: Huge data load happens, and it increased the data file size as auto growth is enabled An open transaction causes to increase a database log file increased when it’s auto grow option is on A big transaction log/differential backup generated TEMPDB might fill the disk due to a huge query sorting and maintenance A huge number of SQL Mini DUMP files created on log folder It’s may not always SQL Server causes the DISK FULL issue, check the huge files on disk manually and find if any other application or OS causing the issue. Disk size was gradually increased as per the process, but the alert was disabled and just it got enabled and it started sending alert messages. Resolutions: We quickly identify if there are any files that can be moved to another drive Remove if you find any old/unnecessary backups, SQL Server logs or crashed logs If it is log file full issues handle it properly and shrink the log file If you find any low risk or small databases located on that drive, then try to move those database files to other drive using detach and attach method If you identify any open transaction which is causing the disk full then collect all possible details of that transaction and kill it. Talk to the server owner and ask for more space on that drive or request for a new drive, take a maintenance window and increase the drive space or attach a new drive. SQL Interview Questions 6: What are the most common issues a SQL DBA should deal with as a part of DBA daily job? Backup Failure Restore Failure Log Full Issues Blocking Alerts Deadlocks Alerts TEMPDB full issues Disk Full Issues SQL Connectivity Issues Access issues Installation and Upgrade Failures SQL Agent Job Failures Performance Issues Resource (Memory/IO/CPU etc.) Utilization Alerts High-Availability and Disaster Recovery related issues. SQL Interview Questions 7: You got an alert by saying that a SQL Agent job has been failed on one of the servers. You need to find the root cause. What is the approach?There are certain things we need to quickly check when a job failed:
Job History SQL Server Error LogLog file If you configure any log file at Job Step advanced properties Window Event Log
Job Execution Time Delta Time Difference between the current and last execution the above checklist will give you maximum information that causes the job failure. To do further RCA, note down the job execution time and capture below details in that time:
CPU usage Memory usage I/O usage Blocking and Deadlocks if any Any dump file created Log full issues if any Any other suspecting errors SQL Interview Questions 8: Is it possible to configure Database Mirroring/Log Shipping when servers are having different drive structure?Yes! It is possible. But Microsoft strongly suggests that never configure any topology when primary and secondary is having different drive structure. There are some issues/manual interruption required if you configure mirroring with principal and mirror are having different drive structure.
When we are adding/restoring the backup from Principal/Primary to Mirror/Secondary we should use WITH MOVE option. Once mirroring/log shipping is configured successfully and later in sometime if we need to add a file (.NDF) on principal/primary then we need to follow a specific process to make the topology functioning.
SQL Interview Questions 9: What is throw option in exception handling?THROW is simple to use than RAISEERROR. When using THROW once error THROWN next statements will not be executed whereas in RAISEERROR next statements will be executed. With THROW we can get the correct error number and line number We can simply THROW user-defined error messages using THROW whereas to use the user-defined message in RAISE ERROR first that message has to be added to sys.messages using SP_ADDMESSAGE
SQL Interview Questions 10: What are the different ways available to insert data from a file into the SQL Server database table?These are the different ways:
BCP BULKINSERT OPENROWSET OPENDATASOURCE OPENQUERY LINKED SERVER IMPORT/EXPORT WIZARD SSISNow let me explain each in detail:
BCP: BCP Stands for a Bulk copy Mostly used for import/export text files Can be used from windows Command Prompt or in SSMS using XP_CMDSHELL It can also be useful for generating the file format in XML formatExample: Import data from C:\Temp\emp.txt into a table dbo.emp
C:\>bcp dbo.emp in ‘C:\Temp\emp.txt ‘ -T S
serverName\instanceName BULK INSERT: A T-SQL statement that imports data directly from a data file into a database table or non-partitioned view. This can only import the data from files to SQL Server It doesn’t support data export from SQL Server to a file We can provide the file format along with the data file so it can handle the data conversion part
Example: BULK INSERT dbo.emp FROM ‘C:\Temp\emp.txt’ WITH (FIELDTERMINATOR =’,’, FIRSTROW = 2)
OPENROWSET: T-SQL command that allows you to query data from other data sources directly from within SQL Server. This can only import the data from other data sources ex: Excel File OPENROWSET function can also be referenced as the target table of an INSERT, UPDATE, or DELETE statement, We can provide the file format along with the data file, so it can handle the data conversion part We can use SQL statements for pulling data thereof there is more flexibility in applying filters and selecting the required data. OPENDATASOURCE: T-SQL command that allows you to query data from other data sources directly from within SQL Server. This is like the OPENROWSET command. Ad Hoc Distributed Queries advanced configuration option should be enabled. We can use in SELECT, INSERT, UPDATE AND DELETE statements We can execute a remote stored procedure using OPENDATASOURCE Example: We can directly query a table “Employee“ from other SQL Server instance “SQL2012EP”SELECT * FROM OPENDATASOURCE('SQLNCLI', 'Data Source=SQL2012EP;Integrated Security=SSPI') .AdventureWorks2012.HumanResources.Employee OPENQUERY: OPENQUERY can be referenced in the FROM clause of a query as if it were a table name. OPENQUERY can also be referenced as the target table of an INSERT, UPDATE, or DELETE statement. OPENROWSET requires a linked server to be created to the target data source. Example:
Linked Server “OracleSvr” is created to an Oracle instance to insert data into a table in Oracle from SQL Server
INSERT OPENQUERY (OracleSvr, 'SELECT name FROM joe.titles') VALUES ('Sr Engineer'); Update a table in Oracle using OPENROWSET UPDATE OPENQUERY (OracleSvr, 'SELECT name FROM joe.titles WHERE id = 101') SET name = 'Sr.Engineer'; LINKED SERVER: Link with the other data source and directly refer the remote objects in local queries. We can directly use the remote objects by referring the four-part name “LinkedServer.Database.Schema.Object”We can create a linked server to ODBC Data source, MS Excel, MS Access, File System, OLEDB Data Source, Another SQL Server, Oracle 8.0 and above, IBM DB2, Azure SQL Database.
Example:
EXEC sp_addlinkedserver ‘EmpExcel’, ‘Jet 4.0’, ‘Microsoft.Jet.OLEDB.4.0’, ‘C:\Temp\Emp.xls’, NULL, ‘Excel 8.0’ GO INSERT INTO dbo.Emp SELECT * FROM EmpExcel...Sheet2$IMPORT/EXPORT WIZARD: Native tool for data imports/exports.
SSIS PACKAGE: We can build an ETL package for data transformation between data sources using SQL Server Integration Services.
SQL Interview Questions 11: How does change data capture works?Log serves as a source for capturing the change data.
First the CDC should be enabled to the database sys.cdc_enable_db.We need to enable the CDC for required table sys.cdc_enable_table.All CDC enabled tables/indexes details can be retrieve -sys.sp_cdc_help_change_data_capture.
All CDC objects are stored in the database once you enable the CDC.
Objects are stored under the change data capture schema.
All change data is retrieved and stored in the change table. Each table that enabled CDC is having a corresponding change table from the change table we can find the change data using columns like _$start_lsn indicates the starting of a transaction and _$seqval (Order of the operations) and _$Operation (1=Delete,2=Insert,3=Update Before image 4= Update after image). Two jobs are created when you enable CDC to a database.
CDC-Capture job: When the first table enabled in the database and not Transactional replication is enabled and running for that database. If TRAN repl is enabled, then replication log reader agent will take care of populating the change table.
The job simply called the sp_replcmds to populate the change table and continuously running (Can scan 1000 transaction per 5 seconds) as transactional replication.
SQL Interview Questions 12: In your environment how do you handle old history and backups?We have traditionally designed maintenance plans to take care of History and Maintenance cleanup. We use tasks History Cleanup and Maintenance Cleanup tasks from maintenance plans.
Maintenance Plan log files: Delete all files older than 4 weeks History Cleanup: Delete all history for Backup & Restore, Agent Job & Maintenance Plan older than 4 weeks. Database Full backups (Weekly): Delete full backups older than 2 weeks Database Diff and Log (Daily/Hourly): Delete all backups older than 2 weeks