Quantcast
Viewing all articles
Browse latest Browse all 3160

TroubleShooting SQL Server Memory Consumption

Recently on one of our staging servers had a memory consumption issue and

even 32 GB of RAM was looking insufficient and application performance was being impaired. Initially I thought the cause would be poorly designed queries, but after a thorough investigation this turned out not to be the case. I

noticed that whenever the Re-indexing and Update Statistics jobs executed

against the database the memory consumption increased considerably.

The size of the database was 155 GB, the server had 32 GB of RAM as well as 8

CPU’s on it.

While performing the investigation of the system, I decided

to test the performance of the Re-indexing and Update Statistics tasks

separately.

The T-SQL used for performing the Re-indexing of the

database is as shown below:

DECLARE
@dbname varchar(1000),
@parentname varchar(255),
@SQLSTR VARCHAR (1000),
@ctrl CHAR (2),
@command varchar(1000)
SET @ctrl = CHAR (13) + CHAR (10)
DECLARE DBCUR CURSOR FOR
select [name]
from sysdatabases where name not in
(
'master',
'model',
'msdb',
'tempdb'
)
order by 1
OPEN DBCUR
FETCH NEXT FROM DBCUR INTO @dbname
WHILE @@FETCH_STATUS = 0
BEGIN
select @command =
'
use ['+@dbname+']
Exec sp_MSForEachtable ''DBCC DBREINDEX ("?")''
'
exec (@command)
FETCH NEXT FROM DBCUR INTO @dbname
END
CLOSE DBCUR
DEALLOCATE DBCUR
GO

Note from the above T-SQL, the task uses a system stored

procedure named Exec sp_MSForEachtable along with a DBCC command named ”DBCC

DBREINDEX ("?"). This was taking around 54 minutes to perform

the Re-indexing of the entire database. As soon as the above T-SQL was

executed the Memory Utilization on the server rapidly shot up. Since

DBCC DBREINDEX is going to be removed in future releases of SQL Server and

sp_msforeachtable is an undocumented stored procedure I decided to change the

logic involved in performing the Re-indexing task as follows:

-- Ensure a USE statement has been executed first.
SET NOCOUNT ON;
DECLARE @objectid int;
DECLARE @indexid int;
DECLARE @partitioncount bigint;
DECLARE @schemaname nvarchar(130);
DECLARE @objectname nvarchar(130);
DECLARE @indexname nvarchar(130);
DECLARE @partitionnum bigint;
DECLARE @partitions bigint;
DECLARE @frag float;
DECLARE @command nvarchar(4000);
-- Conditionally select tables and indexes from the sys.dm_db_index_physical_stats function
-- and convert object and index IDs to names.
SELECT
object_id AS objectid,
index_id AS indexid,
partition_number AS partitionnum,
avg_fragmentation_in_percent AS frag
INTO #work_to_do
FROM sys.dm_db_index_physical_stats (DB_ID(), NULL, NULL , NULL, 'LIMITED')
WHERE avg_fragmentation_in_percent > 10.0 AND index_id > 0;
-- Declare the cursor for the list of partitions to be processed.
DECLARE partitions CURSOR FOR SELECT * FROM #work_to_do;
-- Open the cursor.
OPEN partitions;
-- Loop through the partitions.
WHILE (1=1)
BEGIN;
FETCH NEXT
FROM partitions
INTO @objectid, @indexid, @partitionnum, @frag;
IF @@FETCH_STATUS < 0 BREAK;
SELECT @objectname = QUOTENAME(o.name), @schemaname = QUOTENAME(s.name)
FROM sys.objects AS o
JOIN sys.schemas as s ON s.schema_id = o.schema_id
WHERE o.object_id = @objectid;
SELECT @indexname = QUOTENAME(name)
FROM sys.indexes
WHERE object_id = @objectid AND index_id = @indexid;
SELECT @partitioncount = count (*)
FROM sys.partitions
WHERE object_id = @objectid AND index_id = @indexid;
-- 30 is an arbitrary decision point at which to switch between reorganizing and rebuilding.
IF @frag < 30.0
SET @command = N'ALTER INDEX ' + @indexname + N' ON ' + @schemaname + N'.' + @objectname + N' REORGANIZE';
IF @frag >= 30.0
SET @command = N'ALTER INDEX ' + @indexname + N' ON ' + @schemaname + N'.' + @objectname + N' REBUILD';
IF @partitioncount > 1
SET @command = @command + N' PARTITION=' + CAST(@partitionnum AS nvarchar(10));
EXEC (@command);
--PRINT N'Executed: ' + @command;
END;
-- Close and deallocate the cursor.
CLOSE partitions;
DEALLOCATE partitions;
-- Drop the temporary table.
DROP TABLE #work_to_do;
GO

As seen from the above T-SQL, only those indexes whose

fragmentation level is less than 30% will be re-organized whereas the one whose

fragmentation level is greater than 30% will be rebuilt. The script has also

eliminated the use of sp_msforeachtable undocumented stored procedure as well

as the DBCC DBREINDEX command.

Once this logic was implemented, the Re-indexing task

completed in just 18 minutes and the memory consumption on the server

was mere 5.29 GB.

The Logic initially used for performing the Update

Statistics is as follows:

DECLARE
@dbname varchar(1000),
@parentname varchar(255),
@SQLSTR VARCHAR (1000),
@ctrl CHAR (2),
@command varchar(1000)
SET @ctrl = CHAR (13) + CHAR (10)
DECLARE DBCUR CURSOR FOR
select [name]
from sysdatabases where name not in
(
'master',
'model',
'msdb',
'tempdb'
)
order by 1
OPEN DBCUR
FETCH NEXT FROM DBCUR INTO @dbname
WHILE @@FETCH_STATUS = 0
BEGIN
select @command =
'
use ['+@dbname+']
Exec sp_MSForEachtable ''update statistics ? with fullscan''
'
exec (@command)
FETCH NEXT FROM DBCUR INTO @dbname
END
CLOSE DBCUR
DEALLOCATE DBCUR
GO

The majority of the tables in the database had columns

of NTEXT and Image Data Types.

Note from the above T-SQL that the

Update STATISTICS with FULLSCAN is going to be performed for each and every

table in the database even for those tables where there are columns having data

type such as NTEXT, IMAGE,VARBINARY which is very inefficient and can kill the performance of your server.

If a column has a NTEXT data type then it means that the

data is stored in the form of LOB structure. For a LOB column, when the statistics

is created, the first 100 bytes and last 100 bytes will be read to generate

statistics. If we do not specify the sample rate for the UPDATE

STATISTICS command, SQL Server will calculate the sample rate based on the

number of rows and number of pages used by that table. In addition, the number of

BLOB pages will not be considered so it is possible that when we run UPDATE

STATISTIC a large volume of data is sampled and 200 bytes of every BLOB row is

read which generates a large I/O request and increasing memory consumption. Therefore it is always best that

if you have a table in which there are columns of data types

TEXT,NTEXT,IMAGE,VARBINARY then always perform the UPDATE STATISTICS

operation without a FULLSCAN. I changed the T-SQL involved in

performing the UPDATE STATISTICS operation as follows:

SET NOCOUNT ON
DECLARE @columnname VARCHAR(MAX)
DECLARE @tablename SYSNAME
DECLARE @statsname SYSNAME
DECLARE @sql NVARCHAR(4000)
DECLARE @NAME VARCHAR(MAX)
declare @i INT
declare @j INT
create table #temp
(
tablename varchar(1000),
statsname varchar(1000),
columnname varchar(1000)
)
insert #temp(tablename,statsname,columnname)
SELECT DISTINCT
OBJECT_NAME(s.[object_id]),
s.name AS StatName,
COALESCE(@NAME+ ', ', '')+c.name
FROM sys.stats s JOIN sys.stats_columns sc ON sc.[object_id] = s.[object_id] AND sc.stats_id = s.stats_id
JOIN sys.columns c ON c.[object_id] = sc.[object_id] AND c.column_id = sc.column_id
JOIN INFORMATION_SCHEMA.COLUMNS D ON D.[COLUMN_NAME]= C.[NAME]
JOIN s

Viewing all articles
Browse latest Browse all 3160

Trending Articles