Quantcast
Viewing all articles
Browse latest Browse all 3160

Does a TempDB spill mean statistics are out of date?

When you see a yellow bang in an execution plan, and you hover over it to discover that SQL Server spilled out to TempDB, your first reaction is to wail in pain and rend your clothes.


Image may be NSFW.
Clik here to view.
Does a TempDB spill mean statistics are out of date?

Crying over spilled memory

After that, though, it seems like people always ask the same question:

“Why doesn’t updating stats fix this?”

In the example here, I’m using Cade Roux’s rank-and-percentile query against the Stack Overflow data dump . It’s not that statistics are out of date look at the estimated number of rows versus the actual number of rows.

They’re identical. The stats are fine.

And no matter how many times Iupdate statistics, I’llstill get a ~400MB spill to disk.

I’m not using a small server, either: my virtual machine has 32GB RAM, and I’ve allocated 28GB of that to SQL Server. There are no other queries running at the same time it’s just one lonely query, spilling to disk for the lack of 400MB RAM.

When I right-click on the SELECT operator and click Properties, I get the fancy properties window in SSMS, which shows me more details about the query’s memory grants:


Image may be NSFW.
Clik here to view.
Does a TempDB spill mean statistics are out of date?

Maximum bakery

Note the rigorous devotion to alphabetical order in the properties window. It’s up to you, dear reader, to figure out what you’re supposed to read, in which order. Here, I’ll give you a peek at my decoder ring:

DesiredMemory look here first. It’s how much memory SQL Server wanted. RequestedMemory sometimes, this is less than desired. (I’d really like a Porsche 911R, but I know better than to ask my wife for the budget to buy one.) GrantedMemory how much I actually got.

SQL Server wasn’t off by too much: this operator only needed 7,643 more 8KB pages, around 60MB. Close but no cigar.

Fixing it with the claw hammer: MIN_GRANT_PERCENT

Starting with SQL Server 2012 SP3, 2014 SP2, and 2016, I have a nifty new query option :

OPTION (min_grant_percent = 50)

That brains SQL Server into giving me all the memory I need, and then some. The spill is gone from the execution plan, and my memory grants are way higher:


Image may be NSFW.
Clik here to view.
Does a TempDB spill mean statistics are out of date?

Sort without spill

But this has a terrible drawback: I’m claiming memory that no one else can use while my query runs. And how do I know if I even got that grant percent right, anyway? It’ll change over time as my data changes.

Fixing it with the scalpel: CREATE INDEX

Whenever you’re tuning queries, look at the big picture: if the data was already sorted in the order you needed, you wouldn’t be having this problem with the sort operator. And SQL Server’s trying to give you a hint in the form of a missing index recommendation:


Image may be NSFW.
Clik here to view.
Does a TempDB spill mean statistics are out of date?

Missing index hint

That impact number might seem low at just 14, but remember that’s a suggestion SQL Server came up with before it even executed the query. Before the query ran, it didn’t even know about the TempDB spill, so the impact’s going to be even larger.

Can you fix every spill? Of course not sometimes, you’re dealing with data that’s larger than the physical box’s memory. But even when you can fix it,just make sure you’re fixing it with the right tool.


Viewing all articles
Browse latest Browse all 3160

Trending Articles