More Annoyances With Local Variables And Optimize For Unknown Hints In SQL Server

More Annoyances With Local Variables And Optimize For Unknown Hints In SQL Server



Thanks for watching!

Video Summary

In this video, I delve into the frustrations and challenges that local variables and the `optimize for unknown` hint can bring to SQL Server performance tuning. While these techniques are often touted as best practices, they can significantly complicate the process of diagnosing and resolving performance issues, especially when trying to reproduce them in a testing environment or during code reviews. I demonstrate how using local variables and `optimize for unknown` hints can obscure crucial information from execution plans, making it nearly impossible to pinpoint the exact conditions that led to poor query performance. By highlighting these annoyances, I hope to encourage database professionals to reconsider their use of these practices in favor of more transparent and maintainable alternatives.

Full Transcript

Erik Darling here, with Darling Data. Don’t you just love it? Don’t you just love every minute of it? I sure do. Except the odd-numbered minutes, and some of the even-numbered minutes. A few of the seconds are great, though. Milliseconds really stand out. In today’s video, I’m going to talk about some annoyances that I have with local variables and optimize for unknown hints. Now, you’re probably used to hearing me complain about these from the problems that they cause with cardinality estimation, not using the smart part of the histogram to figure out how many rows might qualify for a WHERE clause. We’re going to look at something a little bit different in this video, because it’s something that I run into quite a bit when I’m trying to help clients with things, and it makes it really, really difficult to figure out Well, if I say it here, it’ll give away the whole video. So, I’m going to wait and say it. Let’s just say I’m sufficiently annoyed with local variables and unknown hints that we need to talk about them from a completely different angle this time around.

But first, let’s talk a little bit about money. Like, if you want to give me $4 a month, you totally can. There’s a link in the video description where you can join the channel with a membership subscription thing, and it’s cheap and it’s worth it, because you get a lot of free content out of it. If $4 is beyond your means, maybe your rates are a little too reasonable, and you can’t afford $4. That’s totally cool. I like likes, I like comments, and I like new subscribers. I like new faces showing up in the comments saying, Hey Eric, how you doing? It’s about all it takes to make me happy. All the other stuff just, you know, buys me wine, I guess.

If you need help with your SQL Server, if something that I talk about in one of these videos strikes you as something that you think that I would be particularly valuable to assist you with, any of these things, my rates are reasonable. If you want some high quality, low cost training, for life, forever, you can get about 24 hours of it for about $150 US with the discount code that’s also in the video description.

So the more things you click on there, the better off we all are, right? In reality, just clicking on the links in the description, and it will improve all of our lives greatly. If you would like to see me live and in person, you can do that. November 4th and 5th in Seattle, Washington. I don’t know if there are any other Seattles. I’m unaware of them. But I’ll be there for my birthday.

And I’ll be hanging out with Kendra Little for two full days of performance tuning magic and wonder. It’s going to be like you were a kid again, right? Watching someone do like one of those finger tricks where they, ah, where’d it go? I wasn’t flipping anyone off, I promise. I can’t blur this later, so screw it.

Anyway, if there’s an event near you that you think I would make a good speaker at, a good addition to the lineup, let me know what it is. If they’re accepting pre-cons and I can, you know, cover the cost of travel to get there, I’ll happily show up and teach people live and in person. That’s my favorite. So with that out of the way, let us begin our festivities. Let us start the party.

We will commence with the fun. So let’s go over to Management Studio. And I’ve got a store procedure here, just to make sure that we are thorough in our investigation.

I’ve got a store procedure here that does three things. It’s got one query that just takes some regular old parameters right there, right? That’s just the normal parameter way of doing things.

And then I have a second query where I declare a couple variables and I use those instead of the parameters here. And then I’ve got a third query where I use the variables, but I add this optimize for unknown hint at the very end. Now what I’m going to do is I’m going to run each of these.

And when I run each of these, we’re going to get results back, you know, fairly quickly. This doesn’t have to be like miraculously fast. What I want to show you here is really not really the cardinality estimation thing.

Like we can see the cardinality estimation thing here where like, you know, for the first set of compiled parameters, we got a good estimate and everything was cool, right? Well, I mean, really up here is where it matters where we seek into the index where the thing is and where it falls off is when we look down here, we’ll be at, oh, there’s only 123 rows. And that’s the same for both of these.

And that gets, you know, I don’t know, it becomes a thing for the other queries where, you know, the second time we compile with the actual parameters, we get the cardinality estimate from the first one. And that also is true of, oh, stop moving. Gosh, darn it.

You silly thing. That’s also true of the two optimized for unknown things where we get the, you know, the actual number of rows, but of the much worse guess, right? So this guess just is stupid for everyone.

It doesn’t make any sense. But, you know, the parameter sensitivity aspect of the first one is why a lot of people end up doing the up, the, the local variables or the optimized for unknown thing. I’ve had a number of people tell me that they thought it was a best practice in SQL Server, at which point I feel like this hand get real slappy.

Not, not in a way that would gratify anyone. So what really bothers me with this stuff, though, is just to clear things out a little bit so I don’t have six plans staring at me. What really annoys me is when you run a store procedure like this one and you get the actual execution plans, what you’ll see if you go to the properties of this.

Now, let’s say you’re in a situation like me, where you’re a young, handsome consultant and someone has hired you to help them tune their queries and their store procedures and all that other good stuff. And you’re trying to help like trying, you’re trying to reproduce a performance problem with a store procedure. If you use actual parameters in your store procedure, but what you’ll end up with over here are the compile and runtime values for both of these.

Right. So there’s the parameter compile value. There’s a parameter runtime value. We see that repeat for both of the parameters that were passed in. So if I wanted to, I could very easily take these out and I could take the compile values out, run the procedure and see if it’s slow, see if we can make any improvements. Where that changes a bit and I’m going to highlight this so it doesn’t disappear on us.

Where that changes a bit is if we look at the local variable version of this, some things start to disappear on us. All of a sudden, all we have are the runtime values. Now, if you have, if you’re looking at it, if you’re able to execute a store procedure, the runtime value should be obvious to you because you just put them in.

You just told SQL Server what to do there. Right. And that’s going to be the same for down here as well, where all of a sudden all we have is, well, the runtime value. Right. Just zero and two there. So that’s OK.

But this isn’t really my problem. My problem is if we look in query store or the plan cache and we’re like, wow, that store procedure sure was slow. We should probably do something about it. If we look at what happened in there, I’m going to show you two things. Actually, I’m going to show you all three things. The first query up top should be the optimized for unknown hint.

And if we look at the query plan for this and we say, well, this was the slowest thing in there. Let’s see what we passed in there. And let’s try to figure out how we can reproduce this performance issue so we can start fixing it. Well, we don’t have a parameter list over here.

Right. For, you know, for a little bit of what do you call it? A little bit of contrast on that. Let’s look at the query plan for the third one, which has nothing on it.

If we look at the query plan for the third one, we have a parameter list here. And we can see what the compile time values were. Right. So we can get the compile values for both of these parameters.

And we could plug these into the store procedure and we could say, oh, this is why you are slow. We can fix that. But optimize for unknown and the local variables strip that stuff out.

They completely hide that information from you. So you have absolutely no idea what happened in there. This is the up. This is the one with the local variables.

And we don’t have that parameter piece in here so that we can reproduce this. Now, on the one hand, I think this is partially Microsoft’s fault. Even though that even though these are local variables and optimize for unknown hints, you still know what the what the values of the of the parameters they were compiled with were, even if those parameters didn’t really come into play for cardinality estimation.

Remember, when you use local variables of the optimize for unknown hint, SQL Server looks at the different part of the histogram that tells it how unique it thinks the column is that you’re looking at and how many rows are in the table. And it multiplies those together.

So even though the values weren’t used for cardinality estimation, the compile value should still be in the query plan because they might be useful. Amazing. I know. Right. Why like why wouldn’t they be there?

But this is just another crappy side effect of people following bad advice or people following awful, like, non-existent advice that optimize for unknown and using local variables to substitute parameters in is a best practice. You end up in situations where it’s impossible to figure to reproduce anything to pull it out.

And then you’re then you’re stuck trying to figure out like, OK, is there application logging for what people did? Like what are common values for things? For me, with the Stack Overflow database, it’s very easy to like look at the reputation column and look at the uploads column and just come up with a couple values.

In real life, if you’re trying to find like customer IDs and order IDs and dates and all sorts of other things that could really make or break you being able to reproduce a performance issue, it really screws you badly. So please, please, please, please stop doing this. You know, recompile hints are almost the opposite because recompile hints will cache the actual values, but not like in the parameter section.

You have to like look at where you have to like look at where you touch the index to see what actual values are applied in there. So recompile has its own sort of issues. But typically, once something has a recompile hint on it, it’s usually fast anyway.

So I’m kidding. Recompile doesn’t fix everything. It just fixes crap like this. So that’s that’s nice, too.

Yeah. Anyway, please stop using local variables as parameter substitutes. And please stop using optimize for unknown hints because they make my job harder. My job’s already hard.

They have to fix your code, your indexes and your databases on your servers. It’s not easy, but it’s OK. My rates are reasonable.

All right. Thank you for watching.

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. Blog readers get 25% off the Everything Bundle — over 100 hours of performance tuning content. Need hands-on help? I offer consulting engagements from targeted investigations to ongoing retainers. Want a quick sanity check before committing to a full engagement? Schedule a call — no commitment required.