Comment Contest: To Celebrate Groundhog Day, Let’s Celebrate The Groundhog DBA (Or Whatever)

Facing Up


When you’ve been working with the same technology for long enough, especially if you’ve found your niche, you’re bound to find yourself having to solve the same problems over and over again with it.

Even if you move to a new job, they probably hired you because of your track record with, and skillset in, whatever you were doing before.

One thing I do for clients is review resumes and interview applicants for developer/DBA roles. There are some laughable ones out there, where someone claims to be an expert in every technology they’ve ever touched, and there are some ridiculous ones out there that are obviously copy/paste jobs. I even came across one resume where the job description and qualifications were posted under experience.

As a performance tuning consultant, even I end up fixing a lot of the same issues day to day. There are, of course, weird and cool problems I get to solve, but most folks struggle with the grasping the fundamentals so bad that I have to ~do the needful~ and take care of really basic stuff.

There’s something comforting in that, because I know someone out there will always need my help, and I love seeing a client happy with my work, even if it wasn’t the most challenging work I’ve ever done.

Anyway, to celebrate the times where we’ve gotten to come out of our groundhog holes (I’m sure they have a real name, but I’m not a groundhogologist), leave a comment below with a cool problem you’ve gotten to solve recently.

The best comment (judged by me), will win free access to my SQL Server Performance Tuning Training Library.

In the case of a tie, there will be a drinking contest to determine the winner.

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that, and need to solve database performance problems quickly. You can also get a quick, low cost health check with no phone time required.



22 thoughts on “Comment Contest: To Celebrate Groundhog Day, Let’s Celebrate The Groundhog DBA (Or Whatever)

  1. I wanted to model the behavior of a workload when throttled by Resource Governor (As part of a project to consolidate servers)
    We had collected workload read operations per second from the past week. And we saw the workload was very spiky because it was a reporting workload.

    On the assumption that the reads still needed to happen even when throttled, we wanted to know, how long of a stretch would the reads be saturated for.
    i.e. If we throttle at 2000 IOPS, would we see the IO be saturated for longer than 10 minutes?

    Using Excel, I pasted the data into a table with columns collected_timestamp and readops.
    I added three new calculated queries work_to_be_done, throttled_work_done, work_left_to_do. If there was any work left to do, it fed into the work to be done of the next row.

    After visualizing that, we could visualize what a throttled workload would look like.
    The model wasn’t perfect, and inaccurate because we made a ton of assumptions, but it was useful.
    It helped us identify servers to consolidate and resource governor levels to use.

    After applying the RG settings, the resulting behavior was accurate to the model (surprisingly)

    So that was fun.
    (I may blog about this)

  2. Corrupt Filegroup blocking Full backups.

    We’re moving the code and data into a new physical DB, one datafile, one log files, and off Enterprise Edition, since they don’t use the features.

    Not sexy, but something I’ve never needed to do before.

  3. I trust this will be the shortest, most succinct entry, although not inspirational…

    Every morning, with every employer, after waking up to the cheesy strains of “Babe… I got you Babe”, I open a procedure and remove WITH (NOLOCK). Then, later on, another.

  4. Someone brought me an interesting issue whereby attempting to attach databases in Azure storage ended up throwing errors about drive geometry. Very interesting to see how different products implement the fundamental things we take for granted each day that just “work”.

  5. The request (problem?) I used to get asked ALL THE TIME was:

    Client: How do we make our imports faster?
    Me: Drop some of the indexes on that huge table
    Client: Ok, but will that effect our read performance?
    Me: Yes, but it needs to be a balance, cant have super fast reads and writes.
    Client: Ah, but we need both, at the same time.
    Me: …….

  6. I don’t know whether to be proud or ashamed of this one…
    The product I’m supporting is a CRM/ERP (third party) product backed by SQL Server, with a web front end. They helpfully provide a webservice API so developers can interact with their product. They also designed it so that you can put code in the Custom schema and their API will pick it up and expose it through their webservice. So, certain departments have contracted with some developers to make a product that needs to be able to push notes in to the CRM to attach to customers, and also push in attachments (like pictures and PDFs) and have them save to the document repository for the customer. Obviously, all these important servers are protected in our datacenter, with only the API exposed for this external tool. Guess who had to write a stored procedure that accepts an attachment name, a VARCHAR parameter for Base64 encoded attachment data, and then uses the SQL Server to spit those bytes out to a file on the document server and link it in to the CRM….

  7. I recently changed our INSERT/UPDATE processing for our warehouse to utilize CDC. Our load/update times went from roughly 45 minutes to seconds. I templated the code and used Ctrl-M to build all of the CDC modules in a decent amount of time.

    Pretty fun.

  8. Most recently, but not necessarily cool other than “didn’t take forever”, I was helping someone import ~ 200 files into a database. They were flat file format, pipe-delimited. We got the (flawed) DDL script to create the database tables – no keys/indexes and some bad definitions like “varchar(0)”. I fixed the flaws to get us by, then looked at the many files to see what we could do to make that task of importing them easier.

    At this point, I have to express gratitude to the community for the ages-old experience working with BCP to move data around before we had all of these fancy bacpacs and such. I was able to write up some code to pull that data in, make a batch file out of it, pipe errors to some text files, and get the majority of the data imported quickly. After that, we had 3 files error out and those were due to some odd type mismatches between the ddl and what was in the file. Made a couple of adjustments to the database and pulled those over again without any further issues.

    I don’t get a chance to use BCP often, nor a need for it, but was quite grateful for having used that in the past enough to know how to use it again. Trying to do an “import/export” wizard or write up custom powershell to do all of the mappings for files with no headers would have taken quite some time. (doable, but tedious)

  9. I once worked for a company that used Redgate SQL Compare to create scripts as part of the release process. No issues there, but one particular Developer had a local copy of the Schema that didn’t have Data Compression settings that matched the (extremely large) tables on Production.

    Cue the sounds of CPU straining and Log files filling up as the scripts attempted to deliver the release by expanding Tables that were compressed on Drives that didn’t have space for them not to be compressed!

    Luckilly, the issue was identified quickly, the script cancelled, and the Developer “educated”, returning harmony to the situation.

    Unfortunately, it took a few times for them to get the message and apply compression to their Dev environment! Until then, there was a daily “whirr” as the de-compression started and Log files started throbbing!

      1. That song came out when I was 10 – oh, the joy at school!
        Gordon breathed a sign of relief at that point.
        Thank you for regressing me 45 years 🙂

Comments are closed.