Tip # 5 – SQL Server Maximum Memory (Tips for SQL Server Performance and Resiliency)

Posted: March 4, 2015 in Top 10 Tips for SQL Server Performance and Resiliency
Tags: ,

This article is part 5 in a series on the top 10 most common mistakes that I have seen impacting SQL Server Performance and Resiliency. This post is not all inclusive.

Most common mistake #5: Not Setting a Maximum Memory

Default configurations are not always the best configuration for your server. Maximum memory allocated to SQL Server is a great example of this. If this setting is left unchanged, there is a chance your SQL Server could be starving the operating system of memory. When SQL Server starves the operating system, the OS cannot provide a stable platform. A struggling OS will have a difficult time retrieving data from disk, swapping processes, and completing standard OS functions to name a few issues.

There are a number of ways to see if the operating system is running into memory issues. One of the fastest ways is to simply look at the task manager and see how much is available. If you review the available memory when the server is not performing well and the amount of available memory is low, there is a good chance the OS is being starved. With tasks like file copies, backups and virus scans, the operating system will look for memory to help complete the process.   If SQL Server has that memory reserved, the OS will not have the memory it needs to work efficiently.

Proper SQL Server memory configuration can take some fine tuning as a number of variables need to be taken into consideration. For example if there are 2 SQL Server instances installed you will need to leave the memory configured for each SQL Server low enough so the Operating System has enough to work with. And don’t forget processes like copying large files to different media, such as a file copy for a backup. Finding the balance can often leave DBA’s going in circles. When you add virtual servers into the mix the variables increase even more and sometimes the memory configurations go against SQL Server best practices (VMWare recommends min and max memory to be set to be the same). There are many formulas for recommend initial max memory settings.

So where do you start? A general rule of thumb is to leave the operating system 20% of the memory. With that being said, I hesitate to leave the operating system anything less than 2 gigs. After making a memory setting adjustment, close attention should be given to monitoring the memory counters in the performance monitor and the DMV’s in SQL Server to ensure the operating system has enough memory to perform well without fighting with SQL Server.

Top 10 Tips for SQL Server Performance and Resiliency

  1. Improper Backups
  2. Improper Security
  3. Improper Maintenance
  4. Not having a Baseline
Comments
  1. Thomas Franz says:

    20 % of 8 GB are to less (only 1.6 GB) but 20 % of 512 GB are to much (~ 100 GB just for some file copies), so I tend to let something between 2 and 4 GB to the OS (+ application services + some parts of the SQL server that doesn’t count to its max memory setting)

  2. Rudy Panigas says:

    Here is a formula that works.. from “TroubleShooting SQL Server A Guide for the Accidental DBA”
    which was written by Jonathan Kehayias and Ted Krueger ISBN: 978-1-90643477-9

    1. Reserve 1 Gigabyte (GB) of RAM for the Operating System (OS)
    2. Reserve 1GB of RAM for each 4GB of RAM installed from 4 – 16 GB
    3. Add 1GB of RAM for every 8GB of RAM above 16GB
    4. Min memory set to be half of the max memory.. as a start

    Rudy

Leave a comment