Never Stop Learning

Posted: May 20, 2009 in Uncategorized

I went to the Colorado Springs SQL Server User Group tonight and heard one of the best thought provoking presentations I have heard in years. The session was called SQL Server 2008 OLTP Database Consolidation Best Practices by Maurice De Vidts of Hewlett Packard Company. The session is based on a white paper that he did and how he was measuring performance. The session really got me thinking in a direction that I have not been recently however, I believe in. My thought here is that I believe that there are many SQL Servers out there that have the power they need to perform well. I think the problem lies within the development of the whole architecture around the database, including the database code and application code. I believe that as hardware gets faster and storage gets larger it is easy to take the route of buying bigger. Often buying bigger can be a huge help, but only to a point. Eventually many realize that the bottle neck is not the hardware.

Example: Last Christmas I did a consulting gig for a company that had an online web store. This company had been in business for about 10 years. One of the nice features of the web site was if you searched on an item, let’s say shoes. They would query the database past 110 years worth of sales; add a group by Item type and then count. Now over the course of 10 years this table had close to 100,000 million records in it. Not all entries were sales, some were tax, shipping charges, wrapping charges and so on… This feature that was on the page was embedded in the asp code and not a stored procedure, this table had thousands of different items to search on, so the data could not be cached well. Now to make this problem worse, this code was in the header for about 1,300 asp pages.

The end result was every time someone navigated from one page to another in the website the database would re-group every item and re-count to find the most popular items. When Christmas season came along and the database would come to a snail’s pace it was the database to blame. Not a month before they upgraded the hardware and did not spend time upgrading the code.

This problem could have been easily corrected by:

  • Putting in a store proc that every night or even every week refigures the top 5 selling items and putting them in their own table.
  • Adjusting the current code to only look at the new table for the top selling items.
  • Maybe even putting older items in a history table so not to impact the results. (Let’s say that when the new “Wiffle” shoes came out 8 years ago they sold 10,000. But a year later no one would buy them. The company would stop carrying them yet they would still appear in the results. Anyone who wanted a pair of the “Wiffle” would see that they are top selling, but the site had none to sell).

Back to the User group meeting, I eyes were open to something that I did not know, and I am not sure if it is true at this point so I would like to look at it more. I started to research the subject and I ran across something that I did not know was added to SQL Server 2005. Point is that you should always try to keep learning. Logic can really pay off.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s