Archive for August, 2010

I am not sure why I have never done this before. I can think of many reasons why one would want to track the changes that are being made to their databases. There are the obvious reasons to do this:

  • Auditing (Make sure the changes that were supposed to happen really did)
  • Security (Who is making changes to your database)
  • Trouble Shooting (What has changed before the database performance took a dive)

In the script below (I have to admit I have only been using for a day or two so there may be some things I need to work out with it) I am inserting changes that are made to Tables, Procedures, Triggers, Functions into a table in a database I use for a management database. The script not only grabs the change, but who made the change, where they were when they made the change, and when the change was made. If I make changes to this I will update this post. If you have any suggestions please feel free to let me know. The trigger can be altered catch other DDL Events.

— This set of Steps will allow you to

— Track all changes made to the SCHEMA of a database

— Notify off of those changes

— Steps to implament


— 1) Create Dest Table

— 2) Create DDL Trigger

— 3) Create Procedure that looks at the last 24 hours for a change

— 4) Create Job that Executes Procedure in Step 3

–Step 1

DROP
TABLE schema_tracking

GO

CREATE
TABLE schema_tracking

(

ChangeDate DATETIME
NOT
NULL,

ChangeUser VARCHAR (20)
NOT
NULL,

ChangeHost VARCHAR(40)
NOT
NULL,

ObjectDatabase NVARCHAR(25)
NOT
NULL,

ObjectChanged NVARCHAR(1250)
NOT
NULL

)

GO

–Step 2

— Insert this into each database and including the model

Alter
TRIGGER Audit

ON
DATABASE

FOR DROP_TABLE, ALTER_TABLE, CREATE_TABLE, CREATE_PROCEDURE, ALTER_PROCEDURE, DROP_PROCEDURE, CREATE_TRIGGER, ALTER_TRIGGER, DROP_TRIGGER,

CREATE_FUNCTION, ALTER_FUNCTION, DROP_FUNCTION

AS

INSERT
INTO DBStats.dbo.Schema_tracking VALUES(

GETDATE(),

SYSTEM_USER,

Host_name(),

DB_NAME(),

EVENTDATA().value(‘(/EVENT_INSTANCE/TSQLCommand/CommandText)[1]’,‘nvarchar(max)’)

)

;


Advertisements

Do you know how long your procs run on average? What about a trend on how long they have been executing?

Ok so a while ago I posted a DMV that Glenn Berry used in a presentation. It was a great piece of code that I am using a lot, but I found myself wanting to know over an extended period of time how the performance on the procs was changing, were things getting better or worse? There is a number of developers where I am working right now and each of them are working on new aspects of the applications we have, so how does what they are doing now, going to impact what we already have in the database?

A note about Glenn, I have had a couple discussions with Glenn and I can tell you each time I contact him he has been an awesome help. Tonight when I was grabbing the link for is blog I noticed even another set of DMV’s there that I didn’t have a chance to look at. If you have not seen his blog I highly recommend it. You can find his blog here, Thanks Glenn

I made a few changes and here is what I did.

 

First I created a table in my DBStats database that I keep out there:

 

CREATE
TABLE [dbo].[PROC_Exe_Times](

    [dbname] [varchar](200)
NOT
NULL,

    [spname] [varchar](2000)
NULL,

    [ExeCount] [bigint] NULL,

    [ExePerSec] [bigint] NULL,

    [AvgWorkerTime] [bigint] NULL,

    [TotalWorkerTime] [bigint] NULL,

    [AvgElapsedTime] [bigint] NULL,

    [MaxLogicalReads] [bigint] NULL,

    [MaxLogicalWrites] [bigint] NULL,

    [TotalPhysicalReads] [bigint] NULL,

    [DateRecorded] [datetime] NULL

)
ON [PRIMARY]

 

GO

 

Then I create a proc that will cursor through each of the databases, other than the ones I don’t care about for this effort and record that information in the Table. This will give me a good idea how the performance trends over time.

 

CREATE
PROCEDURE [dbo].[GetProcExe]

AS

DECLARE @DBNAME VARCHAR(20)

DECLARE db_Cursor CURSOR

    FOR
SELECT NAME FROM
master..sysdatabases

        WHERE name NOT
IN
(‘Master’,‘Model’, ‘MSDB’, ‘TempDB’, ‘distribution’, ‘LiteSpeedLocal’)

OPEN db_cursor

FETCH
NEXT
FROM db_cursor INTO @DBNAME

WHILE
@@FETCH_STATUS
= 0

    BEGIN

    –DECLARE @DBname VARCHAR(10)

    INSERT
INTO DBSTats..PROC_Exe_Times

    (dbname,SPname, ExeCount, ExePerSec, AvgWorkerTime, TotalWorkerTime, AvgElapsedTime, MaxLogicalReads, MaxLogicalWrites, TotalPhysicalReads, DateRecorded)

        SELECT

            @dbname AS dbname,

            OBJECT_SCHEMA_NAME(qt.objectid, qt.dbid)
+
‘.’
+
object_name(qt.objectid, qt.dbid) AS spname,

            qs.execution_count AS
‘Execution Count’,

            qs.execution_count/DATEDIFF(Second, qs.creation_time,
GetDate())
AS
‘Calls/Second’,

            qs.total_worker_time/qs.execution_count AS
‘AvgWorkerTime’,

            qs.total_worker_time AS
‘TotalWorkerTime’,

            qs.total_elapsed_time/qs.execution_count AS
‘AvgElapsedTime’,

            qs.max_logical_reads,

            qs.max_logical_writes,

            qs.total_physical_reads,

            GETDATE()
AS RecordedDate

        FROM
sys.dm_exec_query_stats
AS qs

            CROSS
APPLY
sys.dm_exec_sql_text(qs.sql_handle) AS qt

            WHERE qt.dbid =
db_id(@dbname)
— Filter by current database

        ORDER
BY qs.execution_count DESC

    FETCH
NEXT
FROM db_cursor INTO @DBNAME    

    END

WHILE
@@FETCH_STATUS
= 0

CLOSE db_Cursor

DEALLOCATE db_cursor

DBCC FreeProcCache

 

 

GO

 

I want to make sure that I thank Glenn for sharing this DMV with me, this is going to give me a lot of information that I have not had in the past.

Roll Back Scripts

Posted: August 9, 2010 in SQLServerPedia Syndication

I have to ask… Do you know what a roll back script is?

All too often I see database changes made to Production systems in the scripts all nice and pretty, but when I ask for roll back scripts people look at me like a deer in the headlights. Then all the sudden it’s like the light comes on and they turn the tables back on me. The roll back script is the backup you fool of a DBA…. Well lets thing again.

Many databases are required to be online 24 hours a day. So when a change is rolled to production it requires that the change be done after hours or whatever the slowest time of the day is for the database. What worries me is when this situation occurs:

A database that takes updates around the clock, these changes are being delivered into the system by a web service or even a simple web page. After the release the testing team is going crazy to approve the release. A couple hours later a bug is discovered. So you have your backup that does not have the most recent 2 hours of transactions, your only choice is going to be to restore the backup and try to merge the database between the two databases. With larger databases this is going to take some time.

The other reason I am not a fan of backups as roll back plans is 100% motivated by sleep. It sounds selfish but the older I get the more I realize that when I wake up in the middle of the night is a miracle I can still find the restroom in my house. The middle of the night is not always the best time for you or me at least to make decisions. If I have a script that will allow me to load and roll, the decision making is gone. Decisions were made when the team was all there and awake. Now we do have plans that if I am required to be at the top of my game in the middle of the night, that includes a lot of candy and soda.

So the next time you are scripting the changes you are going to apply to the database, take some time and decide if you need roll back scripts. Go the extra steps and develop the plans that you need to remove changes from the database.