Giving away FREE Access to my SQL Server High Availability and Disaster Recovery Deep Dive Course – Round 2

- Microsoft technologies and what I do for fun -


Last year, as part of launching my very first online course, I gave away FREE access to my SQL Server High Availability and Disaster Recovery Deep Dive Course. I’m doing it again this year but with a totally different reason. Here’s why.

I’ve been very active in the SQL Server community in one way or another. A lot of people ask me why I do what I do. It all started in late 1999 when, fresh out of college with no one wanting to hire me, a potential customer asked me to write an inventory application for their small business. This might sound really exciting for somebody who would consider this their very first consulting opportunity immediately out of college. This plus considering the fact that my potential customer was willing to pay me any price I would charge them for it. But not for me. You see, I didn’t have…

View original post 766 more words


SQL Server proxy account and local policy

Taking control of the ‘sa’ account for SQL Server 2012

As a DBA, it is perfectly normal to forget your password from time to time. No need to hold your breath or panic. In case of emergency that you need to recall your password or the ‘sa’ password, no need to panic because you can do it via the backdoor any way. These steps should get you going and move on with your administration tasks.

  • From the SQL Server configuration manager, stop the SQL services.
  • Open a cmd window as an administrator or elevated rights
  • From the command prompt, you need to locate the folder path of where your SQL Server binary files are located. Go to that path and type; sqlservr.exe –m.
  • Once the SQL server service succesfully starts with single user, open another cmd window as administrator or account with elevated rights
  • Type in command prompt; sqlcmd –S <servername> or (local) if your SQL Server is in the same machine.
  • Then you can reset the ‘sa’ password and also enable or unlock it at the same time.

USE [master]

There you have it. You have successfully taken control of the ‘sa’ account. From this point, close all the previous command windows and go back to SQL Configuration Manager. Start the SQL Server services. Try to connect to your SQL Server and start to use ‘sa’ account. 🙂

Now breathe.

A quicker way to know when DBCC CHECKDB last ran in your database.

Problem : Surely you have scheduled jobs to run DBCC CHECKDB across your SQL Servers to check database integrity. But sometimes you miss checking reports on these scheduled jobs if they did run or completed but failed. So how do you monitor when was the last run of DBCC CHECKDB in your database?

Solution : There are many ways to do this. Also there is a very good in depth post from Paul Randal about DBCC CHECKDB which was a great help for me.

Generally, to check the last run of DBCC CHECKDB you can always use the command

DBCC DBINFO (yourdatabasenamehere) WITH TABLERESULTS

and look for the dbi_dbccLastKnownGood field which contains the date time stamp of the DBCC activity. But this is good only if the DBCC actually ran against the database and not on the snapshot of the database. For cases in which mirrored databases are involved you would usually run DBCC CHECKDB against a snapshot of the mirrored database.

To check the last DBCC CHECKDB run against these mirrored database, you may use a query i use frequently across all the SQL Server I monitor.

[LogDate] VARCHAR(25),
[ProcessInfo] VARCHAR(10),
[Text] VARCHAR(1000)

INSERT INTO #DBCheckInfo EXECUTE xp_ReadErrorLog 0, 1, ‘dbcc checkdb’

In the above method that I use, i check the SQL Server Logs for any run of DBCC CHECKDB activity. This way I get the correct date whether the DBCC actually ran on the database or a snapshot of the database as long as the DBCC CHECKDB command was actually invoked.

Next Step : You can add this query as part of your daily health check monitoring reports.


Setting Up Your Custom Data Collector Set


From my previous post, we learned how to set up Data Collection in SQL Server. This is most suitable for monitoring purposes and reporting on disk usage and server activities. But what if you want to monitor other areas of your database server aside from the default data collection sets?


Aside from the default data collection sets, you can also set up your own customised data collection sets. For example your requirement would be to monitor all failed SQL Agent jobs on a daily basis across all your SQL database servers. Lets use this as an example.

In creating a customised data collection set, the step is composed of 3 parts;

1. defining the data collection container – which contains header parameters such as name of the data collection set, description, logging and schedule to run.  You will need to use the stored procedure sp_syscollector_create_collection_set. See example below;

EXEC [msdb].[dbo].[sp_syscollector_create_collection_set]
                                    @name=N’Failed SQL Jobs’,
                                    @description=N’Collects data about failed jobs for all servers.’,
                                    @collection_set_id=@collection_set_id_1 OUTPUT,
                                    @collection_set_uid=@collection_set_uid_2 OUTPUT

2. defining the data collector type – this is the part you will define the data collector type for your custom data collection. There are predefined data collector type already setup for use. For most usual cases we will use the Generic T-SQL Query Collector Type.   To retrieve this value you may use the statement below;

Declare @collector_type_uid_3 uniqueidentifier
                  Select @collector_type_uid_3 = collector_type_uid
                  From [msdb].[dbo].[syscollector_collector_types]
                  Where name = N’Generic T-SQL Query Collector Type’;

3. Define the data collection item – this is the part wherein you define your parameters for your data collection set.  This will also contain the actual query to retrieve all SQL failed jobs. As an example see the code below

Declare @collection_item_id_4 int
EXEC [msdb].[dbo].[sp_syscollector_create_collection_item]
                                    @name=N’Failed SQL Jobs Item’,
                                    @parameters=N'<ns:TSQLQueryCollector xmlns:ns=”DataCollectorType”><Query><Value>
                                    SELECT  @@ServerName AS [ServerName],
                                                      [sJOB].[name] AS [JobName],
                                                      [sJOBH].[run_date]  AS [LastRunDateTime],
                                                      CAST([sJOBH].[run_duration] AS VARCHAR(6)) AS [LastRunDuration (HH:MM:SS)],
                                                      [sJOBH].[message] AS [LastRunStatusMessage],
                                                      CAST([sJOBSCH].[NextRunDate] AS CHAR(8)),
                                                      [sJOBSCH].[NextRunDate] AS [NextRunDateTime]
                                    FROM [msdb].[dbo].[sysjobs] AS [sJOB] LEFT JOIN (SELECT [job_id], MIN([next_run_date]) AS [NextRunDate]
                    , MIN([next_run_time]) AS [NextRunTime]
                FROM [msdb].[dbo].[sysjobschedules]
                GROUP BY [job_id] ) AS [sJOBSCH] ON [sJOB].[job_id] = [sJOBSCH].[job_id]
                 LEFT JOIN (SELECT [job_id] , [run_date] , [run_time]          , [run_status], [run_duration], [message], ROW_NUMBER() OVER (PARTITION BY [job_id] ORDER BY [run_date] DESC, [run_time] DESC) AS RowNumber FROM [msdb].[dbo].[sysjobhistory] WHERE [step_id] = 0 ) AS [sJOBH] ON [sJOB].[job_id] = [sJOBH].[job_id] AND [sJOBH].[RowNumber] = 1    WHERE  [sJOBH].[run_status] = ”0”                                    ORDER BY [LastRunDateTime] DESC

                        </Value><OutputTable> FailedJobs</OutputTable> </Query><Databases UseSystemDatabases=”true” UseUserDatabases=”true” /> </ns:TSQLQueryCollector>’, 
                                    @collection_item_id=@collection_item_id_4 OUTPUT,

The <OutputTable> tag contains the destination table for your query results. This table will be created with default schema of custom_snapshots inside your Management Data Warehouse database.

After successfully creating your custom data collection set, you must manually start the data collection. You can either do this via GUI from the Data Collection menu or by executing below under the msdb database.

EXEC sp_syscollector_start_collection_set @collection_set_id = <yourcollectionsetid>;

After the initial upload of data thats the time the output table for your data collection set will be created. Now that you have the data for all failed SQL Agent jobs across your database servers on a daily basis, you can create a report for this via Reporting Services and added to your monitoring reports.

There you have it, your first customised data collection set. From here on you can create more monitoring reports for your dashboard and impress your team lead or boss. 🙂



Inspired Global Storytelling

My Time to Travel

The travels of an old(er), solo, woman


The life of a data geek


Traveller Observer

The SQL Pro

(Ayman El-Ghazali -

Meels on Wheels

Meels for breakfast, lunch and dinner.

Building A Business While Having A Life

Paul Turley's SQL Server BI Blog

sharing my experiences with the Microsoft data platform, SQL Server BI, Data Modeling, SSAS Design, Power Pivot, Power BI, SSRS Advanced Design, Power BI, Dashboards & Visualization since 2009


Explaining the bits and bytes of SQL Server and Azure

Coding Tips

We are crazy about programming and we want to share our craziness with you!!!

SQL Studies

 Live, Learn, Share

Sql And Me

My Experiments with SQLServer

Dimitrios Kalemis

I am exactly like Jesus Christ: an atheist and anarchist against society and bad people with influence and power (judges, social workers, politicians, priests, and teachers).

Clint Huijbers' Blog

Senior Certified Microsoft BI Consultant

Blog of Many Useless Wonders

Where Uselessness Abounds!

Steve Spevack's Blog

IT Recruitment