Sysprocesses Tracker

In developing some Server/Application baselines it is good to know some details about  who and what connects to the database server. The sysprocesses system table contains some good data for tracking down what is up on your SQL Servers.

I set out to create a VBScript that would go out and grab results from sysprocesses  from all  my SQL Servers on demand or on a schedule.

First step was to create a table to collect the data. So I hopped on my test server and created  the following table:

CREATE TABLE [dbo].[sysprocessTracker](
[spid] [int] NULL,
[status] [varchar](250) NULL,
[runid] [int] NULL,
[nt_username] [varchar](250) NULL,
[hostname] [varchar](250) NULL,
[blocked] [varchar](50) NULL,
[dbid] [varchar](50) NULL,
[cmd] [varchar](200) NULL,
[cpu] [int] NULL,
[physical_io] [int] NULL,
[memusage] [int] NULL,
[last_batch] [varchar](100) NULL,
[program_name] [varchar](200) NULL,
[nt_domain] [varchar](200) NULL,
[net_library] [varchar](200) NULL,
[login_time] [datetime] NULL,
[lastwaittype] [varchar](200) NULL,
[servername] [varchar](200) NULL,
[capturetime] [datetime] NULL,
[loginame] [varchar](100) NULL
) ON [PRIMARY]

The table represents the basic output from the master.dbo.sysprocesses table with a few tweaks.

  1. I added a column called servername. I set this during runtime of the script in the server loop. I add it  to my table so I can later query by server name.
  2. Capture Time. I added this column to the table to know when I captured the master.dbo.sysprocesses results.

I referenced a server loop in step 1. To run the script against all servers I setup a little server text file  with a list of my servers in the text file. When I add or remove a server I update the list.

Format of the Text File:

SERVER1
SERVER2
TESTSERVER1

After the setup was complete I wrote the script to go out and execute the master.dbo.sysprocesses query.

set fso = CreateObject(“Scripting.FileSystemObject”)
set f = fso.OpenTextFile(“C:\scripts\sysprocessTracker\servers+.txt”,1,True)
set counterFile = fso.OpenTextFile(“C:\scripts\sysprocessTracker\runcounter.txt”,1,True)
set logfile = fso.OpenTextFile(“C:\Scripts\sysprocessTracker\logfile.txt”,2,True)

counter = counterFile.ReadLine

counterFile.Close

dbs=”server=servername;DRIVER={SQL Server};database=dbname;Trusted_Connection=yes”
Set db = createobject(“adodb.connection”)
db.ConnectionString = dbs
db.Open

while not f.AtEndOfStream

lineArray = Split(f.ReadLine, “|”, -1)
serverName = lineArray(0)

ssql = “insert into sysprocessTracker (servername,capturetime,runid,spid,status,nt_username,hostname,blocked,dbid,cmd,cpu,physical_io,memusage,last_batch,program_name,nt_domain,net_library,login_time,lastwaittype,loginame) SELECT ‘” & servername & “‘,GETDATE(), ” & counter & “,spid,RTRIM(status),RTRIM(nt_username),RTRIM(hostname),RTRIM(blocked),dbid,RTRIM(cmd),cpu,physical_io,memusage,last_batch,RTRIM(program_name),RTRIM(nt_domain),RTRIM(net_library),login_time,RTRIM(lastwaittype),RTRIM(loginame) FROM OPENROWSET(‘SQLOLEDB’,’Server=” & serverName & “;Trusted_Connection=Yes;Database=Master’,’select spid,sp.status,nt_username,hostname,blocked,sd.name as dbid,cmd,cpu,physical_io,memusage,last_batch,program_name,nt_domain,net_library,login_time,lastwaittype,loginame from master..sysprocesses sp, master..sysdatabases sd where sp.dbid = sd.dbid’) ”
On Error Resume Next
db.Execute(ssql)
If Err.Number <> 0 Then
logfile.WriteLine(“error connecting to: ” & servername & ” — ” & Now)
Err.Clear

End if
wend

set counterFile = fso.OpenTextFile(“C:\scripts\sysprocessTracker\runcounter.txt”,2,True)
newcount = counter + 1
counterFile.WriteLine(newcount)
counterFile.Close

ssql = “select count(*) from sysprocessTracker where capturetime < GETDATE() – 2”
set rs = db.Execute(ssql)
if not rs.EOF then
logfile.Writeline(“Rows Deleted: ” & rs(0))
end if

‘put this in here to keep the table cleaned up. You can remove this if you want to view more than 2 days worth of history
ssql = “delete from sysprocessTracker where capturetime < GETDATE() – 2”
db.Execute(ssql)

db.Close
logfile.close


Update:

The script will loop through my server text file and then insert the results of the sysprocesses call into a table. The  most difficult part was working out the INSERT command based on results on the OPENROWSET command. That took a few rounds but worked out well.  Initially I tried creating a temp table then selecting the temp table into a recordset, parsing the recordset and creating the insert. I had some problems  with temp table scope so went this route after some google searching. The single insert seems a lot more efficient and works well.

Update 2:

Added a run counter in my script. Easier to track the specific runs. I started off by using the time stamp but soon realized that the timestamp resolves down to the millisecond and at times  the query of sysprocesses spanned a few milliseconds. Made it tough to group the captures so now I just use the runid column.

What to do with the info? Some sample queries:

–How many users connected, avg CPU and Diskio based on each capture
select count(spid) ConnectedUsers, avg(cpu) AvgCPU,avg(physical_io) AVGDiskIO, capturetime from sysprocessTracker where servername = ‘SERVER1’ group by servername, capturetime

–Which command is executed most often, which db is used the most, who are the users logged in, what programs are used the most, etc.
select avg(spid) ConnectedUsers, avg(cpu) AvgCPU,avg(physical_io) AVGDiskIO from sysprocessTracker where servername = ‘SERVER1’ group by servername
select count(cmd), cmd from sysprocessTracker where servername = ‘SERVER1’ group by cmd
select count(dbid), dbid from sysprocessTracker where servername = ‘SERVER1’ group by dbid
select count(nt_username),nt_username,capturetime from sysprocessTracker where servername = ‘SERVER1’ group by capturetime,nt_username
select count(program_name),program_name,capturetime from sysprocessTracker where servername = ‘SERVER1’ group by capturetime,program_name

SQL Job Tracker

I am not big on buying off the shelf software to monitor SQL for two main reasons:

  1. Not much flexibility.
  2. You can learn much more by developing scripts, websites, and such yourself.

Now I understand that in some circumstances it is nice to buy off the shelf products. It is nice having a piece of software that someone else can support. Also buying off the shelf software is necessary at times due to time constraints and project loads.

One custom process I have running is a SQL Job Checker.  The job checker will report on all the SQL Jobs that have failed. Even if you only have a handful of servers it is nice to have a single place to monitor all your jobs across multiple servers.

Now I am still old skool when it comes to DTS vs SSIS. DTS is simple and works. I am sure SSIS would get the job done just as well if not better than DTS. I am slow to make the jump to SSIS.  I created a DTS package on server that points to all the other servers. I created the following table:

CREATE TABLE [dbo].[SQLJobTracker](
[servername] [varchar](20) NULL,
[jobname] [varchar](75) NULL,
[message] [varchar](200) NULL,
[rundate] [datetime] NULL,
[stepid] [int] NULL,
[capturedate] [datetime] NULL CONSTRAINT [DF_SQLJobTracker_capturedate] DEFAULT (getdate())
) ON [PRIMARY]

In order to keep the job tracker table clean, I run a delete command in the DTS package as the first step to delete the old stuff  from the table:

delete from SQLJOBTracker where capturedate < Getdate() – 2

Then on each source in the DTS package I use the following query to extract the data and dump it into the table above:

select @@SERVERNAME ServerName,sj.name, sjh.message,
cast(dateadd(ss,
cast(substring(cast(run_duration + 1000000 as char(7)),6,2) as int),
— Add Run Duration Minutes
dateadd(mi,
cast(substring(cast(run_duration + 1000000 as char(7)),4,2) as int),
— Add Run Duration Hours
dateadd(hh,
cast(substring(cast(run_duration + 1000000 as char(7)),2,2) as int),
— Add Start Time Seconds
dateadd(ss,
cast(substring(cast(run_time + 1000000 as char(7)),6,2) as int),
— Add Start Time Minutes
dateadd(mi,
cast(substring(cast(run_time + 1000000 as char(7)),4,2) as int),
— Add Start Time Hours
dateadd(hh,
cast(substring(cast(run_time + 1000000 as char(7)),2,2) as int),
convert(datetime,cast (run_date as char(8))))
))))) as char(19)) rundate,sjh.step_id
from sysjobhistory sjh, sysjobs sj
where sj.job_id = sjh.job_id
and sjh.run_status = 0
and convert(datetime,cast (run_date as char(8))) > Getdate() – 2

I like to keep 2 days worth of logs in the table to show jobs failing one day but then success the next. Just how my mind works. Now one of the draw backs on DTS is the lack of control about what steps happen in what order, step precedence. SSIS does a much better job with step precedence. This limitation was glaring in DTS when trying to make the  generate report step the last step after all the extracts. So as a workaround I just created another DTS package to extract some data into a CSV, and then send me the report in an email:

select * from sqljobtracker where order by rundate

So the end result is a daily email with a list of the jobs that failed. Each day you get a quick snapshot of the problem jobs. I am sure there are more elegant ways to make this happen, but this method is about 10 minutes of work and works out pretty well.

Deny Domain Admin access to SQL

First thoughts said this was not really possible. Seems like Domain Admins (DA) could access the SQL Server regardless of what I did to lock them out. I did find one way that will satisfy an auditor.

  1. Create a Domain group and add the Domain Admins you wish to lockout.
  2. Add the Domain group as a user to your SQL Server
  3. On the General tab under Security Access click Deny access.
  4. Make sure your

Yeah I agree that seems too simple. One major flaw with this is that a DA could just remove himself/herself from the group you specified. Another flaw is that the DA could just access the server itself and stop services, delete databases, reign down terror etc.

I searched and found some sites that suggested removing the BUILTIN\Administrators group from SQL. NOTE: if you do this make sure you know what you are doing. If you do this in the wrong order you can remove all Sysadmin access from your instance of SQL. Make sure you have an account that has sysadmin privs before you drop the BUILTIN\Administrators group from your SQL.

During testing I did not see a need to remove the BUILTIN\Administrators group.

Testing:

  1. Add Domain Admins to Domain Group
  2. Deny the Group access to SQL using steps listed above
  3. Have Domain Admin try connecting to Enterprise Manager/SQL Analyzer/ODBC from their workstation using Windows Auth. Make sure the login fails
  4. Have Domain Admin try connecting to the server and using Enterprise Manager/SQL Analyzer/ODBC from the server itself.

If you have a better plan then this, then by all means please share it. I am probably missing something here.