Thursday, November 19, 2009

Reporting Services Email Subscriptions SSRS 2005

This assumes a user does not have Content Manager as their SSRS role and SSRS is set up more or less as it comes out of the box.

The Problem

If a user hits subscribe on a report then the chooses email as the delivery option they will see that the To field has been pre-populated with their windows login. They will not have the permissions need to change this setting. This is fine if you are running Exchange as it will resolve the login against the Active Directory and work out the real email address.

If you are not running Exchange it is not so fine.

The fix

You need to make a change to the rsreportserver.config file.

This should be under C:\Program Files\Microsoft SQL Server\MSSQL.3\Reporting Services\ReportServer by default – if you installed SQL somewhere else then the first part of this path will be wrong.

Open up this file and look for a value <sendemailtouseralias> under <rsemaildpconfiguration>. If this is set to True, as it is by default, you will get the problem above. If it is set to False the To field will be blank and the user can enter an email address.


Sunday, November 8, 2009

An error occurred while executing batch

I ran a query which resulted in this:
An error occurred while executing batch. Error message is: The directory name is invalid.

The reason for this was an invalid temp directory. The system was pointing at c:\temp but the directory did not exist. I re-pointed all the temp and tmp environment variables at the real temp directory and all was well again.


Monday, November 2, 2009

MCITP: Database Administrator.

I passed the 70-444 exam today which is the final exam in the MCITP: Database Administrator trilogy.
This exam was 52 multiple choice questions - old style mcp type exam.
The 70-443 was a set of 6 case studies with 12 questions on each.

I find that after revising for a while I begin to get what they are testing and can normally answer the question based on the answer options - although reading the question helps!

For the 70-443 exam I skimmed through the case studies and then went back to look at the specifics as they related to the individual questions.

I have still got to finish off the 2000 MCDBA. I have the 2 SQL exams just need to do the windows admin exams.

Tuesday, October 27, 2009

Step by Step guide to log shipping on SQL Server 2005

What it is and how it works.
Log shipping is the process of transferring transaction logs from one server to another – shipping them. It is used as a form of disaster recovery for a database. It does not have any automated fail over – transfer to the new database will involve human intervention.

Two servers are involved, the Primary server and the Secondary Server. The Primary server is the server with the database that is being protected. The secondary server holds the backup copy of the database.

The database to be protected needs to be in Full or Bulk Logged recovery mode. This is to stop the transaction logs being truncated. A full backup should then be taken and applied to the secondary server. The option No Recovery is used when restoring on the secondary server. This leaves the database in a restoring state. No log shipping is set up and transaction log backups are copied from the primary server to the secondary server and then applied to the restoring database. This has the affect of keeping the database in step with the primary version.

As an extra process login accounts need to be transferred to the secondary server. This way if the database ever has to be brought on line and used access will be the same as it is on the primary server.

How to set up Log Shipping
In this step by step guide I will set up log shipping on the DataImports database from serverA to serverB.

Create a directory on the primary server
On the primary server create a directory for the transaction log backups. The backups will be made into this directory and then copied across to a receiving directory on the secondary server.
I create this under the main maintenance or sql backup directories.
If there is not one there already create a folder called LS. Then create a sub-folder with the name of your database: \\serverA\m$\Backups\LS\dataImports

Create a directory on the secondary server
On the secondary server create a shared directory for receiving the log shipped files.
Again I create this main maintenance or sql backup directories.
If there is not one there already create a folder called LS. Then create a sub-folder with the name of your database: file://serverb/f$/Maint/LS/dataImports

Exclude the database from TLog backups
If you have any maintenance plans which carryout Transaction Log backups for this database you will need to exclude it.

Set recovery model of the database to Full
If the recovery model of the database is simple set it to Full. You can leave it if it is set to Bulk Insert.

Backup the database
Take a full backup of the database in to the folder you created on the primary server and copy it to the new folder on the secondary server.

Create a database on the secondary server and then restore the backup
Create a new database with the same name as the one you are going to log ship.
Now restore the database using the backup you have just taken – make sure you set the file paths to restore over the files you have just created and not try to use the paths the backup sets.
You must select the option to restore with no recovery.

Click OK.
The database name will now have a green arrow next to it and say restoring …
You can also delete the backup file you used for the restore.

Configure Log Shipping
In SSMS right click on the database name and select properties.
Click on Transaction Log Shipping. Now tick the box which says “Enable this as a primary database in a log shipping configuration”
The button ‘Backup Settings’ will now be enabled. Click this and fill out the options.


You should enter the full path to the folder on the primary server. Underneath this you should also enter the local path. In my example the dataimports data rarely changes so I have set different values from the default. I am deleting files older than 24 hours – 72 is the default. I am alerting on 2 hours where 1 is the default. Next click on the schedule button. As the database I am using does not change very often I have scheduled my backups to take place once an hour. The default is every 15 minutes.


Click OK.
You are now on the configuration screen again. There is a white box where it says secondary server instances and databases. Underneath the box click add. Click on the button at the top right which says connect. Enter the secondary server details. The database should be selected by default.

Now you are back on the secondary database settings make sure the option ‘no, the secondary database is initialised’ has been selected. Now click the Copy Files tab. In this screen you need to enter the path for the directory on the secondary server: file://serverb/f$/Maint/LS/dataImports
I change the deleted copied files option to 24 hours again. Now click schedule. I change this to run every hour again – the default is 15 minutes. This time I will also change the start time to 00:15. There is no point it trying to copy the files on the hour as the actual backup is being taken on the hour. Click Ok and that task will be scheduled.

Now click on the Restore Transaction Job task. I change the delay option to 5 minutes and the alert option to 2 hours. This is because I have a slow rate of change and so only copy every hour. Click on schedule again. This time I will set the task to run at 30 minutes past the hour and to run hourly. Click ok and then on the next screen click on again.

You are now on the properties page again.
Click ok again and the server will create the log shipping jobs. Once the two items have gone green you can click close.

Run the Log Shipping Jobs.
On the primary server there will be a job named like LSBackup_DatabaseName.
Run the LSBackup_DatabaseName job and check that a trn backup file has appeared in the folder you created on the primary server.

On the secondary server look for a job like LSCopy_PrimaryServerName_DatabaseName and run this job. You should see the trn file appear in th efoledr you created on ServerB. Now run LSRestore_PrimaryServerName_DatabaseName.

Finally Transfer Logins
You should use SSIS to create a transfer logins task to copy the logins from serverA to serverB. This way if you have to fail the database over to serverB you can enable to logins and won't have any permissions problems.

Friday, October 23, 2009

Change the Default Backup Directory


I have used this on most of my servers so that when I click restore or Backup database it doesn’t go off to a directory under programme files on the C drive and goes to the root of the backup directory I use.

SQL 2005 version

This part of the script will display the current default backup directory.
DECLARE @BackupDirectory VARCHAR(100)
EXEC master..xp_regread @rootkey='HKEY_LOCAL_MACHINE',
@key='SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL.1\MSSQLServer',
@value_name='BackupDirectory',
@BackupDirectory=@BackupDirectory OUTPUT
SELECT @BackupDirectory as BackupDirectory


This script will set G:\Maintenance\userData as the new default directory.
EXEC master..xp_regwrite
@rootkey='HKEY_LOCAL_MACHINE',
@key='SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL.1\MSSQLServer',
@value_name='BackupDirectory',
@type='REG_SZ',
@value='G:\Maintenance\userData'


SQL 2000 version

DECLARE @BackupDirectory VARCHAR(100)
EXEC master..xp_regread @rootkey='HKEY_LOCAL_MACHINE',
@key='SOFTWARE\Microsoft\MsSQLServer\MsSQLServer',
@value_name='BackupDirectory',
@BackupDirectory=@BackupDirectory OUTPUT
SELECT @BackupDirectory as BackupDirectory


EXEC master..xp_regwrite
@rootkey='HKEY_LOCAL_MACHINE',
@key='SOFTWARE\Microsoft\MsSQLServer\MsSQLServer',
@value_name='BackupDirectory',
@type='REG_SZ',
@value='G:\Maintenance\userData'




Friday, October 16, 2009

Primary Keys – Compare the Column Name with the Table Name


This script looks at all the Primary Key Indexes in a database and pulls back the name of the column the key is on. It checks where there is a single column key. It then recommends a name for the column of tableNameID – columns which already match this are filtered out.
I have written this because I have a bad habit of naming surrogate Primary Key columns PKID.
In most cases the column which doesn’t match the name formulae it is because it already had a good name – like serverName in my server audit table.

select si.name as IndexName,
so.name as TableName,
sc.name as columnName,
so.name + 'Id' as recommendName
from sys.indexes as si
Inner Join sys.objects as so
on si.object_id = so.object_id
Inner Join sys.index_columns as ic
on si.object_id = ic.object_id AND si.index_id = ic.index_id
Inner Join sys.columns as sc
on ic.object_id = sc.object_id and ic.index_column_id = sc.column_id
where si.is_Primary_key = 1
AND 1 = (SELECT count(object_id)
FROM sys.index_columns icInner
WHERE si.object_id = icInner.object_id AND si.index_id = icInner.index_id)
AND so.name + 'id' <> sc.name;






Tuesday, October 6, 2009

Error connecting to SSIS remotely

Nearly all of this post comes from the blog listeed in the reference below. I have duplicated most of it here as I found it hard to find and I know I am going to come accross this error again.

As well as the notes below the user will also need to be added to the DCOM users group on the server.

When a user tries to connect to SSIS remotely they get an access is denied error. They can connect locally with no problems. Administrators on the remote machine can connect remotely and locally.











For Windows 2003 Server or Windows XP

1. If the user running under non-admin account it needs to be added to Distributed COM Users group
2. Go to Start - Run and type %windir%\system32\Com\comexp.msc to launch Component Services
3. Expend Component Services\Computers\My Computer\DCOM Config
4. Right click on MsDtsServer node and choose properties
5. In MsDtsServer Properties dialog go to Security page
6. Configure your settings as described below step 7
7. Restart SSIS Service

In the Security page we are interested in “Launch and Activation Permissions” section. Click Edit button to see “Launch Permissions” dialog.

“Launch Permissions” dialog allows you to configure SSIS server access per user/group. In the bottom of the dialog you can select:

• Local / Remote Launch permissions if you allow a user/group to start service locally or remotely
• Local / Remote Activation permissions if you allow to a user/group to connect to SSIS server locally or remotely.

Remote Access:
By default low privileged users can only connect to SSIS Server on the local machine when the service already started. It is shown by the fact that only Local Activation checked for Machine\Users group. To grant the user permission connect to the running server remotely you need to check remote activation.

Reference:
http://deepakrangarajan.blogspot.com/2008/03/connecting-to-integration-service-on.html

Saturday, October 3, 2009

SQL Server 2005 Maintenance Plan – non-system databases

I separate System database backups and User database backups – although the code is essentially the same.

On Some servers extra databases are added to the not in list – update statistics takes a long time to run on a large database so I will run that separately once a week.

These steps are run nightly.

Each Step is step to progress even if it fails – I don’t want to stop all backups just because one db failed something. I have emails sent to me and I also read the job history log each working day so I pick up these failures.

Step 1: Shrink log file if recovery is simple
This only shrinks the log if the database is in simple recovery mode. You shouldn’t really have to do this but I have had some problems which this cleared up for me.

DECLARE @dbname sysname, @cmd varchar(1000);
DECLARE db_recovery_cursor CURSOR FOR
SELECT [name]
from sys.databases
WHERE [name] not in('master','tempdb','model','distribution','msdb')
AND is_read_only = 0
AND recovery_model_desc = 'Simple';
OPEN db_recovery_cursor;
FETCH NEXT FROM db_recovery_cursor INTO @dbname;
WHILE @@FETCH_STATUS = 0
BEGIN
DECLARE @databaseId as int;
SELECT @databaseId = database_id
FROM sys.databases
WHERE [name] = @dbname;

DECLARE @LogName as varchar(250);
SELECT @LogName = name
FROM sys.master_files
Where type_Desc = 'log'
AND database_id = @databaseId;

DECLARE @sql as Varchar(2000);
SET @sql = 'Use [' + @dbName + '];';
SET @sql = @sql + 'Checkpoint; ';
SET @sql = @sql + 'DBCC ShrinkFile(['+ @LogName + '],30);';

EXEC(@sql);
FETCH NEXT FROM db_recovery_cursor INTO @dbname;
END;
CLOSE db_recovery_cursor;
DEALLOCATE db_recovery_cursor;

Step 2: Update Usage and Checkpoint and turn off autoshrink
Update usage helps fix problems in space reporting on the database.
Some of our databases were created with auto shrink on – a big no no!

USE Master;
DECLARE @dbName varchar(255);
DECLARE @SQL varchar(255);
DECLARE dbCursor CURSOR Local Fast_Forward FOR
select [name]
From master.sys.databases
WHERE Name not in ('master','msdb','tempdb','model','Distribution','Resource')
And is_read_only = 0
And state_desc = 'ONLINE'
Order by [name];
OPEN dbCursor;
FETCH NEXT FROM dbCursor INTO @dbName;
WHILE @@FETCH_STATUS = 0
BEGIN
Set @sql = 'dbcc updateusage ([' + @dbName + ']) WITH NO_INFOMSGS;';
Exec (@sql);
Set @sql = 'Use ' + QuoteName(@dbName) + '; Checkpoint;' ;
Exec (@sql);
SET @sql = 'EXEC sp_dboption ''' + @dbname + ''', ''autoshrink'', ''False'';';
EXEC(@sql);
Use Master;
FETCH NEXT FROM dbCursor INTO @dbName;
END;
CLOSE dbCursor;
DEALLOCATE dbCursor;

Step 3: Integrity Check
Here you may not want to run this against every database every night but if you can you should. I use the Data_purity option as we have a lot of databases that are getting imported from sql 2000

USE Master;
DECLARE @dbName varchar(255);
DECLARE @SQL varchar(255);
DECLARE dbCursor CURSOR Local Fast_Forward FOR
select [name]
From master.sys.databases
WHERE Name not in ('master','msdb','tempdb','model','Distribution','Resource')
And is_read_only = 0
And state_desc = 'ONLINE'
Order by [name];
OPEN dbCursor;
FETCH NEXT FROM dbCursor INTO @dbName;
WHILE @@FETCH_STATUS = 0
BEGIN
Set @sql = 'dbcc CHECKDB ([' + @dbName + ']) WITH NO_INFOMSGS , DATA_PURITY ;';
Exec (@sql);
FETCH NEXT FROM dbCursor INTO @dbName;
END;
CLOSE dbCursor;
DEALLOCATE dbCursor;

Step 4: Update Stats
Sp_updatestats will only update the stats SQL thinks needs updating. Update statistics will up date stats whether they need it or not. Updating stats will cause stored procs to recompile – which is ok if the stats were out of date but not if they werern’t.

USE Master;
DECLARE @dbName varchar(255);
DECLARE dbCursor CURSOR Local Fast_Forward FOR
select [name] From master.sys.databases
WHERE Name not in ('master','msdb','tempdb','model','Distribution','Resource')
AND is_read_only = 0
AND state_desc = 'ONLINE'
Order by [name];
OPEN dbCursor;
FETCH NEXT FROM dbCursor INTO @dbName;
WHILE @@FETCH_STATUS = 0
BEGIN
Exec ('Use [' + @dbName + '] EXEC SP_updatestats ');
FETCH NEXT FROM dbCursor INTO @dbName;
END;
CLOSE dbCursor;
DEALLOCATE dbCursor;

Step 5: Backup and remove old backups
I delete files older than 96 hours – more accurate than x days. On some servers where space is more of a constraint this time is reduced. On out live servers backups are also copied to tape but this process runs outside of sql. Backups will only be removed if there is a corresponding live database. This means if a database is dropped backups will stay on the server until separately dropped – I like having this in case someone wants to restore it. I’ll normally manually delete them after 6 months or so.

If a database is over a certain size I stripe the backup to 4 files.

USE Master;

Declare @delDate as VARCHAR(50);
Set @delDate = CAST(DATEADD(hh, -96, GETDATE()) AS VARCHAR(50));

DECLARE @rootDir as Varchar(255);
SET @rootDir = 'M:\Backups\User\'; /*Enter path to user backups here*/

DECLARE @dbName varchar(255);
DECLARE @sqlCreateDir varchar(255);

Declare @y as char(4);
Set @y = Cast(Year(getDate()) as char(4));
Declare @m as char(2);
Set @m = Right('0' + Cast(Month(getDate()) as varchar(2)),2);
Declare @d as char(2);
Set @d = Right('0' + Cast(Day(getDate()) as varchar(2)),2);
Declare @date as char(10);
Set @date = @y + '_' + @m + '_' + @d;

Declare @h as char(2);
Set @h = Right('0' + Cast(DatePart(hh,getDate()) as varchar(2)),2);
Declare @mi as char(2);
Set @mi = Right('0' + Cast(DatePart(mi,getDate()) as varchar(2)),2);
Declare @time as char(5);
Set @time = @h + '_' + @mi;

DECLARE dbCursor CURSOR Local Fast_Forward FOR
select [name] From master.sys.databases
WHERE state_desc = 'ONLINE'
AND Name not in ('master','msdb','tempdb','model','Distribution','Resource')
Order by [name];
OPEN dbCursor;
FETCH NEXT FROM dbCursor INTO @dbName;
WHILE @@FETCH_STATUS = 0
BEGIN
DECLARE @dirPath as varchar(250);
Set @dirPath = @rootDir + @dbName;

Set @sqlCreateDir = 'master.dbo.xp_create_subdir N''' + @dirPath + '''';
Exec (@sqlCreateDir);

EXECUTE master.dbo.xp_delete_file 0, @dirPath,N'bak',@delDate;

DECLARE @size as numeric(25,0);

SELECT @size = b.backup_size
FROM master..sysdatabases d with (nolock)
left join msdb..backupset b with (nolock)
on d.name = b.database_name
AND b.backup_start_date = (select max(backup_start_date)
from msdb..backupset b2
where b.database_name = b2.database_name
and b2.type = 'D')
WHERE d.name = @dbName;

Declare @BakName as Varchar(250);
If @size < 2147483648 BEGIN Set @BakName = @dbName + '_' + @date + '_' + @time + '.bak'; --Now run a backup command DECLARE @BakCommand as varchar(1024); SET @BakCommand = 'BACKUP Database ' + QuoteName(@dbName); SET @BakCommand = @BakCommand + ' TO DISK=''' + @dirPath + '\' + @BakName + ''''; SET @BakCommand = @BakCommand + ' WITH INIT, Checksum;'; exec (@BakCommand); END ELSE BEGIN Set @BakName = @dbName + '_' + @date + '_' + @time; --Now run a backup command SET @BakCommand = 'BACKUP Database ' + QuoteName(@dbName); SET @BakCommand = @BakCommand + ' TO DISK=''' + @dirPath + '\' + @BakName + 'a.bak'''; SET @BakCommand = @BakCommand + ' ,DISK=''' + @dirPath + '\' + @BakName + 'b.bak'''; SET @BakCommand = @BakCommand + ' WITH INIT, Checksum;'; exec (@BakCommand); END; FETCH NEXT FROM dbCursor INTO @dbName; END; CLOSE dbCursor; DEALLOCATE dbCursor;

References:
Update Usage: http://msdn.microsoft.com/en-us/library/ms188414.aspx
Check Db with Data_purity: http://www.sqlskills.com/BLOGS/PAUL/post/CHECKDB-From-Every-Angle-How-to-tell-if-data-purity-checks-will-be-run.aspx

Saturday, September 26, 2009

Linked Tables from access failing if they have a primary key.


We had someone trying to connect to a table in a database using access and linked tables. They kept getting ODBC – call failed when they tried to access the data. If a table didn’t have a primary key then it worked!
I ran a trace and saw it was calling a select statement against the Primary Key in that table. There were no errors but an attention warning appeared. I then spread the trace to all databases and go this error:
The EXECUTE permission was denied on the object 'sp_execute', database 'mssqlsystemresource', schema 'sys'.
I had locked down this server so now had to open it up a little.
I created a role in Master called AccessUsers. I granted the role Execute on 'sp_execute'.
Now they can connect using access.
I'll show the lock-down scripts in another entry.

Monday, September 14, 2009

Space related commands

Database Files
This script runs against the database you are connected to. This will show the total size of a database file along with the available space left in the file. This is for all the files that make up the database including the log file.

SELECT name AS 'File Name' , physical_name AS 'Physical Name', size/128 AS 'Total Size in MB',
size/128.0 - CAST(FILEPROPERTY(name, 'SpaceUsed') AS int)/128.0 AS 'Available Space In MB'
FROM sys.database_files;

Log Files
This runs against every log file in the database giving the database name, Log Size in MB, Log Space Used (%), and Status.

DBCC SQLPERF (LOGSPACE);

DBCC SQLPERF: http://msdn.microsoft.com/en-us/library/ms189768.aspx

Free Drive Space
Shows the name of the drives and how many MB of free space it has. It does not show how large the drive is – or if there is any compression on it etc. Good to use if you a quick check to see if there is space to run a backup up.

EXEC master..xp_fixeddrives

Database Space Used
Replace tempDB with the name of the database you want to querey. Shows the database name (although you know that already), the size of the database in MB, percentage of unallocated space. It also show: total amount of space allocated by objects in the database, total amount of space used by data, total amount of space used by indexes, Total amount of space reserved for objects in the database, but not yet used.

EXEC tempDB.dbo.sp_spaceused;

You can also use this to find the size of an object:
EXEC sp_spaceused N'NameofObject';

sp_spaceUsed: http://msdn.microsoft.com/en-us/library/ms188776.aspx

SSIS Script to show disk data on all servers populated from a table list of servers:
I have a management server with a list of servers – for this I use columns isAccessible and isSSPI. I do the isSSPI bit because this normally means I can run the WMI query due to the way the servers have been set up.

Imports System
Imports System.Data
Imports System.Data.SqlClient
Imports Microsoft.SqlServer.Dts.Runtime
Imports System.Collections
Imports System.Management
Imports System.IO

Public Class ScriptMain
    Dim sConn As String = "server=.;database=ServerAudit;Integrated Security=SSPI;max pool size=100;min pool size=1;"
    Dim serverNameString As String = String.Empty
    Dim serverIPString As String = String.Empty
    Public Sub Main()
        Call getServerRecordSet()
        Dim serverNameArray As String() = Split(serverNameString, ",")
        Dim serverIPArray As String() = Split(serverIPString, ",")
        Dim serverHDD As String = String.Empty
        Dim x As Integer = 0
        For Each Server As String In serverNameArray
            serverHDD = getDataDiskDrives(Server, serverIPArray(x))
            x += 1
            Dim aResultSet As String() = Split(serverHDD, "/*/")
            For Each sDisk As String In aResultSet
                Try
                    If Len(sDisk) > 1 Then
                       ' MsgBox("sending to db: " & sDisk)
                        sendDatatoDB(sDisk)
                    End If
                Catch ex As Exception
                    ' MsgBox("DB Failed: " & ex.Message)
                End Try
            Next
        Next
        Dts.TaskResult = Dts.Results.Success
    End Sub

    Public Function getDataDiskDrives(ByVal server As String, ByVal ipAddress As String) As String
        Dim query1 As ManagementObjectSearcher
        Dim queryCollection1 As ManagementObjectCollection
        Dim HDDSize As String = String.Empty
        Dim thisDrive As String = String.Empty
        Try
            query1 = New ManagementObjectSearcher("\\" + ipAddress + "\root\cimv2", "SELECT * FROM Win32_DiskDrive")
            queryCollection1 = query1.Get()
            For Each mo As ManagementObject In queryCollection1
                Dim HddKB As Long = 0
                Dim Hddletter As String = String.Empty
                HddKB = Convert.ToInt64(mo("size"))
                For Each b As ManagementObject In mo.GetRelated("Win32_DiskPartition")
                    For Each c As ManagementBaseObject In b.GetRelated("Win32_LogicalDisk")
                        Hddletter = Left(Convert.ToString(c("Name")), 1) & "," & (Convert.ToInt64(c("Size")) / 1073741824).ToString & "," & (Convert.ToInt64(c("FreeSpace")) / 1073741824).ToString
                    Next
                Next
                If Len(Hddletter) > 1 Then
                    thisDrive = server & "," & Hddletter & "/*/"
                    HDDSize += thisDrive
                End If
            Next
        Catch ex As Exception
            'MsgBox(ex.Message)
            HDDSize = String.Empty
        End Try
        getDataDiskDrives = HDDSize
    End Function

    Sub sendDatatoDB(ByVal dataArray As String)
        Dim SqlCommand As SqlCommand = New SqlCommand
        SqlCommand.CommandText = "[uspAddHDDItem]"
        SqlCommand.CommandType = CommandType.StoredProcedure
        SqlCommand.Connection = New SqlConnection(sConn)
        Dim DriveLetter As String
        Dim DriveSize As String
        Dim DriveSpace As String
        Dim serverName As String
        Dim resultSet As String() = Split(dataArray, ",")
        serverName = Trim(resultSet(0))
        DriveLetter = Trim(resultSet(1))
        DriveSize = Left(Trim(resultSet(2)), 7)
        DriveSpace = Left(Trim(resultSet(3)), 7)
        If InStr(DriveSize, ".") > 1 Then
            DriveSize = Left(DriveSize, InStr(DriveSize, ".") + 2)
        End If
        If InStr(DriveSpace, ".") > 1 Then
            DriveSpace = Left(DriveSpace, InStr(DriveSpace, ".") + 2)
        End If
        Dim server As SqlParameter = New SqlParameter("@server ", SqlDbType.VarChar, 25)
        Dim letter As SqlParameter = New SqlParameter("@letter", SqlDbType.Char, 1)
        Dim size As SqlParameter = New SqlParameter("@size", SqlDbType.VarChar, 7)
        Dim space As SqlParameter = New SqlParameter("@free", SqlDbType.VarChar, 7)
        server.Value = serverName
        letter.Value = DriveLetter
        size.Value = DriveSize
        space.Value = DriveSpace
        server.Direction = ParameterDirection.Input
        letter.Direction = ParameterDirection.Input
        size.Direction = ParameterDirection.Input
        space.Direction = ParameterDirection.Input
        SqlCommand.Parameters.Add(server)
        SqlCommand.Parameters.Add(letter)
        SqlCommand.Parameters.Add(size)
        SqlCommand.Parameters.Add(space)
        SqlCommand.Connection.Open()
        Try
            SqlCommand.ExecuteNonQuery()
        Catch ex As Exception
            MsgBox("DriveSpace: " & DriveSpace & " DriveSize " & DriveSize)
        End Try
        SqlCommand.Connection.Close()
        SqlCommand.Connection.Dispose()
    End Sub

    Private Sub getServerRecordSet()
        Dim sql As String = "SELECT DISTINCT ipAddress, serverName FROM servers WHERE (isAccessible = 1) AND ipAddress is not null and domainID = 1"
        Dim conn As SqlConnection = New SqlConnection(sConn)
        Dim cmd As SqlCommand = New SqlCommand(sql, conn)
        cmd.Connection.Open()
        Dim r As SqlDataReader = cmd.ExecuteReader(CommandBehavior.CloseConnection)
        While r.Read()
            serverNameString += r("serverName").ToString() & ","
            serverIPString += r("ipAddress").ToString() & ","
        End While
        r.Close()
        conn.Close()
        conn.Dispose()
        serverNameString = Left(serverNameString, Len(serverNameString) - 1)
        serverIPString = Left(serverIPString, Len(serverIPString) - 1)
    End Sub
End Class

This is the Stored Proc for Importing the data. From this you can work out the table schema – I delete data in the table first and use server as a primary key. You could keep it and add an update data column – and change the Primary Key.
CREATE PROCEDURE [dbo].[uspAddHDDItem]
    @server as Varchar(25),
    @letter as char(1),
    @size as Varchar(7),
    @free as Varchar(7)
AS
BEGIN
SET NOCOUNT ON;
    INSERT INTO HDDAudit(diskLetter,DiskSize,FreeSpace,ServerName)
    VALUES(@letter,Cast(@size as Decimal(7,2)),Cast(@free as Decimal(7,2)),@server);
END;

Friday, September 11, 2009

Upgrading SQL Server 2005

Upgrading SQL Server Standard to Developer Edition on Dev and Test servers – Stand alone servers. I have also used this to upgrade back again - Developer to Standard Edition.

Make sure you have backups of your System and user databases before you begin. Take a backup of your system databases before you apply the service pack. Take another backup of them when you have finished the hot fix install.

You should probably reboot the server before you begin to make sure you have a clean slate to work with.

Create a folder called sqlAdmin on the C drive.
Create a sub folder called SQL2005
Create two sub folders called Servers and Tools.
Copy the contents of CD1 into the Servers directory and CD2 into the Tools Directory.

You will also need to copy the latest service pack and hotfixes on to the server ready for installing after the upgrade: an upgrade takes the product level down to RTM.

In the sqlAdmin directory create a text file called upgrade.cmd and add this text to it:

@echo off

Set UPGRADE = SQL_Engine, SQL_Data_Files, SQL_Tools90, SQL_Replication

Set INSTANCENAME=MSSQLSERVER

Start /wait c:\sqlAdmin\SQL2005\servers\setup.exe /qb UPGRADE=%UPGRADE% SKUUPGRADE=1 INSTANCENAME=%INSTANCENAME%
:: END OF CMD TEXT FILE

This assumes no Reporting Services or Analysis Serivces. If you have them you may need to add RS_Server, RS_Web_Interface, Analysis_Server, AnalysisDataFiles to the list in Set UPGRADE=

After the upgrade you need to install the service pack and hot fixes.
Reboot the server once you have finished.

Problems

I had a problem with the upgrade failing when Reporting Serivces was on the server “sql bpa command line has encountered a problem and needs to close”. To fix this I created an extra directory under c:\Program Files\Microsoft SQL Server\90\Setup Bootstrap\BPA\ called BPAClient and in it I copied the file BPAClient.dll from the c:\Program Files\Microsoft SQL Server\90\Setup Bootstrap\BPA\bin\ directory.

BPA failed: http://social.msdn.microsoft.com/Forums/en-US/sqlsetupandupgrade/thread/2a5f460b-8684-4510-8fcb-1e9d9786baff
 
I have not yet managed to upgrade SSAS - SSRS was fine but SSAS maintains the same version - If I ever work this I'll I will update this post. What I do instead is backup the databases then uninstall the old version and install the new edition.

Thursday, September 10, 2009

SQL 2005 Maintenance: Re-Index Plan

This goes through all databases and finds any indexes which have fragmentation greater than 15% and reindexes them. The loop near the beginning is because sys.dm_db_index_physical_stats needs a database ID. If you do not pass a database Id and leave it as null it will go thrugh all database – which will cause a problem if it is not writeable (read_only, offline, In Recovery)

To use the null options the account running the command needs: VIEW SERVER STATE

This task will re-index and index where fragmentation is greater than 15% provided that there aare more than 50 pages used in the index. If there are less than this the chances are the index isn’t being used and Query Analyser will perform a table scan.

I took the initial idea of using this function to populate a table for reindexing from a blog but can’t remember which. I have adapted it to suit the tasks I wanted to perform.

I find STATISTICS_NORECOMPUTE = OFF a little unintuitive. It means do not not recomputed stats – or in English: recompute the stats.

On most servers this tasks takes between 5 and 20 minutes to run and I can run it every night. There are a few though where it is run instead on a weekly basis.

I also use an extra line after AND ps.index_type_desc <> 'heap' to exclude some databases. This is used along with AND [Name] NOT IN ('LIST OF DBNAMES TO EXCLUDE'); The names should be like this (‘dbname’,’dbname2’). This is where the there is a database that does a lot of ETL work during the night so it is best to run a separate re-index job for that once it has finished.

If speed is a problem you can replace
FROM sys.dm_db_index_physical_stats (@dbCountID, NULL, NULL, NULL, 'detailed')
With
FROM sys.dm_db_index_physical_stats (@dbCountID, NULL, NULL, NULL, NULL)
This will then use the default Limited for gather statistics on the indexes – of course the faster version does not get such accurate data back.

use tempdb;

if not object_id('indexData') is null
  drop table [indexData];

CREATE TABLE [dbo].[indexData](
[dbName] varchar(250) Not NULL,
[di] int Not NULL,
[oi] varchar(150) NOT NULL,
[Frag] int not null,
[pages] int not null,
[iType] varchar(250) not null,
[oName] varchar(250) NULL,
[schemaName] varchar(250) NULL);

DECLARE @dbCountID as int;
SET @dbCountID = 0;
DECLARE @topID as int;
SELECT @topID = Max(database_id) FROM master.sys.databases;
While @dbCountID < @topID
BEGIN
   SELECT TOP (1) @dbCountID = database_id FROM master.sys.databases
   WHERE database_id > @dbCountID
   AND is_Read_Only = 0 and state_desc = 'ONLINE'
   AND [Name] NOT IN ('LIST OF DBNAMES TO EXCLUDE');

   INSERT INTO [indexData]([dbName],[di],[oi],[Frag],[pages],[iType])
   SELECT Distinct DB_NAME ( @dbCountID ), @dbCountID,    ps.object_id,ps.avg_fragmentation_in_percent,ps.Page_Count,ps.index_type_desc
   FROM sys.dm_db_index_physical_stats (@dbCountID, NULL, NULL, NULL, 'detailed') ps
   LEFT OUTER JOIN master.sys.databases AS sd
   ON ps.database_id = sd.database_id
   Where (ps.database_id > 4) AND sd.is_Read_only = 0 AND sd.state_desc = 'online'
   AND ps.Page_Count > 50
   AND ps.avg_fragmentation_in_percent > 15
   AND ps.index_type_desc <> 'heap'
   AND sd.database_id = @dbCountID;
END
DECLARE @dbName varchar(250);
DECLARE dbCursor CURSOR Local Fast_Forward FOR
SELECT Distinct [dbName]
FROM [indexData];

OPEN dbCursor;
FETCH NEXT FROM dbCursor INTO @dbName;
WHILE @@FETCH_STATUS = 0
BEGIN
    DECLARE @sql as Varchar(500)
    SET @sql = 'Use [' + @dbName + ']'
    Set @sql = @sql + 'SET ARITHABORT ON; SET QUOTED_IDENTIFIER ON;'
    Set @sql = @sql + ' Update tempdb..[indexData] '
    Set @sql = @sql + ' SET oName = object_Name(oi), '
    Set @sql = @sql + ' schemaName = (select schema_Name(sobj.schema_id) from sys.objects sobj where sobj.object_id = oi) '
   Set @sql = @sql + ' WHERE dbName = ''' + @dbName + ''';';
   Exec (@sql);
   Use tempdb;
   FETCH NEXT FROM dbCursor INTO @dbName;
END;
CLOSE dbCursor;
DEALLOCATE dbCursor;
DECLARE @dbName2 varchar(5000);
DECLARE dbCursor2 CURSOR Local Fast_Forward FOR
SELECT Distinct '[' + dbName + '].[' + schemaName + '].[' + oName + ']' as databaseObject
FROM [indexData]
WHERE oName not like 'perfValuesForProcess%';

OPEN dbCursor2;
FETCH NEXT FROM dbCursor2 INTO @dbName2;
WHILE @@FETCH_STATUS = 0
BEGIN
     DECLARE @sql2 as Varchar(8000);
     Set @sql2 = 'ALTER INDEX ALL ON ' + @dbName2
     Set @sql2 = @sql2 + ' REBUILD WITH (SORT_IN_TEMPDB = ON,       STATISTICS_NORECOMPUTE = OFF);';
     EXEC(@sql2);
     FETCH NEXT FROM dbCursor2 INTO @dbName2;
END;
CLOSE dbCursor2;
DEALLOCATE dbCursor2;
Use tempdb;
Drop Table [indexData];

Reference
MSDN sys.dm_db_index_physical_stats (http://msdn.microsoft.com/en-us/library/ms188917(SQL.90).aspx)

Friday, September 4, 2009

Cluster install: SQL Service keeps auto-restarting

The set up was a SQL Server 2005 Cluster - the service was not failing over it was simply restarting. The error log would read something like - The service responded to a stop command from the cluster manager. There was no record of any problem in the SQL error log.

The first time it occurred whilst I was working here – it had happened before then – it looked like this:
At 00:29 the cluster manager issued a Stop command to the SQL Service. This caused the SQL Service to Stop. It was followed by a Start command and the service came back up again. The server was down for around 5 minutes.

There was no clue as to why it had happened except in the Windows Event Viewer which had a lot of lost communication and 19019 errors. These event errors corresponded with SQL jobs failing with server timeout errors – lost communication with the server - on tasks such as update stats in my maintenance plans (even though these are run directly on the server).

It carried on happening as I thought it was network cards and CPU affinity:
http://support.microsoft.com/kb/174812
http://www.sqlservercentral.com/Forums/Topic399630-149-1.aspx
http://support.microsoft.com/kb/892100

After a lot of tinkering and Googling it was resolved by setting Priority Boost to 0 – Microsoft recommends this for all cluster installs. After changing the setting you need to restart the SQL service. It is now servral months since this all happened and I no longer get any of these errors.
Even if you are not encountering any problems you ought to set PB to 0 (if it isn't already) as these restarts can happen at any time.

SQL Server 2000 Transaction Log Backups

This is really one reason why I don't like the built in maintenance plans. Why can't they simply ignore the databases which are in simple recovery mode? Why do they instead fill the error logs with message that they have tried to backup the transaction log of the master database??

I wanted to be able to choose which databases to exclude – and then only try to carry out a T log backup on a DB which has Full or Bulk-Logged recovery set.
Remember you can only start T Log backups after you have taken a full backup.

DECLARE @name VARCHAR(50); -- database name  
DECLARE @fileName VARCHAR(250); -- filename for backup
DECLARE @BakCommand Varchar(550);
DECLARE @rootDir Varchar(20);
DECLARE @fileDate Varchar(50);
/*@rootDir is the root directory where user db backups go*/
SET @rootDir = 'F:\SQLBackups\user';  

SELECT @fileDate = CONVERT(VARCHAR(20),GETDATE(),112)  
   + '_'  
   + REPLACE(CONVERT(VARCHAR(20),GETDATE(),108),':','') ;

DECLARE db_cursor CURSOR Local Fast_Forward FOR  
SELECT [name]  
FROM master.dbo.sysdatabases  
WHERE [name] NOT IN ('master','model','msdb','tempdb')  
   AND DATABASEPROPERTYEX(name, 'Recovery') IN ('FULL','BULK_LOGGED') ;

OPEN db_cursor    
        FETCH NEXT FROM db_cursor INTO @name    
        WHILE @@FETCH_STATUS = 0    
        BEGIN
                /*A sub directory is created for each database if not already there
                Backups are kept for 3 days*/
                SET @BakCommand = 'use [' + @name + ']; Checkpoint;';
                EXEC (@BakCommand);
                use Master;
                SET @BakCommand = 'master.dbo.xp_sqlmaint N''-S . -D ' + @name + ' -BkUpLog "'+ @rootDir + '" -BkUpMedia DISK  -CrBkSubDir -DelBkUps 3DAYS -BkExt "TRN"''';
                --Print @BakCommand
                EXEC (@BakCommand);                
       
                FETCH NEXT FROM db_cursor INTO @name;    
        END;
CLOSE db_cursor;
DEALLOCATE db_cursor;

SQL Server 2000 Maintenance Plan – non-system databases

I don’t like the system generated maintenance plans so use this instead.
I normally go from one step to the next with total disregard as to it’s outcome. This is because I check job histories and logs each morning so if update stats or integrity checks failed I will know about it and fix it then. I also want my backups to run regardless – they stay on disk for 3 days but are also copied to tape so if a problem with integrity creeps in and isn’t noticed for > 3 days (I am on holiday) then if I can’t fix it I can use tape backups to repair almost as easily as disk backups. If this doesn’t suit you you could add an extra step which takes a backup but does not delete old backups – you just need the space for this if you can’t check for a while.
The most unsatisfying thing with this is detecting read only databases – it is easy in 2005 but haven’t found a satisfactory easy method for 2000.
I have a separate job for system dbs where NOT IN is replaced with IN. This runs at a different time of the evening

Step 1: Update Usage and Checkpoint
USE Master;
DECLARE @dbName varchar(255);
DECLARE @SQL varchar(255);
DECLARE dbCursor CURSOR Local Fast_Forward FOR
select [name]
From master..sysdatabases
WHERE [Name] not in ('master','msdb','tempdb','model','Distribution','Resource')
Order by [name];
OPEN dbCursor
FETCH NEXT FROM dbCursor INTO @dbName
WHILE @@FETCH_STATUS = 0
BEGIN
Set @sql = 'dbcc updateusage ([' + @dbName + ']) WITH NO_INFOMSGS;';
Exec (@sql);
Set @sql = 'Use ' + QuoteName(@dbName) + ' Checkpoint;';
Exec (@sql);
Use Master;
FETCH NEXT FROM dbCursor INTO @dbName;
END;
CLOSE dbCursor;
DEALLOCATE dbCursor;

Step 2: Integrity Check
USE Master;
DECLARE @dbName varchar(255);
DECLARE @SQL varchar(255);
DECLARE dbCursor CURSOR Local Fast_Forward FOR
select [name]
From master..sysdatabases
WHERE [Name] not in ('master','msdb','tempdb','model','Distribution','Resource')
Order by [name];
OPEN dbCursor
FETCH NEXT FROM dbCursor INTO @dbName;
WHILE @@FETCH_STATUS = 0
BEGIN
Set @sql = 'dbcc CHECKDB ([' + @dbName + ']) WITH NO_INFOMSGS;';
Exec (@sql);
FETCH NEXT FROM dbCursor INTO @dbName;
END;
CLOSE dbCursor;
DEALLOCATE dbCursor;

Step 3: Update Stats
USE Master;
DECLARE @dbName varchar(255);
DECLARE dbCursor CURSOR Local Fast_Forward FOR
/* 1048 represents REadonly databases but it is no the only code that represents Readonly
you will need to check your dbs*/
select [name]
From master..sysdatabases
WHERE Name not in ('master','msdb','tempdb','model','Distribution','Resource')
AND status <> 1048
Order by [name];
OPEN dbCursor
FETCH NEXT FROM dbCursor INTO @dbName
WHILE @@FETCH_STATUS = 0
BEGIN
Exec ('Use [' + @dbName + '] EXEC sp_MSforeachtable ''UPDATE Statistics ? With Sample 50 Percent''');
FETCH NEXT FROM dbCursor INTO @dbName;
END;
CLOSE dbCursor;
DEALLOCATE dbCursor;

Step 4; Backup databases
USE Master;
DECLARE @rootDir as Varchar(255);
SET @rootDir = 'E:\Backup'; /*Enter path to user backups here*/
DECLARE @dbName varchar(255);
DECLARE dbCursor CURSOR Local Fast_Forward FOR
select [name] From master..sysdatabases
WHERE Name not in ('master','msdb','tempdb','model','Distribution','Resource')
Order by [name];
OPEN dbCursor
FETCH NEXT FROM dbCursor INTO @dbName;
WHILE @@FETCH_STATUS = 0
BEGIN
DECLARE @BakCommand as varchar(1024)
SET @BakCommand = 'master.dbo.xp_sqlmaint N''-S . -D ' + @dbName + ' -BkUpDB "'+ @rootDir + '" -BkUpMedia DISK -CrBkSubDir -DelBkUps 3DAYS -BkExt "BAK"''';
--Print @BakCommand
EXEC (@BakCommand);
FETCH NEXT FROM dbCursor INTO @dbName;
END;
CLOSE dbCursor;
DEALLOCATE dbCursor;