Deploying changes to multiple SQL Server Environments

I am a bit old school when it comes to working with databases. I know it’s so easy to just jump into SSMS or Query Analyzer to do ones job so fast and nifty, but I like to have scripts handy of everything I do, as well as logs on disk for each of these operations I perform.

In order to satisfy this requirement I have 2 separate folders which I use to manage my scripts. The first folder being “c:\sqlscripts\” which I use for storing automation scripts & utility scripts, and secondly a folder in which I keep scripts that I will run “c:\sqlsource\”.

The first step is to create a batch file as a deployment template for deploying changes

<code>

osql -S server_name -d database_name -i “c:\sqlsource\%1” -o “c:\sqlsource\logs\%1.environment_name.log” -U username -P password –n

</code>

I then save this file as “c:\sqlscripts\run_script_on_environment_name_database_name.bat” in a convenient folder of my choosing, for the purposes of this example “c:\sqlscripts\”.  Replace environment_name and database_name with values suitable to your specific need.

When its time to run scripts on an environment, I assemble a list of files choose the method I will use to run these items from the command line.

# run a single file on a specific database 
run_script_on_environment_name_database_name.bat sourcescriptname.sql 

# run all non-hidden files in directory

FOR %x IN ( dir /A:-H /B )  DO run_script_on_environment_name_database_name.bat sourcescriptname.sql 

# run a list of files
FOR %x IN ( file1.sql file2.sql file3.sql )  DO run_script_on_environment_name_database_name.bat sourcescriptname.sql 

 

There are several tweaks that one can do to customize this process, however these tweaks would be based entirely on your personal preferences.  

By all means I can imagine people ranting about having separate files for each environment. This was a choice I made because of my own devices. You could easily parameterize the entire batch file and produce a script such as the one below and pass all your arguments in at run time. 

osql -S %1 -d %2-i “%5/%6” -o “%7\%6.%1.%2.log” -U %3-P %4 -n

%1 => servername
%2 => databasename
%3  => username
%4 => password
%5 => sourcefolder
%6 => sourcefile
%7 => logfolder 

If you choose to do so, go right ahead, however I like to have everything setup in my file and keep my parameters to a minimum. 

Do what suits you. In the near future I am going to recreate this process in VBScript. Reason being, I would like to

1. Timestamp the log files, so if the scripts errors out, I create a new log file each time. This would also provide neatly time stamped log files of the deployment &/or rollback process, which would be sweet for debugging on a testing environment.

2. Introduce error checking. If I do not have any print statements, and SET NOCOUNT ON, then I could theoretically say that if the log file size is greater than zero, the script failed, and I should automatically run the rollback statement.

I am wondering whether I should use Powershell or VBScript to do this. I have not looked into Powershell enough at this point to determine if this is a good option, any comments would definitely be appreciated.

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s