In his day-long session, Rob will talk about a variety of super interesting subjects including: dbachecks, PowerShell module-making, GitHub, VSTS, and dbatools. Rob is a vibrant, knowledgeable speaker and I can’t recommend this precon enough! I learn a ton every time that Rob and I present together.
DBA’s are seeing the benefit of using PowerShell to automate away the mundane. A modern data professional needs to be able to interact with multiple technologies and learning PowerShell increases your ability to do that and your usefulness to your company.
At the end of this fun filled day with Rob, a former SQL Server DBA turned professional automator, you will be much more confident in being able to approach any task with PowerShell and you will leave with all of the code and demos. You can even follow along if you bring a laptop with an instance of SQL Server installed.
How to learn how to interact with any technology using PowerShell
Understanding the syntax
The importance of Get-Help and how PowerShell enables you to help yourself
Why to write your own Modules and how to make them available to all of your team
A quick automated method to creating your module framework
Unit testing and debugging your code
How to continuously deliver changes to your PowerShell modules using GitHub and VSTS
Tips and tricks for script writing with the popular open-source community dbatools module
How to validate your SQL Server estate with PowerShell
Advanced SQL Server Agent and PowerShell management
We will have a lot of fun along the way and you will return to work with a lot of ideas, samples and better habits to become a PowerShell ninja and save yourself and your organisation time and effort.
Last week I performed a server migration from SQL Server 2014 to SQL Server 2016 using dbatools. A lot of work is done up front to prepare and plan for these migrations to have the least impact on our customers. So I always do my best to improve the process each time we do it.
This is the second product server we have migrated to SQL 2016. The first migration had automation to a certain point.
Configure the servers using some dbatools commands
Take a full backup the day before the migration
Take and apply a differential backup the night of the migration
My comfort level with dbatools last year when the migration was done was low, so the migration was a mix of TSQL and manual processes.
The goal this time around was to enhance the process with more automation. With that in mind I decided to use the
command to build log shipping for the primary database that was part of the migration. The current size of the database is 650GB which is not too outrageous, but it has growth of 30GB a day and by Wednesday my differential backup would have been 90GB. So time to take the backup and restore the backup would have been time consuming during the migration window.
Why Log Shipping?
The reason I decided to go with Log Shipping is the ease of setup, made even easier by the dbatools commands. Also, the ability to minimize downtime which I pointed out earlier, even for large databases.
Log shipping also supports mixed versions as far back as 2005 to current versions and mixed editions like Standard, Workgroup or Enterprise which other migration options lack. See this great article on using Log Shipping to minimize downtime for more detail.
So we used our standard build configuration process to get the servers configured to our standards. This process has not been converted to use dbatools like I had hoped, but time has been my enemy on that project. Currently we have 12, yes 12 environments for each of our products. That means this exercise is done 12 times before it is all said and done.
However, this time we did things a little different. Our development teams now follow the Agile life cycle and our development teams do two week sprints. For our first phase we migrated the primary Continuous Integration environment and the main Integration environment using the automated process. However, log shipping does not come in to play due to the size of these environments. So it gives us practice and testing of the migration process. Then on the next sprint we migrated the Regression and Production environments. This is where the fun starts.
Last week I decided to use
to get the production database built on the new server and keep it in sync with the current server.
Steps To Build Log Shipping
To get the secondary going I restored the latest Full backup and left the database in NO RECOVERY
Then I ran the
command with the parameters that were needed for my environment
Once everything was in place I monitored the Backup, Copy and Restore SQL Agent Jobs
And just like that we are done. 17 total minutes of downtime for our product and we are back, with only replication and AG to be done after the fact.
Invoke-LogShipping Quick View - YouTube
Considering the number of parameters available in Invoke-DbaLogShipping, there are a number of ways to setup Log Shipping. Here are a couple available configurations using the parameters -UseExistingFullBackup and -GenerateFullBackup. These help automate the initial backup/restore process.
If the database is not yet initialized on the secondary instance it can be done by selecting an existing full backup and restore it for you.
I chose the GNU GPL for nostalgic reasons and I also wanted companies who used dbatools to contribute back to the project. What I didn’t realize was that licensing dbatools as GPL would hurt integration which ultimately impacts adoption. Permissive like the MIT don’t subject users to such restrictions.
Considering this and our lessons-learned from dbatools, we released dbachecks under the MIT license. The reception at SQL Bits was amazing. A number of comments were made expressing thanks and optimism about our selection of the MIT license.
And, after speaking with a number of community members about the restrictive GPL license, it became clear that dbatools must also switch to MIT.
As of version 0.9.300, dbatools is now MIT licensed! If you would like to continue using the GPL version, please fork v0.9.211 and take it from there.
If your project appreciates this switch, please comment in the section below or to let us know!
Read Full Article
Scroll to Top
Separate tags by commas
To access this feature, please upgrade your account.