Loading...

Follow dbatools on Feedspot


Valid
or
Continue with Google
Continue with Facebook
dbatools by Chrissy Lemaire - 1M ago

Congrats to our teammate Rob Sewell! Rob was invited by PASS to present about PowerShell at PASS Summit 2018 on Tuesday, November 6 2018.

In his day-long session, Rob will talk about a variety of super interesting subjects including: dbachecks, PowerShell module-making, GitHub, VSTS, and dbatools. Rob is a vibrant, knowledgeable speaker and I can’t recommend this precon enough! I learn a ton every time that Rob and I present together.

Professional and Proficient PowerShell

Professional and Proficient PowerShell: From Writing Scripts to Developing Solutions

DBA’s are seeing the benefit of using PowerShell to automate away the mundane. A modern data professional needs to be able to interact with multiple technologies and learning PowerShell increases your ability to do that and your usefulness to your company.

At the end of this fun filled day with Rob, a former SQL Server DBA turned professional automator, you will be much more confident in being able to approach any task with PowerShell and you will leave with all of the code and demos. You can even follow along if you bring a laptop with an instance of SQL Server installed.

  • How to learn how to interact with any technology using PowerShell
  • Understanding the syntax
  • The importance of Get-Help and how PowerShell enables you to help yourself
  • Why to write your own Modules and how to make them available to all of your team
  • A quick automated method to creating your module framework
  • Unit testing and debugging your code
  • How to continuously deliver changes to your PowerShell modules using GitHub and VSTS
  • Tips and tricks for script writing with the popular open-source community dbatools module
  • How to validate your SQL Server estate with PowerShell
  • Advanced SQL Server Agent and PowerShell management

We will have a lot of fun along the way and you will return to work with a lot of ideas, samples and better habits to become a PowerShell ninja and save yourself and your organisation time and effort.

Register now

To register, visit the shortlink sqlps.io/precon

- Chrissy

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Last week I performed a server migration from SQL Server 2014 to SQL Server 2016 using dbatools. A lot of work is done up front to prepare and plan for these migrations to have the least impact on our customers. So I always do my best to improve the process each time we do it.

Backstory

This is the second product server we have migrated to SQL 2016. The first migration had automation to a certain point.

  1. Configure the servers using some dbatools commands
  2. Take a full backup the day before the migration
  3. Take and apply a differential backup the night of the migration

My comfort level with dbatools last year when the migration was done was low, so the migration was a mix of TSQL and manual processes.

Goal

The goal this time around was to enhance the process with more automation.  With that in mind I decided to use the

Invoke-DbaLogShipping
 command to build log shipping for the primary database that was part of the migration. The current size of the database is 650GB which is not too outrageous, but it has growth of 30GB a day and by Wednesday my differential backup would have been 90GB. So time to take the backup and restore the backup would have been time consuming during the migration window. Why Log Shipping?

The reason I decided to go with Log Shipping is the ease of setup, made even easier by the dbatools commands.  Also, the ability to minimize downtime which I pointed out earlier, even for large databases.

Log shipping also supports mixed versions as far back as 2005 to current versions and mixed editions like Standard, Workgroup or Enterprise which other migration options lack. See this great article on using Log Shipping to minimize downtime for more detail.

Migration Steps

So we used our standard build configuration process to get the servers configured to our standards. This process has not been converted to use dbatools like I had hoped, but time has been my enemy on that project. Currently we have 12, yes 12 environments for each of our products. That means this exercise is done 12 times before it is all said and done.

However, this time we did things a little different. Our development teams now follow the Agile life cycle and our development teams do two week sprints. For our first phase we migrated the primary Continuous Integration environment and the main Integration environment using the automated process. However, log shipping does not come in to play due to the size of these environments. So it gives us practice and testing of the migration process. Then on the next sprint we migrated the Regression and Production environments. This is where the fun starts.

Last week I decided to use

Invoke-DbaLogShipping
  to get the production database built on the new server and keep it in sync with the current server. Steps To Build Log Shipping
  1. To get the secondary going I restored the latest Full backup and left the database in NO RECOVERY
  2. Then I ran the
    Invoke-DbaLogShipping
      command with the parameters that were needed for my environment
  3. Once everything was in place I monitored the Backup, Copy and Restore SQL Agent Jobs

# Restore Latest Full
$restoreDbaDatabaseSplat = @{
    SqlInstance = 'localhost\sql2017'
    DatabaseName = 'WideWorldImporters'
    Path = 'C:\SQLData\Backups\WideWorldImporters_201804011846.bak'
    NoRecovery = $true
    WithReplace = $true
}
Restore-DbaDatabase $restoreDbaDatabaseSplat


# Configure Log Shipping 
$invokeDbaLogShippingSplat = @{
    SourceSqlInstance = 'localhost\sql2016'
    DestinationSqlInstance = 'localhost\sql2017'
    Database = 'WideWorldImporters'
    BackupNetworkPath = '\\LXDW17181\C$\SQLData\LSBackups'
    BackupLocalPath = 'C:\SQLData\LSBackups'
    BackupScheduleFrequencySubdayType = 'Minutes'
    BackupScheduleFrequencySubdayInterval = 5
    CopyDestinationFolder = 'C:\SQLData\LS'
    CopyScheduleFrequencySubdayType = 'Minutes'
    CopyScheduleFrequencySubdayInterval = 5
    RestoreScheduleFrequencySubdayType = 'Minutes'
    RestoreScheduleFrequencySubdayInterval = 5
    NoInitialization = $true
    NoRecovery = $true
    Force = $true
}
Invoke-DbaLogShipping @invokeDbaLogShippingSplat

Once all the jobs are verified to be working from the proper locations, you can see that the Transaction Log Shipping Status built in report shows that we are current on our log shipping.

Migration Day

11:00 PM on Wednesday was go time.

  1. Our product site was put in maintenance mode so no new traffic is coming in
  2. My steps in preparation for the migration
  3. Time to verify that the latest Log Backup was taken, Copied and Restored on the secondary
  4. Now it is time to bring that secondary copy online

I did not use the command below, as I had not had time to do any testing.  So I used TSQL, but after testing this now it would have been even easier.  The

Invoke-DbaLogShippingRecovery
  sets the jobs to disabled and then brings the database online.

# Recover Log Shipping 
$invokeDbaLogShippingRecoverySplat = @{
    SqlInstance = 'localhost\sql2017'
    Database = 'WideWorldImporters'
    Force = $true
}
Invoke-DbaLogShippingRecovery @invokeDbaLogShippingRecoverySplat

And just like that we are done.  17 total minutes of downtime for our product and we are back, with only replication and AG to be done after the fact.

Invoke-LogShipping Quick View - YouTube

Alternative configurations

Considering the number of parameters available in Invoke-DbaLogShipping, there are a number of ways to setup Log Shipping. Here are a couple available configurations using the parameters -UseExistingFullBackup and -GenerateFullBackup. These help automate the initial backup/restore process.

UseExistingFullBackup

If the database is not yet initialized on the secondary instance it can be done by selecting an existing full backup and restore it for you.

$params = @{
    SourceSqlInstance = 'sql1'
    DestinationSqlInstance = 'sql2'
    Database = 'db1'
    UseExistingFullBackup = $true
    BackupNetworkPath = '\\sql1\logshipping'
    BackupLocalPath = 'D:\Data\logshipping'
    CompressBackup = $true
    Force = $true
}

Invoke-DbaLogShipping @params

GenerateFullBackup

If database is not yet initialized on the secondary instance it can be also be done by telling the command to create a new backup and restore it for you.

$params = @{
    SourceSqlInstance = 'sql1'
    DestinationSqlInstance = 'sql2'
    Database = 'db1'
    UseExistingFullBackup = $true
    BackupNetworkPath = '\\sql1\logshipping'
    BackupLocalPath = 'D:\Data\logshipping'
    CompressBackup = $true
    Force = $true
}

Invoke-DbaLogShipping @params

UseBackupFolder

Or, if you’ve already got all of your backups in a folder, you can also use that as well.

$params = @{
    SourceSqlInstance = 'sql1'
    DestinationSqlInstance = 'sql2'
    Database = 'db1'
    UseBackupFolder "C:\SQL\Backup"
    BackupNetworkPath = '\\sql1\logshipping'
    BackupLocalPath = 'D:\Data\logshipping'
    CompressBackup = $true
    Force = $true
}

Invoke-DbaLogShipping @params

If you have any migration in your future, or want to learn more, please take a look at the Log Shipping commands that dbatools has to offer. You can use the command help for more detail.

Get-Help Invoke-DbaLogShipping -Detailed
Get-Help Invoke-DbaLogShippingRecovery -Detailed

Additional Information

Check out the multi part series on the Log Shipping commands by the author of most of these awesome commands Sander Stad (b|t).

Thanks for reading,
Garry (@gbargsley)

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Back around the time we did SQL Bits, Rob Sewell spoke to Richard Campbell at RunAs Radio.

The show, Managing SQL Server using dbatools with Rob Sewell, is a fun and fantastic listen. Not only does Rob talk at length about the migration capabilities in dbatools, but also other cool things like CI/CD, containers and more. Check it out!

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
dbatools by Chrissy Lemaire - 2M ago

Nearly two years ago, I wrote that as a open source GPL-licensed project, dbatools was free as in speech and free as in beer.

I chose the GNU GPL for nostalgic reasons and I also wanted companies who used dbatools to contribute back to the project. What I didn’t realize was that licensing dbatools as GPL would hurt integration which ultimately impacts adoption. Permissive like the MIT don’t subject users to such restrictions.

Many of the tools we use on a daily basis, like VS Code and even .NET are MIT licensed, and MIT is the most popular license on GitHub.

Considering this and our lessons-learned from dbatools, we released dbachecks under the MIT license. The reception at SQL Bits was amazing. A number of comments were made expressing thanks and optimism about our selection of the MIT license.

And, after speaking with a number of community members about the restrictive GPL license, it became clear that dbatools must also switch to MIT.

As of version 0.9.300, dbatools is now MIT licensed! If you would like to continue using the GPL version, please fork v0.9.211 and take it from there.

If your project appreciates this switch, please comment in the section below or to let us know!

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview