Loading...

Follow Planet PowerShell on Feedspot


Valid
or
Continue with Google
Continue with Facebook

Through online courses, blogging, articles, whatever medium it takes. I really enjoy the fact of teaching and helping others. Just like other people, I gathered a lot of wisdom and I’ve gathered a lot of knowledge throughout the years that I’ve been in IT and in life, really. I like to give back, I like to help other people to do that sort of thing. What I found is there have been times where I’ve wanted to learn something new. For example, Python; I don’t know a lot about Python but one day I decided I want to learn about python. Instead of opening up a book or watching some big online course or reading articles, I decided I’m going to learn by teaching.

I have a lot of clients that I do freelance writing for and I approached IP Switch (a blog that I write for), and I said I want to do some articles on Python, they said sure. I had no idea what it was but I committed myself to writing on Python. Through that, I found that the knowledge that I acquired really sank in a lot more than what it would have if I normally tried to learn it on my own.

The fact of having to absorb the knowledge that you get and then writing it down, talking about it, describing it, that act of kind of talking about what you learned and trying to explain it to someone else, for some reason that clicks in your brain. It stays in there so much longer than it normally had you just tried to learn it on your own and not try to teach it.

Teaching somehow has this way solidifies the knowledge in your head and the training that you’ve received from lots of different mediums. The reason that I say that is because there are so many great things about training and teaching. Not only as far as learning, but being able to evolve in your career.

A Little Honesty

To be honest with you, just a few years ago, I was just a typical new IT pro, sysadmin, go to work at 8 come home at 5 and just kind of do my own thing, not really get involved with the community. I thought that was all great, not doing any training or anything like that. Once I started to get involved with more people and tried to learn and teach these things, not only did it help me learn more, it helped me learn so must more and so much faster than I normally would have.

It also allowed me to learn more because so many more opportunities popped up during that time. That was the time when I started blogging and doing courses, and things like that opened up so many different more opportunities for me. I was able to eventually get to where I am now, where I quit my six-figure job and I’m loving what I’m doing now. I really want to stress that to so many different people. Just sharing your knowledge, you have so much knowledge in your head that you don’t even realize.

Even if you’re only a couple of years out of college, maybe at a helpdesk position or a computer tech or something like that, you have knowledge that nobody else in the world has or even has thought about for a long time. That knowledge is valuable and I really, really, really want people to know this and get this. To know that you have so much knowledge in your head, and it helps in so many different ways. You don’t have to be an expert.

It’s like I said with the Python stuff, I don’t know anything about Python. All I’m knowing is I’m learning. I have this process that I can go through to learn something, I’ll learn and teach, learn and teach. It just automatically comes out and it’s automatic at this point. Whenever I learn something I automatically think, “oh, I have to write a blog post” or “I have to do a TechSnip on my new TechSnips.io service.” I have to do something like that, I have to give back and I have to share that in some way.

That not only lets me learn things so much faster, and so much better, and really solidifies that learning, but it also helps others. If you learned right, (TechSnips.io), you can also be paid for those sorts of things. Your knowledge is valuable; you just have to get it down out of your head into some kind of medium, some kind of format to where you can start sharing that and getting involved in it, and all the other great things that happen when that happens.

If this concept intrigues you, there are a lot of different ways to start training. You can start blogging, do some articles, there’s a lot of different mediums that you can use. You can do online courses, through Udemy, Pluralsight, LinkedIn learning, there’s a lot of different places. You can teach people through podcasts, although podcasts aren’t the best way to teach something like code or IT because people need to follow along.

There’s a lot of different ways to get started. You can do video, audio, written text, you can do things for free or you can get paid for things. It’s all completely up to you; there’s a lot of different options out there. This is a topic that I’m really passionate about and one of the reasons why I started my TechSnips.io service, just to help people through screencasting.

Definitely, if you want to take your career to the next level, if you’re in technology in any shape, form, or fashion, I highly encourage you to not only do your job, obviously, but, take a few hours every week and try to just give back in some shape, form, or fashion. If you do, it’s going to help out your career in so many different ways and open up so many different opportunities that you may not have even fathomed before. Thanks!

This post was brought to you by yet another #CarTalks YouTube video. Be sure to check out all of the other #CarTalks videos and other video content on the Adam the Automator YouTube channel!

The post Learning by Teaching – The Shortcut to a More Successful Career appeared first on Adam the Automator - DevOps, Automation, PowerShell.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

If you were here Wednesday, then perhaps you already read Get the Total Parameter Count. If not, I’ll quickly tell you about it. I wanted to know how many parameters a function had whether or not those parameters were used when the function was invoked. I did this by using Get-Help against the function, from within the function. Yeah, I thought it was cleaver too, but the best part, it gave me the results I was after. If a function had three parameters, it was indicated in the function’s output. Same goes for when it had two parameters, but of course in that instance, it indicated two.

In going along with that post I had an idea. I wanted to create a way to dynamically create a function with a random number of parameters between one and ten. Then, I could prove my code was working. Sure, I could’ve used Pester, but I was after making this work — this whole, create a function with a random number of parameters thing. I needed to figure out how to get the code for a function, stored inside a variable, into an actual PowerShell function. That might’ve been confusing, but the example code in this post will probably prove helpful.

I’ll show you what I started experimenting with for discovery purposes, and then we’ll jump into the code that actually creates my function with a random number of parameters in a soon to be released second part to this post.

First, we’ll start with a here-string variable assignment. Using a here-string allows us to include multiple line breaks within our variables. As you should be able to see, the value of $FunctionCode below is the code that makes up a simple function. It includes the function keyword, a function name, an open curly brace, the function’s actual code — there’s three lines of it — and a closing curly brace, as well.

$FunctionCode = @'
Function Show-Info {
    '*****'
    'This is function Show-Info.'
    '-----'
}
'@

As expected, when I echo my variable’s value, I get back exactly what I put into it.

PS > $FunctionCode
Function Show-Info {
    '*****'
    'This is function Show-Info.'
    '-----'
}

Now, for the fun part. I’ll include all my code and then we can discuss it line by line below.

$FunctionCode = $FunctionCode -split '\n'
$FunctionCodeCount = $FunctionCode.Count - 2
$FunctionCode = $FunctionCode[1..$FunctionCodeCount]
$FunctionCode = $FunctionCodeOut-String

The first of the above four lines assigns the variable $FunctionCode the value stored in $FunctionCode after we split it on each new line. The second of the four lines creates a $FunctionCodeCount variable, assigning it the number of lines in $FunctionCode after having subtracted 2 from its value. See if you can figure why it’s two…

The third line reassigns $FunctionCode again. In this line we only return the contents of the function. This means it doesn’t return the first line of the function, which includes the function keyword, the function name, and the open curly brace. It also will not include the closing curly brace at the bottom. Our final line reassigns the $FunctionCode for a third time, taking its current value and piping that to Out-String. This will help us ensure we add back in our line breaks.

Before we create our Show-Info function, let’s take a look at the $FunctionCode variable value now.

PS > $FunctionCode
    '*****'
    'This is function Show-Info.'
    '-----'

Using Set-Item, we’ll create our function called Show-Info regardless of whether or not it already exists. That’s the difference between New-Item and Set-Item (and most New vs. Set commands). New-Item will only work if Show-Info doesn’t already exist, while Set-Item will work even if the function already exists. It if doesn’t, it’ll act like New-Item and create the function.

Set-Item -Path Function:\Show-Info -Value $FunctionCode

And finally, entering Show-Info invokes the newly created function.

PS > Show-Info
*****
This is function Show-Info.
-----

Okay, with all that out of the way, we’re going to hit the pause button. Be sure you get what’s happening here, because we’ll pick up on this post in a very soon to be released follow-up that includes the rest of what we’re after: creating a function with a random number of parameters and testing we can calculate that number correctly. If you see an area in which I can improve this, please let me know!

Before we sign off though, let me include all the above code in a single, below code block. That may end up being helpful to one of us.

Remove-Variable -Name FunctionCode,FunctionCodeCount -ErrorAction SilentlyContinue

$FunctionCode = @'
Function Show-Info {
    '*****'
    'This is function Show-Info.'
    '-----'
}
'@

$FunctionCode = $FunctionCode -split '\n'
$FunctionCodeCount = $FunctionCode.Count - 2
$FunctionCode = $FunctionCode[1..$FunctionCodeCount]
$FunctionCode = $FunctionCodeOut-String

Set-Item -Path Function:\Show-Info -Value $FunctionCode

Show-Info

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

There are several scenarios where you might need to assign an Office 365 license to a user. The specific scenario in this blog article is that you’re migrating an Exchange Server 2010 on-premises environment to Office 365. The Exchange Server is already in hybrid mode. Users have been automatically created in Office 365 by synchronizing them from your on-premises Active Directory environment using Azure AD Connect. Users who haven’t already had their mailbox moved to Office 365 will first need an Office 365 license assigned to them, and before a license can be assigned to them, a usage location must be set on their individual account. This blog article is written using Windows 10 Enterprise Edition version 1803 and Windows PowerShell version 5.1. The examples shown in this blog article will not work with PowerShell Core. Your mileage may vary with other operating systems and other versions of PowerShell. First, you’ll need the cmdlets to perform these actions. Find the MSOnline module in the PowerShell Gallery. Install the MSOnline module from the PowerShell Gallery: Store your Office 365 credentials with sufficient access to perform these tasks in a variable. Connect to your Office 365 account. This is the part that will generate an error if you’re using PowerShell Core. Check to see if you have more than one Office 365 subscription. Store the specific account SKU with the licenses for the Office 365 subscription to assign to users in a variable. Find the users to assign licenses to and store them in a variable. I found it useful to narrow these results down by filtering left with the UserPrincipalName and/or Department parameters of Get-MsolUser. As you can see in the previous image, John Doe does not currently have a license assigned. Assign a usage location. Assign an Office 365 license. A license has now been assigned to John Doe. Although a single user was assigned a license, with the exception of the previous command, the code as it is written in this blog article can be used to assigned licenses to multiple users. µ
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

PowerShell Join Operator In this tutorial we will go through PowerShell Join Operator. -Join  Operator combines multiple strings into one. Strings are combined in the order that they are appear....

Do you work with PowerShell. In this Blog you will find a lot of information to help you out with it.
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

It’s been a littler longer than normal for me to have not written. My entire week last week didn’t include a post; it’s so weird. Well, I’m taking a quick minute — seriously — to discuss something I wanted to do recently.

In a new project I’m on, I wanted to know how many parameters a function has from inside the function itself. I didn’t want to know how many parameter were used when the function was invoked. Had I, I would’ve used $PSBoundParameters. Again, I wanted to know how many parameter(s) the function had, whether they were used or not.

I’ll tell you what I opted to use. On that note, do however, let me know if you come up with a better idea. I didn’t give this a ton of time. That said, it doesn’t even have to be better; I’d still like to hear about it. For me, I opted to execute a Get-Help command against the function ,from inside the function. I’m making this a quick post, so let’s jump to some code.

Function Check-ParameterCount {
    [CmdletBinding()]
    Param (
        [Parameter()]
        [string]$Param01,

        [Parameter()]
        [string]$Param02,

        [Parameter()]
        [ValidateSet('Source','Destination')]
        [string]$Param03
    )

    $ParameterCount = (Get-Help -Name Check-ParameterCount).Parameters.Parameter.Count
    "There are $ParameterCount possible parameter(s) in this function."
}
PS > Check-ParameterCount
There are 3 possible parameter(s) in this function.

The next example is the same example as above, however, we’ve removed the third parameter. Sure enough, the value is now two. As you may have noticed, Get-Help gets its parameter count from the actual parameter(s) themselves. Neither function has any comment-based help. Therefore, we can determine that it doesn’t use any static help we might include in regard to the parameter(s).

Function Check-ParameterCount {

    [CmdletBinding()]
    Param (
        [Parameter()]
        [string]$Param01,

        [Parameter()]
        [string]$Param02
    )

    $ParameterCount = (Get-Help -Name Check-ParameterCount).Parameters.Parameter.Count
    "There are $ParameterCount possible parameter(s) in this function."
}
PS > Check-ParameterCount
There are 3 possible parameter(s) in this function.

That was it. Again, let me know if there’s a better way.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Regular visitors of this blog are used to seeing PowerShell and DevOps content, and this is a little bit of a divergence since it’s written in C#, and it’s a .NET Core MVC Azure Web App, but if it found itself on my plate, maybe it will find itself on yours. I was tasked with writing an Azure Web App that users would visit, sign into using their Azure Active Directory (ie: “Work or School”) account, to test if their Conditional Access and MFA was configured properly. Once logged in, a little information about the user is displayed.

Here’s how to pop all the claim information for an authenticated user into a Razor Page.

I decided to put the whole thing into an HTML table in order to make it a bit more readable. It’s kind of a challenge to differentiate between the claim name and the value if they aren’t aligned nicely. From there, make sure you’re using System.Security.Claims, and you can write yourself this foreach loop.

<table>
    @foreach (var claim in ((ClaimsIdentity)User.Identity).Claims)
    {
        <tr>
            <td>@claim.Type</td>
            <td>@claim.Value</td>
        </tr>
    }
</table>

It’s not a big mind blower. This is a .cshtml document, so we can write HTML and mix in some inline C#. Using the ClaimsIdentity class, we can write a foreach loop for each claim in the identity of the currently logged in user. This assumes that the user isn’t logged in more than once (ie: Facebook and Twitter and Azure AD).

Then I’m making a new row in my table for each claim, and separate cells for the claim type, which is the name of the claim, and the claim value.

Nice and concise!

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

I gave a presentation at SQL Day in Poland last week on dbachecks and one of the questions I got asked was will you write a command to put the results of the checks into a database for historical reporting.

The answer is no and here is the reasoning. The capability is already there. Most good PowerShell commands will only return an object and the beauty of an object is that you can do anything you like with it. Your only limit is your imagination I have written about this before here. The other reason is that it would be very difficult to write something that was easily configurable for the different requirements that people will require. But here is one way of doing it.

Create a configuration and save it

Let’s define a configuration and call it production. This is something that I do all of the time so that I can easily run a set of checks with the configuration that I want.

# The computername we will be testing
Set-DbcConfig -Name app.computername -Value $sql0,$SQl1
# The Instances we want to test
Set-DbcConfig -Name app.sqlinstance -Value $sql0,$SQl1
# The database owner we expect
Set-DbcConfig -Name policy.validdbowner.name -Value 'THEBEARD\EnterpriseAdmin'
# the database owner we do NOT expect
Set-DbcConfig -Name policy.invaliddbowner.name -Value 'sa'
# Should backups be compressed by default?
Set-DbcConfig -Name policy.backup.defaultbackupcompression -Value $true
# Do we allow DAC connections?
Set-DbcConfig -Name policy.dacallowed -Value $true
# What recovery model should we have?
Set-DbcConfig -Name policy.recoverymodel.type -value FULL
# What should ourt database growth type be?
Set-DbcConfig -Name policy.database.filegrowthtype -Value kb
# What authentication scheme are we expecting?
Set-DbcConfig -Name policy.connection.authscheme -Value 'KERBEROS'
# Which Agent Operator should be defined?
Set-DbcConfig -Name agent.dbaoperatorname -Value 'The DBA Team'
# Which Agent Operator email should be defined?
Set-DbcConfig -Name agent.dbaoperatoremail -Value 'TheDBATeam@TheBeard.Local'
# Which failsafe operator shoudl be defined?
Set-DbcConfig -Name agent.failsafeoperator -Value 'The DBA Team'
## Set the database mail profile name
Set-DbcConfig -Name agent.databasemailprofile -Value 'DbaTeam'
# Where is the whoisactive stored procedure?
Set-DbcConfig -Name policy.whoisactive.database -Value master
# What is the maximum time since I took a Full backup?
Set-DbcConfig -Name policy.backup.fullmaxdays -Value 7
# What is the maximum time since I took a DIFF backup (in hours) ?
Set-DbcConfig -Name policy.backup.diffmaxhours -Value 26
# What is the maximum time since I took a log backup (in minutes)?
Set-DbcConfig -Name policy.backup.logmaxminutes -Value 30
# What is my domain name?
Set-DbcConfig -Name domain.name -Value 'TheBeard.Local'
# Where is my Ola database?
Set-DbcConfig -Name policy.ola.database -Value master
# Which database should not be checked for recovery model
Set-DbcConfig -Name policy.recoverymodel.excludedb -Value 'master','msdb','tempdb'
# Should I skip the check for temp files on c?
Set-DbcConfig -Name skip.tempdbfilesonc -Value $true
# Should I skip the check for temp files count?
Set-DbcConfig -Name skip.tempdbfilecount -Value $true
# Which Checks should be excluded?
Set-DbcConfig -Name command.invokedbccheck.excludecheck -Value LogShipping,ExtendedEvent, PseudoSimple,SPN, TestLastBackupVerifyOnly,IdentityUsage,SaRenamed
# How many months before a build is unsupported do I want to fail the test?
Set-DbcConfig -Name policy.build.warningwindow -Value 6
## I need to set the app.cluster configuration to one of the nodes for the HADR check
## and I need to set the domain.name value
Set-DbcConfig -Name app.cluster -Value $SQL0
Set-DbcConfig -Name domain.name -Value 'TheBeard.Local'
## I also skip the ping check for the listener as we are in Azure
Set-DbcConfig -Name skip.hadr.listener.pingcheck -Value $true
Now I can export that configuration to a json file and store on a file share or in source control using the code below. This makes it easy to embed the checks into an automation solution
Export-DbcConfig -Path Git:\Production.Json
and then I can use it with
Import-DbcConfig -Path Git:\Production.Json
Invoke-DbcCheck
I would use one of the Show parameter values here if I was running it at the command line, probably fails to make reading the information easier
Add results to a database
This only gets us the test results on the screen, so if we want to save them to a database we have to use the PassThru parameter for Invoke-DbcCheck. I will run the checks again, save them to a variable
$Testresults = Invoke-DbcCheck -PassThru -Show Fails

Then I can use the dbatools Write-DbaDatatable command to write the results to a table in a database. I need to do this twice, once for the summary and once for the test results

$Testresults | Write-DbaDataTable -SqlInstance $sql0 -Database tempdb -Table Prod_dbachecks_summary -AutoCreateTable
$Testresults.TestResult | Write-DbaDataTable -SqlInstance $sql0 -Database tempdb -Table Prod_dbachecks_detail -AutoCreateTable

and I get two tables one for the summary

and one for the details

This works absolutely fine and I could continue to add test results in this fashion but it has no date property so it is not so useful for reporting.
Create tables and triggers

This is one way of doing it. I am not sure it is the best way but it works! I always look forward to how people take ideas and move them forward so if you have a better/different solution please blog about it and reference it in the comments below

First I created a staging table for the summary results

CREATE TABLE [dbachecks].[Prod_dbachecks_summary_stage](
	[TagFilter] [nvarchar](max) NULL,
	[ExcludeTagFilter] [nvarchar](max) NULL,
	[TestNameFilter] [nvarchar](max) NULL,
	[TotalCount] [int] NULL,
	[PassedCount] [int] NULL,
	[FailedCount] [int] NULL,
	[SkippedCount] [int] NULL,
	[PendingCount] [int] NULL,
	[InconclusiveCount] [int] NULL,
	[Time] [bigint] NULL,
	[TestResult] [nvarchar](max) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

and a destination table with a primary key and a date column which defaults to todays date

CREATE TABLE [dbachecks].[Prod_dbachecks_summary](
	[SummaryID] [int] IDENTITY(1,1) NOT NULL,
	[TestDate] [date] NOT NULL,
	[TagFilter] [nvarchar](max) NULL,
	[ExcludeTagFilter] [nvarchar](max) NULL,
	[TestNameFilter] [nvarchar](max) NULL,
	[TotalCount] [int] NULL,
	[PassedCount] [int] NULL,
	[FailedCount] [int] NULL,
	[SkippedCount] [int] NULL,
	[PendingCount] [int] NULL,
	[InconclusiveCount] [int] NULL,
	[Time] [bigint] NULL,
	[TestResult] [nvarchar](max) NULL,
 CONSTRAINT [PK_Prod_dbachecks_summary] PRIMARY KEY CLUSTERED 
(
	[SummaryID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_summary] ADD  CONSTRAINT [DF_Prod_dbachecks_summary_TestDate]  DEFAULT (getdate()) FOR [TestDate]
GO

and added an INSERT trigger to the staging table

CREATE TRIGGER [dbachecks].[Load_Prod_Summary] 
   ON   [dbachecks].[Prod_dbachecks_summary_stage]
   AFTER INSERT
AS 
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

    INSERT INTO [dbachecks].[Prod_dbachecks_summary] 
	([TagFilter], [ExcludeTagFilter], [TestNameFilter], [TotalCount], [PassedCount], [FailedCount], [SkippedCount], [PendingCount], [InconclusiveCount], [Time], [TestResult])
	SELECT [TagFilter], [ExcludeTagFilter], [TestNameFilter], [TotalCount], [PassedCount], [FailedCount], [SkippedCount], [PendingCount], [InconclusiveCount], [Time], [TestResult] FROM [dbachecks].[Prod_dbachecks_summary_stage]

END
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_summary_stage] ENABLE TRIGGER [Load_Prod_Summary]
GO

and for the details I do the same thing. A details table
CREATE TABLE [dbachecks].[Prod_dbachecks_detail](
	[DetailID] [int] IDENTITY(1,1) NOT NULL,
	[SummaryID] [int] NOT NULL,
	[ErrorRecord] [nvarchar](max) NULL,
	[ParameterizedSuiteName] [nvarchar](max) NULL,
	[Describe] [nvarchar](max) NULL,
	[Parameters] [nvarchar](max) NULL,
	[Passed] [bit] NULL,
	[Show] [nvarchar](max) NULL,
	[FailureMessage] [nvarchar](max) NULL,
	[Time] [bigint] NULL,
	[Name] [nvarchar](max) NULL,
	[Result] [nvarchar](max) NULL,
	[Context] [nvarchar](max) NULL,
	[StackTrace] [nvarchar](max) NULL,
 CONSTRAINT [PK_Prod_dbachecks_detail] PRIMARY KEY CLUSTERED 
(
	[DetailID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail]  WITH CHECK ADD  CONSTRAINT [FK_Prod_dbachecks_detail_Prod_dbachecks_summary] FOREIGN KEY([SummaryID])
REFERENCES [dbachecks].[Prod_dbachecks_summary] ([SummaryID])
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail] CHECK CONSTRAINT [FK_Prod_dbachecks_detail_Prod_dbachecks_summary]
GO

A stage table

CREATE TABLE [dbachecks].[Prod_dbachecks_detail_stage](
	[ErrorRecord] [nvarchar](max) NULL,
	[ParameterizedSuiteName] [nvarchar](max) NULL,
	[Describe] [nvarchar](max) NULL,
	[Parameters] [nvarchar](max) NULL,
	[Passed] [bit] NULL,
	[Show] [nvarchar](max) NULL,
	[FailureMessage] [nvarchar](max) NULL,
	[Time] [bigint] NULL,
	[Name] [nvarchar](max) NULL,
	[Result] [nvarchar](max) NULL,
	[Context] [nvarchar](max) NULL,
	[StackTrace] [nvarchar](max) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

with a trigger

CREATE TRIGGER [dbachecks].[Load_Prod_Detail] 
   ON   [dbachecks].[Prod_dbachecks_detail_stage]
   AFTER INSERT
AS 
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

    INSERT INTO [dbachecks].[Prod_dbachecks_detail] 
([SummaryID],[ErrorRecord], [ParameterizedSuiteName], [Describe], [Parameters], [Passed], [Show], [FailureMessage], [Time], [Name], [Result], [Context], [StackTrace])
	SELECT 
	(SELECT MAX(SummaryID) From [dbachecks].[Prod_dbachecks_summary]),[ErrorRecord], [ParameterizedSuiteName], [Describe], [Parameters], [Passed], [Show], [FailureMessage], [Time], [Name], [Result], [Context], [StackTrace]
	FROM [dbachecks].[Prod_dbachecks_detail_stage]

END
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail_stage] ENABLE TRIGGER [Load_Prod_Detail]
GO

Then I can use Write-DbaDatatable with a couple of extra parameters, FireTriggers to run the trigger, Truncate and Confirm:$false to avoid any confirmation because I want this to run without any interaction and I can get the results into the database.

$Testresults | Write-DbaDataTable -SqlInstance $Instance -Database $Database -Schema dbachecks -Table Prod_dbachecks_summary_stage -FireTriggers -Truncate -Confirm:$False
$Testresults.TestResult | Write-DbaDataTable -SqlInstance $Instance -Database $Database -Schema dbachecks -Table Prod_dbachecks_detail_stage -FireTriggers -Truncate -Confirm:$False

Which means that I can now query some of this data and also create PowerBi reports for it.

To enable me to have results for the groups in dbachecks I have to do a little bit of extra manipulation. I can add all of the checks to the database using

Get-DbcCheck | Write-DbaDataTable -SqlInstance $sql0 -Database ValidationResults -Schema dbachecks -Table Checks -Truncate -Confirm:$False -AutoCreateTable

But because the Ola Hallengren Job names are configuration items I need to update the values for those checks which I can do as follows

$query = "
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.systemfull) + "' WHERE [Describe] = 'Ola - `$SysFullJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserFull) + "' WHERE [Describe] = 'Ola - `$UserFullJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserDiff) + "' WHERE [Describe] = 'Ola - `$UserDiffJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserLog) + "' WHERE [Describe] = 'Ola - `$UserLogJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.CommandLogCleanup) + "' WHERE [Describe] = 'Ola - `$CommandLogJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.SystemIntegrity) + "' WHERE [Describe] = 'Ola - `$SysIntegrityJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserIntegrity) + "' WHERE [Describe] = 'Ola - `$UserIntegrityJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserIndex) + "' WHERE [Describe] = 'Ola - `$UserIndexJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.OutputFileCleanup) + "' WHERE [Describe] = 'Ola - `$OutputFileJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.DeleteBackupHistory) + "' WHERE [Describe] = 'Ola - `$DeleteBackupJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.PurgeBackupHistory) + "' WHERE [Describe] = 'Ola - `$PurgeBackupJobName'
"
Invoke-DbaSqlQuery -SqlInstance $SQL0 -Database ValidationResults -Query $query

You can get a sample Power Bi report in my Github which also has the code from this blog post

Then you just need to open in PowerBi Desktop and

Click Edit Queries
Click Data Source Settings
Click Change Source
Change the Instance and Database names

Then have an interactive report like this. Feel free to click around and see how it works. Use the arrows at the bottom right to go full-screen

This enables me to filter the results and see what has happened in the past so I can filter by one..
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

PowerShell Arithmetic Operators In this tutorial we will see about PowerShell Arithmetic Operators and how PowerShell is dealing with them. Some of the arithmetic operators are working also on...

Do you work with PowerShell. In this Blog you will find a lot of information to help you out with it.
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Have you ever run across a PowerShell cmdlet that produces really nice default console output in table format that you would like to save as a .csv file? You might think it’s as simple as piping the output to Export-Csv. And sometimes it is. But the cmdlet author determines what is output to the console. If that default output contains properties from nested objects returned by the cmdlet, you will have to access those nested objects to create a useful export to .csv.

That means you need to be comfortable with PowerShell objects. Since objects can themselves contain collections of objects, it’s important to understand how to inspect object properties from the PowerShell console. While an .Net app developer might be quite comfortable with objects-within-objects, DevOps folks, administrators and cloud architects — the target audience for PowerShell — might not be.

Let’s walk through an example using Get-AzureRmVm. Follow this and you’ll get a “two-fer”. You’ll understand PowerShell objects better and you’ll be able to create a script to inventory your Azure VMs with any set of details you wish.

The Get-AzureRmVm Azure PowerShell cmdlet’s author decided to include information on the OS type and the sizes of the VMs. Here’s an example of the default response.

Get-AzureRmVm default cmdlet output (click to enlarge)

If you just want to save the console output, you can pipe the output of Get-AzureRmVm to Out-File, as shown here.

Pipe PowerShell default output to disk via Out-File (click to enlarge)

But if you pipe the output to Export-Csv, something else happens:

Exporting cmdlet output to .csv (click to enlarge)

You end up with the name of an embedded object in the .csv file, not the actual data. What’s going on?

Before answering that question, allow me to introduce the most useful cmdlet in PowerShell: Get-Member. The doc simply says that this cmdlet “Gets the properties and methods of objects.” That’s quite an understatement. In my PowerShell work, I use gm (the alias for Get-Member) more than any other cmdlet, bar none. That’s because it’s a quick and handy way of discovering what objects are contained within higher level objects. Let’s use Get-Member with Get-AzureRmVm.

Piping the output of Get-AzureRmVm to Get-Member retrieves a collection of one of more objects of type Microsoft.Azure.Commands.Compute.Models.PSVirtualMachineList. This object contains properties that are themselves objects of varying types. Note in this screenshot that the properties HardwareProfile and StorageProfile are nested objects contained in the PSVirtualMachineList object.

Using PowerShell Get-Member to find the properties of a object (click to enlarge)

Now we can see why a simple piping of Get-AzureRmVm output to Export-Csv produces object type output for variables instead of the properties of those objects. It’s because we need to be specific about which properties of nested objects we want Export-Csv to use.

To work down the list of objects, we can simply pipe the nested object to Get-Member exactly the same way to see what its properties and methods are. I call this “PowerShell object sleuthing” and, while you can find the definitions of objects’ properties online, it’s just simpler and faster to use Get-Member. Here’s an example of discovering the properties of the StorageProfile object returned by Get-AzureRmVm. Notice that the object Microsoft.Azure.Management.Compute.Models.StorageProfile contains yet another object of type Microsoft.Azure.Management.Compute.Models.OSDisk. It’s this last object that contains a property (OsType) that we liked in the default console output and which we want to store in a .csv.

Finding the operating system of an Azure VM (click to enlarge)

Now that we have discovered how to navigate the properties of nested PowerShell objects, we can write a simple script to retrieve those properties for each Azure VM and create a nicely formatted .csv.

Here’s an example PowerShell script that mimics the console output of Get-AzureRmVm. In this script, all VMs in a subscription are passed to the pipline where we retrieve and format properties we are interested in, add them to an [ordered] hash table and use that table as the properties of a custom PowerShell object which represents a row of output in our .csv. An array containing our custom objects is piped to Export-Csv, producing precisely the output we want. 

<#
    .SYNOPSIS
        Lists all VMs in an Azure subscription and creates an output .csv with selected properties of the VMs
    
    .EXAMPLE
        .\ListAzureVms.ps1
    
    .NOTES
        Assumes pwsh session is logged in (Login-AzureRmAccount) and has selected the proper subscription (Select-AzurRmSubscription)

        Alex Neihaus 2018-05-18
        (c) 2018 Air11 Technology LLC -- licensed under the Apache OpenSource 2.0 license, https://opensource.org/licenses/Apache-2.0
        Licensed under the Apache License, Version 2.0 (the "License");
        you may not use this file except in compliance with the License.
        You may obtain a copy of the License at
        http://www.apache.org/licenses/LICENSE-2.0
        
        Unless required by applicable law or agreed to in writing, software
        distributed under the License is distributed on an "AS IS" BASIS,
        WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
        See the License for the specific language governing permissions and
        limitations under the License.
        
        Author's blog: https://www.yobyot.com
#>
$objarray = @() # Create array to hold objects to be written to .csv
Get-AzureRmVM | `
ForEach-Object -Process { `
    $os = $($_.StorageProfile.OsDisk.OsType) # Retrieve OS VM is runing
    $vmsize = $($_.HardwareProfile.VmSize)
    $tags = $_.Tags | ForEach-Object -Process { [string]::Join(";", $_) } # "Flatten" the tags in the Tags object into a single column for export
    $hash = [ordered]@{
        "VmName"   = $_.Name;
        "VmSize"   = $vmsize;
        "ResourceGroup" = $_.ResourceGroupName;
        "Region"   = $_.Location;
        "OperatingSystem" = $os;
        "Tags"     = $tags;
    }
    $obj = New-Object -TypeName System.Management.Automation.PSObject -Property $hash # A PSObject cannot be [ordered], so create an ordered hash table then make a collection of them
    $objarray += $obj
}
$objarray | Sort-Object -Property VmName | Export-Csv "$HOME\Desktop\VMsAsOf-$(Get-Date -format FileDateTime).csv" -NoTypeInformation

Now, you know how to find the properties you might want for an inventory of your Azure VMs and how to automatically produce that inventory.

I hope you find this useful. As always, comments are welcome.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

With the latest release of dbachecks we have added a new check for testing that foreign keys and constraints are trusted thanks to Cláudio Silva b | t

To get the latest release you will need to run

Update-Module dbachecks

You should do this regularly as we release new improvements frequently.

We have also added better descriptions for the checks which was suggested by the same person who inspired the previous improvement I blogged about here

Instead of the description just being the name of the check it is now more of a, well, a description really

This has the added effect that it means that just running Get-DbcCheck in the command line will not fit all of the information on a normal screen

You can use the Format-Table command (or its alias ft at the command line) and select the properties to display using

Get-DbcCheck | ft -Property UniqueTag, Description -Wrap

or you can use Format-List (or its alias fl at the command line)

Get-DbcCheck | fl
Or you can use Out-GridView (or its alias ogv at the command line) (Incidentally, could you also thumbs up this issue on Github to get Out-GridView functionality in PowerShell 6)
Get-DbcCheck | ogv
Happy Validating !
Read Full Article
Visit website

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview