Loading...

To demonstrate this process, I will show you how I export all the PowerShell help and convert it to PDF files that I use instead of looking at the help files in the PowerShell Console. I realize you can view the PowerShell help files in the console and online. The purpose of this script is to demonstrate how I solved what I viewed as a problem for me. You may be just fine looking through the help files in the console or online, as with any PowerShell script anyone hacks together, your mileage may vary.

I have split the code into 3 functions; 1 to export the help files to text files, 1 to convert those text files to PDFs, and 1 to merge all the PDFs in a given folder into 1 PDF. This process requires the iTextSharp .net library which you can download here (https://github.com/itext/itextsharp). This is probably one of the more basic implementations of the iTextSharp library, it has a lot of functionality that I didn’t need to accomplish my mission. You can find more technical information on iTextSharp here (https://afterlogic.com/mailbee-net/docs-itextsharp/)

The Problem

For me, I hate looking through the help files in the console… for a few reasons.

  1. By default, the font is too small for me to see well. Old people problems I know, it is what it is. I find reading a pdf file much easier on the eye. I dump all the help files on a network accessible share that I can quickly access on any machine in my network.
  2. I can highlight, comment and bookmark the pdf files as needed. This may require the full version of Acrobat Pro or an equivalent, I’m not sure. My organization has Acrobat Pro so this isn’t an issue for me. Using highlighting, comments and bookmarks, I can make the help files more of a quick reference, it just works for me.
  3. I frequently work on Servers that are not accessible to the larger internet. Meaning, I can’t update help files for those machines unless I do it manually. Instead, I keep these pdf files on an external hard drive that I carry around anyway.

I know the help files are updated occasionally but it’s not often enough to make much of a difference. I usually update my help files a couple times a year. All that said, below is the code I use to make all the txt files and convert them to pdfs. I have also combined the help for each cmdlet in a module into a single pdf that I have made available here on signalwarrant.com. Hit the downloads link and use if you think you can get some benefit from them.

Convert text files to PDF then merge PDFs in bulk with PowerShell and iTextSharp - YouTube

This script exports all the help for each cmdlet to the specified file path.

Function Export-PShelp {
  <#
    .SYNOPSIS
    Exports the -full help for each CMDlet available on your computer.
    
    .DESCRIPTION
    Gets all the Modules available on the computer, loops through those Modules to retrieve each CMDlet name as well as create a folder for each module. Loops through each CMDlet in each module and exports the -full help for each to the $filepath\$modulename

    .PARAMETER filePath
    -filePath: The folder that you're exporting all the help files to on your local hard drive.

    .EXAMPLE
    Export-PShelp -filePath 'c:\helpfiles'

    .NOTES

    .LINK
    
  #>

Param(
  [Parameter(
    Mandatory=$True,HelpMessage='Path where you want the helpfiles to be exported',
    Position=1
  )][string]$filePath
)
    
If(!(Test-Path -Path "$filePath\")){
  New-Item -Path $Filepath -Name Help -ItemType Directory
}

# You get some errors if a module has no help so I just 
# turned error reporting off.    
$ErrorActionPreference = 'silentlycontinue'

# Get each module name and loop through each to retrieve cmdlet names
$modules = Get-Module -ListAvailable | 
    Select-Object -ExpandProperty Name

ForEach ($module in $modules){
  # Creates a folder for each Module
  If (!(Get-Item -Path "$filePath\$($module)")){
    New-Item -ItemType Directory -Path "$filePath\$($module)"
  }

  # Get the CMDLet names for each CMDlet in the Module
  $modulecmdlets = Get-Command -Module $module | 
    Select-Object -ExpandProperty name
        
    ForEach ($modulecmdlet in $modulecmdlets){
      Get-Help -Name $($modulecmdlet) -Full | 
      Out-File -FilePath "$filePath\$($module)\$($modulecmdlet).txt"
    }  
  }
}


# SIG # Begin signature block
# MIID7QYJKoZIhvcNAQcCoIID3jCCA9oCAQExCzAJBgUrDgMCGgUAMGkGCisGAQQB
# gjcCAQSgWzBZMDQGCisGAQQBgjcCAR4wJgIDAQAABBAfzDtgWUsITrck0sYpfvNR
# AgEAAgEAAgEAAgEAAgEAMCEwCQYFKw4DAhoFAAQUCvPzAOx/EBAsNUVRsPAVBFrD
# k+ugggIHMIICAzCCAWygAwIBAgIQNLlZ1S66VKZIK0GrwNPiZjANBgkqhkiG9w0B
# AQUFADAcMRowGAYDVQQDDBFzaWduYWx3YXJyYW50LmNvbTAeFw0xNzA4MTMwMTQ3
# MDdaFw0yMTA4MTMwMDAwMDBaMBwxGjAYBgNVBAMMEXNpZ25hbHdhcnJhbnQuY29t
# MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQCzRJtZQFTGFuO1em//vTUA9P1D
# KxGzFKXh1smpLVMVc4kIH8IcZduB4g/Zfd3bfk2qPudoaNCz+BWtkISMvKCRsEj6
# wEOzXoEvCJKusEmIH8S9YBiY70uoFSvwn/HR3BoItGPotnGtk69Uc7Ldvm7NQjRL
# z3OJp7xbj5bIkhuDmQIDAQABo0YwRDATBgNVHSUEDDAKBggrBgEFBQcDAzAdBgNV
# HQ4EFgQU9TJKFdTrOuigTVLwltUWgnE6tMowDgYDVR0PAQH/BAQDAgeAMA0GCSqG
# SIb3DQEBBQUAA4GBAIP0zowgc+EYPB5BeVm+L0jkqfiqEvQgSIdeSeYXSe6tLGZ+
# rtOlp6XJ+xBSWnpIl7oftc13zDY5+j/++WBuY1y9aM48zzhUxnfaou48u+wXpwMs
# FkhPouje4qfdF7dJzM+4SeA0rNPbG+7jEqYxAmBOS0U67vFK5ISDntdxQTp6MYIB
# UDCCAUwCAQEwMDAcMRowGAYDVQQDDBFzaWduYWx3YXJyYW50LmNvbQIQNLlZ1S66
# VKZIK0GrwNPiZjAJBgUrDgMCGgUAoHgwGAYKKwYBBAGCNwIBDDEKMAigAoAAoQKA
# ADAZBgkqhkiG9w0BCQMxDAYKKwYBBAGCNwIBBDAcBgorBgEEAYI3AgELMQ4wDAYK
# KwYBBAGCNwIBFTAjBgkqhkiG9w0BCQQxFgQUv9MSg0E0NpHEf9bLHjVr5FaOU94w
# DQYJKoZIhvcNAQEBBQAEgYBCOXC5bFQnCl0tec/+TkQoPSYN2CIO2v9xX3Ws8QtA
# urBEQQEG0fLbqVK0L+O4piMQfwknsJ402MzvXKJ0LmhwVy0dokab1lu67G9tMCNu
# GlT4en3bmGy1LCTW+/aZUSkugCnGKLVNgOIr9juTZsLou3Hg65BRdc+Cc/6Azj9o
# Vg==
# SIG # End signature block

The script below will convert files to PDF.

Function ConvertTo-PDF {
  <#
    .DESCRIPTION
    Convert 1 or many files to PDFs
    
    .PARAMETER filePath
    -filePath: The path to the folder that contains all your text files

    .PARAMETER dllPath
    -dllPath: The Path to the iTextSharp.DLL file

    .EXAMPLE
    ConverTTo-PDF -filePath 'C:\help' -filetype 'txt' -dllPath 'C:\itextsharp.dll'

    .REQUIREMENTS
    - iTextSharp 5.5.10 .NET Library
    - You may have to Set execution policy to less restrictive policy

    .LINK
    iTextSharp .NET library: https://github.com/itext/itextsharp/releases/tag/5.5.11
  #>


Param(
  [Parameter(
    Mandatory=$True,
    HelpMessage='Add the path you want to save help to EX. c:\help'
    )][string]$filePath,

  [Parameter(
    Mandatory=$True,HelpMessage='What file type to convert'
    )][string]$filetype,

  [Parameter(
    Mandatory=$True,HelpMessage='path to the itextsharp.dll file EX. c:\itextsharp.dll'
    )][string]$dllPath
)

Begin{
  Try{
    Add-Type -Path $dllPath -ErrorAction Stop
  }
  Catch{
    Throw "Could not load iTextSharp DLL from $($dllPath).`nPlease check that the dll is located at that path."
  }
}

Process{
  $txtFiles = Get-ChildItem -Path $filePath -Recurse -Filter "*.$filetype"

  ForEach ($txtFile in $txtFiles){
    $path = "$($txtFile.DirectoryName)\$($txtFile.BaseName).pdf"
    $doc = New-Object -TypeName iTextSharp.text.Document
    $fileStream = New-Object -TypeName IO.FileStream -ArgumentList ($path, [System.IO.FileMode]::Create)
    [iTextSharp.text.pdf.PdfWriter]::GetInstance($doc, $filestream)
    [iTextSharp.text.FontFactory]::RegisterDirectories()

    $paragraph = New-Object -TypeName iTextSharp.text.Paragraph
    $paragraph.add(( Get-Content -Path $($txtFile.FullName) |
        ForEach-Object {
            "$_`n"
        })) | Out-Null
    $doc.open()
    $doc.add($paragraph) | Out-Null
    $doc.close()
}

}
}
# SIG # Begin signature block
# MIID7QYJKoZIhvcNAQcCoIID3jCCA9oCAQExCzAJBgUrDgMCGgUAMGkGCisGAQQB
# gjcCAQSgWzBZMDQGCisGAQQBgjcCAR4wJgIDAQAABBAfzDtgWUsITrck0sYpfvNR
# AgEAAgEAAgEAAgEAAgEAMCEwCQYFKw4DAhoFAAQU3DiWc0/Em1jWFkZrblBWk/SS
# 9AagggIHMIICAzCCAWygAwIBAgIQNLlZ1S66VKZIK0GrwNPiZjANBgkqhkiG9w0B
# AQUFADAcMRowGAYDVQQDDBFzaWduYWx3YXJyYW50LmNvbTAeFw0xNzA4MTMwMTQ3
# MDdaFw0yMTA4MTMwMDAwMDBaMBwxGjAYBgNVBAMMEXNpZ25hbHdhcnJhbnQuY29t
# MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQCzRJtZQFTGFuO1em//vTUA9P1D
# KxGzFKXh1smpLVMVc4kIH8IcZduB4g/Zfd3bfk2qPudoaNCz+BWtkISMvKCRsEj6
# wEOzXoEvCJKusEmIH8S9YBiY70uoFSvwn/HR3BoItGPotnGtk69Uc7Ldvm7NQjRL
# z3OJp7xbj5bIkhuDmQIDAQABo0YwRDATBgNVHSUEDDAKBggrBgEFBQcDAzAdBgNV
# HQ4EFgQU9TJKFdTrOuigTVLwltUWgnE6tMowDgYDVR0PAQH/BAQDAgeAMA0GCSqG
# SIb3DQEBBQUAA4GBAIP0zowgc+EYPB5BeVm+L0jkqfiqEvQgSIdeSeYXSe6tLGZ+
# rtOlp6XJ+xBSWnpIl7oftc13zDY5+j/++WBuY1y9aM48zzhUxnfaou48u+wXpwMs
# FkhPouje4qfdF7dJzM+4SeA0rNPbG+7jEqYxAmBOS0U67vFK5ISDntdxQTp6MYIB
# UDCCAUwCAQEwMDAcMRowGAYDVQQDDBFzaWduYWx3YXJyYW50LmNvbQIQNLlZ1S66
# VKZIK0GrwNPiZjAJBgUrDgMCGgUAoHgwGAYKKwYBBAGCNwIBDDEKMAigAoAAoQKA
# ADAZBgkqhkiG9w0BCQMxDAYKKwYBBAGCNwIBBDAcBgorBgEEAYI3AgELMQ4wDAYK
# KwYBBAGCNwIBFTAjBgkqhkiG9w0BCQQxFgQUJly3aMFm5BHJwQzwpMuAiMZ5j2Mw
# DQYJKoZIhvcNAQEBBQAEgYCoHZCqifQlc527VIUSR8mUjC11Xd2nROtSJcyO98Dh
# Ps2AnT5lpHevegUzKusL1b04eIiqo2kRKiMHS4YTAQfnAZWB4ac+dWcxc5i8ZWSz
# /EWXcynMMZibcFewIsvp+IVZ1tUjJSqGcjy/UN5Xn52BtoxnJJuK4Dvgy1oRg/E7
# bg==
# SIG # End signature block

This script will merge all PDFs in a given folder into 1 PDF. You would think it would take a long time for many PDF files but I created a 20,000 page PDF file in a second or so.

Function Merge-PDFs{
  <#
    .SYNOPSIS
    Merges PDF files into 1 PDF file.

    .PARAMETER filePath
    -filePath: Any PDF file in the filepath will be combined into 1 PDF file named All_PowerShell_Help.pdf.

    .PARAMETER dllPath
    -dllPath: The Path to the iTextSharp.DLL file

    .EXAMPLE
    Merge-PDFs -filePath 'c:\scripts\help' -dllPath 'c:\scripts\itextsharp.dll'
    Describe what this call does

    .NOTES
    Modified Code from here
    http://geekswithblogs.net/burncsharp/archive/2007/04/13/111629.aspx

  #>


  Param(
  [Parameter(
    Mandatory=$True,
    HelpMessage='Add the path you want to save help to EX. c:\help'
  )][string]$filePath,
  
  [Parameter(
    ValueFromPipelinebyPropertyName=$true
    )][string]$dllPath
  )

  Begin{
    Try{
      Add-Type -Path $dllPath -ErrorAction Stop
    }
    Catch{
      Throw "Could not load iTextSharp DLL from $($dllPath).`nPlease check that the dll is located at that path."
    }
  }

  Process{ 

##############################################################
# Merges all the PDF files for each Module in to 1 PDF file 
# called All_PowerShell_Help.pdf in $filepath
# 21,000 pages to maybe a second, iTextSharp is fast.
##############################################################

    $pdfs = Get-ChildItem -Path $filePath -Recurse -Filter '*.pdf'
    $ErrorActionPreference = 'silentlycontinue'
    [void] [System.Reflection.Assembly]::LoadFrom(
      [System.IO.Path]::Combine($filePath, $dllPath)
    )
    $output = [System.IO.Path]::Combine($filePath, 'All_PowerShell_Help.pdf')
    $fileStream = New-Object -TypeName System.IO.FileStream -ArgumentList ($output,[System.IO.FileMode]::OpenOrCreate)
    $document = New-Object -TypeName iTextSharp.text.Document
    $pdfCopy = New-Object -TypeName iTextSharp.text.pdf.PdfCopy -ArgumentList ($document, $fileStream)
    $document.Open()
    
    foreach ($pdf in $pdfs) {
        $reader = New-Object -TypeName iTextSharp.text.pdf.PdfReader -ArgumentList ($pdf.FullName)
        $pdfCopy.AddDocument($reader)
        $reader.Dispose()
    }
$pdfCopy.Dispose()
$document.Dispose()
$fileStream.Dispose()

#############################################################
# Find all directories in $filepath that are empty and delete
# them. Some directories created for DSC Resources will be empty
#############################################################
    (Get-ChildItem -Path $filePath -Recurse | 
    Where-Object {$_.PSIsContainer -eq $True}) | 
    Where-Object {$_.GetFiles().Count -eq 0} | 
    Remove-Item
}

}
# SIG # Begin signature block
# MIID7QYJKoZIhvcNAQcCoIID3jCCA9oCAQExCzAJBgUrDgMCGgUAMGkGCisGAQQB
# gjcCAQSgWzBZMDQGCisGAQQBgjcCAR4wJgIDAQAABBAfzDtgWUsITrck0sYpfvNR
# AgEAAgEAAgEAAgEAAgEAMCEwCQYFKw4DAhoFAAQUAUcWhwkBV0QYSdjyamQBs0uq
# 4RigggIHMIICAzCCAWygAwIBAgIQNLlZ1S66VKZIK0GrwNPiZjANBgkqhkiG9w0B
# AQUFADAcMRowGAYDVQQDDBFzaWduYWx3YXJyYW50LmNvbTAeFw0xNzA4MTMwMTQ3
# MDdaFw0yMTA4MTMwMDAwMDBaMBwxGjAYBgNVBAMMEXNpZ25hbHdhcnJhbnQuY29t
# MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQCzRJtZQFTGFuO1em//vTUA9P1D
# KxGzFKXh1smpLVMVc4kIH8IcZduB4g/Zfd3bfk2qPudoaNCz+BWtkISMvKCRsEj6
# wEOzXoEvCJKusEmIH8S9YBiY70uoFSvwn/HR3BoItGPotnGtk69Uc7Ldvm7NQjRL
# z3OJp7xbj5bIkhuDmQIDAQABo0YwRDATBgNVHSUEDDAKBggrBgEFBQcDAzAdBgNV
# HQ4EFgQU9TJKFdTrOuigTVLwltUWgnE6tMowDgYDVR0PAQH/BAQDAgeAMA0GCSqG
# SIb3DQEBBQUAA4GBAIP0zowgc+EYPB5BeVm+L0jkqfiqEvQgSIdeSeYXSe6tLGZ+
# rtOlp6XJ+xBSWnpIl7oftc13zDY5+j/++WBuY1y9aM48zzhUxnfaou48u+wXpwMs
# FkhPouje4qfdF7dJzM+4SeA0rNPbG+7jEqYxAmBOS0U67vFK5ISDntdxQTp6MYIB
# UDCCAUwCAQEwMDAcMRowGAYDVQQDDBFzaWduYWx3YXJyYW50LmNvbQIQNLlZ1S66
# VKZIK0GrwNPiZjAJBgUrDgMCGgUAoHgwGAYKKwYBBAGCNwIBDDEKMAigAoAAoQKA
# ADAZBgkqhkiG9w0BCQMxDAYKKwYBBAGCNwIBBDAcBgorBgEEAYI3AgELMQ4wDAYK
# KwYBBAGCNwIBFTAjBgkqhkiG9w0BCQQxFgQU2AI6PsOR1hE2VXson/ySl+IOHsMw
# DQYJKoZIhvcNAQEBBQAEgYCIFpTuPizoQTJX/LtbyNnYEfmvQQqJucaxrxD9dtUq
# laKhbCVRAyjqxdfaWmYpj4WXdbas0tfuPJtZvwN63yvGNrh7iNKOKNJaYgjb8iOu
# t/4FLTPlz+4+tzWLj3BvifDrYQLR4ZUc+U0K0ZdDRWzFPmmwZcUFaGNDXWpgamJB
# JA==
# SIG # End signature block

This script is not part of any of the functions above but I figured I would include it. It loops through each folder recursively and merges all the PDF files in each folder.

##############################################################
# Merges all the PDF files for each Module in to 1 PDF file per
# module called Help_<moduleName>.pdf in $filepath
##############################################################

$folders = Get-ChildItem -Path $filePath -Directory
$ErrorActionPreference = 'silentlycontinue'
foreach ($folder in $folders){
    $pdfs = Get-ChildItem -Path $folder.fullname -recurse -Filter '*.pdf'

    [void] [System.Reflection.Assembly]::LoadFrom(
        [System.IO.Path]::Combine($filePath, $dllPath)
    )
        $output = [System.IO.Path]::Combine($filePath, "Help_$($folder[0].Name).pdf")
        $fileStream = New-Object -TypeName System.IO.FileStream -ArgumentList ($output, [System.IO.FileMode]::OpenOrCreate)
        $document = New-Object -TypeName iTextSharp.text.Document
        $pdfCopy = New-Object -TypeName iTextSharp.text.pdf.PdfCopy -ArgumentList ($document, $fileStream)
        $document.Open()
    
        foreach ($pdf in $pdfs) {
            $reader = New-Object -TypeName iTextSharp.text.pdf.PdfReader -ArgumentList ($pdf.FullName)
            $pdfCopy.AddDocument($reader)
            $reader.Dispose()
        }
    $pdfCopy.Dispose()
    $document.Dispose()
    $fileStream.Dispose()
}
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

As you may or may not know, I recently decommissioned my old Dell PowerEdge 1950 server that I used for a few Lab virtual machines. While experimenting with PowerShell on these Virtual Machines, I have found myself in the situation where it would be easier to delete the Virtual Machines and re-create them instead of troubleshooting something I fouled up in the Registry. After the 2nd time rebuilding the lab VMs using the Azure website, I decided to script the process.

The script below will take input from a CSV file and create a virtual machine in your Azure subscription for each row in the CSV file. My example creates 2 virtual machines but you can obviously add as many as you need.

For a production environment in Azure, I would suggest Snapshotting the Virtual Machines. There is a good write-up of the process here: http://www.coreazure.com/snapshot-vms-in-azure-resource-manager/. In my case, for a Lab, snapshots use more storage which costs more $$.

Automate Creating Lab Virtual Machines in Azure with PowerShell - YouTube

Function New-AzureLab {
  <#
    .SYNOPSIS
    New-AzureLab will create 1 or multiple VMs in Azure based on input parameters from a CSV

    .DESCRIPTION
    Create a CSV file like below:
    VMName,Location,InterfaceName,ResourceGroupName,VMSize,ComputerName
    SP,EastUS,SP_Int,SignalWarrant_RG,Basic_A2,SP

    The function will read the input from the CSV file and create VMs in an Azure Resource Group
    
    .PARAMETER csvpath
    The full path to your CSV file (eg c:\scripts\VMs.csv)

    .EXAMPLE
    New-AzureLab -csvpath c:\scripts\VMs.csv
    Imports the applicable values from the CSV file

    .NOTES
    1. I already had a Resource Group in Azure therefore I put all the VMs in the same group.
    2. I already had a VM network created, all my VMs are in the same network.

    .LINK
    URLs to related sites
    A good writeup on the process - https://docs.microsoft.com/en-us/azure/virtual-machines/windows/quick-create-powershell
    Azure VM size values - https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sizes-general
    Azure VM Publisher, Offer, SKUs, Version info for various VM types - https://docs.microsoft.com/en-us/azure/virtual-machines/windows/cli-ps-findimage

    .INPUTS
    CSV file path

    .OUTPUTS
    None

    .EXAMPLES
    New-AzureLab -csvpath c:\scripts\VMs.csv
  #>

    Param(
        [Parameter(Mandatory=$True,HelpMessage='Enter the Path to your CSV')]
        [string]$csvpath
        )
    # Lets make sure the CSV file is actually there
    $testpath = Test-Path -Path $csvpath
    If (!$testpath){
        clear-host
        write-host -ForegroundColor Red '***** Invalid CSV Path *****' -ErrorAction Stop
    } else {

        # This will the be local username and password for each VM
        $Credential = Get-Credential

        # Import the information from my CSV
        Import-Csv -Path "$csvPath" | ForEach-Object {
        
        # Get the Storage Account Informaiton
        $StorageAccount = Get-AzureRmStorageAccount

        # This is the naming convention for the OS Disk
        $OSDiskName = $_.'VMName' + '_OSDisk'

        # Network Information
        $PublicIP = New-AzureRmPublicIpAddress -Name $_.'InterfaceName' -ResourceGroupName $_.'ResourceGroupName' -Location $_.'Location' -AllocationMethod Dynamic
        $VMNetwork = Get-AzureRmVirtualNetwork
        $Interface = New-AzureRmNetworkInterface -Name $_.'InterfaceName' -ResourceGroupName $_.'ResourceGroupName' -Location $_.'Location' -SubnetId $VMNetwork.Subnets[0].Id -PublicIpAddressId $PublicIP.Id

        ## Setup local VM object
        $VirtualMachine = New-AzureRmVMConfig -VMName $_.'VMName' -VMSize $_.'VMSize'
        $VirtualMachine = Set-AzureRmVMOperatingSystem -VM $VirtualMachine -Windows -ComputerName $_.'ComputerName' -Credential $Credential -ProvisionVMAgent -EnableAutoUpdate
        $VirtualMachine = Set-AzureRmVMSourceImage -VM $VirtualMachine -PublisherName MicrosoftWindowsServer -Offer WindowsServer -Skus 2016-Datacenter -Version 'latest'
        $VirtualMachine = Add-AzureRmVMNetworkInterface -VM $VirtualMachine -Id $Interface.Id
        $OSDiskUri = $StorageAccount.PrimaryEndpoints.Blob.ToString() + 'vhds/' + $OSDiskName + '.vhd'
        $VirtualMachine = Set-AzureRmVMOSDisk -VM $VirtualMachine -Name $OSDiskName -VhdUri $OSDiskUri -CreateOption FromImage

        ## Create the VM in Azure
        New-AzureRmVM -ResourceGroupName $_.'ResourceGroupName' -Location $_.'Location' -VM $VirtualMachine -Verbose

        }
    }
}
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

This is why I Love PowerShell… It’s simple, yet functional.

From an Administrative perspective, I think we can all agree that any changes in your Domain Admins group without your knowledge would be of interest to you. If you’re in a large organization with access to enterprise management tools you probably have some widget that fires off a message to you or a group of people in the event a change is detected… or maybe you don’t.

If you’re an admin at a small business and maybe even some medium sized businesses, you may not have access to those enterprise management tools and widgets. Turns out, we can use PowerShell to monitor any group for us and notify us when a change occurs. It’s actually pretty simple.

You can even have PowerShell send you a text message… which is pretty cool.

I’m using the script to keep an eye on my Domain Admins Group but you could easily adapt it to monitor services or processes. You might want to monitor your Exchange Servers Transport service, if it stops for whatever reason send me an email and text message.

Hey PowerShell... Text me if my Domain Admins Group changes - YouTube

First, we have to get all the members of the Domain Admins Group and export to an xml file.

# Run this once to get the Domain Admins group baseline
Get-ADGroupMember -Server signalwarrant.local -Identity "Domain Admins" |
    Select-Object -ExpandProperty samaccountname | 
    Export-Clixml -Path 'C:\scripts\CurrentDomainAdmins.xml'

This is the script we’ll run on a schedule.

# This is the script we'll run on a regular basis

# Get the filehash of the CurrentDomainAdmins.xml
    $CurrentAdminsHash = Get-FileHash -Path 'C:\scripts\CurrentDomainAdmins.xml' | 
      Select-Object -expandProperty Hash
# Get the current date
    $Date = Get-Date
# This is the file we're testing the CurrentDomainAdmins.xml file against
    $newAdmins = 'c:\scripts\NewAdmins.xml'
# A variable we will use in the if statement below
    $Change = ''

# As we run the test we're going to get the contents of the Domain Admins Group
Get-ADGroupMember -Server signalwarrant.local -Identity 'Domain Admins' |
    Select-Object -ExpandProperty samaccountname | 
    Export-Clixml -Path $newAdmins -Force

# Get the filehash of the new file 
$NewAdminsHash = Get-FileHash -Path $newAdmins | Select-Object -expandProperty Hash

# If the CurrentDomainAdmins.xml (our baseline file) and NewAdmins.xml do not match
If ($NewAdminsHash -ne $CurrentAdminsHash){
    
    # Do all of this if a change is detected
    $Change = 'Yes'
    $ChangesDetected = 'Domain Admins Group changed detected on: ' + $date
    $ChangesDetected | Out-File -FilePath 'C:\scripts\DA_Changes.txt' -Append -Force
} else {

    # If no change detected just write when the script last ran
    $Change = 'No'
    $NoChangesDetected = 'No Changes detected on: ' + $Date
    $NoChangesdetected | Out-File -FilePath 'C:\scripts\DA_NoChanges.txt' -Append -Force
}

# Credentials for the email account
# Do not store cleartext passwords in scripts
# https://powershell.org/forums/topic/powershell-specifiy-a-literal-encrypted-standard-string/
# The above link will tell you why I had to do it.

# If your Email account is on the same domain as the machine you're running the script from
# I would suggest using this function to create your encrypted Password file.
# https://gist.github.com/davefunkel/415a4a09165b8a6027a297085bf812c5
$username = 'your email here'
$password = 'password for the above email address'
$secureStringPwd = $password | ConvertTo-SecureString -AsPlainText -Force 
$creds = New-Object System.Management.Automation.PSCredential -ArgumentList $username, $secureStringPwd

# If the test above fails and the $change = "yes" then send me an email and text message
# and attach the NewAdmins.xml
If ($Change -eq 'Yes') {
    # Code to send the email and lof the message sent in the EventLog
    $From = 'your email here'
    $To = 'your email here'
    $Cc = 'your email here'
    $Attachment = $newAdmins
    $Subject = '----Domain Admin Members has changed----'
    $Body = 'Your awesome PowerShell script has detected a change in your Domain Admin members'
    $SMTPServer = 'your smtp server address'
    $SMTPPort = '587'
    Send-MailMessage -From $From -to $To -Cc $Cc -Subject $Subject `
    -Body $Body -SmtpServer $SMTPServer -port $SMTPPort `
    -Credential $creds -Attachments $Attachment
}

These are the Action arguments for the scheduled task.
-NoLogo -NonInteractive -WindowStyle Hidden -NoProfile -Executionpolicy bypass -file “C:\scripts\AD_Audit.ps1”

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

For whatever reason, I seem to be migrating most of my coding activity to Visual Studio Code from the PowerShell ISE. I’m not really sure why other than the look and feel seems to be more pleasing than the ISE… to me anyway. Anywho, I have wanted to hack out some sort of code backup and version control solution for all of my code for a while now. Moving everything to Visual Studio Code seems like the prime time to get that done as well.

I know nothing about Git or Visual Studio Team services, I have zero experience with either. So, In order to get this setup, I found myself searching the Youtube for a tutorial… I didn’t find much. After a couple of hours googling and youtube video watching I finally got it to work. In order to save you the time and effort, I thought it would be good to make a video myself.

Visual Studio Code and Visual Studio Team Services Integration - YouTube

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

I recently started an Azure subscription in order to move all the servers I use to test PowerShell code to the cloud. Right now I have only a couple Virtual Machines, one running Windows Server 2016, that’s my Domain Controller. I also have a Windows Server 2012 R2 Virtual Machine with Exchange 2013 installed. Obviously, both of these VMs are in the same domain.

For the purposes of testing, I wanted to be able to remote to the cloud VMs using PowerShell. The problem is since my local machine is not in the same domain as the VMs I couldn’t get authenticated. Now, you can stand up an Azure Active Directory and put the local machine in that domain and you’re good to go. I’m trying to keep costs as low as possible so I wasn’t willing to pay that extra expense for the Azure AD. I think you can also use a certificate in an Azure Keystore but again, extra expense plus I would have to figure out how to make it work… I’m an Azure n00b.

After some quality time consulting Professor Google, I came to the conclusion to create a certificate in each VM, then importing that certificate on my local laptop was the easiest way to make this work. Obviously, this is not a good enterprise solution although I guess you could probably do it a little more efficiently on a larger scale using Certificate Services. Anywho… this is how I did it.

If you have a better method, please let me know in the comments.

Enable PowerShell Remoting to an Azure Virtual Machine, without Domain Membership - YouTube

# Enable Remoting to an Azure VM
Enable-PSRemoting

# Make sure to set the Public IP address to static or make sure you track the change of the public IP

# Create Network Security Group Rule to allow winrm

# Create a Selfsigned cert on the Azure VM
$Cert = New-SelfSignedCertificate -CertstoreLocation Cert:\LocalMachine\My -DnsName PC1.mydomain.local
Export-Certificate -Cert $Cert -FilePath '<filepath>\exch.cer'

# Create a firewall rule inside the Azure VM 
New-Item -Path WSMan:\LocalHost\Listener -Transport HTTPS -Address * -CertificateThumbPrint $Cert.Thumbprint -Force
New-NetFirewallRule -DisplayName 'WinRM HTTPS-In' -Name 'WinRM HTTPS-In' -Profile Any -LocalPort 5986 -Protocol TCP

# Install the Cert on the client

# Run this on the remote client
$cred = Get-Credential
Enter-PSSession -ConnectionUri https://xx.xx.xx.xx:5986 -Credential $cred -SessionOption `
(New-PSSessionOption -SkipCACheck -SkipCNCheck -SkipRevocationCheck) -Authentication Negotiate
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

This is a very minimal introduction into filtering pipeline objects using Where-Object and comparison operators. This video will give you the syntax and a simple example. To unleash the full capabilities of comparison operators take a look at the help file.

Basic and Advanced Filtering of PowerShell Objects using Where-Object - YouTube

# Basic Syntax example
Get-Service | Where-Object Status -eq Running

# Advanced Syntax example
Get-Service | Where-Object {$PSItem.Status -eq 'Running' -and $PSItem.StartType -eq 'Automatic'}
# Same as above
Get-Service | Where-Object {$_.Status -eq 'Running' -and $_.StartType -eq 'Automatic'}
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Something a little different with this video, no PowerShell goodness. I’m an Azure rookie, and I’ve had a few challenges getting a VMs up and running. I’ll have a few more videos documenting what I’ve learned so hopefully, it will allow you to get spun up quicker than I.

Azure VM Connect Button is Greyed out, lets fix it. - YouTube

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

A practical example of why you should filter as far left in your scripts as possible. It might matter.

PowerShell Quicktip #2: Filter-Left - YouTube

#1
Get-Service | Where-object { $_.Status -eq 'Running' -and $_.name -like 's*'}
#2
Get-Service -name s*| Where-object { $_.Status -eq 'Running'}

#1
Measure-Command -Expression {Get-Service | Where-object { $_.Status -eq 'Running' -and $_.name -like 's*'}}
#2
Measure-Command -Expression {Get-Service -name s*| Where-object { $_.Status -eq 'Running'}}
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

PowerShell Basics: Sorting and Selecting Objects with Sort-Object and Select-Object - YouTube

# Selecting
#Default
Get-Process

# All Properties
Get-Process | Select-Object -Property * | Out-GridView

# Sorting
# Changes the default sorting order for Get-Process
Get-Process | Sort-Object CPU

# Minimize the data and sort
Get-Process | Select-Object ProcessName, CPU | Sort-Object CPU -Descending

## Caution 
# Mission: Get the top 10 processes by CPU usage (which 10 processes have the most CPU usage)
Get-Process | 
    Select-Object ProcessName, CPU -First 10 | 
    Sort-Object CPU -Descending

# or

Get-Process | 
    Sort-Object CPU -Descending | 
    Select-Object ProcessName, CPU -First 10
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview