As previously discussed in my The PaaS/SaaS DSC Paradigm article, the continuous monitoring of Software-as-a-Service configurations is something that keeps me up at night. Still to this day, when a customer asks me what the recommended way to run Office365DSC configurations and to ensure continuous compliance is, I still don’t have a good answer to provide. For on-premises system, my answer has always been to use Azure DevOPS to manage the Configuration as code, and to automate the deployments via Azure Pipelines; that really should be a no brainer for organizations. However Azure DevOPS doesn’t play that nice with SaaS configurations. I can use an Azure DevOPS Agent to compile and deploy the configuration remotely against my Office 365 tenant, but then that agent gets destroyed, and doesn’t do the frequent Consistency checks that DSC is known to be good for.
Things are about to change however. As I was sitting in a plane on my way to sunny (way too sunny this week actually) Las Vegas, I decided to try out a new approach to get around this limitation. Azure DevOPS allows you to schedule the execution of Release Pipelines. If we could somehow create a Pipeline whose sole purpose would be to test a configuration against a remote Office 365 Tenant and return an error if the configuration is not in the Desired State, we should be able to mimic something similar to what Azure Automation DSC is doing. When you come to think about it, with PowerShell 5.0 and above, the Test-DSCConfiguration cmdlet now offers a -ReferenceConfiguration parameter that can accept a path to a compiled .MOF file, and analyzes a machine based on it to check and see if it is in the Desired State. The good news for us is that the .MOF file doesn’t have to be currently applied to the machine in order for us to do the Consistency check. We can therefore have a Release Pipeline that will run on a regular basis, on an ephemeral Azure DevOPS Agent, test the configuration using Test-DSCConfiguration -ReferenceConfiguration, and report any configuration drifts.
This article describes the steps one should be taking to create 2 Azure DevOPS pipelines to “simulate” an ApplyAndMonitor Scenario. If you are looking to achieve ApplyAndAutocorrect, then it becomes even simpler since all you have to do is schedule a Pipeline that will execute the configuration on a regular basis without even bothering to check if the environment is in the desired state or not. After all, the concepts of Idempotency tell us that running the same thing over ad over again will always results in the same results don’t they?
For the purpose of this demo, I will use the below configuration Please note that it is “twice parameterized”. The .ps1 file accepts Username, Password, for the Global Admin Account, creates a PSCredential object out of these two pieces of information, then feeds that PSCredential object into the actual configuration. These Username and Password are fed in by the Build Pipeline at compilation time via Pipeline Variables.
While I will not be covering the creation of a build pipeline in this article, you should know that your build pipeline should do 3 things: Compile your .MOF file, Copy the files in the build artifacts and publish the artifacts so that your Release pipelines can consume the files. If you required additional guidance, you can always refer to my DevOPS for Office 365 article.
Push Release Pipeline
The push pipeline gets triggered via Continous Integration everytime a successful build completes. Once it does, all this pipeline does is run Start-DSCConfiguration against your configuration so that the Azure DevOPS pipeline can start configuring your tenant remotely. Again, I would refer you to the DevOPS for Office 365 article if you require additional guidance.
Alright, let’s get down to business.
Create a New Release Pipeline in your project.
Select an Empty job template.
Leave Stage 1 as the default name and click on Add an artifact.
Select the latest Build, leave the default values and click Add.
Click on the task link to add new tasks to the pipeline.
Click on the + button beside the Agent. When the task panel opens, search for PowerShell and finally, click on the Add button beside the PowerShell task to add a new one.
Select the newly added task, rename it to Monitoring Tenant, select Inline as the type of script, and paste in the following script. This will assess the configuration of your tenant against localhost (the DevOPS agent). The path here is the default build path. If you followed all steps, that path to the .mof file should be the same for you as well.
winrm quickconfig -Force
Install-Module Office365DSC -Force
$results = Test-DSCConfiguration -ReferenceConfiguration "D:\a\r1\a\_Build\Package\BuildFiles\TenantConfig\localhost.mof" -ComputerName localhost
if (-not $results.InDesiredState)
throw "Some resources are not in the Desired State."
Go back to the Pipeline tab and click on the Schedule Icon under the artifact.
This is the step where you define the frequency at which the Monitoring pipeline will automatically get triggered. Think of this as our ConfigurationModeFrequencyMins property for a Local Configuration Manager that doesn’t really exist. In my case, I will create 2 schedules, once at 6AM and once at 6PM, every day of the week. Click Save once your schedules have been defined.
Our pipelines are finally created, and everything is in place for our monitoring to take place. To showcase the monitoring process, I have manually triggered the new Azure DevOPS Release Pipeline. In my case, the tenant was already in the Desired State, so the pipeline returns successful.
Let us now force a configuration drift and see what happens. Go to your tenant’s list of team, and delete one of the two teams that are controlled by our Office365DSC configuration.
If I am to re-run the monitoring pipeline, it will now error out and if I take a look at the actual error, I will get information regarding what resources are no longer in their desired state.
I am sitting in the speakers room at the European Collaboration Summit as I am writing this blog article, just finished giving a talk on DevOPS for SharePoint Admins here in Wiesbaden, Germany. The session I delivered was focused around building a completely automated Continuous Integration and Continuous Delivery pipeline to automate the deployment of a SharePoint 2019 environment in Azure infrastructure-as-a-service, and integrate changes to it. All the files that have been used in my demos are available in my personal Conferences repository on GitHub at AKA.MS/CollabSummit2019. The main idea behind the demos was to deploy a 5 servers environment (1 AD, 1 SQL and 3 SP) to Azure IaaS via ARM templates, and configure Active Directory, SQL 2017 and SharePoint 2019 on them using PowerShell Desired State Configuration. This blog article will describe in details the process involved in setting up the Azure DevOPS environment to support my demo.
In my case, my Organization was created as NikBlogPost.
Create a New Project
Next step is to create a new project within our new Organization. For my demo, I will be creating a new project named SP2019 Farm and will keep its visibility to Private.
Create a New Build Pipeline
The first step in automating our delivery process is to create a new Build Pipeline.
From the left menu, select Pipelines > Builds;
Click on the New pipeline button;
As mentioned earlier on, all of our code will be hosted on GitHub. Therefore, when you get to the screen where you need to select the source for your code, we need to pick GitHub.
The next screen will ask you to select the Repository to import the code from. I recommend you Fork my repository inside your own repo before reaching this point. In my case I will select NikCharlebois/Conferences as the source.
On the next screen, select Existing Azure Pipelines YAML file.
A new panel will popup from the right hand side of the screen, allowing you to select and existing pipeline. Make sure you select the proper branch (in my case Master), and enter the path to the Build Pipeline from your repo. It most likely will be the same as me: /2019 – CollabSummit – Europe/AzureDevOPSPipeline.yaml. Click Continue. The imported YAML template contains a PowerShell Task that will download the Azure Stack Tooling inside the Azure DevOPS Build Agent to validate your ARM templates. This is an important step in ensuring the quality of our build before attempting to deploy anything to our environment. Invalid ARM templates can result in….well, undesired results. Once the template has been validated, it will copy all the files and make them available to our Release Pipelines.
Once the template has been imported, makre sure you update line 24 to reflect your own Azure Subscription and then click on the Run button to test it out and make sure everything is configured properly.
There is a very good chance that running the Build for the first time will result in an error being thrown. This is because we haven’t authorized Azure DevOPS to deploy to our Azure Subscription yet. To correct this, simply click on the Authorize resources button. YOu will need to Queue a new Build once you corrected the issue for it to succeed.
Go back to the screen where you can edit your Build Pipeline’s YAML. Click on the ellipses and pick Triggers.
Make sure that Continuous Integration is enabled for your project. This will allow for a new Build to be triggered every time new code is committed to our GitHub repository.
Create a New Release Pipeline
The final step in our demo is to create a Release Pipeline to actually go an deploy our code to our environment. This is the piece of the puzzle that will make sure the magic happens.
In the left navigation Panel, select Build > Release pipeline
Click on the New pipeline button.
When prompted, select the option to start with an Empty job
Rename Stage 1 to something meaningful like Deploy ARM and close the panel.
Click on the Add an artifact bubble.
Select Build as the source, select the Build pipeline from the drop down, rename the Alias to Demo and click on Add
Click on the lightning icon beside the Artifact bubble to launch the trigger panel. Make sure you enable Continuous Deployment and close the panel. This will make sure that every time we have a successful build, that a new release gets initiated so that our changes can be automatically pushed to our environment.
Click on the task link to add tasks to our stage
Beside the Agent Job label, click on the + sign. In the action panel, search for Azure PowerShell and select it from the list.
Select the newly added task. Pick your Azure Subscription from the Azure Subscription dropdown.
In the Script Path picker, select Demo/Demo/CollabSummit/Deploy.ps1 and click OK.
Enter -TargetEnvironment “Dev” -TemplatePath $(System.DefaultWorkingDirectory) -ResourceGroupNamePrefix “CS” as Script Arguments, and type in 6.7.0 as the PowerShell Version to use. Click Save, and then OK when prompted.
Go back to the Pipeline Overview tab, and select Add > New stage.
Folks, I decided to finally bite the bullet after many years of debating on the topic, and to produce my own webcast series. The idea is to produce short, consumable webcast episodes that will focus on any automation topics (not just Microsoft focused). If you are interested in learning more about some of the projects I am working on, if you wish to keep skills up-to-date, or if you just want to hear my sexy French accent on a weekly basis, I recommend you follow my new YouTube channel at: https://www.youtube.com/channel/UCyhC_JPnw5P0WoV4iY0Upsw
Unless you’ve been living under a rock for the past 6 months, you’ve all heard me talk about the Office365DSC project that the team and I have been working on lately. The idea behind this project is to allow System Administrators to define their Office 365 Tenants’ configuration as code. By exposing all of their settings (e.g. SharePoint Search Managed Properties, Teams Channels, Exchange Online Mailboxes, OneDrive settings, etc.) as code, organizations can now integrate their Office 365 environments with their DevOPS practices. What we are trying to promote with this project is that we want to make sure changes to Office 365 Tenants are done in a consistent and managed fashion.
Let’s explain this further by taking a typical scenario where your organization wants to add a new Search Managed property to their Office 365 Tenants. The System Admin would receive the request, and go in their Source Control tool to add to the existing Office365DSC configuration script the few lines of code that represent that new Managed Property. When they save the changes, automated processes would take care of compiling the Office365DSC script and would execute a remote configuration process that will go and automatically create that new managed property on the tenant. This reduces the risk of deployment errors, and ensures that this configuration script in our source control is always the “Source of Truth”, meaning that what is defined in it, is sure to be exactly how the Office 365 tenant is configured.
Now let us take the example where you have multiple tenants, say Dev, QA and Production. The automated deployment process would first start by deploying the change to the Dev tenant, and then assign a review task to the accountable team. Upon reviewing the change and approving it, the change would then be promoted to QA, and another approval task would be sent. Then whenever the change is approved in QA, it will then be promoted to Production. All of this completely automated.
Back to our scenario, the following lines of code are what we will be starting with in our case. Now, very important, for the purpose of the demo, I am passing plaintext credentials in my configuration script (even worst I am hardcoding the password in the actual configuration). Do not attempt this at home! This is strictly to keep my demo as simple as possible to demonstrate the capabilities.
This configuration is stored in my Azure DevOPS project’s Repos as O365.ps1. The following screen capture shows the content of my Repos for this demo.
The second file (other than the Readme.md) in my Repos, which is named Deploy.ps1, is located under the Release. Is is a PowerShell script that will be executed on my DevOPS build Agent and which will take care of downloading the Office365DSC module onto the Agent and will then execute the Office 365 configuration remotely against your tenant. The script in itself if farily straight forward:
winrm quickconfig -force # Configure the Agent to run DSC configurations
Install-Module Office365DSC -Force -AllowPrerelease # Install the Office365DSC module
&"D:\a\r1\a\_Showcase-CI\DSC\O365.ps1" # Execute the DSC Configuration script from the built artefacts
Now that we have our files created and available in our Repos, the next step is to define a Build Definition. While we will not be “compiling” anything during our Build phase, we still need to have our files available as Artefacts for our Release definitions. Our Build Definition will consist of two very small steps:
Copy all the files from our Repos into the Artefacts folder
Publish the Artefacts, making them available for our Releases Pipelines to consume
The first phase will involve a Copy Files step configured as follow:
The second phase consists of a Publish Build Artefacts step, which takes on the following values:
The last step is to define automatics triggers for our Build Definition which will allow for a build to be automatically triggers each time a new change is commited to our Repo. To do so, navigate to the Triggers
You may have read from some of my previous blog posts that I am now heavily focussing my efforts and attention on the Office365DSC project. This project lets you define the configuration of your Office 365 tenants as code. This makes it easy for you to integrate your configurations with existing CI/CD DevOPS pipelines, and allows you to replicate configuration settings across multiple tenants.
Not only is this project very ambitious and promising, it is also one of the first of its kind. Typical DSC modules aim at configuring a given physical server or component. You normally write the DSC configuration you wish to execute and then have it assigned to the Local Configuration Manager (LCM) of the machine you wish to manage so that it can go and perform local operations. DSC is therefore typically thought of being related to Infrastructure-as-a-Service (IaaS) to some degree. Office365DSC however doesn’t work that way; it configures a remote Office 365 tenant, which is considered to be Software-as-a-Service (SaaS). Under the cover it somehow still works the same: you need to assign the configuration to an LCM which then performs the configuration steps. These steps all involve making remote calls to the various Office 365 APIs. The challenge at hand is that for SaaS and PaaS, DSC needs to monopolize the LCM of a machine somewhere, even though that machine itself will not have anything applied to it. This creates the need to have “middle-man” agents and adds complexity to the architecture or our environments. Off course we can simply have that machine’s LCM execute the Configuration once to bring the remote tenant into the Desired State and then shut the machine down, but then we loose 50% of what DSC is all about: the monitoring and consistency check aspect.
In my opinion, configuration as code (DSC script) makes even more sense when integrated with DevOPS pipelines. System administrators make a code change and commit it back to Git or TFVC, the Continuous Integration (CI) pipelines copy these changes onto the server and the Continuous Delivery (CD) pipelines automatically apply the configuration change to the environment. In most demos I do at conferences that cover this topic, I normally have my Release CD pipeline upload the modified configurations onto Azure Automation DSC, and initiate a compilation job. Any machines that are connected to my Azure Automation DSC account and which use the affected configuration then automatically obtain the new bits and update their configurations. In the case of SaaS, this would also be feasible, but would still require a VM to be assigned to my Azure Automation DSC account, meaning we’d still have that “middle-man” agent effect.
With Azure DevOPS, we could always have the Release CD pipeline execute the configuration from an Build Agent directly. This would be the equivalent of DSC Push mode in a certain way. But Build Agents being stateless in nature, the moment the configuration has been applied, the agent would shut down which means we also lose all monitoring/consistency check capabilities. That is almost the same thing I described earlier where we use a machine’s LCM to execute the configuration once, wait for the remote tenant to be in the Desired State, and then free that LCM of the responsibility to keep that tenant in a Desired State. Might as well use traditional Sequential PowerShell scripts to configure my environment at that point.
Probably what is the most viable option for our scenario is the use of Containers to execute the configuration and ensure the monitoring/consistency checks. Until SharePointPnP adds support for PowerShell Core, we are still very limited in the choices of container images we can choose from. We need an image that runs WMF 4.0+. My recommendation is to run a Windows Server 2019 image (mcr.microsoft.com/windows:1809). As you can tell by the previous link, I personally use Docker for Windows to run my containers. With this approach, you can simply spin off different containers for each tenant you are trying to configure. You could also be leveraging Azure Container Instances to achieve the same thing in a complete cloud-hosted fashion.
As you can tell by the present article, we are still working on how to best position Office365DSC for customers out there, keeping the focus on reducing both the complexity requirements for the environment and the cost for users and their organizations. More details will be provided on this blog as we evolve our strategy. In the meantime, I would like to encourage the community to use the comments section below to initiate a constructive discussion thread around the topic!
Office365DSC is an Open-Sourced PowerShell module that allows you to define the configuration of your Office 365 Tenants as code. This makes it easier to integrate Office 365 deployments with your existing Continuous Integration and Delivery pipelines. Since the module relies on PowerShell Desired State Configuration, it also automatically monitors your Office 365 tenants for configuration drifts. As if this was not enough, Office365DSC is also the very first DSC module out-there to have native ReverseDSC support in it. Users can therefore extract full-fidelity configuration scripts out of their existing tenants.
The present article describes the steps to install and run Office365DSC on a brand new machine. At the time of writing this article, the module is still in an alpha version. The plan is to have the module “RTM” during the 2019 SharePoint North America conference at the end of May. Because most machines don’t have the latest version of the PowerShellGet module which allows you to automatically grab PreRelease versions of modules from the PowerShell Gallery, we will need to start by updating this module.
This section describes the steps that need to be performed in order to install the Office365DSC module onto a machine or server. Please note that these steps require PowerShell version 5.0 or greater.
Open a new PowerShell session as an administrator and run the following lines of PowerShell to update your module:
Once the module is installed, you can now execute your configuration locally which will automatically trigger a connection to your Office 365 tenant. You are now also able to trigger the ReverseDSC console by simply running the following PowerShell command:
Running the above command will launch the Office365DSC ReverseDSC Graphical Interface as shown below. Simply select the components you wish to extract, enter the credentials of a Global Administrator account for your tenant in the top right section, and click the Start Extraction button.
This will automatically initiate an extraction of the existing configuration from your existing Office 365 tenant. Once the extraction completes, you will be prompted to provide a path to store the extracted configuration. If the path you provided doesn’t exist, the module will automatically create it.
This past week I had the honor of presenting a session on Azure DevOPS for SharePoint Admins at the North American Collaboration Summit in Branson Missouri alongside some of the best and smartest SharePoint people there are. Huge thanks to my friend Mark Rackley (@MRackley) for putting on such a great show. In my session, I demoed how enterprises can now manage their entire SharePoint configurations as code (Office 365 an on-premises) using SharePointDSC and Office365DSC, and how they can automate the deployment of these changes using Azure DevOPS pipelines. In our main demo, we wanted to make sure that .JSON files we’re not being blocked in our SharePoint environment to allow SharePoint Framework webparts to run. In order to do this, we simply modified our main configuration script, which was located in our Azure DevOPS repository:
By simply checking in this code change back into Azure DevOPS, an automatic build was triggered inside of Azure DevOPS pipelines, and upon completing this build then triggered a Release Pipeline which simply updated the latest code changes to an Azure Automation Account. Our servers then automatically picked up this new configuration and applied it.
To add to the magic, the PowerShell script that gets executed as part of our release process included logic to automatically connect our servers to configurations that matched their names. This means that if we ever decided to add an additional server to our farm, the ONLY thing we would have to do would be to modify our associated Configuration Data file (we have one for dev, one for QA, and one for Prod), to list the additional server ET VOILA. That server would automatically get added to the farm and configured. All the material used in my session is available on GitHub at: https://Github.com/NikCharlebois/Conferences/2019%20-%20CollabSummit%20-%20Branson
If you have any questions regarding the content of the session, please contacting me on Twitter at @NikCharlebois.
A few weeks ago, I posted what I consider to be a “Tease” Post regarding a new project I am involved with, Office365DSC. Today, I would like to share additional details regarding what our plans are for this module. Throughout the development process, the team and I will be completely transparent about what our progress, road-map and vision is. I will be using my personal blog to publish examples and keep the community informed about upcoming releases and changes.
The focus for today’s blog post will be on the ReverseDSC capabilities that are directly built into the module. For over two years now, the PowerShell team and I have been discussing possible approaches to have the ReverseDSC capabilities baked-in within each module, but could never figure out what the best approach to enforce this would be for existing modules. Since Office365DSC is a brand new resource, we decided to use it as our very first Proof-of-Concept, and have it included directly within it. There is currently an active Request for Change that proposes the introduction of a 4th function in each resource which would allow for the extraction of DSC configuration scripts for it.
The Office365DSC Approach
Our approach for Office365DSC was to introduce that 4th method, named Export-TargetResource, in each of our resources as a standard. This logic for this function is to be kept very simple: it calls into the Get-TargetResource function, and converts the returned hashtable into a consumable DSC block, represented as a string. This function only receives the Required parameters (primary and secondary keys). This function is a one-to-one mapping, meaning it receives the keys of a specific instance of the given resource, and return the string block representing that one instance.
In order to collect information about each instances of all resources supported by the module, we introduced a new public function, defined in the Office365DSCUtil.psm1 utility module, which is named Export-O365Configuration. This function is responsible for retrieving the keys of all instances of each resources, to call the specific Export-TargetResource function for it, and to collect and merge the retrieved DSC string blocks. To reduce the complexity of this function, we leverage the existing logic from the ReverseDSC module, and therefore Office365DSC is dependent on this module.
How to Use It?
As of this morning, a new Alpha Release of the Office365DSC module has been published to the gallery. This alpha release contains the mechanisms to automate the ReverseDSC extraction of existing Office365 tenants. In order to download it an initiate a ReverseDSC extract, you can run the following lines of PowerShell. Note: In order to initiate the extract, you will need to provide credentials from a user who is assigned the Global Admin role.
Inputing Office 365 Global Admin credentials
Installing Office365DSC with PowerShell
The Install-Module cmdlet automatically downloads and installs the dependencies
Importing the Office365DSC module displays warning about the SharePoint Online dependencies
Initiated the Office365DSC Extraction Process
User will be prompted to select a destination to save the files
Office 365 Configuration was successfully extracted
We hope that this article properly articulated what our plans are with the Office365DSC. By making it as easy as possible for organizations to Reverse Engineer the configuration of their existing Office 365 tenant, we hope to to open the door to brand new adoption scenarios (a.k.a. easy replication across tenant, etc.) which weren’t possible up to now. As always, we encourage the community to submit feedback using the Issues tabs in the GitHub Repository for the project, and to contribute to the initiative by forking the repository and submitting Pull Requests.
Over the past few weeks, a team of Microsoft Premier Field Engineers from around the globe and I have been kicking the tires on a crazy idea we’ve had for a while, a PowerShell DSC module that would allow us to manage Office 365 configurations. This is an ambitious task and one that is a little disruptive in itself since DSC is traditionally seen as focusing on managed Software, and not Platform-as-a-Service. The idea was to create a module that could run on any machine (or agent) that had connectivity to Office 365 and remotely perform configuration changes and monitoring of drifts. The plan is to have the module focus on the following workloads to begin with, but we plan to expand its reach to all Office 365 workloads:
Today, the team and I are proud to release a very early preview of the Office365DSC PowerShell DSC module. As stated, the module is in its very early stages, and at the time of writing this article, only officially support:
O365Group: Office 365 Groups (Security, Distribution List, Mail enabled and Office 365)
I am writing this blog post knowing that the module is very light at the moment. My goal with this article however is to make the community aware that the effort is currently undergoing, and that if people want to contribute to it, that they are encouraged to report issues, comments/feedback or to fork and submit Pull Requests to help out with the code base.
As it currently stands, the module allows you to deploy new SharePoint site collections, assign them resources and storage quotas, create Office 365 Groups and assign members to them, and create new users and assign/remove licenses. Because the module is a DSC one, it also allows you to monitor the status of the Office 365 tenant. For example, if the user John Smith is supposed to be assigned a Tenant Admin license, but someone modified John’s properties in Office 365 and only granted him Billing rights, DSC would automatically detect that the remote Office 365 environment is not in the desired state and report it or attempt to correct it. What’s more, is that with the help of ReverseDSC, you will be able to reverse engineer an existing Office 365 tenant, and replicate its configuration across other tenants. We are thrilled about the possibilities that will open with the venue of Office365DSC. Stay tuned for more details on upcoming releases!