Inside Technologies offers solutions and services for IT world. The most established areas of action concern Private and Public Cloud, Virtualization, System Management, Backup and Disaster Recovery, Security & Identity Management, Collaboration and Mentoring. Founded and directed by a visionary expert, Silvio Di Benedetto, Inside Technologies meets the needs of companies of any size and in..
In the last days some users raised a critical error about Veeam Backup & Replication usage. In particular, they receive the message Failed to check certificate expiration date.
Figure 1 – Message Error
The bug is present in all computer where the Update 3 has been installed. The reason is that the self-signed certificate has validation period for one year but the Update 3 was released in 2017 so this is the reason why there’s this issue. By the way, there’s a way to fix this in 4 steps:
Open the menu General Options
Click on Security tab
Click on button Install
Start the wizard and select the option Generate New – figure 2 – and enter a new name – figure 3.
Figure 2 – New Certificate
Figure 3 – Enter Name
Into the summary page, check the new expiration date as showed in figure 4.
Figure 4 – Summary Page
Restart the machine just to be sure.
This bug will be resolve with next Update 4 where probably the new expiration date will be set for a long time, to avoid similar behavior.
The System Center teams has released a new version of Microsoft Azure Backup Server (MABS). For who ignore what is it, MABS is a product to protect your environment on-premises with the possibility to create a short-term retention locally and a long-term retention in Azure through Backup Vault. MABS is free for all Azure Backup users and this is not a bad thing. Behind the scenes we have Data Protection Manager but, to simplify the model, there are some pros, like the possibility to use SQL Express (instead SQL Standard) for the other hand there’s no way to use tape library. The v3 introduce many new features:
Backup Storage Migration: By using volume-to-volume migration in MABS v3, you can move your backups on-premises for more optimized storage management. You can also move your data source to a different volume if a volume is near capacity and cannot be extended.
Prevent Unexpected Data Loss: Assigning wrong volumes for backup storage can cause critical data loss. MABS v3 lets you disable such volumes by using Windows PowerShell. All volumes and mount points, except System Volumes, are available for MABS storage.
Custom Size Allocation: Modern Backup Storage (MBS) optimizes storage consumption by using storage when it is necessary. After you configure MABS for protection, it calculates the size of data that is being backed up. When multiple data are backed up together, this calculation can take a while. By using MABS v3, you can configure MABS to accept the storage volume size as default, instead of calculating the size of each file and folders that’s being backed up.
Optimized CC for RCT VMs: MABS v3 optimizes network and storage consumption by transferring only the changed data during any Consistency Checks for RCT (the native change tracking in Hyper-V) VMs.
VMware VM Protection Improvements: VMware VM backup was in Preview mode in Windows Server 2016 that has MABS v2 installed. By using MABS v3, this is supported in production mode. MABS v3 will include support for vCenter/vSphere and ESX/ESXi licensed version 6.5, 6.0, and 5.5.
TLS 1.2 Support: TLS 1.2 is a secure method of communication that is recommended by Microsoft. For MABS, TLS 1.2 is applicable for protecting workloads to cloud.
SQL Server 2017 Support: MABS v3 can be installed together with Microsoft SQL Server 2017. You can either upgrade your SQL Server from SQL Server 2016 to SQL Server 2017 or use a fresh instance of SQL Server 2017 as an MABS database. By using v3, you can seamlessly back up the SQL Server 2017 workload.
Windows Server 2019 Support: MABS v3 can be installed on Windows Server 2019. You can upgrade your system to Windows Server 2019 either before you upgrade or install MABs v3 or a later version.
This update fixes the following issues:
No billing for protected collocated VMWare data sources in MABS v2.
MAB server crashes while sending billing information for a clustered data source.
Indefinite looping of RCT VM consistency check jobs because of I/O errors.
When you create a protection group, MABS displays a warning message about exceeding data limits. This issue occurs even for small data sources when you are protecting Hyper-V virtual machines.
The MABS UI stops responding after it starts if there are many recovery points for data sources.
Hyper-V backups fail because of checkpoint time-outs on the Hyper-V server.
MABS console crashes when the nightly pruning job runs for a long time.
MABS status report does not display client/laptop recovery points.
Consistency checks for Hyper-V VMs transfers more data than the size of the VMs.
When you try to recover files and folders from Azure, some items that have a path length longer than 256 characters in the online recovery point are not visible in the UI.
The UI stops responding for several minutes when a consistency check job is triggered.
Severity is lowered from “Critical” to “Informational” for the alert that is raised when an online backup fails because new disk backups are present since the last online backup.
BMR and System State data source backups consume more storage when you use Modern Backup Storage.
Alternate synchronization jobs for files and folders on a deduped volume raise failed file alerts incorrectly. Failed files log links in failed file alerts do not show the failed files.
System Center Configuration Manager allows, among the so many things, to centrally manage applications Universal Windows App like Mail, OneNote, Edge and much more. In an advanced management perspective, it is undoubtedly essential to have a company store, with software approved by the IT department, distributed to the various users without having to do something.
During this article we will see how the integration between SCCM and the Microsoft Store for Business is configured, the portal dedicated to the management of the tenant from the Windows point of view. The MSfB it has evolved over time, becoming from before the simple portal to manage business applications, then he saw the management of the corporate wallet and finally was added the part of Autopilot.
The integration between the two worlds once again passes from Microsoft Azure, as you need to create a point of contact between our Azure Active Directory and Configuration Manager. The first step is to register a new Application within the Active Directory Azure profile.
Assign a name, a URL, which should not be speaking, and make sure that the Web app/API entry is selected, as shown in figure 1.
Figure 1 – Creating Apps
Once you have finished creating, you will have to open the Settings part to generate a new Key – figure 2; this will serve to ensure the secure connection between premises and cloud. Set a description and set the Expire period as Never Expire.
Figure 2 – Key Creation
As you can see, the value of the key will not be generated until the Save button is pressed. It is important to remember that this value will be shown only once for security reasons, so make sure you copy it into a file.
Once the key is created, it will be the turn of the configuration within the Store for Business. For those who have never activated it, can do so free of charge through the address https://businessstore.microsoft.com/. Go to the Section Settings – Distribute and add a new Management Tool, as shown in Figure 3.
Figure 3 – New Management Tool
A window will open where you enter the name of the service created within the Azure portal – figure 4.
Figure 4 – Application Search
Configure Configuration Manager
To be able to configure SCCM with the Store you must have at least the build 1706, even if is it closer to end-of-support. You also need to enabling the role Service Connection Point, which deals with the communication with the Azure part.
Within the Azure Services (Administration area) section create a new connector by selecting the Microsoft Store for Business entry, as shown in figure 5.
Figure 5 – New Connector
The choice available from the next screen is the creation or importing the app; having already made this step at the beginning, choose Import Apps. Enter the references required by the wizard:
Azure AD Tenant Name: available inside the Azure AD tenant
Azure AD Tenant ID: available inside the Azure AD tenant
Application Name: the name of the application you created earlier
Client ID: Application ID of the app you created earlier
Secret Key: The key generated by the system at the time of application creation
App ID URI: The URL of the application you created earlier
Validate the results and if everything goes well, figure 6, you can continue, closing the window and assuring you that the app name is brought back into the mask App Properties – figure 7.
Figure 6 – Import Apps
Figure 7 – Imported Application
Your configuration has almost come to an end, you just have to validate it and conclude the wizard – figure 8. From this point forward, you can approve applications from the Store and distribute them in Configuration Manager, like you already done today with Win32 applications.
Figure 8 – End Wizard
Windows 10 and Store
Windows 10 automatically shows public store, which can be turned off GPO side or limited to show only the business part. To do this you can follow the directions Featured in this article: Windows 10: Advanced Management
The integration of SCCM with the Microsoft Store allows you to extend the management features and Simplify the life of IT administrators, both in terms of maintenance and for everything that concerns the centralized distribution of software approved exclusively by the company.
In a next article, we’ll see how manage your apps directly with SCCM.
Many companies make remote their applications, to avoid installing the software on every single workstation, as well as to optimize the resources, without neglusing the possibility to access the applications even outside the company (perhaps positioning it on the cloud).
The integration of these software with the Office’s world is much stronger than what you can think of and in some cases the main protagonist is Outlook, which serves as the engine for sending e-mail.
Sometimes it can happen that, although the virtual machine where the software runs is configured with Active Directory (or Azure Active Directory), the integration with Office 365 does not happen in the correct way. The most frequent scenario, which triggers these issues, is given when the user’s email is different from his or her username.
Figure 1 – Account Office 365
Trying to configure Outlook returns an error stating that the account cannot be configured – Figure 2.
Figure 2 – Account Error
Also trying to use the Microsoft Support and Recovery Assistant (https://diagnostics.outlook.com) does not come out of the problem, however the resolution is much simpler than it seems. By logging back into the Office 365 Administrative Console, you must add a new Alias, but use the xxxx.onmicrosoft.com domain (instead of your public domain), as shown in Figure 3.
Figure 3 – New Alias
Once done, you can try to configure Outlook again but this time by entering the alias reference – Figure 4.
Figure 4 – Outlook configuration with Alias
Because the alias is different from the user name, you will be prompted for authentication – Figure 5.
Figure 5 – Authentication
Entered the correct values, you will receive the go-ahead on the configuration – Figure 6.
Figure 6 – Completed configuration
Easy, no? Well done Franky! We are now able to use Outlook in RDS environment.
After one month from the retire, for isolated reports of users missing files after updating to the latest Windows 10 feature update, Microsoft has been released in General Availability, Windows Server 2019 and Windows 10 v1809.
There are many new features into Windows 10 1809: Sets, Your Phone, Windows Security Center, Windows Autopilot enhancement and much more. The new build is a bunch of many improvements that increase the security, the performance but also the integration with cloud and the other platform.
It’s not easy describe what’s new in Windows Server 2019, because there was an amazing work from Product Team. From my point of view, I really love the great improvements into Hyper Converged Infrastructure (HCI), with performance up to 10X, the new functionalities into ReFS, without forget native support for Persistent Memory for physical and virtual machine. The other two cool features are System Insights and Storage Migration Services. Windows Server 2019 is the first operating system, server side, with full support for Windows Defender Advanced Threat Protection.
Windows Server 2019 continues the support for Containers and the first big news is the native support for Linux containers on Windows. The first edition of this integration was introduced in Windows Server 1803 but now is available for all users (SAC is available only for Software Assurance customers). This enables you to have a heterogenous container host environment while providing flexibility to application developers. Windows Server 2019 continues the improvements to compute, networking and storage from the semi-annual channel releases needed to support Kubernetes on Windows.
The management platform of Windows Server 2019 is Windows Admin Center. For all crazy guys that didn’t know what is WAC, is an evolution of Windows Server in-box management tools; it’s a single pane of glass that consolidates all aspects of local and remote server management. As a locally deployed, browser-based management experience, an Internet connection and Azure aren’t required. Windows Admin Center gives you full control of all aspects of your deployment, including private networks that aren’t Internet-connected.
Windows Admin Center is not only a console but, in some cases, is the only way to manage some features in Windows Server 2019 like Storage Migration Services, System Insights, SDN, Azure integration and much more. Check my articles to learn more about WAC:
Windows 10 quality updates are cumulative, containing all previously released fixes to ensure consistency and simplicity, and are released monthly. Available only for Windows 10 1809 and Windows Server 2019, the goal is emulate the Full Update with the difference that the package should be very small, thanks a new compression method. If you want more details, check this article: https://www.insidetechnologies.eu/en/blog/windows-10-and-windows-server-quality-updates/
Not Only Gold….Also Pain!
These are the good news but the bad news in behind the corner because there’s a big bug where Windows File Explorer indicated that mapped network drives appeared to be broken. The description of known issue from Windows 10 History:
Mapped drives may fail to reconnect after starting and logging onto a Windows device. Symptoms include:
In File Explorer, a red “X” appears on the mapped network drives.
Mapped network drives show as “Unavailable“ when you run the net use command from a command prompt.
In the notification area, a notification displays, “Could not reconnect all network drives.”
The issue affects Windows 10 1809, Windows Server 2019, and Windows Server, version 1809. Microsoft says it is working on a resolution but warns admins not to expect a fix until “the 2019 timeframe“.
This is not the only issue present in Windows 10 1809 because there are several problems and since the first release of the build, the users found bugs and this could force the IT Admins to delay the upgrade and wait the next 1903 to avoid blocks and critical failures.
Make Microsoft Intune similar to System Center Configuration Manager is one of main goal that the product team is working on since long time. During the last Ignite was announced a new feature that allows the Win32 apps deployment in a way much easier than past.
Since last month, the only way to deploy an application to our remote client was use a PowerShell script. The system works fine but it wasn’t very simple because requires scripting knowledge and a remote repository where collocate the source files (like Azure Blob).
In this article I will show you how to use the new method to deploy an application via Intune.
Once the tool is in your PC, it will be possible convert the setup file with this syntax: .\IntuneWinAppUtil.exe -c “C:\Users\Oem\Downloads\7Zip\” -s “C:\Users\Oem \Downloads\7Zip\7z1805.msi” -o “C:\Users\ Oem \Downloads\7Zip\”
The parameters accepted are:
-c – the folder with all source files
-s – the setup file
-o – the folder with output file
If everything works fine, you will receive an output similar figure 1.
Figure 1 – Setup Convert
NB: be sure that the source folder contains only the necessary files for the deployment, this to avoid fails or errors during convert. The output folder can be unique.
After you have converted all of our applications, it’s time to start the wizard inside Microsoft Intune; select the value Windows App (Win32) as showed in figure 2.
Figure 2 – New Deployment
Select the converted file – figure 3.
Figure 3 – Source File
The area App Information, figure 4, shows the information about the package to install and uninstall the software. Check the parameters because can be happen that some information are not reported correctly.
Figure 4 – Setup Command
Select the Requirements (minimum version of Windows and architecture) and jump into Detection Rule area, figure 5, where you can manage the methods to verify if the software is already installed into the target machine. It’s possible be less or more detailed, in terms to detect only the MSI code or a specific build version.
Figure 5 – Detection Rule
Close the wizard and select the destination target (System or User). Don’t forget that the upload file changes by the Internet speed but also by the source file dimension. Only when the upload will be finished, the package can be deployment.
In case the software was deployed as Available, the end-user will see it inside the Company Portal (another step to make it similar to the SCCM Software Center).
Figure 6 – Company Portal
It doesn’t matter if the end-user is not local admin, because the software will be installed without problem. This increase the security and resolves many critical limitations for all remote users. Similar like upload, the Internet bandwidth is critical to download the package. Once the task has been finished, the software will be available inside the Start Menu – figure 7.
Figure 7 – Software Installed
This new method simply all the software deployment tasks and make Intune as main point for everything, thanks also to Company Portal. It’s very important analyze the log files inside the folder C:\ProgramData\Microsoft\IntuneManagementExtension\Logs, to verify if there are errors or fails (in the most of cases generated by wrong syntax and missing exit code); in case of problems, the resolution can be found inside the Intune portal, without touch the source file – and avoid a new upload.
During my test with our lab environment, I found a strange issue with RDP session on Windows 10 1809. The event viewer shows me this message error.
No difference between the classic RDP and the Enhanced Session on Hyper-V (behind the scenes is a RDP) so I’m investigating on this to understand the cause, because a machine outside the GPO works fine.
I found an old setting to improve the graphic mode via RDP under Remote Desktop Services area, called Prioritize H.264/AVC 444 graphics mode for Remote Desktop Connections. To fix the issue, I set the GPO as Not Configured.
It’s not clear if there’s a bug with the codec or if Windows 10 not support anymore the AVC:444. By the way, after a gpupdate /force the machine was again available to be used.
New chapter about Windows Admin Center and the integration with Microsoft Azure. After we saw how to integrate the platform with Azure AD, extend the protection of server with Azure Backup and protect the entire virtual machine with Azure Site Recovery, it’s time to see how to keep updated our servers thanks to Update Management.
I will not explain what is Update Management, and why you should use it, but if you are interested check this article (Microsoft Azure: implement Update Management). In this article we will see how to integrate WAC and the cloud solution. As usual, the requirements to proceed are the Azure AD integration in WAC and an Azure Subscription.
The configuration must be done inside each machine where you want to extend the management. Go to area Updates, if the integration will not be enabled you will see a notification bar as showed in figure 1.
Figure 1 – Enable Update Management
The classic wizard, figure 2, will help us to select the right Resource Group, the Log Analytics Workspace and the Azure Automation.
Figure 2 – Wizard
After few seconds, in the bottom side of the screen of Updates area, will be showed a new section that indicate the integration with Update Management – figure 3.
Figure 3 – Server Managed via UM
Inside the Azure portal, in the Update Management section, the server should be present and marked as compliance, or not, it depends by the updates installed before the integration.
Figura 4 – Server in Update Management
You done! The last task is add the machine inside each Scheduled Update Deployments created or add the machine inside a global group, in case you have a dynamic group in Log Analytics or the integration with Active Directory.
The full integration with Windows Admin Center and Azure Update Management is absolutely a great point that allows all IT admins to be compliant with security policies and forgot the patching life-cycle. The easy configuration permits to be up-and-running in few minutes.
New chapter dedicated to Windows Admin Center. Into the first part, I showed how to integrate WAC with Microsoft Azure and how to use the Backup extension to protect the content of single virtual machine, thanks to integration with Azure Backup.
In this article we will see how to extend the protection to an entire Hyper-V host, with Azure Site Recovery.
Integration between Windows Admin Center and Microsoft Azure allows you the possibility to protect your datacenter and extend Hyper-V with Azure Site Recovery. ASR is a service present in Azure where your VMs are replicated with on-premises hosts, in order to provide a disaster recovery site outside your company; the result is reducing the downtime in case of big failure. Obviously, plan ASR model is not easy and unless you have a standalone machine, many pre-requirements are needed but the idea is amazing.
To set up your host with ARS, is necessary work on the Hyper-V machine, open the Virtual Machine area and then click on the object and use the menu More – figure 1.
Figure 1 – Set up VM Protection
Similar like Azure Backup, ASR allows to use an existing Resource Group and Recovery Vault, as showed in figure 2. The procedure will install the MARS agent and configure the object inside your Azure subscription.
Figure 2 – Set up ASR
When the agent will be deployed, the Hyper-V will be allowed to protect each single VM. Bear in mind that in this moment there are a couple of “limitations” present inside the console:
There’s no way to select the Azure VM plan
There’s no way to select Managed Disks
There’s no way to select the Hyper-V Site
In few minutes your host will ready and to be sure, it’s possible check the new Hyper-V Site in ASR.
Remember that the Retention Policy called smepolicy, figure 3, cannot be substitute with another one. In case you have already created another Retention Policy, this cannot be assigned to Hyper-V Site called smepolicy to avoid the replica fail.
Figure 3 – Retention Policies
Protect Virtual Machine
When the host will be ready, click on the virtual machine that you want protect and use the menu to start the wizard – figure 4.
Figure 4 – Protect VM
After a couple of hours, depending by the Internet speed and the VHDX size, your virtual machine will be replicated on Azure. In Hyper-V console you can view the replica status, figure 5, and into the backup vault you can see the Replicated Items, as showed in figure 6.
Figure 5 – Hyper-V Replica Status
Figure 6 – ASR Status
To close the procedure and make possibile the failover, don’t forget to configure the VM on Azure as showed figure 7. Without this procedure, the replica will fail!
Figure 7 – Configure ASR
By default, some settings are missing like:
Figure 8 – Configure Replica VM
In case you want run a Failover Test, a Planned Failover or a Failover, you must use the Azure portal. It’s not clear if there will be a plan to change this procedure but at the end of all, I don’t think that is a critical point because if the requirement is run the protected VM is just because the entire infrastructure is down and this will be true also for WAC console.
The integration with Azure Site Recovery is another great brick in the wall to extend our infrastructure to the cloud and reduce the effort from the IT departments. The product team is working to improve the extension and make easier our life.