FormatterServices.GetUninitializedObject is Obsolete, What can we use instead?
Chaminda's Blog
by
6d ago
The FormatterServices.GetUninitializedObject is obsolete and shows a warning as shown below if we try to use it in our code. The FormatterServices class is obsolete as per documentation. What is the alternative for this? Let's find out with an example. Below is a simple interface. namespace getuntnitializeobject {     public interface IMessage     {         string GetEventHubName();     } } Using this interface we have two classes defined as below. namespace getuntnitializeobject {     internal class NewInvoiceMess ..read more
Visit website
Installing Mising Fonts in windowsservercore-ltsc2022 Docker Image Using Azure Pipelines with az acr build
Chaminda's Blog
by
3w ago
 The missing fonts in windowsservercore-ltsc2022  docker images can be installed as described in the blog post here. However, when we are not using hosted agents, and when we use kubernetes based self hosted build agents, we do not have access to host machine, to perform the all steps described in the post "Adding optional font packages to Windows containers". Since docker is not supported on self hosted build agent running as container in AKS, we have to use az acr build to build the docker images in such cases. To setup fonts in this kind of a situation in a Azure DevOps pipel ..read more
Visit website
Dynamically Control Matrix of Jobs in GitHub Actions Based on Input Parameter Value
Chaminda's Blog
by
1M ago
 We can use matrix in GitHub Actions to use a single defnition of job to create multiple jobs as described here in the documentation. Let's say we input the list of application names we want to build, as an input parameter to the action workflow, and need to have the ability to remove the items from the app list at the time of triggering it manually (run workflow). For example, we have 4 apps by default. However, when we need we should be able to build only one or two out of them using the same action workflow without having to change, the workflow defintion. In this post let's explore ho ..read more
Visit website
Loop Jobs based on Parameter Value in Azure DevOps Pipelines
Chaminda's Blog
by
1M ago
 Consider a situation where we want to perform same set of steps in a pipeline, multiple times. A good example would be building or deloying multiple apps, using same set of steps. Let's explore this example to understand how we can loop through set of pipeline steps, to build multiple apps using a list of app names provided as a parameter in the pipeline. The app name list can be provided as a parameter as shown below. parameters:   - name: apps     displayName: 'Apps to deploy'     type: object    ..read more
Visit website
Update Azure Pipeline Library Group Variable Value in Azure Pipeline using CLI
Chaminda's Blog
by
1M ago
We can set a variable value in Azure piplines using task.setvariable. This will only set a variable in the pipeline but not in a variable group. If we want to set a variable in a library variable group in Azure DevOps, we have to use command line azure-devops extension  for Azure CLI. Let's explore how to update a library variable group variable value using Azure pipeline step. We can use a task similar to below to update an existing variable in a variable group. - task: PowerShell@2 name: setup_blue_green_controls_vars displayName: 'Setup blue-green control ..read more
Visit website
Deploying Kubernetes Event Drivern Autoscaling (KEDA) with Azure Pipelines Using Helm
Chaminda's Blog
by
1M ago
 We have discussed how to deploy KEDA using helm in the post "Setting Up Kubernetes Event Drivern Autoscaling (KEDA) in AKS with Workload Identity" .  Instead of deploying KEDA manually it is better to automate the deployment. Let's look at the steps to get KEDA deployed using Azure pipelines. As the first step we need to have kubectl and helm installed in the pipeline agent.       - task: KubectlInstaller@0         displayName: 'Install Kubectl latest'             - task: HelmInstaller@0         d ..read more
Visit website
Scale Pods in AKS with Kubernetes Event Drivern Autoscaling (KEDA) ScaledJob Based on Azure Service Bus Queue as a Trigger
Chaminda's Blog
by
1M ago
 In previous posts we discussed "Setting Up Kubernetes Event Drivern Autoscaling (KEDA) in AKS with Workload Identity" and how to "Set Up (KEDA) Authentication Trigger for Azure Storage Queue/Service Bus in AKS". With that now we can proceed to setup kubernetes scaled job in AKS to run a pod when the Azure service bus queue received a message. Using scaled job we are going to start a job (pod) once a messsage is received in the queue and then receive the massage in the pod container app, process and complete the message and complete the job execution with a pod complete. So, there will be ..read more
Visit website
Setting Up (KEDA) Authentication Trigger for Azure Storage Queue/Service Bus in AKS
Chaminda's Blog
by
1M ago
We have discussed setting up Kubernetes Event Drivern Autoscaling (KEDA) with AKS workload identity in the post, "Setting Up Kubernetes Event Drivern Autoscaling (KEDA) in AKS with Workload Identity". Purpose of KEDA is to once we receive messages in a queue, such as Azure storage queue or Azure service bus queue we have to scale a scaledjob/deployment in kubernetes. To setup authentication for the KEDA to communicate and monitor such a queue to scale a job or deployment, it should authentication to access the queue. We can set up the required authentication using using connnection string ..read more
Visit website
Setting Up Kubernetes Event Drivern Autoscaling (KEDA) in AKS with Workload Identity
Chaminda's Blog
by
1M ago
 We have discuss setting up workload identity in AKS to be used with application containers we deploy to AKS in the post "Setting Up Azure Workload Identity for Containers in Azure Kubernetes Services (AKS) Using Terra- Improved Security for Contianers in AKS". Kubernetes Event Drivern Autoscaling (KEDA) is the mechanism we need to use when we want to scale our deployments, or specially kubernetes jobs (A pod that runs for completion). In this post let's look at how to setup KEDA with workload identity, so that we can use KEDA in a later posts to run a Kubernets job, autoscaled based ..read more
Visit website
Azure File Share with DefaultAzureCredential in .NET with Azure.Storage.Files.Shares - Is it possible?
Chaminda's Blog
by
1M ago
 Using DefaultAzureCredential with most of the Azure resources is straight forward and simple with most of the Azure resources with relevant Azure .NET SDKs (We can use nuget packages Azure.Storage.Blobs and Azure.Identity). For example, with storage blob we can easily use DefaultAzureCredential as shown in below code.     private static BlobServiceClient GetBlobServiceClient(string accountName)     {         return new(new Uri($"https://{accountName}.blob.core.windows.net"),             new De ..read more
Visit website

Follow Chaminda's Blog on FeedSpot

Continue with Google
Continue with Apple
OR