Local Docker Container Run with DefaultAzureCredentials
Chaminda's Blog
by
2w ago
 We discuss how to enable workload identity for continers running in AKS in the post "Setting Up Azure Workload Identity for Containers in Azure Kubernetes Services (AKS) Using Terraform - Improved Security for Containers in AKS". However, when we use DefaultAzureCredentials and try to run docker containers locally from a development machine, we do not have the workload identity support. With Visual Studio we can run with the Azure AD user and run applications successfully. But if we are using a docker run command and run docker container locally, we will have to use app registration ..read more
Visit website
Jump Into a Container Deployed in AKS (kubernetes)
Chaminda's Blog
by
1M ago
 We may sometimes want to jump into a container deployed in kubernetes pod to ivestigate he conntents of the container, such as files in it or even we may want to run commands and see how they work inside a deployed container. For that purpose we need to jump into the container and obtian the command shell in that container. Let's look at how we can jump into both Linux and Windows containers. To jump to Linux container in AKS(Kubernetes) use below command syntax kubectl exec -it <yourpodname> -n <k8snamespace> --container <containername> -- sh For Windows container u ..read more
Visit website
Multiple KEDA Triggers for a Scaled Job with Event Hubs in AKS
Chaminda's Blog
by
1M ago
 Kubernetes scaled job helps us running one job per event/message we recive from the queue/even hub. We can have an event handler job which can handle more than one type of event messages or  queue messages. Let's look at what we need to consider when we are defining more than one trigger, with kubernetes event drivern autoscaler (KEDA) for a scaled job. The trigger definition can be as shown below. triggers:     - type: azure-eventhub       metadata:         consumerGroup: orderhandler         unprocessedEventThresho ..read more
Visit website
Mount Azure Storage Fileshare Created with Terraform on AKS
Chaminda's Blog
by
2M ago
 We can mount Azure file share to containers in AKS as explained in dcumentation here, and we can use static vloume  mounting to use existing Azure file share. The documentation only explains how to setup static volume mount with Azure CLI. In this post let's look at stpes for using terraform provisioned Azure file share strage as a static volume mount in AKS, using kuberntes yaml. First we can create a storage account and a file share in terraform as shown below. Code is available here in GitHub. resource "azurerm_storage_account" "fs" { name = "${v ..read more
Visit website
FormatterServices.GetUninitializedObject is Obsolete, What can we use instead?
Chaminda's Blog
by
2M ago
The FormatterServices.GetUninitializedObject is obsolete and shows a warning as shown below if we try to use it in our code. The FormatterServices class is obsolete as per documentation. What is the alternative for this? Let's find out with an example. Below is a simple interface. namespace getuntnitializeobject {     public interface IMessage     {         string GetEventHubName();     } } Using this interface we have two classes defined as below. namespace getuntnitializeobject {     internal class NewInvoiceMess ..read more
Visit website
Installing Mising Fonts in windowsservercore-ltsc2022 Docker Image Using Azure Pipelines with az acr build
Chaminda's Blog
by
3M ago
 The missing fonts in windowsservercore-ltsc2022  docker images can be installed as described in the blog post here. However, when we are not using hosted agents, and when we use kubernetes based self hosted build agents, we do not have access to host machine, to perform the all steps described in the post "Adding optional font packages to Windows containers". Since docker is not supported on self hosted build agent running as container in AKS, we have to use az acr build to build the docker images in such cases. To setup fonts in this kind of a situation in a Azure DevOps pipel ..read more
Visit website
Dynamically Control Matrix of Jobs in GitHub Actions Based on Input Parameter Value
Chaminda's Blog
by
3M ago
 We can use matrix in GitHub Actions to use a single defnition of job to create multiple jobs as described here in the documentation. Let's say we input the list of application names we want to build, as an input parameter to the action workflow, and need to have the ability to remove the items from the app list at the time of triggering it manually (run workflow). For example, we have 4 apps by default. However, when we need we should be able to build only one or two out of them using the same action workflow without having to change, the workflow defintion. In this post let's explore ho ..read more
Visit website
Loop Jobs based on Parameter Value in Azure DevOps Pipelines
Chaminda's Blog
by
3M ago
 Consider a situation where we want to perform same set of steps in a pipeline, multiple times. A good example would be building or deloying multiple apps, using same set of steps. Let's explore this example to understand how we can loop through set of pipeline steps, to build multiple apps using a list of app names provided as a parameter in the pipeline. The app name list can be provided as a parameter as shown below. parameters:   - name: apps     displayName: 'Apps to deploy'     type: object    ..read more
Visit website
Update Azure Pipeline Library Group Variable Value in Azure Pipeline using CLI
Chaminda's Blog
by
4M ago
We can set a variable value in Azure piplines using task.setvariable. This will only set a variable in the pipeline but not in a variable group. If we want to set a variable in a library variable group in Azure DevOps, we have to use command line azure-devops extension  for Azure CLI. Let's explore how to update a library variable group variable value using Azure pipeline step. We can use a task similar to below to update an existing variable in a variable group. - task: PowerShell@2 name: setup_blue_green_controls_vars displayName: 'Setup blue-green control ..read more
Visit website
Deploying Kubernetes Event Drivern Autoscaling (KEDA) with Azure Pipelines Using Helm
Chaminda's Blog
by
4M ago
 We have discussed how to deploy KEDA using helm in the post "Setting Up Kubernetes Event Drivern Autoscaling (KEDA) in AKS with Workload Identity" .  Instead of deploying KEDA manually it is better to automate the deployment. Let's look at the steps to get KEDA deployed using Azure pipelines. As the first step we need to have kubectl and helm installed in the pipeline agent.       - task: KubectlInstaller@0         displayName: 'Install Kubectl latest'             - task: HelmInstaller@0         d ..read more
Visit website

Follow Chaminda's Blog on FeedSpot

Continue with Google
Continue with Apple
OR