Connecting to Azure Red Hat OpenShift using Service Accounts from Azure DevOps

In this quick post, we will see how to connect to OpenShift from Azure DevOps with help of Service Accounts and Service Connections.

Utkarsh Shigihalli
7 min readDec 29, 2022

Azure DevOps does not need another introduction. It consists of powerful services for requirement gathering, building/deploying workloads using automated pipelines, support for multiple source controls (TFVC and Git), and testing and publishing artefacts.

We heavily use Azure Pipelines for our build/release activities and currently use it to deploy our microservices to Azure Red Hat OpenShift. Similarly, Azure Red Hat OpenShift offers jointly (Red Hat + Microsoft) offered enterprise-grade Kubernetes solution.

Because there is a good integration between Red Hat and Azure, it is easily accomplished in few simple steps.

In this post, we will see how we connect Azure DevOps to OpenShift.

Preparing Azure DevOps to connect to OpenShift

Azure DevOps uses the concept of service connections to connect to external or remote services.

For more information on Azure DevOps service connections, refer to the documentation

Azure DevOps provides many OOB service connections which helps you to quickly connect your pipelines to services like Azure, GitHub etc, but extension developers can also provide other service connection types in the form of Azure DevOps extension which you can install from Visual Studio Marketplace.

Out-of-the-box Azure DevOps service connections

To connect Azure DevOps to Azure Red Hat OpenShift, you will need to install the OpenShift extension maintained by Red Hat from VS Marketplace. Once, installed into Azure DevOps, this extension adds a new service connection type to the Azure DevOps service connections dialogue.

If you select OpenShift service connection and click Next, you will see that it supports different authentication modes.

In this post, we will be using the Token Based Authentication method, as Basic Authentication requires us to provide user credentials - which is considered a bad practice.

Creating Service Account in Azure Red Hat OpenShift

Service Accounts in OpenShift are objects which are scoped for namespace or project, which provide a useful way to connect to OpenShift, without the need for user credentials. Each service account in OpenShift generates two secrets automatically when you create it.

  • An API token
  • Credentials for the OpenShift Container Registry

By design, both the API token and credentials for the OpenShift Container Registry do not expire, which makes using Service Accounts to connect to Azure DevOps, a perfect candidate.

Creating a service account in Azure Red Hat OpenShift via web console is easy (which is what we going to do in this post), however, you can also use oc CLI

Azure Red Hat OpenShift Web Console

So, log in to Azure Red Hat OpenShift and

  1. Click ServiceAccounts under User Management
  2. Then, click Create ServiceAccount button
  3. In the window, which opens next, you will see the manifest for the service account
  4. Give a name for the service account and click Create

You can then see the service account created.

Assigning Role and RoleBinding to Service Account

The service account which we created currently does not have any permission in the cluster. RBAC in OpenShift is managed using Role/RoleBinding or ClusterRole/ClusterRoleBinding. The explanation of how they work is outside the scope of this post — but if you are interested there is good documentation on it.

What your ServiceAcount should and should not do is based on your requirements, but for the sake of simplicity, we intend to use this service account to basically manage anything on the cluster, so we will make this service account a cluster administrator.

As a best practice, you can create multiple service accounts, with different permissions (one for managing the cluster and one for only deploying workloads etc) and associate them with multiple service connections in Azure DevOps, which gives greater flexibility — For example you can create one service connection for Cluster Administrators and another one for Developers which only lets them deploy workloads.

For our service account to manage every aspect of the cluster from Azure DevOps, we will give cluster-admin permission. To do that, go to RoleBindings the menu and create ClusterRoleBinding

  1. We select ClusterRoleBinding and give a name for the role binding
  2. We would like our service account to be Cluster Administrator so we pick cluster-admin the role from the dropdown.
  3. Finally, we select the subject as ServiceAccount and provide the namespace and name of the service account. Click Create

Finish creating the service connection in Azure DevOps

Now it is time to create the service connection in Azure DevOps. Go to Service Connections and create an OpenShift service connection. Select Token Based Authentication

For Server URL field, you need to provide the API server URL for your Azure Red Hat OpenShift. You can get it from Overview blade in the Azure portal for the service.

Next, we need to get API Token — as mentioned above, the service account automatically generates secrets which also have an API Token. We can get the token easily from Web Console but you can also use CLI if you prefer (refer to docs for the commands).

From the web console, head over to ServiceAccounts then select the service account you just created. Then scroll down to Secrets and select the secret of type kubernets.io/service-account-token

This will take you to a specific secret, and you will be able to scroll down to find the token. Click on the Copy to clipboardicon to copy it.

You can paste the token now in the Azure DevOps service connection dialog-box under the field API Token , give a name for the service connection and click Save

You should now see the service connection under your service connections list.

Creating an Azure Pipeline to test the connectivity

Let us create a simple pipeline to use the service connection and test the connectivity to our OpenShift cluster. We will use YAML pipelines and use Azure Repos Git repo to store the YAML.

So in the YAML editor, if you click Show assistant and search for openshift you would see all the tasks exposed by the OpenShift extension. Since we are only testing connectivity using oc CLI, we select Execute oc command task.

You will be shown the dialog which takes the service connection to use and select the service connection from the dropdown and provide the command to run.

I have filled with whoami command which should tell us the context we are in (in our case it will be service account). I am also downloading the specific version of the CLI by providing the URL for the oc CLI. See note below for more info.

Since oc CLI, depends on the API version of your cluster, its better to download version specific to your cluster. You can get the URL for oc CLI from your OpenShift console.

Next, I using the same step as above, I add another task to see what permission our service account has. This can be tested using below command.

# Check to see if I can do everything in all namespaces ("*" means all)
oc auth can-i '*' '*' --all-namespaces

So our complete YAML for the pipeline should look as below.

trigger: none

pool:
vmImage: ubuntu-latest

steps:
- task: oc-cmd@3
displayName: whoami
inputs:
connectionType: 'OpenShift Connection Service'
openshiftService: 'openshift-cluster'
version: 'https://downloads-openshift-console.apps.mycluster.uksouth.aroapp.io/amd64/linux/oc.tar'
cmd: 'whoami'

- task: oc-cmd@3
displayName: what can i do
inputs:
connectionType: 'OpenShift Connection Service'
openshiftService: 'openshift-cluster'
cmd: 'auth can-i ''*'' ''*'' --all-namespaces'
version: 'https://downloads-openshift-console.apps.mycluster.uksouth.aroapp.io/amd64/linux/oc.tar'

Save the pipeline and trigger the run. You should see pipeline run successfully.

Pipeline connected to OpenShift under service account context
Check to see if I can do everything in all namespace (“*” means all), “yes” confirming we have correct permissions

That is it! Hopefully you found this post useful. If yes, give a clap and share :-)

--

--

Utkarsh Shigihalli

Microsoft MVP | Developer | Passionate about Cloud, .NET and DevOps