DEV Community

Deepika Banoth
Deepika Banoth

Posted on

Managing integration of CI/CD server to DevOps Environment

We all know that CI or CD server needs to be tightly connected or integrated across your DevOps environment.

But the problem in most of the CI servers today, is that you have to configure for each job every time.
For example, if you are trying to deploy to Kubernetes using Jenkins, you have to configure this at a Job level and you can only verify this by actually running the build.
If you need this KubeConfig in few other jobs the only way in here is that you add the same config file in all other required jobs.
Now imagine, you need to change something in the config file, now you have to go to all other places and change them, which is not effective.
This stands the same for deploying to Google Cloud or AWS or any other cloud provider which might also require some plugin to make it work.

So just to tackle this problem, JFrog Pipelines extracted these out into a central system, created a pointer for you which you can use it in your steps where our platform gonna inject them during run time and we call these as Integrations.

Integrations are used to connect your Pipelines CI or CD workflows to third party platforms or services and manage secrets like keys, tokens, passwords that are needed for Steps in a Pipeline to interact with the source.
All credential information is encrypted and maintained separately from the pipeline definition, and held in a secure storage.

The biggest advantages of our design for integrations are:

  1. Integrations are created in the UI and used with friendly names in your YAML config. This means you do not have to touch your automation scripts when an integration is updated.
  2. Integrations are securely stored in a Vault store and encrypted at rest and in-flight.
  3. Integrations values are not revealed in logs unless you choose to print them out as part of your workflow. You can scope each integration to allow or deny access for Pipeline Sources

Let me show an example of adding an integration and using it in one of your step.

Add New Integration

To add new integration, Go to Pipelines->Integrations tab and click on Add an Integration,
You can give some friendly name and select the type of integration which you want to create.

Alt Text

The available integration types for Jfrog Pipelines are:
Airbrake, Artifactory, AWS Keys, Azure Keys, Bitbucket, Digital Ocean, Distribution, Docker Registry, External Webhook, File Server, Generic, Github Enterprise , Github, Gitlab, Google Cloud, Internal Webhook Integration, Jira, Kubernetes, Newrelic, PEM Key, Slack, SMTP Credentials, SSH Key

Once you added all the required fields:
Alt Text

You can also set which Pipeline sources can use this integration:
Alt Text

After creating, you can add the integration directly to a step:

   steps:          
      - name: sample_bash
        type: Bash
        configuration:
          integrations:
            - name: myArtifactory
        execution:
          onExecute:
            - echo "executing bash step."
            - echo $int_myArtifactory_url

when you do that, a set of environment variables is automatically made available.

You can also send notifications using utility function: send_notification whenever your step succeeds or fails due to some reason using Notification integrations: Airbrake, Slack, Jira, NewRelic, Outgoing webhook, SmtpCreds(email)

   steps:
      - name: sample_bash
        type: Bash
        configuration:
          integrations:
            - name: myArtifactory
            - name: mySlack
            - name: myJira
        execution:
          onExecute: 
            - echo "executing bash step."
          onSuccess:
            - send_notification mySlack
          onFailure:
            - send_notification myJira --project-id myProjectOnJira --type Bug --summary "sample_bash step: ${step_id} failed"

Hope this helps! :)

Top comments (0)