Sharing information across environments has become even easier and more secure with the introduction of Environment Output Variables. This new feature simplifies sharing outputs of one environment with another in the same project or workflow, storing them securely on the env0 platform.
Using environment outputs enhance our existing Workflows capability, making it even easier to define and manage complex dependencies, enabling you to:
- Pipe outputs to dependent environments
- Securely share sensitive values as outputs
- Avoid complex scripting or data sources
In this post, we'll dig into what options you had before Environment Output Variables, how the new feature works, and how you can get started today.
What’s New?
A common practice with Infrastructure-as-Code is to break large configurations into smaller, easier-to-manage environments. However, there will be dependencies between the environments and a need to share information across environments.
For example, an application deployment might require a subnet ID from a network environment and a database connection string from a database environment.
Prior to the introduction of Environment Output Variables, you could share the outputs of one environment with another through a few options. Each option has some potential downsides to consider.
The Import Variable Plugin in particular was the preferred solution on env0 for accessing the outputs of other environments.
However, using the plugin requires the creation and maintenance of an API key. Also, using the plugin ting requires writing a custom flow to add the step into the automation for an environment. Finally, and perhaps more importantly, the plugin couldn’t support sensitive data outputs, meaning you had to select a different solution for sensitive data or mark them as insensitive.
Environment outputs offer a better and more elegant solution, native to env0, which does not require the addition of a custom flow or provisioning an API key.
Moreover, here the output values are stored securely using encryption and secrets management, making support for sensitive data values possible.
Let's dig into how Environment Output Variables work on env0.
How it works
As an example, let's say we have two environments. One deploys a Virtual Network and the other deploys an AKS cluster.
The AKS cluster will need a subnet ID from the network environment. To share this information, we would first define an output in the VPC configuration that has the subnet ID for the AKS cluster to use:
output "aks_subnet_id" {
value = module.network.vnet_subnets_name_id["aks"]
}
In the AKS environment configuration, there would be an input variable that accepts the subnet ID:
variable "vnet_subnet_id" {
type = string
}
When setting up the variables for the AKS environment, we can define the value for the vnet_subnet_id input variable as the Environment Output aks_subnet_id from the network environment.
Once the Network environment has been provisioned, the output values will be available, and the AKS environment can be deployed without having to update any variable values.
The output values from the source environment are stored on the env0 platform and secured using AWS Secrets Manager with KMS encryption. This makes Environment Output Variables a safe and secure option even for sensitive values.
Using Outputs with Workflows
Environment Output Variables can be used to share configuration data across environments in a project, but they are especially useful for Workflows. To understand why, let's quickly review what Workflows are.
What are env0 Workflows
Workflows in env0 are a declarative approach to describing the relationships between different environments. Each sub-environment in the workflow can reference an existing template or VCS repository housing a configuration.
The workflow itself is stored as a template that projects can use as a golden path to streamline platform creation.
Workflows vastly simplify expressing the sequence, creation, and maintenance of each sub-environment.
Using the declarative nature of Workflows, you can describe the complex relationships between deployments and initiate partial or full runs to create and update sub-environments.
Inside the workflow, the relationship between sub-environments is described using a needs block, referencing the sub-environments that the current entry is dependent on for deployment.
Not only does this help with the initial deployment of resources in the proper sequence, but subsequent changes to any sub-environment can initiate a workflow run to update any dependent components.
Now, let’s get back to our earlier example of a Virtual Network and AKS cluster.
Rather than having them as two separate environments in a project, workflows enable you to describe the relationship between each environment and the order in which they are deployed.
environments:
network: 'Virtual Network'
name: 'AKS Network'
templateName: 'AppNetwork'
aks:
name: 'AKS Cluster'
templateName: 'AKSCluster'
needs:
- network
When provisioning the Workflow, env0 creates a graph showing the relationship between sub-environments in the Workflow.
Workflows are truly a massive step forward in describing and managing complex deployments, but until now they did not share configuration data across the sub-environments. That is where Environment Output Variables come in!
Enter: Environment Output Variables
Workflows define the relationship between sub-environments, codifying what you would have previously expressed through custom scripts or tribal knowledge.
Before Environment Output Variables, passing outputs from one sub-environment needed to be handled by another solution.
This was often accomplished through the Import Variable Plugin or with terraform_remote_state data. Environment outputs replaces the need to use either of those solutions with a native approach that is simpler and more secure.
So how might Environment Output Variables be used in a workflow?
Here are a few examples:
- Sharing subnet and environment information from a network deployment to an AKS deployment
- Passing a database connection string from a database deployment to an application deployment
- Passing the Kubernetes connection information and credentials to a Helm configuration
One especially tricky deployment is the bootstrapping of a Kubernetes cluster with a tool like ArgoCD or Flux after the cluster has been created. Workflows can assist with sequencing tasks and handling dependencies, and output variables can be used to pass the Kubernetes connection information.
What’s next
Output variables are a great way to pass configuration data between environments in the same project, but there's more goodness to come.
Importantly, the feature capabilities are not limited to OpenTofu and Terraform. Terragrunt, Pulumi, and CloudFormation are all supported as output sources and all IaC Frameworks on env0 can make use of Environment Output Variables as inputs. Support for Helm and Kubernetes as output sources is coming soon.
For shared infrastructure scenarios, you may want to share output values with environments in different projects or Workflows. For instance, sharing Kubernetes cluster information with multiple applications in separate projects.
Coming soon, we will add the ability for Environment Output Variables to reference outputs in other projects.
We're excited for you to try out Environment Output Variables and listen to your feedback!
We invite you to schedule a demo to get a better idea of how Workflows and Environment Output Variables - and env0 in general - could streamline and improve your IaC journey.
Top comments (0)