Getting Started

Getting Started

The follow guidance is language independent, i.e. does not require a development environment, and simply uses scripts to reflect a deliverable.

Subsections of Getting Started

Seed Solution

Continuous Integration (CI) is a critical prerequisite of Continuous Delivery/Deployment (CD).

Create a Release Package

To allow the execution of the build and package (CI) process on the DevOps Engineers machine, CDAF is used for both loose coupling and standardisation. CDAF provides a variety of features which provide consistency, especially important and the number of pipelines grow and the team members move between both squads and value streams.

Install on Windows

To install to current directory, recommend placing in your home directory, or download latest zip.

. { iwr -useb https://cdaf.io/static/app/downloads/cdaf.ps1 } | iex
.\automation\provisioning\addPath.ps1 "$(pwd)\automation"

Exit your session and re-open to reload the path.

Install on Linux

To install for the local user, recommend placing in your home directory, or download latest tarball or zip.

curl -s https://cdaf.io/static/app/downloads/cdaf.sh | bash -
./automation/provisioning/addPath.sh "$(pwd)/automation"

Exit your session and re-open to reload the path.

Entry Points

CDAF providers 4 entry scripts for different purposes.

  • ci : Build and Package only, i.e. Continuous Integration, mandatory argument is BUILDNUMBER
  • cd : Release, i.e. Continuous Delivery or Deploy (depending on gating or not), mandatory argument is ENVIRONMENT
  • cdEmulate : Executes ci and then cd, generates BUILDNUMBER if not supplied and uses configurable ENVIRONMENT
  • entry : Executes ci and then cd, generates BUILDNUMBER if not supplied and uses configurable ENVIRONMENT(s)

Release Package Creation

With the focus being delivery, not development, the creation of a consistent, self contained release package is a core CDAF feature used for both component delivery and stand-up/tear-down capabilities. The output of the CDAF CI processs is a single release.ps1 file. See Self-extracting release article.

Property Translation

The transformation process converts the human readable .cm files into computer friendly properties file for use in the CD process, i.e. release.ps1. See Configuration Management, tokenisation and detokenisation documentation.

A key principle of the Continuous Delivery Automation Framework is loose coupling. This gives the automation developer the ability to run the automation process on their workstation, well before executing in the pipeline tooling. This principle should be retained where possible so that troubleshooting and feature development can be be bought closer to the developer.

a loosely coupled solution can allow migrating from one pipeline tool to another with minimal effort.

Seed your solution

To seed a new solution, the minimal requirement is a directory with a solution file CDAF.solution

mkdir .cdaf

Linux

echo "solutionName=mycoolproduct" > .cdaf/CDAF.solution
echo "artifactPrefix=0.1" >> .cdaf/CDAF.solution

Windows

Set-Content .\.cdaf\CDAF.solution "solutionName=mycoolproduct"
Add-Content .\.cdaf\CDAF.solution "artifactPrefix=0.1"

The minimum properties are the name of your solution, and the versioning prefix. The resulting artefact will have the build number appended to the release package, e.g. the first build will be 0.1.1, then 0.1.2 and so on.

solutionName=mycoolproduct
artifactPrefix=0.1

Continuous Integration (CI)

With CDAF installed on your path, you can now test the solution by running the Continuous Integration entry point

linux

ci.sh

windows

ci

Many things will happen, however the key observation is that a file called release.sh for linux or release.ps1 for windows will be produced, this is the build artefact that can be consumed by the Continuous Delivery (CD) stages. See minimal sample for an executed example.

Shift-Left & Fail-Fast

Now that you have the bare minimum, apply it to your CI/CD toolset immediately. We want to have a green pipeline from the start to trap any problems we may introduce in subsequent steps.

Pipeline

CDAF provides a loose coupling for core CI & CD objectives. The intention is that the CI & CD processing is performed on the developers desktop, and then executed in the same way in the pipeline tool. By establishing a healthy pipeline as soon as possible, any pipeline failures can be quickly and incrementally identified. See Entering Sprint-0 for elaboration.

Pipeline Orchestration and Capabilities

The CI process gathers files from source control, then uses the CDAF CI entry point to produce the release package. The pipeline tool then stores the release package for reuse in subsequent deploy processes.

graph LR
  subgraph CI
    git[("Source Control")]
    bp["Build & Package"]
    registry[("Artefact Store")]
  end
  qa["βœ“ qa"]
  pp["βœ… pp"]
  pr["βœ… pr"]

  git -->
  bp -->
  registry -->
  qa -->
  pp -->
  pr

classDef dashed stroke-dasharray: 2
class CI dashed

After the CI process, the pipeline tool may perform additional value add processes that are not directly related to delivery, i.e. publishing test results or code coverage.

The pipeline then retrieves the release package, and then triggers one or more deployments to promote a release to production. This is the CD process.

graph LR
  git[("Source Control")]
  bp["Build & Package"]
  subgraph CD
    registry[("Artefact Store")]
    qa["βœ“ qa"]
    pp["βœ… pp"]
    pr["βœ… pr"]
  end

  git -->
  bp -->
  registry -->
  qa -->
  pp -->
  pr

classDef dashed stroke-dasharray: 2
class CD dashed

The triggering of each stage of the promotion can be immediate (indicated with βœ“ in the diagram above) or require a manual approval (βœ…), but it is expected the deployment process itself is fully automated once it has been triggered.

Using the seeded solution from the previous material, it is recommended that this is executed in your pipeline as a do nothing verification. See the orchestration examples in GitHub for guidance:

Build

Continuous Integration

Continuous Integration (CI) is the objective of bringing code branches together and building them to produce a consolidated artefact. This shift-left approach ensures the efforts of multiple contributors are combined and tested regularly. The testing within CI typically starts with unit testing, and that should be included in the build task. For some ecosystems this is an implicit or parameterised part of the build command, others, it’s separate command.

How does it work

CDAF will process all build.tsk files in the solution root, then all the build.tsk files found in one level of sub-directories.

The build.tsk files are processed line by line, each line is logged and then executed, with errors and exceptions trapped and logged. In the case of linux the error processing is based on the exit code and standard error, while windows has a broader range of errors, such as halt and exception conditions.

For this material, the build output is a simple script, for some language specific examples see:

Extend the Seeded Solution

Add a build.tsk file to the solution root

Linux

echo 'echo \"hash!/usr/bin/env bash\" > runtime.sh' > build.tsk
echo 'echo \"echo Deploy %integer%, property set to : %property%\" >> runtime.sh' >> build.tsk
echo 'hash=$(printf \"\\u0023\")' >> build.tsk
echo 'REPLAC runtime.sh hash $hash' >> build.tsk
echo 'REFRSH runtime.sh output' >> build.tsk
echo 'chmod +x output/runtime.sh' >> build.tsk

Windows

Set-Content build.tsk 'Set-Content runtime.ps1 "Write-Host `"Deploy %integer%, property set to : %property%`""'
Add-Content build.tsk 'REFRSH runtime.ps1 output'

Continuous Integration (CI)

The build.tsk is a CI task so only need to execute

ci.sh

or for windows

ci

The build process will now be triggered, this can be observed in the log build.tsk found in solution root, this will produce a directory called output, however, this will not be included in the release file, which will be covered in the next step.

Package

Now that build artefact has been created, create a deployable package.

Package

Build-Once/Deploy-Many

An objective of Continuous Delivery is to have a predictable, repeatable, deployment process. A fundamental principle of CDAF to achieve this producing an immutable release package. This decouples the deployment process from the source management process. The release package is a self-contained deployment asset, and should be executable anywhere, i.e. on the automation developers desktop, within the pipeline or even manually file transferred to a remote server.

Artefact Retention

In the Configuration Management step, a default release package was created which contained properties files. The following step defines the solution specific artefacts which need to be available at deploy time. These are typically compiled binaries, but can be any set of files and/or directories.

Retain the output from the previous build task.

Linux

echo 'output' > .cdaf/storeForLocal

Windows

Set-Content .\.cdaf\storeForLocal 'output'

Build & Deploy

Use the continuous deployment emulation entry point.

  • cdEmulate : Executes ci and then cd, generates BUILDNUMBER if not supplied and uses configurable ENVIRONMENT

Linux

cdEmulate.sh

windows

cdEmulate

Inspect the directory TasksLocal, and will now contain the output directory produced by the build task. Test the artefact

Linux

./TasksLocal/output/runtime.sh

windows

.\TasksLocal\output\runtime.ps1

This should output the following:

Deploy %integer%, property set to : %property%

Other File Locations

There are three artefact definitions file names, depending on context, local, remote or both:

  • storeFor
  • storeForLocal
  • storeForRemote

Other directories within your solution directory which will also be automatically included in the root of your deployment directory. Based on the suffix these will be placed in a local context, remote context or both. See the following sections for how these contexts differ.

  • crypt
  • cryptLocal
  • cryptRemote
  • custom
  • customLocal
  • customRemote

An explanation of the local and container extensions will be explained in following sections.

Continuous Delivery/Deployment

Deploy the artefact using the created package, along with Configuration Management.

Continuous Delivery

Continuous Integration (CI) is a critical prerequisite of production-like stand-up/tear-down, i.e. if it can’t be build on the engineers machine, it can’t be deployed from the engineers machine.

Configuration Management and Automated Deployment

Configuration Management

CDAF origin was to ensure consistent configuration of servers across environments, based on a source of truth. The partner construct to this approach is tokenisation, i.e. a way of abstracting environment variations away from the syntax of the consuming application.

Tabular Properties

To provide a human readable, single pane-of-glass view of the multiple environment configurations, a tabular approach is used. An example of this follows. The first two columns, context and target are mandatory, all others can be any values needed for your solution.

context  target  property
local    TEST    test.server.comain
local    PROD    production.server.domain

Configuration Management files should never container sensitive data or secrets. These are supplied as variables, see more on sensitive data strategies.

The configuration management tables can be any file name with .cm extension, in your solution root. All .cm files are processed prior to the build task in the CI process.

Extend the Seeded Solution

Based on the seeded solution, add a properties.cm file to the solution root.

Linux

echo 'context  target  property               integer' > .cdaf/properties.cm
echo 'local    LINUX   "Local Context"              1' >> .cdaf/properties.cm
echo 'local    TEST    "Test Property"              2' >> .cdaf/properties.cm

Windows

Set-Content .\.cdaf\properties.cm 'context  target     property               integer'
Add-Content .\.cdaf\properties.cm 'local    WINDOWS    "Local Context"              1'
Add-Content .\.cdaf\properties.cm 'local    WORKGROUP  "Local Context"              1'
Add-Content .\.cdaf\properties.cm 'local    TEST       "Test Property"              2'

Automated Deployment

Retest your solution, but this time, execute the end-to-end process

Linux

cdEmulate.sh

Windows

cdEmulate

The resulting CD process will not perform any action, however, the release package will now be extracted and there will be a directory TasksLocal, and in this will be the sub-directory based on the property context, propertiesForLocalTasks. In this directory will be the two properties files, compiled from the properties.cm file, TEST and PROD respectively, e.g.

property=Test Property
integer=1

Tokenisation

The partner files in source control are in whatever syntax required by the application, with tokens only for values that vary between environment. By default, tokens are in the form %name%. Following examples highlight how the configuration management is intended to provide an abstraction from the complexities of the application configuration files.

ASP.NET

  <connectionStrings>
    <add name="aspdotnetEntities"
      connectionString="metadata=res://*/Models.aspdotnet.csdl|res://*/Models.aspdotnet.ssdl|res://*/Models.aspdotnet.msl;provider=System.Data.SqlClient;provider connection string=&quot;data source=%databaseFQDN%;initial catalog=aspdotnetapp;integrated security=True;multipleactiveresultsets=True;application name=EntityFramework&quot;" providerName="System.Data.EntityClient"
      xdt:Transform="SetAttributes" xdt:Locator="Match(name)"/>
  </connectionStrings>
dotnet core
{
  "ConnectionStrings": {
    "appDB": "Server=%databaseFQDN%;Database=dotnetcoreapp;Trusted_Connection=True;"
  }
}

Python

database: 

dbopt:
  host: %databaseFQDN%
  dbname: pythonapp
  user: pythonappdbuser
  password: @dBpassword@

Java

jdbcConnection=jdbc:mysql://%databaseFQDN%/javaapp
jdbcDiver=com.mysql.jdbc.Driver

Ansible

---
spring_fqdn: "%spring_fqdn%"
rails_fqdn: "%rails_fqdn%"

Helm

env:
  name: QUEUE_TRANSPORT value: "%QUEUE_TRANSPORT%"
  name: ORM_CONNECTION value: "%ORM_CONNECTION%"

Deployment Tasks

With the properties for the application defined, now it is time to execute the deployment.

Local Deployment Tasks

Local Tasks

Local Tasks use the same execution engined as build tasks, but at deploy time, rather than build time. Local Tasks are executed in the local context of the host/server. Local Tasks are suited to situations where the agent is installed on the server where tasks are to be performed, or the server that the agent/runner is installed has the tools required to perform tasks on a remote target, i.e. a service offering with a command line interface, such as Kubernetes, Azure or AWS.

The CDAF capabilities with containers cater for more sophisticated uses in the local context and the alternative container tasks execution approach.

Example Task

The default tasks that are run in the local context are tasksRun.tsk and tasksRunLocal.tsk. These are placed in your solution root.

Linux

echo 'DETOKN ./output/runtime.sh' > .cdaf/tasksRunLocal.tsk
echo '' >> .cdaf/tasksRunLocal.tsk
echo './output/runtime.sh' >> .cdaf/tasksRunLocal.tsk

Windows

Set-Content .\.cdaf\tasksRunLocal.tsk 'DETOKN .\output\runtime.ps1'
Add-Content .\.cdaf\tasksRunLocal.tsk ''
Add-Content .\.cdaf\tasksRunLocal.tsk '.\output\runtime.ps1'

Continuous Delivery Emulation (CD)

Execute the CD emulation

Linux

cdEmulate.sh

Windows

cdEmulate

Two steps are performed, first the deployable artefact is detokenised

Found %property%, replacing with Local Context
Found %integer%, replacing with 1

Then executed to verify the environment specific properties.

Deploy 1, property set to : Local Context

This now completes an end-to-end example of CDAF, from configuration management, build & package through to deployment. Following are some common additional configuration elements, and the final step covers the increasingly less common pattern of Remote tasks.

Alternate Tasks

If you require a variety of tasks, you can explicitly define them, which will ignore any tasksRun.tsk and tasksRunLocal.tsk in your solution root. Please your task files in directory named either custom or customLocal in your solution root.

To map your configuration to the alternate tasks, you must use the column name deployTaskOverride.

context  target  deployTaskOverride     databaseFQDN        dBpassword
local    TEST    simple-db-deploy.tsk   db1.nonprod.local   $db1Pass
local    UAT     simple-db-deploy.tsk   $db2Pass
local    PROD    cluster-db-deploy.tsk  $prodPass

Remote Tasks

Tasks run in a remote context. This approach is less common with the license barriers to installing deployment agents, and the client oriented nature of modern agents, making the need for push deployments less common.

Remote Deployment Tasks

Remote Tasks

Like Local Tasks, Remote Tasks use the same execution engined as build tasks, but at deploy time, rather than build time. Remote Tasks are executed in the local context of a remote host/server. Remote Tasks are suited to situations where the agent is not installed on the server where tasks are to be performed and instead the deployment is pushed, i.e. to an application server in the DMZ which can only be accessed by Remote PowerShell or SSH.

The Remote Task is executed in a local context, so all the processes described in Local Tasks, however, how the deployment package is made available to the execution engine differs, along with pre-execution steps to make execution on the remote host possible.

SSH/SCP or Remote PowerShell with custom file transfer

Remote PowerShell for Windows or SSH/SCP for Linux are the protocols used to transfer the Remote Task package to the remote host for execution. PowerShell does not have an file transport protocol (Windows is typically reliant on SMB) so a CDAF feature has be provided to allow a file transfer mechanism similar to SCP in Linux.

Nested Package

When using Remote Tasks, a reduced set of CDAF helper scripts are packed into a nested compressed file. This file is transferred to the remote host and then unpacked. Once unpacked, the properties for the current release environment are transferred to remote host, and then the deployment is executed.

Remote Task Configuration

The default authentication for transferring the remote files is pre-shared keys for Linux and domain service principle for Windows, however, alternative authentication methods are supported.

context  target   deployHost   remoteUser
remote   VAGRANT  linux.local  adminuser

Windows PowerShell Authentication Options

The simplest authentication option is to use username and password, do not store the password in source control, instead use a variable.

Environment variables are the recommended approach because this allows execution on a desktop or in a pipeline.

context  target   deployHost          remoteUser           remotePass
remote   VAGRANT  windows.mshome.net  windows-1\adminuser  $env:CDAF_PS_USERPASS

Release Approaches

Now that there is an automated deployment mechanism, a variety of release and deployment strategies can be considered.