“Security is 1% technology plus 99% following the procedures correctly” — Tom Limoncelli
Having dealt with GPG last week at work,
I remembered that I had intended to write a blog post
about how we used GPG, Blackbox, and Paperkey to store secrets in Git
at my previous job.
We used Blackbox to manage secrets that were needed
during development, build, deployment, and runtime.
These secrets included AWS credentials, Docker registry credentials,
our private PyPI credentials, database credentials, and certificates.
We wanted these secrets to be under version control,
but also to be secure.
For example, we had a credentials.sh that exported environment variables,
which was managed by Blackbox:
# Save current value
…continue.
[Previously published at the now defunct MetaBrite Dev Blog.]
A collection of miscellaneous tips on using Pipelines in Jenkins 2.0.
#6 in a series on Jenkins Pipelines
Environment Variables
Use the withEnv step to set environment variables.
Don't manipulate the env global variable.
The confusing example that you see in the documents,
PATH+WHATEVER=/something,
simply means to prepend /something to $PATH.
The +WHATEVER has no other effect.
Credentials
The withEnv step should not be used to introduce secrets into the build environment.
Use the withCredentials plugin instead.
withCredentials([
[$class: 'StringBinding', credentialsId: 'GPG_SECRET', variable: 'GPG_SECRET'],
[$class: 'AmazonWebServicesCredentialsBinding',
credentialsId: '0defaced-cafe-f00d-badd-0000000ff1ce',
accessKeyVariable: 'AWS_ACCESS_KEY_ID',
…continue.
[Previously published at the now defunct MetaBrite Dev Blog.]
Jenkins Pipelines are written in a Groovy DSL.
This is a good choice but there are surprises.
#5 in a series on Jenkins Pipelines
Groovy as a DSL
Groovy lends itself to writing DSLs (Domain-Specific Languages)
with a minimum of syntactic overhead.
You can frequently omit the parentheses, commas, and semicolons that litter other languages.
Groovy has interpolated GStrings, lists, maps, functions, and closures.
Closures
Closures are anonymous functions where state can be captured at declaration time
to be executed later.
The blocks that follow many Pipeline steps (node, stage, etc) are closures.
Here's an example of a Closure called acceptance_integration_tests,
where the release_level parameter …continue.
[Previously published at the now defunct MetaBrite Dev Blog.]
If there isn't a built-in Pipeline step to accomplish something,
you'll almost certainly use the sh step.
#4 in a series on Jenkins Pipelines
The sh step runs the Bourne shell—/bin/sh, not Bash aka the Bourne-again shell—with the -x (xtrace) and -e (errexit) options.
The xtrace option means that every step in the sh block is echoed to the Jenkins log,
after commands have been expanded by the shell.
This is useful but you could echo the contents of passwords or secret keys inadvertently.
Use set +x in your sh block to control this.
The errexit option means that the …continue.
[Previously published at the now defunct MetaBrite Dev Blog.]
Much of our code is in one large GitHub repository,
from which several different applications are built.
When changes are pushed to the master branch,
we want only the applications in affected directories to be built.
This was not easy to get right with “Pipeline script from SCM” builds.
#3 in a series on Jenkins Pipelines
Configuration
To get builds to trigger upon a push to GitHub,
you need to configure a webhook pointing to your Jenkins Master.
Create an SSH key for Jenkins/GitHub.
A passphrase is recommended.
[Previously published at the now defunct MetaBrite Dev Blog.]
The “slave” terminology is unfortunate,
but the utility of running a Jenkins build on a node that you've configured
at Amazon's EC2 is undeniable.
#2 in a series on Jenkins Pipelines
We needed to install system packages on our build nodes,
such as Docker or Postgres.
For obvious reasons,
CloudBees—our Jenkins hosting provider—won't let you do that on their systems.
You must provide your own build nodes,
where you are free to install whatever you like.
We already use Amazon Web Services,
so we chose to configure our CloudBees account with EC2 slaves.
We had a long and fruitless detour through On-Premise Executors,
which I …continue.
[Previously published at the now defunct MetaBrite Dev Blog.]
The MetaBrite dev team migrated most of their builds
from Atlassian's Bamboo Cloud to Jenkins Pipelines in late 2016/early 2017.
This is a series of blog posts about that experience.
Jenkins Pipeline Series
The series so far:
Eviction
For three years, we used Atlassian's hosted Bamboo Cloud service
to build and deploy most of our code.
In the summer of 2016,
Atlassian announced that they were
going to discontinue Bamboo Cloud on January 31st, 2017.
We looked around for a suitable replacement.
We did not find …continue.
I've been using Jenkins lately,
setting up Pipeline builds.
I have mixed feelings about that,
but I'm quite liking Groovy.
Here's an example of a Closure called acceptance_integration_tests,
where the release_level parameter is a String
which must be either "dev" or "prod".
def acceptance_integration_tests = { String release_level ->
assert release_level =~ /^(dev|prod)$/
String arg = "--${release_level}"
def branches = [
"${release_level}_acceptance_tests": {
run_tests("ci_acceptance_test", arg, '**/*nosetests.xml')
},
…continue.