Using helm-secrets

Helm secrets is a great plugin to avoid checking in secrets in your Source code.

Here, I am using Hashicorp vault to store secrets and retrieve them safely in helm values files while installing helm charts.

Installation

$ helm plugin install https://github.com/jkroepke/helm-secrets

Setup

  $ export VAULT_TOKEN="s.VAULT_TOKENEXAMPLEASLDKASKDASDA" 
  $ export VAULT_ADDR="https://vault.example.com" 
  $ export HELM_SECRETS_DRIVER=vault

In vault, add the secrets:

In your helm values file, refer to the secret as follows:

db:
  db:
  database:     !vault secret/misp#db_database
  username:     !vault secret/misp#db_username
  pasword:      !vault secret/misp#db_password
  rootpasword:  !vault secret/misp#db_rootpasword

Now change the helm upgrade command as follows:

$ helm secrets upgrade misp ./helm/misp --install --wait --atomic  --namespace=misp --create-namespace  --values=./helm/misp/values.yaml

The secrets plugin will fetch and update the vault references in values file before invoking the upgrade command on helm.

Note:

To check the result of decoding, you can use:

$ helm secrets dec helm/misp/values.yaml

This will result in vaules.yaml.dec with actual decoded values from Hasicorp Vault.

How to update Vault with ADCS issued Intermediate Cert Authority

Start Vault server

vault server -dev
export VAULT_ADDR='https://vault.example.com'
export VAULT_TOKEN="s.q3M0FGIdtVu60hLJnwrU1JC2"
export VAULT_SKIP_VERIFY=1
vault status

Enable Engine

vault secrets enable -path=pki_intermediate_ca_core pki
vault secrets tune -max-lease-ttl=87600h pki_intermediate_ca_core # 10 Years

Generate CSR

vault write pki_intermediate_ca_core/intermediate/generate/internal common_name="Example Company" ttl=87600h country="United Arab Emirates" locality="Dubai" organization="Example Company" ou="Technology Department"

 -OR-

vault write -format=json pki_intermediate_ca_core/intermediate/generate/internal common_name="Example Company" ttl=87600h country="United Arab Emirates" locality="Dubai" organization="Example Company" ou="Technology Department" | jq -r '.data.csr' > pki_intermediate.csr

Key    Value

---    -----

csr    -----BEGIN CERTIFICATE REQUEST-----
MIICWTCCAUECAQAwFDESMBAGA1UEAxMJR2FuZCBCYW5rMIIBIjANBgkqhkiG9w0B
AQEFAAOCAQ8AMIIBCgKCAQEA7e1qj67LeZCnDPKa+14YCWp8XtbG4soRs544lIJW
YipBB5eCaiRazfA5kxWUv3fOklP7/pjCkeCNhryjS5DB1GK1EdgZNFpS8odqxXwY
t4CPECGVRzSK4Cce4OKBXFMKRuTuKgWH9i9Nt+eGaxD2gOkGTruuWyTiLUnr6/mx
PyenoHzMqyeUifTv0M651KUztqPJPvSz0SSO4+jpEIrGPNYEIET1Ce/1Opkf0kCq
vtCOFzIcVqzq/bYUjtkBvKgg7kyUG/EXAMPLEKJLyVA2ij3wC5LXD2Z8OMcr
iGeSqmrOKAeAJeOwfnULIhdsXABXouWlQwi+yhS5cS7QAQIDAQABoAAwDQYJKoZI
hvcNAQELBQADggEBAMPAABu9I+ezwm//CjDiIPhhQQQSsgmXPR9SdQMDkM94hGOQ
WkWFL66RDBZp/kC+OwNDC1lj7hPLGzhhZCQY3xtzcCVhRS8C1LZYiKlZ5HyY+9GG
KwBrOsBVNTyiLTDkpuGNhmUfJbIoM2fLbKoTQ7lWjaH+Ryyd7Ud8eB6L5FLXPpQm
QjdnhXqtQ7Z1u8Q66UzR7wXHhKTZn0ZBxS0C2m85pwgVdVQepL8KyGMx6zRAveyJ
wcZ4L+Ni7op7fO6nb78cfnMSE6Ja5X0KgIU0VPbVbwFAACHkNA9fP5DNvfa5DCWq
7RxQqJ7sQflnVulQ4qUnN1Y1seqFl8W36G3V8uM=
-----END CERTIFICATE REQUEST-----

* Signed from ADCS (x509 Base64 format = PEM)

Note: Run following on ADCS:

    > certreq -submit -attrib "certificatetemplate:SubCA"

-----BEGIN CERTIFICATE-----

MIIE+zCCAuOgAwIBAgITUAAAAARrstY4ahm6yQAAAAAABDANBgkqhkiG9w0BAQsF
ADAcMRowGAYDVQQDExFHYW5kIEJhbmsgUm9vdCBDQTAeFw0yMjA0MDIwMTQ5MTJa
Fw0yMzA0MDIwMTU5MTJaMBQxEjAQBgNVBAMTCUdhbmQgQmFuazCCASIwDQYJKoZI
hvcNAQEBBQADggEPADCCAQoCggEBAO3tao+uy3mQpwzymvteGAlqfF7WxuLKEbOe
OJSCVmIqQQeXgmokWs3wOZMVlL93zpJT+/6YwpHgjYa8o0uQwdRitRHYGTRaUvKH
asV8GLeAjxAhlUc0iuAnHuDigVxTCkbk7ioFh/YvTbfnhmsQ9oDpBk67rlsk4i1J
6+v5sT8np6B8zKsnlIn079DOudSlM7ajyT70s9EkjuPo6RCKxjzWBCBE9Qnv9TqZ
H9JAqr7QjhcyHFas6v22FI7ZAbyoIO5MlBv2qEANd2Eq/8iiS8lQNoo98AuS1w9m
fDjHK4hnkqpqzigHgCXjsH51CyIXbFwAV6LlpUMIvsoUuXEu0AECAwEAAaOCATww
ggE4MB0GA1UdDgQWBBR6lWYqP/8bOMicwdXJFJ7AnqIUuzAfBgNVHSMEGDAWgBQF
vqWonZOoah+8S0tku+yRQYOkyTBQBgNVHR8ESTBHMEWgQ6BBhj9maWxlOi8vLy9X
SU4tMUlCQzRJRjlKVUYvQ2VydEVucm9sbC9HYW5kJTIwQmFuayUyMFJvb3QlMjBD
QS5jcmwwawYIKwYBBQUHAQEEXzBdMFsGCCsGAQUFBzAChk9maWxlOi8vLy9XSU4t
MUlCQzRJRjlKVUYvQ2VydEVucm9sbC9XSU4tMUlCQzRJRjlKVUZfR2FuZCUyMEJh
bmslMjBSb290JTIwQ0EuY3J0MBkGCSsGAQQBgjcUAgQMHgoAUwB1AGIAQwBBMA8G
A1UdEwEB/wQFMAMBAf8wCwYDVR0PBAQDAgGGMA0GCSqGSIb3DQEBCwUAA4ICAQA4
3TviPyTXM6H+G3WCzdNMhcjauoEXAMPLEGI8JfdBsZayeEtw0ZHLbiEWDvylX
CN5FOoKImfcUNDXMzQY9PiokGKo69WtIUx5V+AZwMxDFoW9tkvrtO5AVtHJLlL5l
MDqD92dDAnojHGn8BDjlrVIxvMomMRXi5p6sksSwDijgnpIJtiml+Ss5nyI7JjID
X2x5fvhRP2kqQHisdpCWyz+l8jqj3dCFsECSHkoGJjhkj8sywJK2kK9h5sXMyj0K
VNRLLli1BaWFYk0++GVK72CnzaTBXw389Pv1a+B3yOYzd+QEoprSs7RUajHPbRmF
iepFIISHGdWrtWxH9W+9R4iWWHzQ7fUNAFjtVBo7inTEtlHYH+EFCv3sgnWW+mkr
AVU/dZV+XsLzBhbd0tm21cn3hWaGMujxswGNHvKw2uvo5KL277VKrgDwEWTIMx/L
LkqCEg23eN7n5oefbULFhJVn9RBFvvjdDs2q81mp/LgeXkJVesdR1Fe83TzXRiR9
gBLRw6RRqWWuRibsGhMl16LthQMFRBbucnBwQfLCxKdV9mv+s5nUrTbUBXSQDFcE
Dyk0Z/BmEPtiRWcQzyzYR4TwWLO3ejPexfZz1rAZdfZMKSuYnz0LqXQ6l2Kjs7b3
nbjn1W7s0CSzE4HomHwKRCqlBJUb/XapqilsQ5kTpQ==
-----END CERTIFICATE-----

Set Intermediate CA in Vault

cat From_ADCS_signed_certificate.pem > full_chain.pem
cat ADCS_root.pem >> full_chain.pem
vault write pki_intermediate_ca_core/intermediate/set-signed certificate=@full_chain.pem 

// Following is to Fix CRLs

vault write pki_intermediate_ca_core/config/urls \
            issuing_certificates="https://vault.example.com/v1/pki/ca" \
            crl_distribution_points="https://vault.example.com/v1/pki/crl"

Test

# Issue a cert:

vault write pki_intermediate_ca_core/roles/generic_server_cert allowed_domains="example.com" max_ttl="43830h" allow_subdomains=true #5 years
vault write pki_intermediate_ca_core/issue/generic_server_cert common_name="testserver01.example.com" ttl="24h" > testserver01

Ref:

[1] https://www.vaultproject.io/api-docs/secret/pki#set-signed-intermediate
[2] https://www.vaultproject.io/docs/secrets/pki 

Can a CodePipeline GitHub source provide more than just one branch

A lot of customers ask:

Can a CodePipeline GitHub source provide more than just one branch?

CodePipeline can currently only run on a single branch per source action which is defined in the action configuration. You need to specify a git repository and branch name when you create a pipeline, so if a branch name is unknown like GitFlow branching model, it is not possible to create a pipeline in advance.

CodePipeline tied to a single git branch is more of a feature of CodePipeline as the design is more inclined towards Trunk based development. Also, as per the designers of this service, CodePipeline is designed for post-merge/release validation. That is, once your change is ready to be released to production and is merged into your master/main branch, CodePipeline takes over and automatically tests and releases the final merged set of changes. CodePipeline has a lot of features like stage locking, superseding versions, etc. which don't work well for the case where you want to test a change in isolation before it's merged (e.g. feature branch testing or pull request testing.) Therefore there currently isn't a recommended way to do this in CodePipeline.

Pipeline Topology Best Practise

Multiple Pipelines

  • 1 pipeline per environment sounds sensible as it respects account boundaries
  • Configuration drift is an issue due to multiple templates
  • Different pipelines = different way of managing change = different results!
  • Not building confidence about the build artifact as different artifacts

What we often see is customers setting up one pipeline per environment. This sounds sensible, as it respects the boundaries which an account is designed for! Who has a design like this?

The issue is that it also means that you get drift in the definition of these pipelines. Using CloudFormation is a good practice for defining your pipeline in code, but we still see customers creating different templates for different environments. The problem with this is that when you define the change process differently for each environment, you’re going to get different results! As you move across environments, you’re not building confidence as you promote code changes through the environments. You’re using different build artifacts for different environments, and you’re getting slowed down by the need to process this once for each environment.

We’re starting to consider this a bit of an anti-pattern as we see customers struggling with this.

Snigle Pipeline

  • 1 pipeline is better: 1 build artifact being promoted; 1 place to see all changes being made
  • Risky if it’s in Dev
  • Hard to change if it’s in Prod

A better way to set things up is to have a pipeline that resides in your Dev or Prod environment, with different stages for different environments. Who has this sort of design?

This means that any changes being made to your  app/service/microservice is visible in one place – you can visualize the entire SDLC. Of course you want to store it in code, and you want to ensure that the right people have the right permissions to change it.

Setting things up this way will increase your release velocity, as everyone can easily see what’s happening to code changes as they pass through the environments. This means that you don’t need to coordinate across team members and don’t need to manage multiple pipelines.

The downside of this approach is the lack of environmental isolation. If your pipeline is defined in a Dev account, those accounts tend to be a little lighter on security so you have the risk of a bad actor introducing malicious code into the pipeline that can be pushed into Production. And if the pipeline is defined in Prod, it is too hard to access to change the earlier stages of the pipeline! So although this pattern will increase your release velocity, it isn’t ideal from a security perspective.

So what is the best practice?

Pipeline in a "Tools" or "Shared" Account

  • 1 pipeline in it’s own account using cross-account roles
  • Good for security
  • Fast to make changes

Well, we’re seeing customers have a lot of success with sticking DevOps tools like CD pipelines in their own account. The pipelines in this account can assume cross-account roles to access resources in the Dev, Test and Production accounts. This imposes best security practices as well as operational practices in how you define your pipelines in code and how you manage the end-to-end flow of your changes to your application.

AWS Certified Solutions Architect - Professional today - YAY

I have 4 out of 5 AWS Certifications today.

AWS Certified Solutions Architect - Professional (English) Completed
Hello Shariq Mustaquim,
Congratulations! You have successfully completed the AWS Certified Solutions Architect - Professional  exam and you are now AWS Certified. You can now use the AWS Certified Solutions Architect - Professional  credential to gain recognition and visibility for your proven experience with AWS services.

...

Overall Score: 90%
Topic Level Scoring:
1.0  High Availability and Business Continuity: 90%
2.0  Costing: 100%
3.0  Deployment Management: 100%
4.0  Network Design: 85%
5.0  Data Storage: 90%
6.0  Security: 92%
7.0  Scalability & Elasticity: 81%
8.0  Cloud Migration & Hybrid Architecture: 85%