The more you use caching the less your files are downloaded from the S3 bucket. Bandwidth is money. So you can save money by setting a long age for your objets.
1- Set the property Cache-Control:
Edit the file .gitlab-ci.yml, and add the property to your S3 command or like below to the dpl command
$ dpl --skip_cleanup --provider=s3 --bucket=$S3_BUCKET --region=$AWS_REGION \
--cache-control='public, max-age=31536000' --local-dir=public/
2- Clear cache of the CDN
You have to got some tool to clear the cache on the CDN side otherwise your updates will be never seen :(
The Cloudflare GO library includes a tool named flarectl. You can use it to purge the cache of your zone.
flarectl required 2 environmement variables:
Name | Value |
---|---|
CF_API_EMAIL | Your Cloudflare login (email) |
CF_API_KEY | Your API Key (Go find it into your profile page) |
You should add a new stage into your gitlab pipeline like into this example:
stages:
- build
- deploy
- clear-cache
...
purge_cache:
stage: clear-cache
image: golang
script:
- go get -u github.com/cloudflare/cloudflare-go/cmd/flarectl
- flarectl z purge --zone my-blog.com --everything
Now let’s try it !!!