EvaporateJS is a JavaScript library for uploading files to a S3 bucket using multipart uploads. You can pause/resume or cancel an upload.
I’ve tested it with Minio before moving to AWS or Scaleway.
Prerequisites Install dependencies You have to install the following applications :
Minio NodeJS and NPM You can use brew or any other package manager following you’re running on MacOS or Linux.
Start Minio Once Minio is installed you can start it with the following command :
The more you use caching the less your files are downloaded from the S3 bucket. Bandwidth is money. So you can save money by setting a long age for your objets.
1- Set the property Cache-Control: Edit the file .gitlab-ci.yml, and add the property to your S3 command or like below to the dpl command
$ dpl --skip_cleanup --provider=s3 --bucket=$S3_BUCKET --region=$AWS_REGION \ --cache-control='public, max-age=31536000' --local-dir=public/ 2- Clear cache of the CDN You have to got some tool to clear the cache on the CDN side otherwise your updates will be never seen :(
This article will show you how to automate the deployment of your hugo site to your S3 bucket.
1- Create a new repository on Gitlab I choose Gitlab because of Its CI/CD features. Use your gitlab account and create a new project. You can named the project as your domain name.
Once you have created your project on Gitlab, you can add the git support into your local project :
When I’ve searched a solution for hosting my tech blog I’ve heard that the combo S3 + Cloudflare was the cheapest.
I bought my domain name on Gandi. I use Hugo for managing the content of website.
The architecture
The setup steps 1- Create a new site with Hugo Install Hugo and run the following command:
$ hugo new site my-blog Note: replace “my-blog” with the name of your site