I finally managed to automate deployment of parsiya.io with Travis CI. Not having done this before, I encountered some pitfalls. Additionally I had two extra problems:
- The structure of the blog is different from most Hugo deployments. Parsia-Clone only contains the
content
directory. Parents and everything else are in the parsiya.io repo. So while we push toParsia-Clone
, we need to cloneparsiya.io
and build the repository there. - I am hosting it out of an S3 bucket. All other examples were using github pages.
Update November 2020: As of late November 2020, I have switched to Github
actions for both parsiya.net
and parsiya.io
. Please see
deploy.ymlOLD.
Update February 2021: parsiya.io is now hosted on github pages with a custom domain instead of an S3 bucket. Please see the workflow file at gh-pages.yml.
TL;DR
- Sign into travis-ci.org with your Github account.
- Alternatively create an access token. All my repositories are public so I do not care.
- Add the repository containing the content. In this case
Parsia-Clone
.- Enable
Build pushed branches
.
- Enable
- Create the destination S3 bucket (e.g.
BUCKET_NAME
). - Create the following Amazon IAM policy and substitute
BUCKET_NAME
. This policy only gives read/write access toBUCKET_NAME
.travis-write-policy 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::BUCKET_NAME" ] }, { "Effect": "Allow", "Action": [ "s3:PutObject", "s3:GetObject", "s3:DeleteObject", "s3:AbortMultipartUpload", "s3:GetObjectAcl", "s3:PutObjectAcl" ], "Resource": [ "arn:aws:s3:::BUCKET_NAME/*" ] } ] }
- Create a group with the previous policy (e.g.
travis-writers
). - Create a user and add it to the
travis-writers
group. Copy the AWS access/secret keys. - Create
.travis.yml
inParsia-Clone
..travis.yml 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37
# safelist - only build on pushes to these branches branches: only: - master - travis # we needed this if wanted to build Hugo manually # language: go # go: # - 1.10 install: # change this version as it goes up # get and install Hugo - wget https://github.com/gohugoio/hugo/releases/download/v0.40/hugo_0.40_Linux-64bit.deb - sudo dpkg -i hugo*.deb # clone the parent repository, note this is different from Parsia-Clone - git clone https://github.com/parsiya/parsiya.io - cd parsiya.io # update and fetch submodules - git submodule init - git submodule update --recursive --remote script: # build the website with Hugo, output will be in public directory - hugo # deploy public directory to the bucket deploy: provider: s3 access_key_id: $AWS_ACCESS_KEY secret_access_key: $AWS_SECRET_KEY bucket: BUCKET_NAME region: us-east-1 local-dir: public skip_cleanup: true acl: public_read on: # make it work on branch other than master # change this to master or any other branch if needed branch: travis
- Add the AWS keys in
Settings > Environment Variables
(do not include$
):AWS_ACCESS_KEY
AWS_SECRET_KEY
- Push any object and enjoy the deployed blog in your bucket.
Now for the longer version.
Setup
My git structure is unnecessarily complex. I will explain it in detail in a different blog post. But I wanted to keep Parsia-Clone
intact and did not want to add the modified Hugo-Octopress them to it. Parsia-Clone
is in the content
directory. The parent them is in the parsiya.io repository that contains the theme and the clone as submodules.
After every push to the Parsia-Clone
repository. Travis CI will:
- Create a new default container with Go.
- Install Hugo.
- Clone the
parsiya.io
directory. - Update and fetch submodules (theme and
Parsia-Clone
). - Build the website and deploy it to S3.
- ???
- Profit.
We can see it in the travis.yml
file above. Let's talk about them a bit:
safelist
Safelist tells Travis CI to only build on certain branches. In this case, I am pushing to master
and travis
.
# safelist - only build on pushes to these branches
branches:
only:
- master
- travis
language
If you want to build Hugo manually instead of downloading a deb, you can install Go and configure it.
language: go
go:
- 1.10
install
install
runs commands after push and is setting up the environment:
wget https://github.com/gohugoio/hugo/releases/download/v0.40/hugo_0.40_Linux-64bit.deb
- Download the Hugo
deb
. At the time of writing, version0.40
is out.
- Download the Hugo
sudo dpkg -i hugo*.deb
- Install the
deb
file.
- Install the
git clone https://github.com/parsiya/parsiya.io
- Clone the parent repository.
git submodule init
-git submodule update --recursive --remote
- Update and fetch submodules. The submodules might have been updated since the last commit to
parsiya.io
, so they must be updated.
- Update and fetch submodules. The submodules might have been updated since the last commit to
install:
# change this version as it goes up
# get and install Hugo
- wget https://github.com/gohugoio/hugo/releases/download/v0.40/hugo_0.40_Linux-64bit.deb
- sudo dpkg -i hugo*.deb
# clone the parent repository, note this is different from Parsia-Clone
- git clone https://github.com/parsiya/parsiya.io
- cd parsiya.io
# update and fetch submodules
- git submodule init
- git submodule update --recursive --remote
script
Now we will build the blog by running hugo
. This command without any parameters will build the current website and put it in the public
directory.
deploy
- Deploy to S3 every time the
travis
branch is updated.on
at the bottom of the file. - Access and secret keys are set in environmental variables above.
- Region is
us-east-1
. This is the default region and does not need to be provided, if you bucket is in a different region be sure to change this. local-dir: public
: Copy thepublic
directory to the bucket.bucket: BUCKET_NAME
: Destination bucket, change this.skip_cleanup: true
: Do not delete the build artifacts.acl: public_read
: Grant everyone read access to the bucket objects. This is only needed if you want to deploy the bucket via HTTP. If you want to deploy it over TLS via CloudFront, remove this and configure your bucket permissions for CloudFront properly.
Pitfalls
Being the first time that I have used Travis CI, I encountered some errors. I am documenting them here because inevitably me and some other people get these errors. You're welcome future me.
Add Environmental Variables with $ in Travis CI Web UI
Initially when adding the environmental variables, I had added them as they appear in travis.yml
file. Meaning they started with $
. I am not sure why I had added them with the prefix. The error will be similar to:
The previous command failed, possibly due to a malformed secure environment variable.
Solution: Don't add your environmental variables with $
.
Error with go get Hugo
To build Hugo from source, I initially had setup the container to have Go
and then use go get
to download and build Hugo. I got this error imports context: unrecognized import path "context"
.
$ go get github.com/gohugoio/hugo
package github.com/gohugoio/hugo
imports context: unrecognized import path "context"
The command "go get github.com/gohugoio/hugo" failed and exited with 1 during .
Solution: I decided to not build Hugo from source and just download and instead install the lastest deb
release.
Repository Name not Matching the Condition in Deploy
I am building inside the parsiya.io
repo but I had originally added Parsia-Clone
as condition in deployment in travis.yml
like this:
deploy:
provider: s3
# ...
acl: public_read
on:
# Make it work on branch other than master.
branch: travis
repo: parsiya/parsia-clone
I got this error:
this repo's name does not match one specified in .travis.yml's deploy.on.repo
.
Solution: Remove the repo
condition. It's not needed.
No DeleteObject Permission in AWS User Policy
If you do not give permission to delete the previous versions in the bucket (overwriting), the build will fail. The error message is a bit vague:
Oops, It looks like you tried to write to a bucket that isn't yours or doesn't exist yet. Please create the bucket before trying to write to it.
This error message in general means you do not have enough access. I had initially only given PutObject
and GetObject
.
Solution: Add the following permissions:
s3:PutObject
s3:GetObject
s3:DeleteObject
s3:AbortMultipartUpload
s3:GetObjectAcl
s3:PutObjectAcl