Hugo on AWS

So follow up on my last post on Moving to Hugo talked about my selection of a publishing platform. Took input from a few people and decided on a simple static site in Hugo. This post, a follow up on how I did it based on Nate’s post.

Hugo, done and decided. Now I had to figure out a better way to store my work. I have a few machines I am not exclusive to any of them (not even my work machine). So some type of source control is in order. Sure I have a few options. And based on Nate’s post Amazon has CodeCommit. Decided to keep this well with in GitHub. Most of my other projects are there. And it’s ease of use for me was well worth the choice. I am also trying to keep my options open on the platform I run this site. So created a new Git repository and added my new Hugo site to it.

So as we followed what Nate did with DNS. Amazon’s Route 53 is stupid easy to buy a domain name from. Took about five minutes to buy the domain name. And twice as long for me to pick it.

Now setting up the area to host the site, Amazon Simple Storage Solution or S3. Creating a bucket super easy but what amazon tells you but doesn’t make it clear is what to call it. If your domain is ILikeTurtles.org name your bucket “iliketurtles.org”. Trust me now this is a must.

So far we have set up the following:

Now my laziness unparalleled to say the least. I needed a way to make publishing to S3 easy. That came via Amazon CodeBuild. This was kind of tricky to set up but most of that resided with me. The documentation on the buildspec file is extensive. Almost too much but in first looking at it you need to know what is going on here.

Understanding the phases. You are spinning up a machine and building your code deploying it then deleting the machine. Remember this. The first phase was install and that looked something like this:

install:
    runtime-versions:
       golang: ${golang_version}
    commands:
       - echo "Installing Pygments"
       - pip install Pygments
       - echo "Getting Hugo"
       - wget https://github.com/gohugoio/hugo/releases/download/v${hugo_version}/hugo_${hugo_version}_Linux-64bit.deb
       - echo "Installing Hugo"
       - dpkg -i hugo_${hugo_version}_Linux-64bit.deb
    finally:
       - hugo version

What is going on here? Let’s disclose that anything that has a ${} is a variable and you don’t have to use them I am using them. The runtime-version that is what we need to have install so Hugo can run. Hugo is a GoLang application so installing it is a must. Next is the commands this is the installation of Hugo. Finally here is printing out the version of Hugo. Not needed so feel free to omit.

Building is the next step. The build section is pretty straight forward. That being said, I did have to chance to call Nate to double check my insanity during this part. My build file looked like this:

build:
    commands:
       - ( cd MyDir && hugo -v)
       - echo "S3 upload starting"
       - aws s3 sync MyDir/public/ s3://${s3_output}/ --region ${region} --delete
       - echo "S3 upload complete"

This isn’t the last part of the buildspec file we also need to set up the place where the artifacts will live. This was simple enough and this looked as follows:

artifacts:
  files:
     - '**/*'

Kick off a build and you will see your code residing in S3! Well, not yet there was something we had to change with IAM .

IAM is Amazon’s Identity and Access Management. I had to make a small change here. So under the roles tab there is a role name associated with your build it needs elevated privileges. So make sure it has permissions to access the S3 bucket.

So can I build it and it will deploy my code?!? Yep. that is what it took to set that up. Kind of simple but there was some gotcha I felt in there. Now we didn’t set up any webhooks to kick off builds, when my branch changes. I am notorious for unnecessary Git commits. And yes, CodeCommit is free for only the first 100 minutes of builds a month. But still not worth the risk.

In comes Build Triggers there is a tab that you can set up a trigger for a build. In my case daily builds are fine with me. My average build time thanks to Hugo is 20 seconds. I am sure I will stay under that 100 minute a month threshold. So if I am writing on the road. I can commit my post and wait for the daily build to publish.

We now need to set up tying the DNS to the S3 buckets. This was the tricky part remember when I said earlier on how to name your bucket this is the reason why. When you create a new record set the drop down will list your S3 bucket by name for you to select. Then bind your domain to your S3 bucket and boom done.

Over all this was fun process I had a few moments of frustration but well worth it. Some possible application of this set up? Think single page application like Angular.