AWS

Deploy a Hugo website from GitHub to S3 using GitHub Webhooks, API Gateway and Lambda

If you followed my previous posts on auto deploying a Hugo site from GitHub to S3 (Part 1, Part 2) you may have noticed that GitHub is deprecating the GitHub Services Integration mechanism. This is critical to the auto deployment function so we'll need an alternative. To add to my woes, I've found that the Node deployment package and all of its dependencies involves more maintenance than it deserves. I also noticed that the original Node package was only adding to the target S3 bucket, not performing a sync or equivalent.

Auto deploy a Hugo website from GitHub to S3 - Part 2

This post is part of a series. You're reading Part 2. Auto deploy a Hugo website from GitHub to S3 - Part 1 Preparing for the AWS Lambda function Now that GitHub notifies AWS of changes, we need to create the “doing” part of our project. We're going to use AWS’ Lambda service to perform the work. Lambda will execute the function everytime a notification is published to the SNS topic.

Using Docker as an adhoc NodeJS package manager

During my GitHub -> Lambda -> S3 series I needed to download and install NodeJS modules into my project's working directory. I didn't really want to install NodeJS and NPM on my Mac as my machine is a daily driver (SysAdmin/Ops) and not really a front or backend dev machine. I could always use a virtual machine but that's too resource intensive just to download some NPM modules. What's smaller than a VM?

Auto deploy a Hugo website from GitHub to S3 - Part 1

This post is part of a series. You're reading Part 1. Auto deploy a Hugo website from GitHub to S3 - Part 2 Introduction For those who don't know what Hugo is, it's a static website generator. Its source material is a Hugo template with your content in Markdown. I've used it for a while, in fact this blog is generated using Hugo. Hugo can give you an entire site in HTML/CSS and any required Javascript that you can then place anywhere on the web.

Updating a Route53 domain's Name Servers

I recently had the need to update an AWS Route53 domain NS configuration so that it could be protected by CloudFlare. This domain was purchased via Route53. I had updated the hosted zone NS records with the new Name Servers, but the domain continued to point to AWS and it drove me nuts. Turns out there is another section in Route53 where you updated the NS records for your Route53 managed domain.