feat: add article on deployment
This commit is contained in:
parent
0abbf73c91
commit
e331fc0a9f
10 changed files with 241 additions and 28 deletions
|
@ -1,5 +1,133 @@
|
|||
---
|
||||
title: Deploy aaronjy.me on a Google Storage bucket
|
||||
desc: TBD
|
||||
title: Deploying aaronjy.me on a Google Storage bucket
|
||||
desc: This site is just a bunch of static assets hosted on a Google Storage
|
||||
bucket! Here's how it works...
|
||||
---
|
||||
Google actually has [documentation](https://cloud.google.com/storage/docs/hosting-static-website) on how to deploy a static site to a storage bucket, but I wanted to talk about how I handle deployments, as Google doesn't covert that!
|
||||
|
||||
## Networking
|
||||
|
||||
This site is just a collection oh static assets (HTML, JS, CSS and images) that live inside a Google Cloud Storage bucket. When you load the site, the below route is taken once your request reaches GCP.
|
||||
|
||||

|
||||
|
||||
1. As you can see, you:
|
||||
2. Hit a load balancer, which then
|
||||
3. Directs you to a backend service, which then
|
||||
4. Decides either to either a) serve content directly from the storage bucket, or
|
||||
b) service it from the cache (if available)
|
||||
|
||||
The setup is pretty simple, and doesn't really deviate from Google's suggested setup configuration for static sites hosted from a bucket.
|
||||
|
||||
## Deploying
|
||||
|
||||
Setting up a seamless deployment stategy gets a little tricker, however. I opted to set up a manual deployment strategy, which involves calling `npm run deploy` to kick off the deployment. This in turn calls a bash script that handles the deployment.
|
||||
|
||||
The script consists of 4 deployment steps:
|
||||
|
||||
1. Backup existing bucket files to a backup bucket
|
||||
2. Remove sensitive files before deploying (e.g. `admin/index.html` for Decap CMS)
|
||||
3. Upload the latest files to the hosting bucket
|
||||
4. Invalidate Google's cache, so users receive the latest version of the site
|
||||
|
||||
### Step 1 - Backing up existing files
|
||||
|
||||
Before we do anything, we need to back up what we have already. I created a storage bucket specifically for holding backup files for this purpose, and use the gcloud CLI to copy the live files across to the backup bucket.
|
||||
|
||||
```
|
||||
BUCKET_URL="gs://aaronjy-www"
|
||||
BACKUP_BUCKET_URL="gs://aaronjy-www-backup"
|
||||
|
||||
echo "------------------------------"
|
||||
echo "BACKUP CURRENT SITE FILES"
|
||||
echo "------------------------------"
|
||||
|
||||
TIMESTAMP=$(date +%Y-%m-%d_%H:%M:%S)
|
||||
gcloud transfer jobs create $BUCKET_URL $BACKUP_BUCKET_URL/$(date +%Y-%m-%d_%H:%M:%S)/ --no-async --delete-from=source-after-transfer;
|
||||
```
|
||||
|
||||
The backed-up files are copied into a dated folder, and the `--delete-from` flag ensures the live websites files are deleted from the hosting bucket once they've been backed up.
|
||||
|
||||
### Step 2 - Removing sensitive files
|
||||
|
||||
Because I'm using Decap CMS for content management locally, I need to manually remove the `admin/` folder where Decap lives, as I don't want that to be available on the live site.
|
||||
|
||||
```
|
||||
echo "------------------------------"
|
||||
echo "REMOVE SENSITIVE FILES"
|
||||
echo "------------------------------"
|
||||
|
||||
rm -rfv ./out/admin/
|
||||
```
|
||||
|
||||
### Step 3 - Upload files to hosting bucket
|
||||
|
||||
Now we come to actually uploading the new files to the live site. I take everything from the `/out` directory (where Next.js throws its build output) and upload them directly to the hosting bucket.
|
||||
|
||||
```
|
||||
echo "------------------------------"
|
||||
echo "UPLOADING NEW SITE FILES"
|
||||
echo "------------------------------"
|
||||
|
||||
gcloud storage cp --recursive ./out/* $BUCKET_URL --gzip-in-flight-all
|
||||
```
|
||||
|
||||
The `--gzip-in-flight-all` is a handy edition, as the cli will apply gzip compression locally, and Google will uncompress them before dumping them in the bucket on the other end, resulting in a lower upload size/quicker deployment time.
|
||||
|
||||
### Step 3 - Invalidate the global cache
|
||||
|
||||
As Google uses a global cache for bucket files, we must invalidate it to ensure users get the latest website version.
|
||||
|
||||
```
|
||||
echo "------------------------------"
|
||||
echo "INVALIDATING GLOBAL CACHE"
|
||||
echo "------------------------------"
|
||||
|
||||
echo "WARNING: This is an async operation that can take upwards of 10 minutes depending on how fast Google Cloud CDN invalidates its cache. It does take around 10 minutes on average."
|
||||
|
||||
gcloud compute url-maps invalidate-cdn-cache lb-aaronjy-www --path "/*" --async
|
||||
```
|
||||
|
||||
This can take anywhere between 7-10 minutes, so the `--async` flag has been applied because we don't need to sit and wait for it.
|
||||
|
||||
### Full deployment script
|
||||
|
||||
Here's the deployment script in full:
|
||||
|
||||
```
|
||||
BUCKET_URL="gs://aaronjy-www"
|
||||
BACKUP_BUCKET_URL="gs://aaronjy-www-backup"
|
||||
|
||||
echo "------------------------------"
|
||||
echo "BACKUP CURRENT SITE FILES"
|
||||
echo "------------------------------"
|
||||
|
||||
TIMESTAMP=$(date +%Y-%m-%d_%H:%M:%S)
|
||||
gcloud transfer jobs create $BUCKET_URL $BACKUP_BUCKET_URL/$(date +%Y-%m-%d_%H:%M:%S)/ --no-async --delete-from=source-after-transfer;
|
||||
|
||||
echo "------------------------------"
|
||||
echo "REMOVE SENSITIVE FILES"
|
||||
echo "------------------------------"
|
||||
|
||||
rm -rfv ./out/admin/
|
||||
|
||||
echo "Removed all sensitive files."
|
||||
|
||||
echo "------------------------------"
|
||||
echo "UPLOADING NEW SITE FILES"
|
||||
echo "------------------------------"
|
||||
|
||||
gcloud storage cp --recursive ./out/* $BUCKET_URL --gzip-in-flight-all
|
||||
|
||||
echo "------------------------------"
|
||||
echo "INVALIDATING GLOBAL CACHE"
|
||||
echo "------------------------------"
|
||||
|
||||
echo "WARNING: This is an async operation that can take upwards of 10 minutes depending on how fast Google Cloud CDN invalidates its cache. It does take around 10 minutes on average."
|
||||
|
||||
gcloud compute url-maps invalidate-cdn-cache lb-aaronjy-www --path "/*" --async
|
||||
|
||||
echo "------------------------------"
|
||||
echo "DONE!"
|
||||
echo "------------------------------"
|
||||
```
|
||||
|
|
|
@ -16,6 +16,15 @@ collections:
|
|||
- {label: "You Will Need", name: "you-will-need", widget: "markdown" }
|
||||
- {label: "Recipe", name: "body", widget: "markdown" }
|
||||
|
||||
- name: writing
|
||||
label: Writing
|
||||
folder: content/writing
|
||||
create: true
|
||||
fields:
|
||||
- {label: Title, name: title, widget: string}
|
||||
- {label: Description, name: desc, widget: text}
|
||||
- {label: Body, name: body, widget: markdown }
|
||||
|
||||
- name: fun
|
||||
label: Fun
|
||||
folder: content/fun
|
||||
|
|
BIN
public/img/screenshot-2024-03-13-at-11.58.55.png
Normal file
BIN
public/img/screenshot-2024-03-13-at-11.58.55.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 132 KiB |
16
src/components/Article/Article.jsx
Normal file
16
src/components/Article/Article.jsx
Normal file
|
@ -0,0 +1,16 @@
|
|||
import React from 'react'
|
||||
|
||||
function Article ({ attributes, html }) {
|
||||
return (
|
||||
<section>
|
||||
<article>
|
||||
<h1>{attributes.title}</h1>
|
||||
<p>{attributes.desc}</p>
|
||||
<hr />
|
||||
<div dangerouslySetInnerHTML={{ __html: html }} />
|
||||
</article>
|
||||
</section>
|
||||
)
|
||||
}
|
||||
|
||||
export default Article
|
|
@ -8,7 +8,7 @@ function Header () {
|
|||
<header className={styles.header}>
|
||||
<nav>
|
||||
<Link href='/'>Home</Link>
|
||||
{/* <Link href='/writing'>Writing</Link> */}
|
||||
<Link href='/writing'>Writing</Link>
|
||||
<Link href='/cv'>CV</Link>
|
||||
<Link href='/fun'>Fun</Link>
|
||||
</nav>
|
||||
|
|
|
@ -3,18 +3,12 @@ import React from 'react'
|
|||
import fs from 'fs'
|
||||
import DefaultLayout from '@/layouts/DefaultLayout/DefaultLayout'
|
||||
import { getMarkdownEntry } from '@/lib/content'
|
||||
import Article from '@/components/Article/Article'
|
||||
|
||||
function FunSingle ({ attributes, html, slug }) {
|
||||
function FunSingle ({ attributes, html }) {
|
||||
return (
|
||||
<DefaultLayout>
|
||||
<section>
|
||||
<div />
|
||||
<h1>{attributes.title}</h1>
|
||||
<p>{attributes.desc}</p>
|
||||
</section>
|
||||
<section>
|
||||
<div dangerouslySetInnerHTML={{ __html: html }} />
|
||||
</section>
|
||||
<Article attributes={attributes} html={html} />
|
||||
</DefaultLayout>
|
||||
)
|
||||
}
|
||||
|
|
38
src/pages/writing/[slug].js
Normal file
38
src/pages/writing/[slug].js
Normal file
|
@ -0,0 +1,38 @@
|
|||
import { toSlug } from '@/lib/helpers'
|
||||
import React from 'react'
|
||||
import fs from 'fs'
|
||||
import DefaultLayout from '@/layouts/DefaultLayout/DefaultLayout'
|
||||
import { getMarkdownEntry } from '@/lib/content'
|
||||
import Article from '@/components/Article/Article'
|
||||
|
||||
function FunSingle ({ attributes, html }) {
|
||||
return (
|
||||
<DefaultLayout>
|
||||
<Article attributes={attributes} html={html} />
|
||||
</DefaultLayout>
|
||||
)
|
||||
}
|
||||
|
||||
export function getStaticPaths () {
|
||||
const fun = fs.readdirSync('./content/writing', { withFileTypes: true })
|
||||
|
||||
const paths = fun.map((dirent) => ({
|
||||
params: {
|
||||
slug: toSlug(dirent.name)
|
||||
}
|
||||
}))
|
||||
|
||||
return {
|
||||
fallback: false,
|
||||
paths
|
||||
}
|
||||
}
|
||||
|
||||
export function getStaticProps ({ params }) {
|
||||
const path = `./content/writing/${params.slug}.md`
|
||||
|
||||
const entry = getMarkdownEntry(path)
|
||||
return { props: { ...entry } }
|
||||
}
|
||||
|
||||
export default FunSingle
|
40
src/pages/writing/index.js
Normal file
40
src/pages/writing/index.js
Normal file
|
@ -0,0 +1,40 @@
|
|||
import DefaultLayout from '@/layouts/DefaultLayout/DefaultLayout'
|
||||
import React from 'react'
|
||||
import fs from 'fs'
|
||||
import Link from 'next/link'
|
||||
import { getMarkdownEntry } from '@/lib/content'
|
||||
|
||||
function Fun ({ entries }) {
|
||||
return (
|
||||
<DefaultLayout>
|
||||
<section>
|
||||
<h1>Writing</h1>
|
||||
<p>Hobby projects, helpful scripts, and other fun bits and bobs!</p>
|
||||
</section>
|
||||
|
||||
<section>
|
||||
{entries.map((e) => (
|
||||
<div key={e.attributes.title}>
|
||||
<h2>
|
||||
<Link href={'/writing/' + e.slug}>{e.attributes.title}</Link>
|
||||
</h2>
|
||||
<p>{e.attributes.desc}</p>
|
||||
<Link href={'/writing/' + e.slug}>Read more</Link>
|
||||
</div>
|
||||
))}
|
||||
</section>
|
||||
</DefaultLayout>
|
||||
)
|
||||
}
|
||||
|
||||
export function getStaticProps () {
|
||||
const fun = fs.readdirSync('./content/writing', { withFileTypes: true })
|
||||
|
||||
const entries = fun.map((dirent) =>
|
||||
getMarkdownEntry(`${dirent.path}/${dirent.name}`)
|
||||
)
|
||||
|
||||
return { props: { entries } }
|
||||
}
|
||||
|
||||
export default Fun
|
|
@ -1,16 +0,0 @@
|
|||
import DefaultLayout from '@/layouts/DefaultLayout/DefaultLayout'
|
||||
import { NextSeo } from 'next-seo'
|
||||
|
||||
export default function Writing () {
|
||||
return (
|
||||
<DefaultLayout>
|
||||
<NextSeo title='Writing' />
|
||||
<section>
|
||||
<h1>Writing</h1>
|
||||
</section>
|
||||
<section>
|
||||
<i>Nothing to see here yet!</i>
|
||||
</section>
|
||||
</DefaultLayout>
|
||||
)
|
||||
}
|
|
@ -15,4 +15,8 @@ ul {
|
|||
|
||||
ul li:not(:last-child) {
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
article {
|
||||
border: none;
|
||||
}
|
Loading…
Add table
Reference in a new issue