This blog has now moved to Amazon Amplify. It's connected to a Bitbucket git repository and AWS pulls it at the moment it's pushed. I was polling the repository manually in a VPS but this is much quicker.

  • Setting your domain name for Amplify requires (a) to write a CNAME record to prove ownership. Then (b) you modify ALIAS and CNAME records of @ and www records to a cloudfront URL given to you and automatically your site becomes https. My domains are in namecheap but AWS advertises that using Route 53 is much simpler than this.

  • I wrote a script to convert from pelican's front matter to YAML front matter used in Hugo. It receives a filename argument (e.g. a/index.md) and converts header to YAML format, backs up the old file in a/index.md.old. Not for this, but for my Turkish blog, it converted about 400 posts quickly. Hugo gives error in YAML headers with quotation marks but they were few and I've corrected them manually

#!/usr/bin/python3

import sys
import re
from dateutil import parser
import datetime
import os

fn = sys.argv[1]

fread = open(fn, encoding="utf-8")
lines = fread.readlines()
fread.close()


fmorig = []

for i, l in enumerate(lines):
    if len(l.strip()) == 0:
        blank_line_i = i
        break
    else:
        fmorig.append(l)

print(fmorig)

fmtarget = {}

for fm in fmorig:
    if fm.startswith('Date'):
        dt = parser.parse(fm[5:].strip())
        fmtarget['date'] = dt
        expiryDate = dt + datetime.timedelta(days=365)
        fmtarget['expiryDate'] = expiryDate
    elif fm.startswith('Title'):
        fmtarget['title'] = fm[6:].strip()
    elif fm.startswith('Author'):
        fmtarget['author'] = fm[7:].strip()
    elif fm.startswith('Image'):
        imageName = fm[6:].strip().split('/')[2]
        fmtarget['image'] = "/images/{}".format(imageName)
    elif fm.startswith('Status'):
        fmtarget['status'] = fm[7:].strip()
    elif fm.startswith('Dp'):
        fmtarget['dp'] = fm[3:].strip()
    elif fm.startswith('Tags'):
        fmtarget['tags'] = fm[5:].strip().split(',')
    else:
        fmtarget[fm] = fm


dt = fmtarget['date']
dn, bn = os.path.split(fn)



newfrontmatter = """---
title: "{title}"
date: {dt}
expiryDate: {expiryDate}
dp: {dp}
featured_image: "{image}"
images: ["{image}"]
published: {status}
tags: [{tags}]
---
""".format(title=fmtarget['title'],
           dt=fmtarget['date'].strftime("%F %H:%M:%S"),
           expiryDate = fmtarget['expiryDate'].strftime("%F %H:%M:%S"),
           dp = fmtarget['dp'],
           image = fmtarget['image'],
           status = "true" if fmtarget['status'] == "published" else "false",
           tags = ",".join(fmtarget['tags']) if 'tags' in fmtarget else '')


newcontent="""{}

{}
""".format(newfrontmatter, "".join(lines[blank_line_i:]))

print(newfrontmatter)

backup_filename = fn + ".old"
os.rename(fn, backup_filename)

fwrite = open(fn, mode="w", encoding="utf-8")
fwrite.write(newcontent)
fwrite.close()


You need to add the following to config.toml

[outputs]
home = [ "RSS", "HTML"]

[outputFormats]
[outputFormats.RSS]
mediatype = "application/rss"
baseName = "rss"

  • I created two Firefox searches similar to DuckDuckGo bangs. !g in the omnibar makes a google search and !pb searches my pinboard bookmarks. It avoids a rountrip from DDG if I'm already using the browser.

To make an S3 bucket public, you need to add

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "PublicReadGetObject",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::subdomain.example.com/*"
        }
    ]
}

to BucketPolicy.

It's possible to give permissions of subfolders by right clicking on them but for the entire bucket, you need to write this.


  • Python datetime.timedelta doesn't have a years parameter. It receives weeks as the longest period, as it's the longest unambigious period of 7 days.