remoteStorage is an open standard to enable unhosted web apps where users are in full control of their data and where it is stored, while app developers are freed of the burden of hosting, maintaining and protecting a central database.
Run ./release.sh to move next_release/ to released/ and released/ to prev_release/
Run ./release.sh --rollback to revert to prev_release/
Time-delay safety feature — ./release.sh will no-op until next_release/ is an hour old.
Can then run this from a cron job and there's a window to catch issues before they are publicly visible.
This will also effectively batch RSS/JSON feed updates to be kind to readers.
Using AWS Lambda is just more complicated than it should be and I'd rather
just get this thing working.
I reverted
a bunch of the complicated changes I had made so far in my static generation
script and that was before I'd even figured out how to handle the mako
templates or bundle the whole thing for deployment or hook up the API
gateway...
Using Zappa seemed like a possible
solution, but even that seems like overkill.
So the new process is:
content in a github repo, initially directly in a single jsonfeed file
posting to my bot would have a checkout of that repo and update the
file, push it to github
generate.py script pulls the latest version and processes the feed file into static html
I wanted to rewrite the $uri to allow links to omit
index.html from directories and also to omit .html extensions,
and came up with the following (this goes in the server block, before the location block below):
If $uri has a known extension, rewrite as itself and break out
of the rewrite block.
Add index.html to directories and break out of the rewrite block.
If we've made it this far, it doesn't have an extension and isn't a
directory, so assume a .html extension.
Note that the list of extensions in line 1 needs to include everything you serve from
S3, otherwise a .html extension will be added and it will 404.
Once that's done, you can then pass any requests to /somepath/
through to S3 (I have the bucket permissions set to public-read, you may need to use a different proxy_pass url):
After implementing my bot in Python, I thought I'd try porting it to other languages as an opportunity to learn a bit of Go and Rust.
Both Go and Rust are statically typed and compile to a single self-contained binary, which is appealing as there is no need to maintain a virtualenv and install dependencies on the server.
The parsing code for "rain" messages is a good starting point to compare the bots. Each is using a "parser combinator" library to parse the messages.
Quick thoughts: Python & Go are both pretty sane. There's almost certainly a better way in Rust, but I found it considerably harder.
The tests are stored in different places in each version so the total line count is not comparable, and the Rust version includes "ignore case" code that is built into Python's parser combinators, and that I added into a fork of the Go parser library.