-
Back up your digital life
Our digital life increasingly exists primarily in the cloud. Documents, photos, emails, passwords: all of this resides in the cloud. And be it One Drive, iCloud, Google Drive, or Dropbox - you don’t really own any of it.
We trust these companies with our digital life and take their reliability for granted, but it’s worth remembering that nothing in this world is a guarantee. The likelihood of an outright failure of these services is relatively low; Google, for example, stores copies of all data in data centers across three geographic locations (often across multiple regions). Microsoft, Amazon, Apple and other giants follow equivalent policies. The real threat with this storage is bureaucracy. Your account can be erroneously flagged and banned: automated systems that constantly scan for policy violations aren’t perfect and can misfire. Your account can get hacked, even with a strong password and two factor authentication. Navigating account restore process and getting access back can take weeks, months, or be altogether impossible.
Because of this, local backups are critical if you care about your data - which you probably do.
3-2-1 backup strategy
A common, straightforward, and widely used rule of data backups is referred to as the “3-2-1 rule”:
- Keep three copies of your data.
- Use two types of media for storage.
- Keep one copy off-site.
In fact, you’ll be hard-pressed to find a cloud service provider who doesn’t subscribe to some (likely more complex) variation of this rule.
We’ll satisfy the 3-2-1 backup rule with a dedicated backup drive and a home computer you likely already have:
- ✔️ Keep three copies of your data: (1) in the cloud, (2) on a dedicated backup drive, and (3) on a home device you already have.
- ✔️ Use two types of media for storage: (1) in the cloud, and (2) on our backup drive.
- ✔️ Keep one copy off-site: you’ll keep your backup drive at home, which is a different location from the cloud data centers.
Use an external HDD for backups
Solid state drives - SSDs - are all the rage today: they’re blazingly fast and have become relatively affordable. But you don’t want an SSD for a backup: SSDs reliability isn’t great when left unpowered: with the low end of failures occurring at merely the one-year mark. And since it’s a backup, you’ll want to leave it unpowered.
No, for this you’ll want a hard drive - an HDD. We’ll be trading read/write speed for reliability. External hard drives are affordable, don’t need to be powered to store the data, have been around for ages, and degrade more slowly. A quality hard drive should be mostly reliable for 5-7 years, and can be repurposed to a tertiary backup storage after that. Set a reminder in the future for yourself to do that.
Finally, some data can be recovered from a failed HDD, while failed SSDs are largely unrecoverable.
Survey your space needs, and use a hard drive a few times the size for scalability. I use a 4 Tb Seagate portable HDD, and it’s working just fine.
Use an existing device for tertiary storage
You likely already have some devices you could use at home. Maybe a laptop you’re currently using, or an old desktop tower you haven’t plugged in for years. Using this device will help ensure reliability and longevity of your data.
If you don’t have anything you can use, or your existing storage is too small - violating the 3-2-1 backup rule is better than having no backup at all. Use an external HDD, which you can downgrade to tertiary storage once you replace it in 5-7 years.
On encryption
Encrypting or not encrypting your backups is a personal choice.
You’ll likely be backing up important documents, which makes encryption critical for security. If the backup gets stolen, your whole life can be turned upside down (although this possibility still exists today if someone hacks into your cloud account).
However, because backups tend to live a long time, encryption can have downsides: tools can change, and most importantly you can forget your password. You also can’t decrypt a partially recovered backup: it’s all or nothing.
If you choose to encrypt, consider using established and mature open source encryption tooling like
gpg
(I wrote about how to use GPG all the way back in 2012).It’s not all or nothing either: you can choose to only encrypt sensitive documents, but leave less sensitive media like photos, videos, or music unencrypted.
I do not encrypt my backups because I worry about forgetting my password by the time I need to recover the backup. I have a tendency to get in my own way: I couldn’t recover some writing I’ve backed up in 2012 because I couldn’t figure out what the password was. How fun.
Extracting data from Cloud
Internet giants allow you to download all your data in a fairly convenient manner. Google has Google Takeout, which lets you download data across services (Google Drive, Photos, email, etc). Apple allows you to request a copy of your data, and Microsoft allows you to submit a privacy request.
Don’t forget about other service providers who store your data like email providers or password managers.
Back up regularly
Set up a routine you’ll follow. For me, it’s every year. I won’t follow a more rigorous backup routine, and the trade-off of losing a year worth of data is worth the convenience of infrequent backups.
As our lives become more intertwined with the digital world, protecting your data is essential. By following the 3-2-1 backup strategy and using reliable storage, you can safeguard your data against unexpected mishaps. Regular backups and smart encryption choices will help keep your digital life secure and accessible. So, take a moment to set up your backups today - you’ll thank yourself later for the peace of mind that comes with knowing your data is safe.
-
Migrating from Octopress to Jekyll
Back in 2014 I abandoned WordPress for Octopress. It’s been especially amazing for page load speeds, and I also enjoyed the fact that GitHub Pages are completely free - and I only need to pay for a domain name. Hosting a website can get expensive.
Octopress was a shortlived framework built on top of Jekyll, focused on blogging and designed to run on top of GitHub Pages. Unfortunately the development stopped in 2015, and now, 10 years later, I couldn’t set it up on a new machine due to most dependencies getting dangerously out of date.
I chose to migrate to vanilla Jekyll, since it’s a static site generator which is built on top of simple markdown and HTML files. Jekyll’s been around for some time, and I’m hoping Microsoft won’t be shutting down GitHub pages any time soon.
The whole process only took a couple of hours, and I’d like to document some highlights and lowlights. You might find it useful if you’re setting up a new Jekyll blog, or, like me, still have an Octopress blog that needs migrating.
Fresh setup
I went with a fresh Jekyll setup, by installing Jekyll and running
jekyll new blog
. I successfully copied over old_posts
andimages
, and ported the relevant parts of_config.yml
from Octopress to vanilla Jekyll.Octopress uses liquid
{% img %}
tags, which aren’t natively supported in Jekyll. I took the opportunity to convert those to markdown style syntax. I only have a few hundred posts, and I used a Vim macro to convert all{% img /foo/bar.png baz %}
to
.By default Jekyll comes installed with the
minima
theme, which I found to be mostly sufficient for my needs. I was able to override specific theme files by copying them from gem installation location to my blog directory and modifying them. Turned out to be straightforward and customizable. For example, I transferred the way Octopress pagination looks by modifying_layouts/home.html
.For backward compatbility, I also had to move RSS feed to
/atom.xml
by modifying_config.yml
:feed: path: /atom.xml
I could immediately run the site locally with
bundle exec jekyll serve --baseurl=""
.Missing functionality
Two major things were missing straight out of the box: archive and category pages.
I grew attached to my archive page, and recreating it only took a couple of minutes. All I had to do is add an
archive.markdown
page to the site’s root directory:--- layout: page title: Archive navbar: Archive permalink: /blog/archive/ --- {%- assign date_format = site.minima.date_format | default: "%b %-d, %Y" -%} <div> <ul> {% for post in site.posts %} {% capture this_year %}{{ post.date | date: "%Y" }}{% endcapture %} {% unless year == this_year %} {% assign year = this_year %} <h2 style="margin-top: 1em;">{{ year }}</h2> {% endunless %} <li> <a href="{{ root_url }}{{ post.url }}" itemprop="url">{{ post.title }}</a> <span class="text-muted">| 📅 {{ post.date | date: date_format }}</span> </li> {% endfor %} </ul> </div>
Building category support turned out to be messier and more complicated. I didn’t want to write up a custom solution, and ended up with some technical debt I’ll probably have to address in the future (wink-wink, this will never happen).
I used
jekyll-category-pages
gem, which worked okay-ish. The instructions on field-theory/jekyll-category-pages are extensive and aren’t too difficult to follow - I appreciated not having to write my own category pages, but I had to:- Stop category pages from being automatically added to the navigation bar.
- Disable pagination on category pages, because for some reason it really didn’t work with
jekyll-category-pages
.
I also added my own basic category index pages by creating
categories.markdown
:--- layout: page title: Categories navbar: Categories permalink: /blog/categories/ --- {% assign category_names = "" | split: "" %} {% for category in site.categories %} {% assign category_names = category_names | push: category[0] %} {% endfor %} {% assign category_names = category_names | sort %} <div> <ul> {% for category in category_names %} <li> <a href="{{ root_url }}/{{ site.category_path }}/{{ category | slugify }}">{{ category }}</a> </li> {% endfor %} </ul> </div>
GitHub Pages
While GitHub Pages documentation is extensive, getting Jekyll to work with GitHub Pages took longer than I’d like to admit. Specifically,
Gemfile
generated by runningjekyll new blog
misleadingly tells you to comment away the latest version of thejekyll
gem and instead use thegithub-pages
gem:# Happy Jekylling! gem "jekyll", "~> 4.4.1" # If you want to use GitHub Pages, remove the "gem "jekyll"" above and # uncomment the line below. To upgrade, run `bundle update github-pages`. # gem "github-pages", group: :jekyll_plugins
You don’t want to do that, oh no. Because the default GitHub Pages gem is stuck in the past on the 3rd version of Jekyll (and at the time of writing we’re on version 4), which caused all kind of hidden problems - including the fact that my URL slugs silently weren’t getting generated right. I switched back on the
jekyll
gem and set up a custom GitHub action to deploy the site:name: Deploy Jekyll site to Pages on: push: branches: ["master"] # Allows you to run this workflow manually from the Actions tab workflow_dispatch: permissions: contents: read pages: write id-token: write concurrency: group: "pages" cancel-in-progress: false jobs: # Build job build: runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v4 - name: Setup Ruby # https://github.com/ruby/setup-ruby/releases/tag/v1.207.0 uses: ruby/setup-ruby@4a9ddd6f338a97768b8006bf671dfbad383215f4 with: ruby-version: '3.1' # Not needed with a .ruby-version file bundler-cache: true # runs 'bundle install' and caches installed gems automatically cache-version: 0 # Increment this number if you need to re-download cached gems - name: Setup Pages id: pages uses: actions/configure-pages@v5 - name: Build with Jekyll run: bundle exec jekyll build --baseurl "$" env: JEKYLL_ENV: production - name: Upload artifact uses: actions/upload-pages-artifact@v3 # Deployment job deploy: environment: name: github-pages url: $ runs-on: ubuntu-latest needs: build steps: - name: Deploy to GitHub Pages id: deployment uses: actions/deploy-pages@v4
Don’t forget to set “Build and deployment source” to “GitHub pages” in the repository settings to actually use the action.
My Octopress blog was set up in a
source
Git branch, and content was generated into themaster
branch. I wanted to change that to have the source in the master branch (the action above won’t work without that), and I was able to replace mymaster
withsource
with the following set of commands:git checkout master git pull git checkout source git merge -s ours master --allow-unrelated-histories git checkout master git merge source
We merge the
master
branch intosource
usingours
merge strategy (effectively ignoring themaster
branch history), and then merge that back intomaster
.Positive experience
All in all migrating to Jekyll has been a great experience, which is a testament to Jekyll community’s dedication to thorough documentation. Knowing that Jekyll is a mature, maintained, and documented project, and that GitHub Pages infrastructure is reliable and supported, provides a sense of stability. I hope this results in Jekyll and GitHub Pages becoming a (reasonably) future-proof platform for my blog. But let’s check back in in 10 years - see you in 2035?
-
Essentialism: A Practical Guide to Less
I’ve thoroughly enjoyed Essentialism, a book that encapsulates the simple yet powerful notion of “do fewer things, do them well.” There’s not much else to it. While this philosophy is straightforward, it’s the way Greg McKeown presents and reinforces this message that makes the book truly compelling.
Having Essentialism in physical form proved invaluable. I filled the margins with notes, worked through exercises alongside the text, and took the time to fully absorb the material as I progressed.
Essentialism is not a new concept, but the key takeaway is the author’s focus on truly internalizing the message. “Focus on things that matter, trim the excess” is a simple motto to remember, yet challenging to implement. Throughout my life, I’ve adopted many of essentialist practices in one form or another, from guarding my calendar to learning to say “no” to prioritizing essential projects. However, over time, clutter inevitably creeps in.
McKeown wisely focuses on routines that support the essentialist lifestyle, emphasizing the importance of dedicated time for reevaluation and recentering. He suggests establishing routines that prevent slipping into the frantic “onto the next thing” mentality so prevalent in the modern corporate world.
An analogy that particularly resonated with me is the closet metaphor. While you can declutter your closet once, it will eventually refill with clothes you don’t need. To keep your closet tidy, you need to have a regular time to reevauate your outfits, know where the nearest donation center is, how to get there, and what hours is it open. Similarly, McKeown provides methodologies to regularly reevaluate our priorities, supporting the rigorous process of regularly discarding the non-essential.
Essentialism extensively focuses on routines, practices, and exercises. The edition I read includes a “21-day Essentialism Challenge,” a helpful list of concrete activities corresponding to each chapter. While some prompts, like “take a nap” or “play with a child for 10 minutes” are a bit silly (where am I supposed to find a child on a Tuesday, Greg?), many steps effectively reinforce and integrate the material into your daily life, such as “design your ideal calendar,” “practice saying no gracefully,” or “schedule a personal offsite.”
The latter suggestion, scheduling a personal offsite, left a significant impression on me. It’s time dedicated to strategizing around your personal and professional goals. While I occasionally reflect on my career and life, McKeown elevates this practice into a ritual – a full day focused on self-reflection, planning, and deliberate action.
Essentialism is a helfpul book that prompts the reader to think about the routines one can put in place to change the way we approach life. It’s a reminder that less can indeed be more, and that by focusing on what truly matters, we can create a life of greater purpose, meaning, and fulfillment.
-
Static websites rule!
I hope you’ve noticed that navigating to this page was quick (let’s hope that the Internet Gods are kind to me, and nothing stood in the way of you accessing this page). In fact, most pages on my blog - hopefully including this one - should render in under a second. I didn’t put any work into optimizing this site, and it’s not a boast, nor is it a technological marvel - this is just a good old fashioned static website.
If this is new to you - static website is just what it sounds like - static HTML and CSS files, sometimes with some light JavaScript sprinkled throughout. There’s no server side processing – the only bottlenecks are the host server speed, recipient’s connection speed, and the browser rendering speed. Page is stored as is, and is sent over as soon as it’s requested. This is how the Internet used to be in late 90s and early 2000s (with eclectic web design to boot, of course).
I think static websites are cool and aren’t used nearly enough, especially for websites that are, well, static. Think to the last website you’ve visited to read something - maybe a news site, or maybe a blog. Now did it take at least a couple of seconds for them to load? Likely. Did their server have to waste unnecessary cycles putting together a page for you? Most definitely. Now, contrast this with your experience with a static website like this one. Here’s the result from pagespeed.web.dev for this page:
Every render complete in under a second, and I didn’t have to put in any work into optimizing my website.
This site is built on a (now unsupported) Octopress, which is itself built on top of Jekyll. You write pages in Markdown, generate web pages using a pre-made template, and deploy the resulting pages to a hosting provider. In fact, GitHub Pages allow you to host your static website for free, and you can have a third party platform like Disqus provide comment support.
Static websites work great for portfolios, blogs, and websites that don’t rely on extensive common manipulation. They’re more secure (no backend to hack), simple to build and maintain, very fast even without optimization, and are natively SEO friendly (search engines are great at understanding static pages). Static websites are cheap to run - I only pay for a domain name for this site (under $20 a year).
If you have a blog or a portfolio and you’re using an overly complicated content management system to write - consider slimming down. Jekyll (or many of its alternatives) offers a number of pre-made off-ramps for major CMS users, is easy to set up, and is straightforward to work with. Can’t recommend enough - static websites rule!
-
Sifu and a state of flow
It’s not a hot take that I really like Soulsbourne video games: games like Dark Souls, Sekiro, Elden Ring. There are many reasons behind my appreciation for the titles: fascinating lore delivery mechanisms, selfless multiplayer cooperation, or thought provoking art direction.
But one faucet of these games ties together the experience: combat. Tight, responsive, and unforgivingly difficult. And mainstream critical success of these games is what allows us to see publishers take more risks with difficult games.
And nothing makes this more apparent than a game stripped down to its core: Sifu. Okay, bear with me - Sifu has nothing to do with the Dark Souls franchise, and the small French Sloclap studio is as far away as you can get from the now behemoth of the industry From Software.
But Sifu is just that: a responsive, fluid, and unforgiving combat experience masquerading as a game. Sifu is a revenge story set in modern-day China, that sees you play through through five levels, each of which culminates with a boss fight.
There’s a twist: every time the character dies, they get older - increasing their damage and reducing their health. Once you go past 80 - it’s game over. This forces the player to master each level, as the player is incentivised to finish each level at the youngest possible age.
The game’s final boss - a man who killed the protagonists’ teacher is hands down one of the hardest boss fights I have experienced in the video games. He’s immune to every cheap trick you might have in your sleeve, attacks relentlessly, and leaves nearly no room for error.
And it’s this boss fight that really made me aware of the state of flow difficult combat forces the player into.
State of flow, or the feeling of “being in the zone” is a state of intense concentration, a perfect balance of difficulty and skill. State of flow is immensely satisfying, and time flies by in an instant while you’re in the zone. And difficult games force the player into the state of flow to progress.
State of flow has many benefits: increasing concentration, creativity, problem solving abilities, and even boosting self esteem. Among many benefits, state of flow helps to reduce stress and anxiety: it’s an inherently relaxing and enjoyable experience, and is one of the main reason I like playing difficult games, even if I’m exhausted after a long day. I can rely on an induced state of flow to help myself relax.
Back to Sifu’s final boss.
The only way I was able to defeat the final boss in Sifu is by sitting back, relaxing, and letting the built up muscle memory take over. It’s a beautiful experience akin to playing an instrument - something that requires the right amount of concentration: too much focus would make you get in your own way, too little would allow you to get distracted. It’s a state of flow, which I immensely enjoy, and which games like this help me enter.
But it isn’t until a second playthrough, which the game encourages you to take to experience the full story, that you’re able to appreciate all the progress you made, and reenter the state of flow for the duration of the full playthrough. Because you’ve already completed the game, there’s confidence in mastery of the game’s systems, but the difficulty and unforgiving nature of encounters still keep the game a challenge.
I think this applies to many games that are built with high difficulty - be it platformers like Celeste, or even tactics games like XCom. In fact, I’d be really curious to see how more cerebral video games get the player into the state of flow - I might dig into that someday.