Thumbnail image

Vibecoding With Gemini-CLI: How I Built a Git Backup Utility

I built GiterDone from scratch in about five hours using Gemini CLI, Google’s new terminal-based coding AI tool, all without having written a line of Rust before. I prompted it to scaffold the project, wire up SSH-only Git sync, schedule cron jobs, handle divergence with force-push logic, and statically compile for musl. What started as a weekend tinkering session turned into a working Rust binary faster than I could brew my morning tea in proof that even a non-coder can “vibecode” a solution end-to-end when armed with the right AI sidekick. Enter Gemini-CLI.

The result? giterdone, a stubborn CLI tool that backs up your config files to a GitHub repo on a schedule. No cloud APIs, no OAuth nonsense, just SSH, Git, and the cold comfort of cron. The sort of tool you forget about until your NAS turns your carefully crafted .bashrc into digital confetti.

The Problem That Sparked It

My homelab, Proxmox LXCs for Jellyfin and Open WebUI, Docker stacks behind Traefik, usually hums along nicely. Then suddenly, it doesn’t. Maybe a config file evaporates. Maybe an update resets everything like it’s 2007 and your MySpace top friends just got reshuffled.

I had rclone for bulk data and Filebrowser for tinkering, but I needed something paranoid: a silent, low-maintenance brute that would shove my configs into version control while I slept. No babysitting, no bash scripts that break if you sneeze near them, and absolutely no trusting my future self to remember manual backups.

Why Rust? Because Go Was Too Sane

I started with Go, because Go is easy and I enjoy functioning programs. Then I thought: What if I did something stupid instead? Rust had always been that language people evangelized like it was a CrossFit cult, so I dove in blind, no idea what a borrow checker was, no clue about lifetimes, just a vague awareness that unsafe exists and should probably be avoided outside of dire circumstances.

Turns out, Rust’s ecosystem is well-stocked. clap for argument parsing, serde for config handling, dirs for not breaking on different OS paths. I opted to shell out to git directly instead of wrestling with libgit2, because sometimes the best abstraction is just duct-taping system commands together and hoping for the best.

The setup was simple: first run prompts for a GitHub SSH URL, picks files to track, and sets a cron schedule. It writes configs to ~/.config/giterdone, and every scheduled run does the usual git add, commit, push dance. No surprises, no wizardry, just Git’s natural state of barely-contained chaos, automated.

The Git Betrayal

Git’s quirks became apparent fast. My first few runs seemed fine until I checked the logs and realized git push had failed, silently, because of a non-fast-forward error. Turns out, when cron runs your script at 3 a.m., Git gets opinionated about sync states and refuses to push like a toddler refusing vegetables.

Solution? Detect divergence and use --force when needed. I also started logging everything to ~/.config/giterdone/logs so I wouldn’t have to spelunk through strace output like some kind of digital archaeologist.

Then there was the time I forgot to .gitignore debug logs and almost uploaded 20MB of my own frustrated console dumps to GitHub. That was a fun discovery.

SSH or Bust

Early on, I tried supporting both SSH and GitHub’s personal access tokens. Tokens introduced too much nonsense, credential storage, HTTPS URL parsing, Git’s credential helpers, so I nuked that idea. Now, giterdone checks for a working SSH key and, if you don’t have one, tells you to go generate one like an adult.

Cutting out tokens made it simpler, and let’s be honest: if you’re pasting API keys into CLI tools in 2024, you deserve what happens next.

The Aftermath

After wrestling with musl static builds to shrink the binary (down to ~7MB) and ensure it ran on Alpine, I had a tool that just worked. Not gracefully, not beautifully, but reliably.

I also, accidentally, learned a lot: Rust’s ecosystem, Git’s quiet failures, and how satisfying it is to automate something so thoroughly you forget it exists. Which, in homelab terms, is the highest praise.

Regrets and Next Steps

Abstracting the Git logic into a proper library would make testing easier (I test nothing, of course).

A curses-style TUI for setup might make it feel more polished, but it also sounds like yak shaving.

GitHub repo creation via API might be a good addition for users who want this more plug-and-play.

Here’s where GiterDone lives.

Vibecoding at its finest. Built, deployed, forgotten, like all good tools should be.

Related Posts