Thanks - this is exactly what I needed.
Yes - we’re “I’ll let you use my electricity for your computer thing” friends, not “I’m okay with seeing your printer on my home network” friends.
Kavita is for ebooks - it’s not perfect, has some weirdness with series sometimes because of it’s manga heritage.
For me, AudioBookShelf is the clear standout for audio books, and I ended up going with Kavita for ebooks.
I have it in a git repo, broken down by the nodes and vps names. In each of these folders is a mixture of Ansible playbooks, docker compose or just markdown files with the descriptions. Some is random stuff - my VPS allows the export of the cloud firewalls as JSON for instance. All the secrets needed by Ansible are in an Ansible vault, the rest in KeePass.
I like data, I like tech, I like investing large amounts of time and energy to self-host things that muggles would not bother with.
I mean, yes, I could. But I’m committed to the #selfhosted life where I spend hours building unnecessarily complicated systems to make my life easier in small ways.
The process for this is to obtain an EPS32 with bluetooth and wifi, pair it to the scale with bluetooth then keep it powered on in range of the scale, then the data goes into HA?
I have the opposite experience of this. All of my local services are a single docker container inside an LXC. I don’t like that it’s conceptually messy, but in practice it’s easy to manage. What I love about it is the simplicity of backing up or moving the entire LXC between servers.
I’ve not had any drama with things breaking across Proxmox updates. The only non-gui thing I need to do during the process is adding two lines to the LXC conf to have Tailscale work correctly.
No one’s mentioned Forgejo yet? Solid git and artifact repository.
Two good points here OP. Type docker image ls
to see all the images you currently have locally - you’ll possibly be surprised how many. All the ones tagged <none>
are old versions.
If you’re already using github, it includes an package repository you could push retagged images to, or for more self-hosty, a local instance of Forgejo would be a good option.
Guide to Self Hosting LLMs with Ollama.
ollama run llama3.2
If it’s an M1, you def can and it will work great. With Ollama.
Shoutout to Magic Earth, the (weirdly named) iOS app that uses OpenStreeMap data. Works on CarPlay, has reliable routing, and I get a buzz out of updating a changed a speed limit or something on OSM and then seeing the change implemented a few weeks later when I’m driving through there again.
My step-up from Pi was to ebay HP 800 G1 minis then G2’s. They are really well made, there’s full repair manuals available, and they are just a pleasure to swap bits in and out. I’ve heard good things about, and expect similar build quality from the 1 liter Lenovos.
I agree that RAM is a likely constraint rather than processor for self-hosting workloads. Particularly in my case as I’m on Proxmox and run all my docker containers in separate LXCs. I run 32GB in the G2’s which was a straightforward upgrade (they take laptop like memory). One some of them I’ve upgraded the SSDs, or if not, I’ve added M.2 NVME drives (that the G2’s have a slot for).
Wish by Peter Goldsworthy. J.J. has always been more at home in Sign language than in spoken English. Recently divorced, he returns to school to teach Sign. His pupils include the foster parents of a beautiful and highly intelligent ape named Eliza.
Greta Tintin Thunberg
I run two local physical servers, one production and one dev (and a third prod2 kept in case of a prod1 failure), and two remote production/backup servers all running Proxmox, and two VPSs. Most apps are dockerised inside LXC containers (on Proxmox) or just docker on Ubuntu (VPSs). Each of the three locations runs a Synology NAS in addition to the server.
Backups run automatically, and I manually run apt updates on everything each weekend with a single ansible playbook. Every host runs a little golang program that exposes the memory and disk use percent as a JSON endpoint, and I use two instances of Uptime Kuma (one local, and one on fly.io) to monitor all of those with keywords.
So -
I’m on board with original punctuation going inside the quote, but then to be consistent, capitalization has to as well. So instead of “This comment…” it should be “this comment…” since in the original quote that was just a clause separated by a comma, not its own sentence.
Yes, this.