

Note that using headscale transfers the anxiety of contril from tailscale as a company to whatever vps you would be hosting the headscale on
I like sysadmin, scripting, manga and football.
Note that using headscale transfers the anxiety of contril from tailscale as a company to whatever vps you would be hosting the headscale on
I used to run a mastodon bot in termux on a galaxy s3 mini many years ago.
Maybe when the merge transcoded downloads on the official clients. rn depending on streamyfin
4-2-1-1 for me I guess 🫣 or 4-2-2?
Two copies at home, synced daily, one of them in an external drive that I like to refer as the emergency grab and run copy lol
One at a family member synced weekly and manually every time I visit.
All of those three copies are always within a 10 kilometer radius in a valley overseen by a volcano so…
One partial copy of the so-critical-would-cry-if-Iost data is synced every few days to a backblaze bucket.
I did setup UptimeKuma for notifications on this. let’s see if it works out when the expiry arrives in a month
I dont think there’s anything that seamless integrate on keyboards or social/chat apps but you could try to selfhost a booru app and then share the hotlinks to those gifs.
They are mostly known for anime and weeb stuff but for memes and gifs I think the tagging system would work the best so you can quickly search what you want.
Lemmy itself and then run any of the importers I guess it would be really straightforward
If you expect your IT cousin/uncle/brother hosting the family immich/nextcloud to not be a trusted person in regards of bad actors your issue is not exclusive to selfhosting.
That’s like saying a farmer will put cheese on a piece of cardboard for the mice to eat.
They might eat it yes, but that wasnt the reason for the whole interaction to start. The glue around the cheese was.
It’s supposedly faster/snappier loading on large rooms. But if you are self-hosting a single user instance, you might not notice much improvement.
I was also running the dendrite but I gave up because it seemed like development was stalled so I moved over to Synapse.
Note that most wireguard clients wont re-resolve when the dns entry changes and they will keep silently a failed tunnel so you would have to do some measure to periodically restart the tunnel.
Is x266 actually taking off? With all the members of AOmedia that control graphics hardware (AMD, Intel, Nvidia) together it feels like mpeg will need to gain a big partner to stay relevant.
For an old nvidia it might be too much energy drain.
I was also using the integrated intel for video re-encodes and I got an Arc310 for 80 bucks which is the cheapest you will get a new card with AV1 support.
Is it for security? I think is mostly recommended because your home router is likely to have a dynamic address.
This is in regards to opening a port for WG vs a tunnel to a VPS. Of course directly exposing nginx on your router is bad.
remnant, partially because it’s a frankestein of second hand from wallapop and dusted pieces from my old computers, partially as a weeb reference to the world of RWBY lol
lol same I like to know exactly where the data is
I do this always for any service but also do a dump of the db in the top directory to keep a clean copy to could be version independent. They wont take much more space ho estly
It doesn’t cover permissions unless you are willing to setup http auth on your webserver but I really enjoy mdbooks. I looks clean and still is just markdown.
I guess that’s fair for single service composes but I don’t really trust composes with multiple services to gracefully handle only recreating one of the containers
The easiest way by far is downloading an existing dump from kiwix
Per example wikipedia_en_all_nopic_2024-06.zim is only 54GB since it only contains text. Then via docker you could use this compose file where you have your .zim files in the wikis volume:
services: kiwix: image: ghcr.io/kiwix/kiwix-serve container_name: kiwix_app command: '*' ports: - '8080:8080' volumes: - "/wikis:/data" restart: always
Theorically you can actually one of the wikipedia database dumps with mediawiki but I don’t known of any easy plug and play guide