

Oh I think i tried at one point and when the guide started talking about inventory, playbooks and hosts in the first step it broke me a little xd
Oh I think i tried at one point and when the guide started talking about inventory, playbooks and hosts in the first step it broke me a little xd
Got any decent guides on how to do it? I guess a docker compose file can do most of the work there, not sure about volume backups and other dependencies in the OS.
Hmm, I bought a used laptop on which I wanted to tinker with linux and docker services, but I kinda wanted to separate the NAS into a separate advice to avoid the “all eggs in one basket” situation (also I can’t really connect that many hard drives to it unless I buy some separately charged USB disk hubs or something, if those exist and are any good?)
However I do see the merit in your suggestion considering some of the suggestions here are driving me into temptation to get a $500 NAS and that’s even without the drives… that’s practically more than what my desktop is worth atm.
Could be a regional thing but Synology HDDs are around 30% more expensive than ‘normal’ WD/Seagate/Toshiba that I’m seeing at first glance. Maybe it does make it up for quality and longevity but afaik HDDs are pretty durable if they are maintained well, and I imagine them being in RAID1 should be good enough security measure?
Considering the price of the diskstation itself it’s all quickly adding up to a price of a standalone PC so i’m trying to keep it simple since it’s for a relatively low performance environment.
gummibando@mastodon.social
Sorry, with ‘docker drives’ I meant ‘docker volumes or bind mounts’. I dont have a lot of experience with it yet so I’m not sure if I’m going to run into problems by mapping them directly to a NAS, or if I should have local copies of data and then rsync / syncthing them into the NAS. I heard you can theoretically even run docker on the NAS but not sure if that’s a good idea in terms of its longevity or performance.
Is the list of “approved HDDs” just a marketing/support thing or does it actually affect performance?
Thanks for the answers! The DS2xx series looks like something I could start with. DS223 is a bit cheaper and has 3 USB ports so that could be useful, I’d guess I don’t need to focus on performance since it’s mostly just for personal data storage and not some intensive professional work.
Logseq
I feel old, I don’t understand 90% of words in this thread lol.
I just have kodi on Libreelec with a jellyfin plugin on my rpi4 and even that struggled with overheating at times. So I run most stuff on my pc instead. I’m tempted to try the portainer to get some experience with docker tho.
I see, thanks. I’m still stuck in the mentality “the more parts there are, the easier for it to break” :P. Or that it’d affect the speed in some way
Any specific reason why?
Isn’t it more likely that paths are used to reference resources like images rather than a db fk?
Damn, that’s extensive. How long did it take to set it all up and to maintain it continually?
Tbh I just run pihole as a background service on my PC since I can’t easily access router to change DNS for the whole network anyway. It never seemed that much work, what devops issues were you running into with it and Blocky to justify paying a service for it?
My understanding is that instances have worker threads that continually pull new data from linked communities (the ones at least 1 person is subscribed to). It should be almost instant but recently it’s sometimes delayed due to huge influx of traffic.
Yep, you can search by name specifically on https://kbin.social/search (or your kbin instance) and subscribe to it, then it starts getting synced.
You can if someone else subscribed to it in the past. If nobody ever did, then that community is unknown to kbin and you won’t find any data on it whether you’re logged in or not.
Honestly I’ve never used docker properly and one time I tried for the *arr stack I ran into many issues with access to storage drives and connectivity between different services. Does it actually help with anything on rpi? I thought it’s good enough to just install the rpi OS and then install other services normally on it?
I really wish there was an easy way to export and import a list of communities@domain that you can then transfer to another account. It would make it very easy to just add a bunch of content to a freshly created instance like you’re talking about, once you subscribe to communities your instance will start getting updates from them and your “All” tab will get populated.
For now you have to do it manually unfortunately, afaik.
Does Fluent Reader count? Doesn’t have an amazing interface but it’s free and simple to use.