• 1 Post
  • 16 Comments
Joined 2 years ago
cake
Cake day: August 19th, 2023

help-circle

  • I use Inoreader. Not selfhosted but that is everything i have at that company. The articles are public anyway so I don’t care that much.

    My workflow is article maximize and try to hide everything else like menubar and list of sources etc. I use jk to navigate for scrolling and v/space to go to article. I use vimium extension so d for close tab with article. Article is automatically marked as read as I scroll. It takes 5 min per day to go through. I think if I would selfhost then I would try tiny rss.

    I don’t use it on phone. No need. I am at my computer most of the time.




  • Mio@feddit.nutoSelfhosted@lemmy.worldAnyone running ZFS?
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    7 months ago

    Dataloss is never fun. File systemet in general need a long time to iron out all the bugs. Hope it is in a better state today. I remember when ext4 was new and crashed in a laptop. Ubuntu was to early to adopt it, or I did not use LTS.

    But as always, make sure to have a proper backup on a different physical location.





  • I use very simple software for this. My firewall can use route monitoring and failover and use policy based routing. I just send all traffic to another machine with the diagnosis part. It does ping through the firewall and fetch some info from the firewall. The page itself is not pretty but say what is wrong. Enough for parents to read what error. I also send DNS traffic to a special DNS server that responds with the same static ip address - enough for the browser to continue with a HTTP GET that the firewall will send forward to my landing page. It is sad that I don’t have any more problems since I changed ISP.

    Had a scenario when the page said gateway reachable but nothing more. ISP issue. DHCP lease slowly ran out. There were a fiber cut between our town and the next. Not much I could do about it. Just configured the IP static and could reach some friends through IRC in the same city so we could talk about it.

    The webpage itself was written in php that read icmp logs and showed the relevants logs of up and down. Very simple.



  • I use it with Kubuntu. Doing apt update is now much faster. I did some testing and found some good public mirror so I could max my connection(100 Mbit) with about 15ms latency to the server. But I think the problem was there are so many small files. Running nala to fetch the files in parallel helps of course. With apt local ng I don’t need nala at all. The low latency and files on gigabit connection to my server leads to fast access. Just need to find a good way to fill it with new updates.
    A second problem is to figure out if something can be done to speed up the apt upgrade, which I guess is not possible. Workaround with snapshots and send diff does not sound efficient either, even on older hardware.

    apt update - 4 seconds vs 16 seconds.

    apt upgrade --download-only - 10 seconds vs 84 seconds;


  • First off. If Internet goes down I have a http captive portal that do some diagnos, showing where the problem is. Link on network interface, gateway reachable, dns working and dhcp lease. Second, now when it is down, show the timestamp when it went down. Third, phone number to the ISP and city fiber network owner.

    Forth. Watch my local RSS feed and email folder. Also have something to watch from Youtube or Twitch game downloaded locally.