• 1 Post
  • 38 Comments
Joined 2 years ago
cake
Cake day: July 1st, 2023

help-circle

  • That’s a good call out.

    There are a few things I do right now:

    1. All of my public DNS entries for the certs point at cloudflare, not my IP.
    2. My internal Network DNS resolver will resolve those domains to an internal address. I don’t rely on nat reflection.
    3. I drop all connections to those domains in cloudflare with rules
    4. In caddy, I drop all connections that come from a non-internal IP range for all internal services. Additionally I drop all connections from subnet that should not be allowed to access those services (network is segmented into VLANs)
    5. I use tailscale to avoid having to have routes from the Internet into my internal services for when I’m not at home.
    6. For externally accessible routes, I have entirely separate configurations that proxy access to them. And external DNS still points to cloudflare, which has very restrictive rules on allowable connections.

    Hopefully this information helps someone else that’s also trying to do this.


  • I just:

    1. Have my router setup with DNS for domains I want to direct locally, and point them to:
    2. Have a reverse proxy that has auto- certbot behavior (caddy) connected to the cloud flair API. Anytime I add a new domain or subdomain for reverse proxine to a particular device on my network a valid certificate is automatically generated for me. They are also automatically renewed
    3. Navigation I do within my local network to these domains gives me real certificates, my traffic never goes to the internet.












  • Yeah I had literally no idea what you were talking about until you mentioned the actual name in the comments.

    NPM almost universally refers to node package manager in any developer or development adjacent conversation in my experience. Given that both the site, the command, the logo, and the binaries are “npm” makes that more appropriate.

    Nginix proxy manager is far to niche to be referred to universally by acronym when it’s only ever used as an acronym when the context for it’s usage has already been defined (ie. In it’s documentation).

    This becomes much more clear when you Google the acronym.


  • It is, but also it’s worrisome since it means support is harder, which means risk of abandonment is higher and community contributions lower. Which means “buying in” is riskier for the time investment.

    Not really criticizing, 10/10 points on making something and then putting it out there, nothing wrong with that. Just being a user who’s seen too many projects become stale or abandoned, and have noticed that the trend has some correlation to the technology choices those projects made.





  • I might be crazy but I have a 20TB WD Red Pro in a padded, water proof, locking, case that I take a full backup on and then drive it over to a family members 30m away once a month or so.

    It’s a full encrypted backup of all my important stuff in a relatively different geographic location.

    All of my VM data backs up hourly to my NAS as well. Which then gets backed up onto the large drive monthly.

    Monthly granularity isn’t that good to be fair but it’s better than nothing. I should probably back up the more important rapidly changing stuff online daily.