• 0 Posts
  • 28 Comments
Joined 2 years ago
cake
Cake day: June 12th, 2023

help-circle
  • yes i did read OP.

    ed. i see this was downvoted without a response. But il put this out there anyway.

    If you host a public site, which you expect anyone can access, there is very little you can do to exclude an AI scraper specifically.

    Hosting your own site for personal use? IP blocks etc will prevent scraping.

    But how do you identify legitimate users from scrapers? Its very difficult.

    They will use your traffic up either way. Dont want that? You could waste their time (tarpit), or take your hosting away from public access.

    Downvoter. Whats your alternative?











  • Some things, or points, to consider.

    1. Do you want whatever you’re hosting to be internet accessible? Or local network only? As it being able to access from anywhere
    2. If its something you want to be able to use full time, youll eventually want a dedicated machine. I have an *arr stack etc, which I share with a private group. Eventually downtime was an actual consideration and interrupted my normal usage of my main machine. I picked up a secondhand elitedesk mini Pc for £150 and its doing really well.
    3. Potentially make use of tools like Zerotier for private networking (others hopefully will chime in with alternatives, but I have had good success with zerotier.
    4. How do you want to host it, and are you willing to learn? Get a bit of knowledge on docker or podman but this is hardmode as most examples will be docker specific. Using containers will make things simpler. The most complicated part IMO is networking but even then its more docker networking stuff than general TCP/IP - (like you mentioned in your post)
    5. OS - you mentioned using Linux? I personally use Ubuntu just out of defaulting to what i’ve previously used. But Im currently using Manjaro on my non host PC, which I am liking
    6. Keep things secure - the more you expose to the internet, the more risk. Keep exposure as small as possible, use letsencrypt or alternatives for anything you want to access over the internet etc

    good luck have fun!



  • It depends whether a whole season torrent exists or not. If sonarr can identify one thats a whole season, it should download that when you search at season level. If youve searched individual episode at a time, youll get a single one.

    You can do an interactive search and iirc specify full season during that search



  • When you tried caddy and received an error, that looks like you are getting the wrong image name.

    Then you mentioned deleting caddyfile as the configuration didn’t work. But, if I am following correctly the caddyfile wouldn’t yet be relevant if the caddy container hadn’t actually ran.

    Pulling from Caddys docs, you should just need to run

    $ docker run -d -p 80:80 \
        -v $PWD/Caddyfile:/etc/caddy/Caddyfile \
        -v caddy_data:/data \
        caddy
    

    Where $PWD is the current directory the terminal is currently in.

    Further docs for then configuring for HTTPs you can find here under

    Automatic TLS with the Caddy image

    https://hub.docker.com/_/caddy