I understand that people enter the world of self hosting for various reasons. I am trying to dip my toes in this ocean to try and get away from privacy-offending centralised services such as Google, Cloudflare, AWS, etc.

As I spend more time here, I realise that it is practically impossible; especially for a newcomer, to setup any any usable self hosted web service without relying on these corporate behemoths.

I wanted to have my own little static website and alongside that run Immich, but I find that without Cloudflare, Google, and AWS, I run the risk of getting DDOSed or hacked. Also, since the physical server will be hosted at my home (to avoid AWS), there is a serious risk of infecting all devices at home as well (currently reading about VLANS to avoid this).

Am I correct in thinking that avoiding these corporations is impossible (and make peace with this situation), or are there ways to circumvent these giants and still have a good experience self hosting and using web services, even as a newcomer (all without draining my pockets too much)?

Edit: I was working on a lot of misconceptions and still have a lot of learn. Thank you all for your answers.

      • Guadin@k.fe.derate.me
        link
        fedilink
        arrow-up
        8
        ·
        2 years ago

        I don’t get why they say that? Sure, maybe the attackers don’t know that I’m on Ubuntu 21.2 but if they come across https://paperless.myproxy.com and the Paperless-NGX website opens, I’m pretty sure they know they just visited a Paperless install and can try the exploits they know. Yes, the last part was a bit snarky, but I am truly curious how it can help? Since I’ve looked at proxies multiple times to use it for my selfhosted stuff but I never saw really practical examples of what to do and how to set it up to add an safety/security layer so I always fall back to my VPN and leave it at that.

    • d_ohlin@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 years ago

      May not add security in and of itself, but it certainly adds the ability to have a little extra security. Put your reverse proxy in a DMZ, so that only it is directly facing the intergoogles. Use firewall to only expose certain ports and destinations exposed to your origins. Install a single wildcard cert and easily cover any subdomains you set up. There’s even nginx configuration files out there that will block URL’s based on regex pattern matches for suspicious strings. All of this (probably a lot more I’m missing) adds some level of layered security.

      • atzanteol@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 years ago

        Put your reverse proxy in a DMZ, so that only it is directly facing the intergoogles

        So what? I can still access your application through the rproxy. You’re not protecting the application by doing that.

        Install a single wildcard cert and easily cover any subdomains you set up

        This is a way to do it but not a necessary way to do it. The rproxy has not improved security here. It’s just convenient to have a single SSL endpoint.

        There’s even nginx configuration files out there that will block URL’s based on regex pattern matches for suspicious strings. All of this (probably a lot more I’m missing) adds some level of layered security.

        If you do that, sure. But that’s not the advice given in this forum is it? It’s “install an rproxy!” as though that alone has done anything useful.

        For the most part people in this form seem to think that “direct access to my server” is unsafe but if you simply put a second hop in the chain that now you can sleep easily at night. And bonus points if that rproxy is a VPS or in a separate subnet!

        The web browser doesn’t care if the application is behind one, two or three rproxies. If I can still get to your application and guess your password or exploit a known vulnerability in your application then it’s game over.

        • zingo@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 years ago

          The web browser doesn’t care if the application is behind one, two or three rproxies. If I can still get to your application and guess your password or exploit a known vulnerability in your application then it’s game over.

          Right!?

          Your castle can have many walls of protection but if you leave the doors/ports open, people/traffic just passes through.

        • Auli@lemmy.ca
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          So I’ve always wondered this. How does a cloudflare tunnel offer protection from the same thing.

          • atzanteol@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 years ago

            They may offer some sort of WAF (web application firewall) that inspects traffic for potentially malicious intent. Things like SQL injection. That’s more than just a proxy though.

            Otherwise, they really don’t.

    • Oisteink@feddit.nl
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 years ago

      A reverse proxy is used to expose services that don’t run on exposed hosts. It does not add security but it keeps you from adding attack vectors.

      They usually provide load balancing too, also not a security feature.

      Edit: in other words what he’s saying is true and equal to “raid isn’t baclup”