

If they ever use biometrics for this, mine are for sale. Cheap… I’d like to know what happens when I can’t use who I am as my ID because it’s public.
If they ever use biometrics for this, mine are for sale. Cheap… I’d like to know what happens when I can’t use who I am as my ID because it’s public.
In 2025 they exploited a 2023 vuln. What did our telcos expect to happen?
It doesn’t check all your boxes but pingvin is what I use to share large files
Looks like you got it! Congrats.
Router gets the public IP. Login to it, find port forwarding option. You’ll pick a public port. IE 443 and forward it to a local IP:port combo, IE 192.168.0.101:443.
Then you can pick another public port and forward it to a different private IP:port combo.
If you want a subdomain, you forward one port to one host and have it do the work. IE configure Nginx to do whatever you want.
EDIT: or you use IPv6. Everything is a public IP.
Even then, lots of other options…
Hang on, why not open the port to jellyfin to the internet?
I have a lifetime Plex pass so its not urgent but I have a containers running emby and jellyfin to check them out. When I decide which one I planned to open it up and give people logins.
And a monthly weekly subscription.
https://docs.docker.com/engine/swarm/
Yeah, so you have more than one PC and they will talk to each other and decide who hosts what.
For example, you host nextcloud and the cluster will decide (unless you tell it differently) it goes to PC1. Then you host Minecraft and the cluster will put it on PC2.
Now, PC2 dies, you unplug it, or generally something bad happens. The cluster will see that Minecraft isn’t running, PC2 is down, and start Minecraft on PC1. The best part, just keep adding cheap computers as you need more compute power. One container (Plex,emby,etc) can not run on two or more computers. If you need to transcoded then you’ll want one with a GPU or a more powerful CPU depending on how many people will use the service.
This all assumes you’re not using local data. Meaning if the Minecraft save and config files are on PC2 and it dies, starting it on PC1 will either not work or be 100% new. There’s other self hosted software to replicate the data to more than one computer or you can have a NAS of some sort.
It’s a bit more advanced but a lot of fun if you enjoy that kind of thing. It allows you to work on your stuff with minimal downtime. Of
I have 3 raspberry PIs, 4 various lenovo tiny PCs all in a kubernetes cluster and it seems I need more RAM than CPU. Storage is on a DIY NAS with 8*8TB disks in a raid 6.
I run bookstack, nextcloud, 2007scape, gitea, synapse, the *are stack, Plex, and a bunch of other things.
If I was just starting out I’d grab a used lenovo tiny or two, set up a docker cluster and play with that. There is software to replicate local storage across nodes that I’ve never touched but I’d try out a few of them also if you don’t want to use a NAS. Worst case, just use local storage and the containers will be locked to that host.
I think Proxmox let’s you run VMs and Containers too if you prefer that route.
8