37802 readers
78 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.


  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.


Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago

Hi! Could someone tell me if the current (real OS) UmbrelOS supports multiple internal drives?


Looking for a good foss pastbin service i can easily host with docker. Requirements: Can put a password/account login on past uploading Foss Will auto delete pasts after some time Need a rawtext capability to i can wget things Preferably language heighlighting.


Might be related to this other post:

crosspost content below:

Firstly, this post is not to celebrate somebody losing their job, nor to poke fun at a company struggling in today's market.

However, it might go some way to explaining why Portainer are tightening up the free Business plan from 5 to 3 nodes

Sean O'Dell

My time at Portainer came to an end in May due to restructuring/layoffs. I am proud of the work the team and I put in. Being the Head of Marketing is challenging but I am thankful for the personal growth and all that we accomplished. Monday starts the search for my next role!


Hi everyone,

I've started pushing backups of media important to me (family pictures, video etc) to backblaze with client-side encryption.

However, are they a reliable storage provider? I can't help but compare them to something like Amazon who likely has a better chance of maintaining my files but they are so expensive that I don't even bother.

What do you think? Yes, I've heard of 3-2-1, however for now I only have backblaze and a local backup. I'm trying not to spend too much on this.



Hi Folks,

I host a nextcloud instance, a NAS, and a few content portals for things like ebooks and music (internal only). I'll be migrating Smartthings to Home Assistant eventually. We're going to be upgrading to fiber soon and I have the opportunity to rebuild my wife's network with a long term outlook (we'll likely be here for years). Currently we have an older eero mesh system over cable internet. My desk is right where the cable currently comes in so all my Ethernet devices can live near the router.

My question is this:

What am I missing out on as a self-hoster by using whatever equipment metronet gives me?

What am I missing out on as a regular internet user by using the default equipment.

Am I likely to be annoyed about where the fiber comes into the house?

If it makes sense to buy my own router or access point(s), what is a reasonable balance between "daddy Bezos please read all my emails" and "you'll never be secure until you build a router from custom circuit boards you custom ordered and hand assembled in a secure area".

I'd like to avoid complex configuration, but if I can surface advanced options when needed, that would be great.

My Linux knowledge is intermediate. My networking knowledge is begintermediate.

Old microserver bad idea? (
submitted 1 week ago* (last edited 1 week ago) by [email protected] to c/[email protected]

I'm thinking of picking up an old HP Microserver (gen8) and was wondering if it is a bad idea from a security standpoint.

I mean it's only 10 years old - is there any exploit or something like that?

What about a N36L Microserver?

I'd probably run Debian headless on it.

I'd only use it for Syncthing and as a backup NAS.


Everybody made really good arguments against the microserver and I won't be getting one. Thank you for your inputs


Just got an email thanking me for being a 5-node/free user, but Portainer isn't free and I need to stop being a cheap-ass and pay them because blah blah economic times enshittification blah blah blah.

I've moved off them a while ago, but figured I'd see if they emailed EVERYONE about this?

A good time to ditch them if you haven't, I suppose.


I'm in desparate need of setting up borgmatic for borg backup. I would like to encrypt my backups. (I suppose, an unencrypted backup is better than none in my case, so I should get it done today regardless.)

How do I save those keys? Is there a directory structure I follow? Do you backup the keys as well? Are there keys that I need to write down by hand? Should I use a cloud service like bitwarden secrets manager? Could I host something?

Im ignorant on this matter. The most I've done is add ssh keys to git forges and use ssh-copyid. But I've always been able to access what I need to without keeping those (I login to the web interface.) Can you share with me best practices or what you do to manage non-password secrets?


Hello! I have jellyfin+qbittorrent+radarr on my home server, but I can't make it work with hardlinks. When a download finishes, it just copies it to the /movie folder, doubling the disk space. at least, I think that it's just a copy, because the disk space is double and find ./downloads -samefile ./movies/path/to/file.mkv returns no result, meaning if I understand correctly that file.mkv is not hardlinked to any file in the download folder (but it should).

this is the docker compose:

    container_name: radarr
    network_mode: container:gluetun
      - PUID=1000
      - PGID=1000
      - TZ=Europe/Rome
      - ./radarr-config:/config
      - /media/HDD1/movies:/movies
      - /media/HDD1/downloads:/downloads
    restart: unless-stopped

HDD1 hard drive is formatted ext4, that supports hardlinks (in fact I can create them manually), and in the radarr settings the checkbox "use hardlinks instead of copy" is checked.

Ideally I'd prefer softlinks instead of hadlinks, but I don't think there's a way to do it natively, I think I'd need an external script

Any tips? Thanks in advance!


I’ve been using the CarFAX Car Care app/website for a long time but I’m looking for something better.

It would be nice to have something I can enter my car make/model into and have it suggest maintenance but also keep track of repairs. I like uploading PDF scans of receipts too; one thing that always bothered me about Car Care is the horrible, weird compression it does on those files.


cross-posted from:

I have many ebooks I have from scouring the Internet in two formats: epub and PDF. I want something server like that lets me drop read them from any device on my local network and remembers where I left the book on device and let's me continue on another. I want the client app to have android and Linux support while the server should run on linux. Is there anything out there? Bonus points if it autographs metadata from the internet and organises them by topics, authors, ddc etc.

TLDR: An ebook library running on a Linux server with Android and Linux client software.

submitted 1 week ago* (last edited 1 week ago) by [email protected] to c/[email protected]

Should be easy to use, remember what I bought before and propose things that are probably running out (based on my personal buying frequency), and allow sharing the list between multiple people. Ideally also allow adding recipes for meals that I cook often.

unattended upgrades with caddy (
submitted 1 week ago* (last edited 1 week ago) by [email protected] to c/[email protected]

Edit: credit to [email protected]

Assuming you installed caddy via Debian, Ubuntu, Raspbian method

add "cloudsmith/caddy/stable:any-version"; to /etc/apt/apt.conf.d/50unattended-upgrades


// Automatically upgrade packages from these (origin:archive) pairs
// Note that in Ubuntu security updates may pull in new dependencies
// from non-security sources (e.g. chromium). By allowing the release
// pocket these get automatically pulled in.
Unattended-Upgrade::Allowed-Origins {
        // Extended Security Maintenance; doesn't necessarily exist for
        // every release and this system may not have it installed, but if
        // available, the policy for updates is such that unattended-upgrades
        // should also install from here by default.
//      "${distro_id}:${distro_codename}-proposed";
//      "${distro_id}:${distro_codename}-backports";

Link to comment chain (not sure how to add links in a federated way)

Origional post:

Hi guys anyone know how to use un attended upgrades with caddy.

I have ubuntu server 22.0.4.

The part that stumps me is caddy uses a external repository cloud Smith making ot difficult to setup.

I installed caddy via Debian, Ubuntu, Raspbian

The closest example I could find to unattended upgrades with a external repo was this example using docker.



I'm not sure if it's as simple as




One more question affect would adding

APT::Unattended-Upgrade::Package-Blacklist "";




I just removed this I only found this from google gemini (which probably isn't the best source of info)

APT::Unattended-Upgrade::Package-Blacklist "";

cross-posted from:

A really nice project which provide charts to display Linux server status and tools to manage server.

I was using DaRemote only available on Google Play Store, to do that. Recently there was an option to download it and pay it directly to the dev.

ServerBox is really awesome, in 3 minutes it convince me, open-source, secure access with biometric, select a font, etc...


I had everything working fine for a while, and suddenly all my indexes have stopped working I get the error : " unable to connect to indexer. Connection refused ("

The is not the address where my CasaOS is I don't know why it want to connect to that one or if it has something to do with the error. As I said it was working before like for 2 months I didn't change anything in the setting.


I want to host some engine, but don't know which. I know about searxng and 4get, for example, but I know that there are a lot of other search engines. Here is question: how to pick one and by what criteria?

submitted 1 week ago* (last edited 1 week ago) by [email protected] to c/[email protected]

I've been ripping my anime bluray collection and wanted to have an easier way to sort it for Jellyfin, so I wanted to try Shoko Server, but it's not recognizing any of my anime. It sees the actual files, but categorizes them all as Unrecognized, making the entire idea of using it for automated sorting pointless. I'm struggling to find guides on this and the documentation is quite lacking. I don't know what I'm wrong. Are there certain rules I need to be following in order for Shoko to hash correctly? Does it hash the name? The actual ripped files?

My folder structure is setup in a way that Jellyfin properly recognizes it (without using the Shoko plugin yet), so like so for example:

- Fate/stay night: ubw (2014)
---- Season 01
---------- <episode> S01E01
- Fate/stay night: ubw (2015)
---- Season 01
---------- you get the idea

Since multi season anime often are separate entries, each season is usually its own main folder (which is one of the reasons I wanted to try Shoko to see if I could combine them into one so that I don´t have multiple entries for what is really only 1 anime series).

Anyone here that uses Shoko and have some tips?

EDIT: thanks for the information and tips everyone. Seems like Shoko might not be what I'm actually looking for.


Hello! I was wondering if running periodically a script to automatically pull new images for all my containers is a good or a bad idea. I'd run it everyday at 5.00AM to avoid interruptions. Any tips?

EDIT: Thanks to everyone for the help! I'll install Watchtower to manage the updates



I've been playing with a Dell mini PC (OptiPlex 7070) that I set up with Proxmox and a single Debian virtual machine that hosts a bunch of containers (mostly an *arr stack).

All the data resides on the single SSD that came with the machine, but I'm now satisfied with the whole ordeal and would like to migrate my storage from my PC to this solution.

What's the best approach software side? I have a bunch of HD in of varying size and age (therefore expected reliability) and I'd initially dedicate such storage to data I can 100% afford to lose (basically media).

I read I should avoid USB (even though my mini PC exposes a USB-C) for reliability, but on the other hand I'm not sure what other options I have that doesn't force me to buy a NAS or properly sized HD to install inside the machine...

Also, what's a good filesystem for my usecase?

Thank for any tips.


Currently, I have two VPN clients on most of my devices:

  • One for connecting to a LAN
  • One commercial VPN for privacy reasons

I usually stay connected to the commercial VPN on all my devices, unless I need to access something on that LAN.

This setup has a few drawbacks:

  • Most commercial VPN providers have a limit on the number of simulations connected clients
  • I either obfuscate my IP or am able to access resources on that LAN, including my Pi-Hole fur custom DNS-based blocking

One possible solution for this would be to route all internet traffic through a VPN client on the router in the LAN and figuring out how to still be able to at least have a port open for the VPN docker container allowing access to the LAN. But then the ability to split tunnel around that would be pretty hard to achieve.

I want to be able to connect to a VPN host container on the LAN, which in turn routes all internet traffic through another VPN client container while allowing LAN traffic, but still be able to split tunnel specific applications on my Android/Linux/iOS devices.

Basically this:

   +---------------------+ internet traffic   +--------------------+           
   |                     | remote LAN traffic |                    |           
   | Client              |------------------->|VPN Host Container  |           
   | (Android/iOS/Linux) |                    |in remote LAN       |           
   |                     |                    |                    |           
   +---------------------+                    +--------------------+           
                      |                         |     |                        
                      |       remote LAN traffic|     | internet traffic       
split tunneled traffic|                 |--------     |                        
                      |                 |             v                        
                      v                 |         +---------------------------+
  +---------------------+               v         |                           |
  | regular LAN or      |     +-----------+       | VPN Client Container      |
  | internet connection |     |remote LAN |       | connects to commercial VPN|
  +---------------------+     +-----------+       |                           |
                                                  |                           |

Any recommendations on how to achieve this, especially considering client apps for Android and iOS with the ability to split tunnel per application?


~~Got it by following this guide.~~

Ended up modifying this setup to have better control over potential IP leakage


This is a part from an IBM server dated 2008 that I want to reuse in my new computer. It essentially converts from 1 SAS port to 4 SATA ports. I’ll use the raid card to connect to it via SAS, but I do not know what the power port is and what the connector on the top is either

submitted 1 week ago* (last edited 1 week ago) by [email protected] to c/[email protected]

Serverbox Github Link

Looking for a convenient overview of your servers?

Randomly found this app on F-Droid and I am blown away.

It fetches the server stats, even drive usage and makes it super easy to open an sftp browser or even a ssh console if you quickly need to.

deep recommendation


I've been using the Firefox docker container through the gluetun docker container (runs great with proton and mullvad) and it's been really great.

To me it's kind of like a less restricted tor browser, for when you need something stronger in terms of speed or IP blocking. And maybe something more persistent.

And it always stays open even when you close your connection.

Some of my use cases are:

  • Anonymously downloading larger files through the clearnet.

  • Anonymous ChatGPT usage.

  • Manually looking for torrent magnet links (though I usually do that with the tor browser)

  • Accessing shadow libraries


Normally my *arr -> Plex setup is quite painless, but lately I've had a bunch of imports failing which appear to be multi-part files with a .MKV file in a "sample" subdirectory.

Does anyone know if I can sort it so these files import properly? Or how to filter them before downloading? I'd rather a fix if possible because certain torrents don't have many options.


It's for open source ai generated speech.

That thing is like an 11 out of 10 install difficulty. I hate github projects that are really difficult to get working.

I'm using Debian. So many fucking issues. It probably can't work on Debian but I'd really like to know what system people are using who have been successful in getting this to actually work.

view more: next ›