As a Go dev, its simplicity is arguably taken too far. For example there are no union types or proper enums
gibson
It usually isn't super hard to tell apart randomized junk like this from real human patterns. That is why Tor Browser for example tries its best to make everyone look the same instead of randomizing everything.
That said, for the mere purpose of throwing off the ISPs profiling algorithms, you could make a relatively simple python program to solve this. A naive solution would just do an http GET to each site, but a better solution would mimic human web browsing:
- Get a list of various news sites and political forum sites
- Setup headless firefox or chromium
- Use Selenium or similar to crawl links on each site. Make sure you have the pages fully load and wait a random amount of time that a human would before going to the next page.
- https://realpython.com/modern-web-automation-with-python-and-selenium/#test-driving-a-headless-browser
If you have no programming capability this will be rough. If you have at least a little you can follow tutorials and use an LLM to help you.
The main issue with this goal is that it isn't possible to tell how advanced your ISP's profiling is, so you have no way to know if your solution is effective.
Feel free to DM me if you go this route.
Hello open source dev here (though no very popular projects), even relatively small donations are morale boosts (e.g, $5, $10), which is kind of sad but yeah. If you can't do that, at least put effort into bug reports and use good manners.
Just because you can't stop all the leaks in your plumbing doesn't mean you shouldn't fix the ones you can.
Its best to have some defence in depth. Ideally you would have a firewall on your network AND your local machine. If you are running a laptop definitely have a local firewall on that as you cannot trust random networks you connect to when out and about in the world.
firewalld is sufficient, i suggest learning its CLI as it is not super complicated. ufw is ok if you are allergic to command line.
I believe he does extend it to JavaScript however, so if he were required to run unfree javascript on a webpage relating to his treatment that could be a problem.
I don't think tar is actually hard, we are just in the time where we externalize more information into resources such as Google. Its the same reason why younger people don't remember routes by name or cardinal direction as much anymore.
side note: $ tldr is much better than man for just getting common stuff done.
Yes and no. decentralization is great for a lot of reasons but it does come with downsides. I don't know about you, but i convinced my family and friends to use and keep Signal for years now and i don't think i would have had such luck with Matrix/Element, let alone a p2p app.
I'm glad decentralized options exist and think they deserve more funding and love, however.
Two generals problem handled poorly by lemmy. User probably saw an error of some sort when trying to post, but it was actually posting correctly.
Depending on what you're doing, Local LLM can help a bit. Like if i want a recipe for an apple pie i could use LLaMA-2 to find out even without an internet connection.
Not saying its a replacement for a search engine, i just think its worth mentioning.
(edit for grammar)
Thanks, will look into it
There is already gridcoin which is a cryptocurrency that awards boinc work, so I'd say this concern has already been addressed because of that.