jlh

joined 1 year ago
[–] [email protected] 8 points 1 week ago

ENEMY STAND: ORGAN DISLOCATOR

[–] [email protected] 23 points 2 weeks ago (1 children)

Who is Sandy Loam?

[–] [email protected] 5 points 2 weeks ago

Awful. This is the same argument that the conservatives in Sweden have used to justify their massive increase in CO2 emissions after gaining power in 2022. Tech won't save us, and it's easier, cheaper, and more effective to fix your own heavily polluting grid than it is to plant some trees in Africa, or give away some solar panels and call it a day.

[–] [email protected] 3 points 2 weeks ago

Oh, you might have one of the newer ones that use interferometry to detect soulless entities.

[–] [email protected] 2 points 2 weeks ago (2 children)

Yes. This is how motion detectors work. Normally, motion detectors have an IR emitter that acts as a particle, but when someone walks by, the IR emitter works as a wave, triggering the motion detector.

Notably, this doesn't work with dogs, as they have no souls.

[–] [email protected] 1 points 1 month ago* (last edited 1 month ago)

Sure, but there are likely 4% more cars on the road now than there were in 2019. One graph I see shows about a 1% YoY growth of the car population in the US. EVs might have saved us from a 4% increase in car emissions, but car emissions are still increasing. I am really not convinced that EVs are the solution to the US's massive car emissions. Ban production of all gas cars in 2024 and then maybe there's a solution in sight.

[–] [email protected] 4 points 1 month ago (1 children)

Pretty useful article! I'm pretty sure I've seen data shifting happen a few times on Jerboa.

I wonder if uuid7 based pagination solves these issues. Precise enough that you don't overlap/over fetch data with time based pagination, while still being essentially time based so you have a consistent offset. Definitely important to limit the size if requests, though.

[–] [email protected] 1 points 1 month ago (2 children)

EVs make up a very small percentage of the market. The majority of cars are gas powered, and more people are buying more cars, so it's not like the number of gas cars is decreasing, just some of the additional cars are EV. It was be very hard for California to be climate neutral while it's still so dependent on cars.

[–] [email protected] 1 points 1 month ago (2 children)

Quantum computers are not advanced enough to break RSA/EC yet. NSA might have some secret backdoors, but the recent focus on quantum-resistent encryption from both the public and private sectors (see TLS and Mullvad) has nothing to do with China.

What is a one-time crypto, and how does it allow you to avoid the use of quantum-vulnerable asymmetric encryption such as the algorithms used for HTTPS?

[–] [email protected] 1 points 1 month ago

it's obviously a scheduler/p-state bug in windows, look at the Linux performance

https://www.phoronix.com/review/ryzen-9600x-9700x

[–] [email protected] 1 points 1 month ago* (last edited 1 month ago) (1 children)

I guess that makes sense, but I wonder if it would be hard to get clean data out of the per-token confidence values. The LLM could be hallucinating, or it could just be generating bad grammar. It seems like it's hard enough already to get LLMs to distinguish between "killing processes" and murder, but maybe there could be some novel training and inference techniques that come up.

[–] [email protected] 1 points 1 month ago (3 children)

I thought confidence levels were for image recognition? How do confidence levels work for transformer LLMs?

 

Seems like a really serious vulnerability, any container attack or malicious image could take over a container host if there's no hardening on the containers.

 

I wanted to share an observation I've seen on the way the latest computer systems work. I swear this isn't an AI hype train post 😅

I'm seeing more and more computer systems these days use usage data or internal metrics to be able to automatically adapt how they run, and I get the feeling that this is a sort of new computing paradigm that has been enabled by the increased modularity of modern computer systems.

First off, I would classify us being in a sort of "second-generation" of computing. The first computers in the 80s and 90s were fairly basic, user programs were often written in C/Assembly, and often ran directly in ring 0 of CPUs. Leading up to the year 2000, there were a lot of advancements and technology adoption in creating more modular computers. Stuff like microkernels, MMUs, higher-level languages with memory management runtimes, and the rise of modular programming in languages like Java and Python. This allowed computer systems to become much more advanced, as the new abstractions available allowed computer programs to reuse code and be a lot more ambitious. We are well into this era now, with VMs and Docker containers taking over computer infrastructure, and modern programming depending on software packages, like you see with NPM and Cargo.

So we're still in this "modularity" era of computing, where you can reuse code and even have microservices sharing data with each other, but often the amount of data individual computer systems have access to is relatively limited.

More recently, I think we're seeing the beginning of "data-driven" computing, which uses observability and control loops to run better and self-manage.

I see a lot of recent examples of this:

  • Service orchestrators like Linux-systemd and Kubernetes that monitor the status and performance of services they own, and use that data for self-healing and to optimize how and where those services run.
  • Centralized data collection systems for microservices, which often include automated alerts and control loops. You see a lot of new systems like this, including Splunk, OpenTelemetry, and Pyroscope, as well as internal data collection systems in all of the big cloud vendors. These systems are all trying to centralize as much data as possible about how services run, not just including logs and metrics, but also more low-level data like execution-traces and CPU/RAM profiling data.
  • Hardware metrics in a lot of modern hardware. Before 2010, you were lucky if your hardware reported clock speeds and temperature for hardware components. Nowadays, it seems like hardware components are overflowing with data. Every CPU core now not only reports temperature, but also power usage. You see similar things on GPUs too, and tools like nvitop are critical for modern GPGPU operations. Nowadays, even individual RAM DIMMs report temperature data. The most impressive thing is that now CPUs even use their own internal metrics, like temperature, silicon quality, and power usage, in order to run more efficiently, like you see with AMD's CPPC system.
  • Of source, I said this wasn't an AI hype post, but I think the use of neural networks to enhance user interfaces is definitely a part of this. The way that social media uses neural networks to change what is shown to the user, the upcoming "AI search" in Windows, and the way that all this usage data is fed back into neural networks makes me think that even user-facing computer systems will start to adapt to changing conditions using data science.

I have been kind of thinking about this "trend" for a while, but this announcement that ACPI is now adding hardware health telemetry inspired me to finally write up a bit of a description of this idea.

What do people think? Have other people seen the trend for self-adapting systems like this? Is this an oversimplification on computer engineering?

view more: next ›