gedhrel

joined 1 year ago
[–] [email protected] 5 points 3 weeks ago

"Maybe our friend doesn't like monads."

[–] [email protected] 6 points 1 month ago (1 children)

Excel famously misidentifies all trees as dates.

[–] [email protected] 1 points 1 month ago (1 children)

You joke, but watch this:

https://archive.org/details/take-me-to-titanic

from 29 minutes in. A last-minute adjustment before launch plugged in a thruster backwards; no protocol to check the behaviour prelaunch. They doscovered it when they got to the bottom.

[–] [email protected] 1 points 1 month ago

That depends entirely on the ability to execute change. CTO is the role that should be driving this.

[–] [email protected] 3 points 1 month ago (2 children)

Developers aren't the ones at fault here.

[–] [email protected] 5 points 1 month ago (4 children)

Possibly the thing that was intended to be deployed was. What got pushed out was 40kB of all zeroes. Could've been corrupted some way down the CI chain.

[–] [email protected] 22 points 1 month ago (6 children)

Check Crowdstrike's blurb about the 1-10-60 rule.

You can bet that they have a KPI that says they can deliver a patch in under 15m; that can preclude testing.

Although that would have caught it, what happened here is that 40k of nuls got signed and delivered as config. Which means that unparseable config on the path from CnC to ring0 could cause a crash and was never covered by a test.

It's a hell of a miss, even if you're prepared to accept the argument about testing on the critical path.

(There is an argument that in some cases you want security aystems to fail closed; however that's an extreme case - PoS systems don't fall into that - and you want to opt into that explicitly, not due to a test omission.)

[–] [email protected] 1 points 2 months ago (1 children)

...unless it's running software that uses signed 32-bit timestamps, or stores data using that format.

The point about the "millennium bug" was that it was a category of problems that required (hundreds of) thousands of fixes. It didn't matter if your OS was immune, because the OS isn't where the value is.

[–] [email protected] 29 points 5 months ago (2 children)

Casey's video is interesting, but his example is framed as moving from 35 cycles/object to 24 cycles/object being a 1.5x speedup.

Another way to look at this is, it's a 12-cycle speedup per object.

If you're writing a shader or a physics sim this is a massive difference.

If you're building typical business software, it isn't; that 10,000-line monster method does crop up, and it's a maintenance disaster.

I think extracting "clean code principles lead to a 50% cost increase" is a message that needs taking with a degree of context.

[–] [email protected] 1 points 5 months ago

The test case purported to be bad data, which you presumably want to test the correct behaviour of your dearchiver against.

Nothing this did looks to involve memory safety. It uses features like ifunc to hook behaviour.

The notion of reproducible CI is interesting, but there's nothing preventing this setup from repeatedly producing the same output in (say) a debian package build environment.

There are many signatures here that look "obvious" with hindsight, but ultimately this comes down to establishing trust. Technical sophistication aside, this was a very successful attack against that teust foundation.

It's definitely the case that the stack of C tooling for builds (CMakeLists.txt, autotools) makes obfuscating content easier. You might point at modern build tooling like cargo as an alternative - however, build.rs and proc macros are not typically sandboxed at present. I think it'd be possible to replicate the effects of this attack using that tooling.

[–] [email protected] 5 points 10 months ago

What are the permissions on the directory? What is command are you running to edit the file? What command are you running to delete it? (Have you got selinux turned on? What filesystem is this directory on?)

[–] [email protected] 2 points 10 months ago

It's all the files. Content-addreasable storage means that they might not take up any more space. Smart checkout means they might not require disk operations. But it's the whole tree.

view more: next ›