Bitrot

joined 1 year ago
[–] [email protected] 1 points 1 day ago

Harder on the corporate side, but this has been an issue in the warehouses.

[–] [email protected] 1 points 1 week ago (1 children)

I thought it was Surge that had that unfortunate side effect.

[–] [email protected] 3 points 2 weeks ago (1 children)

If they arrest someone to gain access to their key, they don’t need this attack to use their key. They can just use their key.

[–] [email protected] 4 points 2 weeks ago* (last edited 2 weeks ago)

One thing the article doesn’t make very clear is that for 2FA the PIN requirement comes from the site itself. If the site requires User Verification, the PIN is required. If not, it is not prompted even if set and this attack is possible. The response to the site just says they knew it.

It is different for Passkeys. They are stored on the device and physically locked behind the PIN, but this is just an attack on 2FA where the username and password are known. (In depth it’s more than that, but for most people walking around with a Yubikey…)

It also seems limited in scope to the targeted site and not that everything else protected by that specific Yubikey. That limits how useful this is in general, which is another reason it is sort of nation-state level or an extremely targeted attack. It’s not something your local law enforcement are going to use.

I think the YubiHSM is a much more appealing target, but that isn’t so much a consumer device and has its own authentication methods.

[–] [email protected] 1 points 2 weeks ago

The bot demonstrated very well what this article is about. I don’t know the internals, but I also can’t image the bot was using the best and most expensive ways of doing analysis.

It was pretty bad at “getting the point” even when it was obvious, a better system should be able to do so. Sometimes the point is more difficult to discern and there has to be some judgement, you can see this in comments sometimes where people discuss what “the point” was and not just the data. I imagine an AI would have some difficulty determining what is worth summarizing in these situations especially.

[–] [email protected] 3 points 2 weeks ago

Different applications have better performance on one vs other. Google Cloud still offers a lot of Nvidia options.

[–] [email protected] 20 points 2 weeks ago

I was confused how a resume or application would be largely affected, but the article points out that software is often used to look over social media now as part of hiring (which is awful).

The bias when it determined guilt or considered consequences for a crime is concerning as more law enforcement agencies integrate black box algorithms into investigative work.

[–] [email protected] 18 points 3 weeks ago

Amazon is notorious for combining stock, “the seller” often doesn’t matter.

[–] [email protected] 24 points 3 weeks ago* (last edited 3 weeks ago)

I think this is the crux of the article. In the past most people have considered photographic evidence to be very convincing. Sure, you could be removed from a photo of Stalin, and later people could do photoshop (with varying realism), now it’s a few words to make changes that many people believe without hesitation. Soon it will happen to video too, very soon.

Most people are not ready for it. Even shitty AI photos on social media get huge reactions with barely a handful calling them out.

[–] [email protected] 9 points 1 month ago (1 children)

I think you explained it fine, it just doesn’t make sense to people who only go to the same place.

[–] [email protected] 5 points 1 month ago

Difficult to fix if exploited.

Can be patched beforehand.

[–] [email protected] 5 points 1 month ago* (last edited 1 month ago)

Because people learn how to write as young children in a place that typically has no experience with a proper technique for left-handed people, and re-learning how to write is not as easy as thinking to yourself that a different way would be better.

It’s probably good you don’t say the “you know” because that would be dumb as fuck.

view more: next ›