brick

joined 1 year ago
[–] [email protected] 4 points 6 months ago

FLACs/Qobuz via Roon. I spend the most time in my office so that’s where my favorite setup is. LS50 Metas + SVS SB-1000 Pro + Peachtree GaN stack.

I also love my HD660s with the Bottlehead Crack tube amp I built.

[–] [email protected] 1 points 6 months ago

In most cases.

And likewise, it’s not as if they can just boot you either. Y’all are stuck with each other, but I’m not sure how that’s related to the fact that everyone who lives in a house contributes to the mess and should also contribute to the cleaning.

[–] [email protected] -3 points 6 months ago

Yes, reality can be hard to deal with. Over here in the real world, you’re choosing between Biden and Trump. Pretty much everyone here agrees with you that it’s not ideal. In your ballot there will be several candidates, but they are all either Biden or Trump. Guess who Dr. Cornell West is?

[–] [email protected] 2 points 6 months ago

You’re out of your league intellectually. How dare you talk to someone with ALL THE ANSWERS like this!?

[–] [email protected] 30 points 7 months ago (1 children)

The selling point for M365 Copilot is that it is a turnkey AI platform that does not use data input by its enterprise customers to train generally available AI models. This prevents their internal data from being output to randos using ChatGPT. OpenAI definitely does use ChatGPT conversations to further train ChatGPT so there is a major risk of data leakage.

Same situation with all other public LLMs. Microsoft’s investments in OpenAI aren’t really relevant in this situation.

[–] [email protected] 0 points 7 months ago* (last edited 7 months ago)

So sorry to interrupt your circlejerk about this guy’s opinion on 3d V-Cache technology with a tangentially related discussion about 3d V-Cache technology here on the technology community.

I fully understand the point you’re trying to make here, but just as you think my comments added nothing to the discussion, your replies to them added even less.

[–] [email protected] -3 points 7 months ago (2 children)

I was comparing the 7950x and the 7950x3d because those are the iterations that are available right now and what I have been personally comparing as I mentioned. I apologize if I wasn’t clear enough on that point.

My point was that the essence of the take, which I read to be, “CPUs with lower clocks but way more cache only offer major advantages in specific situations” is not particularly off base.

[–] [email protected] 7 points 7 months ago (1 children)

In your mind, do you really think that is the intention here? Seems more like a convenience for people who use both Linux and Windows.

I have to use both so I welcome it.

[–] [email protected] 5 points 7 months ago

You would want to look for an R730, which can be had for not too much more. The 20 series was the “end of an era” and the 30 series was the beginning of the next era. Most importantly for this application, R30s use DDR4 whereas R20s use DDR3.

RAM speed matters a lot for ML applications and DDR4 is about 2x as fast as DDR3 in all relevant measurements.

If you’re going to offload any part of these models to CPU, which you 99.99% will have to do for a model of this size with this class of hardware, skip the 20s and go to the 30s.

[–] [email protected] 4 points 8 months ago (2 children)

The Ford Mach-E is excellent. I have also heard great things about Kia/Hyundai, VW and Volvo EVs as well.

In 2016 I drove a Tesla Model S P85D and I was surprised at how crappy the interior was considering it was a six figure car. And I don’t mean minimalist, I mean poor quality.

Back then, Tesla was you only real option. Today, there’s a lot of great competition in the market.

view more: next ›