MajorasMaskForever

joined 1 year ago
[–] [email protected] 1 points 4 months ago

I'm not really feeling it

[–] [email protected] 43 points 4 months ago (1 children)

For graphics, the problem to be solved is that the N64 compiled code is expecting that if it puts value X at memory address Y it will draw a particular pixel in a particular way.

Emulators solve this problem by having a virtual CPU execute the game code (kinda difficult), and then emulator code reads the virtual memory space the game code is interacting with (easy), interprets those values (stupid crazy hard), and replicates the graphical effects using custom code/modern graphics API (kinda difficult).

This program is decompiling the N64 code (easy), searches for known function calls that interact with the N64 GPU (easy), swaps them with known valid modern graphics API calls (easy), then compiles for local machine (easy). Knowing what function signatures to look for and what to replace them with in the general case is basically downright impossible, but because a lot of N64 games used common code, if you go through the laborious process for one game, you get a bunch extra for free or way less effort.

As one of my favorite engineering phrases goes: the devil is in the details

[–] [email protected] 18 points 6 months ago (1 children)

I'm genuinely curious how saying that Linux GUI desktop has issues equates to gargling Microsoft's balls?

[–] [email protected] 20 points 6 months ago (14 children)

So many people forget that while they understand how to use a Linux terminal and how Linux on a high level works, not everyone does. Plus, learning all of that takes time, effort, and tenacity, which not everyone is willing to do. Linus's whole conclusion was that as long as that learning curve exists and as long as it's that easy to shoot yourself in the foot, Linux desktop just isn't viable for a lot of people.

But Linus has done a lot of public fuck ups therefore everything he says must be inherently wrong.

[–] [email protected] 1 points 7 months ago

I think part of the "what do I do with this" factor for the iPad was that Apple (and other companies still to this day) were so hell bent on making everything smaller and more compact that releasing a larger product was marketing whiplash. Not to mention that smartphones were being pitched as this "do everything device" so why would you need anything else?

After you get over that marketing sugarcoating, it becomes pretty obvious what you'd use an iPad for. Internet and media consumption at a larger scale than your phone, easier on your eyes than a phone, but retains at least some of the lightweight smaller form factor that separates it from a regular laptop. Sure you didn't have the stick it in your pocket advantage of a phone or the full keyboard and computational power of a laptop, but there was this in-between that for a modest fee, you could have the conveniences if you can live with/ignore the sacrifices.

[–] [email protected] 9 points 7 months ago

I don't think the MacBook Airs launch is a good comparison.

Sure there was an early adopter tax on being one of the first "thin and light" laptops, but people already know what you can use a MacBook for, there was already a large value proposition in having a MacBook, the extra cost was entirely being more portable than it's full size counterparts. Everything you can do on a Mac, just way easier to take on the go.

I've read a few reviews on it, watched MKBHD's initial review, and outside of a few demo apps they point to the vision pro having no real point to it. Which if true, then it falls in line with existing VR headsets that are a fraction of it's cost and in a niche market, being three times the cost of your competitors is not a good position to be

[–] [email protected] 2 points 8 months ago

Oh yeah I pulled 20 years out of my ass. I could see some manager there saying to plan for it even though all the engineers expect a much shorter lifetime

[–] [email protected] 7 points 8 months ago (5 children)

The issue is that with ongoing service across time, the longer the service is being used the more it costs Kia. The larger the time boxes Kia uses the bigger the number is and the more you're going to scare off customers.

Using Kias online build and price, looks like the most expensive Telluride you can get right now is $60k MSRP, cheapest at 30k

Let's assume Kia estimates average lifetime of a Telluride to be 20 years so they create an option to purchase this service one time for the "lifetime" of the vehicle. Taking in good faith the pricing Kia has listed, using that $150 annual package, and assuming that price goes up every year at a rate of 10% (what Netflix, YouTube, etc have been doing) across those twenty years you're looking at around $8.5k option. At the top trim thats still 14% extra that is going to make some buyers hesitant, at the base model that's 28% more expensive.

Enough buyers will scoff at that so Kia can either ditch the idea entirely as they'll lose money on having to pay for the initial development and never make their money back, or they find some way to repackage that cost and make it look like something that buyers are willing to deal with.

To me the bigger issue is the cost of the service vs what you're getting. Server time + dev team + mobile data link cannot be costing Kia more than a few million annually, mid to upper hundred K is more likely so they must not be expecting that many people to actually be paying for any of this

[–] [email protected] 3 points 8 months ago (1 children)

It's IEEE misinterpreting the guys original paper.

https://liuyang12.github.io/proj/privacy_dual_imaging/ (can't find the full paper, but here's the abstract at least)

The paper author straight up says the light sensor is impractical to use as an attack vector, but when you use it in conjunction with other sensors you might be able to gleam more information than most might think. It leaves me with question of what other sensors can you combine to start getting behavioral information that is a security threat?

I'll say it worked for me. I read the IEEE headline, called bullshit, dug into it and yeah you can only get a tiny bit of information that you have to stretch pretty far to get useful conclusions from.... But it's more than the zero I initially thought. So props to the paper author, he met his goal. IEEE wanted sensationalized clicks, which they too unfortunately got.

[–] [email protected] 8 points 8 months ago (1 children)

Even in P2P you'll still need someone to go tell you what other IP addresses are in the group that you're trying to join. And you have to know the IP address of that someone. You're not going to scan the entire Internet to figure out who all else is attempting to play the exact same game as you, that would take literal days every time (assuming you rule out anyone IPv6, if you include them that suddenly becomes millions of years).

Even in P2P you will need to hit a commonly known and trusted resource to tell you what other IP addresses you need to go talk to.

[–] [email protected] 1 points 10 months ago (1 children)

I'm also curious how many people in this thread have ever been involved in product development and are actual trained/professional software devs. Because not only are some of these comments absolutely ridiculous from a business perspective, they make zero sense in a technical perspective too.

Proprietary file formats show up because often times the needs of the system don't line up with CSV, JSON, raw text or they hit some performance problem where you literally can't write that much data to the disk so you have come to come up with something different.

There's also that a computer program in the last 50 years is, except for extreme circumstances, never truly on its own. That microscope control software is completely dependent on how Win95 works, is almost certainly reliant on some old DOS kernel behavior that was left over in early Windows, which Microsoft later completely ripped out starting with Win Vista (tossed back in for Win7 cause so many people complained, then ripped it back out in 8 which no one seemed to care about)

And it's not just Microsoft that pulls this, even Lemmy's darling Linux has deprecated things over the years because even in open source projects it's unmaintainable to keep everything working for forever.

view more: next ›