My concern is more so if he gets elected, he might try to justify "emergency powers" citing political violence as history has shown with other authoritarians.
Dark_Arc
Sure, there's a cost to breaking things up, all multiprocessing and multithreading comes at a cost. That said, in my evaluation, single for "unity builds" are garbage; sometimes a few files are used to get some multiprocessing back (... as the GitHub you mentioned references).
They're mostly a way to just minimize the amount of translation units so that you don't have the "I changed a central header that all my files include and now I need to rebuild the world" (with a world that includes many many small translation units) problem (this is arguably worse on Windows because process spawning is more expensive).
Unity builds as a whole are very very niche and you're almost always better off doing a more targeted analysis of where your build (or often more importantly, incremental build) is expensive and making appropriate changes. Note that large C++ projects like llvm, chromium, etc do NOT use unity builds (almost certainly, because they are not more efficient in any sense).
I'm not even sure how they got started, presumably they were mostly a way to get LTO without LTO. They're absolutely awful for incremental builds.
Slow compared to what exactly...?
The worst part about headers is needing to reprocess the whole header from scratch ... but precompiled headers largely solve that (or just using smaller more targeted header files).
Even in those cases there's something to be said for the extreme parallelism in a C++ build. You give some of that up with modules for better code organization and in some cases it does help build times, but I've heard in others it hurts build times (a fair bit of that might just be inexperience with the feature/best practices and immature implementations, but alas).
There's no precompiler in C++. There's a preprocessor but that's something entirely different. It's also not a slow portion of the compile process typically.
C++ is getting to the point where modules might work well enough to do something useful with them, but they remove the need for #include preprocessor directives to share code.
That's a false equivalence. Building up society as a whole is better than trying to determine "the most relevant" voices.
It's not a new problem; I remember back in 2016 looking up one MAGA supporter on Twitter that seemed EXTREMELY enthusiastic via reverse image search.
I ended up finding a girl in Brazil that had presumably never set foot in "Nebraska" where this MAGA supporter was allegedly "born and raised."
That's when you use different exit codes. 1 for failure during simulation, 2 for simulation failed.
Shame they wouldn't listen.
What did you hate about it? I mean CentOS is fine other than IBM killed it
By integrating everything into it, it has become a good enough medium of communication for almost everything.
Except that's not at all what we've done.
The only reason English dominates is because it's the dominant language of the world super powers following world war II. It's not because of some special design, principle, or properties.
English isn't just "make up whatever rules and put them wherever", particularly formal English which is what we're talking about in the context of education.
Really, a better argument against changing the spelling is the classic "standards" xkcd, where now you're just making another dialect of English where they spell words differently again, and now it needs to be adopted, fracturing the language further.
Language will evolve with or without direction. We have the structure in the form of schools to actually evolve it with direction in the name of making things more consistent and intuitive. We should use it, that's all.
The old "why try to do anything because it will never be perfect" argument never holds water.
Yeah this is either projection because they didn't see the mainstream media doing it or an attempt to drive a wedge and create controversy where there shouldn't be any.