The extinction of the VX scene is a really sad development. Current malware is boring, not even remotely innovative, and has reached a new low with the emergence of ransomware. 29A has already programmed viruses over a decade ago that are more sophisticated than all the boring low-quality garbage that is distributed today.
Not that Mirai and WannaCry are not effective, but technically they are at the lowest level.
The exploits of the NSO group are a happy exception with their iMessage exploit.
Meh. You sound like one of those "the code's not elegant enough to ship yet" types. It's malware. If it achieves the goals intended, ship it. Get what you can while you can. It's not state level infiltration built for years of spying without being noticed. It's the level of script kiddies trying to get some street cred, maybe some scratch to spend, or whatever else for the lulz.
The old addage "know your audience" seems apropos here.
Compared to laptops running windows, I feel a little better with macbooks in regards to telemetry and data collection. Apple primarily makes money off of physical hardware sales and software subscriptions, they're not in the business of selling your information.
The problem is the execution-time verification (which calls home), not the compile-time signing. Think about it: the exec(3) function of the OS connects to the internet ! This is absolutely crazy stuff, yet folks don't seem to care.
Continuing with your analogy, it would be as if every time that you enter your home, with your key, with different clothes, the keylock sends your photo to Apple for "verification", which in turn has the power to not open your own door if it so decides. Would you find that acceptable?
You ought to have focused on the runtime issue instead of mentioning compile time signing.
But even then, your description isn't what it does. If there's no network or you block that domain it assumes everything's fine (and it doesn't even do this test on all launches), so returning to the analogy it's like moving into an apartment and knowing that previous tenants can't use a copy of their old key to let themselves in.
This basically describes the locks on modern hotel rooms. The main reason you don't want it in a residential rental is that the hardware breaks far too often.
I find it strange that you liken the OS to a hotel. Shouldn't it be your house?
But even on an airbnb, the owner is not allowed to record you secretly, and even then it is very creepy, and they cannot put cameras on the toilet, on your bed, etc.
On macOS, however, it is way worse than that. You buy a laptop that you own. Then you write a program in C and you compile it. Now, when you try to run your program on your computer, the operating system betrays you and tries to send some data to Apple and check whether you are allowed to run your own program. Moreover, this is the default behavior. This is extremely creepy, outrageous and should be illegal if it isn't already. The programmers who coded this betrayal, and the managers who ordered them to do so, should be tried and put in jail for their shameful malpractice and violation of basic human rights.
EDIT: The technical inconvenience is a minor one; you only notice it, if at all, when you are over an unstable wifi connection. And then it is just a minor nuisance. Even with a flawless implementation, however, the main problem would remain: that new binaries on macOS "call home" to ask for permission to be run on your own computer. This is totally unacceptable as a matter of principle.
As I said, the main reason you don't want it in a residential rental is that the hardware breaks far too often — if it was reliable, this wouldn't be a problem for me to have this as a door lock at home. Well, not by itself at any rate, though I would still expect the YouTuber knowns as The Lockpicking Lawyer to be able to exploit it.
The fundamental problem with computers is that they do exactly what they are told to do, not what we think we're asking them to do, so this kind of thing is pretty useful for preventing (or at least limiting) malicious apps. Without some signing and signature checking, I wouldn't be comfortable with an internet-connected machine these days, and the internet is sufficiently useful that I would rather not be forced to solve malware by not being online.
For all the imperfections in systems of this world, opining that the developers of this particular system "should be tried and put in jail for their shameful malpractice and violation of basic human rights" feels like as much of an overreaction as some of the graffiti near my apartment which says the police are legalised mafia.
In November of 2020, Apple's OCSP responder became slow/unavailable for a bit. As a result, Mac apps were either not loading or taking a very long time to load because trustd was trying to connect on app launches and connecting but not getting a quick response.