The problem is the execution-time verification (which calls home), not the compile-time signing. Think about it: the exec(3) function of the OS connects to the internet ! This is absolutely crazy stuff, yet folks don't seem to care.
Continuing with your analogy, it would be as if every time that you enter your home, with your key, with different clothes, the keylock sends your photo to Apple for "verification", which in turn has the power to not open your own door if it so decides. Would you find that acceptable?
You ought to have focused on the runtime issue instead of mentioning compile time signing.
But even then, your description isn't what it does. If there's no network or you block that domain it assumes everything's fine (and it doesn't even do this test on all launches), so returning to the analogy it's like moving into an apartment and knowing that previous tenants can't use a copy of their old key to let themselves in.
This basically describes the locks on modern hotel rooms. The main reason you don't want it in a residential rental is that the hardware breaks far too often.
I find it strange that you liken the OS to a hotel. Shouldn't it be your house?
But even on an airbnb, the owner is not allowed to record you secretly, and even then it is very creepy, and they cannot put cameras on the toilet, on your bed, etc.
On macOS, however, it is way worse than that. You buy a laptop that you own. Then you write a program in C and you compile it. Now, when you try to run your program on your computer, the operating system betrays you and tries to send some data to Apple and check whether you are allowed to run your own program. Moreover, this is the default behavior. This is extremely creepy, outrageous and should be illegal if it isn't already. The programmers who coded this betrayal, and the managers who ordered them to do so, should be tried and put in jail for their shameful malpractice and violation of basic human rights.
EDIT: The technical inconvenience is a minor one; you only notice it, if at all, when you are over an unstable wifi connection. And then it is just a minor nuisance. Even with a flawless implementation, however, the main problem would remain: that new binaries on macOS "call home" to ask for permission to be run on your own computer. This is totally unacceptable as a matter of principle.
As I said, the main reason you don't want it in a residential rental is that the hardware breaks far too often — if it was reliable, this wouldn't be a problem for me to have this as a door lock at home. Well, not by itself at any rate, though I would still expect the YouTuber knowns as The Lockpicking Lawyer to be able to exploit it.
The fundamental problem with computers is that they do exactly what they are told to do, not what we think we're asking them to do, so this kind of thing is pretty useful for preventing (or at least limiting) malicious apps. Without some signing and signature checking, I wouldn't be comfortable with an internet-connected machine these days, and the internet is sufficiently useful that I would rather not be forced to solve malware by not being online.
For all the imperfections in systems of this world, opining that the developers of this particular system "should be tried and put in jail for their shameful malpractice and violation of basic human rights" feels like as much of an overreaction as some of the graffiti near my apartment which says the police are legalised mafia.
In November of 2020, Apple's OCSP responder became slow/unavailable for a bit. As a result, Mac apps were either not loading or taking a very long time to load because trustd was trying to connect on app launches and connecting but not getting a quick response.
Continuing with your analogy, it would be as if every time that you enter your home, with your key, with different clothes, the keylock sends your photo to Apple for "verification", which in turn has the power to not open your own door if it so decides. Would you find that acceptable?