The disagreement here seems to be "how relevant are the real world costs and profitablity born by ethereum node operators and validators to the overall 'costs' of running ethereum"?
You seem to think that is the end all be all to measure the profitability of the network. And you aren't entirely wrong. If it requires too much hardware, too much internet bandwidth, too much in the way of skilled node operators relative to the value the folks running the nodes would gain, the number of particpants would dwindle and the physical network would suffer.
So there are boundaries that real world costs impose on the operation of the network. But at what level do those boundaries kick in? Well, that is why it's been an important value to the developers that a node can be run on 'commodity hardware and internet'. You can run a full node on a standard Pc from the last 5 years with 16gb of ram (less in some configurations), a 2 tb ssd (or 1tb if you don't mind some downtime every few months), a modest internet connection and can be done so by anyone with some basic command line skills.
Because of those modest demands, I can and do run a non-validating node on old hardware I already owned on an internet connection I would be paying for anyway. I make $0 from doing so, but it interests me as a hobby because I want non-intermediated access to the the network. In contrast some large node service providers have immense costs because they host on a cloud services and they hire expensive SRE's to keep it running at a high reliability level. But they woudn't be spending that money if they didn't seem some kind of profit or value in it. Because of that variability and the low baseline to get started, whether it is real-world profitable to any particular participant is irrelevant to "ethereum" as a whole.
So, from my view, it becomes reasonable to look at it as "how much ether is created vs how much is burned" to see if "ethereum" as a whole is profitable.
I don't know if it's relevant, I think it's interesting, to me at least as an economist, to ask these questions. What you do with this information is up to you.
You seem to think that is the end all be all to measure the profitability of the network. And you aren't entirely wrong. If it requires too much hardware, too much internet bandwidth, too much in the way of skilled node operators relative to the value the folks running the nodes would gain, the number of particpants would dwindle and the physical network would suffer.
So there are boundaries that real world costs impose on the operation of the network. But at what level do those boundaries kick in? Well, that is why it's been an important value to the developers that a node can be run on 'commodity hardware and internet'. You can run a full node on a standard Pc from the last 5 years with 16gb of ram (less in some configurations), a 2 tb ssd (or 1tb if you don't mind some downtime every few months), a modest internet connection and can be done so by anyone with some basic command line skills.
Because of those modest demands, I can and do run a non-validating node on old hardware I already owned on an internet connection I would be paying for anyway. I make $0 from doing so, but it interests me as a hobby because I want non-intermediated access to the the network. In contrast some large node service providers have immense costs because they host on a cloud services and they hire expensive SRE's to keep it running at a high reliability level. But they woudn't be spending that money if they didn't seem some kind of profit or value in it. Because of that variability and the low baseline to get started, whether it is real-world profitable to any particular participant is irrelevant to "ethereum" as a whole.
So, from my view, it becomes reasonable to look at it as "how much ether is created vs how much is burned" to see if "ethereum" as a whole is profitable.