I understand the natural frustrations articulated here, especially given OP’s experience working with files, but it seems to dismiss what is actually a core strength of current operating systems: they work. Given a program supporting 16 bit address spaces from the 1970s, you can load it into a modern x86 OS today and it works. This is an incredible feat and one that deserves a lot more recognition than offered here! Throughout an exponential explosion of complexity in computing systems since the 70s, every rational effort has been made to preserve compatibility.
The system outlined here seems to purposefully avoid it! Some sort of ACID compliant database analogy to a filesystem sounds nice until 20 years down the line when ACIDXYZABC2.4 is released and you have to bend over backwards to remain compatible. Or until Windows has a bug in their OS-native YAML parser (as suggested here) so now your program doesn’t work on Windows until they patch it. But when they do, oh no you can’t just tell your users to download a new binary. Now they have to upgrade their whole OS! Absolute chaos. And if you’re betting on the longevity of YAML/JSON over binary data, well just look at XML.
Want to admire your fancy After Dark win 3.1 screensaver? Just emulate the whole environment! We don't want to keep suporting the broken architectures and leaky abstractions of past, they drag us down. Microsoft's dedication to backwards compatibility is admirable but IMO misguided and unsustainable in the long run. The IT industry has huge problem with complexity. We need to simplify the whole computing stack in interest of reliability, security and future innovations.
The proposed improvement as I understood it would be future proof. It seems trivial to build a rock-solid YAML/XML/JSON/EDN parser on OS level, and since it would be so crucial part of OS the mistakes would be caught and fixed quickly. It shouldn't even matter if structured data syntax is replaced or expanded in future, as long as it is versioned and never redacted. Rich Hickey's talk "Spec-ulation" has much wisdom about future-proofing the data structure.
> The IT industry has huge problem with complexity. We need to simplify the whole computing stack in interest of reliability, security and future innovations.
Yes! I really hope I keep hearing more of this sentiment and that eventually we collectively take action. What would be the first practical step? There's a lot of effort duplicating the same functionality across different languages and frameworks. Is reducing this duplication a good first goal? Should we start at the bottom and convince ARM/x86/AMD64 to use the same instruction set? After that, should we reduce the number of programming languages? It seems there's still a lot of innovation going on, would it be worth stifling that?
The actual non-snarky first step would be to admit that we are over our depth and we can no longer deliver software that is reliable, secure and maintainable. We can only guarantee that our software works for at least some users, on current versions of OS/browser, and is hopefully secure against some of poorer attackers.
Countless variants of programming languages and of instruction sets are not an issue. The problem is lack of well-defined non-leaky interface on boundaries of abstraction layers.
This is too big a topic to reliably cover in a comment (or ten) but standardising using strongly and strictly typed data formats like ASN.1 and EDN and practically forfeiting everything else (JSON, YAML, TOML, INI, XML) for configuration might be a good first step.
You cannot innovate if you keep insisting on eternal backwards compatibility. That's just the facts of life. At some point a backwards compatibility breaking move must be made. It's absolutely unavoidable and we'll see such moves in the near future.
> Is reducing this duplication a good first goal? Should we start at the bottom and convince ARM/x86/AMD64 to use the same instruction set?
Not sure about the CPU architectures; it seems they have been stuck in a local maxima for decades and just in the last few years people started finally asking if there are better ways to do things.
But as for some of the author's points, you can bake in certain services directly in the OS (say, utilise SQL for accessing "files" and "directories" instead of having a filesystem), standardise that and then just make sure you have a good FFI (native interface) to those OS services no matter the programming language you use -- akind to how everybody is able to offload to C libraries, you know?
> After that, should we reduce the number of programming languages?
We absolutely should, even if that leads to street riots. We have too many of them. And practically 90% of all popular languages are held together by spit, duct tape and nostalgia -- let's not kid ourselves at least.
It cannot be that damned hard to identify several desirable traits, identify the languages that possess them, combine that with the knowledge of which runtimes / compilers do the best work (benchmarking the resulting machine code is very good first step in that), then finally combine that with desirable runtime properties (like the fault tolerance and transparent parallelism of Erlang's BEAM VM). Yes it sounds complicated. And yes it's worth it.
> It seems there's still a lot of innovation going on, would it be worth stifling that?
Yes. Not all innovation should see production usage. I can think of at least 10 languages right now that should have remained hobby projects but became huge commercial hits due to misguided notions like "easy to use". And nowadays we no longer want easy to use -- we want guarantees after the program compiles, not being able to spit out a half-working code in 10 minutes (I definitely can't talk about all IT here, of course, but this is a sentiment / trend that seems to get stronger and stronger with time).
Many languages and frameworks aren't much better than weekend garage tinkering projects and should have stayed that way -- Javascript is the prime example.
Most operating systems ship a general-purpose structured binary serialization format parser as an OS component: ASN1. There have over the years been a number of security critical bugs in there, and everybody hates ASN1 anyway.
ASN.1 has an amazing idea and an awful implementation. :(
I'd say standardise a subset of ASN.1's binary and text representations and introduce a completely different schema syntax -- LISP seems like a sane choice -- and just stop there.
ASN.1 suffers the same problems that many other technologies suffer: they have way too many things accumulated on top of one another. Somebody has to put their foot down and say: "NO! Floating-point numbers in these binary streams can be 32 bit, 64 bit and arbitrary precision but no more than 1024 bits! I don't care what you need, there's the rest of the world to consider, deal with it". And people will find a way (maybe introduce a composite type that has 2x 1024-bit floats).
We need standard committees with a bit more courage and less corporate influence.
> Given a program supporting 16 bit address spaces from the 1970s, you can load it into a modern x86 OS today and it works.
Actually, it doesn't. It is extremely hard to properly return to 16-bit userspace code from a 64-bit kernel, so Windows removed support for it entirely, and it's not enabled by default on Linux.
Well, I don't want to say anything about the utility, longevity or appeal of yaml/json, but I somehow think a user is going to upgrade their entire operating system before they upgrade my little app.
And if they're inclined to upgrade my app, I mean, nothing stops me from using a third party library to parse yaml. It sounds like we're talking about an app from three operating systems and 20 years ago so it's likely I'm doing that anyway - maybe not in the current Windows version, but in a recent enough version on some other operating system.
The system outlined here seems to purposefully avoid it! Some sort of ACID compliant database analogy to a filesystem sounds nice until 20 years down the line when ACIDXYZABC2.4 is released and you have to bend over backwards to remain compatible. Or until Windows has a bug in their OS-native YAML parser (as suggested here) so now your program doesn’t work on Windows until they patch it. But when they do, oh no you can’t just tell your users to download a new binary. Now they have to upgrade their whole OS! Absolute chaos. And if you’re betting on the longevity of YAML/JSON over binary data, well just look at XML.