I was AWOL in the years where http became a delivery mechanism for applications but when I finally noticed I was very surprised.
What made it seem like a good idea to run transactional software over a connectionless protocol designed for static hyper-text over unreliable connections? Aren’t half the problems in webdev caused by taking this shortcut and isn’t it time someone designed a protocol that is appropriate for the job?
[Runs back to ‘Cave of Web Ignorance’ while people explain why I’m wrong]
For comparison: when I was working with developers in the mid-90s, OO was happening and people were moving away from proprietary networking solutions like DECnet to TCP/IP and architectures like DCE (Distributed Computing Environment) and Object Request Brokers like CORBA, which seemed a very good idea.
Does anyone know where that went wrong? Was it just that DCE wasn’t adopted by the VB programmers on Windows because MIcrosoft bribed them with something easier, to keep them Microsoft-only or were distributed objects actually a bad idea? Is there anything current that works this way?
I asked WikiP too:
"The rise of the Internet, Java and web services stole much of DCE’s mindshare through the mid-to-late 1990s, and competing systems such as CORBA muddied the waters as well.
One of the major uses of DCE today is Microsoft’s DCOM and ODBC systems"
Typical! Open Systems competed themselves out of existence and Microsoft stole the ideas.