I like this far better than the first post, at least the author is proposing some concrete ideas that can be debated, which I think is worthwhile.
However actually going and attempting to implement this idea from whole cloth is a fool's errand. I'm not really cynical by nature, so it pains to be that guy, but in this case I just think it belies a complete naivete about how software ecosystems evolve and thrive, and why the web in particular was successful.
The web you see today bears almost no resemblance to the original proposal. What got the ball rolling was a very simple document format and protocol that made it much easier to share documents. That was a unique value-add over Email/FTP/IRC/Usenet/Gopher that drove early adoption. The reason the web is so crufty is because all the application stuff was bolted on after the fact, but here's the rub—if they had designed it for applications from the beginning it would not have been too complicated and wouldn't have taken off. There were tons of proper app development platforms that could have formed the basis for "proper" cross-platform GUI app development, but none of them could cross the chasm to true ubiquity the way the web has.
By the late 90s, the amount of effort being invested in the web was far greater than any single company or organization could ever bring to bear on a problem. 20 years later it is a deep sedimentary stack of more technologies than anyone can really keep track of, and yes it's been quite tortured and abused, and often feels extremely janky.
But tempting as it might be to think you can design something better, you really can't. You can't solve all the problems the web has solved. It will seem good when you start, but as you go you run up against the edge cases, and the scope grows, and then you have to bring in more people to solve the new problems, and you start to lose control of the design again, and you end up with an entirely new mess that people write hang-wringing blog posts about. Either that or you play benevolent dictator and try to control everything, but then adoption slows and it becomes more of a proprietary platform that never gains ubiquity.
The only tractable way to improve the web is to pick an area and work on improving it, it's not as satisfying because you have to much cruft, but it is possible to slowly improve things. It takes a long time this way, but you are leveraging the millions (billions?) of man-hours that have gone into web technologies to date.
That's not to say the web is invincible, yes the web might be replaced, but it won't be replaced by someone building a better app platform (that will never gain traction, mark my words). Instead, it will be something unexpected, it will be a simple use case that, in unforeseeable ways, over time, creates an entirely new ecosystem that simply makes the web irrelevant.
> There were tons of proper app development platforms that could have formed the basis for "proper" cross-platform GUI app development, but none of them could cross the chasm to true ubiquity the way the web has.
I think this is the heart of our disagreement.
App platforms that were actually designed, like iOS, Android, Java etc have all been very popular. Smartphones are ubiquitous, but the smartphone that bet most heavily on the web platform (Palm Pre) was wiped out by the smartphones that bet on relatively cleanroom designs.
Heck, the Android API isn't going win many awards for simplicity or elegance, but the BeOS and Danger people had OS design experience and pretty much knew what they were doing. Android gets far more right (in my view) than it gets wrong. The entire "app revolution" that followed the iPhone launch was more or less a huge slap in the face to the web platform. If it was really so hard to beat the web, why is all the innovation on smartphones in the native app space, with web devs getting the crumbs years later after various Apple/Google/Microsoft talking shops have finished finalising and shipping WebTiltSensor or whatever feature we have in mind?
The key difference between mobile and desktop, in my view, is deployment. Android and iOS handle deployment and upgrade for you. Desktop platforms only started trying to tackle that recently, and mostly screwed it up. The web has a great deployment story.
> If it was really so hard to beat the web, why is all the innovation on smartphones in the native app space, with web devs getting the crumbs years later after various Apple/Google/Microsoft talking shops have finished finalising and shipping WebTiltSensor or whatever feature we have in mind?
Because platforms have different strengths. It's obviously not hard to design a better platform than the web for apps. What I'm arguing is that you can't take one of those clean room implementation and turn it into a truly cross-platform standard.
As popular as iOS, Android, Java, Flash, QT and whatever else are or have been, they are still hamstrung by being single-vendor efforts. The web on the other hand is table stakes for any new computing device, on the manufacturer's dime—it's not cost center or support burden for the platform "owner". This is a world of difference that's hard to overstate.
> The entire "app revolution" that followed the iPhone launch was more or less a huge slap in the face to the web platform.
Why? Again, they have different strengths. If you need high performance, and access to hardware then you have to go native. Of course all innovation will happen in closed environments where the vendor can control the full stack and move quickly. There was no way a loose set of open standards like the web can compete with that, the results should surprise no one.
But does this mean apps are going to eclipse the web? No! Because there is still a high threshold of trust to install an app. As much as the web security model is a mess, it also has been reasonably successful at isolating the hardware environment from the on-demand functionality it delivers. Can you imagine a world where you only ever install apps for everything and never use the web? Even if you use Google and Facebook's apps, there will always be a long tail that you aren't willing to install, and the web is there for that use case. Because of this, companies have to continue developing websites to serve the top of the funnel, and as they do so, they will demand more and more standards to tie into hardware etc, so slowly web standards will creep in and commoditize functionality which today is only available via native APIs.
More fundamentally, I think you underestimate the sort of worse-is-better strength of the web being document-centric but support app-like functionality. There are far more documents than apps in the world, and many of the apps deal with things resembling documents. So the web has this incredible low barrier to entry where you can throw some documents online, and then slowly build functionality around it. If you're working on apps all day, and living in SV, it's easy to see the warts and lose sight of what a powerful dynamic this is for web adoption.
I hope you can prove me wrong and come up with an idea that can revolutionize the web, it's just that it runs counter to my observation about the forces that shape standards and technology ecosystems at a higher level than individual minds and platforms.
However actually going and attempting to implement this idea from whole cloth is a fool's errand. I'm not really cynical by nature, so it pains to be that guy, but in this case I just think it belies a complete naivete about how software ecosystems evolve and thrive, and why the web in particular was successful.
The web you see today bears almost no resemblance to the original proposal. What got the ball rolling was a very simple document format and protocol that made it much easier to share documents. That was a unique value-add over Email/FTP/IRC/Usenet/Gopher that drove early adoption. The reason the web is so crufty is because all the application stuff was bolted on after the fact, but here's the rub—if they had designed it for applications from the beginning it would not have been too complicated and wouldn't have taken off. There were tons of proper app development platforms that could have formed the basis for "proper" cross-platform GUI app development, but none of them could cross the chasm to true ubiquity the way the web has.
By the late 90s, the amount of effort being invested in the web was far greater than any single company or organization could ever bring to bear on a problem. 20 years later it is a deep sedimentary stack of more technologies than anyone can really keep track of, and yes it's been quite tortured and abused, and often feels extremely janky.
But tempting as it might be to think you can design something better, you really can't. You can't solve all the problems the web has solved. It will seem good when you start, but as you go you run up against the edge cases, and the scope grows, and then you have to bring in more people to solve the new problems, and you start to lose control of the design again, and you end up with an entirely new mess that people write hang-wringing blog posts about. Either that or you play benevolent dictator and try to control everything, but then adoption slows and it becomes more of a proprietary platform that never gains ubiquity.
The only tractable way to improve the web is to pick an area and work on improving it, it's not as satisfying because you have to much cruft, but it is possible to slowly improve things. It takes a long time this way, but you are leveraging the millions (billions?) of man-hours that have gone into web technologies to date.
That's not to say the web is invincible, yes the web might be replaced, but it won't be replaced by someone building a better app platform (that will never gain traction, mark my words). Instead, it will be something unexpected, it will be a simple use case that, in unforeseeable ways, over time, creates an entirely new ecosystem that simply makes the web irrelevant.