I don't think virus is a 4th declension noun, and I've seen it described in multiple sources as a 2nd declension (meaning slime, poison, infection, mucus, or something thereabout). It is unusual in being a -us noun despite being neuter, but irregularities happen (other examples of neuter 2nd declension -us nouns being vulgus and pelagus). In classical Latin, it is a mass/uncountable noun, and only got the modern meaning (and the ability for form plurals) in recent times.
That said, your point about virulent vs virolent is interesting, but I don't know enough about sound changes, or how that word came to be and made it's way to us to know if this is counter evidence to being 2nd declension, or if it can be explained away otherwise.
All the above is correct. Additionally, it is important to understand that W3C isn't wholly hosted at MIT. MIT is one of 4 co-hosts (the others being Keio University in Japan, Beihang university in China, and ERCIM in France), though MIT is the central hub of that relationship.
The documented reasoning and references do not make any sense. Older browsers are going to choke on newer syntax regardless. This is an ongoing issue in the JS domain that is managed reasonably well via e.g. polyfills, tooling and rapid release. Often with code bloat initially, sure, but the browsers catch up. The days of "oldIE" are long gone.
The point of the declaration is to tell the browsers that can handle the syntax to handle it. Another method could be to use a new linking method. Regardless, the language would still gracefully degrade as only the newer syntax would require strict rules in place.
Please explain it to me if I am completely missing the point.
That country could have an interestingly named neighbor: if Ireland and Scotland band together, they could be the United Republic of Ireland and Northern Great Britain.
I like that Chinese one because I am working on a project involving Chinese text now and because it is written in three character sets: English, Simplified and Traditional Chinese.
Ah, that's why the Chinese appears twice. I didn't see a difference between them (not that I was looking hard).
A question for you, the Chinese characters seem very much more complex eg 但也包括雜誌, which makes me wonder if they're meant to be read at the same font size as the english text or if they'd typically be printed larger?
Also, the full stops and commas seem to be used in Chinese text, as these inherited from european script or did they always exist in some form?
Chinese text is usually the same size as English text: I think people just get used to the vague shape of them and don't actually need to see every stroke precisely.
Only the period 。 and the enumeration comma 、 existed before European influences.
For the font size, it's kind of the same, kind of not. The numerical value of the font sized used is typically the same, but Chinese characters fill the allocated space more thoroughly.
> The web was designed to publish documents, it was never intended to build applications.
I find that repeated a lot, but to me, it's somewhere between an overstatement and a misconception.
Sure, the early web wasn't anywhere near the versatility of what we have today, and interactivity was extremely limited (and as you say, is still more complex than it ought to be). But the semantics of HTTP+HTML, from the early days, allowed for application development. If all we had had were <p>, <h1>, <aside> and GET, sure, documents it is. But we had forms and POST and everything you need to make a CRUD application from the early 90s. It was pretty basic, but so was support for documents (tell Desktop Publishing people that HTML in the 90s was good for documents, and they're not likely to agree much).
I consider (paper) forms to be documents as well. In my opinion, it's a minor evolution of what is considered a "document" to go from a paper form to a dynamic, interactive form. A media application or video game in a browser is quite different. That doesn't mean it's not now a valid use of browsers, it's just not what the intended use of the web originally was.
Further, to quote Tim Berners-Lee directly on his intention when he created the first version of the WWW (emphasis mine):
"Creating the web was really an act of desperation, because the situation without it was very difficult when I was working at CERN later. Most of the technology involved in the web, like the hypertext, like the Internet, multifont text objects, had all been designed already. I just had to put them together. It was a step of generalising, going to a higher level of abstraction, thinking about all the documentation systems out there as being possibly part of a larger imaginary documentation system."
— Tim Berners-Lee
The early '90s only had GET in the HTTP protocol (0.9) so you couldn't actually build any CRUD applications. Quite far from it. POST came in 96 (1.0) and PUT, DELETE in 97 (1.1). REST was introduced in 2000 by Fielding and the idea that the web could used to build large scalable applications slowly gained momentum as CRUD was mapped to POST/GET/PUT/DELETE and the proposal gathered steam in '00s. In the meantime massive amount of engineering went into making JavaScript workable, performance wise (Chrome/V8). Everything came together only in the late 2000's. So, quite far from claiming web as a viable mechanism for building applications in the early '90s.
HTTP POST was in common use well before 1996. It was standardized with HTTP 1.0 in 1996, but form POSTs were common well before then. People were developing "web apps" in the 90's using Perl CGIs, ASP pages (VB script), and even Javascript (Netscape's web server supported server side Javascript applications.)
It's funny that there's contention on this. Here's someone trying to answer the question in 2002:
> The purpose of this document is to answer the question "When did we start seeing web-forms that captured credit-card details, and passed them to CGI-scripts, thereby enabling interaction with server-side applications
In addition to icedchai's comment, there are two more issues worth quibbling about what's written here:
It's odd to say something like "REST was introduced in 2000 by Fielding", since it makes it sound like support for a feature. But REST is just a description of how the Web works. Fielding's dissertation is a retrospective, documenting what the real work of standardization at IETF and among browser makers and server distributions had already earlier enabled.
Secondly, JS was workable performance-wise in the early 2000s in Netscape 6, even without a JIT.
Yet there I was building online shopping apps in 96 in my internship. Get was more than good enough for a search of the catalog to find what you needed. Our system just sent an order email in and we called you for payment. That was via post.
You can definitely make a crud app with only GET. E.g. PHP used to have support for automatically adding session ids to urls in links, for people who didn't like cookies.
I think it's right on the money. The Web has long sucked even for presentation, let alone for application delivery. It's not that you can't do it now; but it's still a hack upon a hack.
I also don't think the longevity of Web sites will necessarily continue. The earliest sites will render today, but will the same be said of today's "single-page-application" sites that rely on scripting?
And finally, I think the LaserDisc comparison is dumb. LaserDiscs from the '70s were playable through the format's demise in the 2000s. "Coding" for it... makes even less sense.
Sure, you could write apps using forms and POST, but how prevalent was it? One of the most popular frameworks for early-ish apps was Flash eg, for which support has been thoroughly discontinued.
Maybe my memory is fuzzy, but I don't remember Flash being used for apps. It got used for video playing and games mostly, as well as unnecessary animation on restaurant/hotel websites. The occasional enterprise app used it, but generally they were ActiveX.
What we had was server-side frameworks. Generated HTML, click a link that does a GET or POST. That's how my webmail at University worked, that's how the control panel for my web hosting worked etc.
I think my first exposure to proper clientside UI was GMail (lunched 2004, I think I signed up ~2006).
>Maybe my memory is fuzzy, but I don't remember Flash being used for apps
Flash was used for apps; I remember one webmail provider which had built their frontend in Flash (can't remember the name sadly), and there various enterprise apps that were deployed with Adobe Flex (I remember Paychex being one of them).
Adobe really wanted it to be used for serious stuff. There was AIR, a Flash desktop runtime. They started integrating
Flash into their tools, I recall a few Illustrator palettes that would make my computer’s fans come on just by opening them because they would make the Flash runtime wake up and start busy-waiting for input on them.
They had the same vision for it that Sun had for Java, with the same flaws that would become apparent when that vision was finally manifested in Electron: write once, run anywhere, and be hideously inefficient compared to native code.
Up through 2008 we made very heavy use of posts in web apps where I worked, js was more of an accelerant for things like autocomplete and client side validation. Amazon retail was still building mostly post based experiences when they were doing the universal detail page project in 2015.
I never used flash seriously from 1996-2015 when I switched to an electron front end and then mostly to backend work.
"App" is short for application, which is just a fancy word for program. If a page does computations locally (using javascript), then it is a program; if the page does nothing locally but push requests to the server (get/post), then this is a user interface. By this distinction, the early Web even with POST had no application, it was just at best a remote interface.