elinks and lynx are completely different browsers. Lynx is actively developed and can link against recent versions of OpenSSL, GnuTLS, and LibreSSL. The latest release came out yesterday: http://lynx.invisible-island.net/release/
Lynx can render HTML5 web pages. Not sure what WebGL has to do with a web browser that is meant for terminal emulators. CSS is not supported in text browsers in general (not all of them are full-screen terminal applications) because it makes no sense - for decent text rendering, the browser needs to ignore style sheets and render in a way appropriate for text output.
Yes you're right, it's unfair to lump elinks and lynx together. It's before Browsh I would use elinks all the time as it seemed the most modern.
Surely though just the sheer fact of CSS's `hidden` declaration has to be taken into account by any browser, text-based nor not? I've came to the conclusion that there just simply isn't a subset of modern web standards that you can adhere to and still use the web today. Our best option is just to piggy back off one of the major browsers that does all the hard work of keeping up with the rapidly changing technologies.
I like the idea behind your wrapper but I think you’re mistaken regarding hidden. Things are hidden from browsers for a lot of reasons but a text browser should probably safely ignore them all as content and presentation are different.
I mean for example one of the big problems Browsh has is not rendering text that web developers have hidden in the DOM to be displayed on some kind of mouse interaction. So Browsh has to ask JS to query the `display: hidden`, `z-index`, etc values to ensure that it doesn't render text that is not intended to be displayed until after an interaction.
Displaying hidden text is one of the places where text browsers can "fix" broken web pages. There are broken web pages that hide the text you want to read until you click on some JavaScript modal, or some "Click to continue reading" link/button that breaks scrolling. w3m displays hidden text and greatly improves readability on many web pages.
You can argue that using the CSS display property is a bad hack around the poor performance of adding/removing DOM elements, but the general problem of trying to automatically adapt a GUI application to terminal display does not change (having the GUI be emulated in HTML that is meant to be displayed in the GUI of the web browser just adds a layer of indirection, difficulty, and inefficiency). On the other hand, web pages that are really hypertext documents wrapped in useless and inaccessible JavaScript and CSS can have all of the event handlers and CSS properties ignored, and become legible again. This is something that text browsers do well, and why text browsers continue to be relevant today.
> w3m displays hidden text and greatly improves readability on many web pages
Are you saying that ignoring CSS rules for hidden content improves readability? I find this very hard to understand. For example something like "Thanks for submitting this form" appearing on page load is not an improvement to readability.
apart from ssl, there are slight differences: last time I checked, "lynx" will do a http/1.0 request. Ubuntu nginx-common config suggestions do not ship gzipped content below http/1.1 if enabled without change (gzip_http_version 1.0). "w3m" is the same. "links" does http/1.1 without explicit runtime argument. I need to measure if this makes a difference in the general web (or with a day of my browsing history). http/1.0 sure does receive compressed content with "accept-encoding: gzip", but I'm curious if the http-version will make a big difference. Firefox will give you http/2.0
I stumbled on this detail when I was capped to 2 kbytes/s and tried different cli browser. I applaud the authors effort. The docs should mention the bandwith benefits are gained primarily if you have a remote to ssh into where you can setup a firefox install.