Prefetching is possible using standard web technologies and that improves the security footing since it follows the normal browser origin policies.
The actual reason for AMP is that it gives Google control: the good side of that is that companies have an outside check on performance — think about how many companies exist where internal politics means that “this will hurt SEO” will be listened to but “this page is too heavy” coming from inside will be ignored — and the bad side is that anyone who adopts it is binding their technical capabilities closely to Google's decisions.
> pre fetching is dangerous given you don't know if the origin can support it or not.
That's not true unless your site already has a huge exposed vulnerability. It's also irrelevant to the discussion of how a search engine could provide an opt-in feature which you could choose to enable.
> AMP is faster than the alternative.
Do you have any data to support that assertion? Remember, you're replying to a thread about possible standards-based alternatives but Google has never implemented that, so we all we can say is that AMP is faster than not doing anything at all. We don't have any data saying that the solution to web performance problems is a bunch of proprietary markup and JavaScript. I've certainly found AMP to be slower as often as it is faster, because the proprietary markup means you have a bunch of resource requests which will be delayed until ~125KB of JavaScript loads and executes whereas a standard <img> tag would have started the same transfer without that delay (see e.g. https://www.webpagetest.org/video/compare.php?tests=161129_X... where the non-AMP version starts rendering over a second faster on an iPhone 6 over LTE).
As an example of how this could be different, imagine if Google simply started aggressively using measured page-load times and transfer size with the same weight they currently give to AMP, giving all publishers the same incentive to reduce things like the massive amount of ad-related JavaScript they traditionally serve, and started serving rel=dns-prefetch/preconnect/preload hints for the top-n search results (or on mouse hover states, etc.). That might not be quite as fast as the best-case for AMP but I suspect it would rapidly get into the territory where the user benefit is diminishing compared to the benefits of not dictating a restricted tech stack.
That last part is pretty important because AMP is not without cost: it breaks the sharing UX on mobile devices, desktop users get a mobile-optimized page which isn't as good for their devices, and most importantly it limits you to the subset of functionality which they choose to implement. It seems risky to push everyone towards a single company's view of how the web should work rather than allowing independent experimentation and optimization for different types of content and users.
The actual reason for AMP is that it gives Google control: the good side of that is that companies have an outside check on performance — think about how many companies exist where internal politics means that “this will hurt SEO” will be listened to but “this page is too heavy” coming from inside will be ignored — and the bad side is that anyone who adopts it is binding their technical capabilities closely to Google's decisions.