This is pretty fascinating and comes with some complicated AI-world incentives that I've been ruminating on lately. The better you document your work, the stronger contracts you define, the easier it is for someone to clone your work. I wouldn't be surprised if we end up seeing open source commercial work bend towards the SQLite model (open core, private tests). There's no way Cloudflare could have pulled this off without next's very own tests.
Speaking more about the framework itself, the only real conclusion I have here is that I feel server components are a misunderstood and under-utilized pattern and anyone attempting to simplify their DX is a win in my book.
Next is very complex, largely because it has incrementally grown and kept somewhat backwards compatible. A framework that starts from the current API surface and grows can be more malleable and make some tough decisions here at the outset.
Crazy to see it's already being run on a .gov domain[0]. TTFGOV as a new adoption metric?
> The better you document your work, the stronger contracts you define, the easier it is for someone to clone your work.
Well said; this is my thinking as well. One person or organization can do the hard work of testing multiple approaches to the API, establishing and revising best practices, and developing an ecosystem. Then once things are fairly stable and well-understood, another person can just yoink it.
I have little empathy for Vercel, and here they're kind of being hoist by their own petard of inducing frustration in people who don't use their hosting; but I'm concerned about how smaller-scale projects (including copyleft ones) will be laundered and extinguished.
> Then once things are fairly stable and well-understood, another person can just yoink it.
That transparency & availability for community contributions or forks is the point of open-source.
If you're only using open-source as marketing because you're bad at marketing, then you should probably go closed source & find a non-technical business partner.
Whoever "yoinks" the package runs into the same problem because they now have to build credibility somehow to actually profit from it.
Established corporations will be doing yoinking, with a pre-existing credibility. There's a huge incentive to offer these copied services for cents on the dollar, as a way to kill the competition.
It'll be interesting to see if this happens at a service level too. Like how lots of companies offer an S3 compatible API, will companies start offering similar services and building a compatibility layer over the top as an easy way to for customers to transition? You could use the existing service as a test suite to check your compatibility API behaves the same as the original product.
> There's no way Cloudflare could have pulled this off without next's very own tests.
I'm very uncovinced. History showed us very complex systems reverse engineered without access to the source code. With access to the source code, coupled with the rapid iteration of AI, I don't see any real moat here; at best a slight delay.
There was a recent post on here where the creator of Ladybird (Andreas Kling) translated a chunk of his novel browser from c++ to Rust in two weeks -- a feat he estimated would take him months: https://ladybird.org/posts/adopting-rust/
I, in my own way, have discovered that recent versions of Claude are extremely (as in, super-humanly) good at rewriting or porting. Apparently if recently released coding agents have a predefined target and a good test suite then you can basically tell them that you want X (well-defined target w/ good suite of tests) written in Y (the language/framework you want X written in but it isn't) -- and a week or two later you have a working version.
I have spent the last month wrapping my head around the idea that there is a class of tasks in software engineering that is now solved for not very much money at all. More or less every single aspirational idea I have ever had over the last 20 years or so I have begun emabarking on within the last two months.
I am curious, have you attempted to do this to any binary packed with commercial obfuscation/"virtualization" schemes (e.g. Orean's Themida/Code Virtualizer and VMProtect)?
No, I would need to find a binary to test on. I suspect it would produce horrible code at the decompiler layer but ultimately I would expect that function signatures are still relatively clean?
Its scary - once you get the differential testing harness set up it seems to be just a matter of time/tokens for it to stubbornly work through it.
Source code is one thing; tests covering the codebase are another.
And if you just copy the source code or translate it one-to-one into a new language, rather than make a behavioral copy, there will be copyright issues.
The tests are absolutely essential, otherwise there's no signal to guide the LLM towards correct behavior and hallucinations accumulate until any hope of forward progress collapses.
> I wouldn't be surprised if we end up seeing open source commercial work bend towards the SQLite model (open core, private tests).
Wouldn't this just mean that actual open source is the tests? or spec? or ... The artifact which acts as seed for the program, what ever that ends up being?
I'm not sure about this. LLMs can extract both documentation and tests from bare source code. That said I think you're correct that having an existing quality test suite to run against is a huge help.
Man, I love Next ... but I also love Vite ... and I hate the Next team, because they focus on fancy new features for 0.1% of their users, at the complete expense of the other 99.9% of the Next community (who they basically ignore).
This gives someone like me everything we want. Better performance is something the Next community has been begging for for years: the Next team ignored them, but not the Cloudflare team. Meanwhile Vite is a better core layer than the garbage the Next people use, but you still get the full Next functionality.
I wish Cloudflare the best of luck with this fork: I hope it succeeds and gets proven so I can use it at my company!
React was originally meant to be the 'V' in MVC. You can still use it that way and React becomes very simple when you only use it for UI. Why do data fetching in a React component?
Rails 8 is surprisingly good nowadays. It absolutely still has its share of problems (e.g. Bundler being slow, the frontend story being crappy without Inertia, lack of types which is a biggie, memory) but it is still a fantastic framework imo.
Why Inertia.js? I quite enjoy not using JS heavy frontends in Rails by leaning on Turbo and light Stimulus JS controllers where needed. My experience going hard into Vue+Rails was full of pain and I've rediscovered why server first makes everything easier to reason about instead of duplicating tons of logic + dealing with constant async issues (particularly around automated testing and complex data loading).
Inertia because it’s a plug-in replacement for ruby html templating aka erb. Try it out, it’s basically the same stuff you get from erb, without the need for Turbo’s web sockets. You get server side rendering, all the great BE stuff like server side validation, but no SPA headache.
I find the best DX with Adonis/nodejs and typescript.
I miss Rails so much when working with any of the top JS frameworks.
Every time I run into an issue that Rails had a standardized solution for a decade ago just proves that most of the JS world spends their days metaphorically digging holes with sharp sticks, rather than using the appropriate tool.
But the industry values overpaying stick-diggers over results, therefore I gotta play along…
I came at this from the opposite direction in that I had zero coding experience and started learning ~7-8 months ago. I picked Rails specifically because everyone kept saying "it's dead" but every actual builder I talked to said "it just works." They were right. Rails gave me sensible defaults for everything I needed; auth, background jobs, file storage, payments...without me having to evaluate 30 competing npm packages for each concern. I shipped a production app with real Stripe payments as a solo dev with no CS degree. I genuinely don't understand how someone starting from scratch today navigates the JS ecosystem without losing months just to decision fatigue on tooling.
Shilling a bit but maybe check out wasp.sh, we are conceptually very similar to "Rails for JS"! Just don't tell Claude to completely copy us hehe (pls)
Nothing personal and I wish you the best of luck with wasp.sh, but the constant churn of new libraries that will revolutionize JS development, but are never quite finished and are eventually abandoned in a semi functional state is exhausting and exactly the main issue I have with the JS ecosystem in general.
At this point, I'm convinced there's a secret global conspiracy to prank JS developers. For example 1 person maintaining 3 similar-but-distinct decimal libraries for Javascript, or the top 3 PDF processing libraries silently producing blank outputs.
Rails powers nearly 15 percent of the US e-commerce. I love it. Any time I have to use another framework it feels like a huge downgrade. Rails has so many things that make it nice to use
You can find specialized roles than Rails and usually more experienced (understandably) but most companies would be really open if you told them you have a Rails background but want to learn Elixir.
The pool size is less, but the pay is more (depending on your demographics, experience, etc) in my personal experience.
But, honestly I chose it not for the market, it's just a better programming language to build stuff, period.
The basic premise of Next is good, but it definitely has more overhead that in should, has odd "middleware", and is very hard to optimize. I view this mostly as a React problem though since any page requires full hydration and ships everything to the client. RSCs are... not my favorite for sure.
I too have been very frustrated by this, and I made an "Astro for dynamic sites" TypeScript framework called Hyperspan ( https://www.hyperspan.dev ) that aims to fill the gap in the JS ecosystem for a modern fully dynamic option that, similar to Astro, makes dynamic islands easy. I have enjoyed using it in all my own projects. Check it out if you want.
RSC by design does not ship everything to the client. That's one of its basic premises. It ships markup, composed in client interactivity, but you can shed a lot of the code required curate that markup.
I obviously meant traditional React components, not RSC. RSC can eliminate some client code, but they can be very awkward to use in practice, and lines between server and client get blurry really fast. The mental model is difficult for many to fully grok. I say this as someone who has lead engineering teams with folks of varying skill levels. RSCs are not worth the extra complexity and mental overhead they bring.
at my job we have some 7+ year old nextjs apps that don't receive new features but still do their jobs perfectly fine, and they keep changing random shit around for no reason, we've had to waste time on multiple refactors already for major nextjs version bumps once the older ones are no longer supported
Is there any front end framework that doesn't do this? I dropped out of the front end years ago, and it seems to just get worse every year with a profusion of confusion. Doesn't anyone yearn for back when we didn't have to build the front end at all?? Just emit some HTML and serve up some JS files from the backend, and everything just flows from there?
Someone go make an AI rewrite of Apache+Mod-PHP and sell it to zoomers as the hip new thing already please
I know everyone loves to hate Angular but it is in a really good place at the moment. If you don't need SSR and just want to build an SPA, Angular is the way to go imho.
Is there any reason to keep upgrading if the apps keep doing their jobs perfectly fine? Pull in a stable version of the framework and the associated docs and stay there.
I'm moderately hopeful that LLMs will help here because they lack the human motivations to needlessly mess around with stuff and over-complicate things.
Was Linux owned by a large company? Was the maintainer getting paychecks from that company? Was it profit motivated? Was it released as an AI experiment?
If the similarity is "they are both open source projects" then so are about a million others. 99.99% of them don't get any traction beyond the first week.
what if we all move to vinext? I'm asking claude to migrate us in a git worktree using a team of agent, installed the vinext skill to help with that, it did it in 10min
also why do you need support? agents are the support
What is it you love about Next that isn’t tied to Vercel and isn’t available elsewhere? I love Next too but I find the value is inextricably linked to Vercel. I can’t imagine choosing to use Next if I’m not choosing it for Vercel’s fancy stuff.
It is hilarious to me that the industry has reinvented serving HTML to clients, but with many intermediate steps, and this is heralded as groundbreaking.
I still just prefer having a more clear separation of concerns with API routes instead of using server components. I want my frameworks to be way less fancy than what Next is pushing out these days. I get the feeling we're dealing with the consequences of Vercel employees needing to justify their promotions.
React and RSC are not dope they are a kludge and the only reason you’re blind to that fact is because you’re React brained and have no experience with modern alternatives that are actually good like SvelteKit or SolidStart.
Weird, I hate Next and I love Vite. We have a big (I mean _really_ big) production app that runs on Next.js at work and it's the slowest thing I've ever worked on. I had to upgrade my machine to an M4 Pro just to get local dev compile times down from 5-8 minutes to ~30-60 seconds per route. And my hot refreshes are down from ~15-20 seconds to 5-10. It's _bad_. All the Next.js team does is give you the run-around and link to their docs and say here, try these steps, you're probably doing something wrong, etc. Nope. The framework is just slow. They use simple toy apps to demo how fast it is, but nobody tells you how slow it is at scale.
If you are using webpack, see if you can make the switch to turbopack. It cut my build times from ~1 minute to 15 seconds, incremental builds are down from 10 seconds to 2. Memory usage is down a ton as well. But if you rely on webpack plugins this may not be an option for you.
I've been pushing our apps at work in this direction and it's definitely worth it.
If you're relying on webpack plugins heavily, I'd consider that a liability either way. It's going to seriously hamper your portability to other frameworks and build tools and even new versions of the current ones.
You can easily run turbopack for development / preview environments and webpack for production(-like) ones btw. as long as you don't rely on custom magic.
It may be sacrilege to bring it into this conversation, but I've spent the last year building a fairly large community site in Nuxt, vite has been wonderful, though I prefer vue over react. I am a little annoyed I paid for NuxtUI Pro like 3 months before it became free, but whatever.
Yeah, Vercel should have done this with NextJS a while ago. There is a reason why quite literally every other framework uses Vite because it amazing, easy to use, and easy to extend.
I mean you don't want really want to use javascript for the backend anyway... What's the problem with just using vite and any backend of your choosing?
I wonder to what extent you should say you "rebuilt" something when the most basic hello world example doesn't work. And I wonder to what extent it makes sense to call it "from scratch" if you inherit a battle tested extensive test suite from the thing you're rebuilding, and the thing you're rebuilding is part of the training data.
Here's the first paragraph of Harry Potter and the philosopher's stone. I rewrote it from scratch, apparently:
Mr. and Mrs. Dursley, of number four, Privet Drive, were proud to say that they were perfectly normal, thank you very much. They were the last people you’d expect to be involved in anything strange or mysterious, because they just didn’t hold with such nonsense. Mr. Dursley was the director of a firm called Grunnings, which made drills. He was a big, beefy man with hardly any neck, although he did have a very large mustache.
I find it interesting that they bought Astro (https://blog.cloudflare.com/astro-joins-cloudflare/), which from my definitely-not-a-frontend-person perspective seems to tackle a similar problem to Next. A month ago.
If it is so cheap to make something that they recommend using (rather than a proof of concept), why buy Astro (presumably it was more expensive than the token cost of this clone?).
One conclusion is that, at the organisational level, it still makes sense to hire the “vision” behind the framework, rather than just clone it. Alternatively, maybe AI has improved that much in 1 month!
I view it as a long-overdue exit ramp for maintainers of Next.js-based webapps to extricate themselves from its overly-opinionated and unnecessarily-tightly-coupled build tooling. Being stuck on webpack/rspack and unable to leverage vite has been a huge downside to Next.js. It's a symptom of Vercel's economic incentives. This project fixes it in one fell swoop. I predict it hurts Vercel but saves Next.js.
I'm very patient with the ai-led porting projects since they're revealed with a big engagement splash on social media. Could it be durable? sure but I doubt anyone is in that much of a rush to migrate to a project built in a week either.
Astro is a different paradigm. Acquiring Astro gives Cloudflare influence over a very valuable class of website, in the same way Vercel has over a different class from their ownership of Next.js. Astro is a much better fit for Cloudflare. Next.js is very popular and god awful to run outside of Vercel, Cloudflare aren’t creating a better next.js, they’re just trying to make it so their customers can move Next.js websites from Vercel to Cloudflare. Realistically, anyone moving their next.js site to Cloudflare is going to end up migrating to Astro eventually.
Astro isn’t solving the same surface as next. Astro is great for static sites with some dynamic behavior. The same could be said about next depending on how you write your code, but next can also be used for highly dynamic websites. Using Astro for highly dynamic websites is like jamming a square peg into a round hole.
We use Astro for our internal dev documentation/design system and it’s awesome for that.
I think they just want steer users/developers to CF products, maybe not? It is interesting to see the two platforms. I've moved to svelte, never been a frontend person either but kind of enjoying it actually.
Astro has "server islands" which rely on a backend server running somewhere. If 90% of the page is static but you need some interactivity for the remaining 10%, then Astro is a good fit, as that's what makes it different than other purely static site generators. Unlike Next.js, it's also not tied to React but framework-agnostic.
Anyways, that's why it's a good fit for Cloudflare: that backend needs to be run somewhere and Astro is big enough to have some sort of a userbase behind them that Cloudflare can advertise its service to. Think of it more as a targeted ad than a real acquisition because they're super interested in the technology behind it. If that were the case, they could've just forked it instead of acquiring it.
From Astro's perspective, they're (presumably) getting more money than they ever did working on a completely open source tool with zero paywalls, so it's a win-win for both sides that Cloudflare couldn't get from their vibe-coded project nobody's using at the moment.
This is probably the most interesting AI experiment I've seen yet. Looking through the codebase has me wondering where all the code is. I don't know if anyone has had the displeasure of going through the next.js codebase, but I estimate it's at least two orders of magnitude more code than this reimplementation. Which makes me wonder, does it actually handle the edge cases or does it just pass the tests.
Like compare the two form implementations for example. Vinext is a completely different implementation compared to what the Next.js version does. Is their behaviour actually the same? The rewrite looks incredibly naive.
The behavior isn't entirely the same and reaching 100% parity is a non-goal, but there are a few things to note.
This is still a very early implementation and there are undoubtedly issues with the implementation that weren't covered in next's original test suite (and thus not inherited) while not being obvious enough to pop up with all the apps we've tried so far.
As for why it's so much smaller, by building on top of Vite and their react + rsc plugins there is a whole lot of code that we don't need to write. That's where a significant portion of the LOC difference comes from.
> The result, vinext (pronounced "vee-next"), is a drop-in replacement for Next.js
"Drop-in" in my mind means I can swap the next dependency for the vinext dependency and my app will function the same. If the reality is that I have to spend hours or days debugging obscure edge cases that appear in vinext, I wouldn't exactly call that a drop-in replacement. I understand that this is an early version and that it doesn't have parity yet, but why state that it is a non-goal? For many of us, that makes vinext a non-choice, unless we choose to develop for vinext from the beginning.
Furthermore, if you're making a tool that looks almost like a well-known and well-documented tool, but not quite, how is gen AI going to be able to deal with the edge cases and vinext-specific quirks?
Changing the definition of drop-in definitely has me concerned and makes me not take this any seriously than other projects open-sourced by Cloudflare, particularly the ones focused on more critical parts of their systems – e.g. pingora and ecdysis.
Yeah I'm curious about all the routing edge cases, form actions, server functions etc, since that is where most of the complexity of the app router comes from. Does it encrypt captured values inside closures sent to the client? Stuff like that.
It is the most passive aggressive thing I’ve ever seen. Cloudflare team had issues with the Next team? And they responded with ‘we can do your whole product with an intern and AI’, lol.
Nextjs had remote code execution vulnerabilities because of how they implemented react server side. I am not touching an AI version without waiting for a while.
Thank you. This is the part that shocks me the most. I was always wary of Next.js for this exact reason (in fact, I refused to use it for personal projects before the RCE because I was scared that I would make a mistake and leak server-side data to the client.
Bugs like this are easy to happen and even easier to miss if you’re generating thousands of lines of code with AI.
It was a vulnerability that only could exist due to the incestuous relationship between React and Vercel. It was something Vercel has been trying to heavily push into React for years (which is why they hired previous react core team members).
I'm deeply skeptical of the "X reimplemented and it was super easy" thing.
The devil is in the detail.
So many edge cases unlikely to be there.
So many details or fine details unlikely to be there.
Years of bug fixes.
If it is literally a drop in replacement and it passes all the tests, and you're replicating something with and extremely thorough test suite, then sure I'll give you the benefit of the doubt.
Otherwise, I don't believe people "rebuilt X product in a week".
God speed to the poor souls that have to make it actually work in the long run:
"I can say that with some authority. Yes, I'm the one who wrote most of this project, but I'm also the director in charge of the entire Cloudflare Workers org, almost 80+ people at this point. I'm not just an IC engineer who has to find time to justify continuing to work on this. I can literally put people on it, and I've already been talking to the team about how to do exactly that."
I don't necessarily buy it either, but TFA talks about the test suite. They basically pulled 2k unit tests and 400 E2E tests from Next and made sure they all passed.
> Most abstractions in software exist because humans need help. We couldn't hold the whole system in our heads, so we built layers to manage the complexity for us.
Kind of a sloppy statement, but I don't think it's accurate to say abstraction or layering exists in software just because humans need help comprehending it. Abstractions often exist to capture the essence of some aspect of the real world, and to allow for software reuse. AIs will still find reusing software useful? Secondly, you equate "abstractions" with "layers" which aren't really the same thing. Layers are more about separation of concerns. Maybe it could be argued layering is a type of abstraction.
I remember multiple people at HN saying "show me ONE example where AI was used to produce commercial-grade software" like a month ago. Cloudflare alone has posted a couple of examples recently, and yesterday Ladybird was ported to Rust using AI.
The most interesting aspect I see in all these examples is that extensive test suites make the work very straightforward. Maybe AI will produce a comeback of test-driven development.
How is this production grade? The last few things CF posted with AI were outright lies or omitted large swaths of functionality.
If your stance is assuming we have an existing implementation of something in the training set, and we have a robust test harness already, and we have thousands of dollars to throw at tokens, and we're not at all concerned with "works" THEN this is viable then sure? But that doesn't seem to be what most boosters are saying.
Something like 95% of vinext is pure Vite. The routing, the module shims, the SSR pipeline, the RSC integration: none of it is Cloudflare-specific.
--- end quote ---
The real achievement is human-built Vite (and it is an amazing project).
Since Next.js's API surface and capabilities are known, this is actually quite a good use of AI: re-implement some functionality using a different framework/language/approach. They work rather well with that.
> The real achievement is human-built Vite (and it is an amazing project).
From TFA:
Vite is the build tool used by most of the front-end ecosystem outside of Next.js, powering frameworks like Astro, SvelteKit, Nuxt, and Remix
Are you saying those frameworks aren't impressive because they are also powered by Vite?
Also from TFA:
A project like this would normally take a team of engineers months, if not years. Several teams at various companies have attempted it, and the scope is just enormous. We tried once at Cloudflare! Two routers, 33+ module shims, server rendering pipelines, RSC streaming, file-system routing, middleware, caching, static export. There's a reason nobody has pulled it off.
That's the most important result of this experiment. They achieved something that they'd wanted to do but couldn't pull it off. Do you think they are lying?
> Are you saying those frameworks aren't impressive because they are also powered by Vite?
That is not what I'm saying
> That's the most important result of this experiment. They achieved something that they'd wanted to do but couldn't pull it off. Do you think they are lying?
Once again, that is very explicitly and very clearly not what I'm saying or thinking.
You could try actually reading and understanding what I wrote instead of responding to words in your head.
The buried lede here is the Astro acquisition timing. Cloudflare bought Astro a month ago, and now they're showing they can replicate Next.js's API surface with AI in a week. The strategic play isn't vinext itself — it's signaling to the market that framework lock-in is dissolving.
If you're a Next.js shop stuck on Vercel because self-hosting is painful, Cloudflare just gave you two exit ramps: Astro (for new projects) and vinext (for existing ones). Whether vinext is production-ready today matters less than what it represents for Vercel's pricing power.
The real question nobody's asking: if your framework's value can be replicated by targeting its test suite, what exactly are you paying for with Vercel's premium tiers? The answer used to be "the only place Next.js runs well." That moat is eroding fast.
> We also want to acknowledge the Next.js team. They've spent years building a framework that raised the bar for what React development could look like. The fact that their API surface is so well-documented and their test suite so comprehensive is a big part of what made this project possible.
Hi next.js devs, we like to acknowledge the effort you put for writing good tests so we were able to rip it off. You know claude already has next's entire source code in it's training data?
"I'm also the director in charge of the entire Cloudflare Workers org, almost 80+ people at this point. I'm not just an IC engineer who has to find time to justify continuing to work on this. I can literally put people on it, and I've already been talking to the team about how to do exactly that." -Steve Faulkner
https://github.com/cloudflare/vinext/issues/21#issuecomment-...
It's wild. If I need to steal someone's code, I just need to let an ai crawl it and it'll regurgitate the same thing. I can't wait for it to recreate marvel movies.
I think this framing is wrong. we have to learn to accept that it's not stealing. It's a new world where it's fair use and we don't know how to deal with it.
If we accept this then we can do something about it. Without it no one will heed us.
I understand your point, but at some point someone needs to think about morality.
If you or I copied and reimplemented nextjs in a better way, it doesn't feel as wrong. But when a large company does that and then brag about it, it's in poor taste.
Especially pointing out one developer and 1000 usd of tokens replacing the efforts of hundreds of talented developers. There's people on the other side of the screen.
This traffic guided on-demand build thing, can it be powered by a file which has a standard format?
That’d make it vendor-independent - any vendor could create tooling to generate the traffic file, and you could opt to fetch it at build time or check it into your repo.
The article say that "Next.js is well-specified." I... don't think this is actually true. It certainly has lots of documentation, but as has come up time and time again, there are tons of undocumented or poorly documented behaviors that have been the cause of consternation.
So I kinda wonder, did they just create the framework that Next.js claims to be but never has been? And is Next.js without the hidden stuff actually a good framework? Who knows.
I have done a similar thing with my web app (codeinput.com) and honestly wouldn’t touch this thing for even a fun project. My reason to migrate was two fold: nextjs simply didn’t work with Cloudflare workers and IBM carbon required ‘use client’ for every page which meant that no HTML was generated. Everything was client side.
Google Gemini, at the time, created an SSG solution which I had spent the next 3-4 months fixing bugs for. Consequently, I had to understand the whole SSG build step and all the wrong design decisions the AI made that resulted in the site getting a horrible core web vitals score. In the end, I just put the site behind a white “div” that disappear when the page finally loads. SSR is way more complex than it sounds.
This project (along with the quantum post) is quite concerning. It’s not clear why Cloudflare has decided to take this direction. If you want to know why LLMs are completely unable to produce something even close to NextJS, a better solution would have been to ask the LLM to fix the opennext adapter rather than building a new framework from scratch.
fwiw, I just tried running the agent-skill they provide for fun to migrate an app-router based next 15 site and the end result is it entirely failed to start.
Vite just hangs when running vinext dev, with no output in logs whatsoever beyond printing`vinext dev (Vite 7.3.1)`.
i love how this disintermediates the next.js/vercel axis, which seems to be determined to make basically everything hard except for exactly what they want to do. as much as i love what vercel has done for open source in general (amazing stuff!) it is hard to interpret some of the stuff they do with next as anything other than vendor lock-in bs… the kind that i know is not in their hearts.
NextJS is bad enough, cannot imagine an Ai version
Cloudflare also lost my support because their support is among the worst, rep evn sneered (cannot update my WHOIS, still, after months of emails). Strongly recommend avoiding their platform. You will find that you lose more time & money to dealing with the issue of parity. God help you if you ever need support, almost every question in Discord goes unanswered as well.
The former CTO commented a lot here and said numerous times about emailing him with issues that support couldn't figure out. Maybe try emailing the new CTO?
nah, they already lost my business, it seems cultural, which we know is a hard ship to steer in a new direction, I'm not interested in trying that again
if said CTO happens upon this, my handle should show up in your systems if you do do as parent commenter suggests
Would LOVE a more in depth article about the discussions with Claude and how you created the agents to accomplish this! Would be very interested to know what you found the agents were really good at and what they could improve on.
I especially am interested in the initial discussions with Claude about scope, process, and general gameplan. I find that this initial discussion requires a lot of shaping and a lot of input, would love to see more insights into that process from your end!
someone spent over 1000 dollars to replicate the functionality of Next.JS, even 1 dollar would seem too much somehow. I suppose that is me being overly retributive.
...and (again), hello world does not work [1]. the ai slop pr [2] absolutely butchers the fix. anyone foolish enough to switch to this is in for a rough time. details matter!
Lots of hate for NextJS in here so im wondering what people use as an alternative framework...
Gatsby? I used to use that one until the updates basically ceased to exist.
Vite with <insert your favorite here> - looks good, but at initial glance seems to favor just pure speed for any other feature support like MDX, advanced SEO, etc.
Roll your own with React and webpack? Good luck, and you'll probably end up with something that looks like the others I've mentioned above.
Just surprised many comments are just stating complaints about Next and not providing any counter examples, its very un-HN.
This stood out to me as well. The level of discourse is incredibly low compared to other topics. Huge amount of "next/vercel sucks" without any constructive or deeper argument made.
There is a similar vibe on the Next.js subreddit, just enormous amounts of shallow negativity. Very strange.
(I'm not saying there aren't valid reasons to dislike the framework or company but the way it's expressed is consistently incredibly juvenile for some reason)
Eh, it's been rehashed over and over again. People experienced with Next.js and other frameworks don't need to read the same constructive, deeper arguments being made ad infinitum. It's like when people say microslop or micro$oft, everyone knows what they're talking about.
Astro, Nuxt, SvelteKit, SolidStart, React Router (prev. Remix), among others
Vite has plugins for MDX, SSR, etc. You can easily build your own framework if you want in a few hundred lines of glue code.
The criticisms of Next / React are so ubiquitous you can metaphorically gesture in their direction and most people know what's up. It's like wondering why someone said "ai slop" and didn't provide an expose on the quality of AI writing.
I get the gist here but I hate the tone of these sorts of posts. Imagine being a NextJS developer, pouring your heart and soul into it day after day, knowing the codebase inside and out, and seeing some dude on the Cloudflare blog bragging about how he rewrote your project in a week using AI. It's tone deaf. It's not impressive.
The tool is hella useful. The messaging is ignorant. This should have been a "we built a tool to deploy NextJS on cloudflare natively" instead of this AI brag.
I partially agree with you, because Vercel put a ton of effort on this one. And the repo (and features list) is massive.
On the other hand, I do believe they drifted too much and stopped listening. As someone else said it - most of the fancy features are used by 1% of the projects, and everything else is buried in hacks and workarounds.
I also agree the tone (and especially the AI brag) is a bit too much. And at the same time it's honest - it doesn't need to be THAT hard and complex. Nor slow :)
That's why I've written an open letter to fix Next.js, to whomever wants to do it (Vercel, Cloudflare or anyone else). Because we have needs, and we cannot play this game anymore...
asdf was the hot shit for quite a while, with people (myself included) invoking all kinds of shell arcanum to make it faster - then mise (née rtx) came out, and it was game over. Compatible with asdf’s ecosystem, but infinitely faster.
Poetry was incredibly popular, along with various other competitors, and then uv came out.
I get what you’re saying about the AI angle, because it’s somewhat different when a human takes your crown by dint of pure skill, but it’s gotta sting either way.
Tone deaf? It's the reality. Developers shouldn't bury their head under the sand. Chart your course accordingly.
> Rewrote your project
That project would die without user's adoption. Be appreciative. Nextjs is an open source project. What is it with HN that constantly praise the virtue of open source software, but downplay that fact the moment they don't like the outcome?
It's going to get increasingly difficult to sell software when there is no moat to replication. We're quickly reaching the point where you can just tell an agent "learn what this software does and then code it".
I love Cloudflare but not a big fan of Next, I love Remix though :) But getting to make things work on Cloudflare is a pain, hopefully they will make it easier with OpenNext. On the other hand , maybe they can do something better at infrastrucute level , rather than make it easier, like a lot easier to bring your own JS flavor.
Woahhh... This is direct threat to Vercel's ecosystem capture model. All those projects they hoovered up then re-aligned and soft-locked into their hosting business.
> The [next.js] developer experience is top-notch.
let me add my own unqualified statement to that: no.
> Next.js has invested heavily in Turbopack but if you want to deploy it to Cloudflare, Netlify, or AWS Lambda, you have to take that build output and reshape it into something the target platform can actually run.
it's almost as if vercel had some kind of financial incentive to gear this towards their own platform.
> reimplemented the Next.js API surface on Vite directly
a clown car screeches to a halt; several burnt-out-bored oracle vs google lawyers climb out and, weirdly, i am there for it
all in all, it's definitely a good example of something we couldn't have done for $1100 pre-llms, but: should we have? did somebody consult the lava lamps?
The result of these heists is that no one will publish test suites on the Internet in the future.
The tone of the blog post is upbeat. What are the consequences? Is the new performance expectation at Clownflare to "port" one framework per week? Do you have to generate at least 20 kLOC per week? Aren't you redundant right now?
This is interesting to my on both a technical level as well as a social-political level. I wonder what impact "AI-washing" will have on licensing for example
The core network products seem to be having a run of downtime issues too. Maybe they should focus on their homework before going out to play with the AI kids.
That's cool and all, but them investing $1,100 and two weeks of one engineer's time doesn't yet give me confidence that they're in this for the long haul. It'll be interesting to see how long the long tail of remaining issues will be (it doesn't sound like in another two weeks, they'll have pre-rendering and cache components working), but I'm definitely not adopting this any time soon.
This is another example that good tests (e.g. Next.js's own test suite) are SO incredibly important to making the AI able to work on big projects autonomously with lower steering. So is a very domain-knowledgeable human in charge of steering.
I guess the thing here (which they admit in the post) is that they’re just porting it to Vite, which is the real champ of the story. The LLM basically worked as a translator instead of rebuilding the whole thing from scratch.
So maybe the project is sort of maintainable, as long as people maintain Vite.
>The only people that have trouble with this development are the gatekeepers who think that code should be sacred and revered by itself. That is a perversion of computing, and we got the wrong group of people there.
I'm not sure who the hell you're talking about, but I'd guess from your comment that you have a pretty high opinion of yourself.
> Surprised this didn't get a higher placement on the HN front page, only 34 points?
Vercel may be bad, but they have been a net positive to the web landscape, so many projects are alive because of them. And I truly respect the hard work the next devs put into their code and test suites. I'm surprised any self respecting dev even votes this up.
> Surprised this didn't get a higher placement on the HN front page, only 34 points?
Without spending the time on reading through all the details for the umpteenth “look what we built with AI!” article, I assume this is as valid as Anthropic’s claim about building a C++ compiler a few weeks ago where, when you looked under the hood, it was still relying on existing compilers.
Like OK, I really don’t believe the claims to begin with, but even if I do take them at face value, you just recreated something already existing and working for years?
Or just skip/migrate off of the Next.js and other JS SSR rats nets to Elixir and Phoenix LiveView - Claude and Codex are both very good with Elixir now: https://elixirisallyouneed.dev
I see this sort of maximalism a lot where people are just turned off js and say f it I'll use HTMX or LiveView or Alpine or whatever promises that you won't have to write js, and that's fine; as long as you're building generic dashboards and/or the same repetitive UI patterns. And even then you're basically writing JS just in a worse way.
I use Liveview and Elixir for 2-3 home-lab related frontend services; but when I have to do something moderately complicated I have to reach out for a darn js library and hooks and phx-commands. Try using native drag and drop or even client-side markdown rendering. This also leads to memory leaks when you can't properly detach libraries.
I just say think about your goals; these frameworks/platforms that promise to remove JS from your life or minimize it do so by sacrificing something. There's no silver bullet for building on the web.
But whenever I do talk to people who are debating amongst frameworks SvelteKit and SolidStart are the two I recommend, it's easy to host anywhere (unlike Next), you can turn off SSR, just ship static files with very minor changes (exporting a variable in Svelte for ex). They're really quick, get the job done, actively being worked on, loads of resources, discussions and thriving communities.
Speaking more about the framework itself, the only real conclusion I have here is that I feel server components are a misunderstood and under-utilized pattern and anyone attempting to simplify their DX is a win in my book.
Next is very complex, largely because it has incrementally grown and kept somewhat backwards compatible. A framework that starts from the current API surface and grows can be more malleable and make some tough decisions here at the outset.
Crazy to see it's already being run on a .gov domain[0]. TTFGOV as a new adoption metric?
[0] https://www.cio.gov/
reply