Many comments are missing the point here (although the article doesn't properly explain neither); it's not about resolution, but about fixing imperfections in filming:
> The recent Cameron restorations were based on new 4K scans of the original negative, none of which needed extensive repair of that kind. [...] The A.I. can artificially refocus an out-of-focus image, as well as make other creative tweaks. “You don’t want to crank the knob all the way because then it’ll look like garbage,” Burdick said. “But if we can make it look a little better, we might as well.”
The only movies which would require upscaling to 4K are those released between about mid-2000s to mid-2010s, the advent of native digital cinema, but filmed in 2K. Everything before was filmed in 35mm film, which can be scanned to 4K with information to spare; everything after is filmed in native digital 4K or more.
Moreover, upscaling which deal only with resolution has absolutely no need of AI. Any TV will decently upscale in _real-time_ a non-4K movie, and more sophisticated techniques can give basically indistinguishable results. 2017's _Alien: Covenant_ was voluntarily produced in 2K but released in 4K through upscaling and the image look just great.
> The only movies which would require upscaling to 4K are those released between about mid-2000s to mid-2010s, the advent of native digital cinema, but filmed in 2K. Everything before was filmed in 35mm film, which can be scanned to 4K with information to spare; everything after is filmed in native digital 4K or more.
Good to call this out, I think this is something that's really lost on people.
It really blows my mind that George Lucas, for all of his apparent obsessive concern about his films supposedly looking dated, chose to shoot Star Wars Episode 2 in 1080p in contrast to Episode I on 35mm film.
I guess 1080p was the big shiny edge thing back at the time. 35mm can supposedly be scanned beyond 8K, so you could theoretically consider 4K filming not good enough neither.
> The recent Cameron restorations were based on new 4K scans of the original negative, none of which needed extensive repair of that kind. [...] The A.I. can artificially refocus an out-of-focus image, as well as make other creative tweaks. “You don’t want to crank the knob all the way because then it’ll look like garbage,” Burdick said. “But if we can make it look a little better, we might as well.”
The only movies which would require upscaling to 4K are those released between about mid-2000s to mid-2010s, the advent of native digital cinema, but filmed in 2K. Everything before was filmed in 35mm film, which can be scanned to 4K with information to spare; everything after is filmed in native digital 4K or more.
Moreover, upscaling which deal only with resolution has absolutely no need of AI. Any TV will decently upscale in _real-time_ a non-4K movie, and more sophisticated techniques can give basically indistinguishable results. 2017's _Alien: Covenant_ was voluntarily produced in 2K but released in 4K through upscaling and the image look just great.