It's clear that the quoted text was a lie from the start.
There's nothing you could do in an EC2 instance that I couldn't do on my quad core i7 at many times the speed and a fraction of the cost. Even if you matched users one-for-one with large EC2 instances, you'd be looking at hundreds of thousands of dollars an hour (and that many don't exist).
Approaching the question from the opposite direction, it's also clear if you have any familiarity with the kinds of per-player CPU budgets that server-side games allocate. The norms for games like WoW aren't anywhere near enough to run even something close to the last-gen Sim City simulations, even though that's a 10-year-old game. Unless EA decided to allocate unprecedented levels of per-player server-side CPU, it's unlikely they're doing any significant percentage of the simulation on the server side.
When Diablo 1 and 2 were launched, the single player was 100% client-side. As a result, there was rampant cheating. Jump online with your character, and you had an unfair advantage because of your infinite gold, stacked character and rare inventory.
Diablo 3 went always-online to help solve this problem. Loot discovery, inventory, and fighting outcomes are entirely controlled server-side. While it's possible that they could have forced separate online-only and offline characters, it's reasonable for them to have decided online-only for all characters and not duplicate the logic and engineering. Not to mention DRM.
With Sim City, it's conceivable that they went this way as well.
I'm sorry but you are wrong here. Diablo 2, while it had online realms where you could bring your single player character, most people chose to play on the battle.net realms where all character info was stored online.
The servers where you could bring your single player character were just a show off for people who used character / item editing programs to create insanely stacked gear and characters, and like I said no one really played on those.
The reason Blizzard went online only with Diablo 3 and SC2, was because the Diablo 2's battle.net was reverse engineered, and there was an abundance of servers that could be played on with a fake CD Key all over the world. I remember specifically in Eastern Europe, we had quite a few servers and obviously with the average monthly salary being like less than $200, no one could afford to buy a PC game. Even LAN centers had cracked versions of all of the games and hosted their own servers.
Anyway, my point is that you cannot have locally stored game information that can be imported in an online realm and have a direct impact. People will edit that information to create whatever they want. But if a game requires you to be online, it must be an online game period. If a game can be played offline, there is no reason whatsoever for it to have to force you to use online authentication in order to play in a local environment.
> The reason Blizzard went online only with Diablo 3 and SC2, was because the Diablo 2's battle.net was reverse engineered, and there was an abundance of servers that could be played on with a fake CD Key all over the world.
I don't think that's the only reason. Diablo II is plagued with bots and duping. While the balance between client-server data on the Battle.net closed realms is much better than it was with Diablo (where you could just edit your character data locally to increase gold/upgrade inventory online), Diablo II still has the problems of 1) loading the entire level map into memory at once, giving bots the opportunity to path their way to POIs with no effort and 2) reconciling local inventory with the server's inventory after lag spikes and server crashes, which is hypothesized to be the main method dupers use.
But, as could be expected, both botting and duping happen on D3 anyway. And, to your point, during the D3 beta period, there were several devs that were able to reverse-engineer the D3 protocol anyway and create a local server. shrug
In addition to your point, Diablo 3 also introduced a real-money auction system, which necessitated a need for far deeper control over inventory etc. Online-only for such a system is a fairly obvious choice.
I think that there were a lot of players that were frustrated when they made a character offline, and then got invited to play with a friend on battle.net and had to start from scratch. I assume that frustration was part of the reason (among others) that they made all characters online/battle.net characters.
Single player cheating a.k.a "God Mode" was considered a feature for early versions of SimCity. Multiplayer requires a network connection either way. As long as multiplayer is optional, cheating issues should not be an impediment to single player.
There are gamers who prefer to play with unlimited resources and complete control of the situation. Many prefer a sandbox where other gamers cannot mess with their experience. Are you telling me Maxis is in the business of getting between the consumer and their game? That's a losing business proposition if it's true.
I assumed they where talking about the 'world' economy and if that's the case it may be both true and irrelevant.
Ex: ~20,000 player city's are uploaded into a model. They do a simple calculation based on excess energy, pollution, ect. The result's of that are fed back down and then they run the model again adjusting for new city's and client city updates. Now even if 10mhz per city is used your talking about a 200GHz worth of processing which is far more than an i7 but shared and mostly irrelevant in single player as you could just as easily fake the global numbers.
While I agree that this is likely not truthful, I do wonder if part of the strategy of offloading more work to the cloud be to make an easier transition to allowing mobile clients (iOS/Android) where local processing capacity is more of an issue, or to platform-neutral approaches, or to a "take your city anywhere" play model? (I doubt this is the case because unless they have a motivation to keep it secret, it would make for a much better explanation for why they'd want to keep state in the cloud, given the trends towards more mobile gaming.)
EA's claim struck me as really odd too; this just isn't the way that games are made [yet?] and if they did manage to get something like they claimed running, there would be much more interesting technical aspects of it that they probably would have done press releases for. In this day and age, standing up a system like that is still a major accomplishment, and the details of how they got around things like processing power and network bandwidth on consumer connections would be really interesting to the tech community.
TL;DR easily-verifiable claim by EA that stunk from the beginning proven wrong by anyone who knows how to use Wireshark
Forget Wireshark, the article implies that some of the critics tried the good old fashioned "yank the Internet cord and see how long it takes to break" method and got 20 minutes of playability.
you can only offload processing that has barely any requirement on latency. If you send of loot or hit calculations (like d3) onto a cloudy server and it takes minutes for it to finish, the game would become unplayable.
Maybe not the case for SimCity, but for Diablo 3 a lot of game calculations are done server side to prevent hacks in the item drop rate and item duplication. Because Diablo 3 has a real money economy, it's crucial to ensure that the items in the game are authentic and not acquired through mods/trainers/cheat programs.
Anyway, offloading processing to the server does have its benefit and uses, just maybe not the case here with SimCity (though I'm not sure about this having no experience with the series)
It's clear that it's a lie because if it were true it would've been delivered in a form of an awesome tech demo and not an excuse for broken game.
That said however...
If there's a shared world, then running its simulation server-side makes sense. Not the game minutiae, but global state. Something like weather, simulated stock markets, etc. The environment, basically. That's not to say that SimCity has any of this, because it doesn't.
What reasonable company would justify spending tens of thousands an hour on server costs, when they could optimise their code a little more and run it for free?
So let's assume that the average person wanting to play SimCity is running a Core2Duo (released 7 years ago). It's hard to find benchmarks directly comparing a Core2 E6600 to something like an Ivy Bridge Xeon that you'd expect to find in a modern dual-socket 1U server, but even looking at a TomsHardware chart of x86 core performance can tell you that an i7-2600k is only about twice as powerful as a Pentium 4 HT660 (core-for-core) http://www.tomshardware.com/charts/x86-core-performance-comp...
Bottom line, the total cost of ownership doesn't at all make business sense to do right now. In 10 years, it very well might.
The core-for-core thing makes a huge difference in the real world that doesn't show up on single-core benchmarks. The HT660 was a good chip in its day, but it's at a four-to-one disadvantage for code that's multithreaded and/or running on a busy PC.
I think Sandy Bridge is my favorite CPU of all time. It does a truly massive amount of work without consuming significantly more power than the part it replaced.
There's nothing you could do in an EC2 instance that I couldn't do on my quad core i7 at many times the speed and a fraction of the cost. Even if you matched users one-for-one with large EC2 instances, you'd be looking at hundreds of thousands of dollars an hour (and that many don't exist).