1gbit connection here, tested to topping out at 1130Mbps.
95% of the time 2.5-3Mbps in the sim. The servers are likely configured to make sure all users get a fair shake of the stick. It’s always worse on a patch day, but more generally access is a bit like this for me:
Burbles along at a bare minimum, then you get a small burst of speed which peters out quite quickly.
Anyone with experience of this will be able to relate.
Let’s say you have 1GB of data to download. It’s a bunch of files, say 2000, and you have two choices:
- Download each file separately
- Bundle all the files up into a single archive
Step 1 is the least effort, but the costliest, as each file needs to be accessed, downloaded, checked, before moving on to the next one. Step 2, in theory at least, should take less time, even with the necessary extraction process at the end as only a single file has to be accessed, and the connection gets the chance to stretch its legs.
The sim takes the second route, but not the most optimal way. They have multipart archives, and each part has to be downloaded in turn before the entire set can be expanded, and then the archives deleted. I’m not sure why they chose that method, but going from the file names it looks like they might be incremental in some way. It looks like each one would be applied in turn so that your sim can be updated from your current version to the latest, instead of a single archive that can upgrade any version to the latest version.
It would be the most costly way from a disk space point of view, but if there were a single bundle, that could handle any version, and any edition, with all files required to take any install to the current version, and then just download that one large archive, that would be more optimal from a transfer point of view, but there is a good chance a lot of the data is not needed, depending on your edition, and what you actually have installed.
Those who have been gaming since the 80’s or 90’s may remember how patches used to be distributed. They packages up the diffs, and sent those instead, rather than sending whole new files to overwrite what you have. A true “patch” in that respect, and the file sizes were minimal.
Why replace a 2MB executable with another 2MB executable, when the diffs might only be a few 100Kb?
That could be a fun little exercise for the weekend. Take two plane installations. My current one, and and an update. Ensure that both installs are extracted, and available for comparison. Look though the full list of all files. When there are new files that didn’t exist in one, they would be added to the archive in their entirety. Where the files do exist, but a CRC check indicates the file has changed, create a binary diff of the file, and that would be added to the archive instead. I’d be interested to see how small an archive that might be.