I am completely annoyed by the fact that during the sim updates or content manager updates the process works sequentially, which is loop(start download → stop download → decompress-> start download). Firstly this unnecessarily stops the downloading process. Second it massively extends already huge sim update time. I recently had to stop my liveries update process in the content manager because decompressing each livery took way more time than its download and the downloading of the next livery was effectively halted while the previous one was being decompressed. As a result 300mb download overall was taking ages to complete. This looks umm… unprofessional. How can we be talking about improving multithreading when such simple stuff as update is single-threaded “up to its eyeballs”? This desperately needs a fix by parallelization of downloading/decompressing threads. There is no reason whatsoever for them to run sequentially.
5 votes at a day. All you could tell about quality of the voting system at the forum. Sigh..
I posted a similar request in the bugs area (my bad - didn’t realise features were requested separately), my post is replicated here to support this request and provide additional info:
The download, save, decompress and install process for both the initial install and subsequent updates are very slow and can be greatly improved.
When updates are reaching several gigabytes in size, often approaching the size of a new install of many games, this update process is made even slower when tasks are performed sequentially.
Most users have a high-spec PC for running MSFS 2020, often containing SSD disks, high-speed broadband, and large amounts of RAM, which is quite capable of performing these tasks asynchronously (in parallel).
This approach allows the throughput of the download to remain high, as it is no longer paused whilst saving occurs (see Steam download window as an example) and adds better visual feedback to the user (see Visual Studio Installer as an example, providing progress bars for each process) and can drastically reduce the overall install time for every update.
Some applications even offload much of the processing to the GPU for manipulating and caching the streamed data (see Steam), thereby freeing up the main CPU for other tasks (such as saving the data to a file).
The end result is a greatly reduced “start” to “play” time when updates are needed, as well as an improved display of the overall progress, for users to see the real progress, instead of scroll bars pausing whilst unmonitored processes take place.
In line with pinned forum post :
- Check if the issue has been identified in our [Zendesk articles ](link removed) or [Known issues ](link removed)
^^ Both lead me to the same search : (I’d love to be able to type : without being offered an emoji
)
flightsimulator zendesk DOTCOM /hc/en-us/search?page=4&query=sequential+download&utf8=%E2%9C%93#results
And searching this thread :
/search?q=sequential%20download%20category%3A211
As you can see, all seem to be about the download loop issue, reset to 0% or constantly download the same package.
Do you have any add-ons in your Community folder? If yes, please remove and retest before posting.
- This is fundamental to initial setup as much as ongoing updates.
Are you using Developer Mode or made changes in it?
- No, the product behaves this way out of the box.
Brief description of the issue:
- Initial download is 243 elements, World Updates and others can be dozens. The Download / Update manager sequentially performs Download, Decomp / Install, Download, Decomp / Install
Provide Screenshot(s)/video(s) of the issue encountered:
- Easily replicable on any clean install or by nuking packages and re-downloading (or awaiting next update). Screenshot seems overkill here and this forum interface is annoying enough.
I take it back, your forum handled a Ctrl+V kudos!
Here’s a cheeky screenshot of the issue ![]()
Detail steps to reproduce the issue encountered:
- Install the base Product.
- Launch the base Product and select a Packages Destination
- Watch it make a thousand cups of tea sequentially.
PC specs for those who want to assist (if not entered in your profile)
- Pretty irrelevant here but it’s an older i7 devil’s canyon base clock, 24Gbb ddr3, GTX1660 6Gb, 1Tb SATA SSD for OS and 1Tb NVMe for games (including FS). I’m on a 500Mbps line and because of the “burst” or windowing behaviour of TCP, your download method means I rarely reach 150Mbps. (To be clear I am not confusing bits and bytes, my other platforms hit just shy of 60MBb/s which is around 550Mbps, I can also download another 100Gb game in around an hour so it’s demonstrably slower.)
(I know in this realm, MS servers will always top my broadband and give me max across Windows, MSDN, Xbox … you name it, I’ve 100% faith in you here.)
Build Version # when you first started experiencing this issue:
- This seems to be by design, the product always behaved this way.
Are you on the Steam or Microsoft Store version?
- Steam, but the behaviour occurs in the in-game download / update manager, so it’s safe to assume this impacts all users.
Did you submit this to Zendesk? If so, what is your ticket #?
It’s not a live technical issue, doesn’t seem appropriate at this stage, I just suffer the behaviour. But it does make me wonder about your other netcode and if we’re pulling all that live map and traffic data efficiently?
Basically it seems like such a massive architectural oversight in the download manager at least, and I’m not seeing much chatter on it, probably because the game size overwhelmed a lot of people’s broadband. But I do recall a massive uproar because the download time being force in-game was causing Steam Players to expire their playtime, making refund impossible.
Excellent post and it would be great to see it discussed here on the forum, however I do recommend opening a ticket with your observations as sadly a lot of bugs and “room for improvement” topics that we discuss here do not get brought to the attention of the developers.
Given that so much of the game is streamed this indeed is worthy of discussion.
it’s not an issue for me that the download is not done completely first and then the decompressing/installing starts. But sounds also not wrong that developers think about and it can improve the process to install the updates. I assume there was a reason to decide for splitting the download.
In general I had/have zero issues witrh downloading.
The Steam refund issue is a completly other thing. Its a fail of how the game is integrated into steam and your suggestion will not change that - simple because: not all people in the world have 500mbit ![]()
Thanks for your feedback! This convinced me to create :
Request #110618 In-Game Downloader / Updater runs sequentially instead of a constant download thread with decompression and installation on another thread.
Not sure of the spirit of the forum here, this was never a problem but if it needs solution provided somethingbrite has offered the smartest path I figure.
If it’s left open for more discussion too, that’s cool.
With regards to the steam refunds, I agree I have a great rate where I am and this would never have impacted me. But somewhere between “the worst rural broadband” and me … lies a person / download rate where the bottleneck might’ve made the difference in 2hours of playtime. I personally acknowledge the download time for me is like 10-20% inefficient and this is simply going to make 1 more coffee, no biggie, and I’d still technically have gotten refunded (I’d never have asked for one, it’s a great piece of Software !! Well worth the money !!)
I do get the integration thing, but when I see steam integrating Rockstar, EA, Epic and any other vendor by installing 100% of game data into the SteamLibrary folder solely from the steam client, I wonder if that particular root cause lies on Microsft’s shoulders also?
As somethingbrite brings us back to, the overall concern about the netcode and the streamed nature of the software is where the mind wanders when you see a download manager exhibiting behaviour which is pre-2000 in terms of threading and netcode technology.
Do you have any add-ons in your Community folder? If yes, please remove and retest before posting.
N/A
Are you using Developer Mode or made changes in it?
N/A
Brief description of the issue:
The installer seems to be extremely inefficient. It will download a small file, then stop downloading to decompress the file & install it, and then it will re-establish a connection and attempt to download the next file, rinse & repeat. This takes far, far longer than it should. Why not download the entire package (so that the download speed can optimise), & then, only then, start to decompress & install the files. Many of us have to wait entire days for download & install, even with High speed Internet connections.
Provide Screenshot(s)/video(s) of the issue encountered:
N/A
Detail steps to reproduce the issue encountered:
Download any update or install the sim from scratch.
PC specs for those who want to assist (if not entered in your profile)
N/A
Build Version # when you first started experiencing this issue:
From the Sims release
Are you on the Steam or Microsoft Store version?
Microsoft store
Did you submit this to Zendesk? If so, what is your ticket #?
116350
I just noticed that MSFS2020 is using a multithreaded downloader when you are downloading multiple World Updates.
This is something that annoyed me a lot since the launch (I can’t hear the MSFS theme to this day). The lack of parallelization today is unforgivable.
But I suspect this practice comes from Windows Updates, which seems to behave the same way. The problem is that WU are done in the background and we don’t even notice they are happening… Which gives me an idea: Why not integrating the updates in Windows Updates just like Office does?
The installer needs to be rebuilt so that -
-
It can either download and decompress simultaneously, or can download all the files at full speed then decompress them in one go. The constant stopping and starting means it barely uses 50% of my connection on smaller files.
-
It can verify all individual files with MD5 and replace any files that have been modified at all from a manifest.
-
It can detect when it is hung or the download is corrupted and automatically re-download the files. Needing to manually delete files and restart the download is not something I have experienced with any other software ever. Really, really poor and disappointing state of affairs.
Given reinstalling is recommended as part of troubleshooting processes, and how many bugs and stability issues keep cropping up every update, I suggest drop everything and build a new installer. I have spent 7.5 hours reinstalling tonight. I need to go to bed.
As it currently exists, the Installation Manager appears to operate in the following manner:
- Get list of packages
- Add each package to download/install queue
- Walk the queue,
- Download package
- Install package (your network connection is idle during this time)
- Remove package from queue
This creates periods of time when the installation manager is decompressing and installing the content, leaving the network connection idle. For small updates, this may be inconsequential in terms of install time. But for large updates (multiple GB), or fresh installs, this can add a lot of time to the overall installation.
I know, Asobo are probably trying to conserve disk space for machines that might not have enough to store the entire update compressed as well as installed. But many machines do have space to grab the entire update and install it. Asobo know for any given update the compressed and installed sizes and could determine if the target computer has enough space for this approach.
Here’s what I would love to see:
- Get list of packages
- Evaluate disk space requirements (downloaded + installed) against a target free space threshold
- If sufficient space does not exist, clear a feature flag for parallel download
- Create download queue
- Add all packages to download queue
- Create installation queue
- Start background download thread
- Get first download work item from queue
- Download package
- Add downloaded package to installation queue
- Start background installer thread if not already running
- Remove work item from queue
- If parallel download is disabled, or this is the last download in the queue, await the installer thread completion
- Repeat until download queue is exhausted
- Installer thread (started as needed by download thread)
- Get first install work item from queue
- Install package
- Remove work item from queue
- Repeat until install queue is exhausted
In this manner, downloading takes place constantly, rather than waiting for installation work to finish. It’s OK for installation to wait on downloading, because you don’t yet have the content. The reverse doesn’t hold, though - downloading doesn’t have to wait on the last package that downloaded.
but not all , and thats the point. All players should be able to install MSFS, not only a little portion of people. And we have allready lots of players which report low disc space while installation. ( what not mean, that the mentioned strategy per se is bad
)
I’m not suggesting otherwise - what I’m saying is that Asobo can easily tell for a given payload to download/install whether this approach would be possible, and use it if that’s the case.
The mechanism I posted above would support that by way of feature flag - if there’s not enough space set a flag to not allow overlapped download, and then above, after starting the background installer thread, await that thread instead of proceeding to the next download.
Single overall process, simple feature flag to control if it’s used.
I’ve edited my approach above to include this check.
yes.. as sayd “not per se bad” .. what I see is, that it makes things again more complex and some users may be not know what to do with such a setting ( if you meant a user-setting
).
But developers can collect ideas and may be they can implement some improvements, which I also see in case the download will be constantly done. I think more important is it for the first installation, the updates itself are “okay” ( just smaller ).
Well, yeah, I agree that more user options doesn’t necessarily make things better, I think this could just be something the installer handles for you.
I’m in the midst of wiping and reinstalling after two+ years, and I’m watching the process on a machine with 1.5TB of free disk space, on gigabit internet, watching my low network interface utilization, and idly thinking things could be… optimized.
And of course, in deciding to reinstall, I’s already observed “why are the most beautiful things the most fragile?” Because all you need to do is lurk these forums to know that ![]()
I notice that there is no parallel work:
- when one package is “decompressing”, the network doesn’t do anything. Instead, it could already start downloading another package
It could be a “queue to download” and at the same time a “queue to decompress”.
Our cpu are multicore-multithread, isn’t it?
