Where to vote please ?
Same problem here with an AMD Radeon 8500 but it worked perfectéy 30 fps before…i can fly only clear sky lol…come on Asobo move your body
Surely it can be resolved as it was working fine before update#5. The question is will they fix it? We are just 3 months from the launch and we have this. So much for the “Set Your Experience” on the opening sceen. What experience are they talking about?
On the top of the thread, go next to the thread’s title and click the “Vote” button right below the vote counter. Right now it’s at 210 votes, so that or a similar number is what you should be looking for.
Anyway I see you have a Radeon 8500, that GPU is from 2001 so it couldn’t possibly work at all. Can you specify the exact model number, maybe it’s the newer Radeon HD 8570? This is the first time someone reports this issue on an AMD GPU, so this would be helpful information in order to find out which architectures exactly are affected.
Sorry it’s an AMD Radeon HD 6900 Series…clearly not in the minimum requs but it worked perfectly before
Thanks haha i don’t seen it ! Voted hope we will find a way guys <3
Thanks a lot for your report. This probably means that we can confirm the TeraScale 3 architecture is affected along with Kepler and Maxwell 1.0 from NVIDIA. Quite interesting. I will update my Zendesk report with this finding and I will see if I can find anything in common between those three architectures that would cause this issue. If they do plan to fix it eventually, more information will help them.
A bad and costly experience apparently
Since you guys are talking AMD I pulled my NVIDIA and tried an old Radeon R7 265. The clouds looked normal. Unfortunately the frames were horrendous and unplayable.
Thanks for the report. The R7 265 is using the GCN 1.0 architecture, which first appeared in 2012, same year as the Kepler architecture. This would be an interesting thing to add to a Zendesk report if you have made one.
GCN 1.0 is much superior to Kepler when it comes to DirectX 12, Vulkan and compute workloads, but Kepler is said to be superior in DirectX 11 games. Even if this cloud system change was some sort of preparation for DirectX 12, it couldn’t be significant enough since the DirectX 11 API is still in use, so I’m not sure why that would result in a massive visual glitch rather than reduced performance.
If I have some time, I should try my old HD 5770 as well, which is TeraScale 2. It was shown in a YouTube video to be barely playable at a resolution of 640x360 and the lowest graphics, though that was before the patch that brought the cloud flickering.
Good idea. I also have an HD 5850 I can try later today as well just for curiosity’s sake.
Thanks, it will be a bit of a hassle to bring back my HD 5770 so if you could make the test it would be much appreciated. Same architecture but around 50% faster so the game could be playable enough at around 720p.
I think you did not read the thread and the majority of its posts correctly. Nobody said that the game should work on GPUs released in 2009 or so, it was merely a test on old leftover GPUs so that we can pinpoint the affected architectures and maybe find what it is that is actually causing this issue, so we can report to Asobo and help towards a potential fix for the issue. The affected architectures are listed in the original post.
The fact is that GPUs like the GTX 770, GTX 780, GTX 780 Ti, GTX Titan and GTX Titan Black all exceed the minimum requirements, and yet they are experiencing this issue because of a stealth change that was not communicated at all. There was nothing in the patch notes about it, and no discernible difference in the clouds apart from more grain in order to tell the difference between the old and the new rendering systems. Would you know that the cloud system was updated if it weren’t for these threads explaining the flickering problem?
Yes, we did read the minimum requirements before purchasing the game, and in fact the GTX 770 is still listed as the minimum requirement in the Microsoft Store and Zendesk. And if you’re going to say that it was released in 2013, GPUs with the affected architectures also continued to be released in laptop variants until 2017. Please check on all the facts before posting, and simmer down with the insults. If your GPU also becomes obsolete without any notice and without a given reason some day, I can guarantee you will not like it.
Voted - I do agree that it is unfortunate, and hopefully the devs can put in a fix for older cards that doesn’t handcuff visuals and performance on newer ones (especially with DX12 and ray tracing incoming).
But having made the plunge into a current gen video card I can now see what I’ve been missing, especially as there is such a monumental difference in performance scaling just by increasing GPU power. And there is no comparison in visuals between medium and ultra…hopefully with AMD coming out swinging, the order-of-magnitude performance improvement may be just a bit more palatable / affordable…
A 770 is the absolute minimum, yes, and even my 1070 needs an older driver than current to avoid stutters in game released about update 4 or 5 (that version was fine, then driver updated and stutters consistently). But the AMD variants are not. And the clouds are different to my eyes, as they seem more reactive to ground lighting and sunset lighting since update 5, giving more vibrant reactions to sunset changes and clouds over heavily populated areas appear orange akin to the night lighting of the ground. Pre-update 5 they appeared more white to my recollection.
I just see so many posts in so many forums focusing on older hardware people have held onto expecting it to work with a modern title. I did read the initial post as well as a majority of the remaining posts and it seriously sounded like a bunch of people complaining that their older 6+ year old GPUs wouldn’t hack it anymore. Someone even complained that the game stopped launching at release only to find out their GPU was below min spec. The upgraded their card and now it works fine. It just seemed like this thread was leaning more toward the “give me back the old feature because my old card can’t handle the new” and for those users that have something adequate may well enjoy any unexplored features the cloud system may yet develop into.
We will never move forward if people keep asking for asobo to rewind code backwards to a prior state for the few people with significantly older hardware. The 770+ issue should be addressed given the minimum spec delivers it, but mobile variants are not covered. A GTX770 and GTX770M are NOT the same capability-wise and the specs do not clearly say what mobile variants are supported, assuming then that a mobile processor that has greater than or equal to the speed of the listed minimum as well as the memory amount (there’s more to a GPU than that, but few people understand the total integration of a part and specs for mobile GPUs are very limited to the common lay person).
I apologize for coming off gruff, I just don’t want the forward momentum Asobo is trying to maintain to become a stick in the mud for those few people with significantly lower end and old hardware. In due time, yes, it needs to be remedied, but not at the cost of moving fully forward.
I completely agree that compatibility with older GPUs should not be a priority over progress, and personally I won’t mind if my GPU is not even able to start up the game when the DirectX 12 update arrives. But I don’t like how they’ve handled this one so far. If there was indeed a change in the cloud system, it should have been mentioned in the patch notes. And I understand that they cannot test patches with each GPU out there, but we’ve been providing detailed reports for almost two weeks now, and yet the minimum requirements haven’t been modified at all.
We’ve also been getting mixed responses from support, most of them suggesting that our GPUs are now obsolete and that flying with clouds will no longer be an option, and a few saying that a potential fix is in the works. It reads somewhat like a canned response, but it gives us hope. And then those who reported that the clouds work fine on Intel iGPUs and not on their NVIDIA dGPUs receive a robotic response that makes no sense.
I also agree that a great GPU makes a big difference and that the simulator truly starts flexing its visual strength at High settings and above, but I’ve found that with a mix of Low-Medium settings on my GPU, it still looked way better than the previous simulators, with the same mediocre performance. I particularly loved the gorgeous lighting and the way it interacted with the clouds, and that was now taken away. I will continue to gather reports and hope that Asobo do something about it, but if they make it clear that supporting these GPUs will come into the way of progress, then I will stop.
Regarding my thoughts on maintaining compatibility with the GPUs close to the minimum specification, I’ve covered that in my reply just above.
Thank you for relaying your observations regarding the differences with the new cloud system. As you can probably tell, I cannot see them for myself, and whenever I do some research I only find people complaining about the increased grain, which somehow seems to be different for everyone, and it does truly look terrible in some people’s screenshots.
As for mobile variants not being covered, I can’t say I completely agree with that. The GTX 770M is using the GK106 chip compared to GK104 on the GTX 770, but apart from their overall performance throughput, the two chips are using the same Kepler architecture with the same feature set, and the only difference between the two should be performance and nothing else. Even if support insist on that, I haven’t seen any other case of pure GPU speed being the cause of visual artifacts like this. The other laptop variants I mentioned earlier are using the GM107 chip, which is using the newer Maxwell 1.0 architecture, and is not missing any critical features for DirectX 11 rendering either.
To fully explain why we added the HD 5770 and the HD 5850 into the discussion earlier, we are simply testing older leftover GPUs so that we can compare different architectures and maybe find something common that could explain what causes this flickering cloud issue. Like, what are the differences between Maxwell 1.0 and Maxwell 2.0, and what are the differences between TeraScale 3 and GCN 1.0? And then, what do Maxwell 2.0 and GCN 1.0 have in common that allows the new cloud system to be displayed without any issues? Otherwise, of course we would not demand a fix for those GPUs that were never part of the minimum requirements to begin with, but if we could find anything that would help Asobo make an easy fix for this issue, then it will be worth it.
Exactly ChaoticSplendid, great reply. Yes, it is just the “geek” in us that are curious on the impact to different (and older) chipsets - why it works on severely outdated cards but not on supposedly “supported” graphics cards. That does not excuse the fact that people purchased this based on published specs and it is not working on some cards that meet these specs. Some folks can easily go out and purchase a new card. Others may have situations where this is not currently an option but should be afforded the opportunity to use the sim after purchasing and meeting the specs. I have no doubt that the title will continue to push forward and, at some point, necessitate upgrades. Perhaps even tech that is currently not available. This should not happen a couple months into a release.
Man that’s not the problem I run DayZ and other games as well i have an i7 and 16 GB RAM normally its okay !
I’m running GTX660 and found a quick fix that works for me.
ALT - ENTER (exit Fullscreen)
Move window to another monitor
ALT - ENTER (enter Fullscreen)