With the power of AI these days (as in 2023/24, rather than 2020), I'm surprised auto gen buildings in FS2020 still look so poor in many places

Well, I don’t think they’ve ever told us for sure how they do what they do but, in studying that image, it looks to me that every single home there is different. If thats the case everywhere then that would be a great deal of models to be downloaded.

I know they do have ‘Procedural buildings’ as part of their updates but I don’t know what those really are or do. It would make sense that it was things like these homes but, if so, I have to say the artists are not doing near as good of a job as I think they could and should. imo.

1 Like

Actually I think it is. MSFS calls them “procedural buildings” in content manager. And there are soooo many different shapes and sizes that I highly doubt these are all fully pre-made models.

3 Likes

The Blackshark buildings are all procedurally generated from what I can see. If you muck about in dev mode building your own sceneries you can see them being procedurally made in real time. They just form out of thin air.

4 Likes

Even if the buildings are procedurally generated (which I’m not doubting) the system which produces them would have been created by a developer. For some reason it seems to have gone a bit over the top with those roof details.

1 Like

I’ve never seen that. That’s really cool. Even in that case, I would expect the building parts are probably made by hand, then assembled together procedurally by the algorithm.

We are overdue for a new version of blackshark. I hope 2024 comes with a revised autogen world and not just a host of missions and challenges with the same scenery and rendering engine as 2020.

1 Like

From what I gather you could think of it as the ingredients of a recipe, basic building blocks such as textures doors and windows and rooftiles. Then blackshark generates the building within those contraints and following what it sees in the satellite photo.

I think it is a little more complex than that. For instance in South Africa, malls are all over the place. Many of them have pretty irregular shapes. The AI DOES have the shape of the building correct, but it seems to make them all warehouses. Similarly, if you pass over a tin shack squatter camp, it then makes that a really funky shaped warehouse. General shape of the area is used, but then it seems to at least for larger buildings use textures to make up the building based on the shape. But that doesn’t always seem to apply. For larger areas it does, but for things it designates as houses, they seem to be random houses selected. So it seems the AI does try to determine what type of structure it is, then determine the best way to handle it.

1 Like

That could be down to either incorrect, or misinterpreted meta data that defines that area. I’ve seen some very odd shaped buildings that kind of look like the hangars you might see at a Boeing factory, where in fact they should be a shopping mall, as you point out. It knows how to build the building shape, but doesn’t know to use different parts to build that up.

Some may remember the Melbourne Monolith, where an incorrect entry in the data caused a giant skyscraper to be created.

1 Like

it does indeed appear to have a bias towards warehouses, they are everywhere
it seems that when it cant make out what it is… then warehouse is what it fallsback to

is it a house? yes >> place house
is it apartment building? yes >> place apartment building
and so on…
if the answer is no to everything, well then place a warehouse and move on

even so, i say again, i think the end result is still pretty good, some cities dont look too bad at all…
but it is due for a facelift, yes

On a slightly different but still Ai related note – I’ve been wondering if Ai wouldn’t be able to help with the issue of how distant scenery looks (melted buildings, non existent structures due to size). The number of triangles in the close up PG is staggering sometimes and often times a surface could be represented by far fewer triangles. Instead of culling data points based on distance to and size of any given PG building, maybe the Ai could be used to analyze how buildings look from a distance and create lower triangle mesh optimized with a view toward making rectangular buildings (as opposed to accidental triangular ones) and also very simple geometric (lines and points) shapes to represent distant small buildings.
As it is right now those small buildings get removed entirely and all you see is the underlying blurry aerial image otherwise known as the Apocalyptic look.

FYI, there is a wishlist for this, though it does not have a lot of vote:

One thing i have noticed recently about the autogen buildings is that they are all too large. They are huge. As related to the texture beneath it, if one looks closely you can see that they are way larger than they should. In a city block where in rl there are 15 houses, it places 3 large ones. Curious.

Could you post a picture? My biggest complaint with just about all of the residential buildings (both Autogen and PG) is that, in some cases, they are too short. In PG areas I’ve begun comparing the width of embedded vehicle images to the height of the nearby houses and in a lot of cases the house looks about 6 feet tall. some PG areas look good but even the residential offline autogen suffers the ‘too short’ problem and they fade out of view when seen from no more than a couple thousand feet:


if i picture a person standing next to it, they look a bit on the large side to me

or… whats actually more likely, is that the ground textures are smaller than they should

I see what you mean. It looks like row houses on sloping terrain. I guess it’s quicker and easier to place a few large buildings than to try to place many small houses that follow the terrain. I’m actually becoming convinced that part of the reason for short residential houses is because the exact terrain altitude is too generic/low resolution so small buildings end up being buried and appearing shorter (maybe). I’m not sure if that makes sense though.

I’m not too picky as it’s amazing to be able to simulate the world with as much accuracy as it does from 1000 feet.

But I would like to see the morphed tree-buiding-rockish looking hybrid, post apocalyptic, monstrosities cleaned up. Some of the smaller airports are full of them.

Palm trees are also terrible. They get rendered as obelisks with palm branches painted on them.

1 Like

The only way to “fix” PG trees, and other small objects with complex shapes is to replace them during tidy up with hand modelled replacements. The same can be said for small buildings, or those with complex external geometry, like cat walks, fire escapes, and the like.

My understanding of intent with autogen here was to convert sat images to 3d at scale with an accuracy that makes it look believable when seen from air. It would be a massive multi-year task, if done manually.

Each level of AI requires a computing power by GPUs and that even today is very expensive. When AI models are developed, we stop at a certain level of output accuracy as beyond that, it becomes too expensive to be financially viable.

Plus AI tech has blown up orders of magnitude since MSFS 2020 was developed.

I’m sure AI could correctly model a bridge now or figure out where taxi lights should and shouldn’t be.