Flight simulation in 15 years? (Horizon scanning)

This is a horizon scanning kind of question. Which do you think are going to be the main changes in the way we live a flight simulation experience in 15 years from now?

To give you an idea, there have been several ongoing improvements (which will continue forever), such as flight model, graphics etc. Here I am talking about big changes, for example since 2005 we have:

  • VR
  • satellite photogrammetry
  • AI applications to scenery and traffic
  • force feedback yokes
  • online ATC and multiplayer
  • we start seeing coming along consumer 3dof/6dof platforms

what would, in your opinion, be the items on the list in 2035 when we look back at 2020?

Well I wouldn’t say we have force feedback yokes in any meaningful way right now but going forward I certainly hope force feedback returns and preferably doesn’t take 15 years to do it since we don’t really have it even in flight sticks except for boutique stuff right now.

I think (hope) hand tracking and AR features will become widespread in conjunction with VR. Other VR-related technologies like haptics will improve and that will have an impact on sims as well as the hardware for it will become more widespread.

We can expect simulation to continue to benefit from increases in computer and communications technology improvement: so much that we get is on the back of developments in the wider technology world. I would guess much improved AI driven 3d figures at airports (service and passenger) as standard. More realistic GA experience / control tower interaction with better reflection of local traffic rules. Animated co-pilots carrying out automated tasks in ā€œassistance modeā€ and much improved interaction between the aircraft model and the world model (friction/airflow etc).

On the down-side some things will get worse… we will have lost many of the super experienced pilots of the 60’s and 70’s that so contribute to the development of classic aircaft in the sim. Will there be any left except the ā€œchildren of the magenta lineā€ by 2040? And - it may be that the whole development ecosystem is SO complex that the large number of hobby contributors are no longer able to create for the sim (note how few freeware third party aircraft have been created for MSFS compared with FS9 and FSX : yes I know its relatively early days but I worry that the entry point for devlopers is now so high that only aircraft that can attract enough downloads to be a serious financial proposition ill ever get developed).

1 Like

More realistic planes and more diversity. Better more believable weather. An AI ATC that can interact with spoken words from the pilot flying the plane. Better support for projectors and multiple screens. The ability to have either an ā€œall glass cockpitā€ or ā€œconventional instrumentsā€ on every plane. The cost of motion simulation comes down to the point where more people can experience it.

It would be nice if somehow the programmers could condense the code so we don’t have to have petabyte drives just for the program and planes.

I think it will improve but not by as much, it’s like CPU speeds, it is flattening off. If you look at the difference of the early Sublogic Flight Simulators, allowing for graphics and detail they were a lot simpler in what they simulated and much smaller areas to fly in. This leapt massively to FS2004 and FSX at age (roughly 20).

Without decrying MSFS or X-Plane or Aerofly at all, the changes since then are more cosmetic. Much better detail - which is more about machines being able to handle the data ; you can uprate FSX quite significantly, better graphics and sound. Modelling of flight has improved, but not hugely.

There’s been a big explosion in other aircraft since then ; less so for MSFS airliners but that will come, but that can only go so far. There aren’t that many new aircraft, so once you’ve done the obvious you’re looking at high accuracy (which is there in many cases) and obscurity. Few people want absurd accuracy.

VR will grow hugely, and we’ll probably have some sort of projection system rather than a headset, which will probably make me fall over :slight_smile:

Expansion of things like VATSIM and the like are limited by the difficulty of producing plausible AI ATC and players ; there aren’t going to be that many people wanting to do flight sims.

Possibly more competitve and missions stuff ; people often ask ā€œwhat do I actually doā€ with flight sims. One downgrade from MSFS is the tuition and missions stuff.

1 Like

I’m old. Old enough to have programmed Windows 2.0. This doesn’t happen. Improvements in CPU speed, GFX clout, SSD speed and size and the like is largely wasted IMO. Developing on a 386DX/20 with 4Mb RAM, a 32Mb hard drive and VGA was possible, and wasn’t significantly slower.

Given the ridiculous increases in CPU speed, memory, storage and speed, and support graphics hardware, it’s not very impressive.

Unreal Engine 5’s Matrix demo (look for it on Youtube if you haven’t seen it) has shown everyone what a potential flight sim could look like if it began it’s development today. For an even more on-the-nose example, see it’s ā€˜Superman’ variant.

1 Like

there a hundreds of different videos that are called similar.

Do you happen to have a link to the actual video you are referrring to ?

I’m sure they all show the exact same thing since it’s just a playable tech demo that lasts like 20 minutes. I didn’t actually know posting links was allowed, here for the Superman: This Unreal Engine 5 Superman Demo is MIND BLOWING [4K] - YouTube

Here for the original and full Matrix tech demo: The Matrix Awakens: An Unreal Engine 5 Experience - YouTube

We will CERTAINLY have a second and probably a third Fenix release. Aaaaw I wish it would be 2035 and have more Fenixes to choose from.

Everything else is… whatever. Graphics is good enough for me world is good enough for me everything else I would say (I want this a little bit better, and that a little bit better and ATC and whatnot better blah blah blah) will probably be done anyway in the next sim or in oncoming updates.

But the most important thing is:
MORE ttriple-A absolute study-level superb quality airplanes :smiley: And redoing realism and failures, for example it should not be possible to just land a full airliner without bursting the tires just by the excessive airspeed needed for touchdown. And it should not be able to brake an airliner that easy without bursting the tirex or making the brakes glow.

Mixed reality will become standard for home cockpit builders: Sitting in an affordable, modular and fully customizable cockpit with physical switches while the windows are green screens. Bringing together haptics and visuals, the immersion will be near perfect.

2 Likes

AR is a big thing in my opinion too. It becomes more and more important if we land into physical cockpits. Hectic also are a massive thing for me, changing everything in my opinion. The question will be their interaction with force feedback.

I think some of the developments in AI art are incredible and relevant here - I’m talking about Dall-E, Midjourney, Stable Diffusion etc. This (and similar approaches) will have a massive impact on the procedural generation of landscape meshes and textures. There are already extensions which can generate eg infinitely tiled textures on any imaginable theme. The technically incredible thing about eg Stable Diffusion is that it works off a 4gb downloadable model and can generate an image if anything in any style you like. There are limitations but things like ā€˜give me a 0.1m mesh & textures for a swiss-looking alpine hillside meadow that matches up With this 10m
Mesh i have and this blurry satellite imagery ’ will be no problem, very soon if not already.

1 Like

This is what I was going to say. To me, that’s when VR will become a real viable option for a home cockpit.

You can already do this to some extent with the Varjo HMD. But it’s still quite pricey and beyond the reach of the average user.

1 Like

I think this is another point of value. I am convinced that AI intelligence will fill in scenarios with pseudodata more and more! While the satellites may not go down to 0.1m and lidar will not cover the entire globe, the actual mesh, photogrammetry and autogen will be AI based (even more than today) leading to have sceneries that are filled with credible although not realistic representations of the reality where data at that resolution are not available. A sort of an AI fuelled autogen

In order to simulate real world the natural move goes into the photoreal scenery and either AI managed live data, not just injecting it as it is but processing it to have more realistic behaviour of AI entities, or a complete AI system based on machine learning (for instance AI really handling vehicles in an autonomous way and not just following a predefined A → B path like bots with basic maneuvering at each waypoint). Something similar is already used by drones formations, where they interact in real time to adpat to the changes created by other drones and keep formation always perfect, no matter how complex movements they do. Clear application case for this is the handling of ground traffic at an airport, where each AI vehicle is adapted to the current status on its vicinity with some more logic than a stop if occupied path and continue if clear path.

Cloud computing would be most likely another effect we will soon see because industry (not only gaming one) is going fast into that direction. This can allow to move most of the computing tasks to a server, while your client is basically just a front end with your graphic card and joystick/mouse/keyboard. So, all those things that are running now on your PC main thread can be run on a server and the resulting calculations just sent to your client to simply draw them into your screen.

A very interesting branch of this one is the clients running a computing network itself (each one runs a small % of the overall computing) so it generates a lot of capacity with low efforts on each single machine. That has been already used to make some astronomy big data calculations or processing, being the users the ones adding their own home computers to the network. This can be used in MSFS to manage more complex live AI traffic (including roads, ships and trains) or very precise weather, as those entities are global for all players for instance. Weather prediction is indeed one of the most demanding features where this can be used, even if this is not really needed as MSFS uses real data instead.

Cheers

It kind of makes me chuckle at how we’re coming full circle. Back in the day before PCs, that was the norm. You would connect to a mainframe via a dumb terminal. All the processing would be done on the server (in the cloud) and all you would have on the user end is the video feed and inputs. Over the decades, the world moved away from that and went to fully integrated desktop computers.

Now decades later after the PC revolution, the whole computer landscape seems to be moving back to the client-server model of the last century. A far more capable model, of course, but still reverting back to its roots. Even gaming - one of the biggest drivers of standalone PCs (and consoles), is moving in this direction. And I find that kind of hilarious.

1 Like

The answer is easy I think: a game running in a server means the cost of the server is added to the development cost, so added to the seller shoulders. If user already has a PC you don“t really need a server unless game is online, so you just sell the software and that“s it. On the industry it“s a bit different: the controlling devices installed on the field are normally very expensive, so for the seller it“s cheaper to have just a front end on the field with a network interface to send and receive data (like diagnosis) and one server only to run the whole logic of all devices in the field at once. Unless you need PLCs or things that physically act over moving parts on the field that cloud computing can reduce your development costs a lot and also ease the maintenance, as server is next to you and not thousands of kilometers away, in the real field, and you only have 1 device to maintain instead of 100 smaller machines spread on the whole field.

Cheers

Sim update (SU)1043 is delayed again.

Cheers
PACO572

2 Likes

15 years?

Not a lot IMO.

FSX to MSFS is 15 years. I would expect more enhanced textures, and more detailed models and terrain, but overall framerates will still be as they are roughly today, as they’ll be pushing more detail.

VR/AR support should be consolidated and more mainstream, hopefully with much lighter and cheaper headsets/glasses.

Perhaps we’ll be sharing data peer to peer directly, rather than via a central server. The idea above about using the combined computing power to solve complex AI algorithms would be a boon to us all.