Touchscreen Support

I’m kind of surprised this one doesn’t exist here yet, it’s been discussed in other sections.

Currently touchscreen monitors do not allow proper touch input, either directly in the sim or with the pop out panels.
This would make using the G3000/G5000 systems that have the (in-game) touch screens way more usable.

You can get partial support working by modifying the listener event in the javascript files but this is not a very good solution as it changes the way that mouse events are handled and requires a double tap to register the input.

Some of the other topics where this has been discussed:
Touch Screen issues with Pop out Panels (Includes details of the workaround)
Screen mirroring touch screen
https://forums.flightsimulator.com/t/touch-screen-on-2nd-monitor-becomes-non-functional/217958

It would be really nice for home cockpit building to be able to pop out one of the touch screens like the TSC for the G3000 or the G3X PFD to a secondary touch screen monitor and use it from there with actual touch control instead of using a mouse. It would seem like this would be a no-brainer in this day and age of touch screens everywhere, yet the feature is lacking in this sim.

Aah, we meet yet again… Thing is, that with a tablet, these pop outs are brilliant to use. Try flight planning with the TBM on tablet and then try your shiny touch screen monitor and compare. In fact I have to keep reminding myself that the touch screen even has a touch function and when I do, I start swearing and cursing and blasph (edited due to lack of server space)

Has anyone been able to use the breakout instrument panel windows on a touch screen monitor?

I’m specifically interested in the TBM G3000 touch panel display. It is no problem to break it out and display it on a separate touch monitor but it will not see any touch events from the monitor. I notice that there is an empty profile for the touch monitor in the Control Settings and wonder if something needs to be configured on this page???

It strikes me as strange that this does not work as it seems to me to be a perfect use for a touch screen.

Same problem here!
TBM 930 popping out the touch panel to my touch screen.
Although when I touch the screen the mouse pointer is right behind it, but the “click” event is not accepted by FS2020. When I click on a button with the touch screen for example, and then click the left mouse button, it works… but it would be great if the touch event triggers the “click” message!

I have a tool to remote graphics and input over the network and I used this solution to deal with this problem. I think it is similar, but I don’t have any touch displays though so I can’t really offer you a fix right now.

  • Set focus to the pop out window
  • Issue input to the window
  • Set focus to the pop out window parent (main game)
  • Set active window to the parent (main game)

Does your touch screen have any driver configuration, or any way to choose whether it issues native Windows 10 touch input as opposed to legacy mouse input?

Touch screen has no drivers, uses the native Microsoft touch drivers. Settings can be changed at HKEY_CURRENT_USER\Software\Microsoft\Wisp\Touch. But only tweaking possible for double tap and press and hold. Not helping us for FS2020…

I read in another thread that Javascript callback functions for events are present for the MouseUp event, which a touchscreen doesn’t have of course. In my opinion a tap sends a MouseClick event/message (in C++ / Win32 API). I also read in JS there should be a touchstart and touchend event? Could we try to map the touchend event to the same callback function of the mouseup event?

Just trying to find some persistent and clean solution :wink:

1 Like

Please let us know if you ever do find such a solution…

1 Like

There’s a pretty straightforward solution whereby you put a transparent window on top of the game and handles the click and deals with sending input back to the main game. I think there’s a related problem here that the main window needs to get focus otherwise your joysticks won’t work anymore. This should be pretty easy, I would use WPF.

The touch handling in JS is unlikely to be the problem, touch input will fall back to mouse handling. I would guess something isn’t hooked up or opts into touch input and then ignores it for something else.

One can build a window on top (out of process) indeed, but how do you want to send the click events to FS2020? Using SimConnect, or using native Win32 API’s?
The other problem is indeed when you click on your app, the FS2020 window will lose focus, so joysticks and other devices won’t react any more :thinking:.

I checked the .js that handles mouse events, and indeed it checks the MouseUp event with a callback function. Touch taps in Windows sends Click events (through the driver), never MouseUp events, so here lies the problem. No idea if Asobo will add support for touchscreens one day.

Will try this weekend to check for the Click event or the TouchEnd event (I read JavaScript knows this event, but I’m not a JS developer, only have knowledge of C++, C#).

I’ll keep you posted guys!

Ah, if you know C# then this is actually easy for you to put together. I can send you some more code if you want. For creating a transparent window and handling the click (which will come from your touch screen), you can use WPF events as normal, after making the window transparent but still hittest for input. You can fit windows over the existing windows by using FindWindow to find title=empty, class=AceApp and then creating your own TopMost=true window.

After that, you can use mouse_event from User32 to take your ‘captured input’ (from your touch screen) and issue it to your flight sim windows. In my app (my input comes from remote over the network), I use this pattern:

// Set focus to sim popout window
User32.SetForegroundWindow(window.Window.Handle);

// Save the current cursor positon
POINT savedCursorPos;
GetCursorPos(out savedCursorPos);
SetCursorPos(xpos, ypos);

// Issue input
mouse_event(MOUSEEVENTF_LEFTDOWN, xpos, ypos, 0, 0);
Thread.Sleep(50);
mouse_event(MOUSEEVENTF_LEFTUP, xpos, ypos, 0, 0);
Thread.Sleep(50);

// Move cursor back over the sim
SetCursorPos(savedCursorPos.x, savedCursorPos.y);

// Set focus to the sim itself (owner of popout)
var owner = User32.GetWindow(window.Window.Handle, 4);
User32.SetForegroundWindow(owner);
User32.SetActiveWindow(owner);

Your task is to do the one thing I forgot about: you must move your curosr back over the main sim window after dealing with your other windows. This is critical as I believe not only does the game need focus but also the cursor must be above it and not above a 2d window any longer. In my case I will ‘park the mouse’ over the corner of my game window, for your case you might need to explicitly set the cursor to over your game window, because using your touch screen probably also just moved it.

Important notes:

  • Use a thread for mouse_event, do not do it from your UI thread
  • Those sleeps above are necessary for the game to see your input for long enough to respond to it (didn’t dive deeper, this is a little strange)

Can you point out which file you’re looking at? We could just fix this with a Community folder mod if so, but I’m very suspect it’s this simple.

For sure, no probs.
I’m talking about the NavSystemTouch.js file (there is only one under \VCockpit\Instruments\navsystems\Shared.

In the code one finds:

class NavSystemTouch extends NavSystem {
get IsGlassCockpit() { return true; }
get isInteractive() { return true; }
connectedCallback() {
    super.connectedCallback();
    this.selectionList = new NavSystemElementContainer("Selection List", "SelectionList", new NavSystemTouch_SelectionList());
    this.selectionList.setGPS(this);
}
makeButton(_button, _callback) {
    if (!_button) {
        console.warn("Trying to add an interaction on null element, ignoring");
        return;
    }
    _button.addEventListener("mouseup", this.onButtonPressed.bind(this, _callback));

I’m pointing to this last line. Will add “touchend” like this:

button.addEventListener("touchend", this.onButtonPressed.bind(this, _callback));

No idea if this will work. I’ll let you know this weekend.

1 Like

I’ll be super impressed if this is the fix. I’m expecting that something between the various parts of the Windows input stack and the game isn’t fully hooked up with Coherent. If it is and they’ve only forgot to handle it correctly/additionally, that’ll be very cool.

If you’re not already aware of this, it’s a must have for debugging the gauges: GitHub - dga711/msfs-webui-devkit: This devkit is a mod and guidance for easier development of WebUI (Panels, MFD..) in Microsoft Flightsimulator 2020.

You could probably skip messing with NavSystemTouch if your simple fix doesn’t work, and create an element and register for all the events, to see if you get anything at all firing for any interactions. That’d go a long way to prove my theory that it’s not hooked up correctly.

Here’s the source project that that code is from, if you end up using that snippet above and need anything else: GitHub - davuxcom/P3DStreamer: A tool for remoting 2D panels

Ran some tests this weekend as promised.

First of all: the .JS is indeed a debugging HTML Coherence piece of code. Doesn’t have anything to do with the Windows Messages to the native FS2020 executable. So forget this leading to a final solution.

Second: did some C++ tests with WM_messages, the mouse and the touchscreen. 99% of all touch drivers simulate a USB Touch or mouse. When you tap on a touch screen, the driver sends the position and a WM_LButtonDown message. When pressing long it sends the WM_RButtonDown message (right click). In my opinion the executable of FS triggers functions with another Mouse message than the LButtonDown. If this is the case, Asobo should change to the LButtonDown message or add native WM_Touch messages for touchscreens.

I’m only a basic C++ programmer, but one could check with a Sysinternals Spy++ program what messages FS need to active navigation functions. With this make a global hook to the touch message, and send the needed message to the application window. This would need knowledge of Global hooks and Win32 mouse events.

I’m curious if more input or reactions will come to this thread. I find it somehow strange that not more people want the pop out navigation for the G3X/5X? be touch screen compatible. Also because third party tools like Air Manager doesn’t support G3X panels…

[quote=“FrettenDotCom, post:13, topic:359527, full:false”]
-snip-I find it somehow strange that not more people want the pop out navigation for the G3X/5X? be touch screen compatible.-snip-[/quote]

Me too :confused:

So did you create a package and test touchend?

CoherentGT runs inside FlightSimulator.exe. I think DirectX/DirectInput might be used and then input routed into Coherent and then into the htmlview hosting the gauge. DirectInput bypasses window messages so the usual WM_LBUTTONDOWN/UP and WM_INPUT for raw input would be ignored.

Well, this is a late reply.

The thing is, I have tried with a tablet and external touch screens (using Air Manager). Touch simply doesn’t work. Like AT ALL. All touching one of these popped out instruments does is give focus to that screen and kills all controls of the plane (yoke, throttle, pedals).

I highly suspect this is an issue with screen resolution and aspect ratio scaling, because at one point, touch did somewhat work. It was awkward, but did at least do something. Now, it doesn’t do anything any more.

Would be great if I could use my touch screen monitor as a pop out screen for my EFB on the CRJ and A32NX. I am sure more EFbs will be coming down the pipe also.