So close to a mouse-free TBM experience

Unfortunately, it would appear MSFS doesn’t register touch screen commands on popped out windows. :frowning:

I thought I had a fantastic idea - install SpaceDesk on my old phone (Pixel 2 XL), pop out the TBM’s touch screen, and use it from there. The idea was sound, in theory, and it would have been perfect! I have the PFD/MFD/COM buttons on my Stream Deck, so I’m able to switch between the different TSC pages from there. It would have been perfect! Except…

It seems MSFS doesn’t support touch input on popped out windows. I confirmed this by firing up MSFS on my laptop which has a touch screen. I can interact with the cockpit using touch, albeit awkwardly, but once an instrument panel is popped out, touch support is lost.

BOOOO-URNS!!!

1 Like

Upon further investigation, it seems it does support touch, but not as touch should be. Tapping a touch screen is seen as a mouse click event, but it doesn’t register the position where it was touched. If I have my mouse on my main screen over a control in my cockpit and I tap in a popout window, it will activate the mouse click in the other screen instead of on what I was actually tapping.

Seriously, this has to be the most terrible implementation of touch screen functionality ever.

1 Like

It’s soooooo many small things like this that ■■■■ me to tears about this game. Such a small thing to implement by Asobo but can guarantee that I’ve they ever actually acknowledge the issue, it will get marked on their list as “under investigation” with a schedule date of 2024.

3 Likes

I‘m pretty sure this comes from the controller curse. You can only „click“ whats right behind the cursor. What we have as a mouse when flying in the desktop sim is just a backup. That‘s why there is no right click functionality either. If they really keep it thst way there will be massive issues with advanced 3rd party addons. A huge amount of existing code will have to be done again to get along with simple left click and a bit of wheel. And then ask again why the NG3 is assumed for Q4. It‘s not only the SDK.

1 Like

It’s another example of the “consolitis” this sim suffers from in some areas, and it’s not even on console yet. But the groundwork for this was laid long ago, I expect. The example I have seen is the apparent negative mouse acceleration you feel when moving the cockpit free look camera around with the mouse.

If you move the mouse around very, very slowly, you’ll find you can traverse your entire mouse mat, and you view will have hardly changed. Move it faster, an it will move as you expect. It’s more usual to see positive mouse acceleration, seen in a lot of console ports, and easy to detect.

Move your mouse up against a fixed point, like the side of your keyboard, and take a note of what you are pointing at. Now move the mouse back way from the keyboard. Now repeat this, and move the mouse either faster or slower, back to the keyboard. If your view does not return each, and every time to that same aim-point, no matter how fast you move the mouse, then mouse acceleration is at work.

But it works well with a console controller joystick…absolute vs. relative.

Back to the screen tapping issue, on some laptops, you can choose how you want the tap screen to work. Either as an absolute device, where you tap the screen, and the mouse jumps instantly to that spot, so your tap registers where you finger touched, or you can run it like a giant mouse pad, where you drag the finger around to move the mouse cursor. Your taps will register where the mouse pointer is, not where your finger is.

The problem here is that the mouse cursor (or what looks like a mouse cursor) can’t leave the sim window and so it won*'t recognize the input to the sim. He can display it somewhere else but he can’t really use it. Maybe this will change with the multi screen support…

Indeed. That’s exactly it. Even though SpaceDesk is set to use absolute, which SHOULD be correct and pick up taps exactly at the coordinates they are. With any other WIndows app, I can move it to my SpaceDesk monitor on my tablet or phone and use it as if I was using my laptop with a touch screen. Just not MSFS. If it’s not in the main window (the cockpit), it’s like it’s forced to go into relative mode and it will register the taps to the mouse cursor vs the coordinates I’m tapping on.

1 Like

I guess that’s also the reason why VR controllers are not supported. The sim’s curser can only MOVE, not suddenly jump to another location like eg a laser pointer or VR pointer. Like you can only move an XBOX cursor with your thumb stick. The mouse gives directions where to move the cursor but a VR device (or a tapping finger) would suddenly require a warp to another location.

3 Likes

OK, here I go again…
We are again having a discussion about how a feature that is not yet implemented in the sim is not working. Controlling pop out gauges on a separate display would require the implementation of multi-monitor support. A feature that MS/A has told us is coming but not yet supported.

While I agree the use of touch tablets would be really nice and I wish it worked, I have three android devices and a MS surface all ready to go, we have to just be patient. It will come.

3 Likes

I have an IR touch frame installed on my monitor. I found it took some getting used to but use it a lot now for any of the switches in particular. I have even used it to use the throttle quadrant, when flying twins, to grab and drag individual levers. Works fine for practicing engine out procedures.

I am still using my old sidewinder, that only has the one throttle axis, and legacy keyboard assignments for prop and mixture, so individual engine control is a pain without the touch screen.

1 Like

Umm not yet implemented? It is implemented. You can pop out the displays. Have been able to from day one. You just can use them because it hasn’t been implemented properly like so many other things in this game that were rushed.

I would also chalk this up to a half-assed implementation of touch control vs multi-monitor support.

Willis, you and I have different definitions of what multi-monitor support means.

You can already break out instruments. You just can’t interact with them using touch control once they’re broken out, even on the same monitor. I think this is 2 different issues we’re talking about.

1 Like

Please read my post again.
I was pretty clear…

Although you can pop stuff out, putting them on a different display causes slowdowns and lost functionality (no touch or mouse). It is this way because MULTI-SCREEN capability is not yet supported. Just because you can stretch a display to cover multiple screens does not mean the program is designed to accommodate it.
I never suggested that the ability to pop stuff out was not implemented. I said you can’t expect them to work across multiple screens. By the way, they work just fine as long as you keep them on the main screen.

Nope. Popping them out causes performance loss, not moving them to other screens.

If your only monitor is a touch screen and you break out a touch screen instrument, you can interact with it with your mouse. But not with touch. Touch “works” in the ■■■-backwards way I described in an earlier post. Moving said instrument to another monitor has zero impact on performance, nor does the wonky touch behaviour change.

It has nothing to do with multi-monitor support. It’s bad touch support.

I may be wrong on this Crunch but I expect the only reason they pop out at all is as a precursor to the upcoming multimonitor support.
In my definition, I have my multiple touch screens arrayed either for the overhead or center consoles so that things like my FMC are where they should be and I can operate them as the real thing. My panel is displayed without an outside view. The outside view is displayed on a curved monitor array so its like looking past the panel and out the windscreen.

Lost me. When I attempted to move the display onto my MS surface halved my frame rate. The gauge has no buttons to press, you only get the display portion. The FMC becomes unusable due to delay.
I have not noticed any hit until I drag it onto another display.

For me, the frame rate hit happens the second an instrument of some kind is popped out, like a PFD, MFD, etc. Doesn’t matter if it’s sitting on the same monitor or moved to one of my other monitors. The act of breaking out the instrument is what causes that performance hit. That has nothing to do with multi-monitor support. That has everything to do with inefficient re-rendering of the instruments in the new “container” it creates vs mirroring the original.

I have 3 monitors hard-wired to my system normally (2 atm - did some desk rearranging and waiting for another VESA arm to arrive). My main is a 34" 3440 x 1440, on the right I have a 27" 4K, on the left an old 1200p 24". The game only runs on my centre 34" monitor. But I can move the popped out instruments to any monitor, and that has no impact on performance at all.

And I also have 2 tablets connected over Wi-Fi using SpaceDesk. Again, I take the frame rate hit breaking out the instrument, but I can move it to either tablet without any additional performance loss.

In the case of my screen shot in my OP, I was running SpaceDesk on my old phone, and moved the TBM’s touch screen to that. Again, no performance decrease from doing that. THe performance hit came from breaking it out.

But back to the main issue - touch screen support is only supported in the virtual cockpit itself, and not on the popped out instrument windows. It registers the touch as a mouse click, but not the position. The mouse click will be sent to whatever position your cursor is on your main screen in the v-cockpit instead of that you actually are tapping on. Even if it’s on the same monitor.

If you’re using SpaceDesk or something of the link on your Surface tablet, make sure it’s not set to use the default low frame rate. That could explain that rotten performance you’re seeing.It did it to me until I realized you could adjust frame rate in the client software

1 Like

In my understanding multi-screen is rather that you place two (or n) screens next to each other and extend the view across those screens so you can more or less build a 180° view. Not necessarily a necessity for flightsims but good for cockpit builders. But to move a GPS to a tablet and be able to actually use it SHOULD be basic for today’s flight simulation. And it has been done but again incompletely. We know this sim has been rushed and that Asobo are probably the last to blame but they need to figure out how they can fix all these incomplete features before they can properly implement new ones that will naturally break again when other things get fixed. Otherwise we run in circles and the 3rd part devs run riot.

Close. You get to select custom camera angles for each screen. That way, you don’t stretch the single view across multiple screens (like you can now with nVidia Surround). You set up custom cameras for each screen. That gives you a more accurate view of the world around you for a home cockpit. You can literally box yourself into 3 giant 60" TVs if you wanted and have realistic views all around you.

You could also set up something like an airliner’s overhead panel on a touch screen, an FMS, etc. Each screen gets its own custom camera angle that’s independent from your main screen.

That’s the multi-monitor setup Asobo are working on and that’s sorely lacking right now.

And agree. Breaking out instruments onto a touch screen and be abe to interact with it is something I call base functionality in 2020/21.

2 Likes

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.