Hi all,
long time since the last time I wrote here in the forum. Maybe someone could remember my long thread about the construction of a force feedback yoke.
I have been using it for a while, although I admit it has been giving a lot of problems due to the instability of the USB link, and I suspect also some limitation of x-force. I would need to redesign part of the electronics to make it simpler, but I’m really short of time lately, so I’m thinking to get a Brunner (I suspect I haven’t spent much less at the end with all the prototyping work).
I have just upgraded my VR and I would like to restart flying a bit, and I would like to add a panel with the physical control (to be used “blindly” without seeing them) for a cessna 172. I’m not interested obviously in all the indicators, I need mostly engine, trim and flaps control, the various switches would be nice to have but not necessary.
I have been searching a bit for panels to be 3D-printed, but I cannot really find something suitable -either they are full cockpit, or they are just single bits here and there.
Is there anything available to be printed that would reproduce the real setup on the plane?
Otherwise, is there a plan of a 172 cockpit to design it by myself?
If I may, design it yourself it’s a big part of the fun Ready-to-use STLs rarely suit your exact needs anyway…
After building a full cockpit I’m currently rebuilding a “blind” one for the C152. It works like a charm. If you can afford a 3D printer and know about coding it’s very interesting to do and you learn a lot about the plane on the go. Fusion 360 is extremely powerful and free to use for personal projects.
About plans, I would say that the most important thing in this context is not to fit the real-world dimensions, but to have controls exactly where your hand wants to find them while in the VR cockpit. Normally this should match, but anyway it’s easier to use a stable reference point (for example the yoke) and then put the controls where your hands expect them while in the cockpit.
I did have a look, there are a few things and I think I have found also the link you mention, but I can’t access. It says that my account doesn’t have access to those pages…
Nice! I have been flying on a C152, I wasn’t sure if trying to replicate that of the C172. The C172 looks a lot better though…
I do have a printer and I use CATIA for work, so that’s not a big issue. I would recycle part of the code and electronics I have from my yoke to get all the inputs to the PC as a standard joystick (which is what I have done now, I have a little panel that is read by the same electronics of the yoke and I have modified the descriptor of the USB peripheral to include all other inputs).
Hm, true, distances are not fundamental but I think relative location is. I might learn how to move my hand, but if what I see coincide with the movement it would be better.
So you would recommend to get the image of MSFS cockpit and simply reproduce it?
I tried this, which is useful anyway for the global picture, but in the end I go for an even more pragmatic approach for the location of individual controls each time I add a new one:
I use my yoke as a reference point (more precisely the left branch of it, as its width doesn’t exactly match the real one).
I center the view very carefully, making sure that the location of the real yoke exactly matches the one in the cockpit (it’s useful for that matter to look through the small empty space around the nose, that I have with my Quest 3).
I let one finger naturally go to the desired location in the cockpit, and mark the physical location with a pencil.
Repeat the last operation several times to adjust, but I noticed that the pointed location is surprisingly consistent from one try to another.
Create and install the new component at this place
Until a few days ago I used to leverage OpenXR toolkit’s ability to track hands and display them in the virtual world, which was even more convenient to apply this procedure and actually use the control (I really like better to see the hands). Unfortunately this stopped working since a very recent update (of Oculus tooling, I suspect) so I don’t see my hands anymore. But as long as the controls are precisely located I can still find them naturally.
However relying on muscle memory only to find controls that would not be at the exact place you “see” in the virtual cockpit, would be very disturbing. It’s important that the visual match is almost perfect.
For that matter I highly recommend using Arduino’s. You quickly get lots of buttons and complex controls are very difficult to implement with simple joystick buttons (eg. rotary encoders), and some are just not bindable in MSFS (eg. carb heat).
For my setup I did the whole programming myself (Arduino, Middleware and protocol between the two), which gives total flexibility. But there are several ready-to-use software like Mobiflight that should be appropriate too (especially for a blind cockpit).
As electronic engineers who had to suffer a lot of people claiming that they “do electronics” becasue they program Arduino, I have a natural hathred towards it but I’m open to use it if it makes sense.
What would I get with an Arduino that I would not get from my own platform, which can control just anything -buttons, potentiometers, encoders, motors, etc- as USB HID? Is there some software on PC side that could manage these things better?
Buy a used Saitek Yoke System ( the one with throttle quad on circular plug ).
Do what this guy has done.
Your tabletop needs to be 250mm above your chair seat. A padded dining chair is great.
The Throttle Quadrant is clamped to the tabletop so your hand falls to it when you are in VR.
On the Quadrant stalks -
Throttle knob is on a cranked beam making it 12mm toward you ( I used bent sawn spoon handles ) and making the stalk effectively 10mm longer.
Mixture - On the centre stalk./
Flaps Axis - Another bent spoon handle making the knob 35mm further away to the right. Try to get the knob level with its original position on its stalk.
Put the RED Mixture knob in the Centre and the BLUE Prop knob as the flaps lever.
Knobs and spoon handles are held firmly on the stalks with plastic insulation tape.
Get VR running and fine-tune the controls positioning so your fingers fall onto each.
After a while, muscle memory will put your fingers straight onto the knobs.
CAUTION. Immersion is terrific and you might get ‘Height Fright’, or panic because you haven’t a seat harness.
Best of luck,
PS
Elevator Trim Effectivness in Flight_Model.cfg needs to be 0.265 instead of the 1.51 Original.
This will make the Trim Wheel turn more for the trimtab movement.
Use Mouse and It’s schrol wheel to move the trimwheel in VR.
I agree it’s not like “doing real electronics” to plug a few things into an Arduino (I’m a software person with little knowledge in electronics). But it’s simple and just works, so why not ? Wouldn’t you be french ? (disclaimer: I am).
And honestly we already have plenty of other things to deal with with modeling, printing, assembling, wiring, programming…
As for some use cases where a plain USB HID (joystick or emulated) directly bound to MSFS control will not suit the need, or not easily, I encountered several of them. A few examples:
Deal with acceleration of a rotary
Deal with anything that cannot be bound in MSFS configuration (ex. carb heat is not bindable, the JPL Cessna magneto switch needs specific event+LVAR, etc.)
And obviously anything that needs bidirectional communication (although for a blind cockpit there is little need to).
“Le diable est dans les détails” as we say, and while creating a full C172 cockpit I often stumbled on minor issues that would have been almost impossible to solve without having full hands on the code on the Arduino AND the simconnect middleware.
Aha, the famous simconnect. I suspected we were going there… In the past I had been asking why don’t they publish the interface protocol, so it could be used in any platform someone want to develop. As they don’t sell the hardware, it would just expand their market, but apparently -at least back then- they wanted to keep it close. Or at least hard to retrieve without a good amount of reverse engineering.
As I have all the development tools (prototype boards, debuggers, etc) for Microchip, I would have preferred to remain on that platform.
But ok, if simconnect and arduino can make things easier (but I think they also have some silly limitation, or that was also in the past? Something like not being able to read potentiometers or encoders…?) I might try again to have a look at it.
I admit SimConnect has a terrible developer experience, probably the worst I’ve seen in the last decade if not in my whole life But I believe it’s quite robust after being used for so many time by so many people.
But yes, I fail to understand how they didn’t end up opening the protocol or come up with something more modern. On my side I encapsulated it in all the wrappers and abstraction layers I could, in order to hide the dust under the carpet. Lastly I rewrote it in Rust which has some bindings, within a generic message-oriented architecture. And now I can just forget about it and enjoy sending messages from every device into MSFS
As for potentiometers it’s really straightforward to use them using an Arduino, couldn’t be simpler. For rotaries it depends on the quality of the actual hardware but implementing the logic with basic debouncing is not that complicated (and I believe there are libraries for this).
But I understand very well that focusing on the microship and avoiding simconnect would be very nice if you have success with just an HID interface of whatever. Would be interested in your findings !