MSFS 2024 Beta 5 , 1.7.16 Freezes

Feel free to delete this quote section after adding your appropriate tags.

ISSUE DESCRIPTION

Description of the issue:

Msfs 2024 beta 6 freezes for few seconds at any situation ( in airports, in the setting menu…), opted out from MSFS 2024 Beta and all were OK.

FREQUENCY OF ISSUE

How often does this occur for you (Example: Just once, every time on sim load, intermittently)?

All the time

REPRODUCTION STEPS

Please list clear steps you took in order to help our test team reproduce the same issue:

  1. started msfs 2024 beta

  2. free flight, any plane, any airport

  3. ready to fly

  4. Freeze, cant open meu or settings

YOUR SETTINGS

If the issue still occurs with no mods and add-ons, please continue to report your issue. If not, please move this post to the User Support Hub.

What peripherals are you using:

[PC Only] Are you using Developer Mode or have you made any changes to it? no

[PC Only] What GPU (Graphics Card) do you use? RTX 50590

[PC Only] What other relevant PC specs can you share?

MEDIA

Please add a screenshot or video of the issue occurring.

Same issue and situation here.

Presume you meant SU5 BETA.

Same here, had to leave beta, back to SU4 :face_with_diagonal_mouth:

Hi folks,

Can those of you experiencing this issue please provide us with some more information, as the reports above don’t indicate too much for our Testing Team to try to replicate. Are there any particular aircraft or airports in use, do you have 3rd party mods enabled, what hardware are you using, are there any particular common steps you take to easily reproduce this consistently?

The more information you can provide, the more focused testing the team can do to try to quickly reproduce the issue!

Thanks
The MSFS Team

I did several flights on 1.7.16, both in VR and 2D. I saw the same behaviour in both modes. During the flight, the sim would literally lock up for up to 5 seconds - to the point I thought it would CTD. However it carried on after the hang-up.

These aren’t stutters, they’re full on hangs.

Could you please provide some further information as asked above? Things like your hardware (and even graphics driver versions), whether you are using mods, if there are any aircraft that seem to trigger this easily etc.?

Absolutely - apologies.

  • Hardware 5800x3d, x570, 4080 Super, 64GB DDR.
  • Flight with Kodiak 100 Series II (2024 version)
  • Scenery Inibuilds EGGW. I was doing circuits of EGGW and seeing the hangs previously mentioned.
  • Reverted to SU4, same scenario - no issue.

When I get some time this week, I’ll re-join the beta and revert the scenery to default and try again.

Hope this helps.

Was testing the new beta, had freezes again.

This is the average CPU usage when it freezes for 20 seconds:

This is how it looks post-freeze:

I have a feeling it has to be something with the CPU, because the usage just drops to zero OR near-zero.

My forumsearch-fu is weak, so I found the wrong thread. Here’s how the bug affected me:

93% Memory usage :astonished_face: Ok, this explains the freezes very well. You have to reduce memory usage significantly.

I don’t know, if the other users suffer from the same problem, but for you it is the memory usage.

More data from this morning, around 7am CEST.

# Start Duration Interval
1 06:56:04 15.1s
2 06:56:56 15.1s 52.5s
3 06:57:50 14.8s 53.5s
4 06:58:45 15.1s 54.6s
5 06:59:40 15.1s 55.7s
6 07:00:37 15.1s 57.0s
7 07:01:34 15.1s 57.1s
8 07:02:33 15.1s 59.2s
9 07:03:33 15.1s 59.4s
10 07:04:34 15.1s 61.4s
11 07:05:37 15.1s 62.5s
12 07:06:40 15.1s 63.7s
13 07:07:45 15.1s 64.4s
14 07:08:50 15.1s 65.3s
15 07:09:57 15.1s 66.9s
16 07:11:05 19.6s 67.2s
17 07:12:18 15.1s 73.0s
18 07:13:26 15.1s 68.2s
19 07:14:36 15.1s 69.5s
20 07:15:47 15.1s 71.0s
21 07:16:59 15.1s 72.0s
22 07:18:12 15.1s 73.2s
23 07:19:26 15.1s 74.2s
24 07:24:41 15.1s 75.3s
25 07:26:00 15.1s 79.0s
26 07:27:20 15.1s 79.6s
27 07:28:41 15.1s 81.0s
28 07:30:04 15.1s 82.7s
29 07:31:28 15.1s 84.0s
30 07:32:53 15.1s 85.0s
31 07:34:19 15.1s 86.3s
32 07:35:47 15.1s 87.8s
33 07:37:16 15.1s 88.8s
34 07:38:46 15.1s 89.6s
35 07:40:17 15.1s 91.2s
36 07:41:49 15.4s 92.0s
37 07:43:22 15.1s 93.4s
38 07:44:57 15.1s 94.6s
39 07:46:32 15.1s 95.5s
40 07:48:09 15.1s 97.2s
41 07:49:47 15.1s 97.8s
42 07:51:26 15.1s 99.2s
  • Duration: 15.1s (outlier: #16 at 19.6s)
  • Interval: increases linearly, 52.2 + 1.12 × cycle_number seconds (R²=0.998)
  • >20% of time frozen

When the freezes happen, they appear to happen anywhere: in the menu, in cinematics, in flight.

I took snapshots of the TCP connections before, during and after each freeze. I could not identify a correlation to specific requests. I still believe that they’re triggered by something external, but since I can’t provide evidence, you have to have faith. :laughing:

Note that I was running sim sessions for hours before the freezes started, including two 1h+ flights without any freezes whatsoever. I restarted the sim a bunch of times, I even restarted my system once (for unrelated reasons).

Since memory exhaustion has been brought up as a likely culprit: my system has 64GB and a 4090, only the sim is unresponsive during the freezes, the rest of the system appears completely unaffected. Nonetheless, I added some memory metrics to my script, so next time I encounter the issue, I’ll find out whether it’s related to resource usage on my system. I might even check if the issue persists when I run the sim with an empty Community/.

Time passes.

============================================================
FREEZE SUMMARY — 7 events

# Start Dur Interval WS(pre) WS(post) WS delta Priv(pre) Priv(post) Priv delta Faults
1 16:41:02 15.0s 13.77GB 13.79GB +24.5MB 21.08GB 21.15GB +66.5MB +46153
2 16:42:04 15.0s 61.8s 13.72GB 13.72GB +76KB 21.18GB 21.06GB -129.4MB +434
3 16:43:06 15.0s 62.7s 13.68GB 13.68GB +492KB 20.96GB 20.96GB +204KB +633
4 16:44:10 15.0s 63.9s 13.69GB 13.69GB +396KB 21.00GB 20.94GB -64.1MB +147
5 16:45:15 15.1s 64.7s 13.74GB 13.74GB +564KB 21.07GB 21.07GB +452KB +209
6 16:46:21 15.0s 66.0s 13.76GB 13.76GB +760KB 21.07GB 21.08GB +844KB +313
7 16:47:28 15.1s 67.2s 13.83GB 13.83GB -260KB 21.14GB 21.14GB -1.7MB +483

Durations: 15.0s, 15.0s, 15.0s, 15.0s, 15.1s, 15.0s, 15.1s (avg 15.0s)
Intervals: 61.8s, 62.7s, 63.9s, 64.7s, 66.0s, 67.2s (avg 64.4s)
** Intervals are increasing (avg +1.1s per cycle) **

── Memory Growth ──
Session: 6.7 minutes (7 freezes)
Working set: 13.77GB → 13.83GB (+65.2MB over session)
Private mem: 21.08GB → 21.14GB (+58.1MB over session)
Page faults: 13459152 → 13943382 (+484230 over session)
System avail: 36.64GB → 36.25GB (-402.6MB)
Pagefile used: 36.45GB → 37.07GB (+631.3MB)

── Per-Freeze Deltas ──
WS avg=+3.8MB min=-260KB max=+24.5MB
Private avg=-18.2MB min=-129.4MB max=+66.5MB
Faults avg=+6910 min=+147 max=+46153

I asked Claude to analyze the raw data:

7 freezes, 16:41–16:47, all exactly 15.0s.

Intervals: 61.8, 62.7, 63.9, 64.7, 66.0, 67.2s (+1.07s/cycle, fits a quadratic with residuals <0.12s).


Memory is NOT the cause

  • System has 36GB free RAM, pagefile at 29%. No memory pressure.

  • WS steady at ~13.7GB, Private oscillates around 21–21.8GB with no sustained growth.

  • No memory threshold is being hit.


What happens during vs after freezes

During freeze: Mostly inert. When something does happen, it’s discrete -64MB Private drops in the first 1–2s (freezes 2 and 4), then silence for 13s. The 64MB quantum suggests a specific allocator block size (large-page region or mapped file). Most freezes (3, 5, 6, 7) show zero Private change.

After freeze (the real action): Every freeze is followed by a burst of 100–600MB allocation and 5,000–25,000 page faults/s in the first 5 seconds. The sim is catching up on a backlog that accumulated while it was frozen.


Bottom line

Memory monitoring confirms the freeze is not memory-related. The culprit is an internal sim operation that blocks the main thread for a fixed 15s timeout. The memory consequences (post-freeze allocation burst) are a side effect, not the cause. This is almost certainly a backend service call or internal synchronization point in the sim engine.

Doesn’t look like a memory issue to Claude and me. Any ideas what else to check?

With only these add-ons and nothing else in Community/, here’s what I’ve found after encountering the issue again:

image

The following text is AI slop, but the basic facts hold anyway:

Symptom

The simulator freezes completely (no frame updates via SimConnect, window becomes completely unresponsive) at regular intervals. Freezes begin approximately 1-2 minutes after loading and continue until the sim is closed. Restart the sim and the freezes might still happen, or everything might be fine. Couldn’t find a pattern.

Finding 1: Freeze Timing is Perfectly Deterministic

Property Value
Duration Fixed at 15.1–15.2s (zero variation across 51 freezes)
First interval ~115s after load
Interval growth +1.1s per cycle (perfectly linear)
Formula $Interval(n) \approx 115 + 1.1 \times (n - 1)$ seconds

This is not load-dependent, random, or caused by resource pressure. It’s a scheduled task on a linearly-growing timer.

Finding 2: CPU Shows a Sawtooth Pattern

CPU time consumed during each freeze follows a repeating 4–5 freeze cycle:

  • Freeze 1: 750ms (reset)

  • Freeze 2: 6296ms

  • Freeze 3: 8609ms

  • Freeze 4: 14437ms (peak)

  • Freeze 5: 796ms (reset)

  • …repeats…

  • “Reset” freezes: ~300–800ms user CPU, ~155KB disk writes

  • “Peak” freezes: ~13,000–14,000ms user CPU, ~680KB disk writes

  • Write I/O volume correlates exactly with CPU time

  • A secondary accumulator builds up across freeze cycles and flushes every 4–5th freeze

Finding 3: Handle Leak — +156 Handles Per Freeze

Every freeze creates a net +156 handles. The creation follows an identical repeating sequence regardless of CPU load:

+26, +0, +0, +0, +13, +14, +0, +0, +0, +26, +0, +0, +0, +12 (loop)

This pattern is executed by a background thread, completely independent of the variable CPU work. Over a 2-hour session (sitting on the apron), handles grew from 15,105 to 23,069.

Finding 4: Not I/O, Not Memory, Not Network

Hypothesis Result
Network connections Ruled out. 68% of freezes had zero connection changes. No consistent endpoint pattern.
Memory pressure Ruled out. 36GB free RAM, no sustained growth, most freezes showed zero private memory change.
Disk I/O Ruled out. Only 1–2MB total read/write per freeze. I/O rates are lower during freeze than background.
Network I/O Ruled out. “Other” ops (Winsock) are constant at ~12K ops per freeze regardless of CPU pattern.

Finding 5: Freeze Anatomy (Per-Sample Timeline)

From high-resolution sampling during freeze #52 and earlier sessions:

  1. t=0–1s: CPU burst (all threads). Multiple seconds of user+kernel CPU time consumed in the first wall-clock second (multi-threaded work).

  2. t=1–10s: Main thread blocked on synchronization primitive. Near-zero CPU. Handle creation loop continues on background thread.

  3. t=10–15s: Still blocked, zero activity.

  4. t=15.1s: Unblocks, SimConnect resumes.

The freeze is not the CPU being busy for 15 seconds. It’s ~1s of burst work followed by ~14s of the main thread waiting on something.

I noticed that you have FSUIPC installed. In some cases, it used to cause stuttering and freezes, so I still recommend testing MSFS without it. I also suggest editing your exe.xml so that FSUIPC doesn’t start automatically with MSFS; ideally, remove all lines related to FSUIPC.
Here’s an example:

<Launch.Addon>
		<Disabled>False</Disabled>
		<ManualLoad>False</ManualLoad>
		<Name>FSUIPC7</Name>
		<Path>D:\FSUIPC7\FSUIPC7.exe</Path>
		<CommandLine>-auto</CommandLine>
		<NewConsole>False</NewConsole>
	</Launch.Addon>

This ensures that FSUIPC won’t load automatically and allows you to test if it’s affecting performance.

Yes, please only fill out a Bug Report if you are able to reproduce a potential issue with a clean Community Folder, and preferably with a clean exe.xml if you feel comfortable to edit that. We ask this as it could well be a 3rd party modification that is the cause of the issue you are facing, and our Testing Team will not be testing to try to reproduce said issue with any 3rd party products installed.

1 Like

Sorry, there must have been a misunderstanding. I am not part of the team. :grinning_face_with_smiling_eyes:

While I understand the team’s wish to get “clean” data, I prefer to spend my limited spare time either flying or working on content. I have no reason to start the sim with an empty Community folder, because then I can neither fly (because half of my controls wouldn’t be working), nor keep working on content.

For similar reasons, I have little interest in helping the team releasing a SU that only works properly with an empty Community/ folder, because the release broke compatibility with existing tools or content. (If that is what happened.)

It was fun tweaking my existing debugging tools to try to pinpoint a cause for an issue I encountered randomly while I was using the sim and that I saw for the first time after a specific beta release.

Trying to reproduce an issue that happens randomly maybe once a day (and I restart the sim frequently, because reasons) is not fun, that’s why people get paid to do it. I’m sorry, but someone else will have to provide the information whether they see the issue without FSUIPC or with a totally empty Community/ (if the report by the OP doesn’t count and the information I provided isn’t specific enough to warrant dev eyeballs).

What I can commit to, though: next time I encounter the freezes, I will quit FSUIPC and report whether the freezes stopped or continued.

My working theory is that it’s faulty telemetry code that only triggers in specific situations and that will hopefully be disabled in the release build.

1 Like

No freezes in the past two days, now it’s freezing again. I ran my “freezemon” tool, let it gather data for three freezes, then I quit FSUIPC7 (using “Exit” from the menu, so it was definitely not running any longer) and let the tool continue to gather data for the next two freezes.

I find it somewhat unlikely that FSUIPC7 can cause the sim to freeze while it’s not running.

There’s a summary block at the end of the tool output:

$ scripts/freezemon/freezemon.exe -threshold 1 -log tmp/freezemon_08.csv
freezemon — MSFS freeze detector (I/O + CPU + handles)

Looking for FlightSimulator2024.exe... PID 11684
Baseline: IO: R=12.83GB/399233ops W=85.1MB/42228ops Oth=56.5MB/1490297ops | CPU: User=402828ms Kern=150016ms | Hdl=6073

SimConnect connected. Watching for freezes (threshold 1.0s)...

Receiving sim data (simTime=46.9).

[00:10:29] OK  simTime=47  3 frames  0 freezes  Hdl=6073
>>> FREEZE #1 at 00:10:36.156  (simTime=53.8, no data for 1.0s)
    Pre: IO: R=12.83GB/403626ops W=85.5MB/45804ops Oth=57.1MB/1494254ops | CPU: User=421375ms Kern=154391ms | Hdl=6071
    [+  1.0s] R +87.8KB/2477ops  W +158.3KB/1343ops  Oth +370.1KB/2669ops  User +7344ms  Kern +2047ms  Hdl +5
    [+  4.0s] R +196.7KB/5121ops  W +167.1KB/1426ops  Oth +736.7KB/4942ops  User +7422ms  Kern +2094ms  Hdl +40
    [+  6.1s] R +230.8KB/6854ops  W +220.3KB/1453ops  Oth +974.7KB/6358ops  User +7453ms  Kern +2266ms  Hdl +53
    [+  6.6s] R +287.0KB/7325ops  W +222.6KB/1475ops  Oth +1.0MB/7058ops  User +7484ms  Kern +2344ms  Hdl +67
    [+  8.6s] R +368.1KB/9115ops  W +230.8KB/1553ops  Oth +1.3MB/8674ops  User +7531ms  Kern +2406ms  Hdl +93
    [+ 10.6s] R +402.2KB/10846ops  W +284.0KB/1580ops  Oth +1.5MB/10089ops  User +7656ms  Kern +2578ms  Hdl +105
    [+ 11.1s] R +458.6KB/11336ops  W +286.3KB/1602ops  Oth +1.6MB/10788ops  User +7672ms  Kern +2641ms  Hdl +119
    [+ 13.2s] R +539.6KB/13105ops  W +294.4KB/1680ops  Oth +1.8MB/12406ops  User +7797ms  Kern +2703ms  Hdl +145
    [+ 13.7s] R +547.8KB/13541ops  W +294.4KB/1680ops  Oth +1.9MB/12708ops  User +7969ms  Kern +2719ms  Hdl +145
<<< FREEZE #1 ENDED at 00:10:51.194  duration=15.0s  simTime=53.8  [ACTIVE (user+kernel)]
    Delta: R +572.7KB/14722ops  W +365.0KB/2352ops  Oth +2.0MB/13611ops  User +8047ms  Kern +2781ms  Hdl +157

[00:10:59] OK  simTime=62  914 frames  1 freezes  Hdl=6223
[00:11:29] OK  simTime=91  1763 frames  1 freezes  Hdl=6248
>>> FREEZE #2 at 00:11:28.365  (simTime=91.4, no data for 1.0s)
    Pre: IO: R=12.88GB/449239ops W=89.9MB/72676ops Oth=63.3MB/1535704ops | CPU: User=551203ms Kern=189062ms | Hdl=6248
    [+  1.0s] R +94.6KB/3002ops  W +214.7KB/1778ops  Oth +430.4KB/2980ops  User +8781ms  Kern +3344ms  Hdl +5
    [+  4.1s] R +203.9KB/5672ops  W +223.4KB/1860ops  Oth +797.5KB/5257ops  User +8812ms  Kern +3406ms  Hdl +38
    [+  6.1s] R +237.7KB/7390ops  W +277.3KB/1887ops  Oth +1.0MB/6671ops  User +8828ms  Kern +3594ms  Hdl +51
    [+  6.6s] R +294.0KB/7872ops  W +279.6KB/1909ops  Oth +1.1MB/7370ops  User +8828ms  Kern +3672ms  Hdl +65
    [+  8.6s] R +375.3KB/9661ops  W +287.7KB/1987ops  Oth +1.3MB/8987ops  User +8828ms  Kern +3688ms  Hdl +89
    [+ 10.6s] R +409.2KB/11382ops  W +341.6KB/2014ops  Oth +1.6MB/10403ops  User +8922ms  Kern +3859ms  Hdl +100
    [+ 11.1s] R +465.4KB/11855ops  W +343.9KB/2036ops  Oth +1.6MB/11100ops  User +8938ms  Kern +3906ms  Hdl +114
    [+ 13.2s] R +546.3KB/13620ops  W +352.0KB/2114ops  Oth +1.9MB/12715ops  User +8984ms  Kern +3922ms  Hdl +140
<<< FREEZE #2 ENDED at 00:11:43.358  duration=15.0s  simTime=91.5  [ACTIVE (user+kernel)]
    Delta: R +576.1KB/15183ops  W +423.3KB/2785ops  Oth +2.1MB/13915ops  User +8984ms  Kern +3953ms  Hdl +152

[00:11:59] OK  simTime=108  977 frames  2 freezes  Hdl=6399
>>> FREEZE #3 at 00:12:21.648  (simTime=130.2, no data for 1.0s)
    Pre: IO: R=12.90GB/494504ops W=93.2MB/100039ops Oth=69.5MB/1576718ops | CPU: User=693266ms Kern=231391ms | Hdl=6396
    [+  1.0s] R +111.5KB/3921ops  W +305.4KB/2553ops  Oth +550.8KB/3749ops  User +15266ms  Kern +3125ms  Hdl +5
    [+  2.0s] R +139.6KB/4803ops  W +306.1KB/2558ops  Oth +670.8KB/4406ops  User +15375ms  Kern +3141ms  Hdl +17
    [+  3.5s] R +164.1KB/6095ops  W +306.1KB/2558ops  Oth +849.2KB/5309ops  User +15484ms  Kern +3203ms  Hdl +12
    [+  4.0s] R +220.6KB/6581ops  W +314.2KB/2636ops  Oth +917.9KB/6024ops  User +15500ms  Kern +3234ms  Hdl +38
    [+  6.1s] R +254.7KB/8324ops  W +368.5KB/2663ops  Oth +1.1MB/7439ops  User +15594ms  Kern +3406ms  Hdl +50
    [+  6.6s] R +311.0KB/8804ops  W +370.8KB/2685ops  Oth +1.2MB/8136ops  User +15656ms  Kern +3484ms  Hdl +64
    [+  8.6s] R +392.1KB/10588ops  W +379.0KB/2763ops  Oth +1.4MB/9756ops  User +15766ms  Kern +3547ms  Hdl +92
    [+ 10.6s] R +426.2KB/12321ops  W +433.2KB/2790ops  Oth +1.7MB/11171ops  User +15922ms  Kern +3734ms  Hdl +104
    [+ 11.1s] R +482.4KB/12800ops  W +435.5KB/2812ops  Oth +1.7MB/11869ops  User +15984ms  Kern +3797ms  Hdl +118
    [+ 12.6s] R +506.8KB/14076ops  W +435.5KB/2812ops  Oth +1.9MB/12767ops  User +16094ms  Kern +3844ms  Hdl +118
    [+ 13.1s] R +563.2KB/14562ops  W +443.7KB/2890ops  Oth +2.0MB/13482ops  User +16109ms  Kern +3844ms  Hdl +144
<<< FREEZE #3 ENDED at 00:12:36.676  duration=15.0s  simTime=130.3  [ACTIVE (user+kernel)]
    Delta: R +594.9KB/16180ops  W +515.0KB/3561ops  Oth +2.2MB/14702ops  User +16266ms  Kern +3891ms  Hdl +156

[00:12:36] OK  simTime=130  1362 frames  3 freezes  Hdl=6552
[00:13:06] OK  simTime=161  1831 frames  3 freezes  Hdl=6554
>>> FREEZE #4 at 00:13:16.079  (simTime=170.1, no data for 1.0s)
    Pre: IO: R=12.95GB/544459ops W=94.9MB/114377ops Oth=76.6MB/1631213ops | CPU: User=856703ms Kern=272703ms | Hdl=6556
    [+  1.0s] R +10.7KB/599ops  W +877B/11ops  Oth +60.1KB/364ops  User +16ms  Kern +0ms  Hdl +4
    [+  4.1s] R +115.2KB/3135ops  W +9.7KB/94ops  Oth +427.2KB/2641ops  User +31ms  Kern +16ms  Hdl +37
    [+  6.1s] R +145.9KB/4777ops  W +64.3KB/121ops  Oth +723.0KB/4263ops  User +125ms  Kern +188ms  Hdl +49
    [+  6.6s] R +201.5KB/5237ops  W +66.6KB/143ops  Oth +791.5KB/4963ops  User +250ms  Kern +281ms  Hdl +63
    [+  8.6s] R +279.2KB/6923ops  W +74.7KB/221ops  Oth +1.0MB/6580ops  User +312ms  Kern +312ms  Hdl +89
    [+ 10.7s] R +310.1KB/8581ops  W +129.4KB/248ops  Oth +1.2MB/7996ops  User +391ms  Kern +500ms  Hdl +101
    [+ 11.2s] R +365.7KB/9045ops  W +131.7KB/270ops  Oth +1.3MB/8695ops  User +406ms  Kern +547ms  Hdl +115
    [+ 12.2s] R +380.6KB/9875ops  W +131.7KB/270ops  Oth +1.4MB/9299ops  User +531ms  Kern +609ms  Hdl +115
    [+ 13.2s] R +443.7KB/10749ops  W +139.8KB/348ops  Oth +1.6MB/10316ops  User +547ms  Kern +625ms  Hdl +141
<<< FREEZE #4 ENDED at 00:13:31.088  duration=15.0s  simTime=170.2  [ACTIVE (user+kernel)]
    Delta: R +469.9KB/12206ops  W +194.0KB/369ops  Oth +1.7MB/11296ops  User +578ms  Kern +656ms  Hdl +153

[00:13:36] OK  simTime=176  917 frames  4 freezes  Hdl=6711
[00:14:06] OK  simTime=206  1829 frames  4 freezes  Hdl=6704
>>> FREEZE #5 at 00:14:11.453  (simTime=211.0, no data for 1.0s)
    Pre: IO: R=12.96GB/589004ops W=95.2MB/117195ops Oth=83.0MB/1672935ops | CPU: User=1006281ms Kern=314031ms | Hdl=6702
    [+  1.0s] R +61.1KB/1350ops  W +4.9KB/70ops  Oth +252.6KB/1844ops  User +2297ms  Kern +859ms  Hdl +7
    [+  3.0s] R +102.4KB/3008ops  W +5.6KB/75ops  Oth +491.2KB/3101ops  User +2422ms  Kern +875ms  Hdl +16
    [+  4.0s] R +165.5KB/3885ops  W +13.7KB/153ops  Oth +619.7KB/4119ops  User +2484ms  Kern +891ms  Hdl +42
    [+  6.1s] R +196.2KB/5537ops  W +68.7KB/180ops  Oth +857.6KB/5533ops  User +2625ms  Kern +1047ms  Hdl +55
    [+  6.6s] R +251.7KB/5994ops  W +71.0KB/202ops  Oth +926.0KB/6230ops  User +2672ms  Kern +1125ms  Hdl +69
    [+  8.6s] R +329.5KB/7701ops  W +79.1KB/280ops  Oth +1.1MB/7847ops  User +2750ms  Kern +1156ms  Hdl +95
    [+ 10.1s] R +351.7KB/8933ops  W +131.1KB/281ops  Oth +1.3MB/8763ops  User +3031ms  Kern +1250ms  Hdl +95
    [+ 10.6s] R +360.4KB/9350ops  W +134.1KB/307ops  Oth +1.4MB/9266ops  User +3078ms  Kern +1391ms  Hdl +107
    [+ 11.1s] R +416.0KB/9818ops  W +136.4KB/329ops  Oth +1.4MB/9964ops  User +3094ms  Kern +1469ms  Hdl +121
    [+ 13.2s] R +493.7KB/11509ops  W +144.5KB/407ops  Oth +1.7MB/11579ops  User +3094ms  Kern +1531ms  Hdl +147
<<< FREEZE #5 ENDED at 00:14:26.444  duration=15.0s  simTime=211.0  [ACTIVE (user+kernel)]
    Delta: R +520.4KB/12992ops  W +199.1KB/428ops  Oth +1.9MB/12774ops  User +3094ms  Kern +1531ms  Hdl +159


============================================================
  FREEZE SUMMARY — 5 events
============================================================

  #   Start     Dur   Interval  Read         Write        Other            User   Kernel  Hdl   Classification
------------------------------------------------------------------------------------------------------------------------
  1   00:10:36  15.0s      ---  572.7KB/14722 365.0KB/2352 2.0MB/13611   +8047ms   +2781ms  +157  ACTIVE (user+kernel)
  2   00:11:28  15.0s    52.2s  576.1KB/15183 423.3KB/2785 2.1MB/13915   +8984ms   +3953ms  +152  ACTIVE (user+kernel)
  3   00:12:21  15.0s    53.3s  594.9KB/16180 515.0KB/3561 2.2MB/14702  +16266ms   +3891ms  +156  ACTIVE (user+kernel)
  4   00:13:16  15.0s    54.4s  469.9KB/12206 194.0KB/369  1.7MB/11296    +578ms    +656ms  +153  ACTIVE (user+kernel)
  5   00:14:11  15.0s    55.4s  520.4KB/12992 199.1KB/428  1.9MB/12774   +3094ms   +1531ms  +159  ACTIVE (user+kernel)

Durations: 15.0s, 15.0s, 15.0s, 15.0s, 15.0s  (avg 15.0s)
Intervals: 52.2s, 53.3s, 54.4s, 55.4s  (avg 53.8s)
  ** Intervals are increasing (avg +1.1s per cycle) **

── I/O During Freezes ──
  Read:  2.7MB across 71283 ops
  Write: 1.7MB across 9495 ops
  Other: 10.0MB across 66298 ops

── CPU Time During Freezes ──
  User time:   36969ms (49.3% of freeze time)
  Kernel time: 12812ms (17.1% of freeze time)
  ** Freeze time is user-dominated — process is computing **

── Handle Trend ──
  Start: 6071  End: 6861  (+790 over 5 freezes)
  ** Handles growing: +158.0 per freeze cycle — possible handle leak **

── Background vs Freeze I/O Rate ──
  Background: R=635.8KB/s  W=47.7KB/s  Oth=122.6KB/s
  Frozen:     R=36.4KB/s  W=22.6KB/s  Oth=136.1KB/s

I was able to get it to trigger few times again. I ran the game along with a task manager window on the bottom right corner showing the games CPU and RAM usage. When the drops happen, I can notice that it drops VERY VERY low (near 0% CPU usage), and sometimes the RAM will stay the same. This can even happen on the main menu. I’ll have to see if I can get a video or clip of it.

So far I’ve also tried turning off FSUIPC7 and removing it from the XML on launch. No effect.

I am well aware that AI bug reports are a scourge, but since I didn’t expect any help on this issue, and it hits me frequently, I started another round of investigations. I’m working on a mod for the 2020 Aerosoft Twotter, and this model triggers the problem quite often. My goal was to identify the subsystem that caused the hangs (audio, CoherentGT, whatever) to narrow down the possible culprit in the Twotter base model and fix it in my mod.

So I fed an etl file to Claude, and it came up with a surprising analysis.

Each 15s freeze I see is actually the sim crashing (apparently from a null pointer exception) and then recovering from the crash. I’m not a Windows guy, I didn’t know about Structured Exception Handling, and I have no idea whether Claude is pulling this out of whatever orifice LLMs have.

I just drop it here and then I re-focus on trying to locate and fix the trigger of the crash in the Twotter to save my mod. The rest of this post is AI slop. *drops mic*

What the freeze actually is

AsoboReport-Crash.txt written at the moment of every “freeze”:
[OnCrash] Msg=“_handleSEH” Type=“SEH” Code=0xC0000005 ← STATUS_ACCESS_VIOLATION [LastKnownDump]
Where=“CGameFlight_G::TryLaunchFlight”

So MSFS is actually crashing — null-pointer / use-after-free deep in its game-flight state machine — and recovering via its Structured Exception Handler. The 15.0–15.2 s “freeze” is the SEH handler running:

  1. catch the AV
  2. snapshot whole engine state to AsoboReport-Crash.txt + AsoboReport-RunningSession.txt (~30 KB each, 4 writes/freeze ≈ 120 KB total)
  3. attempt cleanup
  4. resume main loop

That explains everything previously measured:

  • Fixed 15 s duration → fixed-cost SEH dump+recovery, not a timeout.
  • 49 % user / 17 % kernel CPU during the “freeze” → the recovery is doing real work (walking stacks, serializing engine state, file I/O).
  • +158 net handles per cycle → cleanup leaks handles whose owners died in the AV.
  • +1.1 s/cycle interval growth → handle population grows unboundedly between AVs, the next AV hits sooner each cycle as accumulated state corrupts further.

The handle accumulation mechanism

MSFS has an internal verifier called AsynchCheckHandles. The dump shows its scoreboard:

[AsynchCheckHandles]
Status=“Idle”
EndedXFramesAgo=5178 ← last verifier pass ended 86 s ago
FreedHandles=225 ← that pass freed 225 handles
CreatedHandlesSinceEnded=27166 ← 27 166 handles created since
CreatedHandlesDuringACH=18

8 seconds later, the next snapshot shows +19 227 new handles, +88 freed. That’s ~2 400 handles per second of creation, against ~110/s freed. The verifier is wildly outpaced; one of the dangling/half-freed handles is eventually what gets dereferenced and triggers the AV.

Pattern across history

5 prior crash dumps in AsoboReport-OldCrashes/, all on BuildVersion 1.7.17.0. When the Twin Otter is loaded the ACH section is populated and shows runaway handle growth. With no aircraft loaded the ACH section is empty (subsystem not active) but AVs still happen — there are at least two distinct failure modes here.

Other artifacts worth flagging

  • Stack frames are all in KittyHawkx64_Steam_PCSUB.pdb GUID BEB550BCB6A9EB47BA5757C50CDBD20F. With that PDB the 21-frame
    callstack would resolve to function names and point at the exact offending code.
  • Steam VR is initialized (vrclient_FlightSimulator2024.txt and vrserver.txt log writes during freeze) even though presumably not in VR. That’s overhead-level, probably not causal, but it loads extra Wwise/Renoir state.
1 Like

Okay, one more thing: I kept auditing the Twotter base package for the culprit, based on Claude’s understanding of the issue. I found two classes of bugs that seemed likely candidates: typos in XML attribute names and assignments of sound events to non-existing interactive points.

I fixed both classes of issues in my mod and… I don’t want to jinx it, but… I haven’t seen the sustained crashing anymore.

My current hypothesis:

  • XML that looks correct (xmllint is happy), but contains invalid attribute names, is parsed by the sim,
  • the invalid attributes lead to a corrupt data structure,
  • a sim-internal task that traverses this corrupt data structure about once per minute (some kind of clean-up job, garbage collection, whatever), stumbles over an invalid entry and throws an exception and triggers SEH,
  • SEH does its thing which includes logging, possibly phoning home (just an assumption, I have no evidence) and general cleanup, which takes 15s
  • when SEH is done, the sim is exactly in the state it was before the exception; in most situations (that probably led to a CTD before SEH), the user will experience a 15s hang and then they can continue the flight, which is wonderful and borders on black magic, but in rare situations (like mine), this doesn’t resolve the underlying issue (it can’t fix the corrupt data structure), so SEH is triggered over and over again.

It looks like the parsing of legacy content (or content in general, dunno) has become less robust in SU5. Considering the state some of our content is in (shiny on the outside, but when you look under the hood, it’s like Plinkett’s basement), it would not surprise me at all if more people ran into issues like this after release.

It’s a shame this can’t be escalated to the devs because you can’t report a bug about the handling of add-ons when you’re not allowed to use add-ons for a bug report. This passes the buck back to content developers, who might or might not have interest in fixing a legacy product. :smiley:

I think my Twotter mod is saved, so I’m gonna stop caring about this bug now.

[edt: Only after posting this, I realized that there was a .21 release overlapping with my attempts to find the problem in the Twotter. That makes it way more likely that Asobo fixed the issue in that release, and my fixed typos might have done nothing but restore a few missing sound effects in the Twotter.]