Hi, I have an AMP_B block that I'm driving into saturation, and I'm noticing around a 4 dB difference in compressed stage output power depending on if I use the Time Domain Simulator (measured with PWR_MTR, verified with PWR_SPEC), or the RF Budget Analysis (measured with SPWR_node). A side-by-side comparison is shown below, where I am sweeping the input power. The left plot shows the input and output power of the compressed stage, the right plot shows the entire chain cascaded signal power. The AMP_B block is set up with OP1dB = 19 dBm / Psat = 21 dBm, so the RF Budget analysis seems to be the more realistic of the two.
Is there something I am missing in my interpretation? Or is AMP_B not suitable for large-signal analysis with the Time Domain simulator?