Hi there,
I am trying to understand the difference and correlation between timeaveraging and sampled jitter method.
For sampling jitter, it calculates noise at particular threshold point. And for time averaging it calculates noise with AM and PM, whether the noise varies signal's amplitude or phase.
So I simulate jitter of one inverter, get result of PSD_pm0 and PSD_pm1,which are rising edge and falling edge pm jitter results .
Then I add a limiter(comparator in ahdlLib) at inverter output, simulating its harmonic 1st's USB/LSB noise in dBc/Hz using time averaging, at this moment there is no AM noise.
Further more, I simulate PMOS or NMOS's noise contribution by using analog option, and get result PSD_pmos and PSD_nmos.
I guess PSD_pm0= PSD_pmos and PSD_pm1= PSD_nmos but what I get is PSD_pm0= PSD_pmos+6dB and PSD_pm1= PSD_nmos+6dB
my working environment is ic618.
Best regards,
Lu Lv