Hello,
I am trying to find the effect of supply noise on the
jitter of a delay line (a set of buffers in series). When I went through
various documents and I found the following options
1. PSS + PAC with the sampled option.
2. PSS + PXF with sampled option
In the both the cases sampled option is selected since only the jitter at zero crossing is important. http://www.designers-guide.org/Forum/YaBB.pl?num=1224609785
Let
me start with explaining one setup. When I simulate PSS + sampled PAC, I
set PAC magnitude = 1 in my power supply voltage source. Set the the
maximumsideband = 0, and sweep the input source from 10khz to 10Ghz. The
input clock to the delay line is 1GHz.
I am a little confused
with the output spectrum that I see with the above simulation. Since it
is sampled PAC, I was expecting the spectrum to repeat every 1GHz
(fundamental tone), like in a PSS + sampled PNOISE analalysis. However,
in the PSS + sampled PAC I see that the spectrum does not repeat. Can
someone explain, what is wrong with my understanding?
Also, if
wanted period jitter (instead of absolute jitter) in a sampled pnoise simulation, I would multiply the output spectrum with (1-z-1) and
divide by slope of the output waveform at the transition point and
integrate from 0 - FCLK/2. How do I find the period jitter in a sampled
PAC analysis?
All my questions basically come from the confusion that the output spectrum is not repetitive, like in a sampled system.
Thanks for help.
↧
PSS+(sampled)PAC and PSS+(sampled)PXF
↧