We present an analytic theory of timing jitter in dispersion-managed light-wave systems that is based on the moment method and the assumption of a chirped Gaussian pulse. We apply the theory to a soliton system and show that 50% postcompensation of the accumulated dispersion can reduce the jitter by a factor of 2. We also apply the theory to a low-power light-wave system employing the return-to-zero format and find that timing jitter can be minimized along the fiber link for an optimal choice of precompensation and postcompensation. (C) 2001 Optical Society of America.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据