In modern transistor based logic gates, the impact of noise on computation has become increasingly relevant since the voltage scaling strategy aimed at decreasing the dissipated power, has increased the probability of error due to the reduced switching threshold voltages. In this paper, we discuss the role of noise in a two state model that mimic the dynamics of standard logic gates and show that the presence of the noise sets a fundamental limit to the computing speed. An optimal idle time interval that minimizes the error probability is derived.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据