Noise in Biological Systems
Fundamental Limits on the Suppression of Randomness in Cells
Harvard Medical School, Boston, MA, USA
Negative feedback is common in all types of biological processes and can increase a system’s stability to internal and external perturbations. But at the molecular level, control loops always rely on finite rates for random births and deaths of individual signal molecules and involve delays due to production or transport. By combining mathematical tools from information theory and physical chemistry we show that seemingly mild constraints place severe limits on fluctuations that no type of control system could overcome, regardless of nonlinear or spatial effects. In this limit, noise is determined by the quartic rather than square root of the number of signal molecules made, making a decent job 16 times harder than a half-decent job. We also show how limits and trade-offs become dramatically more restrictive in chemical cascades, where information is inevitably lost at each step. This may explain many subtle feedback phenomena observed, such as creative strategies to minimize the length of signaling cascades, nested feedback loops that partially counteract each other, and molecular memory that reduces the randomness of each birth or death event. The theory is formulated in terms of biological observables that were measured for many systems and existing data suggest that cells use brute force when noise suppression is essential, for example expressing regulatory genes 10,000s of times per cell cycle. We also developed new experimental methods to count the exact number of plasmid molecules per individual cell. The results suggest that plasmids operate relatively close to the fundamental limits.