Alex Nugent continues the series: "AHaH Computing in a Nutshell" by briefly explaining why AHaH Computing can be so efficient with a simple demonstration of adding numbers analog-ly vs. adding them digitally.
First he talks about biological synapses and neurons and compares them to silicon-based electronics components, posing the question: "If electronics are smaller and faster than biology, why is biology better at some computing tasks?" Next, Alex introduces the concept of capacitance and points out that it's the capacitance in the pathways in a digital processor/RAM combination that kill efficiency for large scale machine learning problems. Finally he steps through a basic example of how much energy is required to add numbers both digitally and analog-ly and shows that only by eliminating separation between processor and information and by reducing voltage dramatically can we ever hope to create processors that operate on the efficiency scales of biology.
The companion page for this series on our website can be found at knowm.org/ahah-computing-in-a-nutshell/.
For more information on AHaH Computing and AHaH Nodes, please visit: knowm.org/ahah-computing/