Is the complete connectome enough to model the brain in silico? If not, what else is needed?


You should take the following with a grain of salt, because it is not (yet) supported by experimental evidence, but I state it, because it is the only possibility I can see given the theoretical evidence.

Likely what is needed is the complete RNA sequences in all the cells of the brain, most importantly the synaptic RNA, because this is where all the actual computation is happening. Not the RNA for making proteins, not the RNA for regulating them, but RNA that is used purely for thinking.

This RNA must be linked up to the electrochemical network to produce different pulses for different sequences, and it must read out the electrochemical pulses and convert them to sequence. The result is that each cell is an RNA computer with gigabytes of RAM, linked by a 3000 baud modem to a few thousand other cells, which read out the sequence of inputs and link up the computations as such.

This model is new, it is an original idea, you won't read about it anywhere else.

In order to make this work, you need to have a protein RNA complex which is sensitive to action potentials and transcribes neuron signals directly into sequence. It also requires an intracellular RNA-RNA computation, but this required for other reasons.

The resulting mess means that the connections are only sufficient for simulating the communication overhead of the computation, it is missing the bulk of the computation. The amount of bulk computation is order 1 gigabyte per cell, and there are 300 billion cells, so it is staggering, far, far beyond any current machine.

The reasons to predict this (it is a prediction, this hypothesis is not supported by direct evidence) is that brains are able to store memories, initiate action potential sequences that are coordinated, and their processing speeds are not consistent with the overhead processing speed of the communciation between cells. The cellular level models limit the brain's active memory to a number of bits equal to the number of cells, giving C. elegans a memory capacity of 300 bits, which is ludicrous. For Drosophila, it's 100,000 bits, still ludicrous. There is no way to explain why it is ludicrous without getting an intuition for what a 300 bit computer can do, so I can't go further, except urge the reader to experiment with a 300 bit machine until full understanding of the range of behavior comes (it doesn't take long), to see that it doesn't do anything, it's less than one cell nucleus.