We’ve often mentioned, or alluded to, the limits of human understanding. Now we’ll tiptoe into the world of neuroscience:¹


         (1)  First the neuron.

It’s a nerve cell. The main part, containing the nucleus, looks like a fried egg (“yolk” being the nucleus); from it runs a long tail like a kite called an “axon.” There are also shorter dendrites. Electrical messages usually run from dendrites to the main part of the cell down the axon to a muscle or another neuron.

   (2)  Second, the human brain, which weighs about 3 lb., has about 100,000,000,000 neurons in it.

That’s about 14 neurons for every person now on Earth.  Or think of it this way: (in an average lifespan) you have 33 times as many neurons as you will have heartbeats.


[For more use the DOOR.]





   One more detail to carry away:


    (3)  “In humans, the brain is the hungriest part our body:

at 2 percent of our body weight, this greedy little tapeworm of an organ wolfs down 20 percent of the calories that we expend at rest. In newborns, it’s an astounding 65 percent.”


   Now for some observations–very general ones (from this article and elsewhere).

   (1)  There’s a lot of information in the world.

   (2)  The adult human brain has a volume of about 1.5 L (or 1.7 quarts).

   (3)  The brain can hold and manipulate lots of information; in fact it, using neurons, collects and holds all facts we have gathered, how we put them together, and how we use them consciously, subconsciously, or unconsciously to do things.

   (4)  One of the wonders of modern technology is miniaturization and digital storage of information. Storage devices can contain and manipulate incredible amounts of information. (Think of the contents of a small library in an object the size of a packet of gum.)

   (5)  Now “human intelligence may be close to its evolutionary² limit.” Let’s take a look at more of Scientific American‘s summary comments [boldface in the original, coloring and brackets are ours]:


“…Various lines of research suggest that most of the tweaks that could make us smarter would hit limits set by the laws of physics.

   “Brain size, for instance, helps up to a point but carries diminishing returns: brains become energy-hungry and slow. Better ‘wiring’ across the brain also would consume energy and take up a disproportionate amount of space. Making wires thinner would hit thermodynamic limitations similar to those that affect transistors in computer chips: communication would get noisy. Humans, however, might still achieve higher intelligence collectively. And technology, from writing to the Internet, enables us to expand our mind outside the confines of our body.

   (6)  Of course, collecting and “holding” information is not quite the same thing as “using” it. It’s interesting–but not very productive–to try to picture electrical signals scooting over, under, around, and through those billions of neurons [that Darwin and his followers assume accidentally formed from much simpler stuff] to observe, evaluate, and perform tasks, drawing from the stockpiled knowledge that’s already there.


   ¹ The quoted material and theme of this post is from Douglas Fox’s “The Limits of Intelligence,” Scientific American, Vol. 305, No. 1 (July 2011). The illustration is from the Internet, and the generalizations and commentary are our own.

   ² “Evolution” in this context is, of course, naturalistic Darwinian evolution (without any supernatural influence).