You are on page 1of 217


I was always different.

I never had a casual thought in my life.

I pondered mysteries of life and

death when I was a child.

As far back as I remember I wondered about death.

I stared up at the huge

amber lanterns hanging overhead in church at Sunday services and wondered what would happen if one just

dropped on me.

I secretly almost wished it would simply to find out.

It seemed that nobody had the

answer. I went to confirmation class because I thought Sally Meneely was cute; by then other things were

beginning to occupy my mind.

Still, from about the age of nine, it become my personal life challenge.

What was it really? What actually happened? Could I find out before it happened? I reasoned if I started

at ten, I might have it worked out before I died. I had to start early and keep going. I cried myself to sleep

the night before my tenth birthday. I knew it was going to be a long haul.

My father was a Harvard wit who wrote light verse and managed the old family steel company,

handling management while Maurice Roses ran the engineering.

The firm began in 1857 as the stove

works, “McKinney & Mann”. It reincarnated as Albany Architectural Ironworks and won renown for cast

iron store fronts in the 1880’s. It assumed its third life as James McKinney & Son when my grandfather

entered the firm. My father was born in 1891 and I arrived in his 54th year, the son of the son of the son of

McKinney & Son and his wife, a 28-year-old ex-prep school girl who dropped out of Swarthmore to attend

R.A.D.A. in prewar London. He wrote a Hasty Pudding show, was Albany’s major culture maven, never

made much money, and died at 77 of Hodgkin’s disease. They named the library at the Albany Institute of














The Friends of Eddie Coyle, were researched at the McKinney Library. Dad got obits and editorials in the

Times Union for a week or so.

He was a hard act to follow, so I followed him to Harvard but settled in

Cambridge. As a child, every night when he came home from the “plant” (I once thought my father worked

with vegetables and not at an office) where he “made money” (from long strips of copper with a penny die-

cut stamp, I assumed), he’d answer any three questions we had. Anything at all.

“Where does paint get


“From pigments in a carrier base.”

I imagined colored pigs frolicking in pens amid aircraft

carriers at their naval base. He always had the answers.

Every spring the carnival came to town. James E. Strates Shows would arrive and pitch its tents in a

huge field at the bottom of the Menands hill. They set up a midway, erected a fun house, the side show, the

thrill riders, the coin tosses, cotton candy stands and rides that towered over our heads, each tethered to a

snorting diesel generator with some wild kid at the controls.

It was heaven to a ten-year-old with ten

dollars to spend. It was the yogi that I will never forget. With a blowtorch he heated iron bars red hot and

stepped on them. He blowtorched his own mustache and nothing singed. He stood on red-hot swords. The

whites of his eyes were yellow. Too much heat, I figured. My mother stayed after the show. She wanted to

know just how he did it. The yogi stepped forward. You could see he was weary. No, there was no trick; it

was the result of a great deal of training. Here, he was just being paid to do it. “Of course,”

he said, “It

will do me no good, the money. I have used my gifts for financial gain; this should never be done. There is

no hope for me.”

I looked into his tired eyes.

They were like black marbles, shiny, lifeless,


A sudden chill

gripped my mind.

This man was telling me a truth.

A gift like this was not bestowed for profit.


made in this way is worse than no money at all.

I had met my first Eastern adept, and we communicated

just fine. He was working in the sideshow and faithful to a system which could both empower and undo. I

was still in cotton candy land, but I knew he knew something I dearly wanted to know.

If the mind could



protect this man from red-hot iron, how deep could I go and not get burned for sticking my nose in a little

too far. Were these secrets locked beyond me? Worse, would unlocking them leave me, like the yogi at the

sideshow, regretting a path of knowledge, condemned to travel from place to place like a carnival attraction

playing to eager audiences clapping without a clue to his utter isolation?

Twenty-five years and many lifetimes later the chill came back to me as the theoretical basis for

conscious chronology finally clicked together one day.

If what seemed to be the case were in fact true it

explained the perception of time.

A tool such as that would generate some real insights into the major

metaphysical rules underlying all the world religions.

It all made sense but the conclusions were nearly

frightening. I had stumbled onto some real knowledge. I knew for sure now what happened in death, and it

completely rearranged my understanding of life.

Was this a gift, or a curse?

around to ask, but he had died when I was only 22.

I wished my father were

My mother lived another twenty-six years. She was there when the firm went bankrupt and was sold

to Mark Larner for the price of a parking lot. She taught natural childbirth in the forties, natural foods in

the fifties, natural religion in the sixties; naturally always ahead of her time enough to be a natural amateur

savant without the patience to stay with anything long enough to win professional respect.

She was self-

taught in medical matters, with several thousand dollars worth of medical textbooks filled with underlines,

highlights and margin notes. Her last preoccupation was her eventual stroke, a subject which kept her both

stressed and stressful. At 75, she agreed to try some powerful meditative techniques I had learned directly

from the Dalai Lama, which included focused mental imagery. It worked, she said, and claimed her trusty

Holter blood pressure monitor even recorded it. A year later, she was gone.

After her first severe stroke, I read her CAT scans and was appalled at the devastation. Fully three-

fifths of her right hemisphere was gone for good.

The attending neurologist said to expect the worst.


emotional affect and a foggy mind at best. The best thing, he said, would be another stroke.

She was still



having difficulty opening her eyes. One side of her body was limp as a rag. She was speaking sometimes in

French, memories of her vacations while a young woman in London, but she was coming back by the third

day. I bent over her when she seemed lucid and said “I checked your scans, Mom. You’ve lost a big chunk

in the middle of the right hemisphere but your prefrontal lobes are fine and the visual cortex is still there.”

With her eyes still closed, she whispered feebly “Middle cerebral artery.”

She was right, of course. Then she asked, “Should I do my vipassana now?”

I was floored.

I had

taught her to recall an image from memory and study it in the mind with the eyes closed. Even with such

destruction, the teaching was intact and so was her mind. I gave her hand a squeeze.

“Wonderful, Mom,

it’s great exercise for the visual cortex.

That’s just what you need now.”

Looking ahead, her eyes still

closed, she said gravely, “What I need now are prayers.” She had read earlier drafts of this book and read

what seemed to be simple scientific answers for a number of very basic human questions.

She had taken

the original chapter on death to the dying and had told me of tears, sometimes of relief, when someone

realized that the end, when it came, was eternal comfort no matter what. My mother was religious, but for

her the theories made sense and she shared them with those whom she knew needed some faith without the


Now, in the anticipation of her own death, my mother was slowly returning to the faith she had

been born into.

She did not die of another stroke. She died a month later from bacterial and fungal infections which

had been diagnosed but not adequately treated.

It was as gentle a death as one could imagine as the

pathogens slowly turned her brain to Cool Whip™ one cc. at a time. At the very end, the last day I knew

she was there, she looked vacantly into my eyes. I looked deeply into hers. There she was, like a person at

the very bottom of a swimming pool. She was looking up, letting me know she was there, deep inside, but

very far away. It carried another message. “You were right; I’m in another place.”

Late that evening, I

could feel her soul sighing into the night with the sounds of the late night traffic traversing the long bridge



in the distance. The next day she was flatlined. Her pacemaker had Energizer bunny batteries, however, so

she stuck around for curtain calls.

She was an actress, and she had the whole stage to herself.

Like my

father, she lived a week on heart alone and died, like him, less than a month before her 77th birthday.

During that week she showed up in four different people’s dreams.

“She said she was satisfied with her life, and generally pleased with the way her sons were getting


said Prabha, an Indian neurologist who had become a close friend and confidant during her last

four years.

“She said that there was one small disappointment, however; she was sorry that your book

wasn’t published.” Even at her last level of mental attachment to this world, she’d known I was trying to

cheer her up but she hadn’t let on, an actress to the end.

The next month, Mark Larner finally gave up

trying to stamp out pennies at the steel company he had bought for nickels and after 135 years, the doors at

James McKinney & Son closed forever. It was over.

The book has been published now or you wouldn’t be reading it. Like my father, I wanted to answer

a question for all the people and by the time my mother died, the answers were in hand.

At the end, she

was comforted in her simple Christian faith and went, as Judy said, “to the arms of the Savior she knew

and loved.” I may well too; those are my earliest memories at Sunday school, long before I was interested

in girls or metaphysics. I’m not going to try to modify them.

I know where I’m going, and whether it’s

Buddha’s endless lifetimes or Jesus’ life everlasting, it’s not a bad trip at all.

My big question was

answered as far as I was concerned, but the structure which had evolved to solve the problem had taken on

a life of its own. It was almost a software tool for the mind, a form of mental utility. It turns out this is a

what people call a philosophy, so I called up an expert to see what we had.

“Is it possible to describe a comprehensive metaphysical perspective on life that can answer the

major questions in only six pages?”

It was 1981, and I was still nervous about watching it all fall into

place. “Sure,” he said, “but you might need six hundred to explain just how you got there.” It turns out if



we can agree to accept the concept of our personal consciousness, our mind, as a virtual reality, it leads to

an entire systematic philosophy based on neuroscience.

That begins to explain the two hundred and fifty

pages and why The Last 10 Seconds of Eternity is a lot more than a probable explanation of what happens

when we die.

The last systematic philosophy that really influenced a people in the West was Thomas

Aquinas’ Thomism. On the other hand, almost all Asian philosophies are essentially systematic, but so is

neuroscience and computer science.

We may be dealing here, then, with a time when a synthesis true to

both cultures is finally possible. In other words, a mind-based theology that works for Western Christians,

Muslims, and Jews could do double duty as a “neuro-dharma” in the East.

The final manuscript was nearly complete when I spotted psychologist B.F. Skinner ambling through

Harvard Square one day. I knew he was not well. It might be my last chance to ask him a good question

so I caught up with him

“Dr. Skinner,” I started, “I was also an English major who got caught up in

brain science. You once considered writing as a career. What effect did it have on your later work?”


smiled, and there was a real twinkle in his eye. “I have lived a long and predominantly rewarding life,” he

said, his words flowing in precise intonation, “And I have always taken it for granted that a large measure

of my success was simply due to the fact that I could write a great deal better than most of my colleagues.”

I shared a big grin with him.

If you had a gift, the art was as important as the science.

He died a few

months later, so in honor of the craft of writing I wrote the whole thing over again just to polish it up.


I’d spent half my lifetime answering one question there’s no reason not to be elegant about it and put on the

best show possible. This is an easy-to-read deep book; it took every bit of my writing skill and nothing will

ever be that hard to do, or so rewarding to see completed. Just drop it into your mind and watch it unfold.

The Last 10 Seconds of Eternity will make you think about things you never thought about before in

ways you never thought you would think about them.

That is my first promise.

The second is that if

someone gets the idea that this book unveils nothing that wasn’t generally stated by the philosopher



Nagarjuna and further developed by Chandrakirti, Shantideva, and T’song Khapa, they’re right.

There is

nothing new here; all real truths are ancient.

Still, we always try to improve the explanations so that we

can believe a little better whenever it’s important to have a reason to believe.

In these times, it’s more

important than ever.

The night I met the yogi, I rode the Ferris wheel up into the night, and at the top, it stopped for a

moment to let on more riders.

We were suspended between heaven and earth, slowly swaying in a cool

evening breeze. If we looked down we could see the entire midway, sparkling and bustling, the games, the

tents, the support trucks and supply vans; and behind them the fields, the highway beyond, the Menands

hill, and the starry sky reaching over our heads.

It was very big and vast and then, suddenly, the diesel

gives a snort, the ride goes forward, and we’re back to cotton candy land again.

This is not a long book;

but for some it will provide a new perspective, a Ferris wheel for the mind.

At least that is my hope; and

then back to the lights, the action, and all the games of life.


The First 10 Seconds of Eternity

From Heaven to Earth

The ovum is pierced. The genetic traditions of all our ancestors pour into each other. Dancing chains

of DNA, the jeweled necklaces of life, embrace and entwine. Personal characteristics from both sides

extend greetings, meeting their destined partners. The eternal dance begins. Twining, twirling, they are

weaving into one. As they fuse, the past vanishes. Now we are. We know nothing. Yet we know everything

because we are all we know. We are the one and only, the only one, the one-cell dream of a future self.

We’ve arrived and we don’t even know it. It all begins here and we are very new.

In a time of timelessness the fertile cell divides and divides again. Patches of genes awake with

specific organizing powers. The entire composition is recorded in every cell, the plans as well. Here the

feet, here the eyes, and here the brain. In the eternal darkness our home is forming. We are forming, and

nothing is left to chance. We’ve been unalterably and completely ourselves and only ourselves from that

moment of creation

We’re woven into every strand. From this point on we simply locate our cells, find a

place to settle in, learn a specialty, and multiply. And where is the mind? Will it reside in the toes? Those

The First 10 Seconds of Eternity


who lose their toes rarely lose their minds. Was a mindful spirit nestled in our budding heart? Many hearts

have been transplanted without any sharing of the soul. If our mind is perceived in the brain during our life

on earth it must exist in a very limited form for a while. The eyes are not complete, but there is nothing to

see. Nor is there a place for memory. These capabilities will all come much later. Now is the time of quiet


We turn in an endless universe while currents and connections less thoughtful than thought and

many times more profound are becoming the exquisite networks that will help us perceive our life and

introduce us to the world. We have nearly nine months to go. Nine months to create, bit by bit, the

biological basis for a consciousness that will one day know our spirit, our mind, and our soul. Like all truly

beautiful expressions of nature, it takes time to come together, time to bring us alive, and time to come

apart again. It becomes over time, takes us into time, and it will go, ultimately, only after we have gone.

Our first being is oneness. There is no time to compare with this because without another

there is no comparison. The time of oneness is always forever. The cell divides and we start the

time of two. And then comes the time of four, and the time of eight. Soon, there is the beginning

of a neural ridge. As each new living neuron comes into being, the growing brain becomes more

complex. Three months after conception our brain is adding 250,000 new cells a minute. At birth

it contains between ten billion and ten trillion of the most complex cells in our body. It is more

elegantly specialized and balanced than anything in the universe known to man, for it must

perceive our universe and balance us within it. We remember none of it. We can’t remember when

we were all female. It is not until the third month when the male fetus produces hormones that

alter his body and brain and make him fully male. Males can be feminized and females

masculinized by abnormalities in a mother’s hormones during this crucial time in development.




Severe stress during pregnancy has been linked to this problem.

A mother requires emotional as well as

physical well-being to bear healthy children. She needs stability in the world around her. In another

universe, within her, a child is moving steadily towards a meeting with a world it could never imagine, a

world of time and space, the world inhabited by our human race.

Controversy continues as to when we are officially human. Some use the moment of conception. Others

wait until life can be sustained outside the mother’s womb. Still, all would agree no newborn is fully

developed. The passage down the birth canal is not the final event. It is simply a physical interruption in

our maturing process, transferring us outside our mother as soon as we could survive. Survival wasn’t

something we’d ever thought about up to that point. We never expect to be born. We all naturally assume

we’ll remain where we are forever. In fact at that point we can’t really expect anything at all.

This is our beginning, and it is also our ending. We all start in no time, no place, all time and all space. Our

name was simply “I am”. Soon we will have to leave this eternal place, and it will be a lifetime before we

return. Do we remember our birth? Of course we do. Every cultural myth of the creation of mankind is a

broad interpretation of birth from the viewpoint of an infant being born.

The first Incas emerged into the

sun from a dark cave. The first Navajos arose through a hollow reed to a “glittering place”. We are about

to become again. Now we are about to take human form and be transformed into a infant in it’s mother’s

arms. A blessed event for her, but a bewildering one for us.

The Creation Story

There is a dull redness during daylight. The fetus’s eyes open by the sixth month. Stretching and turning in

the darkness, we grow more aware. There are distant sounds, muffled murmurings of God, closer and

clearer. The fetus can distinguish words by the eighth month. In the eternal rhythm of our only universe, the

The First 10 Seconds of Eternity


heartbeat of our mother fills our world with the pulse of life.

The rhythm was there before us, existing

before we were, beyond the beginnings of time. During our life we will continually seek, be calmed by, and

even sway to this same rhythm if we feel stressed or anxious.

beginning, the wordless prayer we all know.

We will roll back to the beat of our very

This day we awaken to changes. In our eternal darkness a new spirit moves over the waters.

Suddenly the world is jarred and jolted. There are great movements, voices becoming clearer and louder.

The creator is about to jump-start the world for us. The powerful contractions begin. In the beginning the

obstetrician said “Turn on the overhead light,” and there was light. They saw that you were good, and it

didn’t take seven days. Still it was a such chaotic experience it may have seemed that way as forever

suddenly ended. What a demotion in scale! A moment before we were the entire universe, the be-all, he-all,

she-all and end-all. Now we’re reborn helpless as an infant, alone among others. Our minds weren’t started

at birth but we must have been startled. Overdosed on natural endorphins, we were shoved down a dark

tunnel into a blinding light. It had been forever in stage one and suddenly we’re gasping and blinking and

kicking our way into our next stage. It’s center stage. They turn on the lights; the crowd cheers. It’s a

whole new ball game. Who asked for this? We call foul. We cry. We yell. Literally, figuratively,

metaphysically, and actually we are really put out. Newborns dream a lot about the old days. They spend

nearly half their time in REM sleep, the dream state, even with their eyes open. They just can’t believe it. It

was supposed to be forever and ever; and now this utter confusion? What happened?

We keep waking to a new reality and we cry a lot about it. You can’t remember, but neither can

anyone else. We talked in baby talk and we thought in baby thought. We can’t recall anything specifically

because a baby brain is can’t recall specifically. Lower creatures practically up to the reptiles arrive ready-

made. Just hatch them and they’re off and running. Aside from their size they are as smart as they’ll ever

be from their first days on earth. Here they come and off they go. More complex brains take time to fine



tune and we mature as our parts mature. We come onto this earth both unfinished and unorganized. We

can’t even eat solid food for a long time. No part of us is fully detailed or final. Every part is infantile.

Baby toes, baby nose, baby fingers and baby brain.

They were all working or we would not have been

born alive, but there is a long way between first appearance and final maturity.

Every part of us had years to go. Our brain, also, was far from being organized. It takes time to

become structured, articulate, differentiated and capable of consciousness as we know it. It was a baby’s

brain, as capable of reflective thought as baby legs are ready for running. It still had to develop and grow

further, all the time perceiving and understanding as best it could with what little it had. Given back our

baby legs, we would stumble and fall. We are not ready for gravity yet. With a baby brain, consciousness

is equally incapable of the sure and distinctive method of thought characterizing the adult mind. We are

never going to know how it was because we’re only mentally infantile once, and we hadn’t the sense to

appreciate it. Youth is wasted on the young, they say, but we may never again be as wise as when we were

living in pure infantile awareness. Not that we had any alternative, of course.

We are born with nearly all our neurons, our brain cells. These cells rarely reproduce. For reasons

that will become clear, it is impractical to have to deal with the constant appearance of blank, immature or

disconnected cells in the midst of things. Instead, there is enormous redundancy. With trillions of cells we

can afford to lose a couple of thousand a day all our lives. In fact we do, but we never run out during our

life. Between birth and the age of about three and a half, for each of us a little differently, consciousness is

constantly on the run as our brain hooks itself up and trims itself down to size for a lifetime career in data


Each neuron communicates with others by sending electrical pulses down its main nerve fiber, the

axon. Each axon in turn splits off into numerous hair-like dendrites, tiny sub-fibers. An axon fully grown

with all its dendrites is fully “arborated” from the Latin word arbor, tree. Under the microscope it looks

The First 10 Seconds of Eternity


exactly like a tree without leaves, dividing and sub-dividing from major branches to the tiniest twigs. In this

way each neuron can be in contact with thousands of others. With nearly all these cells in place at birth,

much of our next three years is spent in the gradual development of the axons and dendrites. Our chips are

in place but they aren’t wired up yet. We have to make our connections before we can make our


At about a year and a half, our consciousness undergoes a very significant change. Until then the

brain has been using a lot of energy to push impulses down those innumerable pathways. Now glial cells

go to work. From the Latin word for “glue”, these cells were once thought to provide packing, “glue”, for

the neurons. They do far more. Specialized glia called Schwann cells wrap each axon in a fatty layer of

insulation called myelin. This allows electrical impulses to race along as much as ten times faster using far

less energy. The brain quickly adapts to the upgrade. Other glial cells do the same for the nervous system,

preparing it for the complex micro-movements that will allow baby to take her first steps. It’s during this

time of myelinization, as the process is called, when malnutrition can cause mental retardation. The infant

brain is still very vulnerable. From our own perspective, however, things must have really done a flip as we

retro-fitted our mental operating system with the new high-speed networks. We completely alter the pace

and the perspective of perception and we take it for granted. In other words, no infant ever remarked on the

transition of reality from what we might call our “universal infant Jungian mythology” state to the “ancient

real memory” state. We are beginning to set the stage for adult reflective consciousness but we still can’t

discuss it with anyone because until the speech cortex is ready, we can’t talk.

The growth of tiny dendrites is abundant during this period. This creates yet another effect on

perception. No matter how memory is recalled, it must be stored in some way for it to be accurately

retrieved. Complex memories require a large storage space or enough small storage areas to hold the

necessary detail. Luckily, the complex arborization of human neurons makes this possible. In its limitless



interconnections the brain never runs out of complexity. At maturity, with trillions of cells hooked up to

thousands of others and each capable of a nearly infinite number of energy levels, there is more than

enough. But there are other tricks that the growing mind plays while we are still infants. For years after we

are born our neurons grow more complex. Dendrites continue to branch and grow, establishing their final

networks and settling in for long years of electrochemical exercises. The brain reaches its greatest internal

complexity at about the age of three and a half months. Then dendrites which are used less die off, leaving

our basic neural staging. This is our unique lens of consciousness. It will eventually enable us to perceive

our hopes, our thoughts, and our world for the rest of our lives. Infant activities strengthen and nurture

growing neural networks as we reinforce and repeat.

These tiny basic differences, through time and

repetition, will eventually become the foundation of specific personality and our entire underlying image of

the world.

By the time we are three-and-a-half, almost all our major structural upgrades are complete. Final

maturation progresses slowly until adolescence but the rapid growth phase is over and our brain structure

is stabilized. Memory is no longer distorted or transformed by physical growth.

As the brain’s prefrontal

cortex comes on line, chronological time finally becomes possible. “Then” becomes distinct from “now” as

time begins to register. Children can now consciously differentiate. They know they are little boys and little

girls. In Tibet they traditionally select young lamas at about this age. It is no coincidence. The fresh mind is

ready for training as we begin to learn from a clear memory of day-to-day living. We remember ourselves

in our past now, and see ourselves in a future. We become who we are.

All children, in all lands and in all families, gradually become self-conscious. Socialization begins.

We learn we are not center stage but one among others as we come into contact with the world around us.

We become more aware of ourselves and every month more out of touch with the eternal world that was

formerly ours, now so long ago. Before we came into context, we had been incomparable. For so long we

The First 10 Seconds of Eternity


had been the total universe. Birth itself was just a major incident.

For three years the world turns in

sympathy with the churning activity of our budding baby brain as we weave our way to selfhood. We enter

this consciousness not all at once, but by degrees. This could account for the mystery we sense in our

earliest beginnings. In his poem Intimations of Immortality from Recollections of Early Childhood,

William Wordsworth wrote along similar lines over a century ago:

Our birth is but a sleep and a forgetting

The soul that rises with us, like a star

Hath had elsewhere its setting,

And cometh from afar,

Not in entire forgetfulness,

And not in utter nakedness,

But trailing clouds of glory do we come,

From God, who is our home,

Heaven lies about us in our infancy!

Those first affections,

Those shadowy recollections,

Which, be they what they may,

Are yet the fountain-light of all our day,

Are yet a master-light of all our seeing.




John Updike wrote in a

1991 essay, “The poet puts forward a

considerably developed

metaphysical explanation for the incomparable vividness and mysterious power of our first impressions.”

There is a bit of the poet in each of us and it has its beginnings in the fantastic never-ending world we

found ourselves in during our first three years of life. It was a different world, but our guardians were there

to offer security. The first word for God must have been “mama” and the first man to play the Heavenly

Father in our life was our own “abba”, “daddy” in the Aramaic of Jesus. It was our only world, and it was

only there for us. We spent many forevers playing Adam or Eve.

Heavenly Days: From the Garden to Our Back Yard

Although this information about the maturation of the brain has been available for some time there

has been little discussion about how a constantly changing mental environment is experienced by a growing

child. It seems clear if the brain is growing more complex every day, so will our thinking as well. We are

all familiar with the concept of infant learning. Still, we cannot hope to recall the experience of thinking

with a brain which changed from month to month for three years following our birth. It’s a long way from

the simple mentality of an infantile brain to a fully developed adult consciousness that uses abstract

concepts, reads books, and understand words in sequence. If our brain needs four years to just get ready,

from conception to brain maturity, it provokes speculation as to the nature of our earlier mental states.

Three aspects of infant consciousness are typical of this rapid growth stage. First, as early brain

structure is simpler our earliest memories are of a simple and more universal nature. Each day we add

connections and each day things become a little more specific and sophisticated. As toddlers we experience

an evolving, nearly improvisational consciousness as our awareness upgrades day by day. It’s like

powering up a computer with the most basic operating system possible, then adding new chips every week

The First 10 Seconds of Eternity


while at the same time revising and improving the system’s architecture. The programming language would

have to evolve to match the growing complexity of the circuitry. A good analogy is the language we speak.

No matter where we live, we know our native language has its roots in earlier tongues. Ultimately this all

regresses back to the original human languages. An American who spoke some German and studied Latin

in college might guess the origins of half his English vocabulary but would be completely lost in original

Indo-Aryan. Likewise, our earliest personal memories are hidden in simpler neural patterns, faded and

overgrown like cracks and colors in the Rosetta Stone.

they’re there.

We can never translate them, but we all know

Second, as additional dendrites grow out of the same cells for years, early memories will be

generalized even further. No matter how memory is recalled this rule still applies. If memory is a pattern of

electrochemical values, it changes as the physical structure holding it changes. If it is subtle currents in

chaotic flow patterns, the entire flow changes a tiny bit whenever any part of the brain changes. With all

those changes, original memories are never coming back. We all sat in chairs much taller than we are, at a

dining table the size of a garage. Most of these distorted visions are scattered into obscurity by later brain

growth like the destruction of ancient Asian temples crumbling under the vines that displace the stones and

topple their walls.

Our earliest past is past recollection but the ruins still remain to haunt our present

progress and our future dreams.

Third, it stands to reason recently evolved modifications to our brain structure would be the last to

mature. They are based on earlier developments and appear as later improvements. The recently evolved

forebrain and prefrontal cortex mature quite late. They are associated with aspects of awareness

unavailable to young children until these advanced structures are ready. The word cortex is from the Latin

for tree bark, and refers to the convoluted and fissured surface of the brain. The very last parts of our brain

to log on are these recently evolved prefrontal structures, which retain some of their flexibility all the way



to adult physical maturity.

They’re barely operational before we are three, which is another reason we

can’t seem to locate early memories.

Our memory locator is one of those later applications.

It only runs

after the memory itself is operational, and that doesn’t happen for a couple of years at least.

The way our













consciousness we employ for common tasks. In 1991, neurologist Larry R. Squire, working under Dr.

Marcus Reichle at Washington University in St. Louis, used a positron emission tomography (PET)

scanner to determine the order in which brain structures are used during recall. Students were asked to

match word fragments with a list of words they had been shown and asked to memorize. The subjects had

to use not only short-term recollection but also the ability to match word stems with likely candidates in

memory. A primitive brain structure, the hippocampus (from the Latin “sea horse” because of its shape),

was involved with immediate recall. However, as the brain started serious word matching the more

sophisticated visual cortex lit up as if the subjects were visually scanning a list of words. Finally, when the

students started searching their deeper memory a “hot spot” appeared in the prefrontal cortex.


recently evolved structure seemed to be monitoring, or even directing, a detailed search through the entire

file of verbal memory. The hippocampus is ancient, the visual cortex more recently evolved, and our

prefrontal cortex has been doing sophisticated memory searching for less than a hundred thousand years.

From instant reaction to reflective thought we activate increasingly complex levels of conscious recall, each

level represented by a more recently evolved addition to our basic brain structures. New research with

magnetic resonance imaging (MRI) has confirmed and expanded on Squire’s work.

This leads to some provocative suggestions. Since human memory lacks temporal organization until

a child’s prefrontal cortex is mature, we can’t develop a sense of time until fairly late in our mental

growing-up process. Our earlier, more generalized perception blurs the distinctions between one day and

another while recall without chronology would eliminate any planning. Months could last for years; years

The First 10 Seconds of Eternity


could be centuries. There were endless summers; eternal meditations on clouds, simple comforts and

anxious scenarios. There is no sense of time in a dream either, when the prefrontal cortex is asleep.


infants we lived day to day in a dreamtime where it is always present tense. Meanwhile, due to simple brain

maturation, memories of earlier images and experiences became more generalized every day. Imagine the

difficulty of forming any consistent images of a world remembered so differently from month to month

during years that felt like endless lifetimes. We have all the time in the world before we develop a sense of


Not until certain advanced brain structures are nearly mature can daily events be precisely recalled or

even kept as a reference. We were the center of it all as long as our growing brain was flexible. We were

kissed because we were so lovable, not because mother just won twenty dollars in the lottery. What did we

know of lotteries? We were spanked because we were evil. What did we understand of family politics or

pre-menstrual syndrome? We were responsible for it all, since we were the main event in the only world we

knew since birth. Before then was eternity. Now we were here, in this place where things kept changing.

It was forever once, in such endless peace. Suddenly we were ejected and met the great powerful

gods and demons who alternately blessed us to dry-diaper-heaven or condemned us to centuries in too-hot-

bath hell. Sometimes it seemed like forever again, alone in the desolation of a dark, lonely room only to be

wakened and hugged back to paradise in a mother’s arms. All babies feel the same way. All over the world

details are merely cultural. Infantile reality works identically in every little infant mind. We were all little

angels, sent down to earth. We all wandered in the fabled garden, naked and unafraid. Once upon a time

God really did speak to each of us, thundering from on high. Probably about six feet high, but who’s to

know for certain when we’re standing there at one and a half with a brain only half-way through hookup,

innocent of good or evil. Just because we fed the VCR a slice of pizza? It’s mouth was open, right? But

finally the images don’t change and the sequencing becomes clear. Now we can remember clearly. We



seeing ourselves in our minds, in a past, and wonder about tomorrow. We become reflective and begin to

find our place in the scheme of things. Don’t stick the vacuum cleaner hose onto the garden hose. It works,

but the last time I did that Her Greatness Mom was not pleased.

from the lawn. That gets cookies and hugs.

I got spanked.

Better bring dandelions

As our brains mature into memory and clear reflective thought, we begin to pick up and retain both

personal and cultural detail. It happens over a nearly endless time. The gods descend from heaven to be our

mothers and fathers, great saints and demons took off their halos and horns and became older brothers and

sisters, aunts and uncles. Bears and monsters become dogs and dump trucks as we graduate from the

collective unconscious of infancy, passing through a place of fable and mythology we can barely recall in

our deep and personal past. During three years of worldly time we are weaned from the world of our

oneness and rewoven into the collective fabric of our family and culture.

With the arrival of our mental

maturity we finally come into this world. The tree of our knowledge is now fully arborated and the mind is

ripe. We begin to notice the many differences between here and there, me and he and she, good and bad. As

we bloom into conscious chronological thought we are severed from eternity for the rest of our life. We are

no longer all and forever. We are quickly becoming one more lost soul in the here and now. Still, even as

we all come to grips with the grip of time every single one of us distantly remembers, in some general and

diffused manner, those days when the gods spoke. We remember the love they gave us, the same love that

we carry at the very base of our knowledge of this world. It was the earliest language we knew, the earliest

source code of our soul and all our sensibilities.

Our earliest memories are of our parents and their natural love. Babies are treasured everywhere. No

culture in the world condones cruelty to infants. The one thing we discovered in this awful world that made

the loss of eternity bearable was the love we found there. It is the only ration we can take with us when we

leave the garden because it is so simple. It becomes the compass we always use to find our way back again.

The First 10 Seconds of Eternity


We know we must find our way back one day, back to our old eternal home. We can’t forget it just because

we are discovering mortality. But we do. We all forget our first eternity. We nearly forget the love as well.

But somehow we believe it will all come back some day. Back when days were months and months were

years we have the answers to why both Jewish patriarchs and Buddhist demigods, “devas”, had such

extraordinary lifespans. When we were very small there really were giants around. We find them in Genesis

and all creation stories. The years before reflective understanding are different because we experience them

so differently. All mythologies start with a golden age; or at least a time when the gods were making sure

everything was working right. It is to this earthly plane we descend simply by growing up. Heaven was our

infantile perception of our own infancy.

It is our common inheritance, shared by every human on this

planet. We were all there once, and we will all be there again.

If we try to think back to our earliest memories, we can almost scent the breeze of timelessness

beckoning over the dark threshold. This is the true time warp, the undertow of trying to remember thoughts

from another era. These are times so deep and so vaguely comprehended that they are more like ancient

fossils trapped in the strata of our past. We can hardly remember how long it was from age three back to

age two. From two back to one is even longer. There is plenty of time for any number of “previous lives” in

the collective and universal infantile mind. There is more time on the other side of birth than we will ever

remember. There is no time so endless, or so deep. The haunting memories of those earlier times are still

there, scattered and generalized through our waking perceptions but still alive in our dreams and our


Only if the mind simplifies can we ever re-experience that other universe, always there within us. If

the maturation of the human brain forces us to forget that timeless place in order to deal with present time

and space, no matter. We will rediscover it again at the right time. Whenever something makes our mind

simple again. It happens every time we are taken to the limits of our perception, those times when time



stands still. In sudden terror and in ecstasy the overburdened brain slips time for a moment. Instantly we

know things that we cannot express in words or even think about. It happens every time we undergo an

experience so powerful it blankets consciousness, forcing us mentally into a momentary timelessness. It

happens temporarily, but only momentarily. It keeps us aware that there is that place beyond human


It will happen with eternal finality during death, the only experience in this lifetime that can

loose us from the grip of time in plenty of time to make it to back to heaven forever

just before we die.


Religion and Mind Science

Old Questions, New Paradigms

“To study metaphysics as they have been studied appears to me to be like puzzling at astronomy without

mechanics. We must bring some stable foundation to argue from.”

— Charles Darwin

Our earliest mind knew forever and our final mind will know it again just before death. Our early

experience with the infinite may be a heavenly start to our beginnings and a preview of our endings, but

what about now? Let’s close the doors on our befores and afters for a moment, and reflect on what this

says about the present moment. This particular present moment, if we think about it, has been going on for

quite some time. Actually, it’s been going on forever. In fact, there is no time so timeless as the present

moment Despite this obvious condition, our attention is often focused forward or reverse. The past can’t

repeat and nobody knows the future until it happens but we’re always thinking about them. Living in the

present while visualizing both past and future is what makes human consciousness such a unique way of

thinking. We may be the only ones who can do it. It requires a huge amount of memory




and the ability to sequence patterns. In doing so we incidentally create the sense of time. Over time

most of us will find a religion. In much less time, we will discover it is probably because of our unique

human sense of time that we have any religions at all.

Our ongoing reflections on the past and the future gives birth to some very deep questions.

Unfortunately, answers are available only in times that aren’t available. They are the same questions

science always has trouble with. We know them by heart. “Where did I come from? Why am I here?”

Where am I going?” In the broader sense they become “Where did it all come from?” “What is it all

about?” “Where does it all go from here?” In India, the great Saint Shankaracharya summed up the query

in five terse questions: Kastwam? Ko ham? Kutah ayatah? Ko me janani? Ko me tarah? Who am I?

Where did I come from and how? Who was my eternal mother? Who was my eternal father?”

Coming up with answers requires a comprehensive human raison d’etre, some definable wherefrom

and whereto of life

This is what prophets and philosophers do for a living. If they ever agreed, a world

religion would have appeared ages ago. It hasn’t happened yet.

Local answers always mirror the

complexity, art, and wisdom of local culture. Each is a local response to some universal human need to

come to terms with these annoying mental puzzles. Our own conclusions, if we have arrived at them, form

the foundations of our personal metaphysics (Greek, “beyond science”) our individual sense of our purpose

and our reality. We call them our religious or personal philosophical beliefs.

All holy books, including the Christian Bible, Jewish Torah, Muslim Koran, Hindu Vedas, Sikh

Grant Sahib, Taoist Tao Te Ching, and the Buddhist Sutras go to great lengths providing mutually

exclusive answers to these three simple questions. Their answers serve as the basic philosophical dividers

separating the Shiite Muslims from Hassidic Jews, Billy Graham from Thich Nat Hahn, and the Pope from

the Dalai Lama.


we are to find some universally acceptable explanations,

they will



Religion and Mind Science


harmonize some rather disparate characters. Each religion has its own answers and philosophical or

theological structures to support them. Each traces its authority to a divine, or at least infallible, being and

they all disagree. It would be convenient to invite Jesus Christ, Mohammed, the Buddha, Moses, Lao Tsu,

Confucius, Guru Nanak and Hindu lawgiver Manu for dinner and ask if they might come up with

something like United Religions. Our dinner guests would probably think it was a great idea.

But as each

represents a higher power, they must report back to God, Allah, Tao or Dharma for a go-ahead. Things

might get stuck at their own metaphysical level. There is a good reason for this.

Western religions rely on mutually exclusive personal revelation to holy individuals such as Moses,

Jesus, Mohammed, or Mormon leader Joseph Smith from one all-encompassing God. They also tend to

build on each other. Christianity added Jesus to Moses. Islam added Mohammed to Moses and Jesus. In

America, Mormons added Mr. Smith and some Christian Scientists in Boston lobbied for Mary Baker

Eddy. Korean Sun Myung Moon says he actually is Jesus. Followers of late Texas ex-messiah David

Koresh disagree, insisting Jesus did return but ascended again in Waco. Still, most claiming conversation

with a deity these days are offered Prozac ® more often than prayer.

Finding agreement among Asian

believers is no easier. Eastern religions replace mutually exclusive prophets of God with mutually exclusive






the eternal universal system uniting





metaphysical experience. Hinduism is technically Sanathan Dharma, or the “traditional system.” Buddha

preached the Buddha Dharma, his own understanding of the way things worked. The parts are not really

interchangeable. Theory and practices differ. The universality of Brahman is not the emptiness of Nirvana,

and the Tao is neither of the two. All three have several major schools and dozens of sects.

Getting the original sources to sit down could be even harder. Yahweh and Allah might agree to the

same menu since neither enjoy pork, but Ram would have a beef with steak because Hindus don’t eat cows.

Getting served could be dicey protocol; God wants no other gods served before Him. Buddha would win



points for tolerance since his monks must eat anything put in the begging bowl. But only in the morning.

Still, once their divine tastes were accommodated, they would soon discover how similar their messages

were at the human level, the only level we humans are concerned about. Like the larger sects of a major

religion, all religions of the world today seem to lead their followers in the same basic directions. They only

disagree on who is to be guide and the guidebook we are to use. The more we investigate the basic dogmas

of the world religions, the more depressing it becomes. Each originates in a different land, embodies local

traditions, and each is, to the devout, the only one there is. Furthermore, there hasn’t been a new world

religion since the Sikh Dharma, Guru Nanak’s alloy of Muslim and Hindu faiths. The Bahai’s have tried

very hard, but like Esperanto, attempts at cultural combinations this far along lack a certain spark.

In all probability, the world is about due for a major religious event of some sort. In repeated

oscillations periods of human reliance on technology and power seem to alternate with periods of religious

revival. The blooming Renaissance provided incentives for the stern Reformation. Later, the industrial

revolution promoted a working man’s Gospel and democratic sects such as the Baptists, Methodists, and

the Church of Christ. Whenever it seems mankind is becoming too fascinated with material power there is

a social migration back to religious faith. This often results in entirely new sects. With so many different

religions around the globe and so much technology making that world go round, one can only wonder if we

may be closer to new fusions than we ever imagined.

Christmas and Compassion

Since heavenly mergers remain unlikely, world affairs would be greatly improved if our earthly

religious leaders came to some general agreement on not only what constitutes naughty and nice, but why,

and not just because their particular scriptures say so. In a world of over six billion humans we ought to

Religion and Mind Science


have enough accumulated experience to derive some general guidelines for good human behavior. Ideally,

real rules for humans would transcend local tradition and national politics. The real problem, in fact, has

very little to do with this sort of wisdom. Nobody really disagrees about naughty and nice. The differences

come up only in answers to the metaphysical questions and most people are interested in answers to more

pressing religious and spiritual matters.

Who’s getting the Christmas tree?

Why can’t people live in

peace? Didn’t I give at the office? Most of us don’t often think about metaphysics.

In fact, every religion on earth today enjoying credibility, cultural acceptance and at least a half

million followers is defined by only three areas of thought and practice. We can call the first “Cultural

Ceremonies.” The second, “Applied Social Psychology.”

The third and smallest area is metaphysics, the

historical theology or philosophy behind it all. Realistically speaking, most religious activity in any part of

the world today is taken up by the first two categories. Calendars are dotted with regional, national, and

international observances of religious rites and holidays. Christmas is celebrated in Bombay and Tokyo.

Muslims air shuttle to Mecca from Morocco, Marseilles and Memphis, Tennessee. Every culture has

harvest festivals, saints’ days and local celebrations. If it doesn’t disrupt the local social fabric, nearly any

form of personal religious observance is respected. Cultural politics may clash as in Ireland or India but as

individuals we have no quarrel with another’s yearly cycle of faith and celebration provided they stay

within the cultural expectations of our region.

The second category of religious practice, “Applied Social Psychology,” is even less of a problem.

This is because the great lawgivers gained their followings based on their insight into the universals of

human behavior. They had the ability to break them down into simple rules and the charisma to convince

others to observe these rules for personal and social guidance. Any savior or system too specific for broad

and general acceptance ends up with a cult, not a cathedral. Mother Anne Lee’s Shakers were the first

“greens”. Their love of simplicity left us plenty of fine old Shaker furniture but their practice of celibacy



left no fine young Shakers. There are no Essenes in Judea, nor Kadam-pas in Tibet.

Twice as many

gathered at Woodstock in 1969 as practiced Christian Science in 1996 and we can fit all the remaining

Swedenborgian Christians in a small auditorium. Even the world’s largest and wealthiest organized

religion, the Roman Catholic Church, is fighting for

its intellectual survival.

The largest single

denomination in United States is a group termed “lapsed Catholics” as millions of the once-faithful question

the concept of a God the Father transmitted by celibate men wearing unusual clothing. Only theologies

founded on a basic understanding of human nature, expressed in an intelligible and universal form, can last

more than a generations or two.

Tolerance towards others, for example, preached by all faiths, takes on a new dimension as interfaith

conferences demonstrate the basic unity of the major religions. The Pope and the Mullah, the Lama and the

Swami chat cheerfully. Each is technically pagan to the others, but their meetings seem so cordial and

reassuring. Imagine what must be going on in their minds as they smile and pray together.

This growing

openness towards another’s religious beliefs becomes a social necessity when so many traditions mingle in

the crossroads of our global society. We can’t convert them all, and religions getting pushy about specifics

simply lose out. Most at risk are those requiring a hereditary link for membership. This trend is especially

pernicious to religions which avoid conversion. Orthodox Hindus and Jews alike watch their numbers

shrink each generation. By 1995, more than half the Jews in America were marrying outside their faith.

Orthodox Parsees, who until recently required both parents to be Parsees, are an endangered species.

Descended from the original Zoroastrians, they represent the oldest continually practiced organized religion

on earth. Less than a hundred thousand survive, and there’s little any non-Parsee can do about it.

Despite the tumult in the religious marketplace, at the broader levels of human behavior there seem to

be no serious differences. Allah requires generosity, Jesus preaches humility, Moses and Buddha remind us

not to kill and Krishna asks us to open our hearts to devotion and love. All tell us to help the weak, support

Religion and Mind Science


the poor, heal the sick, and above all be kind, compassionate and honest with each other. Their rationales

may differ, but the results are the same. All world religions represent an inherent human wisdom universal

in nature yet specific to each. The morality tales are always told from a familiar viewpoint by teachers we

can identify with. We have no unbelievers here nor any reason to disagree. Despite the surface differences

between a Jew and a Muslim, between a Hindu and a Catholic, we agree on most of the basic ethical

questions of daily life and how we are to behave toward each other. Given the opportunity to sample, we

would enjoy most of each other’s feasts and ceremonies. If this is so, two-thirds of the world’s faith and

practice is nearly convertible from one religion to another.

So what remains to quibble about? The only

area religions really differ in is the third category, their metaphysics. Each religion has their unique set of

answers for questions that go beyond rational or scientific inquiry.

Once more, most who practice a religious faith do not spend a lot of time worrying about such

things. Heavy thinkers do. Bertrand Russell, the celebrated mathematician and agnostic, once made a list

of five favorite questions he was certain science couldn’t answer: “Is there survival after death?”


mind dominate matter or vice versa?”

Is there a purpose to the universe?”

“Is there validity in the

assumption of natural law?”.

And “What is the importance of life in the cosmic scheme?” Only God or

Dharma, we are told by religious teachers, have the answers. Furthermore, only individuals made

acceptable through rituals, rites, and specialized education are entrusted with the interpretation. Clearly it’s

not a dispute over the eternal questions keeping us separated.

It is the variety we seem to find in the

answers. That this nearly academic aspect of religious practice was the basis of so much suffering in the

twentieth century will be the source of wonder in the twenty-first.

Still, most religions continue to insist their specific answers are the one and only truth, despite how

unlikely they may seem to others. “Religion,” as Reinhold Neibuhr, the pre-eminent American theologian of

the century wrote, “is so frequently a source of confusion in political life, and so frequently dangerous in



democracy, precisely because it introduces absolutes into the realm of relative values.” If there were a path

to real wisdom that didn’t require a specific religious or cultural loyalty there should be a United Nations

task force on the lookout for it.

It could stop a lot of the wars. In a post-modernist world, we don’t need

any more grand narratives. We just need to get along with each other.

“Neurophenomenology”: More than a Pretty Name

Scoping out new answers to unanswerable questions seems beyond the range of any one person or

even a group of specialists. The wreckage of countless attempts to come up with explanations and schemas

already litter the shelves of the book stores. At this time, the greatest amount of interest seems to be

focused on the area of the mind sciences and the recent investigations into the general phenomenon we call


There is a good reason for this. No matter what faith we follow, we all know by now

there are billions of people who believe otherwise. Like ourselves, they survive and even prosper. The very

variety of human religions leads to a simple line of reasoning. All human cultures have religions. Their

cultures differ widely but the rules of their religions are nearly identical. Is it possible that the roots of

human religion may extend beneath human culture, into the workings of consciousness itself?

The history of

mankind demonstrates


living in widely separated regions



consistently develop religions with similar rules even if they disagree on the details. If this is the case, it

suggest the source of religious inquiry may be internally generated. It may be a species wide “need-to-

know”, expressed through a variety of remarkably similar social structures, customs and belief systems

whenever human culture reaches a particular level of development. But the development of what? It would

have to be the development of the ability to pose Russell’s very questions. This line of thinking leads one

step further: Any conscious behavior common to all humans must be the result of something at a

Religion and Mind Science


neurological level. Only something based in the very way our human brain operates could affect all human

thought so identical ideas would emerge nearly unchanged in every culture. The basic metaphysical

questions could be hard-wired right into the system. This is a difficult stretch because it inserts the physical

into the metaphysical. The classic dualism of René Descartes, a physical brain creating a non-physical

mind, gives Hindus hives and Christians the creeps. It even give modern “consciousness” scholars the

shudders. Seeking clues to the soul or the spirit in what looks like a mass of wires and plumbing puts off

both Baptists and Buddhists. Still, it remains the most likely route to the source of religious experience as

we perceive it. The brain is, after all, the only organ capable of conscious perception. We can’t do it with

our toes or our tonsils.

This detail has been obvious for a long time. The primary importance of the brain in the perception

of consciousness in all its forms has been well known since the ancients. On the other hand, most of the

methods the brain uses to accomplish this task have only recently been revealed to us through computerized

medical technology. The rapidly growing interface between brain science and computer science has created

a powerful alliance;. as a result, the architecture of the brain is finally being defined. We are beginning to

read the operating manual of the mind.

If this continues it is likely philosophers or theologians of the

twenty-first century will be required to show fluency in mind science just as modern medical doctors must

know their biochemistry. Things have changed that much.

As a natural result, we are drawing closer to new philosophical insights capable of finally

harmonizing scientific method and religious belief. The absolutely correct term for this is the cumbersome

neurophenomenology,” literally “using neurological science to define the nature of reality.” For this is, in

fact, what seems to be emerging. “Neurotheology” has two syllables less, says basically the same thing,

and links it specifically to religious philosophy.

Thomas Aquinas developed his systematic philosophical

system, Thomism, using Aristotelian logic to order and anchor Christian theology. Likewise. modern



philosopher and religious thinkers are starting to use the structures of mind science to provide intellectually

universal, generally agreeable points to underscore their conclusions.

The reason to use mind science as the foundation for a modern metaphysics is a simple argument.

Since we experience only what we perceive, we should first study the structure and function of our major

organ of perception, the brain itself. By learning more about the unique way we perceive reality, we may

discover simple clues to believable explanations of otherwise traditionally unexplainable mysteries. There

are limits to our understanding, true, but this may be because of the way the brain arranges consciousness

rather than any lack of enlightenment, devotion, or grace on our part.

High-Tech Hybrids: If Ever the Time Were Ripe

If our metaphysical questions are specific to humans at large rather than any particular culture, they

must be byproducts of human consciousness itself. How else would these queries appear consistently in the

minds of human beings everywhere? Perhaps our elegant answers, unique to each culture, are just echoes

to built-in queries, a necessary response of the human mind to something even more basic. We seem

susceptible to some common mental itch, a series of questions that make the same mischief in every human

mind. Looking at it logically, supposing simple generic answers were available. Would they be universally

accepted? If it is the nature of human consciousness to pose these questions, any explanations would have

to fit within both the cultural and personal reality of each person. Strictly individual answers to the

questions could become a religion of one, pretty well eliminating both the ceremonies and the good deeds.

Nobody’s going to shut down the banks for “Bob’s Day”. It makes sense, then, even in a global society

most religions are not personally specific but culturally specific.

However, the merger of science into the

general belief systems of nearly all world cultures is finally providing us with a common language that goes

Religion and Mind Science


beyond culture. As a result, boundaries between physics and metaphysics are shifting faster than ever


The concept of human consciousness as the result of a biochemical process is an example of a

systematic viewpoint.

In many areas, systematic ways of thinking seem to be overtaking the original

hierarchical philosophies of the West.

Top-down religions accept that an intelligent God is in charge,

providing purpose and motivation for the entirety of creation. According to a systematic perspective, there

is no need for divine motivation for things to happen if there is a natural system explaining it. The most

powerful systematic philosophy in the world today is rarely recognized for what it is, and yet it pervades

and supports every aspect of our lives. We call it “the laws of science.” From the ecology of the earth to












perspectives are emerging as basic tools in nearly every area of our understanding.

This is a very good time for efficient systems. World consciousness is finally moving from local to













organizations. Continuing advances in global communication urge us daily toward some reasonable

religious accommodation based on global issues and human compassion rather than local history and

politics. This time around, however, “reasonable religious” may no longer be an oxymoron like “jumbo

shrimp”. It seems inevitable any new directions will embrace modern technology as a tool for discovery and

compassion rather than a cutting edge for business or war. So far, at least in this century, the major role of

technology in the service of religion has been to increase the reach of already established world faiths. The

closest contender for a new sect with a high-tech terminology would be L. Ron Hubbard’s neo-rationalist


Despite the utility of some of its simpler practices, Hubbard, an excellent writer of science


never came up with a believable philosophical or ethical structure. It can never, therefore, attract

broad social support.

Even more interesting science/religion mutants have arisen due to the recent



popularity of techno-jargon in books and seminars based on pseudo “mind-science” philosophies. From

Deepak Chopra to Neuro-Linguistic Programming, each has claimed a “scientific” basis and language. It is

in precisely the same manner medieval mountebanks adapted the phrase “hocus pocus” from the priest’s

hoc est corpus Christi”, Latin for “this is the body of Christ.”

It created a Latin-sounding magic charm

to fool village oafs into thinking they were listening to real authority.

Most of

these new scientific

sounding philosophies are convincing only to seekers uneducated in the very science they quote to support

their beliefs.

The real problem has been the lack of a metaphysical structure based on legitimate scientific insight.

It would have to use real science accepted by real scientists if we want to tie together reality as we

experience it into some believable and meaningful pattern. The Roman Catholic catechism asks “What is

the purpose of man?” and provides an answer. Until recently when it came to questions such as these,

science drew a blank and religion stepped in. Most of us assume, like Russell, they are simply not

answerable through science. We were happy enough to get believable answers, even if they always dodged

some fairly obvious questions. For example, there is the time problem. Most answers provided by religion

appear to operate within an eternal system at odds with the laws of time and space. Science is here to help

us define and manipulate the laws of nature in discrete time frames. Our holy books discuss eternal truths

and concepts such as the life, or the lives, everlasting. If science located an everlasting anything it would be

difficult to describe or prove.

Any definitive studies could not published until after the end of forever,

basically summing up the situation. When compared against each other, both science and religion tend to

make the other seem trivial in the most fundamental enterprises of human life.

Virtual Heresy?

If consciousness is the result of a biochemical system, the underlying rules of this consciousness

must be flexible under extreme circumstances. Given the proper conditions, might we then perceive a

Religion and Mind Science


timeless eternity or a transcendental experience? The implications of research along this line of thought,

taken far enough, suggest we may be about to answer most of Russell’s questions. In the process of normal

brain death, for instance, we will all experience states of consciousness where the perception of both time

and space are greatly altered. The real problem is if we use brain science to really answer the question

“What really happens when we die?” and such a theory become generally accepted, just like Darwinian

evolution every religion on earth would have to deny it or demonstrate how their holy scriptures could

include it by a modern interpretation of ancient dogma.

In fact, as this book itself was being written there was a serious concern that any genuine

breakthrough in this delicate area could incite hostile, perhaps even violent, reactions among devout

followers of one religion or another. Copernicus waited nearly until his death to publish his treatise placing

the sun at the center of the solar system.

Author Salman Rushdie spent years in hiding for offending the

Muslim clergy. People get very emotional about their religious beliefs. Going to heaven without believing in

Jesus is impossible for a devout Christian and any suggestion to the contrary is heretical. In a 1997 Gallup

poll, 96% of the population of the United States believed in God and 74% in an afterlife. If heaven is in the

simple mind of pre-birth and infancy, reappearing at brain death, any sinner could deny God, Christ or

Prophet and theoretically if not theologically make it home free. Belief is belief after all.

If an answer

appears valid enough for wide acceptance, the believer is as assured of heaven by that means as by any

other. Would neurological answers be inherent heresy, repugnant to the sincerely religious of every faith?

There are, it turns out, few scriptural bars to contemporary explanations. So long as it doesn’t deny

the religious event itself.

Finding the bones of Jesus Christ, for instance, is off limits. Such a discovery

would certainly prove His existence, but deny His bodily ascension, a basic item of Christian doctrine. It

would forever be contested and the discoverer marked for life as the source of a basic schism in the faith.

One cannot deny such basic dogma and hope to escape censure. Fortunately, finding a scientific



explanation for the experience of an eternal afterlife does not deny the event. It should, after all, be within

the power of God or the Dharma to transcend us properly to our heavens and afterlives without smoke or

mirrors. In the Indian holy city of Varanasi (Benares), for instance, devout Hindus believe Bhairab, the lord

of the dead, allows those fortunate enough to die within the sacred acreage to avoid the tedious rounds of


He simply collapses their future lives into one amazing instant so that they can “see Shiva”

immediately. This endless lifetimes express has never been scientifically investigated but any solid insights

into how Bhairab accomplished his feat wouldn’t invalidate the event.

Describing the method never has to reject the miraculous. It can even reinforce and revitalize the

faithful to realize the divine beauty they see in a gorgeous sunset cannot in any way be diminished by an

understanding of the biochemistry of visual perception. Does it require more of a stretch to suggest that if

consciousness generalizes enough during brain death we will all wind up in identical mental simplicity just

before we die? Could this be what the 19th century American Evangelical theologian Karl Barth had in

mind when he wrote “All shall be reconciled”? Or would such speculation simply become the source of

more misunderstanding and conflict?

Historically, in fact, religion is usually accommodating. Although a number of preachers had

problems with Charles Darwin’s theories, another Victorian clergyman, Charles Kingsley, read the recently

published Origin of Species and wrote to the author, “I have gradually learnt to see that it is just as noble a

conception of Deity, to believe He created primal forms capable of self development into all forms needful

pro tempore (for the time) and pro loco ( for the place), as to believe that He required a fresh act of

intervention to supply the lacunae (spaces, literally “lakes”) which He Himself had made. I question

whether the former be not the loftier thought.”

Evolution was an idea whose time had come. Major

scientific advances, as dramatic as they seem, are inevitable. Philosopher Daniel Dennett has pointed out

they are not like the works of Shakespeare. If Rembrandt hadn’t painted, there would be no Rembrandts,

Religion and Mind Science


but if Darwin hadn’t figured it out, someone else would have within a few years. When breakthroughs so

basic occur, everyone trades up to better science. Religious sects that don’t get with the program generally

die out or end up, like the Pennsylvania Amish, with horse and buggy lifestyles in a world that has passed

them by.

Paradigms and Progress

The most convincing aspect of Darwin’s theory of evolution shares a basic similarity to the

Copernican system.

Both provided simple and connected explanations of complex and previously

disconnected observations of the world around us. In the process both triggered a basic restructuring of

western scientific and philosophical thought. Such radical changes in perspective were identified and

described by philosopher Thomas Kuhn as “paradigm shifts.” In a paradigm shift, a new explanation

forces a fundamental restructuring of a popular viewpoint. This is exemplified by the worldwide switch to

the Copernican solar system.

Within a single generation the entire scientific establishment abandoned

theories taught as fact for centuries.

The “Copernican Revolution” was a fundamental philosophical event as well. Once man was no

longer the center of the universe all sorts of other assumptions began to cave in. As it happens, religion has

always maintained a connection with natural science and by the sixteenth century Christian theologians had

long embraced Ptolemaic astronomy. The nested crystal spheres of the Ptolemaic universe seemed a bit

lonely so medieval Christian writers proceeded to populate them with all manner of heavenly winged


Having deeded the holy real estate to cherubim, seraphim, archangels and so on, it was

embarrassing to evict them all. For a time it seemed easier to evict the Copernicans, but too many

telescopes confirmed their observations. No spheres., no angels. But they revealed a universe so vast and

grand that it humbled human imagination while making us forever a part of it all.



Paradigm shifts are not improvements in an old system but the unexpected introduction of a new

system. As such they always face opposition from institutions and individuals who represent the current

popular culture.

There were plenty of good universities at the time of Copernicus, but Ptolemaic

astronomers were naturally the only ones available.

Once they got their hands on the new treatise most

switched over easily. Others were dragged kicking and fussing into the new era.

Great minds of one age

sometimes simply can’t manage the mental stretch to get to the next. The great Victorian scientist Louis

Agassiz discovered the ice age, explained the origin of glaciers, and collected fossils from ancient sea beds.

Still, he never accepted Darwin. In any event, if scientific observation continues to support the new view a

radical viewpoint becomes natural law through sheer attrition. Everyone knows the continental masses, or

plates, making up the earth’s surface drifted over millions of years to become the shapes we recognize

today. The man who figured it out couldn’t determine exactly what made them move and died in obscurity.

Still, the logic made such sense a mechanism was soon found.

Sea beds were spreading due to lava

upwelling from the earth’s interior. The opposition shifted and plate tectonics, a wild idea from an unknown

scientist, is as respected today as the law of gravity.

Paradigm shifts are usually characterized by two qualities.

First, they result from new yet

surprisingly simple changes in perspective. Second, the new idea seems basic enough, once explained, but

it is always grounded in the most recent science of the time. When the actual breakthrough occurs it often

happens so dramatically it’s a shock to the discoverer. In fact, all “sudden” discoveries are nearly always

the result of years of tedious effort. “At first I was deeply alarmed,” wrote the German physicist Werner

Heisenberg, describing his initial insight into quantum theory. “I had the feeling that, through the surface of

atomic phenomena, I was looking at a strangely beautiful interior and felt almost giddy at the thought that

now I had to probe this wealth of mathematical structures nature had so generously spread before me. I was

far too excited to sleep.” Others report the same experience, the alarming discovery of a new and better

Religion and Mind Science


way to understand some basic phenomena.

The concept is always profound in implication and yet so

elegantly simple in description it simply must be right. Real paradigm shifts often spread so fast it can take

years for solid proof to catch up. The Copernican system, as originally published, was faulty. It required

the combined work of Johannes Kepler, Tyco Brahe and Isaac Newton to both prove and improve a system

so obvious it had already become widely accepted. First comes the vision, later come the details. The ideas

lead to the formulas, not the other way around.

“I am a physicist” grumped Einstein, “I haven’t the

mathematics to prove my theories.” Still, it never stopped him from rearranging the physical universe to fit

the theories he could never prove mathematically.

In their simplicity, the insights behind paradigm shifts all support the example of “Ockham’s Razor,”

a philosophical observation made by the fourteenth century English cleric, William of Ockham. His

cutting-edge comment, “It is vain to attempt with more what may be accomplished with fewer,” has been

proven again and again throughout the history of science. It was itself shaved down to the three words

“nature abhors complexity.” Given two possible explanations for a phenomena, the simpler is invariably

correct. The theories of Nicholas Copernicus, Isaac Newton, Charles Darwin, Albert Einstein, and Werner

Heisenberg all explain a wider range of physical events with a simpler overall system than


available. Each of these new perspectives allowed new and unexpected observations to fit with accepted

science by introducing a radically different, but inherently simpler, overall structure.

The second aspect common to paradigm shifts is not obvious to a purely philosophical investigation.

Nearly all the insights that changed the way we view the world were catalyzed by specific advances in

technology. Copernicus, a gifted amateur astronomer, was using the best tools late Renaissance technology

could provide. They provided observations accurate enough to suggest a theory more daring than the

technology itself. Likewise, it wasn’t until the twentieth century we learned that Isaac Newton fudged some

experiments described in the proofs of his Principia. The structure was so elegant he refused to let the




limitations of his own instruments, far too crude to yield such accuracy, get in the way of his discovery. It

simply made too much sense to be wrong. So he ran with it even when he knew he might never be able to

prove it if anyone challenged him. Fortunately, nobody did.

Still, without the improved mechanics of the Renaissance, the fine lenses ground by Anton van

Leuwenhoek would not have been there for Galileo or Newton. Without the improvements of Newtonian

physics, the nineteenth century Michaelson-Morely speed of light experiment could not have posed new

questions for Einstein to answer. Like relay runners passing the baton, finer science creates finer theory, in

turn creating even finer science. It was only a matter of time before the tools of brain science provided the

necessary perspective for another major shift. Copernicus’s telescope used lenses originally made for

eyeglasses. He may have been the first to show how the tools of modern medical technology, borrowed to

go exploring, revealed more than new sights. They made stunning insights possible. Using the newest

instruments of brain science to probe for meaning and purpose in the mind, what we find is just as clear

and just as stunning. As expected, these new perspectives on the nature of consciousness are surprising but













Religion and Mind Science


monk, was from Ptolemy, the classic philosopher. As expected, unexpected new observations continue to

shift our perspective with proofs which are simple, obvious, and ultimately reassuring.

Get Real, But Where?

Most people would agree reality is what is really happening in the “real world”. We are taught

this “real world” is in fact real.

It has an absolute existence which we each interpret a little differently.

Likewise at any moment the universe occupies a fixed location in time and space and we are located in a

fixed time and space within that universe. This is the way current philosophy works. Based on recent

observations, this is as unlikely as Ptolemy’s spheres full of angels.

We all know nerve impulses don’t

travel at the speed of light. Not only that, none of the senses are directly linked to our brain, the only

organ which can provide our perception of that impulse. Every sound we hear, every scent we smell,

every object we see travels through dozens of slow-me-down nerve connections and crossovers before it

reaches our consciousness. Any “real world” we are experiencing actually happened at some other time.

It’s nearby, yes, but always a few microseconds removed from what we perceive. All we can ever know

is a momentary mental image, an imperfect echo of a recently past event.

“Real time” has moved the

“real world” forward while its previous image still lingers in our mind.

Like the Copernican professors, it’s not easy to reverse gears in midstream but we have to accept the

obvious. Before we were born, our consciousness was nearly all internally generated. From that point on,

we get our information in a series of steps from sources further and further away. It’s almost funny when

we think about the supposed eternal stability of the starry cosmos. What do we know? They’re thousands

of light years away. The images we’re picking up are so old those stars could have all turned bright pink

years ago and nobody will know for centuries. Our sun could blow up right now and we won’t know for

eight minutes.

You can make it to the end of the chapter. We are all so used to hand-me-down, passed-



along information we never think about it. Talk about jury rigged scenarios hashed together from second

hand images collected all over the time spectrum! Real? We can’t get real from out there into here without

throwing real time into a tizzy, and even then we mess it up so badly in so many ways we’re lucky to

perceive it at all. It’s usually more static than sensible information.

The freshest, fastest information we can get is what we generate internally.

Not only that, unless

something rearranges our brain it’s going to be consistent over time as well. Try to get consistency from the

“real world out there” and see what it gets us.

Internal information is naturally more believable than

information from places where things change radically over time or even disappear. Everything “out there”

does that sooner or later.

This leads to the conclusion that not only is “reality” a world we lace together

ourselves, but the realest and most stable part of it is strictly personal. This may be reassuring but it also

means that as we are the only witness to our world we are also the only judge. It’s nearly impossible to

separate truth from opinion in our own mind. We can agree with others about a lot of things, most of the

laws of science among them, but “real”?

“Reality” for any creature is based on an internal state of

awareness. To this, we add a potpourri of third hand passed-along perceptions from “out there” and they

combine to form an all-encompassing interface, the self-created virtual reality we call consciousness. This

consciousness is constantly synthesized from both external and internal senses. It is perceived and

understood to the extent of what’s available in any given brain at any given moment.

“Reality” cannot be

decreed because it can only be as it is perceived and each conscious creature perceives it a little differently.

There can be average perceptions, agreements, or a consensus among people, but there can’t be an absolute

reality because no two brains are ever going to put it all together precisely the same way.

They can’t. Our DNA is unique, each of us is unique, and our brains are likewise unique. No human

can never know any ultimate reality. We only know what appears to our mind, our personal consciousness

and nobody else will see anything the way we see it. Everyone has a different view “For what is a beautiful

Religion and Mind Science


woman?” said the philosopher Nagarjuna. ”To a man? An attraction! To a monk? A distraction! To a tiger?

A good meal!” Even worse, depending on what else is happening in our individual lives, our minds are

easily deceived or distorted in a number of ways we can learn to understand and predict. One thing for

certain, if anything can be certain,

the only “reality” we will ever know is personal, biased and slightly

behind the times as well. It would be nice to perceive the “real world”, but we can’t. Only our world. As

the world we experience is a virtual reality, it is the product of a process. As this process undergoes

predictable distortions, such as in extreme stress or the generalization of brain death, we will certainly

know other realities. They will be just as real to us and just as believable as the one we are experiencing


So this is what happens?

Skip to Chapter 10 if you’re curious.

This is only one example of new

horizons which appear as new discoveries in mind science lead us into an entirely new perspective.

It is

already changing the way we view ourselves and the world we live in.

Since we all perceive a slightly different world it’s important to learn some basics of how we

perceive. To do this, we turn to the brain itself. There is a good reason for this. Most people believe in a

non-material mind, spirit or soul. Still, for us to be aware of that soul it must be held in our consciousness

for examination and reflection. Our consciousness alone is making the entire universe known to us; and our

consciousness itself is made known to us moment to moment only by the healthy functions of our living

brain. It is here where the paradox of a physical brain and a non-physical mind, spirit, or soul may be

resolved. Whether consciousness itself, which we use to perceive everything, is created by the brain or is

simply perceived by the brain, it can only be as we perceive it. And we perceive it all through the structure

and the function of the most complexly and intricately arranged form of matter we can ever know or


Within the space of roughly fourteen hundred cubic centimeters moves the exquisite organic

instrument which determines our awareness of everything else at all. Damage or stress it and we are no



longer aware of anything in the same manner. Our universe will change around us. We may have a change

of heart, we can change our minds. The brain will accommodate our shifting realities without missing a

beat. But tamper with any basic function of the brain and we can distort or destroy the perception,

realization and projection of our entire universe for some time. Possibly for all time. Our world as we know

it is in our own hands. More precisely it’s in our heads. Our brain functions at a level beyond our

perception as it monitors anything we can imagine whiles running everything else at the same time. It is as

close to the infinite as we will ever get close to but it is not out there. It is in us. It is the part of us that is

making us who we are and what we are and it is alive and well or you could not be reading this.

With those fourteen hundred cubic centimeters, the brain arranges the only world we know and

perceive. This constant physical activity allows us to know our days, our nights, our dreams, our faith, our

beliefs and any other thing we can know at all. We shape our world with the very lens we use to perceive it.

Whenever we search for meaning we always find it where we look longest. It will always be based on

whatever we believe in most. Why do we sense the universe is not really locked in time and space but

moves in a constant state of creation and change? Simple answer. Because we perceive it with a human

consciousness perceived through a living brain, itself in a constant state of creation and change.

Nothing is static in there. Every thought has its minute effect, every moment is a little different.

Jesus, Mohammed, Moses, Gautama; Shankaracharya; all experienced the world through this same organic

miracle, the same basic structures we each call our own. Nobody ever suggested our greatest teachers were

another species or their wisdom arose from their elbows or their eyebrows. If there is a link between

humanity and divinity, the interface is located here. We probably all have within us the inherent ability to

experience their spiritual insights and share in their assurance. That might convince deists of their touch of

divinity. At the least it provides us all a more enlightened understanding of our common humanity. Once

the earth was no longer the center of the universe, people searched everywhere to find a new center of it all.

Religion and Mind Science


The answers were right behind our eyes, right between our ears. Now we know. We are each the center and

the creator of our own virtual worlds and if these simple arguments make a little sense, William of Ockham

is smiling and your paradigm just started to shift.


As Real As We Can Make It

The Codes of Consciousness

One ought to know that on the one hand pleasure, joy, laughter, and games; and on the other grief, sorrow,

discontent and dissatisfaction arise only from the brain. It is especially by it that we think, comprehend,

distinguish the ugly from the beautiful, the bad from the good, the agreeable from the disagreeable

— Hippocrates

Our perception of the world and our thoughts about it take place in our brain at the same time.

There is a difference, however, between the world we perceive and the world as it actually exists. Take

sight, for instance. As early as 1709 George Berkeley, whose name adorns the famous California campus,

argued that it couldn’t all happen at once. Now we know why. First a photon has to bounce off whatever

and ricochet into our eye. This part happens instantaneously because all photons travel at the speed of light.

It zips through the eye’s transparent lens and hits the retina, slamming into a molecule called retinol or

rhodopsin, The rhodopsin acts like a spring, bending momentarily and snapping back. This is good because


















As Real As We Can Make It


snap back soon as possible to be ready for another photon. But the 200 quadrillionths of a second it takes

for this twist-and-back is longer than it took the image to get to us. By the time the message makes it

through three layers of cells in the retina and gets routed via the optic nerve to the visual cortex, the

rhodopsin has already snagged a dozen new images. This is before we’ve “seen” the old one. Like a photo

shop developing machine, the brain develops these signals to “sight” as fast as we feed them but it still

takes a little time in the processor.

As a result, at any given moment we experience a world just slightly

behind “real time”. It is this perception we react to and relate with, a magic mental carpet endlessly woven

on the loom of our own internal time and space. Moment-to-moment life appears seamless to us because we

can’t perceive the mechanics of perception. Still, just as the magic of Disney World is limited to the ranges

of the pumps and gears that move the magic kingdom, human consciousness also works within limits and

according to strict rules.

From Zero to One

Consciousness awareness is a dynamic with two ingredients.

The first is perception; the ability to

extract useful information from the environment.

Second, reaction, the manner in which this information

improves our interaction with our environment.

Conscious creatures, even those with limited self-

awareness, exhibit both.

Limited perception naturally leads to limited reaction.

complex reaction, we would likewise have no reason for complex perception.

Were there no need for

When a synthesis of

conscious perception and memory is guided by abstraction and prediction, we call it reasoning.

levels of reasoning which we call “intelligence.”

It is finer

When it comes to judging consciousness in other creatures it is useful to remember these variables.

The honeybee does perceive and is aware of ultraviolet light. It sees a color we cannot imagine.

On the

other hand, a bee’s brain is so small that it cannot adapt or decide anything consciously. It cannot reason



at all. Moreover, its minimal insect awareness must act through a nervous system of great simplicity and


Insects are practically hard-wired and entirely pre-programmed. If a bee heading in a beeline

meets a breeze, increased air pressure on one side of its body trips a muscle linkage that adjust wing angles,

like helicopter rotors, to compensate for sideways drift.

automatic transmission.

The bee doesn’t know it happened.

It’s an

Returning with nectar or pollen, it dances directions to the flowers, turning in patterns on the hive

wall as other bees brush up to get the latest travel reports.

It would be nice to imagine bees are

scrupulously honest since not once has a bad bee knowingly passed on false information. In fact, they can’t.

It’s the playback of a flight recorder operating the bee, turning the insect into a dancing marionette

mindlessly miming something it can never understand. Aside from lacking alternatives, insect brains have

little internal redundancy because insects don’t live long enough to need replacement parts. The life span of

a bee is only four years, the same time it takes a human brain to mature. Ultra violet isn’t the only color

we are denied. Some birds can see “red approaching” as distinct from “red receding”. It’s a shame they

can’t tell us what it looks like.

Eagles have greater visual acuity than we do, but they never think to nest

over VFW halls and get fat.

Birds can’t think.

Descended from dinosaurs perhaps, but they’re still bird

brains. Great perception, but not doing a lot with the information. True, crows are very clever and parrots

are amazing mimics, but still it’s nothing to crow about.

Human neurons are more complex, more efficient, and interconnected at a level not available to most

other creatures.

Observing even a single neuron going about its solitary business is to witness

extraordinarily complex activity. Aside from life support functions such as metabolizing glucose and

oxygen and managing all sorts of chemicals and hormones, each cell is always communicating with

hundreds, even thousands of others. Each neuron has its own unique voltage threshold.

Any time its

internal voltage rises high enough, the neuron will “fire,” sending an electrical pulse downstream to every

As Real As We Can Make It


neuron it touches. Constantly juggling pulses from other neurons, it adds its point or its pause many times

a second to a dense network of interconnected neighbors. It fills the brain with a dense and chattering static

of excitatory, inhibitory and modulatory messages. For any sudden activity, the time period between pulses

shortens, hustling signals along in fast staccato bursts to handle temporary information overloads.

It’s interesting to note the resemblance between each neuron and a tiny process computer. It has

internal instructions to fire if the internal energy passes a certain voltage level. Then rest and reset. Its life

is endless cycles of averaging, pulsing or not pulsing, and resting. It is a world of adding, subtracting, and

calculating like a tolerant little accountant living on air and Twinkies™. Several times a second it says

“something” or “nothing” and rests for a new cycle. This regrettably leaves us with an inevitable

conclusion. At some level every thought, feeling, and perception has been perceived, remembered, or

recalled as a complex pattern of voltage potentials and pulses. This is hardly poetic. For most it conjures

up images of analog to digital interfaces chopping up the harmonious ebb and flow of life into binary bits

like some virtual Vegematic™.

Once again we are confronted with the popular image of the brain as the

ultimate personal computer. Still, it makes sense the brain would have by now evolved to using the most

efficient way of doing its work. For information processing this means simple pulse codes.

The underlying basis for digital codes in computers is that as long as one has sufficient speed, it is

immaterial whether the computer sees the number 357 as “357” or as a string of ones and zeroes. If the

mechanism, or the organism, has to distinguish between only two possible states it’s much easier to identify

a signal against the background “noise” present in an environment, silicon or cellular. Everything can be

simpler and more efficient. Since computers are so exceptionally quick, they don’t mind working in digital

and it’s been policy from the beginning.

We had a small computer back in the dawn of such things, when the first personal computers were

ten years in the future. The Varisystems 1000 was dumber than most hand calculators. All it did was



convert typed code into punched tape read by a phototypesetter. It had a program with about 800 steps,

functionally lower than a sea snail but impressive for its time. The poor idiot was forced to go through its

entire program every time it wanted to do anything. Everything it knew was in 800 consecutive steps, a

one-way smorgasbord-on-a-track allowing no deviation.

It didn’t even have the sense to go looking for

anything in particular. We would type a “q,” and in one cycle it noted a “q” had been struck. It registered

q” in a little cache memory and ran through all 800 steps again to find out what this “q” would look like in

Litton punch code. Finally, it located the code and jogged through the whole shebang a third time to tell the

punch to do “q.” IBM made the keyboard, Photon sold the computer, and Litton Industries made the tape

punch. If anything went wrong, which was not uncommon in those Neolithic times, repair wallahs from a

trillion dollars worth of corporations would show up and blame each other as old stupido ran around in

800-step circles.

Of course, silicon solid-state switches are very fast. So fast, in fact, our poky little computer ran

through all 800 steps about a thousand times a second. It was a manic whirring electronic conveyor belt,

hungry for digital bits. We’d hit the “q” on the keyboard, and zap, it was punched out on the tape faster

than we could think about it. A device wolfing down digits by the microsecond doesn’t mind if 357 comes

in threes, or three hundred ones and zeroes. It has all the time in the world; a relaxed sort of digital virtual


Human brain speed is not nearly as fast, rarely exceeding eighty miles per hour. Yet it more than

compensates with massive redundancy and complexity. Initially most computers used the model developed

by John Von Neumann. Computational steps occurred sequentially. From relics like our early computer to

multi-million dollar mainframes, programs executed commands consecutively. Later advances made

possible a new generation of computers built on another plan.

“Massively parallel” designs separate

computational tasks, sending them simultaneously to a collection of powerful little processors and

As Real As We Can Make It


reassembling an answer at the end. Much faster. The human brain, composed of billions of data processors

in a dense three dimensional matrix is massively parallel to a degree we have never even tried to explore.

“Unimaginably intricate” is both an accurate description and an understatement.

If we are to send information around in a complex structure, and the brain is infinitely more complex

than any computer, it would be helpful to use the simplest codes possible. The binary system is simply less

confusing. Pulses are “there” or “not there.” Ultimately, if these multiple, multiplexed, interwoven codes

are complex enough, we can express nearly anything. As a result, all of our senses, both internal and

external, send information to the brain coded into a string of pulses. From the taste buds on the tongue to

the tone receptor hairs in the inner ear, everything comes to mind originally as a pattern of pulses and

zeroes. It is the major business of the brain to integrate this information sequentially with any and all

pertinent information available in memory and react to it, incidentally creating the grand virtual reality we

call the conscious experience of life. It seems impossible a consciousness such as ours could be adequately

perceived through something as simple as a molecular Morse code, but it is not as hard as it appears. Our

own visual system is a good example of how neural codes can still make poetry.

Painting by Numbers

When it comes to scanning photoreceptors, the human eye is without question unique in the animal

kingdom. We share the gift of color sight with very few other creatures. Every dog has his days but they’re

all in murky greens, browns and grays. Aside from the ability to see over 30,000 shades of color, we are

one of the few species with true stereoscopic three-dimensional vision. More important, we do much more

with it than any other beast or bird. The wild turkey has a more accurate eye than even the eagle, but

they’re still turkeys when it comes to the thinking part.



One of the most amazing things about the human eye is its ability to handle gradations in color and

brightness from bright sunlight to shadow without altering color values. It never has to change film or rely

on filters. In the brain’s visual cortex, areas that interpret the right eye are physically interwoven with the

left eye in natural patterns that resemble a fingerprint or a zebra’s stripes Between our optics and the

interpretive ability of the various layers of the visual cortex, the human 3-dimensional-all-color-correcting

sense of sight is the number one picture show on earth. Still, it’s all in pulse codes. The cone cells in the

retina register blue, blue-green, and red light. The rod cells, used mainly for low light vision, register only

black and white. By a complex process known as color subtraction, not so complex as to prevent Polaroid

from working it into instant color film, those three colors do the same job as the three basic printing colors

red, blue, and yellow. The retina contains several levels of cells allowing it to distinguish not only colors,

but edges and movement as well. With the delicate muscles and the lens of the human eye to direct and

focus images on our retina, we have a natural grid to scan any visual image between the infrared and

ultraviolet ranges.

How subtle should we get? As it happens, each eye has about 120 million retinal cells. It’s difficult

to imagine scanner operating at twelve million lines to the inch but it’s what we have at our disposal. It’s a

pity to waste it on black-and-white type. Looking closely at color printing most of us can make out the

color dots at 120 lines to the inch. National Geographic likes to be special and prints at 180 lines to the

inch on their own presses. When we pass 600 lines to the inch the eye cannot tell printing from

photography. At twelve million lines to the inch we can’t tell the visual mosaic from reality. We will never

detect the color dots, the brain erases them before we see them. The patterns appearing the mental theater

of the mind’s eye are seamless and totally believable. We think we see with our eyes, but it’s so much


As Real As We Can Make It


Depending if a photon hits the rhodopsin, about twelve times a second each rod or cone cell reports

a zero or a one. As a result, every moment our retinal grids are broadcasting billions of bits of information

that shower down through three layers of interconnected neural networks like a galactic Pachinko game,


defining shapes, shades, and shadows. This results in some signal compression but the two optic

nerves cris-crossing through the optic chiasma and lateral geniculate body, back to the twin screens of the

visual cortex carry eight million fibers, each chattering away in strings of pulses up to a hundred times a

second. This is all happens before we see a thing. We watch films at twenty four frames a second, at thirty

frames a second we watch television.

They make pictures move because our own visual images are also

produced frame-by-frame at the very back of the brain, but at about twelve per second.

Neural activity

then washes forward, picking up meaning and context from other brain structures downstream.

A stroke

here and the victim might see perfectly well but can’t make sense of it. Human sight is much more than a

cellular camera. In 1992, a research team from Fuji Photo Film created the first synthetic retina, a sixty-

four pixel grid a tenth of an inch square of synthetic rhodopsin that can sense basic movement. We have a

long way to go.

Virtually Real

The term “virtual reality” in computer language describes the scenario perceived within a computer-

generated environment. Computer-created interactive environments have by now progressed to the point

where cutting-edge entrepreneurs are already promoting new dimensions in entertainment. Customers don

stereo-vision helmets, put on interactive gloves, clip on body-movement sensors and become part of the

scene they’re watching. For a dollar a minute they can actually “be” an interactive part of a complex

computer games. It’s a kareoke sort of reality, but the kids love it. The U.S. Defense Advanced Research

Project Agency (ARPA) went one further, creating complete databases by obtaining interviews with every



single participant of certain Persian Gulf battles. The result is interactive group exercises involving dozens

of trainees in video helmets blasting their way through unreal encounters of the virtual kind. It seems to

train them just as well and saves a lot of ammunition.

Needless to say, aside from the ARPA-level boy toys, these “virtual realities” are a lot less than

believable. The concept has already been exploited beyond all technological boundaries in Hollywood films

and television series. Still, bearing in mind the rapid advancements in other areas of computer science, it is

not unreasonable to expect within a dozen years or so we’ll enter a booth, activate a wrap-around screen,

put on a pair of transducer gloves, sink into a senso-lounger and find ourselves in a jungle, on a beach or

trekking the surface of the moon. For a dollar a minute, we could live in a computer generated “virtual

reality” hunting tigers or romancing a movie star.

There are clearly a number of levels of reality at work in such a scenario. The first level, the

interactive scene, is the only one the player can and should perceive. Supposing, though, the player is a

software consultant who worked on the program.

She might know her virtual Tom Cruise can say lovely

things but can’t hum.

He might be programmed to play an instrument or even harmonize with the player

but only in major and minor keys. Limits like these would be challenged so rarely few would notice.

However, there are deeper limitations at work.

What’s the platform?

How does the program itself


Does Tom run in UNIX or MS-DOS? The rules for UNIX and MS-DOS differ, but they limit

and structure the program itself. If a player slashing his way through a virtual jungle were a UNIX expert

he might wonder what language the program was written in. Unless he saw the program itself, however, he

would have no way of knowing. Some things can’t be found even if we know they must be there.

Underlying the program’s language are the elementary computer instructions, the microcode. The

basic computational environment of a digital computer is a binary reality. It is a quantum world of zeroes

and ones, the everlasting search for “signal” versus “noise,” “there” or “not there” of minute electrical

As Real As We Can Make It


pulses streaming in their frantic missions through the murky chaotic static of an electromagnetic universe.

No matter how complex the data or program instruction, it is ultimately known to the computer as a series

of ones and zeros. “001001100111,” says the microcode. “Multiply value in memory location x by 2 and

store in location y,” says UNIX. “Horizontal pixel generator, intensity double, all edge-reference shapes in

“cloud” image bank, next scan,” says the program”

On the visi-screen, wisps of fractal clouds drifting in

front of the virtual moon on the binary beach flicker softly. So our programmer asks virtual Tom to kiss


She knows he will. She wrote the subroutine.

Finally, there are ultimate physical limitations in the

nature of all computers underlying everything else. A value is either zero or it is one. There is no half-way

or “maybe.” There must be constant voltage and a working memory. The silicon, metal, and plastic

environment cannot be baked, burned, broken, steamed, shocked or boiled. If anything like that happens,

the computer simply won’t work at all and probably won’t ever again. Good-bye Tom, good-bye beach.

This illustration describes a series of interdependent invisible rule structures and logical systems

limiting all forms of virtual perception. This nesting of systems within systems to create the perception of

reality is as close to the meaning of the Sanskrit word “Dharma” as we in the West can get. An honest

philosophy of the mind cannot speculate beyond perception, which itself dependent on a system. There is no

God operating the virtual world inside a computer. It works by a system starting with a one or a zero. A

similar hierarchy of rules is at work in the human brain at any time. They limit and structure a vast

interconnected biological environment, the unimaginably complex system required for the perception of

human consciousness. By observing some systems underlying the perception of consciousness, we may

begin to determine some basic rules behind all the others.

The physical ground rules of our consciousness are short, simple and absolutely certain. Our brain

requires oxygen and glucose and must eliminate waste toxins. This requires a rich circulatory system.

Every part has specific requirements and limitations. It can’t survive ten minutes without oxygen.




single blood vessel can clog or rupture for any reason. Irreparable parts could be gone forever in minutes.

We might never speak again. A serious glucose irregularity can kill in two hours, a threat too well known

to diabetics. Minor electrochemical irregularities can be fatal and anything interrupting blood flow will stop

everything. The result is always coma followed by death. These are unalterable operating rules. Nobody

has ever recovered from brain death. There are other limitations, however, that are not so obvious.

Warps in the Enchanted Weave

The brilliant neurologist Charles Sherrington often referred to the brain as the “enchanted loom,” as

it seemed to create without effort the seamless tapestry of mental experience. We have progressed beyond

learning the basic needs of the brain. We have reached the point where we can begin to describe both how

the loom works, and why it doesn’t work quite so well in some cases. Many insights gained this way have

only limited use. It’s true that we cannot see ultraviolet or hear much above twenty kilohertz but these are

not limitations affecting us. The unheard and the unseen have little effect on everyday life.

At the most

basic level, however, nobody is suggesting the brain is operating with anything but neurons. Whatever

consciousness is, we perceive it with nerve cells and not muscle fibers. We also know these nerve cells,

when excited biochemically, “fire” minute electrical pulses. Information is relayed this way from cell to cell

during normal brain activity. This is not a theological point or a philosophical conjecture. It’s a known fact.

Since the electrochemical pulse from an activated brain cell is the equivalent of “1,” while the latency

period without a pulse is the equivalent of a “0,” the most basic underlying operating imperative of our

perception would be the ability to sense the difference between the two. Consciousness itself seems to arise

from a sophisticated form of chaotic pattern recognition, and a pattern can only be defined by the use of

contrast. To that extent, brain cells and computer chips share a common reliance on comparative functions

As Real As We Can Make It


to get the job done.

It’s the same signal-to-noise ratio, pulse or no-pulse, dot or dash, “there” or “not

there.” A neuron that couldn’t tell the difference would be as useless as a binary circuit that couldn’t tell

zero from one. This delicate separation between “this” and “that”, the distinction between subject and

background, ma and mu in Chinese philosophy or the Tao’s ying and yang, might seem impossible in such

a mass of ongoing activity as the human brain. Still, it forms the foundations of all our conscious thought

and perception.

At the surface level of awareness, this basic functional operative is almost completely invisible. Like

the computer’s program, it does not affect the colors of the day or the thoughts of our mind. Only when we

try to think about something that a pulse-based consciousness cannot compute do we get into any trouble.

Usually when we try to do this it’s either difficult or disturbing, almost as if there were something wrong

with our mental focus button. Try, for example, to picture “forever.” Non-comparatives just won’t operate

in our comparative cognitive environment. We know what the word means, but we can’t access a mental

description for non-comparatives as we can with mental images we acquire from experience or conjecture.

In fact we can’t even “think” about anything we can’t compare. Just try it.

“Forever” is just one example. Human consciousness is perceived through neural communications in

which everything depends on the presence or absence of voltage potentials. Since our method of cognition is

comparison, we can’t communicate about any non-comparative states at all. We can’t really describe

“perfect” any better than we can paint “never”. It’s a basic problem with information being passed around

in a pulse form. This is not to say that we cannot have experiences or feelings in which such non-

comparative mental states are momentarily present; just that we cannot describe or articulate them.


events in which non-comparative perceptions take place are generally intensely emotional personal or

religious experiences.

Nobody would suggest we’re thinking clearly when we’re overcome with emotion.

Emotions, true to their hormonal origin (See Chapter 7 ) feel to us like moving wave phenomena while



mental reflection and cognition behave more digitally. Our neural electrochemistry embraces both levels but

we “perceive” in ways we can discuss with other people and “experience” states that we cannot ever really

communicate in a rational or reflective manner.

Perhaps, then, we cannot know the nature of God simply because neuron-based brains can’t handle

“infinite” at all? Maybe in early infancy, but our brain grew up. From an evolutionary point of view this

makes very good sense as we don’t encounter many incomparable beings in our lifetime.

It also explains

why it has been so difficult to communicate with the Divine, or at least why it might be hard for normal

adults. Even if we could experience perfection, we couldn’t describe it to others without seeming irrational.

The incomparable can happen but it won’t compute, just as we can “know” and “experience” things we

can’t think about rationally or ever describe in words.

This is one example of the perspective required if

we are going to interrelate the truth of science and the truth of religion. There is no scientific problem with

saying “The perfection of God is hidden from the understanding of man” because, neurologically speaking,

the mature human brain can’t really mentally image a “perfect” state whatever it is. It is an inherent built-in

design limitation of our method of perceiving consciousness and we wouldn’t be humans without it. It

doesn’t really matter. We are making it through a complex world every day and it’s a blessing

consciousness does as well as it does even if we can’t see the ultraviolet or describe the transcendental.

Maintaining a flexible viewpoint, however, can provide space for both the “experience” of the divine from a

personal point of view for those who have had such experiences and can’t deny them, and for scientific

explanation as well. God alone could divine the basis and method of divine perception. The basis of human

perception remains locked into the neuron-dharma, systems within systems of brain cells that pulse or don’t

pulse, require sugar and oxygen to survive, and which cannot be damaged or starved or they will die, and

incidentally take us with them.

As Real As We Can Make It


This perspective on the nature of consciousness goes beyond detailing the biological limits of the

human brain. It is equally provocative to both religion and science. On one hand, it may dim the attraction

of seeking perfection to realize that we can’t describe it in any detail using the human mental system.

Clearly, if nobody can adequately describe it, we certainly couldn’t tell anyone what to look for. It seems

we may have to “know” it when and if we find it because it may exist only in the realm of experience,

unrecognizable to anyone but ourselves and dependent on the time in our life as well as the space we were

in at the given moment. Still, science fares no better. If we deprive the brain of oxygen and witness our life

flash before us are we in another time and space, or are we in brain failure? How can we ever know

anything for certain unless our “knower” is standardized so we can be sure it’s working all right today?

We know all human brains differ slightly from each other.

Even worse, at the molecular level, every

thought affects our brain a tiny bit. Werner Heisenberg’s classic uncertainty principle states we can’t find

anything without pushing it a little with whatever we use to find it, even a photon from a flashlight. We

never really locate anything quite exactly because we just moved it by finding it. If we can’t think about

anything without modifying a few thousand neurons each time, what does that have to say, ultimately,

about any search for ultimate answers? Won’t the questions change as the questioner ‘s mind changes in

the process of working out answers? Or is our mental journey the answer itself?

Questions like this are bound to arise as we start to explore some of the operational aspects of our

biological lens of perception. Just as the incomparable is unthinkable, our sense of time itself is probably a

fairly recently evolved capability. If we could neither recall the past nor project the future in any detail we

might never wonder what happens after death until it was far too late. In fact it seems highly likely that

early humans couldn’t even conceptualize most, if not all, of the hard questions requiring religious answers.

Recent discoveries indicate that the ability to sequence time, generate abstractions, and understand speech



all require structures evolved within a very recent time frame. The first humans with larynxes like ours, for

instance, didn’t even appear until after 150,000 BCE.

For Adam to hear God’s commands, or speak with Eve, he needed a well-developed speech cortex.

Clearly, any historic Eden had to appear at least past that point in brain evolution. As far as religions are

concerned, not one is more than 5,000 years old. It is a rather recent phenomenon, just as we ourselves are.

In fact, without many recently evolved neurological capabilities, we would not have had the consciousness

to know either natural law or divine intent.

And even if we did, we could not have written it, read it nor

spoken about it to anyone else. If there were any religions on earth before we developed speech, certainly

no one ever mentioned it.

A Colorful Line of Thought: Synthesis and Sunsets

After much heavy reading, it’s time to unwind with our inner vision, our own imagination.


chapter started with sight. It ends with a sunset. We’re are seated on a bluff overlooking a rocky California

beach a little north of Santa Cruz, looking out over the Pacific. It’s a warm Sunday afternoon, the last part

of the day with the sun low on the horizon. The day was hotter than we’d expected because we notice a

slight sunburn our the neck. The warmth lingers, but the breeze is picking up. With the day cooling off it’s

time to just relax, sit on the grass and watch the sun go down.

There were showers in the afternoon and a last gathering of dark clouds are scudding slowly off to

the west. Blocking the sun, they let its dying rays pierce through, here and there, as it sinks toward the sea.

Then, for a moment, the lower edge of the sun begins to drop slowly from the bottom of the lowest cloud.

Glowing at the edge of the sea, it suddenly brightens, bathing the bluffs and the waving sea grasses in that

unique horizontal yellow light that we all have seen when the heavens are gray and the sun is blazing out

from the horizon. The world is suddenly magical and glowing.

As Real As We Can Make It


For a moment the sun rests there, suspended, glowing in deep oranges, and then slowly sinks into the

sea. Waves hiss up the sand as twilight descends, the pink cotton candy clouds rolling to magenta and

fading in gentle deep purples. Shadows begin to wrap the rocks in deepening darkness, while the silver slice

of a crescent moon, shining against the cobalt blue sky, begins its climb towards an evening star. The

breeze is getting a little chilly now. It’s time to get up and head back to the house, the windows alight from

inside, glowing against the last twilight of a soft evening as the night slowly cloaks the shore.

The brain remains in silence and in darkness. Sixteen million fibers are pouring cataracts of information

over an infinite grid as our mind fills with the sunset, and we are surrounded by it in all ways. We can

never be aware of those billions and trillions of ones and zeroes; we can never hope to see them although

they outnumber the stars in the sky. It all happens so fast and so neatly that all we see is the sunset, and

only that sunset, in a depth and color possible only for our human eyes to perceive and a beauty only a

human mind could know.


Time and Memory

Evolution and Chronology

Have you ever noticed how, The time is always now, Whether you’re a brown-eyed cow, Or a Mau-Mau, Or an owl? — Michael Bridge

Where did we all come from?” “What is it all about?” “Where are we all going?” If our living

brain can fashion masterpieces like sunsets in total darkness with voltage potentials, why can’t it come up

with those answers?

If we must solve this puzzle ourselves, a good place to start would be noting any

possible similarities in the questions. From a neurological perspective there is a characteristic common to

all three questions. It stands out immediately if we think about it. They can be asked only by a creature

with a consciousness based on chronological, past-present-future sequential time.

This sort of thinking

Time and Memory


doesn’t include humans until well after the age of two and would eliminate the rest of the world as well.

The concept of anything going anywhere in time didn’t figure into our own life for a couple of years at

least. For everything else on this planet, things just are or they aren’t.

We tend to think all conscious creatures think in roughly the same manner we do but conscious

chronology is something modern humans learned recently late and share with no other species on the planet.

The final step in human mental evolution was learning to structure time, and only recently have we learned

how we do this amazing feat, the final gift that made us mindful.

In learning the mechanism of conscious

sequential memory, we also establish its evolutionary history.

Evidence is gathering which indicates that

our sense of chronological time originates with specific brain structures located in the prefrontal cortex.

This area of the brain is very recently evolved in humans and matures well after birth. Jean Piaget, a

pioneer in child psychology, was one of the earliest to observe and describe the stage in brain development

when a child, watching a toy train enter a tunnel, instinctively glances forward to await its exit from the

other end. Before that point as soon as the train is out of sight, it’s out of mind. Here and gone. The child

immediately loses interest. The toy train’s re-appearance seconds later is unexpected and surprising.



The child’s train of thought derailed when the actual train disappeared from view. The

mature ability to imagine without seeing, in this case tracking an imaginary train traveling out of sight in

the tunnel, was called “conservation” by Piaget and it appears by degrees.

The prefrontal cortex is the last brain structure to mature so we must slide into conscious

timekeeping past the age of speech. If we can’t chronologically sequence our memory,. the typically

childlike perspective of constant novelty is simply unavoidable. By the age of four, however, we are clearly

experiencing time in a sequential, three-dimensional framework. Gradually we learn to take such a world

for granted, sequencing perception into memory and perceiving time as moving forward. This may be how


















brilliant mathematician Norbert Wiener, whose concepts made the modern computer possible, made this

point in his seminal work Cybernetics. He reasoned if time were to suddenly shift into reverse, causing the

planets to circle backwards in their orbits, a space traveler arriving on the scene could detect no difference

at a planetary level. Time might well run in both directions. However, it would be impossible for forward-

time people to perceive a backward-time universe. For one thing, any stars going backwards in time would

be pulling in light, not pouring it out. Wiener pointed out that it is impossible to see such stars, given the

way human eyes work. Any communication is likewise be impossible, since conclusions would appear first,

disassembling into meaningless parts as time receded. He concluded the only sure thing we can say about

any universe we observe is that it obeys the same laws of thermodynamics we do. If we can’t detect it, how

can we really know? Perhaps “black holes” are receding suns of other times illuminating planets traveling

backwards towards the past. With us, however, time seems to move forward even though we’re always in

the present.

Classical Greeks had two words for time, chairos and chronos. Chairos is “just in time” or “at the

right time” or “the time of your life”. It is personified by a little god in a moving chariot or even,

sometimes, a god on wheels. Chronos is sequential time, time we perceive as passing, the chronology we

use whenever we remember or predict. Once we sense chronos, we know it’s all going to be over some day.

Little wonder the Greek titan Chronos is depicted as a fearsome ancient Father Time, a huge, bearded giant

who consumed his own children. In India, the Sanskrit kala, time, is reborn in the fearsome Kali. The

fearsome black (kalo) goddess is garlanded with skulls, drinks blood, conquers all and, just like time, gets

everyone in the end. Indian parents name their little girls Durga, Lakshmi, Tara, all the great goddesses but

never Kali, most powerful of all. Only Shiva, timeless Shiva, could ever love Kali.

With the rest of the world remaining entirely in chairos, the last evolution of the human brain placed

us as a species on a chronological escalator traveling from the past to the future. When we began to

Time and Memory


transform images from past presents into future predictions we found ourselves confronted with multiple

mental problems arising from this new form of conscious time keeping. If our perception of chronology

turns out to be a specific evolutionary step, it could expand our understanding of the mechanisms of human

perception. At some point we obviously acquired what may be a unique talent and with it the host of time-

related mental problems which plague us all. The moment we are able to sense the passage of time, we

never seem to have enough of it. Where did time appear? Where was the branching off that set us on the

road to the mind of mankind?

Clear Memories from Chaotic Patterns

It’s not easy to cook up chronological time with a human brain.

Norman Ramsey, Nobel Prize

winner in physics for his development of the atomic clock, points out time is measured by periodicity. From

sunrise and seasons to the picosecond vibrations of cesium atoms in Ramsey’s timekeepers, it is regular

repetition that sets the foundation for the measurement of time. Unfortunately, measurement and perception

of time stay linked only if the observer has no periodicity, or at least one that remains constant. For

example, the brain normally runs data-gathering at the same speed as interpretation. Still, a major problem

with using a biological basis for time perception is that neurons simply aren’t as tough as silicon. Things

happen. The body’s response to a very stressful event is always an immediate release of powerful

hormones. One effect is to speed up neural activity dramatically in certain higher brain areas. It makes

excellent sense to increase the intake speed of perception when something exciting or dangerous may be

happening. We need all the information we can get and we need it fast. Neural firing can increase by three

hundred percent.

This creates problems for consciousness. By gulping down information faster we make the world

seem to slow down. This effect, described in greater detail in Chapter Nine, is analogous to speeding up a



movie camera to create the illusion of slow motion. When parts of the brain get out of synchrony, all sorts

of weird things start happening. Since this indicates the perception of time must vary according to the

situation, it could give Einstein a headache. Is the speed of light really an invariable if an observer’s

perception of time itself speeds up and slows down from time to time? If time and space are interdependent,

but time is variable relative to the observer, is space also then a variable? Could we experience infinite

space during a moment of timelessness?

Putting aside such speculation for the moment, it appears there are two fundamental processes basic

to our perception of time. The first has to do with serial storage of neural based patterns. The second has

to do with the level of detail we can recall. Once again, we can draw useful images from the field of

information sciences. The ways and means of memory are absolutely essential to the science of data

processing. In fact, the definition of the computer itself was originally a device able to instruct itself from

an internally stored memory. The better the memory, the more complex the instructions and the functions of

the computer can be. As to how patterns are created in the brain, we simply cannot do anything in a

physical environment without leaving some sort of physical trace. Total and complete disappearing acts

happen only in imaginary places. Every time any neuron sums and fires, something complex happens in a

physical environment. There must be traces left behind, both in physical molecular changes and in the

electrical conductivity at each of the connections. This is covered in greater detail in Chapter 8.

Remember, the average brain has at least several hundred billion neurons. Using less than one

percent of its capacity for anything from soup to nuts means at least a hundred million neurons in action. If

an average neuron has about 200 dendrites, 200 outputs, a pulse down just one aborated axon tree would

leave traces in 200 other neurons connected to that one cell. Adding the other 99,999,999 neurons gives us

numbers like “the number of stars in a galaxy”.

This is what happens the moment we wake up and smell

the coffee. Smelling a cup of coffee, in fact, calls into action a full chorus of cellular choreographies, all

Time and Memory


interpreted through our personal past experience. Neurobiologist Walter J. Freeman of the University of

California at Berkeley describes the process: “When an animal or a person sniffs an odorant, molecules of

the scent are captured by a few of the immense number of receptor neurons in the nasal passages. Cells that

become excited fire pulses through their axons to the olfactory bulb.”

“The bulb analyzes each input pattern and then synthesizes its own messages, which it transmits via

axons to the olfactory cortex. From there, new signals are sent to many parts of the brain, including the

entorhinal cortex where the signals are combined with those from the other senses. The result is a meaning-

laden perception, a gestalt, that is unique to each individual. For a dog, the scent of a fox may carry the

memory of food and the expectation of a meal. For a rabbit, the same scent may arouse memories of a

chase, and fear of attack.”

An original perception creates an instant temporary network of electrochemical trails as the pulses

proliferate outward through millions of interconnected neurons. Like an after-image of a brilliant fireworks

display, the remnants of the event remain in innumerable synaptic and intercellular changes created when

the energy came coursing through.

If we could recreate that electrochemical tapestry perfectly we would

not just remember experiences. We would relive them completely. It would be a perfect replay of the entire

virtual moment we once experienced. Normal recall, in comparison, recreates the image or thought with a

very incomplete pattern

If simple microscopic dots on a plastic laser disc provide CD quality music, we might wonder why

we can’t simply play back our past. The reason is the neurological “afterimage” is not only imprecise, it

serves only as a physical foundation for even more subtle patterns in a moving electrochemical presence

weaving and pulsing throughout the brain, invisible except as a mathematical event. The brain is alive and

thoroughly interconnected. The ongoing process of consciousness must be chaotic to the extreme. Still, as

a pole hammered into a stream will have an effect on the entire flow of water passing around it, any



physical change, no matter how small, will affect neural flow patterns in definable ways. This is why every

thought must change the flow of consciousness a tiny amount. Nothing remains unaffected. The only thing

that remains constant is the flow itself, which may be comforting to Taoists and Buddhists who regard the

pure undefined mind as the only thing that doesn’t change.

It is this chaotic activity which makes it so difficult to compare the brain and a computer. Not only is

everything alive, the working activity of the brain is simply too complex, on the grandest of scales, to be

comprehensible to us. Playing back a specific picture would as hard as extracting the 9th Symphony from

Niagara Falls. With the right microphones and filters one might pull a little Beethoven from the totally

random “white noise”, but how can we get patterns good enough to use in any sort of comparative

chronology out of such confusion?

The closest we can come to pattern recognition in a chaotic

environment unfortunately requires capabilities that we ourselves cannot mentally image any more than we

can see “red receding”. Although chaos may blur signal and noise to otherwise indefinable levels of

confusion, patterns as pretty and defined as snowflakes can be located within these environments. The

problem is that these patterns can be detected only through a sophisticated mathematical structure called

“quantum time”. Since there is no way in the world a human brain can think in quantum time, there may be

no final Rosetta stone of consciousness We may never witness the subtle electrochemical patterns rippling

and shifting through neural channels and biochemical bridges. The final language of the mind may be

indefinable because it may be undetectable. We may indeed create time with huge holographic chaotic

memory patterns, but we may never locate them regardless of how hard we try.

Nobel Laureate Sir Francis Crick, discoverer of the structure of DNA, turned his genius to the mind

recently. Among his conclusions was one that consciousness is timed, put into discrete frames, in regular

neural pulses.

Still, neither he nor his colleagues at the Salk Institute has suggested any way we could

comprehend human consciousness or

discuss it

from a

critical perspective.

To be conscious of

Time and Memory


consciousness and still discuss it at the same time would require a mind one step more complex than our

own. We can imagine only what the mental lens of our mind can resolve and this is may be beyond mental

resolution. Anyone who could know it or explain it completely would be operating with the equivalent of a

neural upgrade and probably wouldn’t be normal enough to make sense to us. But just because we cannot

look at consciousness with science does not mean that it does not exist.

The Indian philosopher

Chandrakirti insisted on the distinction between “that which cannot be found” and “that which does not


Consciousness certainly exists but we have to be flexible about locating it. Take angels for

instance. Perhaps angels learned to avoid telescopes or maybe they just left town when Copernicus closed

Ptolemy’s crystal condos.

Who knows?

It doesn’t mean they absolutely don’t exist, just no place where

we can find them.

Likewise consciousness is there but we may never be able to understand it in our own


Excepting in rare neurological or dream events, then, memory is rarely replayed. Furthermore, those

neural patterns not reinforced by repetition or remembrance lose their linkages eventually and dissolve into

the unconscious. Each moment was in sharp focus when it happened. It was all there once, senses on-line,

but unless we replay sequences over and over again in our memory the patterns fade and are soon lost to

conscious recall. Studies by James Kreuger of the University of Tennessee demonstrated the brain releases

special substances, cytokines, during sleep.

These induce special firing patterns among various neuron


Kreuger suspects since even major connections are not used every day, this may exercise them

during sleep to help preserve important associations and connections for future use. Still, all but the most

extended and repeated networks lose definition over time, and fade from conscious memory. Every instant

left a true record once but depending on the intensity of the event, the attention we gave it, and the number

of times we recall it, stray currents soon wash the labels off most of our files. They fragment, and become

lost in our warehouse of forgetfulness where they fall apart and compost into the general unconscious



image archive we use for imagination and dreams.

Hindus say that over endless lifetimes, the effects of

karma, intentional activity, eventually vanish. In the endless unconscious of forgetfulness we forgive all our

debts, and there all our trespasses will likewise be forgiven and forgotten.

This is actually a blessing. If our memory remained conscious, we’d be constantly distracted. We are

actually living in the present so it is better not to be dealing with too many after-images on the screen of our

conscious perception. Asked about his reputed power to recall his previous lives, the Dalai Lama answered

it was more important to be attentive to the present. His difficulty remembering previous lives didn’t bother

him. “I know people who cannot remember what they did last year,” he added with a chuckle, “So not

remembering an entire lifetime ago is not such a concern to me.” In the United States there was a flurry of

court cases involving “recovered memories” as the basis for the prosecution of accused sex abusers.

Without a doubt such tragedies occur, but as there is no way that a child before the age of three can recall

anything in sequence it is equally impossible that any memory that old could survive the normal distortions

of organic degradation. We don’t have hard drives in our head and there’s no “hard memory” either. If we

forget it, it’s never coming back clean. If it happened before we were three, we can’t possibly remember it

in any reflective context.

It might seem memory networks should affect our brain cells but it’s unlikely. The physical

complexity of such neural impressions could easily encode vast amounts of specific data without altering

the ongoing functioning of any neuron. Like decals on a racing car, tiny molecular changes won’t affect

speed or performance. Each neuron, depending on its past, carries modifications enabling it to function as

part of numerous interlinked patterns, each appropriate to the moment. No single cell has to do very much

or know anything at all. In a sports stadium display of rippling squares, each participant has only a few

pieces of colored cardboard, a tiny part of the whole design. The images are never visible to the people

actually creating them, just like neurons, each a little living voice helping sustain and modify the immense

Time and Memory


energetic chorus of patterns flowing around them and through them and beyond them. Day and night, each

neuron does its digital duties. At the same time the remnants of experience, molecular reminders of our

past, are modifying complex and interconnected patterns throughout the brain. Some networks fade. Others

are reinforced and extended by repetition. Some interconnect to long-lost pattern trails, merging and

creating new combinations energetic enough to surface as sudden thoughts, ideas, insights and intuition.

There are unconscious and intricate muscle routines circulating in the cerebellum. Visual patterns lie

dormant in the visual cortex, emotional patterns in the limbic system and verbal patterns in our speech

centers. Once we wake to the awareness of our own memories, we take it for granted.

But clear

chronological memory itself was a hard blessing, and it came together over a very long time. For that, we

needed the largest brain on earth, and we needed one final tune up.

Banking on Brain Mass

There is no specific location for a complete memory in the brain. Conscious recall is as dynamic as

the brain that is making it. Although the ancient hippocampus seems to have a central organizing role and

the prefrontal areas store, scan and serialize, memory still remains predominantly a function of amount,

complexity and organization of available brain mass. Regard the caterpillar, for instance. It hatches, eats

leaves, spins a cocoon and turns into a moth if it’s lucky. Its nerve cells are much less complex than a

human’s, and it may have only a few dozen interconnections for each. But still, it has over 350,000 of

them. It needs every last one to operate the 200 muscles it uses to chew leaves, and that’s just for starters.

The computational power of a caterpillar “brain” soars above any supercomputers we have devised.

Computer-controlled assembly lines fabricate and weld automobiles with a dozen or so process computers.

The very idea of 350,000 little living processors packed into one part of an insect is awesome. If computers

reach such complexity they’ll probably be able to crawl about and turn into tiny helicopters. Maybe they’ll



lay eggs too. At this basic level of neural mass and complexity there is enough internal memory to run the

caterpillar, but it’s all used up operating the insect. The surest test we have for conscious memory is

learned alternatives and insects learn practically nothing. The longest memory observed for an insect so far

is about fifteen minutes for the scarab, or dung beetle. It actually remembers to feed its young. We can

employ operant conditioning using basic stimuli like electric shocks or chemicals and get responses from

flatworms, but reacting to pheromones is not learning.

If only they had the minds to appreciate it insects

could be ideal Zen monks. Always in the now, and always in the flow, they expect nothing because nothing

happens more than once. A brain that can’t recall can’t predict. No crises, no surprises. Just processes. So

it goes, forever.

As life forms evolved into greater complexity their brains grew to handle additional tasks of

monitoring and control. As the brain grew, memory grew until it could store enough acquired information

to help guide an animal’s activity from one moment to the next. Fish and lizards are difficult to train but by

the time a brain reaches 20 cubic centimeters there is real learning ability. This is sufficient brain mass to

retain experience and initiate vague forms of purposeful repetition to improve interaction with the

environment. Complex learning appears after the reptiles. By the time the brain has grown to 150 cc’s, an

average dog, there is excellent perception and memory enough to learn, recall, anticipate, and dream. There

isn’t enough capacity in 150 cc’s, however, for intellectual discrimination, philosophical meaning, specific

self-consciousness, or even three-dimensional color vision. Memory must be reconstructed from complex

stored patterns.

Most creatures simply have neither the capacity to store much peripheral detail nor our

ability to sequence images into a chronology. Without a chronological consciousness, animals cannot make

any conscious plans. Even chimpanzees, at 300 cc’s the smartest of the smart, never planted a garden. The

immediate future is all they have in focus. If they could recall even a few seasons in sequence they could

remember the progression of seed to fruit and grow their own. They never have.

Time and Memory


Self-consciousness is also limited by memory. We must be aware of the differences which help us

distinguish ourselves from each other. We know ourselves clearly only to the extent that we remember and

understand past experiences that affected us and formed our personality. Conscious and unconscious

memories underlie all our likes, dislikes, hopes and fears. As our sense of self is dependent on the detail and

subtlety of our recall, the better memory we have, the more self-conscious we can be. Animals, for all their

variegated plumage and behavior, are remarkably similar to each other. If dogs had personality traits as

complex as humans they wouldn’t need their noses to greet each other. Reptiles are so lacking in observable

personality their manners are truly reptilian. They never say they’re sorry. They can’t. Remorse takes a lot

of RAM and snake brains simply can’t run complex routines.

It is possible, however, to shame a spaniel and one can actually embarrass gorillas and other great

apes. Bigger brains do more than swell heads. They allow development of complex personal and social

structures. In comparison an insect has no hopes, biases, or conscious predispositions at all. It never

blames, never criticizes and it never complains. There is no self, no self-consciousness, no memory and no

meaning. It means more to nearly any observer than it can to itself.

Its parts are busy operating at full

capacity getting the job done with a mere cubic centimeter of brain matter. There is no recall. There is no

time for recall. Without recall there is no time, no beginnings and no endings. The silkworm mechanically

pulps some mulberry leaf. A bird overhead sees the silkworm and recalls a meal. A human notices the bird,

and predicting what birds will do to silk worms, shoos the bird away. The silkworm mechanically pulps

some more of the mulberry leaf, a living fiber manufacturing plant. No time for silly things. No time at all.

It pulps some more mulberry leaf. No time like the present; no memory of the past, no hope for a future.

Without a thought, the silkworm munches on.

Overall increase in brain mass provides room for better memory and a more refined consciousness.

However, it should be stressed evolution never equates sheer bulk with intrinsic value. If this were the case,



we would all be under the rule of blue whales. Elephants have much larger brains than humans but they’re

still using their noses for hoses and working for peanuts. In terms of species, from an evolutionary

standpoint, whales are closely related to seals. A very big seal isn’t more complex than a small one.


just more seal. There is a lot of whale to operate, which increases the mass of the brain since more body

cells need management. Whale learning has been observed and shown to be at “seal level”, the aquatic

equal of a smart dog.

This is enough memory for a sperm whale to dive down to where it expects to meet a giant squid for

takeout. The giant squid, with less memory than a paper clip, never expected anything in its life, far less a

large whale in the way. No matter how many pounds of neurons a giant squid was born with, if they’re

squid neurons it’s going to be squid smart and no more. Squid nerves are like cables, so big they’re visible.

Compared to that level of simple consciousness even fish are savants. In this world, it seems, any species

that can’t remember will sooner or later serve as dinner for the rest.

Upgrading to Primate

It’s been accepted since Darwin that survival often requires novel adaptation to a new environment.

The last great evolutionary surge in brain development began taking place about sixty million years ago

when some daredevil mammals got tired of being chased up trees and decided to stay there. If we are going

to spend a lot of time jumping from limb to limb with a small bobcat after our tail, things will definitely

need improvement to avoid becoming cat dinners. First, it’s important to shift the eyes to the front, like a

cat, for the three-dimensional depth perception necessary for judging where that limb really is. Second,

good color vision is also at a premium. It helps to get the live branch rather than the dead one. It’s a long

drop if we can’t tell murky brown from murky green. Both aid in finding fruit and catching insects,

increasing food supply as well as safety.

Time and Memory


The brain’s visual area was pressured to expand as so much visual data had to be analyzed for

trajectories. Most animals rely on their noses, but we can’t smell our way across thin air. A little off the

jump, a little off on the grab, and it’s one more pre-tenderized tiger lunch on the jungle floor. Dealing with

gravity in high places is high risk for anything weighing more than a bug that doesn’t have wings to flap. It

must have rained animals until the forebrain evolved enough mass and complexity to transform kamikaze

marmosets into decent monkeys. In the process, the cerebellum doubled in size as well, accepting new

specific and more discriminating control from the higher forebrain areas and creating a much finer tuned

muscle response.

Luckily for lemurs and eventually for us, adding brain tissue is a simple genetic adjustment. At the

fetal stage human brain tissue is so undifferentiated it can been clipped and transferred to other patients like

living tofu. It grows right in. “More brain tissue” is easy evolution. It’s much simpler than adding wings

or claws. It can also happen much faster than we once thought. Anthropologist Karen Milton, of the

University of California at Berkeley, studied two kinds of ape and made a surprising observation.

Apes have a basic diet of fruit and leaves. Fruits provide high energy but are low in vitamins. Leaves

are high in vitamins but require long digestive tracts to process them. As a result, an ape with a shorter

digestive tract must eat more fruit to make up for its inability to extract nutrients from leaves. Since most

trees bear fruit for only a part of the year, even in equatorial climates, a fruit-eating monkey needs a

complex feeding strategy full of searching and returning. On the other hand it doesn’t take any recollection

at all to locate leaves in a jungle. Spider and howler monkeys are about the same size but spider monkeys

are fruit hunters while howlers are leaf munchers.

Although the apes weigh about the same, spider

monkeys are carrying around brains nearly twice as large as the howlers. Not being food may have driven

us into the trees, but once we went arboreal, locating food was the next environmental pressure for a larger



brain. The extreme difference in brain size between two species of modern monkey shows how quickly the

brain can grow if certain conditions and diet are present.

One of our most distinctive evolutionary steps was the rapid development of the forebrain’s

specialized ability to recall and redirect complex muscle sequences. Although this originally evolved to

allow us to perform repetitive tasks without having to re-learn them each time, our enlarged visual

memories became adept at storing visual cues occurring during the learning experiences. If we find a good

fruit tree, we don’t want to forget the way back.

A simple mindless playback may be easy for bees, but

their memory chip is so tiny it only holds the most recent trip and it’s forgotten a few minutes later.


primates, once mature, have much longer memories and they all seem to focus it the same way we do, with

the prefrontal cortex.

When a monkey starts to learn a task, the greater amount of brain activity takes place in structures

linked directly to the event, the visual and motor control areas. When the activity is repeated, however, the

forebrain becomes the more active area. Once a task is learned the primate forebrain seems to sequence and

trigger behavioral routines far more complex than in other species. Just like human subjects who activate

word searches from the forebrain when retrieving verbal information, this sequential searching ability may

be related to the necessity to recall, replay, and modify a repeated sequence of muscle patterns.

This is

exactly what comes into play when we learn how to play a piano, dance a tango, or take a flying leap to a

swaying branch in the treetops. It was essential for grabbing swinging vines. If we jump to where it is right

now, we die. If we jump to where it will be, we live. The brain must unconsciously calculate a future event

and sequentially fire off millions of muscle movements in order to get us there. From a computational

viewpoint, it’s calculating: “Vine there now, vine moving this fast, will be about there if I jump NOW! Got

it!.” Prediction can be a life saver even at the unconscious level.

Time and Memory


By the time apes had their aerial acts perfected they were using the most complex muscle sequences

on earth. Since they hadn’t grown any new organs there were few changes in the operational parts of the

brain that manage body functions. It was the finely tuned higher brain areas which evolved. Sequential

pattern comparison in color and three dimensions requires giga-giga-terabytes of operational memory.


response, our visual and discriminatory areas added large amounts of new mass and specialization. The

brain of the porpoise may be as complex as man’s, but its complexities are more associated with hearing

than sight. Primate brains are primarily visual: monkey see, monkey do. And we are so smooth at it. No

other creatures are better at strong, controlled, and yet delicate movement. Cats couldn’t dance if they tried.

It took all of evolutionary history to reach the mass and complexity of the original primate brain forty

million years ago. Since then there have been dozens of diversions from the original line. Some adapted into

gorillas, chimpanzees, orangutans, monkeys and baboons; from ground dwellers to neo-arboreal apes.

Most of them get through life with a combination of wits, claws, fangs and muscle. Some lines opted for

miniaturization and speed, becoming old-world monkeys and remaining in the trees. Our branch of the

family, homo, out on the African grasslands, bet on brains. To make it possible, they started to eat better.

To be precise, they began to eat more meat. It’s now well known chimpanzees, given the opportunity, will

readily kill and eat small animals. It is more likely our earliest ancestors were not strategic hunters but

scavengers, grabbing the remains of kills brought down by the larger predators and pounding the bones

with stones to extract anything edible.

“It was not just meat, but fat and bone marrow,” explains Leslie Aiello of University College in


“Such easy to digest food requires smaller stomachs and intestines, which use up less energy.