What is the essential naiveté of realism in modern art – essay question,

Scribd search: Brian McGuiness on “Wittgenstein's mystical
epistemological views”.
November 2014

Are universes Boltzmann brain specific? Can BB's and ancestor
simulations be reconciled?
Because of the simplicity and frugality which will be practiced in the
next century the guide to the future will be the discontinued past
Words and phrases invested with mock metaphysical power.
Thermodynamic macrostate as superset of "the appearances"
Are non local correlations required as a precondition for certain coated
transaction between the brain and the external world, particularly in the
case of language use and interpretation?
Concepts must be intersubjectively defined. And it is clear that
subjective States cannot be intercepted actively analyzed. This is what
lies behind the degeneracy of the singular vs multiple minds paradox
Erwin Schroedinger.
Investigate the concept of metasolipsism.
We must not merely consider error correction in the context of the
transmission data but also within the context of the origination of
information in creative mental processes. This is getting it that
ambiguity of the linguistic perceptual reprocessing boundary. It shows
the intimate connection between the problem of other minds and the
problem of the external world - see Bertrand Russell, our knowledge of

the external world.
Mormon metaphysics and the multiverse
Recognizing that one is stupid makes one a genius among the stupid.
This provides additional support for the notion of the timelike angular
momentum of spin one-half particles.
The permanent timelike orientation of quantum intrinsic spins in the
case of spin zero particles and the space-time like orientation of spin
one-half particles bespeaks an absolute reference frame such that the
counter intuitive behavior maybe ultimately a simulated appearance.
It's true that the library of every possible book would include the
complete works of, for example Plato and Shakespeare, however, if the
origin of those coherent seeming works was not an actual mental
process, then the grammatical structure and the semantics represented in
these texts wthin this vast library of mostly gibberish would be
altogether absent - mind after all is the giver of meaning. This objection
may be amplified by relating it to Wittgenstein private language
argument against solipsism. This amplification must in turn be placed
within the context of Alvin Plantinga's book God and other minds
The reprocessing potential represented by the possibility of ever shifting
contexts and the characteristic open endedness of any message or signal
boots up the way open to the human mind at any juncture leap outside of
any pre-established system of logic or reasoning. What provides its
potential is the capacity for metaphorical thought which is of course
consciousness itself. It has already been said elsewhere on countless
occasions that consciousness is indeed the metaphor of all metaphors.
Given the vast multiplicity of human languages it should be obvious that
the sum total of all conceivable books in a Borgesean library does not
represent even a tiny fraction of all conceivable thoughts which it would

typically take a book to articulate.
If coherent thought is just a kind of channeling of pre-existing signals
latent within the vacuum of space then this has intriguing implications
along a variety of different paths. After all what we call coherent may
not be an objective property, but instead an absolute subjective one.
Data is a projection of information which does not contain any of the
dynamics by which the information is originally produced. Information
unlocks access to function specific processes which still have to be
explained even after the origin of the information itself has been
explained through evolutionary processes.
November 2014

If it truly requires nothing less than unadulterated chance in
order to adapt the evolving organism to its essentially unpredictably
changing environment then an important consequence follows from this
when it comes to the concept of technological progress and exploration
The texture is formed of known knowns known unknowns unknown
knowns but also of unknown unknowns
Under one interpretation transcendence occurs in two fundamentally
distinct ways: either one transcends and finds that one is alone or one
transcends and encounters the other. The growing popularity of Soylent
food substitute is an example of a disturbing and fascinating cultural
trend of liberation from out of every nook and cranny of slavery towards
Before transcendence: embodiment
Chinese is a two hemispheres language tonal ideographic
Bureaucratese is the language for unlocking tiny minds.
October 2014

Subject: Concerning dreams, aspirations . . and of "making a living" : )
Dear Sam,
Hugo Gernsbeck's "Amazing Stories" was an immensely popular weekly
pulp science fiction series. My father used to read these in the 1930's. In
fact, I acquired most of my interest in the science fiction genre from
him. Anyway, I was looking at the ads from the first few pages of this
publication from December 1927. Several things: the wonders people
dreamed about for the future back then show no sign of coming about
and probably never will because the fundamental values that underlie
them are fast disintegrating, i.e,. notions like "destiny of Mankind",
"future = progress", "the noble spirit of scientific inquiry", "the inherent
rationality of the Universe", "the pregnant inspiration of the maverick
genius" and so on. The producers of this cheaply thrown together
magazine knew all too well the constitutional weakness of their target
audience when they first began soliciting for advertisers, all of whom
you may notice, promise virtually instant success to the respondent,
inspired by the notion of seizing a too good to be true opportunity from
out of the blue to make a sudden break from the mediocrity of his past
and at once finally begin to carve out his own fate (in which he always
secretly believed, but which he had dared not articulate to himself !) I
look at these ads, not only coldly, dispassionately, but also with more
than a little empathy, armed as I am with the knowledge that the
publication, the ideas and underlying values of its inspired but struggling
contributors, the hapless souls pathetically taken in by the magazine’s
cynical advertisers with the lure of being the "grand exception", the
cynical advertisers themselves, the children of all of these, and a goodly
number of their grandchildren. . . are all dead and rotting in their graves,
remembered only in the sporadic conversational asides and scribbled
footnotes of their self-interested descendants or of an ungrateful
posterity, which is to say, if they are not indeed forgotten utterly, c.f., This having been said, let me also say this: were it not for

the inspired creative persons be-speckling human history, we should all
not now be enjoying life in the Internet Age, because humankind would
have hardly advanced further than the Middle Ages, and its awesome
mysteries of wood-fired furnace and baker’s water wheel! Have you
ever wondered why the townie-locals always try to corrupt and drag
down the prodigal son who is briefly forced to return to the village of his
humble birth? People who have never been touched by creativity or
experienced geniune insight or a moment of genius – almost every last
one of them, instinctively hate the artist, the poet and the philosopher!
This is all by way of saying, dear brother, don’t lose belief in yourself
and for God’s sake don’t be afraid to realize your dreams for bigger and
better things for you and those whom you love! email
signature: “By the very same logic proffered in support of the Anthropic
Cosmological Principle one could put forward an equally cogent
argument in favor of a so-called Solipsistic Cosmological Principle. But
any argument which cannot be communicated (because of the necessary
absence of an audience to hear and understand it) must be rejected out of
hand.” version of the above, which appeared as a comment
posted to the above link: October 2014 fcbk= Hugo Gernsback's "Amazing
Stories" was an immensely popular weekly pulp science fiction series.
My father used to read these in the 1930's. In fact, I acquired most of my
interest in the science fiction genre from him. Anyway, I was looking at
the ads from the first few pages of this publication from December 1927.
Several things: the wonders people dreamed about for the future back
then show no sign of coming about and probably never will because the
fundamental values that underlie them are fast disintegrating, i.e,.
notions like "destiny of Mankind", "future = progress", "the noble spirit
of scientific inquiry", "the inherent rationality of the Universe", "the
pregnant inspiration of the maverick genius" and so on. The producers
of this cheaply thrown together magazine knew all too well the
constitutional weakness of their target audience when they first began
soliciting for advertisers, all of whom you may notice, promise virtually
instant success to the respondent, inspired by the notion of seizing a too

good to be true opportunity from out of the blue to make a sudden break
from the mediocrity of his past and at once finally begin to carve out his
own fate (in which he always secretly believed, but which he had dared
not articulate to himself - until now!) I am of two minds as I can look at
these ads, not only coldly, dispassionately, but also with more than a
little empathy, armed as I am with the knowledge that the publication
and others like it have long since moldered, the ideals and underlying
values of its inspired but struggling contributors are now more or less
forgotten or transformed beyond recognition, the hapless souls
pathetically taken in by the magazine’s cynical advertisers with the lure
of their being the "grand exception", the cynical advertisers themselves,
the offspring of all of these, and a goodly number of the grandchildren. .
. are all dead and rotting in their graves, remembered only in the
sporadic conversational asides and scribbled footnotes of their selfinterested descendants or of an ungrateful posterity, which is to say, if
they are not indeed blotted out utterly, and still more with the confidence
that, certainly in spirit, these same cynical purveyors of false hope in
one's human potential and personal growth are very much still around
today. This having been said, let me also say this: were it not for the
inspired creative persons be-speckling human history, we should all not
now be enjoying life in the Internet Age, because humankind would
have hardly advanced further than the Middle Ages, and its awesome
mysteries of wood-fired furnace and miller’s water wheel! Have you
ever wondered why the townie-locals always seem to try their hand at
corrupting and dragging down the protagonist of the prodigal son who,
by misfortune or strange turn of fate, is briefly forced to return to the
village of his humble birth? People who have never been touched by
creativity or experienced genuine insight or a moment of genius – almost
every last one of them, instinctively resents and distrusts the artist, the
poet and the philosopher! This is all by way of saying, dear creative
types out there...don’t let slip your belief in yourself, endeavor to ever
more court your inner creative voice and by so doing to better and ever
more clearly channel your unique individual spirit. Don't be afraid to
acknowledge and work to achieve your ambition to give to the world
your only truly unique insight, the one which no one has yet suspected,

but which all thinking persons and searchers after truth would perhaps
secretly gratefully receive and which by the way constitutes both the
very core and flair of your being. The morale of the story, I think, is that
inspiration, if cultivated, arises from within, not from a cynical ad in a
popular publication offering untutored persons instant success, my now
withered certificate in VCR Repair, signed by Sally Struthers herself
August 2014 fcbk=

Lime Cat I have always suspected that the strange behavior
of the quantum mechanical wavefunction in the presence of conscious
observation (similar to the oscillating fractal patterns - the transients, not
the stable ones -exhibited by a TV monitor on which its own TV camera
is trained) that bespeaks a kind of mediated self-interference akin to
destructive interference which is probably a component of Hawking's
dynamic "chronology protection" mechanism, because human cognition
possesses so many "advanced wave", "presponsive" and quite frankly,
paranormal properties such that the RPG Master Programmers of this
"ancestor simulation" have probably provided an algorithm for partially
insulating the ground state vacua of "physical objects" and causal
processes from the "temporal accord violating" ground state vacua of
quantum entanglement-mediates processes of human neural microtubule
tubulin dimer networks.
Lime Cat epi=“Dreaming is a phenomenon of the unlinking of
the self from the group mind” (or intersubjective reality). October 2014 The
assertions of dream characters are interpreted by the dreamer as
implications of the written word, which just has for the moment taken on
the superficial appearance of 3rd person speech.
August 2014

Some persons in your collective dream are avatars but most are just filler
If you lock your attention onto a distant filler replica and approach it, it
transforms itself into an avatar. This necessitates a reconfiguration of the
self's linkage to the collective mind.

Dr. Vemuri is mistakenly reifying the metaphor of Self as "Ultimate
Creator". The great mystery is that what he is asserting *is true of each
and everyone of us*. Each is the author of his or her own Universe, i.e.,
the only world he or she has ever known, however . . . this personal
creation is projected entirely within the infrastructure of necessary
conditions and restrictions set forth by each and every other co-creator
who has conspired along with one in doing this.

Lime Cat Interrogation of the continuity of an avatar qua
phenomenal projection of alterity through time means an investigation of
that avatar's ground of manifestation. Is its ground of manifestation en
soi identical with its ground of manifestation por vous?

“There is no thing in itself unless that thing is a self.”


An altogether different timeline with its own anthropic cosmological
principle (of filtering). This leads to a fundamental reinterpretation of
the "re-" in "reincarnation".
Can reference be purely abstract?
Language based on rhythmically spoken sequences of natural numbers.
Underlying causal process determines which parametric statistics to use.
Empirical and not theoretical science. Curve fitting based on trial and
error and judgment. Quantum probabilities are different...random causal
Objective existence and polysolipsistic projection are
Always embrace what makes you.
All you know, is that there was this strange metaphysical message
waiting for you in the form of this myth of God becoming a man, what is

secretly an anthropic cosmological principle. Built into this anthropic
cosmological myth is the caveat against the desire to become as God is
with respect to His knowledge of good and evil by eating of the tree of
life. The curse for attempting to fulfill this ambition:: "You will dig
vegetables from the ground rather than pluck fruit from the trees."
"Descent with modification" is an inversion of the evolutionary order as
well. Built into this anthropic cosmological myth is the caveat against
the desire to become as God is with respect to His knowledge of good
and evil by eating of the tree of life. The curse for attempting to fulfill
this ambition:: "You will dig vegetables from the ground rather than
pluck fruit from the trees."
Boltzmann built into his statistical mechanical proof of the 2nd Law of
Thermodynamics the assumption that three way collisions of particles
could be ruled out. But what if we include 3-way collisions and add
resonant quantum tunneling, e.g., the giant 7 MeV 3-way He-4
collision/Carbon-12 resonance?) Do we then get a modified 2nd Law
and novel predictions for the behavior of entropy in systems where
quantum tunneling has to be taken into account? A theory (regardless of
how fundamental) is only as good as its foundational assumptions. As
aging hackers still like to say, "garbage in, garbage out".
I believe differently. We are each from the other separate by virtue or
our respectively distinct metaphysical grounds of being, but we have
each made the foolhardy leap into the realm of limitation (space-timecausality) in the hope of finding community with others who we suspect
have done the very same. We are not "secretly one", as this is but a
degenerate form of solipsism. We are rather, secretly transcendently
other, one from the other. The cash value of objectivity is
intersubjectivity, whose scope can never hope to encompass the truly
subjective, which each us indeed are. Plants grow and develop from a
common ground on the other hand, which to my way of thinking
supplies the precisely wrong metaphor.
How do you necessarily get increases in entropy from purely reversible

interactions? But isn't quantum tunneling irreversible? Quantum
correlations, i.e., quantum entanglements are instantaneous and
enveloped within a seeming one-way membrane of Heisenberg
uncertainty, and so dropped within the context of classical, timeinvariant physics, further contributes to thermodynamic irreversibility.

<Skepticism about the past> <the past hypothesis>

Apoptosis..."extra normal" "I just need to know if three patients are
For you the reason for exile is strictly practical; for me there is an
underlying metaphysical basis.
I believe consciousness is more fundamental than a mere phenomenon.
The so called irrelevant details suppressed at the beginning of the
formulation of a foundational theory loom on the horizon of this theory's
reigning scientific paradigm. This is particularly the case when the
theory has itself ushered in this paradigm.
The open-endedness of nature does not support the idea of physical
theoretic Platonism.
"All you know is that, after having been rudely ripped away from the
serenity of the Void or heroically rescued from the screaming Abyss
(take your pick), there was, waiting for your consideration, a strange
metaphysical message in the form of this myth of God assuming the
form of an advanced hominid ape. . . what is surely the preeminent
figurative expression of the "anthropic cosmological principle", the
proper literal expression of which, humankind only now stands just on
the edge of formulating."
Isn't it this spark of the divine which makes the hominid ape human?

Combining linguistic pragmatics on the one hand and etymology on the
other with an analytic philosophical approach would likely yield the best
simulation of a metaphysical system.
Alex, I propose two axioms upon which you shall found your
metaphysical system:
1) the meaning of objectivity is exhausted by that of intersubjectivity,
which comes from the linguistic pragmatics.
2) Subjectivity






It follows that an irreducible residuum remains after objectivity has been
fully taken into account.
The appearance of reversibility in thermodynamics systems is perhaps
easily enough explained in terms of the asymmetry connected with
quantum entanglement, which can be created but cannot be destroyed.
Two dimensional time the space of states accessible by human
consciousness changes with time, but this time axis is different from that
along which preexisting states are sequentially accessed.
Change in the space of possibilities necessitates a 2nd dimension of
Let us apply the anthropic cosmological principle to quantum
mechanical phenomena and laws so as to interpret them in terms of a
dynamic interface between mind and body, spirit and matter - an
anthropically fine-tuned fuzzy interface, in other words.
Quantum physics is transitional between classical physics and the realm
of the mind and consciousness.

The intent of the no smoking rule in a restaurant, for example and the
debate within the philosophy of legal jurisprudence: codified law verses
law determined by precedent.
The English word "Gist" is etymologically related to the German word
"Geist" (Meaning implying context again).
With each redrawing of the map, I end up that much closer to the
The Narcissist will surely exile you, once he realizes that he shall never
be the equal of himself in your fond imagination.
Dreaming is a phenomenon of the unlinking of the self from the group
If you lock onto a filler replica and approach it, it transforms itself into
an avatar. This necessitates a reconfiguration of the self’s linkage to the
collective mind.
Some persons in your collective dream are avatars but most are just filler
Intersubjectivity is both the deconstruction of the objective, as well as
the unmasking of the collective. Objective beyond the intersubjective is
transcendent. The subjective particular is their and the ground of
subjectivity, which is a many rather than a one.
A counterintuitive phenomenon discovered: The coexistence of
superconductivity with dissipation.
What type of information is represented by the quantity of surprise to the
quantum vacuum? Information density of changing boundary conditions
to the quantum vacuum outstrips the computational density of this
vacuum and so the future state of the system must be a “guess”.

Are their numbers that can be represented by qubits that cannot be
represented by bits?
Aren't there qubit strings that cannot be translated into bits without a
wavefunction collapse?
Scribd and google search.
Hate is not the opposite of love, but indifference.
The quantum information passing between Alice and Bob depends on
what vacuum contextualized information (data) their brains "contain"
If A can collapse Psi and B can collapse psi, then are there two distinct
random processes by which wavefunction collapse is effected rather than
a single one and is this distinct from wave function collapse base upon
sudden opening up of a potential avenue of obtaining knowledge about
the state of the system?
Why do dreams of being unwittingly in very high radiation fields
fascinate me so?
Are their numbers that can be represented by qubits that cannot be
represented by bits? It would seem so because X qubits always
represents a larger number than X bits.
What to the power of X equals X qubits?
If A can collapse Psi and B can collapse psi, then are there two distinct
random processes by which wavefunction collapse is effected rather than
a unitary one, and is this distinct from wave function collapse based
upon the sudden opening up of a potential avenue of obtaining
knowledge about the state of the system ?

Something became me. I did not come from nothing (provided that I am
something), and so whatever became me can do so again, according to
the solipsistic logic of the anthropic cosmological principle.
Divert potential political energy into a shooting barrel where it can't
challenge the interests of the power elite, e.g., abortion issue, gay
marriage, political scandals, etc.
Genetic over determination of amino acids and the predetermination of
chemical self-organization, for example, Arginine and Isoleucine.
This enables the accumulation of a series of so-called silent mutations
and builds up a great amount of genetic complexity potential which can
be exploited later on in the light of further mutation.
There is a creative side to error correction in genetic base pair
sequencing which is ordinarily thought of as a critical process.

Accumulation of context for correctly spelled out genetic base pair
sequences provides the basis for future error correction. This is a
hypothesis concerning the deeper aspects of the error correction
Consciousness collapses the wave function the quantum vacuum only
gradually degrades the wavefunction. The brain focuses the quantum
vacuum's proto consciousness.
Concerning a scientist's view on the rediscovery of Zipf's Law: "The
paper's co-authors include biophysicists David Schwab of Princeton and
Pankaj Mehta of Boston University. "I don't think any one of us would
have made this insight alone," Nemenman says. "We were trying to
solve an unrelated problem when we hit upon it. It was serendipity and
the combination of all our varied experience and knowledge.""

This goes against Ayn Rand's dogmatic assertion that no significant
discovery was ever made by a group or collective.
Facebook has taught me one thing, if it has taught me anything, namely
that great intelligence and wisdom do not necessarily coincide. Humility
makes one artificially wise as pride renders one artificially stupid.
"Neti, neti" - Advaita Vedanta, concerning the nature of Brahman
"The fool who persists in his folly will become wise."
- William Blake
There is a scale of the processing and integration of quantum
information represented by the quantum decoherence limit, which
applies to the maximum quantity of information that may traverse the
vacuum via quantum teleportation alone. Does this quantum
teleportation bottleneck play an important role in the separateness of
conscious minds?
An empirically based decoherence theory proves that there is more to the
universe than just quantum information.
"I wouldn't describe myself as lacking in confidence, but I would just
say that - the ghosts you chase you never catch." – John Malkovich
Sam, is there really only one consciousness and each brain merely
filters, structures and resonantly tunes to this consciousness in a different
Paradoxically, alterity is the "image" in which each of us is made, and so
the appropriate metaphydical ground for our ethical system must be a
kind of "polysolipsism". To refer to alterity as an "image" is to invoke a
kind of "transcendental metaphor".

Why is topology an important consideration in thinking about the
phenomenon of dissociation?
GW Researchers Disrupt Consciousness With Electrical Stimulation |
GW Today | The George Washington University
It never occurred to anyone in ancient Greece to advocate for the
abolition of slavery. – Robert Garland, Ph.D.
I'm reading "Christopher Lasch and the Moral Agony of the Left - Aidan
Rankin" on Scribd. Read more:
I'm reading "Language and Truth: A Study of the Sanskrit Language and
Its Relationship with Principles of Truth" on Scribd.
Consider the topology of the gaps with in the computational state space
and the domains interlocking in the multiverse
My coincidental domains interlocked with other peoples non
coincidental domains, the relationship of time and frequency is
analogous to the relationship of coincidence and incidents.
"Does it contain any abstract reasoning concerning quantity or number?
No. Does it contain any experimental reasoning concerning matter of
fact and existence? No. Commit it then to the flames, for it can contain
nothing but sophistry and illusion." – David Hume
The intersubjective realm is merely an arena of constructive and
destructive interference of multiple mental frequencies, but we should
hear distinguish between carrier frequencies and frequency envelopes.
Evangelical atheists enjoy a concept of scientific progress which is
modeled in terms of nature, the asymptote, approached closer and closer

by an ever more gently arcing curve. But this model is disproven by the
marked tendency for scientific ignorance to grow at a rate that
acceleratingly outpaces the accumulation of scientific knowledge. And
this is fundamentally owing to the necessary structure of scientific
revolutions, as first compellingly demonstrated by Thomas Kuhn.
In deep philosophical discussions of broad hypothetical nature, it is
common to compare and contrast hypothetical cases which we could
never determine the difference between, for example, what if God never
existed, what if this epic historical event to which we owe most of our
cultural identity never happened, and so on.
Causality is pattern recognition and image enhancement of correlations.
Nature is no thing in itself. (We've known this since Kant) Science
doesn't prove theories, it falsifies them.
“Human beings are free to the extent that they can replace one set of
behavioral determinisms with another set.”

As Fritjof Capra indicated in his book the turning point, visual
metaphors fail us when it comes to trying to understand quantum
mechanics in an intuitive way. Quine the analytic synthetic distinction,
Whorf hypothesis.
If the domain of the unknown with respect to scientific investigation was
a finite, determinate space to be cumulatively and progressively filled in
with correct theories and facts, then the notion of a theory of everything
would be a coherent one. But this is the not the case and the future is not
indeed on a trajectory conceivable to the present.
After a paradigm shift occurs, not only do facts become obsolete, but
also questions, even fundamental ones.
Contrast thinking in terms of tools vs. thinking in terms of concepts.

Do we have a concept of a metaphor or only a metaphor for a metaphor?
The question frequently arises why UFO’s study more less began in the
1940's in the American southwest around and near military and nuclear
installations seems to be perhaps due to the unique neutrino emission
signature of an operating nuclear reactor that would be easily deductible
in deep space with detector technology only a little bit more advanced
then what we possess today.
Because of topology and internality or inwardness of subjectivity, an
aborted fetus cannot possibly correspond to a hypothetical human
possessed of a determinate identity.
Most acts of creation are not creative acts at all, but are merely examples
of what I term "creaction".
Two-dimensional time is built into the universe all one has to consider to
see the truth of this is the principle of quantum superposition in the
interaction of mutually exclusive temporal lines
Freud was the first to observe this principle that humans reactions are to
the contrasts in things and not to the things themselves, and I felt keenly
that this was true when I first read this remark by Freud.
“Lol” has distinct different meanings, however what it is acronym for,
“laugh out loud” has an additionally distinct meaning or acceptation.
Consciousness collapses the wave function the quantum vacuum only
gradually degrades the wavefunction. The brain focuses the quantum
vacuum's proto consciousness.
The appearance of persons in dreams is not merely the post hoc
reproduction of the ordinary experiences of waking life, but should be
reassessed in Kantian terms as at once necessary precondition and

epiphenomenon of egoically structured consciousness (as such or
personal?). Although it should be noted in this connection that
frequently the ego is absent in the 1st person singular perspective, but is
reconstituted in the 3rd person plural perspective (as opposed to a 3rd
person “multi-singular” perspective)
The metaphysics of smiling and laughter.
Contrast Kantian precondition of egoic consciousness with empirically
based sociolinguistic construction.
September 2014

Intersubjectivity is both the deconstruction of the objective,
as well as the unmasking of the collective. Objective beyond the
intersubjective is transcendent. The subjective particular is there and the
ground of subjectivity, which is a many rather than a one.
The unity of the human mind is only intelligible within the context of
transcendental mind, namely to this mind. This follows from the nonexistence of a concept of consciousness with the attendant impossibility
for a theory of consciousness. November 2014 It is then perhaps only by
analyzing the differences of consciousness between and amongst a wide
variety of minds and developing ever higher level categories of these
differences by which consciousness is best investigated Aristotle's
approach of first determining what is in common between individuals
before investigating their differences may will applying to the
investigation of everything in the world except consciousness
Lee Smolin claimed that only a cosmology invoking a model of
sequential BIGBANG's or sequential universes rather than simultaneous
Big Bangs could be empirically testable. This may be a unique case
where the lack of falsifiability and empirical testability does not
necessarily stand against the promoting of a hypothesis concerning
physical reality. This is on account of a philosophical position namely
that of the complex topology of the external world in the recognition that
mankind does not necessarily have access or even potential future access

to the entirety of the domain of reality or being itself.
Scripted search terms "turns out to be the opposite"
Physical theories are never born from empirical data sampled across an
infinite or absolute domain consequently why would we ever expect
such theories to validly apply to internet or absolute domain this is to
indulge in magical thinking of our primitive forebears.
Can we draw a hard and fast distinction between Popper's falsifiability
with respect to theoretically possible experiments versus falsifiability
with respect to doable experiments?
In the same way that the underlying logic of the anthropic cosmological
principle is solipsistic so too is the underlying logic of the quantum
observer solipsistic.
"We learned some of the rules of the ribosome, that evolution can
change the ribosome as long as it does not mess with its core," Williams
said. "Evolution can add things on, but it can't change what was already
Combine the underlying logic of the anthropic cosmological principle
with that of the Copernican principle: if everyone has won the cosmic
lottery of having been born with a brain that precisely resonantly tunes
to the quantum vacuum electromagnetic field, then this field must
possess a multidimensional temporally partitioned structure. And it is
how all of these vacua or quantum vacuum partitions mutually interfere,
which provides the universe with its transcendentally intersubjective er, uh, objective structure.
Dasein and "daseinity" and the anthropic principle, multiverse, etc.
The Latin word "pectora" meant both mind and breast or chest.

“The philosophy of science is about as useful to scientists as ornithology
is to birds.” – Richard Feynman
“Heating and air conditioning are on at the same time in the engineers
don't know and its costing twice as much energy.”
The frequency of internal thought Gaussian distribution relative
frequency Coincidences
Note that virtually half of humankind is possessed of 2-digit IQ's.
Shannon information does not take into account the import of
information, which is context and system dependent. which is to say, it
does not differentiate semantic-contextual from causal-systemic aspects,
respectively of information.
Coincidental centered on Gaussian centroid. Causal located on left and
right tails. Two forms of causality? Implying two time axes? Multiple
interacting but independent causal continua. Multiple centers of
Classical physics: individual deterministic, collective probabilistic;
quantum physics: individual probabilistic, collective deterministic.
The quantum computers, still largely in their infancy of development
and supporting an equally fledgling science of virtual reality may
inadvertently bypass brain filters which natural selection had put in
place eons ago, rendering the newly created human brain network hive
mind vulnerable to alien quantum entanglements from the vacuum.
There are myriad other kinds of systematic correlations by a quantum
entanglement within large molecules and their nuclear magnetic other
than those that represent conventional logical relations resonance spin
states. Relate this to the indeterminacy of Shannon information.

That feeling of perceptual-behavioral "neural hard wiring" that you
sense in yourself when in the grip of the faux optical illusion represented
by the "four-eyed lady" can be experienced in relation to almost any
ordinary perceptual stimulus if thresholds are lowered.
Kurt Gödel: "troof is stronger than proof!"
What portion or percentage of the quantum vacuum fluctuation field is
mutually correlated via a propagating light ray even if only in the form
of a virtual photon?
Whatever tiny percentage of the corn vacuum is constituted by a light
ray correlation that is the percentage of the vacuum that gravitates
Any group of quantum vacuum fluctuations that is not connected by a
propagating light ray do not belong in the same inertial reference frame
it is true that there is relativity of simultaneity but there should not be
relativity of causality
If these vacuum fluctuations do not belong together on the same
propagating light ray then these fluctuations do not belong to the same
folio of spacetime.
The Lorentz invariance of the vacuum is importantly connected to the
fact that the vacuum does not gravitate. It is the deviation from the rents
invariance which causes differences in vacuum density which can have a
gravitational effect.
Whatever tiny percentage of the quantum vacuum is constituted by a
light ray correlation, that is the percentage of the vacuum that gravitates
Any group of quantum vacuum fluctuations that is not connected by a
propagating light ray do not belong in the same inertial reference frame
it is true that there is relativity of simultaneity but there should not be

relativity of causality
If these vacuum fluctuations do not belong together on the same
propagating light ray then these fluctuations do not belong to the same
folio of spacetime.
The Lorentz invariance of the vacuum is importantly connected to the
fact that the vacuum does not gravitate. It is the deviation from the rents
invariance which causes differences in vacuum density which can have a
gravitational effect.
October 2014

One important step towards solving the cosmological constant
problem, that is, the profound mismatch between general relativity's
prediction and quantum mechanics’ prediction for the energy density of
the vacuum, may be simply to note that the vacuum may accommodate a
vast plurality of anthropic quantum observers.
Quantum entangled relationships provide the context for classical
information, which does not distinguish between the different qualities
of digital information.
Famous Feynman lectures put online with free access
Free will, decoherence, moral agency and the multiverse...
The multiple timelines of the multiverse have to be completely mutually
independent for free will to be preserved
There is the multiverse within the confines of the decoherence limit and
then there is the wider multiverse beyond the decoherence limit. One's
freely willed choices cannot really impact versions of oneself in
multiverse universes beyond the decoherence limit.
Fetish thinking is the dirty underside of the causal supervenience of the

Solipsistic logic underlying every major philosophical principle enabling
a postmodern critical re-reading of modern philosophy, e.g., Kantian
categories and forms of intuition, Husserlian eidetic reduction, etc.
Considerations of the relative probabilities of various combinations and
permutations of pre-established sequences within a closed system of
course does not take into account the meaning of it context which also
supplies the operational and causal efficacy of the data that is contained
provisionally within said permutational combinations.
It is a Leibnizian principle paraphrased to say that whatever provides for
the coherence and cohesiveness of a system also directs the assembly of
said system.
October 2014

Evidence for the absence of design on one level and at a certain
scale is suggestive (in the minds of some) for the presence of design at a
much larger scale and along a much higher level, e.g., the vast genomic
sequence space compared to the insuperably vaster space of stable,
biologically meaningful proteins that has heretofore been faithfully
addressed by way of the former. If nature is not really relying on the
particular structures of DNA to *produce* the information that gets
expressed as functional stereomorphic protein structures (such that
unique quantities of information are tied to unique physical realizations
of this information), but merely for enabling the expression of latent
information, then we should reasonably expect there to be myriad and
perhaps an unlimited number of ways to access this information, e.g., a
bank account accessible via multiple account numbers. (Contrast this to
a lottery where each winner accesses a different pot of winnings). There
is then no unique, “natural” and original physical or, rather,
“implementative” realization of biologically meaningful
information. This is suggestive of there being no unique, original or
implementative realization of mind as final interpreter of the data of
information. “Physical” as prn=“implementative infrastructure”.

Information is only projectively “present” within its implementative
The context for the implementative realization of
information is, of course, dual: there is the physical context of the
information, say, in terms of quantum vacua grounding the more or less
discrete interface for this information and there is the observationalsubjective context for this information. However, note that at least two
of the quantum vacua providing these respective contextualizations for
this information are themselves mutually quantum entangled, i.e., that of
the quantum mechanical system “containing” the information and the
observer of this system, namely who accesses the system’s information.
If there is design, but the design is not for us, but for someone else
altogether, then this design will likely go undetected.
As the mind expands, the universe of discourse also expands, which in
turn expands the possibilities of the future growth of mind and so on.
Rogers Penrose's paradoxical platonic triangle contains certain
assumptions about the relationship of mind, physical reality and
mathematical form that certainly bear further scrutiny.
Quantum decoherence is owing to the necessary topological structure of
the state space.
Kurt Godel also proved that reason transcends logic.
The unity of humankind does not guarantee equality beyond the
theoretical. Transcendental alterity vouchsafes the veritable equality of
conscious entities. Each is not a variation on a single exemplary
archetype. Apply this in contradistinction to the concept of variation in
human physiognomy.
October 2014 @$

Asymmetry of matter and antimatter black hole information
paradox conservation of quantum information the extremely low starting
entropy of the universe cosmological constant problem. A modeling of
the Hawking radiation mechanism in the vacuum for a three dimensional

hypersurface such as the suspected cosmological manifold in which
there are two distinct but quantum entangled spectra of radiation
(perhaps timelike entanglement). Here the inflation hypersurface can be
modeled in terms of a +1 dimensional event horizon, which splits the
quantum entangled vacuum Cooper pairs (virtual fermion-antifermion
pairs) into a thermal black body spectrum (high entropy spectrum) and a
highly negentropic (low entropy spectrum of highly correlated
fluctuations). @$The high entropy branch radiates forward in time, while
the low entropy branch radiates backward in time. The matter half of
the Cooper pairs fall inside the hyperdimensional black hole, while the
antimatter half appear outside this hyperdimentional hole. October 2014 wiki=
“De Sitter space has a Killing horizon at r=3/Λ−−−√ which emits
thermal radiation at temperature T=(1/2π)Λ/3−−−√”, c.f., TED Talk:
“In mathematics, a 3-sphere is a higher-dimensional analogue of
a sphere. It consists of the set of points equidistant from a fixed central
point in 4-dimensional Euclidean space. Just as an ordinary sphere (or 2sphere) is a two-dimensional surface that forms the boundary of a ball in
three dimensions, a 3-sphere is an object with three dimensions that
forms the boundary of a ball in four dimensions.” Would the quantum
nonlocal connectivity of the vacuum imply that the De Sitter spacetime
is superimposed on a 4-d Euclidean space, implying an absolute
reference frame that is inaccessible to fluctuations of momentum-energy
smaller than the Heisenberg momentum-energy uncertainty of the
cosmological ground state?

The Anthropic Cosmological Principle and the Multiverse Theory may
be combined in reinterpretation, i.e., fused so as to handily solve the
cosmological constant problem. For example, if each anthropic
subspectrum of the total quantum vacuum is assigned its own universe
within the overarching mutually quantum entangled multiverse, then the
anthropic cosmological constant” would be set low enough to square
with observation, if the ratio of 1/(total # of entangled anthropic

universes) x [theoretically predicted vacuum energy density] ~ 10 -26
The brain's resonant tuning into Boltzmann brain vacuum states
definitely requires quantum nonlocality.
Boltzmann brains are not situated within any particular region of a
spacetime of any particular universe. Another words Boltzmann brains
are non-local.
If there were underlying principles of coherence in the ground being
then the universe being as fast as it is would be literally rife with forms
of living things. Otherwise, one would not expect one, two or maybe
even a handful of incredibly improbable junctures of biological
causation leading to the emergence of perhaps a solitary intelligent
species in the vastness of the cosmos, but rather no intelligence whatever
throughout its entirety. This is the very subtle T of coherence which has
escaped almost everyone's notice.
Does the speed of light postulate imply that any representation of the
interval in one reference frame has the same spacetime length as in any
I understand what adding an interval means, geometrically, but what
does adding this interval to *both sides* of the equation look like,
geometrically speaking, that is?
What is the difference between being created out of nothing and merely
existing against a background of nothing, once we understand the word
existence to be interpreted dynamically?
Time reversal reverses the sense of the magnetic field lines but leaves
the electric field lines unchanged.

So does time reversal amount to the same thing rotation by 360 degrees
of coupled spins?
Discuss intersubjectivity in relation to metasuperposition within the
context of Bohm's quantum causality principle. The partitioning of the
quantum vacuum is in support of the polysolipsistic anthropic
cosmological principle. Would this solve the cosmological constant
problem if the multitude of partitions of the quantum vacuum was
sufficiently great, which is to say that, the number of potential observer
consciousness’s was sufficiently large?
Physical reality just has to appear logically self-consistent to each
individual observer and not to observers collectively over and above
what can be intersubjectively agreed upon between and amongst said
The softer is the science the noisier is the empirical environment it must
investigate and filter so as to bring about theoretical coherence.
Freudian epiphenomena.
CPT Theorem
Apply symmetry to unprovable configurations, e.g., theoretical chess
mating positions.
"I would have already fallen into abject solipsism were it not for keenly
felt peer pressure."
Grace and accountability.
The price of ease and relaxation is vigilance.
Guard the boundaries of your comfort zone.

Consciousness is intimately involved with the creation of new
information think of the chessboard and Kurt Gödel’s Incompleteness
The quantum decoherence limit set in relation to the critical number or
density of molecules after which interactions become fundamentally
irreversible with attendant increases in entropy.
Compare the Schrodinger wave equation with the heat equation from
A proper understanding of God is one in which God is not the solipsist
in this sense that Christianity as well as all other monotheisms get it
completely wrong.
Rhodan: "koennen Sie einen Hubschrauber fliegen?" Bully: "aber
natuerlich." How should we interpret "natuerlich" in this context?
Those who do not experience singularity or solitariness but subjectively
perceive a multiplicity within themselves are never bored with their own
company are comfortable as outsiders as perennial social anthropologists
of their home culture.
Not only the exact same quantity of information can represent an
unlimited number of intended messages, but the same message functions
differently in different contexts.

History is composed of changes in the course of history and doesn't
happen otherwise. Science progresses by paradigm shifts and
fundamentally by no other means.

<Sociology Thomas Schelling social paradigms quantitative>

The only effective defense against idolatry is abject and total
skepticism. If every trace of idolatry has been thoroughly removed from

one's life then a great void is created within one’s spirit and if one waits
patiently this void shall be filled with the ground of suchness, the
quantum vacuum. So the counter to idolatry, i.e., “having no other
gods” before the one, true God, the I am, is to believe in and worship
“nothing”, which is the Void, the interface and intercessor for which is
undoubtedly the quantum vacuum of relativistic quantum field theory!
"Electrodynamic faith" and the ultraviolet catastrophe. It is the specially
correlated fluctuations in energy of the quantum vacuum that prop up the
electronic structure of not only the atom, but the nucleus, as well.
Similar to the inevitable melting away of a dream at the very moment
that the dreamer begins to become lucid, the specially encrypted
structure of quantum entanglement informing the otherwise random and
disjoint fluctuations in the vacuum’s stress-momentum-energy, which
sustains the ground state and hence the very stability of matter, would be
fatally disrupted were it to be reordered by the emergence of contextual
The structure and function of the unknowable may change at any time or
at all times.
The entropy of a quantum system as a whole can be smaller than the
sum of the entropy of its component parts, c.f., Vlatko Vedral on arxiv
One should not want to solve any mysteries so much as desire to
participate in the thoughts of being a smarter person
The forebrain, which is the seat of reason needs rest from an active day
processing the data from short term and even shorter term retentive
memory so that dreaming is the inevitable result of a partially disabled
biological neural network trying to make sense of everything which
happened during the day without the advantage of the operation of the
higher faculties of reason. Much of the purpose of the higher faculties
are to modulate and even inhibit the evolutionarily more primitive

mental processes of the limbic system and other more primitive
structures of the brain such as the amygdala.
The basic drives operate in a more transparent manner in the absence of
a sublimating brain. But the neocortex, parietal lobes and speech centers
remain largely active. Intuition and insight are also still in operation
although the critical faculties are for the most part suspended.

The results of a quantum measurement are determined at time of
measurement and not at time of system prep or before, which is to say
that, either, there is no possible exhaustive set of "hidden" or "subquantum" variables (c.f., David Bohm, John Bell, etc.) such that the
choices and actions of the experimenters preparing the system are
predetermined *or*. . . the state of any quantum mechanical system
possesses an intrinsic Heisenberg uncertainty. Some philosophers
suppose that, in the absence of human free will, all quantum systems
would remain in uncollapsed superposition, others that, the existence of
an omniscient observer would induce the collapse of all quantum
superposition's throughout the Universe, making e.g., covalent bonding
and hence, carbon-based life impossible. There is of course here the
presumption of the everlasting backwards compatibility of current day
quantum mechanics with both the physical science of the near and far
futures alike. :p Note: if the quantum uncertainty is "intrinsic", i.e.,
observer-independent, then Bohr's "light microscope" thought
experiment explanation of quantum uncertainty becomes a mere
pedagogical device such that quantum uncertainty arises as an
implication of Fourier analysis rather than on account of some necessary
physical interference between the observational apparatus and the
prepared quantum mechanical system that is under observation, c.f., October 2014 Which implies that, changes to the
quantum state cannot be understood in terms of the operation of, e.g.,
Bohm’s “hidden variables”. Nonlocality is precisely what we should
expect for physical reality, if it were simulated on a quantum computer.
Doesn’t the distinction between reality as quantum nonlocal and reality

as quantum computer simulation then collapse? The is because
phenomenologically, reality is local though noumenally, reality is
nonlocal. Locality as this presumed general feature of physical reality is
a simulated feature at the very least.
Watch the Stanford University lecture, "3. Behavioral Evolution II" on
YouTube, c.f., – good for stripping off the audio from a
Youtube video and converting to mp3 for later listening in mono.

Coarse-graining, abstraction and paradigm shifts. Irrelevant microstates,
c.f., From Eternity to Here by Sean Carroll.
Electrons moving in a magnetic field exhibit strange quantum behavior
When you commit an act of espionage and you are a trusted FBI CIA
DIA or other trusted government agency you not only destroy your
future you destroy your past. I am contemplating this intelligence
community public service announcement which used to be on posters
displayed at intelligence agencies around the country back in me 1980's.
What are we to think of multiple components of independent origin
which nonetheless interact rationally?

For time and temporality to be genuine and real, successive of moments
in time must not only follow from preceding moments, but also from the
aboriginal ground of existence giving birth to the moment wherein this
ground is independent from the preceding moment as it underlies all
moments. A moment is an integral unit of time possessing a topology
more complex than the merely linear time in which this moment s
naively supposed to pass.
There may be something unique about the deconstructive philosophy
which allows a thinker to access these islands of postmodern truth from
the sky, as it were without requiring the normal access by sea, which in
the case of the more paradoxically true propositions of this intellectual
movement, are not at all accessible in the normal way, i.e., "by sea".
You see truth has this topology which means that certain true
propositions or theorems may not ever be reachable along certain
networks of paths which connect certain other truths. And this topology
is apparently more complex than a flat plane!
For the dissociated, nothing is immediate, island everything is second
nature, as in speaking a second language
This follows from an analysis of subjectivity, intersubjectivity,
objectivity and mysticism set against a backdrop of combating
Solipsism. And, of course, all of this is informed by the very latest
cosmological theories and cosmologically based metaphysical
speculations pertaining to many worlds quantum theory, the multiverse,
Boltzmann brains, Nick Bostrom's ancestor simulation argument and so
Mysticism leads to solipsism. The only defense against solipsism is a
form of anti-mysticism, which I term "Polysolipsism". In the absence of
a God-creator, we must admit that reality is a collaborative effort, one to
which you and everyone has contributed and to which everyone

continues to contribute. See this youtube clip from the 1974 Sciencee
Fiction Film, Dark Star:
August 2014 fcbk=

As "Chuck Norris of Theoretical Physics", you are in a
unique position to judge. (I have witnessed you deliver many a
devastating roundhouse to crackpots beginning in my short memory with
a cavalcade of Usenet kooks in the early 90's and zero-point energy
crackpots a little later on, e.g., Haisch, Rueda, Puthoff. .)..I gotta laugh
thinking back upon those who you regularly skewered back in the
day...people like "Archimedes Plutonium", Alexander Abian, Euejin
Jeong, Todd Desiato et al. Gott sei dank I never aspired to be as
crackpotty as they! :) On the other hand, Doc, predicting stuff before
someone else predicts it and receives their Nobel Prize has gotten to be a
bad habit with you! I think I understand where some of the energy of
those roundhouse kicks comes from! (To Dr. Jack Sarfatti)
August 2014 fcbk=

"Lead, as I do, the flown-away virtue back to earth— yes,
back to body and life; that it may give the earth its meaning, a human
meaning! May your spirit and your virtue serve the meaning of the earth.
. . . Man and man's earth are still unexhausted and undiscovered.—
"This epigraph is chosen quite deliberately. I run the risk of its seeming
to lend itself to a certain Christian, idealist, and human is a tone, a tone
in which it is easy to recognize those well-meaning virtues and values
that have loosed upon the world all the things that have driven the
humanity of our century to despair over itself, where these values are
both blind to and complicit in this letting loose. In his own way,
Nietzsche himself would have undoubtedly participated in this dubious,
moralizing piety. At any rate, the word "meaning" rarely appears in his
work, and still more rarely in any positive sense. One would do well,
therefore, not to give any hasty interpretations of it here. The above
excerpt appeals to a "human meaning," but it does so by affirming that

the human (l'homme) remains to be discovered. .In order for the human
to be discovered, and in order for the phrase "human meaning" to
acquire some meaning, everything that has ever laid claim to the truth
about the nature, essence, or end of "man" must be undone, "Jean-Luc
Nancy, "Being Singular Plural" I resonate with the "frequency
envelope", but not the "carrier wave", verstehst du?

Entanglement, superposition, complexity threshold, decoherence,
translatability, dialectical, transactional, maximum fluctuation energy,
kernel, reprocessing, context dependency, discourse, Quantum Theory,
Philosophy of Mind, Epistemology, Philosophy of Science, Sociology of
Science, Christian Apologetics, Epigenetics, Cosmology, Artificial
Intelligence, Evolutionary Biology, Biochemistry, Chess Theory, Music
Theory, Middle East Politics, American Foreign Policy, Linguistics,
Deconstruction/Postmodernism, Phenomenology, Alvin Plantinga

In the unlimited fullness of multidimensional time, once the little
knobs on the cosmological radio receiver are tweaked to juuuust the
right frequency, one's "signal" is pulled into one's idiosyncratically
unique and tiny corner of the infinite multiverse, which is to say "the
Entanglement in a multiverse with no common space-time, S. J.
Abstract: “Inter-universal entanglement may even exist in a multiverse
in which there is no common space-time among the universes. In
particular, the entanglement between the expanding and contracting

branches of the universe might have observable consequences in the
dynamical and thermodynamic properties of one single branch, making
therefore testable the whole multiverse proposal, at least in principle”,
I like to think in terms of organism and systems-based analogies for
political power in multi-level society. Game theory is also a good
framework in which to think about political power and how it inflects
within and across those levels. More on that later. For something a little
more concrete in response, let me say that, as horrible as US Foreign
policy appears to the Chomskian far left, I believe that, had the great
might of the United States been in the hands of any other nation, culture
or people, it would have undoubtedly been applied with less selfimposed restraint and greater recklessness . . . with all that this entails. I
acknowledge that there shall always be decisions that will have to be
made at a level "above my pay grade" and that I will always have to
maintain a certain level of trust in those above me. Deconstructive
thinking is good...up to a point. As language has evolved over the
centuries so have basic concepts and concept maps and networks. These
changes are subtle and don't fit within any one generation's
sociolinguistic concept map, but are distributed over a higher linguistic
dimension - time. Societal and political self-organization therefore
exhibits a significant amount of "causal supervenience".
Especially the discovery that we all reside within an ancestor simulation
of a billion year old galactic civilization... a kind of crypto "City and the
Stars"... "Thinking inside the box" is not exactly a metaphor in this
This comes from a consideration of the presumed Gaussian distribution
of extraterrestrial civilization ages in the galaxy being tied to the average
age of the civilization parent star and then also taking into account
where a merely five or six thousand year old civilization would fit on
that normal distribution curve.... a curve with a centroid of perhaps a
billion or more years!

This represents a rather peculiar, but I think valid application of the
anthropic principle to the question: "are you living within a computer
simulation?" :p
That is true, but a knee-jerk rejection of dualism is itself a variety of
box, especially when quantum mechanics all by itself exhibits so many
dualities: wave-particle. real - virtual, local-nonlocal, deterministicprobabilistic, fermion-boson, particle-field, quantum-classical,
coherent-decoherent, matter-antimatter, discrete - continuous, realimaginary (for tunneling particle momentum, that is), eigenfunctioneigenvalue, etc. The above combined with the work of Dr. Stuart
Hameroff and Sir Roger Penrose concerning general anesthestics' effects
on the quantum mechanics of brain neural microtubule function and
consciousness *and* the decades long observed reverse effects of
conscious observation upon quantum behavior of particles and their
wavefunctions means that there is too much scientific evidence in favor
of it to just dismiss dualism out of hand. Oh and have you noticed how
"the new atheists", Dawkins, Hitchens, Dennett, Shepherd, etc. all base
their arguments in 19th century physics and biology while critics of
Darwinism tend to argue from the standpoint of 21st Century science?
Matter provides the boundary conditions on the fundamental quantum
field which enables the encoding of structures of quantum entanglement
possibly similar to how a scaffolding or template can make certain
reactions more energetically favorable. Vacuum entanglement patterns
can occupy the interstices of scaffolding provided by neural microtubule
network boundary conditions on the vacuum electromagnetic fields are
required to initiate vacuum entanglement states mirroring physical
particle entanglement states of the matter supplying the boundary
conditions. This mirroring however cannot be complete due to
inequivalence of entanglement contexts.
There is a complexity threshold for conscious thought and perception
but this is for thought in perception admitted to, registered in, as it were,

i.e., consciousness not for consciousness as such, which is fundamental
like any other quantum field may be a threshold for ego - may not be for
reified egoic consciousness. Is egoic consciousness accessing a preexisting subspectrum or an emergent subspectrum of consciousness as
One-dimensional time is a shadow and a projection of multidimensional
time. Kind of like a temporal hologram. On this view, so-called
“temporal entanglement” for which there is recent experimental
confirmation, c.f., the following Quora physics forum question from the
intelligent layman: “When measuring the spin of an entangled particle
and finding that it's counterpart instantly takes up the opposite spin, how
do we know that they didn't possess their spin directions before they
were actually measured?”   Viktor Toth, IT pro, part-time
“This is the question behind Bell's famous inequality (see Bell's theorem).
The question is usually framed this way: is it possible that particles constituting an
entangled pair carry all information needed to determine the outcome of a measurement
along with them, in the form of "local hidden variables", which are revealed by the
measurement but have been present all along? For example, suppose you travel abroad
and upon arrival, you notice that you only packed half of each pair of socks that you own.
Immediately you will know that your socks drawer at home contains the same number of
unmatched pairs of socks. Nothing mysterious about this, and we don't need quantum
mechanics to explain how you "instantaneously" acquired information about your socks
drawer thousands of miles away. The thing with things like spin or polarization is different, in
a very non-trivial way. Suppose you release, in opposite directions, a pair of entangled
photons that are polarized. Let's say they are polarized vertically. That means that if you put
a vertically oriented polarization filter in either photon's path, both photons will pass through.
And they will both fail to pass through when the filters are both oriented horizontally. But
what if the filters are oriented at 45 degrees? Each photon has a 50-50 chance of passing
or not passing through. However, when you perform this experiment, you will find that the
photons will remain correlated; they will either both pass through or neither will, no matter
how far from each other the two filters are. OK, perhaps it is still possible that the photons
carried more than just information about their initial polarization. Suppose the two photons
are part of a secret conspiracy, having discussed all possible orientations of polarization
filters that they may encounter, and agreeing in advance on how they will behave. But the
experimenter is tricky. He now sets the two polarization filters, one at the end of each

photon's path, at different angles. Say, one at 45 degrees, the other at 135 degrees. He
then calculates the probability of one of the photons passing through and the other failing,
using one very simple assumption: that the photon on one end has no way of knowing the
setting of the polarization filter on the other end, and vice versa. This assumption alone is
sufficient to calculate a maximum value for a correlation function between the two
observations. And this value is violated in actual experiments, including experiments in
which the polarization filters were set only after the photons were already well on their way.
Which means that the outcome cannot be explained using merely information that the
photons possessed all along. They also needed information that was somehow
instantaneously communicated from the other end, in order to exhibit their correlated
behavior. This is the essence of Bell's inequality. And this is how we know that photons (or
electrons, the same argument applies) could not exhibit the behavior that they actually
exhibit, using only information that they possessed before the measurement took place.”

Consciousness is the intuition of time according to Kant.
"Plato's Cave" - Google Search

Reinterpreting cold dark matter as a Bose-Einstein condensate... This
compliments an earlier research paper on the derivation of the magnetic
field permeability and electric field permittivity constants in terms of the
magnetization and polarization of the quantum vacuum, c.f.,
“Reinterpreting dark matter”,
September 2014
“Finally, we have to define what mass density is. There are
two quantities that are used to determine whether quantity is a mass
density: the energy density and the pressure (which is a density). A
mass density has energy density with no pressure.
There are two intrinsic quantities to space-time: the cosmological
constant and the space-time curvature:
 The cosmological constant has a pressure density that is equal and
opposite to the energy density:

The space-time curvature has a pressure that is minus one third the
energy density

The point is that both of these objects that are intrinsic to space-time
have pressure and therefore are not a mass density”, c.f., ?
It is interesting in this connection that the Universe possesses three times
as much energy in the form of “dark energy” as it does in the form of
“dark matter”. The roughly 4% of energy in the form of ordinary mass
was originally extracted from the quantum vacuum (which originally
contains the total mix of energy in all of its manifest forms), must have
been pieced out of the vacuum in some manner that is now reflected in
the precise relative densities of dark matter and dark energy. This is
similar to the relationship of the pressure and energy density of

Investigate the concept of genome as "interface data" qua "selforganizing system kernel".
"abils and skillities" is an example of a trivial redrawing of a distinction
within a linguistic concept map.
Inflation through Quantum Tunneling from a False vacuum:
Theory suggests the speed of light is SLOWER than we think @MailOnline

That is the mistaken notion of the neodarwinian synthesis. The genome
is a cybernetic control system.
Proteins are an expression of information in DNA. The DNA's
information is not itself an expression of any information because it
constitutes a blind filtering of merely random changes (mutations) to its
structure. Epigenetics complicates this model by introducing controls,
which are anything but random, at both ends - epigenetic information
modulates mutation, as well as alters patterns of expression of genetic
base pair sequences as amino acid sequences that must be folded in only
relatively very few of many trillions of possible ways in 3- dimensional
space to function properly within the cell. By way of epigenetics,
metainformation becomes involved, and although the path of
information within the cybernetic control system of the genome-protein
system may indeed be one way, in the absence of meta-levels of
information, i.e., information confined to a surface (single level), with
the emergence of multi-level information in the system, i.e., metainformation of the genome- epigenome- histone- ribosome- cytoplasmepigenome- genome system, information originating at the genome is no
longer confined in its path to unidirectional arcs along a surface, but is
allowed to move across this surface. Topology here is perhaps
metaphorical, but serves to illustrate that a paradigm change in evolution
theory shall likely be required, one that a growing number of main
stream biologists believe cannot be successfully accommodated within
the current paradigm of the now 80 year old "Neodarwinian Synthesis".
Once I get my laptop back from "computer heaven", I will include
appropriate screenshots to further illustrate.
Before we sneer at the role of the metaphors employed by the theorist,
we should realize that all concepts upon which theories are based are in
reality only specialized metaphors, which are artificially restricted in
their reference, application and context. But that the ground of these
three restrictions is dynamic, and shall inevitably back-react upon the
theorist. In some cases this back reaction can be stably maintained for a
time, though never indefinitely, the great success of the quantum theory

Is the deck from which evolutionary mutations are dealt, stacked?
Google search
"The Human Brain Project" is a little bit like committing in 1920 to
landing a man on the moon by the end of the 1920's." See 130
Neuroscientists' "Open message to the European Commission
concerning the Human Brain Project". . .
I think theism with the appropriate amount of humility admixed must
needs be an "agnostic theism". The theism that "knows that it knows
"the truth"" shall always pose a danger to the advancement of human
civilization . . . on myriad levels!
The most profound, or shall I say, profoundly satisfying, lyrics are, of
course misheard lyrics because the author's inspiration is forever secretly
infused with the subconscious' heart's desire.
The tension between reason (as opposed to rationality) and mysticism is
ultimately soluble. The catch is that the only proper context for this is
transcendental mind, which is essentially (nota bene) *transpersonal*.
"Managed compassion is the inevitable solution hit upon by the
metaguilty conscience, which feels keenly the chronic nagging guilt of
not feeling guilty."
Natural selection is hamstrung by the fact that incipient structures are of
absolutely no use to the evolving organism in its competition for limited
resources and mates.
If it is the most bloodthirsty tribes, races and subspecies which win at
every turn in the evolutionary competition, then why has morality and
ethics emerged at all in the course of man's evolution?

Without transcendental mind and only evolution and natural selection to
guide us there is nothing standing in the way of a thorough going
epistemological solipsism given the fact that members of a breeding
population share the same physical environment and a common
language in the specific case of human beings
Wittgenstein's "family resemblance" and the concept of consciousness.
Only a minority of people possess a fate, but of those that do, it is
encoded within their very name.
Imagine that thousands of professors were originally leading to
academia because they believe that they possessed inspiration maybe
even genius but in the work today on the ground application of graduate
student life they learned otherwise would there not be a great motive to
try to unseat the notion of genius? isn't this really be true origin of the
deconstructive impulse?
“What’s the best way to kill a lobster? Is there any mercy?
No mercy for the weak! Just kidding. Some experts claim lobsters die
within seconds of being submerged in boiling water (but oh how terrible
those few seconds), and others claim their nervous system is too
primitive [italics, mine] to allow for very much pain. The debate is
ongoing and best summed up by David Foster Wallace in this essay…”
June 2014

Russell Clark Dad began his service to this great country as a private in
the Army Air Corps while WWII still raged on, served with great
distinction as a Marine Corps Platoon Leader during the Korean War
(refusing to be medically evacuated due to combat injuries and
remaining on the battlefield to lead his men) and then with equally great
honor and distinction served as an Artillery Battalion Commander in
Vietnam. I am so proud to be his son!

April 2014

Paul Davies argument which is a restatement of Bertrand
Russell's argument that all of us could have popped into existence 5
minutes ago complete with false memories of an earthly life extending
back many years fails because of considerations of context dependency
of meaning, also in light of Wittgenstein's private language argument
and perhaps also in light of quantum decoherence theory. Duration in
relation to the building up of complexity overtime is very much
dependent upon the Planck mass-energy limit of fluctuation size and the
density of Planck magnitude or smaller fluctuations, not just in terms of
the scalar equation for the Heisenberg time-energy uncertainty
principle. May 2014 Of course, what Russell is making sport of is that
absurdity of a perfect illusion of meaning the in utter absence of a
temporal context. The “five minutes ago” just obscures this realization.
A five minute interval of human biography cannot just “hang in the
void” context-free.
What is the connection between the private language argument and
Penrose's one graviton limit for quantum decoherence? Considerations
of topology lead us to presume that the brain cannot boot strap itself into
self-awareness through use of a private linguistic structure of thought the
structure of thought must be sociolinguistic in order for this
bootstrapping process to be successful if one were a Boltzmann brain
one would not be aware of this fact in other words. Investigate "Fine
Tuning Argument Fail" on youtube. Does this argument itself fail? The
multiverse channels us from out of the void, anthropically fine-tuned
The state of the observer or experimenter does not become quantum
entangled with the apparatus that he is seeking to adjust in order to
perform a characteristically quantum-mechanical mechanical experiment
such as that of nonlocally connected or quantum correlated detector
clicks. In general the nonlocal connectivity exhibited by the mind of the
quantum observer is not traceable to an aggregation of historical
quantum entanglement between the observer and the external world. In
other words at least in part the quantum entanglement of the observer's

mental processes is internal and internally determined.
A generalized description of the freedom which the observer possesses
in determining the experimental conditions and the settings of his
quantum observational apparatus is his freedom to select a system of
basis vectors, i.e., a *basis* for expressing the observables measured in
the experiment.
Look at internal and external conditions for the breakdown of Bohm's
causality principle vis a vis decoherence and wavefunction collapse.

Wigner's friend"" and "quantum solipsism".

If quantum experimenters did not possess human free will then they
would only be able to select bases for quantum measurement that they
themselves were embedded in. And this vector basis would be the only
basis possible - it would be the same quantum basis that every other
person was embedded in. Here we see the connection between the
consciousness of the ego and the question of other minds and the
external world.
"It seems that nature is somehow communicating under the surface but
not allowing us to use that communication to actually transmit
information and this is something that doesn't happen in so-called
classical theories and is one of the key features of quantum mechanics”,
"Physics Math Charlatan" on “Instead of the
communication of information we might speak of a jointly actualized
meaning”, c.f., Cybersemiotics: Why Information is Not Enough.
Yet another connection between gravitation and consciousness in terms
of the basis vectors of a quantum measurement.
Human brains are always tapping into the reprocessed collective
Akashic record. Once in a while, however a brain taps into a copy of an
as yet unreprocessed experience fragment.

In the same way that God transcends the dual opposite categories of
existence vs nonexistent he also transcends the category of being in the
sense of not being a mere instantiation of being as an abstract category.
After all God is the ground of being and of abstract form.
Analogous to the self-existence of God is the self-existence of intuitively
grasped concepts for which we only have metaphorical justification this
brings up the puzzling and even paradoxical nature of the concept of the
Certain questions are so big that we can only hope to answer them by
finding a more proper formulation of the question. Frequently we ask a
question about one thing and formulator answer in terms of a different
kind of the same thing. Substance is always presupposed by form but
then the question of the one versus the many in relation to the concept of
substance becomes problematic. The notion of there being a general
concept of substance is itself very problematic because if there are
multiple substances what is there which is more fundamental than the
multiply distinct substances such that they are instantiations of this
general substance does this imply that there must be something more
general than substance in order to distinguish and categorize different
Multiple instantiations of consciousness in the absence of a concept of
consciousness seems to invoke Leibniz' principle of pre-established
Google search nonlocality and the principal a pre-established harmony
A trajectory of least energy would be decomposed in terms of a series of
steps which are resonance points and this would be consistent with
Fourier analysis theory. This may explain the successful climbing up the
hill by a complex system within a single local region of rugged fitness
landscap, but without nonlocality and quantum entanglement we cannot

explain the climbing down from a local maximum in order to creare the
opportunity for ascending a neighboring greater local maximum.

group selection sexual selection

October 2014

Teleology without intention requires two dimensional time and
self-organizing properties of particles and fields, which cannot be
captured within a linear deterministic causality, c.f., Extended
Criticality, Phase Spaces and Enablement in Biology. Excerpt from
abstract: “We argue that one major aspect of biological evolution is the
continual change of the pertinent phase space and the unpredictability of
these changes. This analysis will be based on the theoretical symmetries
in biology and on their critical instability along evolution.”
Chance vs determinism... are they mutually exclusive and or is there a
third option or way? wiki=“The problem can be expressed as follows.
Suppose that a sea-battle will not be fought tomorrow (for example,
because the ships are too far apart now). Then it was also true yesterday
(and the week before, and last year) that it will not be fought, since any
true statement about what will be the case was also true in the past. But
all past truths are now necessary truths; therefore it is now necessarily
true that the battle will not be fought, and thus the statement that it will
be fought is necessarily false. Therefore it is not possible that the battle
will be fought. In general, if something will not be the case, it is not
possible for it to be the case. This conflicts with the idea of our own free
will: that we have the power to determine the course of events in the
future, which seems impossible if what happens, or does not happen,
is necessarily going
Imagine a rugged fitness landscape composed of generations which
leave behind varying numbers of descendants. We might need to bring
in a second time axis in order to construct an accurate rugged fitness

landscape if we are going to take into account relative multi-generational
success at reproduction. This could be modeled in terms of the strength
of back reaction signals from later generations with the strength of the
signal being a function of both the number of descendants in the
proximate generation and the integration of distal generations in terms if
number of descendants.
What component of Bohm's causal principle is represented by
differential equations versus integral equations versus some unknown
form of equations belonging to neither class?
Future contexts may be in competition with previous contexts as well as
competing with multiple parallel distinct contexts. We don't have to
invoke creationism or intelligent design to explain teleology, we merely
have to invoke a more sophisticated model of temporality.
“When he contemplates the perfidy of those who refuse to
believe, Dawkins can scarcely restrain his fury. "It is absolutely safe to
say that, if you meet somebody who claims not to believe in evolution,
that person is ignorant, stupid or insane (or wicked, but I'd rather not
consider that)." Dawkins went on to explain, by the way, that what he
dislikes particularly about creationists is that they are intolerant. We
must therefore believe in evolution or go to the madhouse, but what
precisely is it that we are required to believe? "Evolution" can mean
anything from the uncontroversial statement that bacteria "evolve"
resistance to antibiotics to the grand metaphysical claim that the
universe and mankind "evolved" entirely by purposeless, mechanical
forces. A word that elastic is likely to mislead, by implying that we
know as much about the grand claim as we do about the small one.” –
Phillip E. Johnson, Darwin on Trial
August 2014

I am not a Creationist and I could only be with considerable
qualification be termed an “Intelligent Design” advocate. What I do
believe is that, analogous to String Theory of theoretical physics, which

postulates the compactification of spatial dimensions, but only at the
tiniest of scales, i.e., at the “Planck scale”, I postulate compactified
temporal dimensions, which constitute the feedback paths between
mutually exclusive branches of a quantum superposition of atomic or
molecular states. The connection of this feedback is instantaneous within
our single dimension of causal time, and is characterized as
instantaneous quantum entanglement. But if this notion of
“instantaneous connection” is somewhat disagreeable, then perhaps
quantum entanglement is but a manifestation of a supracausal process
taking place in a higher dimension of time, i.e., within a 2 nd time
dimension. The collapse of a quantum superposition is in certain cases
effected along the lines of a quantum computation in which the new
eigenstate constitutes the output of the computation. The Origins of Order
Can multiverse physics and the implied multidimensional temporality
thereof simulate teleology?
The absurdity of Goldman's Quantum Jumping paradigm may actually
be implied by the inner logic of the Anthropic cosmological principle.
We cannot deny teleology, that is, its operation in the evolution of
greater biological complexity as well as in the more foundational
chemical evolution. This is because of the obvious self-organizing
properties of atoms and molecules.
Communications received from our closest most intimate friends, soul
mates, life partner, mentors these have a distinctive quality namely we
cannot imagine these communications could somehow secretly be the
invention of our own minds whether subconscious or unconscious.
What is called reason necessarily transcends subjectivity and so points to
the reality of being beyond the subjective, namely that of an objective
world and other minds. The cash value of objectivity is

intersubjectivity. Here again we see how reason to the reality of being
beyond the merely subjective.
Grammar relates state so on a state-space that is of a manifoldly
connected nature. Topology of temporality is here key.
The contingency of language (see Richard Rorty on this) is importantly
connected to the concept of grammar and Chomsky's nativist
understanding thereof. Somehow the gulf between distinct minds is
overcome and information may actually be passed between minds, but
this is necessarily facilitated by way of grammar. Semantics is to
context as syntax is to. . .
Grammar functions as filter and ultra-hard encryption along the same
lines as how a web page uses a captcha image.
“Our sense organs and our brain operate as an intricate kind of filter
which limits and directs the mind’s clairvoyant powers, so that under
normal conditions attention is concentrated on just those objects
or situations that are of biological importance for the survival of the
organism and its species . . . As a rule, it would seem, the mind rejects
ideas coming from another mind as the body r ejects grafts coming from
another body.”

—Cyril Burt (1883-1971) Professor of Psychology University College,

The moral of the story here is that, above is an instance of an "excluded
middle" between chance and necessity, or randomness and determinism.
The only "thing" that I can imagine that populates this logical excluded
middle is consciousness. (I mean, think about it...a computer with
randomly switching circuit elements couldn't be conscious, nor could a
machine with circuit elements that only switched in a predictable pattern
(according to some mathematical formula or equation). I suspect this

hints at a breakdown in the presumed strict analogy between the
determinism of physics and the logical necessity of mathematics. Say,
because what we take to be equations describing unchanging law like
physical behavior are indeed only approximations. It has been said that
there has to be something that "breathes fire into the equations of
physics" to make them "about something" instead of their being entirely
abstract. What breathes fire into the equations is something that we
intentionally ignore or just fail to notice. But what we ignore is
grounded in what we fail to notice.
Without Kant's








If the concept of being transcends that of existence this is provide
support to the ontological argument?
However, the 2nd Law of Thermodynamics is 1) Only a statistical law,
not fundamental like, say the law embodied in Maxwell's Equations of
Electromagnetism, 2) The 2nd Law only applies to *isolated systems*
and 3) The 2nd Law of Thermodynamics assumes that any
thermodynamic system is composed of discrete states such that
combinations and permutations of the arrangement of these states (or
elements) *exhausts* the possibilities for the system. Fluctuations in the
quantum vacuum field are limited in size by what is called the Planck
Mass Limit, given by:  1.2209×1019 GeV/c2 = 2.17651(13)×108 kg,
(or 21.7651 ng). So a Boltzmann brain could only appear out of the
quantum vacuum within a preexisting physical brain as a new
configuration of the quantum states of that brain's tubulin dimer network
of 21.7651 nanograms mass-energy equivalent or less. On this view,
there would be no actual "Boltzmann brains", merely Boltzmann brain
continuants of the integral quantum entangled global tubulin dimer
network states of an actual brain. These Boltzmann brain continuants
would correspond to temporal bubbles of consciousness, i.e., of the
"specious present" and each BB continuant would be informed by
previous BB continuants through the quantum vacuum filtering action
and resonant frequency tuning *an actual brain*. Only BB vacuum

fluctuations that are most compatible with the intact memory traces
within a given person's brain could likely become a subsequent BB
continuant of that person's consciousness. (In other words, the
experiential continuity of lived conscious experience would not be
merely some pernicious illusion) I think something along these lines can
tame the bizarreness of Boltzmann brains predicted by inflationary
cosmology and string theory and reconcile these predictions with human
intuition and common sense. Of course, there are still intriguing
implications of this slant on BB's, but they would not be of the
ridiculous sort that has appeared in recent popular scientific and
philosophical publications, etc. Somewhat along the lines of Descartes'
"Cogito Ergo Sum" idea, we know that the brain of a conscious person is
*not a closed or isolated system"*, since, if it were, there would be no
context for what is happening inside his brain and "without context there
is now meaning" and that person would not actually possess conscious
states of awareness (necessarily implied by understanding or
experiencing "meanings"). Also, if the brain were an isolated system,
then two identical brains could not imply the presence of two distinct
conscious minds. However, add the fact that each is embedded or
grounded in a historically distinct context of embryo-logical
development, etc., and this seeming paradox is handily resolved.
Human creativity moreover implies that the system of the brain (or
whatever serves as the physical substrate for the conscious mind), cannot
be just a changing configuration of exclusively discrete, unchanging
states for otherwise the changes in the configuration of this underlying
system would be causally deterministic - a situation in stark
contradiction with human freedom and creativity. An objection here is
that human freedom and creativity may in fact be an illusion. But the
objector must acknowledge the disqualifying caveat here that, if human
freedom and creativity are illusory, then so is consciousness and again,
turning to Descartes' "Cogito", we can confidently assert that "we know
that ain't true!" c.f.,

What breaks the degeneracy of real brains vs Boltzmann brains? May 2014
Well, according to what was just said concerning the Planck limit on
vacuum fluctuation size, it must be: an already broken scale invariance,
i.e., simulacra at different spatiotemporal scales are necessarily
functionally inequivalent. Instantiation (of a concept) is a concrete
“I do not fear death. I had been dead for billions and billions of years
before I was born, and had not suffered the slightest inconvenience from
it.” - Mark Twain
However, the fact that one is conscious, rather than being a
"philosophical zombie" perhaps implies that the Universe, Multiverse or
whatever one terms this realm of being has registered (and is in the
continual act of registering) one's temporal existence and so Lucretius'
"symmetry argument" may indeed fail, i.e., in favor of the proposition
that the nature of nonexistence *before* and *after* one's brief Earthly
existence are identical. Lucretius' "symmetry argument" carries the
assumption that no metaphysical work is performed by the act of human
existence - an assumption that I suspect is altogether unfounded in the
light of such modern physical principles embodied in the quantum "nocloning" theorem, vacuum entanglement, decoherence theory, two-slit
experiment, "Wigner's friend", Penrose-Hameroff "ORCH-OR theory",
and so on." Lime Cat Another possible breakage of Lucretius'
symmetry: if each person is "called forth" from a *distinctly different*
abyss, (this would certainly beef up the ethics of mutual respect, that is,

such a transcendent "alterity of the other"), then nonexistence would not
possess a simple structure and so likely would possess no such
"Lucretian symmetry" as Twain is glibly presuming.
You could say..."truth splits this degeneracy". But without a
transcendent observer, there's still this degeneracy of ontology vs
epistemology. Infinite regress.
God and reified individual consciousness are two sides of the same coin
qua projections from sociolinguistic constructs.
Boltzmann brains in the vacuum require bootstrap boundary conditions
on the quantum field.
Margenau says quantum superposed states are always of a given
quantum mechanical system of matter and fields, vacuum boundary
conditions must be present.
The first cause dilemma placed in a quantum mechanical context, that is,
the vacuum cannot provide boundary conditions for itself. It cannot boot
strap itself in other words.
The back reaction of complex its extended physical structures upon the
quantum vacuum transcends the configuration complexity possibilities
for fluctuations of this vacuum.
Mutual recognition of behavioral genetics.
Ziad asks, "Could god recreate same person twice? Let’s assume x exist
and then god destroyed it and recreated it again. Is that same x or a
similar x? Is it a duplicate x? No way has the original x come back after
Since a physical duplicate of your body, down to the identical individual
atoms is not the same as you, (by the famous "no-cloning" theorem of

Quantum Mechanics), which is to say, not truly identical to you, then
therefore the true underlying basis of your identity is not to be sought in
the configuration of some particular set of atoms and molecules. I ask
you, if God had managed, hypothetically speaking, of course, to create
you for the first time, then why would it be *more difficult* rather than
less so, for Him to create you once again? After all, He didn't have a
template to start with the first time, but He now has a template to work
from for a second time around. In a word, how does a feat which was
possible on the first occasion become impossible to reproduce on
subsequent occasions? It is our subconscious acknowledgement of the
essential irreversibility of the act of human existence which is secretly at
work here in this discussion. You and I, because of our distinctly
different philosophical prejudices relating to how we regard this
irreversibility of each act of individual human existence, from "opposite
ends of the binoculars", as it were, view the initial human incarnation as
either enabling or disabling the possibility of a second human existence.
Also we come to the problem with different views of temporality. You
see it's not an instantaneous configuration of spatially discrete elements
which provides the underlying ground of being and becoming
(sustainment of being over time) for one’s personal identity. Although
ones being possessed a beginning in time, the very ground of one's being
lies all together outside of time. Via application of Solipsistic
Cosmological Principle, one enters the realm of time and space at its
own good pleasure. The immortality of, or better said, *the eternity of*
the ground of your being is a much more certain proposition than is the
external world and the other minds which appear to populate
said world.
August 2014 fb=

Lime Cat Concerning the "re" in reincarnation. I think it's
high time (no pun intended) for philosophers to step out of the box on
this question. If there's anything that the anthropic cosmological
principle and multiverse theory (not to mention quantum mechanics and
relativity theory) has taught us about time, it's that we should not confine
ourselves to the notion of a unitary cosmic time/temporality. You got
here for the first time in *this* timeline. Since there are an unlimited

number of other times (completely distinct and disjoint) from this time
(i.e., the time that you are in now), it follows that you can always come
into being "for the first time" an unlimited number of times. . . and this
would not be to invoke the phenomenon of "reincarnation" in any way. .
. because all of these distinct times in the multiverse *have nothing
whatever to do with each other* It's soooo hard to let go of the notion of
there only being a single metaphysical timeline, one that somehow
coincides at once with the temporality of the subjective egoic self, that
of other selves and with that of the cosmos as a whole, but once once
glimpses how to metaphorically "shuffle off" that philosophical "mortal
coil", as it were, by effecting the implied mental trick here, then one’s
doubts about the eternal nature of the self (read: the transpersonal as
opposed to the tiny, "myopic", egoic self, namely, the one that formed
the original infrastructural ground of one's being) utterly fall away. But
alas, this leap of insight is just too hard for most people and even many
geniuses are not capable of it. . . because it's not a question of sheer
intelligence so much as it is a question of there being a certain requisite
level of imaginative openness.
The philosophy of empiricism is the linear, zero-threshold extrapolation
of knowledge acquisition.
Flowers for Algernon and the essentially social nature of qualitative, as
opposed to quantitative knowledge acquisition.
Somewhat in the spirit of Frank J. Tipler's "Omega Point": The whole
"burning in hell for an eternity" concept is perhaps just a kind of
metaphor, though perhaps not an altogether groundless one. This
metaphor has undoubtedly functioned in myriad diverse cultures over
these many centuries as a strategic mechanism of social control and as a
scare tactic to compel compliance of the masses and individuals alike
with the disguised agendas of social elites. But the notion of hell is
perhaps also in part aboriginally the product of the fevered imagination
of some particularly gifted Bronze Age Levantine tribesman-poets’
interpretation of the horror of unending separation from God. Because I

don't really believe in the God of most rabid atheists’ impious
imaginations, there is really not that great of a difference of opinion
between them and me. In the spirit of an attempt to reconcile my semiorthodox Christian beliefs with a scientific world view, I interpret the
whole “separation from God” trope for eternal damnation in light of the
following eminently quantum and therefore scientific notions: quantum
entanglement, quantum vacuum, quantum nonlocality, quantum
tunneling, quantum cavity electrodynamics, and I shall include one
additional concept derived from the late classical physics of Maxwell
and his famous equations of electromagnetism, namely that of the
"Gibbs Phenomenon". Applying the interesting and pervasive physical
phenomenon of Willard Gibbs to a quantum tunneling wavefunction in
which the barrier is reinterpreted as that of God-man separation in the
afterlife in which "eternity" equates with the infinite height of the barrier
- required to prevent penetration of the “quantum brain” wavefunctions
of unforgiven sinners into the “heavenly ground state vacuum”. The
eternity or *everlastingness* of the separation is basically needed to
prevent the wavefunctions, describing the quantum- vacuum-embedded
functioning of the brains of atheists and Satan's fallen angels from
quantum tunneling into heaven’s vacuum ground state and threaten to
disrupt Jesus' administration of life within heaven, effectively amounting
to an intrusion of sin nature. (A potential energy barrier of finite height,
however large, permits eventual probablistic quantum tunneling of
wavefunctions through the barrier.) And if conscious mental
functioning, i.e., cognition and will are conditioned by a preexistent
ground of being within the quantum vacuum that is spontaneously
productive of virtual Boltzmann brains as a kind of necessary
percolation process within the brain of each hominid ape, so as to endow
it with self-awareness and a free will, then the only way to excluded
undesirable Boltzmann brain vacuum fluctuations, so as to protect the
new, “glorified brains” of those bodily resurrected in heaven, would be
to erect an infinite potential barrier, rather than a merely large, finite
one, thusly separating fluctuations of “good quality” or “fidelity” from
those of “bad fidelity”. Otherwise, the Gibbs Phenomenon (applied to
the quantum vacuum electromagnetic field) would permit eventual

probabilistic “leakage” or quantum tunneling of the undesirable quantum
information borne on the inwardly seeping nonlocally connected
vacuum field fluctuations from outside the barrier. On this view, heaven
is then an exercise in cavity quantum electrodynamics writ large. An
implication of this view is that those who have been resurrected on the
“dark side” of this potential barrier would possess brains that could only
resonantly tune to Boltzmann brain fluctuations within the global
quantum vacuum field that are of lesser stability and fidelity, i.e., that
are noisier or “more entropic”. The integrity of function of such brains,
resurrected on this dark side would, we imagine, be much more prone to
problems of internal regulation. It is easy to imagine that anxiety, fear
and hate, etc. and other negative emotions, as well as disruption, and discoordination of other psychic faculties, might well be the inevitable
result for such unfortunate souls. Bodily resurrection on this view would
be a general physical phenomenon much like that underlying one’s
original though nonetheless equally “miraculous” earthly embodiment,
and would be in its universality akin to "the rain that falls on the just and
unjust" alike, as befits any physical phenomenon, e.g., hurricane,
tsunami, etc. A precondition to resurrection in a “glorified body” might
correspond to a kind of “scrubbing” of the physical interface between
the saved soul’s brain, or between his brain microtubule tubulin dimer
network and his unique subspectrum of vacuum electromagnetic field
fluctuations, performed while he is still alive, say, as a result of the
influence of a new subspectrum (corresponding to the action of the
“Holy Spirit”), such that a kind of preparatory frequency filtering is put
in place, providing the underlying basis for psychic continuity, through
conserved quantum entanglement encoded information, into the “up” or
“good” side of the infinite potential barrier partitioning the “eternity” of
the resurrection world. Since preexisting quantum vacuum correlations
between two points within spacetime are always necessary to support
any causal connectivity between these points, there will no longer be any
possibility for causal interaction between opposite sides of the infinite
potential barrier, i.e., the Gibbs Phenomena of quantum vacuum
wavefunctions would be altogether “squelched.” Namely, there can be
no correlations between fluctuations of the quantum vacuum on opposite

sides of the infinite potential barrier. This is reminiscent of the fact of
Schrodinger’s cat becoming so isolated in his black box in order to be
placed into a superposition state that the quantum processes underlying
the biochemical function of the cat in its “living branch” of this
superposition, e.g., the quantum superposition of wavefunctions that
supports all covalent bonding activity, would be effectively shut down.
And because it is the spontaneous energy fluctuations of quantum
vacuum, which is responsible for Heisenberg energy uncertainty, /\E,
which in turn produces all temporal change, i.e., /\t ~ h//\E, by virtue of
the complete separation of these fundamental field fluctuations into two
distinct partitions, it follows that time and temporality in “heaven” will
have become totally separate from time and temporality in “hell”. Such
a bifurcation of the time line would effectively be irreversible because
the event of bifurcation would not be contained in either timeline.

I cleaned this up and edited it some. See also, Frank J. Tipler's Tulane
University and personal webpages. I will probably end up emailing him
the final version of this as it contains some ideas that he might find
useful. If there are any questions, I will be happy to elaborate, since I
have glossed over some details. The guiding insight for me is the
consideration that our workaday world of conventional common sense is
underlain, if you will, by a fundamental fluctuation field, i.e., the
quantum vacuum electroweak nuclear field. So relating aspects of some
important area of cultural common sense, e.g, Christianity, or Organic
Gardening, American Football, competitive billiards, etc. for that matter,
is not so ridiculous a proposition as at first seems. All "ordinary"
phenomena are emergent from the universal quantum field. As Alan
Watts used to like to say,
"You didn't come into the world, you came out of it!"

“In digital signal processing applications there is sometimes an effect
known as the Gibbs phenomenon, which is a characteristic ringing
associated with sharp edges and transients. Is this a function of
sampling, quantization or filtering in the system?”
There is no "twice". The ground of one's being is eternally given only
once...but there are unlimited manifestations of time. The only reality is
the absence of limitation. The question becomes whether this absence is
a one or indeed a many. Each soul possesses its own eternity.
May 2014

Sam and I agreed long ago, that this absence is a plurality, which
in mutual collaboration has produced a common metaphorical ground.

Each metaphor begins as a creative leap of insight before it inevitably
cures, hardening into an impenetrable ball of a stock metaphor.
Humanity can perhaps be fundamentally divided into two groups, the
one, formed of individuals who bring new metaphors into being, and the
other, formed of those who consort and trade exclusively in stock
metaphors. It has been noted that the figurative and the literal take each
other's place in dreams. So is there then no real basis for comparison
between the two groups other than biology? (Fundamentally similar up
to the very point at which they are irreconcilably different? The poet,
the artist and the philosopher indeed find themselves strangers within a
strange land.

“the information contained in a retroviral gene is used to generate the
corresponding protein via the sequence: RNA  DNA  RNA  protein.
This extends the funda-mental process identified by Francis Crick, in
which the sequence is: DNA  RNA  protein.

December 2012

By analyzing the DNA of both parents and that of the child, it
can be determined how many so-called germ line mutations are present
in the child’s DNA.
By this means it was established that
approximately 40 such mutations typically occur between generations.
The vast majority of these mutations cause no changes in the types of
proteins that are produced, i.e., they are not “point mutations” and are
mostly harmless because their gene regulatory implications only
manifest themselves over the course of an extraordinarily long life span,
interacting as they would with patterns of gene expression not heretofore
selected for, which is to say manifesting the biochemical self-organizing
properties of matter and not the higher level properties of environmental
context sensitivity. So-called harmless or "silent" mutations are far more
efficaciously interoperative with gene regulatory structures that lie still
far in the future than indeed they are with respect to currently existing
structures, which points up the providential nature of the self-organizing
properties of atoms and organic molecules as a kind of preexisting

infrastructure that promotes and enables evolution. Quo vadis, Peter, er,
uh, Darwin?
April 2014

Research the difference between the hypothetical observation of
an observer versus the observation of a hypothetical observer versus the
hypothetical observation of a hypothetical observer in terms of
wavefunction collapse.
March 2014

The true sceptic possesses no agenda or preconceived notion of
what reality is and plays the devil’s advocate though always without
taking sides. The so-called sceptics of the late 20th and early 21st
centuries are almost uniformly materialist ideologues or evangelical
The puzzles and paradoxes which result if there is no must be
understood and the two distinct and we related of this phrase, "thing in
itself". It is clear that the thing itself must be indifferent to the passage
of time and therefore the originator of temporality.
Voice to text is operating suddenly through use of synonyms instead of
phonetic transcription....the ghost in the quantum computer.
Discuss the tertium datur of the self, "zweiselection", i.e., the self and
the other, but not just any other (another), for there is no other without
the transcendent other, e.g., me and Jesus, pursuant to the anthropic
cosmological principle and the fine tuning of immanent and transcendent
consciousness. How to distinguish another from an other, i.e., an other
who is not me from an other who is me – the me that is always just here
for the first time? Is the other of the other reducible in the final analysis
to merely the other, properly so-called?
A simulation is based on something outside of itself. A really existing
system is based on something outside of what can be known by any
inhabitant of the system.

March 2014

If there is no thing in itself, then there must be something
maintaining the order of things in existence apart from the mere
indifference of creation to the passage of time.
There is a special class of counterfactuals these are a counterfactual that
we would never have been in a position to observe. For example if there
had been an advanced technological civilization prior to the current one
on planet Earth we should have been able to observe that the earth’s
crust is strangely bereft of precious metals. But this would only be
possible if we were observers from another planet with its own precious
metals to support advanced technological civilization
Discuss the general distinction between the operation of laws of physics
and chemistry from the application of the loss of physics and chemistry
in terms of the distinct levels of generality.
God is what is secretly pointed to by the unity of hidden assumptions of
successful splitting of the myriad linguistic degeneracies that human
communication is prey to.
Both general relativity and quantum mechanics have passed every
experimental test to many decimal places and to many standard
deviations or Sigma. And yet these two highly successful theories cannot
be merged harmoniously. Does the experimental success of both
theories, implying that they must in fact be harmoniously blended that
we must admit this possible higher unification of fundamental physics,
or is this rather evidence of our hidden assumption of metaphysical
prejudice? Does a so-called Theory of Everything require data in
addition to what is available through intersubjective observation and
experiment? The concept of the “cosmic breadboard” as the basis for,
for example, on the one hand, a finite speed of light, time dilation,
gravitation and spacetime variations in the local velocity of light in
vacuum from the partitioning of CPU workload/bandwidth, as well as
providing the theoretical modeling basis for wavefunction collapse,
probabilistic quantum behavior (algorithms for “guessing” the next

computational state) and quantum decoherence (link rot), on the other
hand, evinces a possible deep though perhaps comprehensible
interrelationship between GR and QM. Perception as “cognitive tuning”,
i.e., as quantum mechanically mediated tuning of brain circuits to
resonate with selected vacuum frequencies and bandwidth spectra
instead of as portrayed in the naïve realistic models of sense perception
may point to a threshold, above which, cognitive tuning becomes
independent of intrinsic quantum entanglement (and its selfreprocessing) in the vacuum and below which, and below which these
two processes (which may indeed be on a par with one another) act in
mutual competition.
Must spontaneous quantum entanglement reprocessing be intentional
and object based?
Since there is no concept of consciousness therefore there is no
instantiations for quantification of consciousness. This means that there
is no set of conditions which excludes my being. I am always already
here for the first time. Nor can there be a theory of consciousness
(without a concept thereof). (One can always come back “for the first
time” as there is nothing to prevent it! It’s how one originally came to
One of the implications of eternal inflation theory is that each observer
within a given pocket universe does not have access to veritable cosmic
time and so in a very real sense exists within a simulation possessing its
own internal time.
There is a growing consensus in New Age circles that life transformative
events, even tragic once, are directed somehow by the Higher Self of the
person who is subject to these events or incidents.
Because the system is radically reset upon death, if the evangelical
atheists are right, anything becomes possible, as a novel occurrence
since all of the restrictions that might have prevented novelty are

removed. Even God becomes possible, that is, from a starting zero-point
of altogether unconditioned being, which is the transcendent ground of
being. This is the true inevitable underlying logic of multiverse
metaphysics. It has been rightfully said that anything at all will indeed
happen that is not specifically forbidden by quantum mechanics. This is
consistent with the principle of the superabundance of unconditioned
being. Da das System radikal auf den Tod zurückzusetzen ist, wenn die
evangelische Atheisten Recht haetten, wird alles möglich, als ein neues
Auftreten, da alle Beschränkungen, die Neuheit verhindert haben
könnten, entfernt geworden sind. Selbst Gott möglich wird, das heißt,
von einer Startnullpunkt von insgesamt unbedingten Seins das der
transzendenten Grund des Seins ist. Dies ist die wahre unvermeidlich
zugrunde liegende Logik des Multiversummetaphysiks. Es hat zu Recht
gesagt worden, dass überhaupt alles in der Tat passieren soll, dass nicht
speziell durch die Quantenmechanik verboten ist. Dies steht im Einklang
mit dem Prinzip der Überfülle der unbedingten Seins.
It is not merely that the time line frays at infinity. September 2014
Unconditioned being makes no choice between the ontological
alternatives of the one and the many. Unconditioned being takes no
sides in the metaphysical debate between those who insist consciousness
is one (though manifold or pluriform) and those who assert it is a many
(topologically multipartite). @$In other words, there is no sufficient
condition for being to assume either a singular or a plural form, which is
to say, singularity and plurality are not distinct forms of being.
Investigate the concept of agnostic theism.
Analyze and deconstruct the naive tissue of assumptions lying behind
such statements as "the universe has been here forever".
Between the emergence and the originality of time, anything is possible
but not necessarily inevitable.
When you are dead you're no longer attached to the time line along

which you lived your human existance so you're free to start "again"
(English is not equipped to describe situations of multidimensional
temporal tenses – we need the concept of “again” in the sense of “multitime”), as though for the first time, but it's not "again" because
altogether new time. It's difficult for one to grasp because one thinks in
either/ or and black and white terms... either you live again or you won't,
but there's a third way analogous to getting here in "the first place".
One's consciousness simply possesses multiple instances, however in
the total and utter absence of instantiation of an immanent concept or
category, i.e,. a concept graspable by a finite intellect. The instantiation
of consciousness is with respect to a transcendental concept of
consciousness. Multidimensional subjective temporality is implied by
the transcendental nature of consciousness. All concepts are metaphors
masquerading as such, except for transcendental consciousness and its
concepts, which are purely abstract, i.e., not taken from experience as
indeed metaphors are. @$Can the purely concrete be truly concrete and
un-abstract? Doesn’t the concept of pureness or purity imply
abstraction? March 2014 The notion of the “purely concrete” is an
abstraction and so mind as the only source of abstraction becomes
unavoidable. May 2014 It occurs to me that the notion of “the purely
concrete” is an abstraction and so mind, as the only source of
abstraction, becomes an unavoidable precondition to an inevitable
phenomenon. This is a good definition of immanent necessity, i.e., the
instantiation of necessity as such: an unavoidable precondition to an
inevitable phenomenon. This is of course all contingent upon
"existence" and "mind" being well formulated or coherent notions.
March 2014

The Gödel Incompleteness Theorem combined with the finitude
of the human intellect points to the reality of universal mind, since
Platonic mathematical forms as sufficiently complex mathematical
theorems thusly may subsist independently of any finite mind, c.f., wik= May 2014 Lime Cat We live
endless lives, but in each life we are here for the first time. A seemingly
paradoxical statement, but not if one realizes that eternity is the root of
all temporality.

"Time is infinite, but the things in time, the concrete bodies, are finite.
They may indeed disperse into the smallest particles; but these particles,
the atoms, have their determinate numbers, and the numbers of the
configurations which, all of themselves, are formed out of them is also
determinate. Now, however long a time may pass, according to the
eternal laws governing the combinations of this eternal play of
repetition, all configurations which have previously existed on this earth
must yet meet, attract, repulse, kiss, and corrupt each other again..."
~Heinrich Heine
From Aphorism 341 of The Gay Science: "Whoever thou mayest be,
beloved stranger, whom I meet here for the first time, avail thyself of
this happy hour and of the stillness around us, and above us, and let me
tell thee something of the thought which has suddenly risen before me
like a star which would fain shed down its rays upon thee and every one,
as befits the nature of light. - Fellow man! Your whole life, like a
sandglass, will always be reversed and will ever run out again, - a long
minute of time will elapse until all those conditions out of which you
were evolved return in the wheel of the cosmic process. And then you
will find every pain and every pleasure, every friend and every enemy,
every hope and every error, every blade of grass and every ray of
sunshine once more, and the whole fabric of things which make up your
life. This ring in which you are but a grain will glitter afresh forever.
And in every one of these cycles of human life there will be one hour
where, for the first time one man, and then many, will perceive the
mighty thought of the eternal recurrence of all things:- and for mankind
this is always the hour of Noon".[7]
Have you ever had an insight that is extremely significant, but you can't
find a way to put it into words so as to share it with others? I recently
had such an insight. The fact that I can't really describe this insight or
the details of its contents to anyone else, for some strange reason doesn't
cause me to believe that it is any less valid.

Being an alien doesn't mean that you're from another planet or star
system. It means that your brain acts as an interface between quantum
vacua within a partition that is categorized altogether differently from
the quantum vacua partitions which the brains of others, one’s so called
peers, resonate with.
There is no concept of consciousness or category thereof... This would
imply that there is no universal mind which is instantiated by multiple
individual consciousness’s. Only subjective metaphors and not
intersubjective concepts.

If upon death one loses the thread then you were never here so you
can come into being for the first time.
Either God exists or God does not exist either the soul is eternal or does
not either reincarnation is true or its faults either humans have free will
or they don't and so on. In an open ended multiverse metaphysical
possibilities are contingent.
Either reincarnation is true or it is not if it is not true then the thread is
cut upon one's death in which case when was never here in which case
one can come to be for the first time.
Discourse is not just a structure within one's native language, but is a
novel language in its own right.
The illogical modus ponens verses modus tollens reversal or inversion is
necessary for the translation between languages which is nonlinear in the
relationship of concept maps.
And is necessary for the operation of metaphor transitioning between
different universes of discourse or for use with idiosyncratic dialects of
individual persons

I thought it was when Alan Lightman physicist said that we are lucky to
be here and said this within the context of discussing the multiverse.
Because the system is radically reset upon death, if the evangelical
atheists are right, anything becomes possible, as a novel occurrence
since all of the restrictions that might have prevented novelty are
removed. Even God becomes possible, that is, from a starting zero-point
of altogether unconditioned being. This is the inevitable underlying
logic of multiverse metaphysics. It has been rightfully said that anything
at all will indeed happen that is not specifically forbidden by quantum
mechanics. This is consistent with the principle of the superabundance
of unconditioned being. It is not merely that the time line frays at
infinity. Unconditioned being makes no choice between the ontological
alternatives of the one and the many. Chaos is wholly degenerate.

February 2014

Origin of the assertion that "God is dead"? Not with Nietzsche

or Sartre.
February 2014

One needs stability to get life started one needs instability to
get species to split in two subspecies which then become independent
separate species. We need a vast continuum of forbidden States
interwoven into the rugged fitness landscape of the multiverse for
otherwise nothing of value can ever be brought forth. The advancement
of science has, since the Enlightenment or perhaps even since the
Renaissance, gone hand in hand with the steady retreat of Man from the
position he had enjoyed since antiquity, at the center of creation. But the
underlying logic of the Anthropic Cosmological Principle, if properly
understood and this understanding disseminated, threatens a Copernican
revolution that shall boomerang with a vengeance. Reductio for the
multiverse with 10500 universes vastly outstripping the number of
possible distinct minds? April 2014 @$One way to invert this disastrously
counterintuitive ratio and avoid all this is to discover that the mechanism
of human consciousness necessarily and regularly invokes what is well
beyond a mere astronomical number of these parallel universes. And the

quantum decoherence limit set at the Planck mass-energy combined with
Henry Margenau’s observation that quantum superposed states are
always states of some classically describable object aids us in properly
identifying these myriad “parallel universes”. They are distinct quantum
states of brain microtubule tubulin dimer networks of distinct energies
no greater than the Planck energy.
A principle of unity implies a principle of design coherence
cohesiveness unity stability all of these are part of the principles of
design as manifestations. Meditate on what it means to process a deeper
understanding of ideas. Most people live the majority of their lives
ensconced within a tissue unchallenged assumptions nestled within a
paradigm which never shifts. The differential decoherence mapping
correlates with the rugged fitness landscape of evolutionary theory.
Keeping a loved one's stress levels low and giving her attention and love
will help her condition. What I call active, participatory prayer.
Similar to how consciousness does not possess a representation, so too
does time fail to have a representation. This is connected to
Wittgenstein's "private language" argument, one application of which is
a disproof of the skeptical hypothesis that I popped into existence a
moment ago complete with a set of false memories referring to a
fictitious earlier life's history. Wittgenstein's private language argument
is thought to be so important to 20th century philosophy because it is the
only respectable argument ever developed against solipsism. The
incorrigibility of the reality of memory rather than the incorrigibility of
memories themselves is importantly related to quantum decoherence in
thermodynamic irreversibility. Although Boltzmann brains are more
massive than the Planck mass and therefore cannot be generated in a
vacuum fluctuation, nonetheless Boltzmann "minds" *are possible*
states of the vacuum during its fluctuation. And “Boltzmann minds” may
indeed be the most important quantum neurological phenomenon that
can be supposed to result from the coupling of the brain to the
fluctuating quantum vacuum electromagnetic field at the tubulin dimer

In the youtube video, "The Private Language Argument and the Nature
of Consciousness", Cutoftheamateur states that the supervenience of
consciousness increased natural selective pressure in favor of increasing
neurological and brain physiological complexity in the evolution of the
human brain above the selective advantage afforded by mutations or
cultural innovations which merely had the effect of increasing the
efficiency of a classical neural network of stimulus-response by way of
improvements in logic, memory and data processing speed. It is no
accident that metaphors are frequently referred to through the use of
quotation marks enclosing a term or phrase: which is being applied in all
together new way that is in a manner not suspected, heretofore (note
temporal inflection) unsuspected by the listener who has necessarily
received a communication from the other. To form an abstract concept
or idea of a person, there must be more than one exemplar of "person".
Also, quotation marks are used when one is *referring* to one thing in
terms of a different thing. It is not that the mind transcends the
limitations to logic set by Gödel’s incompleteness theorem and it's
implications, but that the brain has access to multiple Boltzmann minds
through the quantum superposition of brain contexts provide by the
embedding quantum vacuum fluctuation field (of nonlocally connected
tubulin dimer states). There are perhaps no Goedelian limitations in
philosophy and philosophizing as there is with logic and mathematics
because of the blurring of the distinction between levels and metalevels
within the domain of philosophic activity.
February 2014

The philosophically naive error of intelligent design theorists
is their presumption that grammatical chauvinism on the part of speakers
of human languages possesses validity beyond the linguistic context in
other words is valid is relevant to the description of objective being.
They should remember that the passive construction leaves the question
of a subject wholly indefinite. The adjective "intelligent" may only be a
property of a fundamental process, I.e., one having no beginning in
time. The noun, "design" does not necessarily imply an activity. An

activity does not imply an agent. An agent does not imply an agency.
An agency does not imply a founding. A founding does not imply a
Thanks, Michael! An old chestnut or shall I say "onion". : p. The motive
behind ID is indeed disingenuous aka backdoor creationism. Believing
grammatical relationships to be as robust as logical or causal
implications is my biggest criticism of the purveyors of ID. . .not their
cynicism, which I take for granted. "You can't get something from
nothing . . . unless its the quantum vacuum."
The philosophically naive error of intelligent design theorists is their
presumption that grammatical chauvinism on the part of speakers of
human languages possesses validity beyond the linguistic context in
other words is valid is relevant to the description of objective being.
They should remember that the passive construction leaves the question
of a subject wholly indefinite. The adjective "intelligent" may only be a
property of a fundamental process, I.e., one having no beginning in
time. The noun, "design" does not necessarily imply an activity. An
activity does not imply an agent. An agent does not imply an agency.
An agency does not imply a founding. A founding does not imply a
Thanks, Michael! An old chestnut or shall I say "onion". : p. The motive
behind ID is indeed disingenuous aka backdoor creationism. Believing
grammatical relationships to be as robust as logical or causal
implications is my biggest criticism of the purveyors of ID. . .not their
cynicism, which I take for granted. "You can't get something from
nothing . . . unless its the quantum vacuum."
I.e., the quantum vacuum is "nothing". And "nothing" comes from
nothing...Heisenberg energy uncertainty is the causal basis of
temporarily and so could have had no beginning in time. And
"everything" comes from "nothing" because the only difference between
real and virtual particles and fields is (and here's the

rub...*undifferentiated* energy).
There are no things, only "things”. In order to do away with dualism we
need to analyze the redundancy of the external world and other minds as
independent concepts or principles of being. Paradoxically in order for
science to be truly positivistic it must, following Feuerabend, be
radically free of any determinate method of discovery and investigation.
Belief that science has a determinate method of discovery is to presume
that the final achievement of science shall be the establishment of a
theory of everything, which is a metaphysical presumption.
Positivistic science, which possesses no determinate method of
investigation or inquiry, cannot make any hard and fast distinction
between the natural and the supernatural.
What distinguishes a brain from a computation device or computer is the
greater independence of modularity of design which leads to the
marshalling of multiple independent contacts also there is the L stripping
of any unified or integral to sign this is about saved due to the plank
limit where in the design functions in a manner which transcends the
look ahead computational capacity of the quantum vacuum field.
All of the operations described by Alan Turing for his universal
computational device are susceptible to quantum superposition
There is certainly an important distinction between 2 to the power of
400 parallel universes versus 2 to the power of 400 quantum computers
within a single universe. Does the natural decoherence limit play an
important role in this distinction?
Compare the degree of focus of attention versus the degree of resolution
the appearances. Is there a degeneracy of case between discovery and
learning about an objective realm versus a self- interfering, self- limiting
cognitive system embedded in a chaotic medium?

God is the consciousness of Being as well as they will which brought
being into being.
Imaginary number mysticism.
Well it is true that long hair usually doesn't look aesthetically
appropriate framing the face of an "older woman" and could create a
perception problem, both personally and professionally. I never thought
that your long hair mismatched your face two years ago. Regulation of
gene expression is no longer under the legacy of a billion years of
natural selection once the child-bearing years are far behind. Regulation
of gene expression rolls over to being subject to the innate selforganizing properties of atoms and molecules. In the second half of life,
therefore, one ceases to be a child of this little green earth, but comes
into her own as a child of the cosmos.
But what is really relevant is not the modern day, but what natural
selection had to work with over the previous million, 10 million,
hundred million, etc. years in the past. It's unlikely that natural selection
programmed gene regulation in hominids living much beyond 30, still
less, *producing offspring* beyond this age. So the relaxation of the
grip of our evolutionary legacy actually probably begins after 35 or so
and granted, maybe is not complete until 50 or 60. The moral here is
that one can look forward to old age instead of necessarily dreading it
because of the possibilities of transcendence and spiritual growth that
our latter years offer. : ) The concept is valid, I believe, in terms of the
combined action of Darwinian and thermodynamic principles.
Now the process of the great spirit of the cosmos strengthening its gentle
grip and loving influence on the genome may indeed extend into the
70's, 80's and beyond.
This principle relies upon a reasonable assumption, the one that levels of
description in gene regulation inter- penetrate, in other words modularity
is imperfect so that it is not just the lower level systems on which natural

selections grip relaxes, but that this applies to higher level function as
According to Whittington, animals could have arisen at different places
and at different times. Instead of descent from an accidental first
common ancestor, we have evolutionary convergence guided by the selforganizing properties of organic molecules from multiple points of
origin: nevertheless guided by a single underlying dynamic. In other
words the common descent is not from a single accidental first ancestor,
but from a single dynamic medium informing and guiding the
evolutionary process as a whole.
Darwin's principle of common descent is false and his principle of
evolution by natural selection is tautologous: everything which
survives, survives.
Because life arose from multiple points of origin, but there is a single
genetic code or language to mediate the evolutionary process from these
multiple point origin, it follows that the rationale of the genetic code lies
with the self-organizing dynamic origin of life rather than itself having
been cobbled together by blind evolutionary processes.
Platonic archetypes offer themselves as a happy 3rd alternative to the
Intelligent Design hypothesis instead of explanation of evolution in
terms of common descent.
Higher mammals such as humans share such an ancient common
ancestor with the octopus that the eye of the octopus and the eye of the
human originated from a common ancestor so ancient that this ancestor
did not itself possess a recognizable functional eye.

By combining Bohm's causal principle with the decoherence principle,
we see that the Planck limit applies not only to space and critical
densities, but also to time and time intervals. in this way we see that

temporality is necessary for the construction of an integrally whole and
open consciousness and one that is prepared to be an active participant
in a heavenly community.
The operation in the background of this tendency to commit informal
fallacies greatly reduces the time it takes to hit upon theoretical insights
which had one can find oneself to correct logical thinking one might not
have stumbled upon for orders of magnitude greater spans of time if at
all and so the fallacies of informal logic we're likely selected for in the
evolutionary process because they had survival value in greatly reducing
the time spans required for hitting up on favorable solutions to otherwise
intractable problem. Hey ready general example of this is the confusion
of the principles of modes tollens with modus ponens.
The robustness and stability of life forms may be better served by
multiple points of origin rather than common descent. With multiple
points of origin the beginnings of life how much less likely to be purely
driven by chance, but partly driven by the self-organizing dynamics of a
fundamental ground.
Investigate the concept of the seeming magical nature of the general
effectiveness of mathematical integration.
Make no mistake deconstructionist theorists: What led us to this
postmodern world of the indeterminate self and the illusory will were
myriad individuals be- speckling history, possessing a determinate will
and a highly determinate ego.
Darwin was only embraced wholeheartedly and promoted to the status
of a scientific demigod only because there were myriad closet atheist
secretly awaiting the arrival of a godless messiah.
We can use evolution theory is a system of appropriate metaphors for
organizing our experience and understanding it better even though
Darwin's theory is not strictly speaking correct this brings up the whole

question of what correctness of theories within the context of the
philosophy of science means. For example applying Darwin's natural
selection concept to the second half of life and the unexpected benefits
of getting older.
Within the realm of limitation angelic free will is considerably more
limited than that of human beings because of their much greater
intellectual capacity and ability to foresee negative outcomes to any
possible decision amongst alternative
<saturation mutagenesis> <developmental biology> <morphological
mutations> <genomic equivalence>
The deterministic chaos observed for algorithms possessing very simple
rules for operating a simulated world such as cellular automata suggest
that just because physicists have been able to pin down the parameters of
the Standard Model of particle physics does not lead us to expect similar
successes in the future in the softer sciences.
Compose an essay on the hopefulness of the true and thorough going
Include a discussion of the concept of time, eternity, everlastingness and
temporality in relation to the Black Adder concept of reincarnation. The
Anthropic Principle, the Multiverse, Simulation (and the problem of
Aetiology) and the theory of Eternal Inflation should figure prominently.
March 2014
Reincarnation means being brought forth, unbidden into a new
universe within the multiverse that is unfathomably connected or not
connected at all to the universe that provided the nutritive yolk of one’s
“previous” incarrnation. The consistency of one’s memories of a past or
past lives is a function of cosmological anthropic consciousness tuning
continuous line or thread. Differentiation and integration is with respect
to a self-same mathematically real line.
edt=June 2014, fcbk= or fb= eml=

"Although the proper fine tuning of brain

microtubule tubulin dimer circuits could be managed by an underlying
naturalistic quantum vacuum, so that the majority of real human beings
could be expected to possess consciousness, in a so-called ancestor
simulation, the fine tuning would be at the opposite end of the one-many
(individual-collective) spectrum, i.e., within the ground of being at its
"world end" instead of within each tiny subspectrum of ground from
which the individual draws his or her being, such that only one distinct
individual would likely possess the correct, concretely dynamic "brain"
structure necessary to embed itself properly within the virtual quantum
vacuum field of the simulation so as to become a properly conscious
entity, rather than the hollow projection of an avatar or at most, a
"philosophical zombie"." That having been said, were multiverse theory
correct, then there really would be nothing to fundamentally distinguish
the universe one happens to be in from a simulation. ! It would be a case
of one universe per customer, instead of tiny fractions thereof, each!
Although the proper fine tuning of brain microtubule circuits could be
managed by an underlying naturalistic quantum vacuum so that the
majority of real human beings would be expected to possess
consciousness, in a so-called ancestor simulation the fine tuning would
be at the opposite end, i.e., of the world instead of the individual, such
that only one individual would typically possess the correct "brain"
structure to embed properly in the virtual quantum vacuum to be
conscious. This does not imply solipsism, but merely that, if Nick
Bostrom's simulation hypothesis is true, then each simulated human is
isolated within his own virtual world and perhaps receiving frequent
multiple data (rather than information) inputs from individuals
inhabiting distinct virtual worlds.
Both pure necessity and pure contingency are inconsistent with free will
and consciousness there must be some chaotic boundary between the
two across which they interact to effect freely willed conscious states.
Treiben Sie wohin Sie wollen Neues zu lernen, soll bitte keine Atbeit
sein. Wir treffen uns bestimmt irgendwann wieder.

Once we have rejected the naive empiricist view of perception memory
and imagination and how these are related one to another and in turn
embrace a more sophisticated Kantian view of these three elements of
mental functioning, then we become open to some highly counterintuitive results concerning the philosophy of mind, the philosophy of
time and space, as well as the philosophy of science.
The tripartite theory of perception distinct from Plato's projective
perception Theory and the traditional empirical theory of perception of
the external world which has been in place for approximately the past
150 years.
There can be no mechanical theory of inertia.
The mental faculties of humankind can be divided into two
fundamentally distinct groups: those whom once the anthropic principle
is explained can only understand this principle as a triviality or tautology
and the minority of persons whose minds are constituted in such a
manner that they are capable of grasping the core insight or logic
underlying the anthropic principle so as to understand this principle in a
non-trivial way. This is perhaps not truly owing to a difference in
intellectual ability of the two groups, but to deep and lasting differences
in intellectual bias.
A further division can be made within the minority group that
understands the anthropic principle in a non-trivial way: those who
believe the anthropic principle obviates the need for infrastructure or
design within the ground of being versus those who believe this
principle absolutely requires the element of design.
The logic of the anthropic principle applies to the multiverse:
humankind can only find themselves living within a bubble in which the
fundamental constants are appropriately fine-tuned. In turn, I can only
find myself within a bubble within the multiverse in which the

fundamental physical constants are even more appropriately fine-tuned.
We must remember that the only thing that has to be made consistent are
the appearances. The logic of the anthropic principle applied to the
multiverse has important implications for the question of personal
identity: for example, could I have come into the world as a lower
animal or as a mentally retarded individual? The necessity of finetuning of the cosmological constant, for instance as well as the
approximately 38 fundamental physical constants. Anthropic
cosmological fine tuning seems to require that personal identity be based
upon some form of necessity such as a determinate essence of some sort,
meaning that one's identity must already
be given prior to the fine-tuning, i.e., a fundamental component of the
ground of being itself.
“While the standard model of cosmology has been largely confirmed by
experiment, a few curious anomalies have resisted explanation. One,
first seen by NASA’s COBE satellite and more recently confirmed by
WMAP9, is the so-called “low quadrupole”. The WMAP satellite has
mapped in great detail temperature fluctuations in the cosmic microwave
background (CMB) radiation that pervades the sky. Standard cosmology
predicts that we should see such temperature fluctuations at every scale.
Both COBE and WMAP, however, discovered that there are virtually no
fluctuations at angular scales larger than 60 degrees. One possible
interpretation of this perplexing
anomaly is that spacetime simply isn’t large enough to support such
large-scale fluctuations. If true, this result would set the upper bound on
the size of the universe at almost precisely the size of the observable
universe. In other words, the boundary of spacetime – the edge of reality
– would coincide with the boundary of a single observer’s reference
frame. According to the standard view, this is quite a coincidence!”, c.f.,
Creator or multiversal vacuum fluctuation are just two different
metaphors which relate to the same really existing first cause.

Nobody lives in the vast majority of multiverse bubbles.
Quantum mechanics predicts that anything which is not forbidden by
conservation laws must occur with some degree of probability.
One of the implications of the multiverse theory is that it each of us is
immortal. Time to address the metaphysics of "incarnating for the first
time" within an eternally inflating anthropic multiverse.
It's not so much that some part of us lives on after our death as it is that
what brought us forth here in the first place shall still be here after we
depart this life and can bring us forth once more just as though for the
first time.
“As a former research student of John Hick I accept his view that we are
living in an
ambiguous Universe which can be interpreted theistically or
atheistically. Hence I am
not attempting to ‘prove’ the truth of a liberal modernist version of
So on account of this solipsistic implication, the multiverse principle
fails as an adequate basis of explanation for the observed density of
information (in the sense of specified complexity) in the universe.
And because there is better continuity across ensembles rather than
within a given ensemble, temporal evolution of the consciousness of any
given anthropic observer is such that causal continuity is psychically
projective rather than reactive. In other words, the psychic continuity of
the self and that of the other do not coincide.
The multiverse principle as an explanation for the radical fine tuning of
the dozen or more fundamental physical constants of this universe - this

principle flouts another principle, that of causal continuity. I say this
because continuity is more readily available in immediately nearby
alternate universes than it is in subsequent states of one's own universe.
Two-dimensional temporality to which we are attributing the majority of
all sudden changes or increases in biological complexity in evolution
would have likely affected the memories and consciousness of any long
lived observer who may have been in the area observing these changes.
<preadaptation in evolution> <mind in the poised realm> Kauffman
Reason and rationality are broader than science, broader than logic.
Logic is applied reason. Logic is to technique as reason is to science.
The former provides the basis for selecting the latter.
'Experimental rugged fitness landscape in protein sequence space.' U. P.
'news and views' 'Why genes in pieces?' 'Nature volume 271 'news and
views' 'Why genes in pieces?' 'Nature volume 271
C.f., YouTube video, The Origin of Genes (cdk007)
New order can arise from random shuffling of modular components
provided that each modular component arose more or less by selforganizing properties of atoms and molecules which graced the process
at its beginning, independently of any natural selection processes.

KEY WORDS for searching this document: @$, threshold, quantum
vacuum, vacuum electromagnetic field, zero-point, Lamb shift, Paul
Exclusion, Pauli Inclusion, Heisenberg Uncertainty Principle, HUP, dark
matter, dark energy, pioneer anomaly, cosmological constant,
cosmological acceleration, kwdmatter-antimatter asymmetry, induced
gravity, induced inertia, Zitterbewegung, dimensional time, temporality,

specious present, dissociation, dissociative, unified, integrated, concept
of consciousness, Incompleteness Theorem, Benjamin Libet, Rupert
Sheldrake, Terence McKenna, Chomsky, Plantinga, El Greco, Kurt
Gödel, Wittgenstein, Bertrand Russell, William James, Julian Jaynes,
Philip K. Dick, Star trek, Perry Rhodan, science fiction, Eccles, Kant,
Santayana, Sartre, Descartes, Hume, Berkeley, Leibniz, Fichte,
Margenau, Penrose, Hameroff, microtubule, tubulin, synapse,
interconnection, future, time travel, Prokhovnik, Sarfatti, Fontana,
zeitschicht, graviton limit, gravity wave, binding energy, Hubble,
density gradient, gravitational constant, fine structure constant, mc2,
mc**2, relativistic mass velocity, rotation about the time axis, angular
momentum, intrinsic spin, imaginary momentum, conservation of four
momentum, hypersurface, hyperspherical, WKB approximation, Casimir
Effect, DeBroglie, Schrodinger, Dirac, Feynman, Einstein, Gamow,
Planck, virtual particle, spontaneous emission, stimulated emission,
fermion, boson, fermion-antifermion, spin 0, spin 1, spin 2, composite
spin 0, composite spin 2, scalar, vector, tensor, Pauli blocking, Cooper
pair, discrete redshift, quantized redshift, cosmological redshift,
gravitational redshift, vacuum energy density, fluctuation, correlation,
Van Gent, Brian Swift, Ziad, Greg, Sakharov, four angular momentum,
precession, red shift, complacency, joy, authenticity, sociolinguistic,
cultural, anthropological, hypocrisy, hypocrite, church, Christian, Paul,
Jesus, God, Divine, divinity, free will, creation, intelligent design,
ancestor simulation, Bostrom, Carroll, Elvidge, Matrix, Gaussian,
Boltzmann brain, solipsism, solipsistic principle, standard deviation,
mean, percentile, foreknowledge, determinism, underdetermined,
overdetermined, correlated, Christianity, Hinduism, Buddha, Buddhism,
Taoism, Tao, Ego, Dream, Dreaming, Lucid, epistemology,
epistemological, metaphysics, metaphysical, methodological, anthropic
cosmological principle, anthropic principle, wavefunction, phase
relations, Fourier, fringe, Gibbs phenomenon, collapse, decoherence,
resonant, resonant tuning, resonance, filter, reducing valve,
transcendence, immanence, evolution, Darwinian, epigenetics, chemical
evolution, mutation, natural selection, unit of heredity, DNA, RNA,
Jacques Monod, Neodarwinian, genetic base pair, information bearing,

infrastructure, providence, grace, theistic evolution, random mutation,
point mutation, regulatory gene, gene regulatory network, GRN,
superposition, tunneling, nonlocal, nonlocality, entanglement, qubit,
parallel, many worlds, Everett, transactional, clone, cloning, twin,
doppelganger, contradiction, contradictory, contrary, Nagel, Dummet,
Smart, Chalmers, philosophy of mind, metaphysics of mind, philosophy
of space and time, arrow of time, irreversibility, entropy, Von Neunann
entropy, 2nd Law, thermodynamic, discourse, free will, reductionism,
consciousness, quantum nonlocality and hidden variables, selfreferential, unity, integration, holism,

The biggest factor interfering with my fantasy that the world is my
oyster is the notably inadequate state of development of battery
January 2014

Write down every original kernel idea you have ever had and
then explore the connections between each and every one. This exercise
should succeed in generating myriad new insights that shall serve to
extend the network further in terms of expanding the total number of
nodes (kernels).
December 2013

Devemos compreender que a Moral tem aspectos relativos; e,
por isso, o que era moral no pretérito pode ser imoral no presente. Não
se pode ajuizar a vida de um povo de mais de dois mil anos, aferindo-lhe
os valores morais mediante o critério de vosso século. Em certos povos
do Oriente a poligamia ainda é de boa moral, a fim de se ajustar o
desequilíbrio que é produto do excesso de nascimentos de mulheres
sobre pequena percentagem de homens. Algumas tribos asiáticas tacham
de imoralidade o fato da viúva ocidental sobreviver ao marido falecido,
em vez de ser cremada com ele no fogo purificador. A moral tão sublime
e sadia que Jesus pregou em sua época, foi o motivo dele ser crucificado,
porque essa moral cristã era considerada subversiva ou debilitante em

face da predominância do instinto inferior dos homens da época.
At logocentrism’s core is the gloss of a faux integrated and unified self,
one concealing and underlying a de facto loose confederation of protoselves. This “trick” of the project-ive unified self is imitated again and
again at numerous different levels, as this splintered confederation of
mental faculties known as the human mind projects its reifying gaze
everywhere and all around itself. Some must suppose this is but an
appearance borne of an incomplete and tattered conceptualization or
mapping of the mind in terms of numerous supposed faculties as the
science of psychology remains in its long infancy as a kind of "psyence
of scichology". One important example of this "logocentric reification"
is the instinctive (as well as subconscious in the sense of
"consciousness-grounding”) understanding of all verbal communication
as being on a par with “telepathy accompanied by articulate sounds”.
The infant acquiring the language of his parents never enjoys the
advantage of having information input to his developing brain, but
develops a theory of his own for making sense out of the funny sounds
being made by big people by interpreting mere data, which only in
retrospect becomes information, retroactively, c.f., April 2014 wik=“Ernst
von Glasersfeld was a prominent proponent of radical constructivism.
This school of thought claims that knowledge is not a commodity which
is transported from one mind into another. Rather, it is up to the
individual to "link up" specific interpretations of experiences and ideas
with their own reference of what is possible and viable. That is, the
process of constructing knowledge, of understanding, is dependent on
the individual's subjective interpretation of their active experience, not
what "actually" occurs. Understanding and acting are seen by radical
constructivists not as dualistic processes, but as 'circularly conjoined'.
Though "data" may pass between minds, these are merely syntactically
and not semantically structured signals: information is always created
anew, in situ on the receiving end, though sometimes with externally

received data as a guide to the construction of the intersubjective
envelope containing the data, while the information resulting from the
processing of this externally received data is a wholly original invention
of the individual consciousness of the recipient. [1]
May 2014

I sometimes feel as though I am living in the shadow of some
great, disruptive and disillusioning realization, perhaps only to be
personally encountered many years or even decades later. No one gets
out of here alive and granted, one's ultimate demise sometimes casts a
long shadow into the living past of the dying, especially if one is
neurotic. I am speaking of something even darker and hidden within the
time traveling penumbra of one's mortal end...unfortunately, any crudely
fashioned chemicals that might be available to the infant science of
psychiatry could only treat this condition at the risk of lobotomizing the
poet in me.
Constructivist Foundations is a free online journal publishing peer
reviewed articles on radical constructivism by researchers from multiple
domains. See also: Francisco Varela, Humberto Maturana, and Heinz
von Foerster”
An envelope of encryption corresponding to a carrier frequency this is
the fundamental grammar which gives proof of intersubjectivity.
All we perceive are Platonic forms from the ground of our own being,
rather than from the ground of being itself. This is not the solipsistic
nature of the world, but of the possibility of perception.
January 2014

If a systematic and thorough going observation and analysis in
terms of signal over noise ratios were performed on the communication
of couples and close knit groups on the one hand and of individuals in
socially and culturally diverse groups on the other, and the results
thereof integrated with long standing analyses of the dynamics of

conflicting witness accounts of accidents and crime scenes, optical
illusion data, “delusions and madness of crowds” phenomena,
anthropological studies of shamanic and shamanically-mediated group
spirituality, entanglement of cognitive metalevels, informal fallacies
pertaining to causality, Bohmian causal analysis Darwinian behavioral
genetics, e.g., Jaynesian cognitive theory, Benjamin Libet’s brain
physiological studies of the relation of intention, retentive memory and
consciousness, and all of this placed in the background of a
sociolinguistic treatment of the El Greco Paradox, it would likely be
revealed that the information to data ratio, I/D is extraordinarily high. A
perhaps naïve view is that a perfect optimization of signal to noise ratio
would necessitate flawless transmission of information between minds,
which precisely tracks the internal vs. external, subjective vs.
intersubjective continuum divide. The untranslatability of mutually
unintelligible human languages informs a hypothesis of the
incommensurate residue of untranslatability of thoughts between
members of the same culture who are speakers of the same language.
April 2014
Communication is not the transmission of information between
minds, but the negotiated middle ground between diverse subjectivities
in which internal semantic and associative adjustments are made as
interpreted data are reprocessed on each side of the communication in a
kind of “hand shake” wherein each party satisfies himself that an
agreement in terms of consensual meaning has been reached. This
becomes especially true when going beyond the relatively unchallenging
demands of concretely descriptive language.
January 2014

As they say, "without context there is no meaning". So the
myriad circulating ion currents in the brain's gray matter can only "be
about something", i.e., make reference to something beyond the
otherwise closed system of the brain, if there is some kind of interaction
or "hand-shake" between the relatively limited and otherwise narrowly
defined quantum processes in the brain and broader quantum-encodedencrypted processes within some grounding, embedding and open-ended
context in which every process in the brain which is conscious becomes

so only by virtue of being "registered" within this embedding
computational-informational infrastructure.
Modern day brain
physiological research, c.f., Stuart Hameroff of the Santa Fe Institute et
al. indicates that this embedding dynamic informational medium is none
other than the quantum vacuum electromagnetic field. Presumably there
must some sort of "cosmic FCC", which regulates and partitions the total
available bandwidth in this quantum vacuum field so that there is no
interference or cross-chatter between your thoughts and my thoughts.
Of course "registration" does kind of carry the implication of archiving
somewhere, say through non-locally connected quantum entanglements
in the vacuum's zero-point energy fluctuation field. By the way,
quantum energy uncertainty is entirely owing to the quantum
fluctuations in this zero-point vacuum field, which through the
Heisenberg time-energy uncertainty relation, /\E x /\t >h, implies that
temporal change, i.e,. temporallity is owing to the constant perturbing
effect of this field. It stands to reason that whatever is the cause of
temporality and change would not itself have had a beginning in time,
unless, of course, something from altogether outside of time was what
had put this field in place.
December 2013

Analogous to the above is the natural assumption that the
conscious movement of the limbs of the body is on a par with telekinesis
accompanied by kinesthetic sensations. This immediacy of contact that
one seems to possess with the external world and with other minds does
not bespeak either telepathy or telekinesis, but of an operant
conditioning born of self-programming and innate or homegrown theory
construction that was fully snapped in place well in advance of the
development of one’s critical faculties of reason. And the lesson of
taught to us from the analysis of the El Greco paradox in the visual
space must be brought through further in the auditory and tactile or
haptic spaces, as well. This lesson stated in its greatest generality is
simply this: that all direct implications as represented in the naïvely
realistic world of the projective space of Newton’s “sensorium” of the
mind and which course through the bulk of Einstein’s imaginative

mollusk are in fact superpositions of constructively and destructively
interfering data streams. Epiphenomenalism is false as originally
pertaining to the individual, though may well prove to be a pretty fair
theory of the relationship of conception and will of the collective
consciousness in relation to its apparent action in the larger world. There
is perhaps then no real collective, but only the intersubjective. This
distinction, of course, is either lost on or matters not to the practical
man. Each differing mental content that is, through the common
conditioning of a shared sociolinguistic culture, mutually addressed via
consonant labels for nevertheless differently perceived objects in the
public space, are jointly juxtaposed and compared, and so although they
be radically different in absolute terms, i.e., when per impossible
compared abstractly with each divorced from its grounding context, are
regarded as mutually and commonly perceived and understood. “Mutual
understanding” is however not necessarily the same thing as two or more
minds possessing the same or like understanding of something.
Call me a reactionary, but I am alarmed by what is happening to the
German language. I think Germans should be as protective of their
native tongue just as indeed the French are! When I reflect upon how
Deconstruction necessarily applies to itself, I realize that conservatism in
some wisely chosen areas should be among the hallmarks of a true
"progressive". Structure is necessary. Why? Because the Planck massenergy limit and therefore the largest possible vacuum energy
fluctuation, i.e., the largest coherent and integrated physical structure
which the vacuum can produce “in a single go” as a virtual object or
simulacrum possesses a maximum mass of only 4.341×10 9 kg = 2.435
× 1018 GeV/c2. The operative insight here is best pointed up by the idea
of the 3d printer. In its current stage of development (end of 2013),
machines, which is to say, objects with articulated moving parts, fitted
together according to an engineering design and intended to perform a
specific set of functions, cannot be generally manufactured in a 3d
printer just by selecting “properties” and then hitting a “print” button.
Many of the individual moving parts still must be printed separately,
filed or ground down to more exacting specifications and then

assembled together. The embodiment of conscious entities may indeed
be similarly called for: the body may function as a mere scaffolding of
the soul, rather than as Aristotle believed, as its integral and indissoluble
component. This “scaffolding” may then be necessary for the formation
of the subtler and more finely detailed structures and internal
connections of the soul, for example, to produce a precise sets of
boundary conditions for such quantum mechanical wavefunctions as
describe brain microtubule substructures, e.g., tubulin polymers so that
the interaction (as well as the self-interaction) of certain higher/larger
and lower/smaller scale dynamic components of the fundamental
quantum field is facilitated. The assembly all at once in a single piece of
the human person would necessarily leave out the presence of the very
subtlest of functional structures that by their very nature cannot be
predetermined within an overarching design, c.f., the difference
between, e.g., androids and robots, humans and angels, c.f., the growing
threat of fundament quantum noise to the designated proper functioning
of high density integrated circuit chips (and not only on account of the
stray capacitance and inductance of imperfections within the chips
structure) due to the innate electromagnetic interference (jamming,
amplification, distortion, filtering, eavesdropping, quantum decoherence,
superposition, entanglement, teleportation and tunneling in the case of
quantum computing) posed by quantum fluctuations of the vacuum’s
momentum-energy, which are fundamental and ineradicable, stemming
as they must from fundamental Heisenberg momentum-energy
uncertainty. Certain important mental features must be learned in a
temporal process and cannot be built in or “programmed in”. Again, the
body is “the scaffolding of the soul”, which is absolutely necessary for
the production of a conscious being, one possessed of a moral sense and
a free will and which is perhaps the only type of being with whom God
is interested in having a relationship. The body also provides the kernel
and crucible of suffering, which is a necessary ingredient in the
development of compassion. The angels, which possess no capacity for
the experience of physical pain, are themselves incapable of genuine
compassion, nor do they possess knowledge as Man does of Good and
Evil. Angels, utterly secure in their persons whilst intervening in the

affairs of men at God’s behest are in a somewhat analogous position to
the Arkonides, that ancient and highly advanced extraterrestrial race,
already in a millennia-long decline as galactic imperator, as described in
the self-congratulatory anthropocentric German space opera, Perry
Rhodan, who have been following the evolution of mankind since
almost its very beginnings and who with great mixture of emotions
realize that Man’s future in the wider cosmos shall assuredly outshine
their own glorious past and who now look to the Terran race to take up
the heavy mantel of a now splintered galactic empire, but who must first
provide much mentoring, guidance and material aid. July 2013 I believe that
the Perry Rhodan serialized novellas were inspired by the idealistic
notion on the part of nationalist-leaning German authors that a rewrite of
the history of the Third Reich could be undertaken wherein all of the
genocidal and distinctly Hitlerian elements could be removed and
Germany’s 20th Century shame could be recast as a cosmic AustroHungarian Imperium.
Russell Clark Jake and Markus. . .“Twain thought that the
German language was a dreadful thing. However, for my part, I am a
great admirer of it. English metaphysics translated into this tongue
appears more profound, scientific prose clearer, poetry more deeply
resonant and Eric Cartman, singularly ueberlustig!”
April 2014


Various previously attempted, as well as all future deconstructions of
the self ("postmodern" or otherwise), have/shall only ever constitute
modes of description, never an actual, still less complete description
itself. The science of psychology is better described as a "psyence of
Instability goes hand in hand with fine tuning of a system. This in turn
requires an extremely intricate system of feedback and control existing
at the ground of being root level.
'In the years after the initial experiments, Couder and Fort used the oil

bath to perform several of the classic experiments in quantum mechanics
One area where the Walkers' analogy with quantum mechanics fails,
however, is entanglement the weirdest quantum phenomenon of all that
describes how the physical state of two particles can be intricately linked
no matter how far apart in the universe they are.
For this to happen, a wave must occupy a very high number of
dimensions so particles can affect one another over large distances,
faster than the speed of light. However, in a walker system the waves
will always occupy just two dimensions, given by the length and width
of the oil tank', c.f.,, ‘Can an oil bath solve the mysteries of
the quantum world?'
Young's double-slit experiment and found that the walkers exhibited
many similarities to the entities used in the original experiments.
Only a transcendental ground of being can accommodate the rationality
of excluding possibilities from the universe of discourse.
transcendental ground of being satisfies all of the requirements for
Deity, i.e., 'big G' godhood, but in addition, so much more.
The many unbelievable events and segments of history suggest the
possibility of a continual retrofitting of past alternate histories to our
present age. This is a further illustration of the operation of
multidimensional time. A case of the future being the history you didn't
Originally there was but a single shooter for the attempted assassination
of JFK but additional shooters were inserted back into the historical time
stream by additional conspirators from the future a second shooter was
also unsuccessful and so a third shooter was inserted ultimately a fourth
shooter was inserted at which time the assassination attempt was
successful.. Interference from future time travelers also helps to explain

the incredible density of conspirators surrounding the assassination of
JFK - a density not otherwise supportable by a merely naturally
occurring linear-time stream.
It may well be the case that consciousness only exist as or within a
manifold and cannot exist as general, universal, transcendental, freefloating, nonspecific or otherwise unstructured. And that therefore this
notion of consciousness as such is but a projection and kind of the
opposite of a reification.
When we make metaphysical assertions about something or other being
beyond or altogether beyond the realm of being or beyond being itself,
of course we can only be referring to beyondness as it were, with respect
to our inadequate concept of Being.
To say 'as it were' is to signal the use of a metaphor, but still deeper, it is
to invoke a universe of discourse other than the one which one inhabits.
But the subjectivity of color perception probably can be extended to
other things beyond color
There are many different ways of maintaining the system of conveying
distinctions such as class inclusion and class exclusion and this is really
the only thing that language is capable of succeeding in doing - all else
is collective if not indeed conspiratorial projecting and make believe.
There is however no concrete performative need at all to convey actual
subjective content of thought or sense data, speaking from the standpoint
of the requirements of natural selection. But their certainly is a
Darwinian requirement that individuals of a breeding population believe
that they can convey and receive subjective information to each other
and from each other. This is much akin to how behavioral genetics
favored the development of small splinter portions of the breeding
population who possessed shamanic powers. The presence of shamans,
witch doctors, seers and so on within breeding population enhanced the
integration of the breeding population and helped it to more

energetically unify and organize its collective energies in the struggle for
survival in a harsh environment amongst other competing groups also
struggling. Because we share the same physical world in the out there
and we share the same language more or less it doesn't matter so much
that our subjective perceptions of the meaning of words within our
common language may differ because a common word, though it splits
in two different subjectively interpreted meanings many times, is
nonetheless united in having a single reference in the external world.
Illustrate this with a diagram in the shape of a diamond with a vertical
line bisecting it. The bottom vertex is a word which refers - the upper
vertex is an object referred to by the word, the left and right vertices of
the diamond are my perception and your perception, respectively, which
may differ as to the subjective content of the word held in common
between us. Just think of what's called the El Greco paradox. El Greco
was supposed to have depicted animals and humans with gnarled and
twisted features because that was the way he actually visually perceived
them in just that manner. But upon a moment's reflection, clearly El
Greco must draw the figures more or less representationally - if he was
attempting to draw them as he in fact sees them, since the canvas which
shows the depictions he has drawn would be similarly distorted.
January 2014

The random mutation + natural selection formula applies
equally well to the competition of evangelical atheism and scientific
creationists, who will always be able to develop an effective resistance
to atheist debunking arguments. But there is a limit to what random
mutations can accomplish, which is vastly outstripped by the capabilities
of intelligent design. If this statement is indeed false, it is because (it
may be turn out to be false on one level because true on a deeper level).
We must make a distinction between a “Russellian paradox” of
reference, e.g., the set of all sets that are not members of themselves vs.
an infinite regress of metalevels in which the role of cause and effect
switch places in an oscillatory pattern after the fashion of a
superposition, vs. referent. (See David Bohm’s discussion in his
textbook of quantum mechanics, Quantum Theory (1951) concerning the
mental aspects of quantum mechanics). Discuss how now that the “great

evangelical atheists” are dying out during the same period that deeper
layers of complexity regulatory genetic control mechanisms and
epigenetics are being uncovered means that a new generation of
biological scientists of similar intellectual caliber shall not likely replace

January 2014

But refer to those commandments and please note that
compliance with half of them makes slavery impossible. But there is
understandably no admonition in the Bible to follow 5 out of 10
commandments. An 11th commandment against slavery would have
been equally redundant. Being either an actress who took up activism or
an activist who took up acting, am I right in supposing that Ms. Sorvino
possesses something less than a stellar intellect? You know, Ziad, Ms.
Sorvino's is an outstanding example of some of the naiver criticisms of
Christianity of which white liberals (who are not intellectuals) are
frequently guilty. There are indeed intelligent criticisms that can and
have been leveled against the Christian faith. This is not one of them,
Empirical mathematics is founded upon the notion that mathematical
complexity is capable of exceeding the grasp of the divine mind. This is
similar to how the complexity of the domain of eigenvalues can outstrip

the complexity of the domain of eigenfunctions.
Taylor power series expansion of analytic functions involves
exponentiation but the rules of exponentiation breakdown when it comes
to transfinite arithmetic.
Compose a list of paradoxes that are related to Russell's paradox.
Unlike being off by a mere factor of two, as is the case for the lion's
share of crackpotty physics theories, a theoretic model which predicts
effects which propagate, act, increase or decrease in the reverse direction
relative to what is actually observed, cannot be rescued with a mere
patch, but must be thrown upon the scrap heap of failed theories. Kind
of like if Dr. Maxwell's equations of electromagnetism had predicted a
Lorenz force with the correct sign.
Once in an abiding system of feedback is in place, one which perturbs
the natural selective forces, heretore acting alone on human behavioral
genetics - at this very point, a honing in process starts up, one which
shall in practical terms intelligently shape human behavioral genetics in
accord with the ground relations of this system to which there has
suddenly been linked a feedback coupling.
There is a bit of a king or problems in Terence Mckenna the notion of
brain activity as an ongoing die marker perturbation of the quantum
field. This is related to another problem which I first brought up in
connection with mental illness likely to occur during deep space voyages
in zero gravity because of the twofold coherent in decohtnesserence
processes that operate alongside each other within the quantum brain.
January 2014

Cellular organization: Recent research implies that gravity
helps cells create patterns. In microgravity, the microtubules in
developing cells might not organize the same way they would on Earth,
even after the astronauts return. It is unknown how this will affect the
uncovered a compelling reason why the dream of colonizing space may
be a non-starter. It seems that the skeletons within living cells may not
form properly in zero gravity. This means that it may be impossible to
live in space over the long-term without creating a form of artificial
gravity. Most cells have skeletons made up of microtubules made from
fibres of the protein tubulin. New Scientist magazine reports that Dr
James Tabony and his colleagues from the French Atomic Energy
Commission mixed up cold solutions of mammalian tubulin with an
energy-releasing compound. When the mixture was warmed to body
temperature for six minutes, microtubules began to form in distinct
bands at right angles to gravity. Next, the team sent up tubulin on a
European Space Agency (Esa) rocket to expose it to the effect of
weightlessness. They found that when microtubules formed, they
pointed in all directions. Dr Tabony said: "This shows gravity triggers
the pattern." Previous work by Dr Marian Lewis of the University of
Alabama at Huntsville produced similar results. Dr Lewis's team tested
the impact of weightlessness on human white blood cells that were
flown on board the space shuttle. After a day in orbit, the microtubules
grew in random directions. The findings might explain some of the
health problems people living in space have, such as depressed immune
systems”, c.f,

The uniform state-space with its combinations and permutations of
distinct states must always be understood to be an abstract projection.
I have said this before but I believe it bears saying again consciousness
as such may indeed be like the state-space... nothing more than an
abstract projection and reification of what can only truly happen at an
individual level.
PHP on server side and Java script on client side.
Having the epiphany that you personally don’t really know anything at

all means that Mans knowledge can be at best but a collective illusion. . .
and all this within the context of a sheer abundance of grace. How can
one not then have faith upon recognizing the Providence involved in
such a coherent manifestation of the radical unknown, i.e., the World as
subjectively perceived. One is called forth from the Void and enters the
world from one unknown only to pass from it into another unknown.
What is lost on some is that the world itself is yet a third unknown,
rendering the first and the latter qualitatively distinct. And so
metaphysical work can only be performed by experience if it is possible
to transcend all dual opposite categories. kwo=“I do not fear death. I
had been dead for billions and billions of years before I was born,
and had not suffered the slightest inconvenience from it.” - Mark

The number of distinctly possible universes in the multiverse outstrips
the number of possibly distinct human minds in this universe - that an
important distinction or qualification to have made, but does not reduce
the original argument, that is, its original force, but merely acts as a
kind of patch to the original argument.

I don't speak from the top of the mountain but from many valleys.

Anne Behrnes says Buddhism is existential and not metaphysical and
that the same principle applies to all religions. It's all distraction, she
says, a way to not be present to your life.
I was awoken by the dream of a sleeping avatar representing a friendship
lost. She hurled curses foul and threatening towards me, the fool who
disturbed her slumber.
Philip Jose Farmer. To your scattered bodies go.

Entropy vis a vis appearance of order vs. actual order.
Quote Feuerabend chapters.
So how does Jesus is redemptive act perform the necessary metaphysical
work to actually read email kind if man is not greater than he is in Jesus
is not with her then he is thought to be namely man is an unsuccessful
attempt to make a Jesus.
Grace as the mechanism which counteracts the root of all evil which is
Without the presence of daily and abiding joy we know that the person
does not indeed possess the faith necessary to accept the gift of God's
The many paradoxes of the early 20th Century in mathematics, physics,
psychology, art and literature signaled the breakdown of the 19th
Century's conception of reality. Paradox signals impending awareness of
a larger system containing the system from which one is currently
operating. Rationality is the mysterious nature of how the elements of
one system transform as they are caught up into higher systems. Nature
is hierarchically structured. We have no concept of consciousness
because it is indeed consciousness that underlies rationality.
December 2013

Neural impulses can only add together within a preexisting
system which provides the context within which they can be interpreted
Both the spatial binding and temporal integration of brain function rely
upon quantum entanglement or quantum nonlocality within brain
microtubule tubulin dimer proteins.
How can what happens within an isolated or closed system be about
anything which is to say how can it make reference to something outside

of itself? Deterministic causality is context free causality or causality
within an isolated or closed system.
Time telepathy and the simulation argument.
One would be expected to possess a phenomenal sense of time and
internal clock.122013
Ironic that the only real evidence for evolution is the microevolutionary
adaptability which makes species robust against fluctuating
environmental conditions.
He was in part she was in part he was not impart they were playing the
part she got the part there should be a new verb tense for those who are
repetitively playing more or less the same character in a plot this is the
concept of playing the same part but differently each time each different
alternate universe.
Does being reunited with everybody familiar from one's earthly life
constitute in any way a disproof of solipsism?
Scaffolding of the stepwise building up of mass larger than the Planck
mass is analogous to the need for embodiment for the development of
the soul.
Social construction of the ego.
The infant provides the theory that gives context to the sociolinguistic
context of 'big people making funny sounds'.
Temporal integration versus the integration of proto selves.
Paradigm-busting philosophical
disingenuousness or wonder.






And in the teaching/retelling by admixtures of both.
I am very much for celebrating the great successes of 20th Century
physics but we should be guarded against the arrogance typical of some
hard scientists by contemplating the likely derision in which 40th
century scientists will hold 30th century scientists. If the currently strong
remaining strand of the generalized “future-hype” of the 1960’s which
survives within the current day, i.e., the artificial intelligence community
and its promoters, were destined not to fizzle out within its current
scientific/engineering paradigm or even better were not to be altogether
overthrown in the shift to the paradigm succeeding it, then one has to
suspect how pervasive and successful indeed has been Descartes
“deceiving demon” in preparing beforehand all the myriad layers of
nested camouflage, which the career of post-enlightenment
technological development has systematically peeled away, one by one,
perhaps all too easily in retrospect, that one really must wonder in light
of the piquant observation above whether October 2014 fcbk= “it's *all*
mythology, I say. . . Man's science, his religion, the whole lot!” This is
an overgeneralization, “every self-respecting scientist and religionist
would likely agree”. “For example, the Ptolemaic epicyclical model did
kind of work, didn't it? But we now know that the Gods of Olympus
were a bunch of phooey, right?”
Very small children, who are not yet three years of age, will happily play
side by side without really interacting with one another. A certain level
of advancement in the sociolinguistic programming of the children’s
neural networks, which is determined both internally and externally, say
by behavioral genetics and social interaction, respectively must have
first been achieved to enable interpersonal cooperation and
communication and so a bootstrap function must somewhere be in
evidence, either in the brain or, more likely in the brain-environment
If there is no concept of consciousness there can be no theory of
consciousness consequently no understanding of what consciousness

truly is or how it operates
Solipsism and the social nature of reality. The engaging nature of people
that is of other minds takes our mind off of such self-destructive
metaphysical questions. An afterlife composed of an endless empty
meadow. Had somebody not been born could they just wait until the
universe goes bang again and get another chance? I had a dream with
you which explained everything to me. The profundity of auto text
correction is similar to that of misheard lyrics.
What are we to make of the fact that the anthropic cosmological
principle together with the multiverse in which the number of universes
vastly outstrips the number of bona fide persons, which is to say original
persons, who are not merely alternate versions of said persons? And
where all the bodies and universes are superimposed although
consciousness do not superpose in a quantum fashion. This is because of
the underlying dynamics of consciousness involvement with the
mechanisms of decoherence and wave wavefunction collapse. So by the
anthropic principle, oneself must be bona fide, but all of the other selves
in one's populated world are merely alternate versions of true or bona
fide selves, each of which are located in some unimaginably far distant
alternate universe. This would imply that only oneself possess free will
and consciousness; everyone else in ones populated environment is
merely an intelligently acting automata.
January 2014

The Anthropic Cosmological Principle constitutes an epic fail
as explanation of the intricate and harmonious complexity of the cosmos
being as it is merely the anemic stepbrother to a secret lone rightful heir,
that of the Solipsistic Cosmological Principle.
Because acceleration only rotates the four momentum vector but does
not change its absolute magnitude there for an external force is not
actually possible
Over the data passed away with the destruction of the brain originally

containing that data the contextualizing a shin and interpretation of this
data remains I am in the form of a network of quantum entanglements in
the electromagnetic field. Consciousness and they referring to ground.
The appearances are overdetermined and possess an entropy the self or
its consciousness is underdetermined, acting more or less as a heat bath
probe though with filtering capabilities, which are quite extraordinary in
that they extract quantum encoded entanglement based information from
the pristine vacuum electromagnetic field. In the reprocessing, the
initial and boundary conditions for this vacuum are also extracted. This
by passes the traditional fine-tuning problem.
The anthropic
cosmological principle is but an inflection of a much more stringent
principle, that of the solipsistic cosmological principle.
One’s consciousness is not determined by some immense,
exquisite set of causal conditions with respect to a particular fixed and
random fine-tuning of the fundamental physical constants of the
Universe, but rather by the fine-tuning itself, which therefore, vis a
vis the Anthropic Cosmological Principle cannot be anything like
random. November 2014 (There is something continually “going on” in the
background in some ultimate context, which demands that this “finetuning” be continual and endlessly adaptable and creative, but must also,
in the final analysis, be just for the sake of “staying in the game”, i.e.,
maintaining objectivity qua intersubjectivity as opposed to mere
absolute subjectivity. December 2013 @$ prn=The “internal appearances” place
a far greater constraint upon this fine-tuning of the fundamental physical
constants in support of the character of one’s own conscious experience
than do the “external appearances”, which concern themselves only with
the apparent configuration of people and animals, tables and chairs, etc.
within one's subjectively perceived spacetime. Consequently, the
fundament physical constants must have been fine-tuned to something
like 14 decimal places in one’s own case, though these constants need
only to have been fine-tuned to 7 or 8 decimal places in order for other
peoples’ behavior to be as manifestly coherent as it traditionally appears
to be. This is a kind of inverse "time scale reductionism", which I will
December 2013

term "frequency scale reductionism", i.e., the behavior of systems
requiring cosmological fine-tuning to N decimal places is determined by
underlying systems requiring cosmological fine-tuning to N + 1 decimal
places. So by the very same logic proffered in support of the Anthropic
Cosmological Principle, one could put forward an equally cogent
argument in favor of a so-called Solipsistic Cosmological Principle.
November 2014
I am reminded here of A. A. Michelson’s infamous assertion,
made during the dedication of a physics laboratory in Chicago in 1894
that, "Our future discoveries must be looked for in the sixth decimal
place." (This was by way of his noting that all of the important physical
laws had already been discovered). And in recollecting Michelson’s
assertion in connection with this discussion of the Anthropic
Cosmological Principle, I am inclined to believe that he was indeed
correct, but for reasons neither he nor anyone of his era could have
imagined (except for possibly Boltzmann himself).
February 2014

Buosso and Polchinski (2003) calculate that the string theory
predicts 10500 distinct universes based upon each universe possessing
slightly different fundamental physical constants. This would seem to
imply that there is “room at the bottom” to support hyperfine tuning well
beyond the intersubjectively measurable realm of a mere 4 to14 decimal
places. So the seemingly hyper-fine-tuning of the Bohr magneton to 14
decimal places, for example is merely superadded to the much more
precise fine tuning that is required for the functioning of an individual
subjectivity/consciousness. This is paradoxical fine tuning that starts out
finer and progresses to coarser. Egoic consciousness is revealed for what
it is: an intersubjectively mediated structure of initial and boundary
conditions placed upon qualia-consciousness or consciousness qua
substance/dynamic integral form. Ego is a structure of an individual
consciousness that is effectively the collaborative effort of myriad egotranscending quale—consciousness’s. The implication here is that, even
if one is indeed a brain in a vat, the feeding of impulses into one’s
isolated brain that manages to succeed in producing the illusion of
coherent world (and world view) must be managed through a
collaboration within a collective of big-headed alien scientists and

cannot be the product of a single big-headed alien scientist. Language is
inherently social in nature and the ego is inevitably a sociolinguistic
structure! Metaphysical solipsism is only a metaphorical interpretation
of a hybrid methodological-epistemological solipsism. There are no
solipsist big-headed alien scientists. This is in part because science is
necessarily a social endeavor.
documentary: “Universe or Multiverse?”)
How naive is the level of discussion, philosophically speaking, since the
larger metaphysical implications are ignored so that this paradigmbusting theory is treated solely within the obsolete paradigm.
August 2014

Just as we need transcendental mind to ground our concept of
consciousness and so a theory of “other minds”, we need to allow the
possibility of metaphysics as an enabler of intersubjective objectivity as
a hedge against solipsism.
It seems all causal relationships within four dimensional space time can
be encoded as quantum entanglement on a two-dimensional surface. I
think this is pretty solid proof of the holographic universe theory.
One has to be thrown into uncertainty in order to have the opportunity to
reduce that uncertainty which is to say acquire information and learn.
Wittgenstein method appears to be one not of providing insight, but of
providing the elements which the creative imagination needs in order to
have its own insights.
Is the aboriginal hologram model consistent with the notion of the
rationality of language and with the reprocessing of 'shard' experience
into new forms entertained within the original hologram reconstituted?
Energy degeneracy and the objection of Hameroff to keep qubit states

from entering a Bose condensation.
Instead of wiping us out will the robots enforce humankind's own
principles of ethics in fairness or will they only be able to be intelligent
by becoming self-conscious that is by connecting to the cosmic Internet
and in so doing be coming evil just as mankind is evil
I like the way Hameroff describes the mechanism of the action of
hallucinogens in terms of how the hallucinogenic compound donates
electron residents to the receptors. Of course we already know that this
residence has to do with the exchange of data and information between
tubulin one dimers and the quantum vacuum electromagnetic field.
These compounds simply change the interface where filtering of
information occurs in the vacuum electromagnetic field's sub spectrum's
There is an important distinction to be made between the aboriginal
hologram and the dynamic informational ground providing the context
for the collective reprocessing of information which is being collected
by each individual shard of the original hologram. Some distinctions are
being suggested here which to me are reminiscent of the emanationism
of Plotinean philosophy or of Neoplatonism.
Hameroff says we do not want strong Froehlich coherence in the
microtubule tubulin dimer quantum switches ('qubits') because this
would result in Bose condensation of all of the qubits and you can't
really do any meaningful computation with all of the qubits in the exact
same state such as would be required by strong Froehlich coherence of
the qubits. The devil is in the details. Similarly we can't have any
meaningful experiential computation if there's only one transcendent
infinite consciousness the hologram needs to be shattered and each shard
have its own state so that in cooperation there can be meaningful
computation, collectively.

Consider the entropy of the unused possibilities for various states of a
system with respect to the closedness or openness of the system's state
space, thermodynamically. Quantum field theory in curved space-time
also in curved phase spaces and in curved state spaces. Gravitational
decoherence of the state vector or wave function.
We see the logic of the anthropic principle at work whenever we listen
to an accomplished individual in some field or discipline describing how
he arrived and his original inspiration to pursue his successful longstanding career. The interviewee points to various junctures in his early
schooling or professional career which seemed very much driven by
chance events. However the underlying logic informing the choices of
this once young researcher or scientist exhibits a cohesiveness strongly
suggestive of teleological influence and almost as though he is being
guided by his future self, somewhat akin to how a quantum computation
is performed by mutual interference of myriad distinct branches of a
superposition. And it is almost as though there is a stiff competition
between future alternate versions of this young person vying with each
other to influence his choices in such a manner that they come into being
as the adult versions of this young person instead of their competitors.
January 2014

Can the logic of the anthropic principle be applied to the
question of what constitutes the present moment or what is called now?
Spatial in temporal scale in the solar system realism dilemma
Of course the logical answer with respect the status of the other ontologically is to say that each individual simply possesses their own unique
subset room of resonantly tuned vacuum fluctuations field.
Any amateur rhetorician who knows both the canon of ancient Scripture
and who has merely a popular science writer's grounding in the basic
sciences can with a little imagination and effort make any recent

scientific discovery backwards compatible with a verse or verses
plucked out of context from those canonical scriptures. Doing so proves
nothing other than perhaps the cleverness of the rhetorician. Remember
that the Sophists of ancient Greece used to pride themselves on being
able to make the worst side of an argument appear the better one and
there are still many around today keeping this cynical tradition alive.
What is more than a little paradoxical today is that some of these folks
are unaware of their own cynical motivations.
Daniel van Gent and Dr. Robert de Brandes may end up being the
collective Marconi's of the 21st Century. That is if the Chinese don't end
up stealing all of their ideas. The team has already successfully applied
for several European patents of their novel communications technology.
The signals which live in a mysterious domain known as 'Hilbert space'
cannot be hacked or eavesdropped on or attenuated, jammed or
otherwise interfered with by other communications signals. The
technology, which has already been successfully demonstrated in
principle, is ideal for communication between underground installations
and between submarines which are submerged for long periods of time.
The technology seems potentially applicable to the real time telemetry of
space probes, but that's where the inventors would likely get into trouble
with the US Patent Office, which holds to the long standing policy of not
accepting patent applications for designs that are deemed to invoke
principles that potentially fly in the face of established physical law.
What these patent examiners need is a hefty dose of Hume and
If you are considering shutting down your FB page or want to preserve it
against the day that FB decides to shut down your page, then this
application is for you.
With penultimate subtlety are the initial and boundary conditions of the
universe for the fine tuning of the consciousness of the other managed in
resonance with this universe whose fundamental physical constants have
ultimately been resonantly tuned to uniquely correspond with my

The only way to democratically apply the anthropic cosmological
principle to self and others is too follow a poly-solipsistic or
emanationist theology. In other words the only way to avoid the
ridiculous conclusion of the solipsist philosopher who misapplies the
logic of the anthropic cosmological principle is for one to adopt
polysolipsism. . . we collaborate with each other to produce this theater
within which we can connect, learn, grow and love.
The world perhaps is orchestrated by all secretly collaborating and like
the occasional glimpsing of a familiar voice in a crowded room of
conversations, one occasionally hears one’s own voice coming back at
The logic of the anthropic cosmological principle is a veritable
metaphysical stick of dynamite its a stick of dynamite that begs to be
used in the appropriate way to solve some long standing and annoying
metaphysical problem the date back to the ancient Greeks on which we
have apparently made a little progress in the last twenty five hundred
Just as ordinary people do philosophers take for granted the implicit
deep infrastructure which graces language mind and all of the faculties
of the human person.
Philosopher, James Aaron’s theory in his book, Assholes (A Theory), is
that "a person counts as an asshole when, and only when, he
systematically allows himself to enjoy special advantages in
interpersonal relations out of an entrenched sense of entitlement that
immunizes him against the complaints of other people...The asshole acts
out of a firm sense that he is special, that the normal rules of conduct do
not apply to him". This is by far the best definition of the colloquially
known “asshole” ever made known to me.!1778!3!31545176582!e!!g!nike%20fuel%20b
V Sauce 2
Idea com/idea channel
Combine the belief that there is nothing greater than oneself with the
anthropic principle and you have something approaching solipsism.
Why wouldn't you be the most superior and perfect being unless such a
being had arranged it so that you were further down the spectrum below
perfect godhood.
Instability goes hand in hand with fine tuning of a system. This in turn
requires an extremely intricate system of feedback and control existing
at the ground of being root level.
'In the years after the initial experiments, Couder and Fort used the oil
bath to perform several of the classic experiments in quantum mechanics
including Young's double-slit experiment and found that the walkers
exhibited many similarities to the entities used in the original
One area where the walkers' analogy with quantum mechanics fails,
however, is entanglement the weirdest quantum phenomenon of all that
describes how the physical state of two particles can be intricately linked
no matter how far apart in the universe they are.
For this to happen, a wave must occupy a very high number of
dimensions so particles can affect one another over large distances,
faster than the speed of light. However, in a walker system the waves
will always occupy just two dimensions, given by the length and width
of the oil tank', c.f.,, ‘Can an oil bath solve the mysteries of
the quantum world?'

'In the years after the initial experiments, Couder and Fort used the oil
bath to perform several of the classic experiments in quantum mechanics
including Young's double-slit experiment and found that the walkers
exhibited many similarities to the entities used in the original
One area where the walkers' analogy with quantum mechanics fails,
however, is entanglement the weirdest quantum phenomenon of all that
describes how the physical state of two particles can be intricately linked
no matter how far apart in the universe they are.
For this to happen, a wave must occupy a very high number of
dimensions so particles can affect one another over large distances,
faster than the speed of light. However, in a walker system the waves
will always occupy just two dimensions, given by the length and width
of the oil tank', c.f.,, ‘Can an oil bath solve the mysteries of
the quantum world?'
Only a transcendental ground of being can accommodate the rationality
of excluding possibilities from the universe of discourse.
transcendental ground of being satisfies all of the requirements for
Deity, i.e., 'big G' godhood, but in addition, so much more.
November 2013

Anything within the state space of possible states is possible.
But pursing the logic of your principle still further, we must say that
multiple (and perhaps an unlimited number of) state spaces are possible.
Exclusion of possibilities underlies the integral unity, coherence and
cohesiveness of worldhood as such. The structure of possibility is
probably fractal, which automatically carries the implication of excluded
possibilities. For example, it is not possible for me to experience your
conscious mental states or sense data. Another example, the real line
possesses a topology, though one that is nonlinear.


=NOEXPSHIP&custcol_shippingsupplement"whence= HYPERLINK

November 2013

It seems all causal relationships within four dimensional
space time can be encoded as quantum entanglement on a twodimensional surface. I think this is pretty solid proof of the holographic
universe theory.
One has to be thrown into uncertainty in order to have the opportunity to
reduce that uncertainty which is to say acquire information and learn.
Wittgenstein method appears to be one not of providing insight, but of
providing the elements which the creative imagination needs in order to
have its own insights.
Is the aboriginal hologram model consistent with the notion of the
rationality of language and with the reprocessing of 'shard' experience
into new forms entertained within the original hologram reconstituted?
Energy degeneracy and the objection of Hameroff to keep qubit states
from entering a Bose condensation.
Instead of wiping us out will the robots enforce humankind's own
principles of ethics in fairness or will they only be able to be intelligent

by becoming self-conscious that is by connecting to the cosmic Internet
and in so doing be coming evil just as mankind is evil
I like the way Hameroff describes the mechanism of the action of
hallucinogens in terms of how the hallucinogenic compound donates
electron residents to the receptors. Of course we already know that this
residence has to do with the exchange of data and information between
tubulin one dimers and the quantum vacuum electromagnetic field.
These compounds simply change the interface where filtering of
information occurs in the vacuum electromagnetic field's sub spectrum's
There is an important distinction to be made between the aboriginal
hologram and the dynamic informational ground providing the context
for the collective reprocessing of information which is being collected
by each individual shard of the original hologram. Some distinctions are
being suggested here which to me are reminiscent of the emanationism
of Plotinean philosophy or of Neoplatonism.
Hameroff says we do not want strong Froehlich coherence in the
microtubule tubulin dimer quantum switches ('qubits') because this
would result in Bose condensation of all of the qubits and you can't
really do any meaningful computation with all of the qubits in the exact
same state such as would be required by strong Froehlich coherence of
the qubits. The devil is in the details. Similarly we can't have any
meaningful experiential computation if there's only one transcendent
infinite consciousness the hologram needs to be shattered and each shard
have its own state so that in cooperation there can be meaningful
computation, collectively.
Consider the entropy of the unused possibilities for various states of a
system with respect to the closedness or openness of the system's state
space, thermodynamically. Quantum field theory in curved space-time
also in curved phase spaces and in curved state spaces. Gravitational
decoherence of the state vector or wave function.

We see the pic you'll your logic of the anthropic principal at work
whenever we listened to an accomplished individual in some field or
disappointed describing how he arrived and his original inspiration to
pursue his successful long-standing career. The interviewee points to
various junctures in his early schooling or early professional career
which seemed very much driven by chance events. However the
underlying logic informing the choices of this once young researcher or
scientist exhibits a cohesiveness strongly suggestive of teleological
influence. And it is almost as though there is a stiff competition
between future alternate versions of this young person vying with each
other to influence his choices in such a manner that they come into being
as the adult versions of this young person instead of their competitors.
November 2013

The bizarre counter-intuitive behavior of the wave function is
smoking gun evidence that we reside within a so-called ancestor
simulation and that we are collectively responsible for the operation of
said simulation.
I couldn't agree with you more, Sam! What you are suggesting is a
"regime-culture" change, which though radical is just . If the robots do
not wipe us out a few generations from now, but instead apply their
implacable logic to the forced implementation of humans' own common
sense notions of fairness and ethical behavior, then maybe your powerful
vision of how things ought to be can be made a reality.
I just looked in my email drafts folder and was psychically crushed
under the weight of so many abortive thoughts.
Managed compassion is the inevitable solution stumbled upon by the
meta guilty conscience which feels keenly the mild but nagging guilt of
not feeling guilty.

Rene Girard Mimetic Theory
Well ordering of the real line. McAfee.
Our Holy Father the Pope by Don Caffrey Ignatius Press
I agree with that article, Ziad. However, the belief in a supreme
intelligence or at the very least in a 'cosmic programmer' of the universe
somewhat akin to George Lucas' 'The Force' may continue among the
relatively uneducated classes for many generations. Strangely, there will
be a rise in the belief in both epistemological solipsism as well as a polymetaphysical solipsism, i. e., the belief that the universe is a
collaboration and sociolinguistic construct of myriad individual
consciousness’s. The deepening realization that there is no evidence for
ET's will begin to reinforce the idea that we are members of an at least
billion year old civilization living within an 'ancestor simulation'.
Philosophers and cosmologists as well as some physicists as well as a
goodly number of philosophy-educated people of the not so distant
future will take it for granted that the probability of universe-simulacra,
for example, Boltzmann brains, et al., greatly outstrip the probability of
so-called 'real' universes. Which realization is natural in light of the
compelling logic of the anthropic principle.
January 2014

There is a historical collection of initialization, updates and
patches in the quantum field corresponding to which there are pointers in
the brain however these pointers are ambiguous when not currently
engaged in the wider underlying quantum field. How large of a temporal
slice or chunk is required in order to accurately duplicate a global brain
Buy a new and different logical pathway I have returned to the notion of
resonant tuning of Boltzmann brains. This pathway is the resonant
tuning to say, 14 or more decimal places of various fundamental
physical constants. Any physical instrument or device which relies on

resonant frequencies and which has been engineered to the extreme
limits of sensitivity should perhaps be able to pick up the effect of 1 and
perhaps multiple individuals and its immediate vicinity due to the
normal operation of their conscious minds or perhaps only when they are
in special mental states such as meditative states. The anomalous
interactions between conscious individuals and random number
generators may be a manifestation of this type of phenomenon.
One paradigm shift which will significantly affect the evolution
intelligent design debate is the dichotomy of faith versus reason and
faith versus evidence, miracle versus natural law. Kuhn and Feuerabend
and their philosophies of science vs the Enlightenment notion of the
inevitable linear progress of scientific discovery.
November 2013

Within current Multiverse theory, the number of possible
alternate universes astronomically outstrips the admittedly large but
relatively tiny number of possible distinct human brains. Add to this the
peculiar logic of the Anthropic Cosmological Principle and you may
find yourself in the possession of a probable truth that dare not speak its
name, c.f., the suggestion of the front book cover of the
Doesn’t it make sense that, with such an inconceivably large number of
possible universes that one could be born into, that one would naturally
be born into (brought into being in) just the very universe possessing a
quantum mechanical ground state or quantum vacuum that was the most
exquisitely fine-tuned in terms of the precise collective settings and/or
adjustments of the more than 20 fundamental physical constants in order
to be compatible with both the unique requirements of one’s peculiar
flavor of subjective consciousness, the phenomenal contents and quale
uniquely befitting this consciousness and the equally peculiar mode of
quantum mechanical functioning of the spiritual-material interface
(quantum mind-brain) qua reducing valve-consciousness filter? And
doesn’t it follow, moreover that, from the standpoint of one’s own
unique anthropocentric and 1st person point of view that the brains of
any and all other human beings, among whom one has now found

oneself, the fine-tuning of the fundamental physical constants of this
universe is not quite so exquisitely precise as in one’s own particular
One of the outcomes of living within an ancestor simulation is that not
all viruses of the mind originate with other human minds.
Clayton Smith says that the fact that the Pauli Exclusion Principle does
not apply to consciousness is connected with the idea of the fine tuning
of consciousness and the hard encryption of quantum entanglement
encoded information the anthropic principle. In the movie, A.I.
(Artificial Intelligence), the idea from the metaphysics of mind is
expressed by one of the super-advanced AI’s from 2000 years in David’s
future that, "David, I often felt a sort of envy of human beings, of that
thing they call 'spirit'. Human beings have created a million explanations
of the meaning of life - in art, in poetry, and mathematical formulas.
Certainly human beings must be the key to the meaning of existence.
But human beings no longer existed. So, we began a project- that would
make it possible to recreate the living body of a person long dead from
the DNA in a fragment of bone or mummified skin. We also wondered
would it be possible to retrieve a memory trace in resonance with a
recreated body. And you know what we found? We found the very
fabric of spacetime itself appeared to store information about every
event which had ever occurred in the past. But the experiment was a
failure. For those who were resurrected only lived through a single day
of renewed life. When the resurectees fell asleep on the night of their
first new day they died, again. As soon as they became unconscious,
their very existence faded away into darkness. So you see, David, the
equations have shown that once an individual spacetime pathway had
been used it could not be reused. If we bring your mother back now it
will only be for one day. And you will never be able to see her again.”
There seems to be two fundamental views on the nature of individual
consciousness in relation to consciousness as such. One is that
consciousness in effect obeys Fermi Dirac statistics this is my view the

other view is the more common mystical view that the consciousness of
the individual obeys Bose Einstein statistics.
Seeing the depiction of parallel earths and the parallel lives of the
humanoids that live in these parallel worlds just reminds one of how
cultural patterns become historically locked in by chance events
There is a relationship between inertia, robustness and holographic

All distinctively quantum interference effects are in reality selfinterference effect. The litmus test of objective realism is the presence
of absence of self-interference effects at some theoretically
predetermined threshold. The question is whether man possesses the
insight and imagination to propose such a theory.
When our method of experimenting begins to get at the bootstrap
mechanism a physical reality it is here that mental effects should become
'All important truths are encompassed for all time by my own beloved
intellectual prejudices.' If this is in fact what one secretly believes, then
is it nevertheless possible to somehow bootstrap one's consciousness out
of such a mental trick bag? Probably not. Therefore an advisable course
of action is to make sure that this trick bag is as large as possible.
That's right. One can only move from one trick bag of intellectual
prejudices to another trick bag.
That is because all information is contextualized and structured data.
There is no such thing as raw information in the absence of a

transcendental ground of mentality, which is to say transcendental mind.
Quantum interference effects are manifestations of a collective
information dynamics, not of a physical mechanism as such.
Yeah that Jesus logic is perfect. We were all God and all chose
limitation including the amnesia of the transcendent self that goes with
this foolhardy act perpetrated out of the be ief that the other would do
the same. Most everyone forgot the original mission having become
enmeshed in Maya. Some, like Buddha and Jesus discovered how to
recontact the higher self though without truly becoming one with it.

November 2013

"All important truths are encompassed for all time by my
own beloved intellectual prejudices." If this is in fact what one secretly
believes, then is it nevertheless possible to somehow bootstrap one's
consciousness out of such a mental trick bag? Probably not. Therefore
an advisable course of action is to make sure that this trick bag is as
large as possible.
November 2013

Yeah, I'm hardly ever on here. We'll have to chat soon.
When I see you again, it will be like no time at all has passed. It will be
as though we had just resumed that personal conversation of 18 years
ago. Time and space are meaningless between great friends. By all
means, re-read part of my comment with all of the appropriately classic
Leonard Nimoy intonation.
November 2013

I have made it a habit to purchase books on Amazon that
have inverted-bell-curve ratings spectra, i.e., a lot of one-star and fivestar ratings and relatively fewer 2 to 4-star ratings, quite regardless of
whether I think I’d agree with the author’s thesis or not. It’s a great way
to step outside of the echo chamber of one’s own intellectual biases and
prejudices. I highly recommend that readers of Meyer’s book next turn
to Mark Perakh’s book, Unintelligent Design. Life is too short and the

universe of ideas too vast to only read authors one agrees with.
October 2013 fcbk=

Guisean Buddhistentialist meditative states are achieved
not in the Lotus position, but while standing and gazing intently in the
mirror at one’s own admirable reflection. The facial muscles must relax
completely and one must appear as unenthused as possible, all the while
one rhythmically intones the empowering phrase, which one has
received from the master. Then and only then, if one is worthy, a
channeling of his contemptuous spirit takes over and the insistent mantra
of subvocal reverberation continues to grow, now powerfully with its
own inner voice, one possessed of a mildly disdainful Brooklyn accent,
which yet continues to build, feeding upon itself until it suddenly erupts,
like a dagger of the mind, giving birth anew to this forever
disembarrassing phrase, only now released from the very depths of
crushing ennui. In a blinding flash, the ego is liberated from its selfimposed dictatorship of caring with just these four simple words: "Who
gives a fuck?”
August 2014 fb=

Psychologist, Sir Cyril Burt writes: "Our sense organs and
our brain operate as an intricate kind of filter which limits and directs the
mind's clairvoyant powers, so that under normal conditions attention is
concentrated on just those objects or situations that are of biological
importance for the survival of the organism and its species. ... As a rule,
it would seem that the mind rejects ideas coming from another mind as
the body rejects grafts coming from another body." This is the "brain as
reducing valve" idea that was made popular during the early 20th
Century by William James, Henri Bergson and Aldous Huxley. On this
view, the brain is kind of a "tuner", which does not produce
consciousness, but merely "acts as a dye marker" (Terence McKenna) to
shape and structure the preexistent consciousness and filter its dense
profusion of jumbled information. Is there mumble in the jumble? :p
And so when we hear the accustomed inner voice, which we
unreflectingly associate with our innermost self, are we merely in such
instances a highly evolved hominid ape whose brain is momentarily
channeling a human spirit that only ever truly exists within the cosmic

matrix, and the hominid ape's own egoic, experienced consciousness
never extends beyond the 1.5 second duration of its specious present?
Who can fracking say?
December 2013

"To my son, Nicholas Coope (1980–98), who fell to his death
climbing in Glen Clova: a brave and thoughtful lad, proudly
remembered." - From a book dedication by the boy's mother. Seemed
placed there by the author mostly out of considerations of style. (I am
amazed sometimes by how subjective my perceptions can be when it is
only my own ego which benefits.)
October 2013 fcbk=Naomi Jakins

Concerning an Inconvenient is a
convenient book . . . for Al, since it helped put him at the ground floor as
majority partner investor in the newly emerging global cap and trade
commodities market, which his book helped to create the demand for. :
Still more, if philosophers and cosmologists of some future age but
possess the patience and acumen to look, many of the secrets of the
Universe shall be laid bare within the past seven years of Brian's not so
humble Facebook postings. Unfortunately, we who remain ensnared
within this benighted century (unlike Brian) possess neither.
Search Term Month Year (contribution put within context of a search term
– primitive hyperlink)
@$ “at money” (an important or seminal passage)
@? “kernel idea requiring further development”
au= “author is” (an important or seminal thinker)
cit= “citation is” (important citation)
con= “concept” (important concept)
cont’d= (to be continued)
epi= “epigram” (a candidate bon mot)
ess= “essay” (passage containing a promising essay topic)
fcbk= “serves as an interesting Facebook posting, e.g., Lime Cat, Lime
Cat Universe, etc.

fic= “novel or short story idea”
hyp= “hypothesis” (important hypothesis)
“keyword” (an exhaustive list needs to be developed for
essay production)
kwo= “quote from elsewhere in this document”
ref= “reference” (missing reference)
ph= (a borrowed phrase, whose original context requires background
per= (from personal conversation or correspondence)
prn= “principle” (important principle)
pru= “proof” (a proof is being demonstrated)
“vocabulary” (a term whose meaning is less than certain or
contextually clear)
coi= “coining” (the coining of a new phrase with future illustrative
web= “web address”
wik= “wikipedia citation or reference”
rsc+ “research” (any “proword” or phrase that seems to require further
scribd= “desirable search term of phrase”
<phrase or something> (Google search is indicated as
Month yyyy (month and year a passage was added, e.g,. “June 2011”)
He was someone who had somehow long ago chosen me as the backstop
for the projection of his many unfulfilled dreams - a situation which I
sometimes find to be more than a little disconcerting. I can assure you
that I am quite miserable enough possessing but a single, functioning
conscience...I cannot brook carrying two of those around!
Thanks, Dr. Sarfatti. I am always amazed at all of the disinformation
and unfounded speculation that exists about you on the Internet. All of
these phony Johnny come lately's who've never done any real research,
theoretical or experimental, who want to cast aspersions. You're one of
the few real physicists out there. The fact that you care enough to try to

enlighten the rest of us rather than just staying within the echo-chamber
of academia has confused more than a few people, apparently. Another
thing: some of your books are on, but I'd like to know how
to get access to some of your earlier publications. If only there were a
one-stop shopping location for your papers! : )


 February 2005 Re: Puthoff's N.E.W. theory
 View
 Next

 Jack Sarfatti
Message 1 of 1 , Feb 20, 2005
View Source
On Feb 20, 2005, at 8:05 AM, Russell S Clark wrote:

> I like Volovik's approach to Sakharov's idea of induced gravity as a
> solution to the cosmological constant problem.
I have my own version of this. It's different from Volovik's, but has
some overlaps.
> To be honest, I felt the same way and was even kind of excited and
> intrigued when I first encountered the Puthoff, Haisch, Rueda
> theory. But then I slowly realized that quantum statistics is a
> much broader basis for an "already unified" induced gravity
> theory than is Puthoff's Zitterbewegung-EMQG induced gravity. The

> principles of quantum statistics would likely easily accommodated
> retrofitted with future theories, e.g., supersymmetry, M-theory,
> etc. This appeared unlikely for a theory of gravity and inertia
> based upon so narrow an abstraction as electromagnetism.
I don't understand your point. I think the Puthoff, Haisch, Rueda
theory is wrong because it is too simplistic. For one thing it only
deals with the transverse polarized virtual photons. It ignores
longitudinal virtual photons and virtual electron-positron pairs. It
has no "Higgs Ocean" coherence and, therefore, no possibility of
explaining the emergence of Einstein's gravity. Haisch simply assumes
Einstein's GR, I actually derive it FROM the cohering of the random
ZPF. Puthoff & Haisch do not even understand ODLRO AKA
coherence. None
of the relevant math are in their papers. The effect they talk about is
small and has already been considered in a paper in Rev Mod Phys on
"quantum friction".
> Once I realized that quantum statistics embraces all of the
> interactions of fermionic and bosonic particles whereas
> electromagnetism only certain fermionic particles, e.g., not neutrinos
> and only one bosonic particle, i.e., the photon, it became difficult
> for me to take Puthoff's theory seriously anymore. A kind of
> philosophical disdain for Puthoff's program crystallized - and this
> despite the fact that I could never have come up with the admittedly
> clever mathematical arguments Puthoff submitted in support of his
> theory.
> On a less dour note, in Volovik's recently published book, The
> Universe in a Helium Droplet, Volovik gives theoretical grounds for
> why the cosmological constant and the average matter density of the
> Universe should be approximately equal throughout much of the
> of cosmological expansion.

Yes, so do I. Puthoff does not even recognize the depth of the problem
and hand waves it away.
> Matter and vacuum perturb each other's spacetime symmetry to induce
> gravitational field because (as Feynman pointed out) the Pauli
> Exclusion and what might be called the "Pauli Inclusion" principles
> apply equally to real as well as to virtual fermions and bosons.
> Matter and vacuum mutually interact as traces of a common destiny,
> uh, density matrix. :)
No, that is meaningless. The precise equations for this are
Bu = bu^aPa/h
{Pa} = Lie Algebra of T4
Bu = (Goldstone Phase),u from partial cohering of random ZPF in
inflationary vacuum phase transition
Einstein-Cartan tetrad = eu^a = &u^a + bu^a
& is Kronecker Delta
guv(curved) = (&u^a + bu^a)(Flat)ab(&v^b + bv^b)
Gauge transforms on Bu -> GCT Diff(4) tensor transforms of local
- missing in Puthoff's other "PV" theory.
> BTW, is the notion of an electron-positron spin-0 field interpreted as
> the timelike component . . . and the photon as the spacelike
> (of an appropriately defined four vector) a "not even wrong" notion?

> I was thinking in analogy with the electric charge (a scalar) being
> represented by the timelike component and the charge currents by the
> spacelike 3-vector components of the "spacetime-like" current density
> four vector in Maxwell's equations.
Until you write the equations I don't really know what you mean. :-)
Show message history


Anomalous quantum phenomena exhibit the mutual
interference of heretofore all along thought distinct bootstrap
mechanisms, that of the mental and that of the physical.
He was just repeating funny voices and as we know the computer
programs which devise these various humorous comments and sayings
do not actually possess the power of human consciousness. Since he
was only absentmindedly mimicking or repeating under his breath what
he heard the funny voices saying I would say that these charges of
sexism are wholly unfounded.
She plays like a Roman with her eyes on fire. He plays like a Roman
with his eyes on fire. These two different pronouns do not seem to
function perfectly analogously in the above exemplar sentences and we
might wonder why this is the case? “Her” refers to a particular person,
while “his” refers to any one of a number of hypothetical Romans.
Niels Bohr the Alice
The akashic record possesses all information past present and future
however the information is not necessarily structured temporally as we
understand time.
Akashic paradigm.
The moral dilemma question is: would you intentionally push a person

off a cliff to his or her death in order to save some great mass of
strangers say from a nuclear detonation? There is a presumptive
metaphysical ethical fallacy lying at the root of such a hypothetical
ethical dilemma. The metaphysical presumption is that individual
human experience fits rationally into some collective matrix through
which and by which human suffering at the level of the individual can be
massed together and accumulated into some very much greater transpersonal or trans-human suffering say on the part of a deity observing
and experiencing the suffering of and through those many individuals.
The paradox is how an atheist who considers himself moral and ethical
might answer this dilemma vs. how a Christian theist who also considers
himself moral and ethical might answer this same dilemma.
Human suffering doesn't add in any absolute or objective sense, and the
moral intuition that it *does add* is informed by a kind of
'cryptotheism', one shared by theists and atheists alike! : )

Similar to the chronology protection mechanism there is another
mechanism which prevents individual human consciousness’s from
gaining access to the control interface to the karmic field.
Darwinian evolution theory does not propose a model or theory of the
origin of life. So Darwin's theory can be likened to the notion of a
bridge that is anchored in the middle of a chasm on one side and to the
edge of a steep cliff on the other. One cannot begin to bridge a chasm
from its midpoint without a sky crane (or otherwise suspending the law
of gravity). Whatever is holding up the bridge in the middle is still
secretly present in order to prevent its collapse: our “sky crane” of
course is chemical evolution and the self-organizing properties of atoms
and molecules, which drove evolution for its first billions years before
the advent of a unit of heredity, e.g., primitive DNA or RNA, which is to
say, before mutations began to be shaped by natural selection. What
were “mutations” or, "spontaneous changes to the phenome” guided by

during those first billion years of chemical evolution? Again, by these
self-organizing properties of atoms and molecules. Is there any reason to
suppose that these “self-organizing properties” ceased to operate after
the advent of a unit of heredity, i.e., DNA, at which time natural
selection commenced its operation? Conclusion: mutations of DNA are
not random, but were shaped and continue to be shaped by the selforganizing properties of atoms and molecules. These principles of
chemical self-organization did not evolve, but are owing to the initial
conditions of the universe, which were determined during the initial
infinitesimal fractions of a second (I say "fractions" instead of the
singular, "fraction" because time likely possesses more than one
dimension) during the Universe's Big Bang when the fundamental
physical constants, e.g., Planck's constant, Bohr magneton, mass and
charge of the electron, speed of light, electric field permittivity,
magnetic field permeability, Avogadro's number, Boltzmann's constant,
Pauli Exclusion Principle, gravitational constant, etc. were determined.
Of course, the simultaneous and more or less instantaneous fixing of all
of these physical constants to 12 decimal places or more (a notable
exception here is the gravitational constant, 4-5 decimal places)
represents an extremely large quantity of information and cosmologists
and astrophysicists now recognize that the Universe began in a state of
extraordinarily low entropy. This low entropy state is now believed by
the experts to be the result of an entropy fluctuations within a much
larger system, e.g., “Multiverse”. My response to this is: "You don't
say?" : )
Combine the anthropic cosmological principle with the etiology paradox
and what do you come up with?
December 2013

It is more than a bit ironic that the only real empirical proof
for the theory of evolution is the design feature of micro-evolutionary
adaptability of each species, which makes it robust against fluctuating
environmental conditions. Of course, the theory-laden evidence for
Darwinian evolution is indeed in plentiful supply.

He was in part, she was in part, he was not in part, they were playing
their parts, she got the part. There should be a new verb tense for
variably repeating time, e.g., for repetition of a particular scene or act of
a play in which actors are participating. Would being reunited with all of
one’s lost loved ones from one’s earthly life in any way constitute a
disproof of solipsism? Indeed not. If anything it would be strong
evidence in favor of the hypothesis that the world is one’s oyster, as it
October 2013


Stalin's speeches published in the problems of Leninism
October 2013

Psychiatrist Brian Weiss mentioned the possibility that persons
could be reincarnated from not only other galaxies but also other
universes and other dimensions.
Transcendental otherness and the plurality of consciousness is related to
the principle of hard encryption and is also connected to spontaneous
decoherence of wave functions inertia and gravitation also the
distinction between subjectivity and physical reality, interiority and
exteriority. If the outside is relative to the inside, but here is a plurality
of interiors or inner domains, then how do we come up with a common
public space called the outside, that is to say, the outside for all
concerned, for everyone?
Is it truly correct to think of the miraculous is that what your curves by
virtue of a mechanism that is merely currently unknown to us I would
have to say this is incorrect since the advancement of science and
technology reveals ever new mechanisms which are realized to be just as

scientific as other older and well established mechanisms were to an
earlier stage of the advancement of science. Consider here the fact that a
wavefunction collapse occurs by virtue no mechanism at all.
Cosmologists likely fail to understand the disturbing metaphysical and
spiritual implications of their latest cosmological theories. When the
number of bubbling universes greatly exceeds the number of possible
distinct human beings and one considers this in light of the anthropic
cosmological principle....the disturbing spiritual implication here is
I dreamed the 'you can't say that in GERMAN paradox. '
Without success of the argument for God from analogy, that is, of a
mind that is both transcendent and universal the argument from analogy
for other minds fails.
Scaffolding vs whole fluctuations vs specified complexity vs intelligent
Where knowledge and reality intersect is within the mind of God.
Science is not the edifice of truth; science is the scaffolding of truth.
No one knows where this place is. This is but one of the many logical
conclusions which can be drawn from the fact that science does not
progress in linear fashion, but a certain amount of destruction of prior
established scientific truth goes hand in hand with the advancement of
science. Just consider how 30th Century scientists shall view 20th
century science and in turn how 40th Century scientists shall view 30th
Century science, and it becomes clear that there is no future stable
plateau from which contemporary science can confidently assess
previous eras in the advancement of scientific truth.
Anomalous quantum phenomena exhibit the mutual interference of

heretofore all along thought distinct bootstrap mechanisms, that of the
mental and that of the physical.
Bohm’s principle applies universally, then the world is constituted as a
hologram. Decoherence is induced by complexity that outstrips Bohm’s
causal principle, meaning energy acquires bulk as higher dimensional
Of course there is no real physical meaning attributable to the phrase,
'photon bouncing off of an electron' in the absence of an observer
properly equipped to perform a position and momentum measurement
on the impact site for the two particles which are also waves.
In the absence of an observer the quantum vacuum serves as the default
ground of quantum entanglement.
Can a grounding or substantive quantum entanglement be transferred
from the quantum vacuum to a quantum observer and back all the while
conserving entanglement? Is entanglement conservation violated by this
The observation that ambient photons within the laboratory should not
be able to trigger the collapse of a superposition state certainly applies to
the case of the two slit experiment performed using buckyballs.
Nevertheless an observer can use photon radiation of precisely the same
momentum energy and polarization in order to observe which slit the
buckyball went through which does succeed in collapsing the
interference pattern. This is true even though the very same types of
photons existed amongst the ambient radiation with in the laboratory
when the observer was not looking at the slits to see which slit each
buckyball went through and this ambient photon radiation in the
laboratory does not succeed in causing the interference pattern to

collapse apparently the observer makes the difference not the physical
interaction of photons with the buckyballs say through collisions.
In other words if the observer is not watching which slit the buckyball
goes through and just looks at the phosphorescence screen for the
presence or absence of an interference pattern then he cannot rely on the
presence of ambient proton radiation of identical character to that which
would have been required to observe which slit the buckyballs goes
through in order to produce collapse of the interference pattern with the
exhibition of particle-like behavior on the part of the buckyballs. So it
does not appear that a physical interaction is responsible for collapse of
the interference pattern but rather the presence of the observer looking at
the slits.
Nature wanted to be Newtonian but didn't think the problem completely
The acausal behavior of perceived physical reality is perhaps as much
owing to the brains overabundant complexity which outstrips the
capabilities of nature to anticipate and create representations of equal
complexity to those of which the brain is capable. This leads to a
consistent underdetermination of representations causally speaking.
All this time we've been looking at the quantum decoherence and
quantum measurement problems from the standpoint of energy,
momentum and momentum-energy exchanges between the observing
system and the observed system, but the decoherence problem can be
handled in the context of the hot, wet brain and its microtubules and
their coherent tubulin dimer energy states by looking at the problem
from the standpoint of information rather than energy, that is, if what is
happening inside the brain's microtubule network outstrips in terms of
complexity or information density, the quantum computing capacity of
the cosmic CPU or quantum vacuum by which each future time step in
the evolution of Schrodinger's wave equation is computed, then we may
have alternate conditions for quantum coherence, which do not depend

on such things as temperature.
The inversion relations for electron mobility vis-a-vis the two slit
experiment... this relation may actually be applicable to solving the
quantum decoherence problem for a wet hot brain specifically where the
coherent quantum states of tubular and dimers of the brain's
microtubules are concerned.
I see no problem with utilizing a consistent empirical relationship
between two observables in order to establish a new axiom.
These two observables between which our empirical relation exists,
namely that of the inversion relation noted earlier, our consciousness and
electron mobility vis-a-vis the two slit experiment and the quantum
observer. A problem further investigated here should be:
consciousness is always consciousness of the individual not an
intersubjective quantity such as a quantum observable.
This inversion relation that I have been talking about is simply this:
when consciousness is present at the two slit apparatus the electron wave
function collapses so that we get particle behavior. On the other hand,
during general anesthesia when electron mobility is induced to switch
from wavelike to particle- like by the presence of the anesthetic gas,
there is a loss of consciousness. “Another model can help explain long
range cooperativity in biomolecules. Soviet biophysicist A. S.
Davydov has considered almost lossless energy transfer in biomolecular
chains or lattices as wave-like propagations of coupled conformational
and electronic disturbances: "solitons." Davydov used the
soliton concept to explain molecular level events in muscle contraction,
however solitons in the cytoskeleton may do what electrons do in
“The loss of electron mobility by interaction with anesthetics leads to the
loss of protein self or at least the ability to protein to control its motions

by quantum jumps”, c.f., “BIO-SYSTEMS AS SELF-ORGANIZING
QUANTUM SYSTEMS” by Matti Pitk¨anen.
September 2014

“5.2 A possible vision about quantum brain

TGD framework encourages strongly to give up the cherished belief about brain as a
seat of consciousness. The following working vision seems to be plausible at least to
me just now.
1. Brain and body as sensory organs of electromagnetic selves
In TGD framework life is self-organization phenomenon involving in essential
manner Earth's magnetic field serving as template for the condensation of biomatter.
In TGD universe ourselves involve in essential manner electromagnetic field
structures (topological field quanta) having size measured using Earth size as a unit.
Our physical bodies can be seen as kind of sensory and motor organs of these
electromagnetic selves. In particular, physical death can be seen only as a death of a
mental image about physical body. These higher levels selves are multibrained
organism analogous to multicellulars and use our brains (in particular during sleep)
for their own purposes’, c.f.,

If intelligent design is a false hypothesis, then when we examine the
informational structures of life such is RNA and DNA and complex
macromolecules such as enzymes and proteins which modulate the
expression of RNA and DNA, we should find that there is only a single
level of description at work, in other words there should be an absence
of meta levels of description. Such informational mechanisms as error
correction codes, operating systems, compilers, hyperlinking
(computational nonlocality, if you will) are all examples of metal levels
at work within a text or within a computer program which exhibits a
linguistic structure and programming infrastructure that possesses too
much specified complexity to have arisen from passively filtered entropy

If intelligent design is a false hypothesis, then when we examine the
informational structures of life such is RNA and DNA and complex
macromolecules such as enzymes and proteins which modulate the
expression of RNA and DNA, we should find that there is only a single
level of description at work, in other words there should be an absence
of meta levels of description. Such informational mechanisms as error
correction codes, operating systems, compilers, hyperlinking
(computational nonlocality, if you will) are all examples of metal levels
at work within a text or within a computer program which exhibits meta
levels and bespeaking a language and programming infrastructure that
possesses too much specified complexity to have arisen from passively
filtered entropy fluctuations.
Would solipsistic Boltzmann brains solve the Fermi paradox and what is
the strange logic that underlies such a propos l? Microcosm exhibiting
Descent with modification leads to a wider and wider divergence of
forms such that any given form tends in later epochs to find itself much
more alone and unique within the state space of possible peers. By the
time such a late stage in evolution has been reached such that individual
consciousness becomes possible and emerges by this time the
divergence has grown so wide that it is unlikely that any given
individual consciousness should find any peers coexisting within the
reach of communication unless a kind of Nexus be provided. This is
somewhat the obverse of the argument that the self being a
sociolinguistic construct cannot emerge in a solitary state but must be
always and everywhere surrounded by peers with which it has
communicated and with which it can communicate.
But the
phenomenon of miscommunication reveals to us that it is not the input
of actual information which leads to the emergence of the individual
consciousness as sociolinguistic construct, but merely the input of data
impulses which are uninterpreted unless a paradigm or theory is in place
with which to interpret them.

Earth's civilization must then be secretly extremely old not only in the
galaxy for but perhaps within the universe at large such that the average
distance of peer extraterrestrial civilizations namely those of comparable
or greater technologic l development may well be too great to be
We should now investigate what might well be called the Fermi
metaparadox within the discipline of the philosophy of mind and the
problem of other minds.
The dissolution of a paradox by way of the advent of a new paradigm
usually signals the appearance of a metaparadox.
Can long lived metastable states of organic molecules be passed down
from one generation to the next, enabling a greater scope of action for
quantum entanglement.
The process of biological evolution is not a hundred percent onward and
upward just consider the case first considered by Darwin himself of
placing newer or modified species in competition with their earlier for
bears in the current environment of the modified species and how the
newer or modified species would merge victorious rendering the
extinction of the order forms from which they are derived and now
having said this consider someone to reverse case where the newer or
modified forms of the older species are placed in the folder species
contemporary environment and allowed to compete with him there here
the advantage would not be so clear of the newer forms over the older
ancestral forms because of the consideration of coevolution and ecology.
That an individual consciousness' apparent peers are intelligent is more a
testament to the ultra high fine tuning of that individual mind's
consciousness than it is to the fine tuning of the brains of those peers.
Note here that we are relating the fine tuning of consciousness with fine
tuning of brains.

The existence of a specious present is inconsistent with the principle of
time scale reductionism. Moreover time scale reduction ism is
incompatible with multi- dimensional time.
Young souls have many peers old souls are relatively peerless.
Of course this principle would not be expected to hold within a universe
that is infinitely old.
This principle would only be expected to hold the universe which had a
Only truly old souls would be expected to possess consciousness
because of the radical context dependency of consciousness what
provides this context myriad previous life times and the sensory
perception inputs and quantum entanglement generated there from along
with proto conscious thought processes lay down in the quantum back
you providing ever richer contacts from which future incarnations may
draw and in so doing the context for future incarnations becomes ever
richer so that greatly more aged souls appearing in those later
incarnations shell if she the necessary context in order to experience
Hypothesized entities substances and principles this is the mainstay of
the 19th century amateur natural philosopher such as Darwin.
I found Robert Zubrin's statement very intriguing, namely, that since the
simplest creatures on the planet are bacteria and bacteria are extremely
advanced in terms of their cellular machinery and the language of their
genetic code which is to say DNA that to assume that bacteria are at the
beginning of the evolutionary process is like assuming that the iPhone is
at the beginning of the technological evolutionary process.
It was upon these considerations that Robert Zubrin considered it likely

that life originated elsewhere perhaps on Mars or perhaps another star
system which passed through our solar system's Oort cloud, causing
comets from its corresponding Oort- like cloud to seed the Earth's inner
solar system with organic material.
Transgressing the boundaries in one's written communications means
busting up pre- registered complexes of behavioral epigenetics of one's
naive, untutored reader.
Is it possible to distinguish these two cases: differentiation of selves
versus differentiation of the otherness of selves?
The concept of consciousness is grounded in the otherness of the self not
in the self as one among many instantiations of consciousness per se or
as such.
Given the level of consciousness attainable by ordinary human beings on
planet Earth it is puzzling how long are life spans can be upwards of 100
years and not necessarily accompanied by serious mental decline or
diminution of self-awareness puzzling when one considers how utterly
small is the socio linguistic and cultural milieu of planet Earth for such
PETA disclaimer: no Christians were offended during the posting of this
When Jesus opened St Paul's eyes, he opened them too wide.
From Sartre's Being and Nothingness: cit=“If we attempt somehow
regarding the Other what Descartes attempted to do for God with that
extraordinary “proof by the idea of perfection” which is wholly
animated by the intuition of transcendence, then for our apprehension of

the Other qua Other we are compelled to reject a certain type of negation
which we have called an external negation. The Other must appear to the
cogito as not being me. This negation can be conceived in two ways:
either it is a pure, external negation, and it will separate the Other from
myself as one substance from another substance – and in this case all
apprehension of the Other is by definition impossible; or else it will be
an internal negation, which means a synthetic, active connection of the
two terms, each one of which constitutes itself by denying that it is the
other. This negative relation will therefore be reciprocal and will possess
a twofold interiority: This means first that the multiplicity of “Others”
will not be a collection but a totality (in this sense we admit that Hegel is
right) since each Other finds his being in the Other. It also means that
this Totality is such that it is on principle impossible for us to adopt “the
point of view of the whole.” In fact we have seen that no abstract
concept of consciousness can result from the comparison of my beingfor-myself with my object-state for the Other. Furthermore this totality –
like that of the For-itself – is a detotalized totality; for the since
existence-for-others is a radical refusal of the Other, no totalitarian and
unifying synthesis of “Others” is possible. It is in the light of these few
observations that we in turn shall now attack the question of the The
Consciousness my consciousness even is more general then the most
general medium of my own experience this also points up the
transcendental nature of consciousness and the concept of
consciousness. Does the notion of consciousness as “con sciousness”
point, etymologically speaking, to the socially context based nature of

Consciousness as such or in general transcends my individual
consciousness in much the same way that objectivity transcends
Penrose's one graviton Planck mass energy limit can be r cast in terms of
quantum information theory, which is to say in terms of the abstract

relationships such as combinations and permutations of the various
relationships between the physical components of a quantum system
which can outstrip in complexity the computing capacity of the
underlying quantum field. This is because the quantum field, whose
entanglements constitute at the very least the causal relationships by
which the quantum vacuum computes each succeeding state of the
Schrodinger wave equation for the system, *is physical* and increases in
mass according to the cube of the system radius, while the abstract
description of the system, i.e., its wavefunction, increases exponentially
with the radius. Spontaneous decoherence is a function of how the
abstract overwhelms the purely physical that is responsible for
computing all future states of Psi.
September 2013

Consider causal relationships with which the universe or
nature has no “experience”.
Certain informal fallacies of reasoning had survival value because they
more readily enabled the would-be demagogue hunter-gatherer to more
quickly mobilize the tribe or clan into action against a threat to the
group's survival posed by a neighboring tribe.
The human brain requires sleep for eight hours once every 24 hours in
order to rest up after enforcing the strictures of causality and logic for
the previous 16 hours running.
Instead of following established procedures you know somebody who
knew somebody who is at the control panel and in a position to turn
knobs at will. You don't necessarily have to arrive at a given board
position in the game of chess by playing step by step in conformance
with the rules of chess but are allowed the option of placing pieces
directly onto the board in such a manner that an otherwise impossible
mating position is effected.
This points up the relationship between free will and self-consciousness

particularly in connection with Kurt Gödel’s Incompleteness Theorems.
Consciousness has a knowledge of the system by which its will is
implemented in the world, its limitations as well as its proactivities.
The logic of coherent tiny subdomains of genetic base pair sequences as
yet untried within a virtually infinite state-space of possible genetic base
pair sequences. How is optimization in an evolutionary sense possible
within such a virtually infinite state space? That is, given the large
negentropy barriers that separates different degrees of fitness on the
rugged fitness landscape. Is the ground of emergence transcendent and
necessarily so? Can teleology in evolutionary biology be an emergent
The soul is the basis but not the ground of spiritual development.
Equivocation of sense is the basis of many logical fallacies but it is also
the basis of the intuition which permits thinkers to sneak outside of a
given system or paradigm. October 2013 An interesting example of
equivocation of sense is the obvious ambiguity of the phrase,
“philosophical zombies”, that is, between its technical philosophy of
mind acceptation and one of its more “ordinary language” or literal
interpretations. For example, do philosophical zombies in sense A tend
towards philosophical zombiehood in sense B, and so on?
Many seeming paradoxes are borne of our inability to adjust the sense
and scope of our technical terms with which we treat problem's analysis.
Paradoxically the delayed choice experiment was devised both to
demonstrate nonlocality and that quantum entanglement of spins was a
genuine physical effect.
Quantum nonlocality and David Bohm's causality principle suggest that
the world is indeed a type of computer simulation the quantum vacuum
serves the purpose here as the cosmic quantum CPU

The fact that quantum non-local information cannot be transmitted faster
than light is an indication that quantum nonlocally connected or encoded
information may form the basis at least in part of the integral unity of the
individual consciousness, information which is hard quantum encrypted.
The principle of quantum nonlocality teaches us that what we call
substance is an emergent phenomenon or property which arises above
the Planck limit.
“And this would be manifestly favorable to natural selection by
affording a better chance of the occurrence of profitable variations.
Unless such occur, natural selection can do nothing”, c.f., Darwin’s
Origin of Species.
Darwin's remarks concerning changing conditions increasing variation
from his seminal work, TOOS: 'and this would be manifestly favorable
to natural selection by affording a better chance of the occurrence of
profitable variations. Unless such occur, natural selection can do
nothing.' Note that profitable changes, triggered or enabled by
environmental change in a random fashion, nevertheless rely on
coherent and coordinated reactions at the genetic level.
Explain time bubble analogy in the species present as a means of
understanding the notion of two-dimensional time.
Re investigate conservation of four dimensional angular momentum in
the light of subatomic particle collisions' non-conservation of threedimensional angular momentum.
Reexamine the interpretation of quantum intrinsic spin as angular
momentum about the time axis, especially while considering the concept
of quantum entanglement and instantaneous Lorentz frames of reference.
Turning inside out in three dimensions as a rotation in four dimensional
space vis-a-vis enantiomer molecules. In turn relate this to the

relationship between consciousness and particle-wave duality of electron
mobility in quantum microtubules hat is pointed up by the classic twoslit experiment and the Meyer-Overton law of anesthetic action upon
quantum microtubule electron mobility.
The diamond in the rough model of intersubjective communication
applied to the infant's development of a hypothesis of speech sound
The DRM model demonstrates that information was never transmitted to
the brain or mind of the infant and that all of the meanings and coherent
structures of perception and cognition which the child develops later on
are wholly the product of transformations, reactions, processing and
reprocessing of data into information which took place within the brain
of the child.
Derivative time versus integral time.
When you observe 2 people are communicating at cross purposes
perhaps secretly in agreement with each other say from the point of view
of a third party, or a comedy of errors of misunderstanding is playing
out, which reminds one that no information is out into anyone's head
until they already understand the signifiers.
Theistic evolution involves supernatural quality control and quality
assurance applied to the process of natural selection by a super
intelligent designer engineer such that Richard Dawkins' mount
improbable is always the tallest mount improbable and never the
smallest mount improbable.
Genetic drift is not enough to provide natural selection with the
variability that it needs to select from for beneficial mutations there must
be a gradient in addition to genetic drift which drives the forward
advance of greater and greater complexity this gradient with the advent
of the first unit of heredity would have been that gradient that drove the

billion years of chemical evolution leading from carbon hydrogen
oxygen nitrogen to the very first primitive RNA or DNA molecule.
An important question is whether or not with the advent of the first
replicating information bearing molecules primitive RNA or primitive
DNA, whether or not feedback: occurred between the gradient which
drove chemical evolution and these first information bearing molecules,
changing that gradient and perhaps steepening it. A dynamic interaction
between mind at large and the wavefunction of organic molecules seems
to be required here.
Shouldn't there be some limitations on the complexity of unitary wave
wavefunctions, which is to say a limit on how complex a wavefunction
can be without either being a superposition or devolving into a density
matrix representation of a statistical mixture.'' a statistical mixture of I
can functions does not have a corresponding quantum observable. So the
limitation on wave function complexity is twofold on the one hand there
is a wave functions which is too complex to be observed in the sense of
having a corresponding observable on the other hand the wavefunction
represents a system which cannot be computed by the cosmic CPU or
quantum vacuum because it out strips the representation capabilities of
that quantum vacuum.
The dynamics of chemical evolution form the subconscious for the
process of biological evolution and are responsible for all of its
fortuitous insights into more perfect adaptation and coadaptation.
Of course there is no real physical meaning attributable to the phrase,
'photon bouncing off of an electron' in the absence of an observer
properly equipped to perform a position and momentum measurement
on the impact site for the two particles which are also waves.
In the absence of an observer the quantum vacuum serves as the default
ground of quantum entanglement.

Can a grounding or substantive quantum entanglement be transferred
from the quantum vacuum to a quantum observer and back all the while
conserving entanglement? Is entanglement conservation violated by this
Of course there is no real physical meaning attributable to the phrase,
'photon bouncing off of an electron' in the absence of an observer
properly equipped to perform a position and momentum measurement
on the impact site for the two particles which are also waves.
In the absence of an observer the quantum vacuum serves as the default
ground of quantum entanglement.

Can a grounding or substantive quantum entanglement be transferred
from the quantum vacuum to a quantum observer and back all the while
conserving entanglement? Is entanglement conservation violated by this
The observation that ambient photons within the laboratory should not
be able to trigger the collapse of a superposition state certainly applies to
the case of the two slit experiment performed using buckyballs.
Nevertheless an observer can use photon radiation of precisely the same
momentum energy and polarization in order to observe which slit the
buckyball went through which does succeed in collapsing the
interference pattern. This is true even though the very same types of
photons existed amongst the ambient radiation with in the laboratory
when the observer was not looking at the slits to see which slit each
buckyball went through and this ambient photon radiation in the
laboratory does not succeed in causing the interference pattern to
collapse apparently the observer makes the difference not the physical
interaction of photons with the buckyballs say through collisions.
In other words if the observer is not watching which slit the buckyball

goes through and just looks at the phosphorescence screen for the
presence or absence of an interference pattern then he cannot rely on the
presence of ambient proton radiation of identical character to that which
would have been required to observe which slit the buckyballs goes
through in order to produce collapse of the interference pattern with the
exhibition of particle-like behavior on the part of the buckyballs. So it
does not appear that a physical interaction is responsible for collapse of
the interference pattern but rather the presence of the observer looking at
the slits.
Nature wanted to be Newtonian but didn't think the problem completely
The acausal behavior of perceived physical reality is perhaps as much
owing to the brains overabundant complexity which outstrips the
capabilities of nature to anticipate and create representations of equal
complexity to those of which the brain is capable. This leads to a
consistent underdetermination of representations causally speaking.
All this time we've been looking at the quantum decoherence and
quantum measurement problems from the standpoint of energy,
momentum and momentum-energy exchanges between the observing
system and the observed system, but the decoherence problem can be
handled in the context of the hot, wet brain and its microtubules and
their coherent tubulin dimer energy states by looking at the problem
from the standpoint of information rather than energy, that is, if what is
happening inside the brain's microtubule network outstrips in terms of
complexity or information density, the quantum computing capacity of
the cosmic CPU or quantum vacuum by which each future time step in
the evolution of Schrodinger's wave equation is computed, then we may
have alternate conditions for quantum coherence, which do not depend
on such things as temperature.
The inversion relations for electron mobility vis-a-vis the two slit
experiment ... this relation may actually be applicable to solving the

quantum decoherence problem for a wet hot brain specifically where the
coherent quantum states of tubular and dimers of the brain's
microtubules are concerned.
I see no problem with utilizing a consistent empirical relationship
between two observables in order to establish a new axiom.
These two observables between which our empirical relation exists,
namely that of the inversion relation noted earlier, our consciousness
and electron mobility vis-a-vis the two slit experiment and the quantum
observer. A problem further investigated here should be: consciousness
is always consciousness of the individual not an intersubjective quantity
such as a quantum observable.
This inversion relation that I have been talking about is simply this:
when consciousness is present at the two slit apparatus the electron wave
function collapses so that we get particle behavior. On the other hand,
during general anesthesia when electron mobility is induced to switch
from wavelike to particle- like by the presence of the anesthetic gas,
there is a loss of consciousness.
If intelligent design is a false hypothesis, then when we examine the
informational structures of life such as RNA and DNA and complex
macromolecules such as enzymes and proteins which modulate the
expression of RNA and DNA, we should find that there is only a single
level of description at work, in other words there should be an absence
of meta levels of description. Such informational mechanisms as error
correction codes, operating systems, compilers, hyperlinking
(computational nonlocality, if you will) are all examples of metal levels
at work within a text or within a computer program which exhibits meta
levels and bespeaking a language and programming infrastructure that
possesses too much specified complexity to have arisen from passively
filtered entropy fluctuations.
<Radical Existentialism teaches>

The problem of initial conditions and inherent chaotic dynamics of the
universe. Initial conditions have to be distributed across time.
People secretly identify the unknown with the sum of things they know
they don't know and forget to consider the vastly larger domain of
unknown unknowns.
Behavioral genetics and the Mr. Potato Head theory.
Every theory of Darwin's day has since been overthrown or at least
revolutionized and each replaced by a modern theory that emerged in the
light of much later discoveries that were completely or largely
unsuspected in Darwin’s time...every theory, that is, except Darwin's. Is
this happy coincidence or is there a logical explanation at work here?
Does the sociology of science really have nothing valid to say on this
question? If so, then this only compounds the coincidence and raises
more serious suspicions of question-begging.
Milton's bonding of angelic beings…
Heaven is not all of the best stuff on earth with none of the bad stuff
We must distinguish between persons whose minds interface with reality
at the surface versus those whose minds interface reality at a
An example of bootstrapping is when accidental attributes end up
becoming central or defining attributes of a system.

Biocentrism rather than demonstrating that life is necessary for the
existence of the universe just the converse is demonstrated namely that
all of our theories are not about the world or the things in the world
outside of the mind but all of our theories are about the self
Apply the anthropic principle to the self-negotiating the myriad quantum
universe branching’s. All of one's parallel selves are merely zombie
versions of one's true self. Continuing with this logic ones true self
continually enters parallel quantum universes that are largely populated
by zombie versions of other people from one's original or home quantum
'The importance of untangle meant for determining space-time structure
is something that 3 years ago only a few of us were thinking about says
than Van Raamsdonk. Now a lot of people are realizing that it's an
important piece of our thinking about quantum gravity.' 'The great
quantum space time tangle' by Adam Becker.
'The great quantum space-time tangle' published by New Scientist
SPACE-TIME, the very fabric of our universe, may be a tangled place.
Entanglement, a feature of quantum mechanics that links objects over
great distances, could be responsible for its structure. It all sounds a little
wild, but the id About 551 documents for 'quantum collapse
wavefunction consciousness "Wigner's friend"'
July 2013

With regard to the fundamentals, e.g., metaphysics, epistemology,

ethics, religion, politics, etc., all smart people, or at least all people who
fancy themselves smart, have already made up their minds as to on
which side of the issues they stand. So any polemical work does well to
state up front what ax is being ground with the understanding that the
work shall only serve, if it succeeds at all, in providing grist to already
like-minded souls, who are more or less passionate about the positions
on the issues, which they hold in common with the author. If a goodly
many of these kindred spirits perceive that the author’s work provides
them with additional rational justification for already long held
intellectual prejudices, then the polemical work offered has every chance
of being a success. By and large humans do not hear much less do they
heed messages from outside of their respective echo chambers. It is the
height of arrogance to believe that one is skilled enough to fashion an
argument which can succeed in changing some smart person’s stand on
any important issue. Experienced polemicists know this and yet they still
devote much effort to conceiving clever arguments and submitting them
for publication. The motivation here is primarily desire for monetary
“If we consider all possible worlds there may be plenty of
universes with naturally occurring fine-tuned constants, but there may be
more fine-tuned universes that have a god who set those constants. This
argument rests on the assumption that the most likely explanation for
improbable (in this particular sense) non-evolved things is intelligence.
It is more of analogical argument and might be summarized as: "from
our experience we can see that non-evolved things that exhibit the
characteristics of being delicately balanced to achieve something
complex (in this case life) are more likely to be designed than to come
about by chance". As au=Plantinga puts it: "It's as if there are a large
number of dials that have to be tuned to within extremely narrow limits
for life to be possible in our universe. It is extremely unlikely that this
should happen by chance, but much more likely that this should happen
August 2013

How can the probability of a coin toss turning up as either heads or tails
be 50 percent, given the more or less causal determination of the flipping
of each coin?
'Eine unendlich überlegene Wissenschaft' there is no English phrase
which sounds as good in the ear of Englishmen or an American as this
phrase sounds in the ear of a German speaker. The linguistic constraints
of rhyme, meter and cadence, which are peculiar to a particular tongue
do not only limit what one can say or translate from one language into
another, but also underlie the profundity of what is said and can be said
in some instances.
There is a non-logical component to the halting trial and error process of
scientific discovery which cannot be intersubjectively communicated to
successive generations of scientists, which is always lost in that process,
but which is a key component to the underlying creative intelligence of
scientific discovery.
We know that god is not looking but not that he does not exist because
of the existence of as yet uncollapsed wavefunctions.
Can we generalize the Monty Hall probability puzzle by casting God in
the role of Monty Hall in such a way as to explain wavefunction

The Wheeler delayed choice experiment does not prove that there is
backwards in time causation rather it demonstrates that a moment of
time takes time to be integrated at the quantum level.
June 2013

The superposition of macroscopic objects points up the
importance of quantum entanglement between sub Planck mass cellular
units, each maintaining its own coherence and unitary evolution
although simultaneously nonlocally connected, at least each to its
neighbors. Hierarchical entanglement and the role of consciousness in

piecing out the macroscopic world.
'Quantum gravity and consciousness - what's the connection? Following
Stuart Hameroff, Roger Penrose and their ORCH-OR theory, as well as
Robert Lanza's Biocentrism, we say that, because of the Planck massenergy quantum decoherence limit, the microworld and the macroworld
must shake hands, as it were inside our own heads, but only there. The
only thing we know of with the power of the bootstrap is consciousness
itself, and so a workable 'theory of everything', i e., a theory of
'something's coming from nothing' has to be sought there. I mean, a
'philosophical zombie' is not going to give us such a theory.'
July 2013

"Shannon information with its notion of information being
equivalent to a reduction in uncertainty does not take into account the
intentionality and context sensitivity of information (the "uncertainty"
must be "about something" because information itself - as interpreted
data - must be "about something"). Reductionistic causal links dissolve
into the inherent fuzziness of space-time (at the micro level), which is
dictated by the Heisenberg uncertainty principle. It is intuitively evident
that this fuzziness is dynamic and forms the substrate of the very
operation of mind as a particular instantiation of consciousness even
though mind cannot possess or conceive of a general notion, concept or
category of what consciousness is, c.f., Gödel’s Incompleteness.
Consciousness *as such* on the other hand, may well be more
fundamental than this "spacetime fuzziness", which may indeed rather
represent the natural indefatigable restlessness of this broader
consciousness - what eastern mystics term "the play of Lila". The
infinite regress prompted by the question, "why is there something rather
than nothing", is not a reductio ad absurdum. It is rather the inevitably
dynamic nature and restless condition of being. Quantum decoherence
prevents the collapse of the causal chain (at the macro level) and thus
allows the operation of temporality. In a subtly analogous way, quantum
decoherence also prevents the reduction of the higher pleasures of life
such as spiritual or intellectual pleasures to mere utilitarian calculated
sums of equivalent small quantities of direct stimulation of the hominid-

ape brain's pleasure center. In short, quantum decoherence introduces
just enough of just the right compartmentalization of experience in order
to prevent the annihilation of spiritual potentiality which is otherwise
borne of the totalization of experience as reprocessed stimulusresponse."
They use the right language that proves they understand the problem.
We don't need a concept of consciousness if there is only one
consciousness. 'One of a kind' is not a kind and is a contradiction in
Daniel Yergin ....The Quest....lady on metro is intently reading.
The information for the Cambrian explosion might have existed in
reservoir during the Precambrian this is doubtful
What else is entailed by supposing that if the evolutionary process was
not actually random but possessed a kind of lookahead capability that
the rate of evolution would simply be increased without any qualitative
difference in the kinds of living organisms that evolutionary process
Latent information in reservoir as it were is equivalent to assuming a
look ahead capability or teleology in evolutionary development.
There is a very definite limit to the amount of information that can be
drive from reprocessing pre-existing information in the absence of the
introduction of any altogether new information that is
June 2013

This notion of all together new information which necessarily
comes from outside of the system rather than simply resulting from the
reprocessing or reshuffling of information already latent in the system is
interesting in connection with the theory of intelligent design. The
paradox of functional information is that non- functional information

plays an essential role in providing the infrastructural context which is
metainformation. If the universe has always been here, then why hasn't it
collapsed under the weight of the trash piled up outside? There should
be a big difference between genetic bootstrapping and the mere
execution of a genetic algorithm. Quantum nucleation instead of
singularity, decoherence, self- existence, thermodynamics. What if being
and existence are derivative categories in but tiny portions of the
unlimited? Such questions are prompted by meditation upon the notion
of transcendence. Free will means if the unlimited opportunity is given
humanity to go back and replay its historical course, then human history
would never replay the same way twice, but only occasionally appear to
do so. @$The singularity occurs at the point at which our software
merges with the ancestor simulation software.
August 2013

Given a properly full blooded concept of transcendence,
existence itself becomes a predicate. This is because existence would no
longer be the most general mode of being, but merely one mode of being
alongside other modes of being, e.g., mathematical subsistence, and it
would then become meaningful to speak of existence as a predicate,
since existence would not then itself be utmost in generality.
Can we generalize the Monty Hall probability puzzle by casting God in
the role of Monty Hall in such a way as to explain wavefunction
The Wheeler delayed choice experiment does not prove that there is
backwards in time causation rather it demonstrates that a moment of
time takes t me to be integrated at the quantum level.
I wonder what Alvin Plantinga and Joseph Campbell would have made
of the logic of the anthropic principle had they encountered it during
their heyday as philosophers.
The transcendental nature of the anthropic cosmological principle
consists in the fact that this principle applies equally to each and every

person amongst the myriad persons that exist in this universe.
The initial and boundary conditions of the universe are after a fashion
like a Bible Code palimpsest.
It depends on which encryption key code (in the sense of consciousness
as individualized hard encryption) one applies in interpreting this
cosmological Bible code in these boundary and initial conditions as to
which solipsistic cosmological principle is realized in actuality.
The only thing we know of in the universe that isn’t a
mechanism is 1) consciousness and 2) wavefunction collapse/reduction
of the state vector. Time as fundamental apriori form/Kantian
supercategory or intuition. Bergson says that consciousness is the
intuition of time’s passing, i.e., Bergson’s Duree’. So we apparently
must seek the chronology protection mechanism in consciousness itself.
The mechanism of hard encryption is also intimately associated with
consciousness, c.f., the absolute mutual compartmentalization of
personal or individual consciousness.
We already noted that
decoherence rate is the one temporal aspect which resists the otherwise
universal action of time dilation, which is an indication that the
mechanism of time dilation is, according to this selfsame logic, to be
sought in the underlying mechanism of quantum decoherence.
December 2012

July 2013

The magnitude of gravitational field intensity is ceteris paribus
correlated with the strength of gravitational time dilation (weak field
approximation), but is also thought to drive, in part at least, quantum
decoherence, itself a temporal process. Quantum decoherence appears
the only temporal process currently known to science whose rate is not
subject to gravitational time dilation in the same uniform manner as
indeed are all other known temporal (physical) processes. This suggests
that the mechanism underlying quantum decoherence may be among the
building blocks of the mechanism of gravitation. The discovery of any
remaining building blocks of this mechanism perhaps have to await the
identification of further nonuniformities in the response of specific

physical processes to the effects of gravitational time dilation. Now if
per impossible some form of dualism turned out to be the case, then we
might anticipate some new form of deep space sickness in the form of a
kind of insidious and cumulative impairment of normal mental
functioning experienced by astronauts during long voyages in zero gee
or artificial gravity, say via nonuniform alternation in tubulin dimer
decoherence in relation to the temporal evolution of brain quantum
coherent states and this on account of the twofold differential action of
gravitational time dilation upon quantum brain coherent and de-coherent
processes, heretofore unseen by the processes of natural selection which
originally fashioned an astronaut’s hominid ape’s brain.
It is obvious that if there was sufficient chemicals self-organization at
the level of atoms and simple organic molecules to drive forward the
first billion years of chemical evolution prior to the advent of a unit of
heredity then the first primitive organisms would have arisen
everywhere across the surface of the earth and at more or less the same
time. So Darwin's notion of descent with modification from a common
primitive ancestor is thus seen to be readily falsified. Neither chance nor
necessity can explain the origin of specified complexity in the genome.
There is a third category which is the action of an intelligent will of a
designer or genetic engineer. The third category is not self-organization
operating exclusively on its own but that of a self-organizing dynamic
operating under the influence and input of intelligence or conscious
Information doesn’t come from the physics and the chemistry, but out of
how these processes are deployed or arranged just like how general
anesthesia is not induced as a result of a specific chemical action.
May 2013

But there's still the problem of how to avoid the seeming
necessity of extending the logic of the anthropic cosmological principle
from the penultimate fine-tuning of physical constants in terms of
making carbon-based life a winning possibility to the ultimate finetuning of said constants in terms of *my* consciousness - inevitable in

the fullness of infinite time (what is called the fine-tuning of
consciousness) - that is, within the context of a "Mixmaster multiverse"
wherein every possible individual consciousness, yours, mine, Bhoutros
Bhoutros Gali's, etc. is inevitably at some point called forth unbidden
from the screaming abyss *into its own anthropically, er, uh,
solipsistically fine-tuned, biocentric universe* populated by yours truly
and a host of also-ran "philosophical zombies". The happy
appearances within this otherwise dismal poly-solipsistic scenario are
presumably only saved by admitting, contra hyp, those aforementioned
Boltzmann brains, albeit quantum-entangled ones (by virtue of BB's
necessarily being engendered within a unitary underlying, fundamental
physical, which is to say, quantum, process, e.g., "Big Bang" or
whatever). The cash value of objectivity is "intersubjectivity" although
logically the "subjective" (read here: "infra-subjective") has no analysis
in "inter-subjective" terms. This emergent economy of rationally selfinterested, though cosmically lonely transcendent beings who are
inadvertently collaborating to produce the workaday world of utilitarian
common sense seems unavoidable when you consider that the realm of
limitation (space, time, causality, and so on) provides the only possible
avenue of escape for beings otherwise forever bored by the powerful
illusion of knowing everything. This headlong dive into the realm of
limitation is foolhardy based as it was upon a hope against hope that
there must be others who are similarly bored. [insert forgotten quotation
by Bertrand Russell on the power of boredom] [insert spooky quote by
Wittgenstein on the necessity of remaining silent]Posted on May 23,
2013 in response to Jeremy White’s assertion that I could throw away
my melatonin and not stay up worrying about Boltzmann Brains because
the latest development in string theory had rendered BB’s highly


Combine Plantinga’s modal ontological argument (MOA) for God’s
existence with the multiverse of Tipler and Barrow’s Anthropic
Cosmological Principle (ACP) through identification of the term,

“possible world” in the MOA with “universe” in the ACP in which these
universes comprise the multiverse. In so doing one naturally asks oneself
the question: would I have been called forth from the void (by the
solipsistic cosmological principle (SCP), i.e., the logic of the ACP
applied and brought down to the level at which each self-aware mind is
generated - by the multiverse) into a universe in which God, who is
admittedly possible in some universes, i.e., “possible worlds”, is not
possible? Perhaps only as a “philosphical zombie” (in other words only
as an entity that others might confuse with being me). An important
supplementary question here is…does God fail to exist in universes in
which He is possible, which given God’s omnipotence should be
rephrased as the following…does God choose not to exist in any
universes in which He is possible, still more in which I now exist? To
wit, did God abandon me? If so, then for what possible reason?
The modal ontological argument can be attacked at an additional point to
that of the concept or definition of a “maximally great being”. This
second weak point of the MOA, which I have never seen anyone attack
before is this notion that the actual world is included in the set of all
possible worlds. But just as “fire must be breathed” into the equations of
physics in order for our mathematically describable universe to be real
as opposed to merely possible, some crucial factor sets the real world
apart from this world as merely hypothetical. Similarly, there is an
important distinction between an eigenvector in a quantum superposition
and its corresponding eigenvalue as an experimental outcome of the
June 2013

"The logic of the Anthropic Cosmological Principle (and in particular the
update of this principle by quantum multiverse theory), truly only applies to the
individual ego or *its* consciousness. It is mere courtesy which permits this
principle to be applied to humankind as a whole and still more to carbon-based
life," c.f.,
oning But of course any argument which cannot be communicated (because of the
necessary absence of an audience to hear and understand it) must be rejected out of

hand. (yahoo email signature updated 06/01/2013 in an email of au=Russ Kick’s
TUTTO QUELLO CHE SAI È FALSO to Leah Haight). October 2014 The signature
line above is related to my observation that most cutting edge cosmological
theories carry implications on multiple levels that call into serious question the
empirical basis for said theories.
By the very same logic proffered in support of the Anthropic Cosmological
Principle one could put forward an equally cogent argument in favor of a so-called
Solipsistic Cosmological Principle. But any argument which cannot be
communicated (because of the necessary absence of an audience to hear and
understand it) must be rejected out of hand.

One should, of course go on to consider the probabilities of the
polysolipsistic cosmological principle. Each of us is “here” as a
conscious being entertaining the logic of the anthropic cosmological
principle and so the fact of one’s being “here” means that the universe or
multiverse (in which a rampant fecundity principle is a corollary) had to
be structured and function in just such a manner as to bring oneself into
self-conscious existence. This notion carries the implication that the
varying temporal (or multi-temporal for that matter) conditions that from
eternity past had conspired to bring one into being, unbidden from the
“screaming abyss” were enabling factors, i.e., some finite set of
necessary conditions, say like the application of cold water, slaps to the
face and smelling salts to a person who has fallen unconscious rather
than a sufficient condition for creation of one’s self, whole-cloth ex
"There are great ideas, undiscovered breakthroughs available, to those
who can remove one of truths protective layers" - Neil Armstrong, July
20, 1994 November 2012 That protective layer is what is called “the
obvious”, i.e., that which obviates. Culturally based or traditional truth
is path-dependent truth, which is to say conditional truth and points up
an important question for philosophy: is path-independent or
unconditional truth ever given or if such exists, but is not “given” can it
nonetheless be accessed? (Leading us back into the notion of path-

dependence, but now at a meta level).
June 2013

Because the seemingly genetically hard-wired confounding of the
rules of implication in the human brain, specifically that of modus
ponens with modus tollens – undoubtedly selected for on account of its
having all along worked more often than not, humans are inclined to
confuse correlations for causal connections, think teleologically and still
more characteristically, to regularly indulge in magical thinking.
Mystical thinking, still more, i.e., the tendency to conceptualize in terms
of all but rather than nothing but is also probably owing to this inborn
informal logic, which has been moreover promoted and sustained by the
always unification/systematization-transcending “mosaic logic” of the
consciousness-diversity-infused and highly social breeding population,
which, after all, is what nature has all along been selecting for rather
than the perfectly adapted individual/ego, which must be Gödelianincomplete as a survival-problem-solving-analytical engine! Coherence
necessarily contains within itself the very seeds of incoherence (in the
sense of not ultimately being practical as a survival strategy)! This is
why the ideologues and system builders of western culture are so
fascinating to study, historically speaking, but are so boring to attempt to
understand on their own terms.
"I know that most men, including those at ease with problems of the
greatest complexity, can seldom accept even the simplest and most
obvious truth, if it be such as would oblige them to admit the falsity of
conclusions which they have delighted in explaining to colleagues,
which they have proudly taught to others, and which they have woven
thread by thread into the fabric of their lives." - Leo Tolstoy
“One night, lightning struck the oak tree. Eddie saw it the next morning.
It lay broken in half, and he looked into its trunk as into the mouth of a
black tunnel. The trunk was only an empty shell; its heart had rotted
away long ago; there was nothing inside—just a thin gray dust that was
being dispersed by the whim of the faintest wind. The living power had
gone, and the shape it left had not been able to stand without it. Years

later, he heard it said that children should be protected from shock, from
their first knowledge of death, pain or fear. But these had never scarred
him; his shock came when he stood very quietly, looking into the black
hole of the trunk. It was an immense betrayal—the more terrible because
he could not grasp what it was that had been betrayed. It was not
himself, he knew, nor his trust; it was something else. He stood there for
a while, making no sound, then he walked back to the house. He never
spoke about it to anyone, then or since.”
- Ayn Rand, Atlas
You cannot quantize the probability of your falling in love with a given
attractive and desirable person, which is to say that this probability
cannot be represented as the absolute square of any possible
“This belief, however, elevates fallible human thought on par with the
Word of God. And what we discover is not a development of doctrine
but a departure
from it.” **Not if the HS guided the selection of books that were
included in the Vulgate, and those who were guided in this were also
guided in the development of early Church traditions and teachings.**
“Rome does not allow private interpretation of Scripture out of fear that
could undermine the authority of the Bible and the Church.” **Just
think about what a serious problem heresies were to the early Church. If
one accepts the existence of a positive principle of evil (as opposed to a
mere absence of good), does one imagine that this principle remained
uncharacteristically silent and uninvolved with influencing the creation
and dissemination of heresies? What better way to attack the Church
than at its root, that is to say, during its infancy?**

“Pope Leo XIII (1810-1903) stated: "God has delivered the Holy
Scripture to the Church, and.... in reading and making use of His Word,
(men) must follow the Church as their guide and teacher.” **So the
Church should have just allowed members of the illiterate masses of all
cultures and historical traditions to read the Holy Scriptures on their
own, unsupervised and without any instruction as to interpretation and
application of those Scriptures? Really? **
“The same Pope also said that it is impossible for any legitimate
interpretation to be extracted from the Bible that is at variance with the
doctrine of the Church. Any interpretation that is opposed to Church
doctrine is therefore false.” **The fundamentalist Protestant principle
of Sola Scriptura could have only become viable after the fashion of the
creation of a metabolic by-product as opposed to the spontaneous
generation of a wholly intact DNA molecule within the primeval, prebiotic soup. Moreover, Sola Scriptura is supported by a vast
infrastructure of Theological Seminaries. **
“In other words, the RC church professes to provide divine guidance for
her members. She demands recognition as the infallible interpreter of the
Scriptures.” **This is a twisting of words: the guidance to her members
provided by the Church is informed by divine guidance. A Church that
is not guided by God is no Church at all, but merely a house of spiritual
complacency. There is an equivocation of doctrine vs. interpretation of
doctrine here - God provides the former, while His Church provides the
latter. **
“The 16th century Reformers were in unanimous agreement in their
opposition when Rome claimed that teaching authority lay in the
magisterium with the pope as its chief shepherd under Christ.” **To
what do we owe the vast proliferation of Protestant denominations and
their further 1st, 2nd, etc. order splinterings, which shows the triumph of
chaos over the intention to faithfully transmit God's message down the
generations?** “The Reformation of the Church was the Lord's
intervention to lead His church back to the Gospel. The decline of

medieval Christianity was very gradual. The more serious errors didn't
arise until as late as the 14th and 15th centuries. Eventually the result of
this descending darkness was serious. The problem was with what Rome
had added to the Bible over the centuries.” **The errors, accumulated
over many centuries more or less harmlessly, not having reached any
critical threshold, only arrived at this threshold just at the end of the
Middle Ages and just at the advent of the Renaissance - mere
“The Protestant Old Testament is the same as the Hebrew Scriptures
(except for the order of the books). Roman Catholics, on the other hand,
add additional books – Tobit, Judith, Wisdom of Solomon,
Ecclesiasticus, Baruch and 1 & 2 Maccabees – as well as some extra
sections to Daniel and Esther, to form their version of the Old
Testament. Though most of the Old Testament books are quoted
frequently by New Testament writers, these extra RC books are never
quoted.” **Many books, over two dozen, in point of fact, are quoted
from in the Old Testament that were themselves not included in the
Canon of Scripture.**
Sense perceptual juxtapositions that produce multimodal cognitive
dissonance make excellent vectors for the insertion of wordless
artistically fertile metaphors.

Add personal Kantian Metaphors here

Can we utilize and understand metaphor without being taken in,
captured, hypnotized by metaphor (narrative structure)? Off the autism
axis – between autistic and “normal”, disjointed recollected biography
leaves room for revisionism, rationalization and higher integration (2d
time). Dreams and multi-dimensional time, c.f., /\E/\t in the context of
2d time.
Without transcendent universal mind there is no distinction between

consciousness being a One or a many. The collective mass of what we
don't know we know. And of course, it is this inherent and perhaps
unbounded multifacetedness of the individually collected human
experiential data, which must transcend the meaning of individual
human experience (as attributed by each individual to his or her own
experience) that so strongly suggests the reality of the transcendent
realm and its Deity. But what services this “unbounded
multifacetedness” is the underlying rationality of language qua
infrastructure of intersubjectivity. This transcendent realm may be
thought to be akin to Saussure's "unlimited semiosis". Deity is implied
by the notion that there is an objective observer that is in turn implied by
the necessity of an ultimate “tying off” point of this otherwise
unbounded semiosis.
Having the epiphany that you personally don’t really know
anything at all means that Man’s knowledge can be at best but a
collective illusion. . . and all this within the context of a sheer abundance
of grace. How can one not then have faith upon recognizing the
Providence involved in such a coherent manifestation of the radical
unknown. One is called forth from the Void and enters the world from
one unknown only to pass from it into another unknown. What is lost on
some is that the world itself is yet a third unknown, rendering the first
and the latter qualitatively distinct. And so metaphysical work can only
be performed by experience if it is possible to transcend all dual
opposite categories. You emerged into the world from a hole you didn’t
crawl into and shall leave it from a hole you can’t crawl out of. No one
as they say, “gets out of here alive”.
August 2012

“If a man is standing in the middle of the forest speaking and there is no
woman around to hear him - Is he still wrong?” “Yep...because of the X
chromosome lurking in each cell of his body.”
“Clay tablets would have been better” Comment about the fragility of
digital media data storage.

Begrudging the cultural and artistic expressions of others. Two forms of
the very same kind of blind faith: that of the “God of the Gaps” versus
“Science will eventually find the answers”. What are the implications of
a radical logical self-consistency approach to a theory of truth? How
does the principle of quantum superposition endanger the law of
excluded middle? Or does quantum superposition actually support the
excluded middle? The principle of the loose ends only showing on the
underside of the carpet? Is behavioral genetics fundamentally
responsible for the coherence and cohesiveness of the perceived external
world and society?
Edited by Tim Crane and Jonathan Wolff,
University College London

@? Develop a list of kernel idea search terms to facilitate expanding
expositions of the ideas in this text. (Actually, hyperlinked Excel
spreadsheets should be developed for all of the above superscripted
“footnotes” to include hyperlinking different Excel objects together.
Better yet the Excel spreadsheets should merely exist in potentia as
queries to an Access database. Developing a list of key terms, that is,
those with say, more than 100 hits within this document, might facilitate
“There is no remembrance of men of old, and even those who are yet to
come will not be remembered by those who follow.” - Ecclesiastes

“Whosoever reflects on four things, it were better for him if he had not
come into the world: what is above; what is beneath; what is before; and
what is after.”
- The Mishnah, Hagigah 2:1
“I could say harsh things about it but I cannot bring myself to do it—it is
like hitting a child. Man is not to blame for what he is.... He is flung
head over heels into this world without ever a chance to decline, and
straightaway he conceives and accepts the notion that he is in some
mysterious way under obligations to the unknown Power that inflicted
this outrage upon him...” – Reflections on Religion, Mark Twain
My contributions to these writings during the period, January thru
November 2011 were made while deployed to Bagram Airfield,
Afghanistan with the 415th MI BN in support of DFIP operations for
OEF. Contributions made during July 2003 through February 2004 were
made during my deployment to Gjilane, Kosovo as THT 6 Team
Sergeant in support of KFOR’s HUMINT collection efforts in support of
OEF there. By the way, if you happen upon these writings and it is not
yet mid-century (by that I mean the 21st !) then permit me to indulge in a
bit of naïve, wishful thinking by saying that there is perhaps a fair
chance that I’m still knocking about this rock and feel free to provide me
with your feedback Bitte Schön.

The perception of one’s own intentional thought process is
temporally interwoven with the perception of sense data, memory as
well that of one’s own bodily movements and orchestrated within
Libet’s 500 millisecond window of pre-consciousness in just such a way
as to maintain awareness of intention and action as interconnected so as
to present the appearance of a freely acting “center of volition.” This
brings up the puzzling question of what might indeed be the value to
natural selection of merely the appearance of a self possessing a free
will if in fact the continual stable appearance to the human animal of its
possessing a self as the author of its own decisions and actions is an
illusion without substantive causal efficacy.
July 2011

We sometimes revisit perforce the haughty metaphysical opinions of our
youth because the passing of all the learning and experience of the

decades has done nothing to undermine them and these same opinions if
refined and dressed up as though uttered by a mature person will not so
quickly be dismissed out of hand and thus lend their weight to sober
ArXiv: hep-th/9308061v1: au= “Aharonov, Anandan, and Vaidman [1]
have recently argued that in addition to its usual epistemological role,
the wave function in quantum mechanics in certain situations also has an
ontological status. In other words, in addition to acting as a device in the
theory to encode the conditions (of our knowledge of the world) it must
also, in certain circumstances, be regarded as real [italics mine], in the
sense that one can completely determine an unknown wavefunction of a
single system as opposed to an ensemble of states [italics mind].
Certainly if their claim were true, that one could take a single system
with an unknown wavefunction, and completely determine that wave
function on that single system, one would have to accord the wave
function a reality on the grounds that that which is measurable is real. In
the course of this paper I will argue that they have failed to establish the
measurability of the wave function, and thus have failed in their attempt
to demonstrate the reality of the wave function. The argument is
however subtle. Thus the plan of this paper will be to first discuss the
problem of reality in quantum mechanics, to set stage for the question
that they are trying to answer.” My comment: au=Aharanov proved the
reality of A, the vector potential so that if A is identified with the photon
wavefunction (see au=Bohm in this connection), then the reality of Psi
has been adequately demonstrated.

"Science proceeds as if the past was the home of explanation; whereas
the future and the
future alone holds the key to the mysteries of the present. When that first
cell divided, the
meaning of that division was to be discovered in the future, not in the
past; when some
pre-human ancestor first uttered a human sound, the significance of that
sound was to be

interpreted by human language, not by apish grunts; when the first plant
solicitude for its seed, the interest of that solicitude lay in the promise of
affection. Things must be judged in the light of the coming morning, not
in the setting
stars." (au=Sedgwick, 1916) And here it is again as stated by au=Terence
McKenna: “For me, the key to unlocking what is going on with history,
creativity, and progressive processes of all sorts is to see the state of
completion at the end as a kind of higher-dimensional object that casts
an enormous and flickering shadow over the lower dimensions of
organization, of which this universe is one”, c.f., cit=Trialogues at the
Edge of the West. Philosophers have a term for this: causal
supervenience. The details of how causal supervenience works may
forever remain mysterious, however, in general we can say that in must
involve the spontaneity (whether or not intended) of vacuum
fluctuations, which are correlated in a manner which does not support a
time-reversible causal connection.
Presumably entropy-laden,
irreversible processes also have a causal (time-reversible) physical basis,
at least with respect to sufficiently small scales of spacetime, but at some
sufficiently large spacetime scale, the nature of the correlation of
vacuum fluctuations underlying the physical process in question invokes
quantum entanglement that is not reversible. (Relevant to the issue of
wavefunction collapse/state vector reduction). Given that quantum
entanglement is a relation of absolute simultaneity, i.e., in all reference
frames, we suspect the irreversibility comes into play on account of
higher dimensions of temporality. November 2012 Quantum decoherence is
intimately connected with entropy, statistics (probability) and temporal
irreversibility. Reversibility of time seems to require an additional
degree of freedom best provisioned within a plane of complex or
otherwise multidimensional time.
April 2011 “

Sedgwick’s principle” is particularly relevant in connection with
higher order regulation of gene expression. 98.5% of the DNA base pair
patterns in the human genome are held in common with the Chimpanzee

and the Bonobo Ape. Approximately 50% of base pair sequences are
the common genetic heritage of both humans and the lowly yeast mold.
Recent research reveals that approximately 50% of human DNA is also
of viral origin. Perhaps so much of human DNA is of viral origin
because these viral genes are leftover “vectors” utilized in the distant
past as means of inserting gene sequences which otherwise would have
taken too long to develop on their own via natural evolutionary
processes. August 2011 Viruses, which are not included in either of the
three Linnaean Kingdoms of taxonomic classification – well, arguably
there are now four as of this writing, because they are not considered by
biologists to be alive, seem to predate the appearance of the first unit of
heredity, i.e. DNA and/or RNA, and so must have originated and
developed according to the intrinsic self-organizing properties of atoms
and molecules – what is thought to have been responsible for the first
billion years of chemical evolution that took place prior to and wholly in
the absence of Darwinian natural selection. Because viruses function so
efficiently within the cell and cell nucleus as well as interoperate
admirably with strings of DNA and/or RNA, both reorganizing genetic
base pair sequences as well as altering the expression of these same
sequences, it is tempting to suppose that viruses themselves originated
within ancient living cells or cellular nuclei. But this would of course
land us in a “chicken or the egg” paradox. November 2012 To recap: viruses
came about prior to DNA/RNA, which themselves were necessary for
the appearance of the first cells, but interoperate with both, even though
viruses originated in an exclusively self-organizing process that is not
dependent upon natural selection. And now, however, we are expected
to accept on faith that random chance mutations working in combination
with natural selection alone succeeded in bringing into being the entire
spectrum of biological order that we witness today all around us. August
This reminds us of the case in which a shattered hologram is
gradually reassembled and meanwhile the image encoded in the
hologram becomes ever sharper. (Do base pair sequences within viruses
function akin to metaphors in relation to the base pair sequences of the
DNA within the cellular machinery that the invading virus subverts?)
Now if this impossibly causally twisted temporal relationship could be

shown to be a mere projection or shadow (appearance) of what is
occurring in a higher dimension onto some lower dimension, then this
paradox would be solved or, rather dissolved. Since causal relationships
within 1-dimensional time are, according to Bohm’s causal principle,
equivalent to a specific set of correlated vacuum fluctuations, but which
only constitutes a tiny subset of the total array of correlated fluctuations
within the quantum vacuum, we may seek the needed higher
dimensional causal relationships, i.e., causal relationships within higher
temporal dimensions, within the quantum vacuum, perhaps within its
higher order, e.g., 2nd, 3rd, 4th loop, etc. (There is a New Scientist article
of recent writing entitled something to the effect of “matter is composed
of vacuum fluctuations”). Or perhaps even confining ourselves to 1 st
order fluctuations, if two or more sets of frequencies can be shown to be
orthogonal, out of which distinct time series can be constructed, then this
may be strong indication of higher temporal dimensionality. (This
orthogonality is distinctly different from that by which functions of
different frequencies are assigned different weighting coefficients within
a Fourier expansion of a single time domain function).
August 2012

A unit of heredity is to natural selection as a rational individual
is to the free market. Both DNA and the rational individual are
providential kernels that grace their respective dynamical systems while
transcending the scope of explanation supported by the logic of either.
April 2011

What makes mankind different from his more primitive apelike
forebears is not so much distinct differences in genetic base pair
sequences as it is differences in the regulation of the expression of these
genes held in common with earlier or less evolved life forms. There has
been simultaneously two trains of evolution operating: evolution of the
individual gene and evolution of the regulation of the expression of
genes and gene sequences. It appears that the “temporal space” within
which evolution takes place must possess a basis of at least two
dimensions. June 2011 The fact that the regulation of the expression of
genetic base pair sequences is open ended implies that the interpretation
of the genetic code as a language is no mere cute analogy, @# but is a

fact of some profound and enduring significance! The context
sensitivity of the genetic code is likely to be a two-way affair. The DNA
only contains information if it is a component of an informational
system, which is to say that any information stored in it has to have been
put there and that the molecule is open to modification so as to receive
additional information. All this is by way of saying that an element only
contains information if it is contained within a feedback circuit or
network such that the “flow of information” is two-way.
April 2011

Communications refer to the necessary or important features of
things without ever specifying the things themselves. Only data is
transmitted between minds, which is then interpreted as information.
Context gives meaning just as nonlocality gives reference. One comes
from one unknown, spends the duration of one’s life in another
unknown, only to pass on to still another unknown. (Aside: it is a most
important life’s goal to find the message that was left waiting for one
within the world into which one was born). And the two domains of the
unknown bookending this infinitesimal existence, we tend to suppose
are eternity past and eternity future. The question arises whether these
two domains of oblivion are indeed identical or, does this mote of a
single fleeting human existence enjoy or partake of the metaphysical
power to divide eternity such that it must take on an aspect of
everlastingness, itself of two parts. This brief speck in time represented
by a human existence, converts eternity into an ever lasting oblivion.
(Eternity into “everlastingness”) September 2011 I am reminded here of the
playfully irreverent statement by the crucified character played by Terry
Gilliam at the end of the Monty Python film, Life of Brian, which goes
something like this: “you started with nothing and now you’ve ended
with nothing – you haven’t lost anything!” September 2011 However,
consider that the proposition, “Russell Clark does not exist” uttered in
1958 (a year before my mother conceived me) and “Russell Clark does
not exist” uttered in 2058 (after I am dead) can’t possibly mean/refer to
the same thing: @$in the first instance, the subject “Russell Clark”
doesn’t refer to anything (only if the “Russell Clark” from the second
utterance is what is intended), otherwise this term refers to any number

of persons named “Russell Clark” – past, present and future, including
this Russell Clark. So intentionality doesn’t equate with reference
because being and what is called existence are not coextensive
categories. What distinguishes intentionality and reference, suggested
above, shares some features with McTaggarts incompatible time
predicates. For “Russell Clark” to successfully refer to this Russell
Clark (the current one writing this), it must fail to refer to him at a
certain earlier and all previous times. But there is something, which, if it
were ever successfully referred to once and named at that time, it would
be successfully referred to with this name at all other times. November 2012
This rather invokes the notion of registration, whether of one’s brain as
a system that shall be “plugged into” and interoperate with some cosmic
Internet or as a consciousness which is assigned, as it were, its own
unique and delimited signal bandwidth, etc., we cannot say. September 2011
And this something is that something which avoids contradiction of the
incorrigible principle, ex nihilo nihil fit. Things which don’t exist get
and stay connected (say in the quantum nonlocality sense) with things
that do exist via the operation of mind. In this way all things are
connected regardless of what categories we might apply to them,
including minds via the operation of mind. But how might two or more
minds be connected when the subjective by definition is not composed
of the intersubjective (“inter”-subjective), unless “inter” only obtains it
meaning via the above alluded to connection principle. This is the case
where the metaphysics of pluralism fails utterly and a kind of monism
must take its place. Another example of failure to refer is what is called
unspecified reference. An example of this is arbitrarily making up a
name for a character in a story not yet conceived of and never written
down. February 2012 All important philosophical, religious and evangelical
atheistic writings (in fact, any writings, which imply metaphysical
claims, affirmative or negative) are open-ended in that they necessarily
contain equivocations of sense traceable to this confusion of the notions
of being, existence and subsistence! January 2014 These evangelical atheists
neglect the behavioral genetic basis of religion they ignore the fact that
religion is part of the heritage of humankind.

May 2013 May 26, 2013

D-Wave Corporation recently offered a 512 Qubit
chipset quantum computer capable of accessing 10 154.127 distinct
quantum states to perform a calculation. Lime Cat D-Wave Corporation
recently offered quantum computer with a 512 Qubit chipset capable of
accessing 10**154 distinct quantum states to perform a calculation.
There's only about 10**80 particles in the observable universe. DWave's computer is then apparently capable of accessing 10**74
alternate universes to perform a calculation. Don't worry though, there's
no people in those alternate quantum universes, just calculating zombies.
The principle of spontaneous decoherence, namely that of Penrose’s
“one graviton limit”, i.e., Planck mass limit in the magnitude of a single
fluctuation of the vacuum, defies Leibniz’ principle that whatever
conditions were sufficient to create a thing are necessary at every
succeeding moment to maintain that thing in existence. We may take as
a new principle that, the inadequacy of nonlocality implies the necessity
of temporality. Correlated fluctuations embrace distinct subsets: those
which are or are not causality preserving and/or energy conserving.
This reminds us of the humorous saw that, “if it weren’t for the
existence of time, everything would happen at once! The Planck mass
fluctuation limit means thatcont’d
“In essence the solution to this would appear to lie in the use of
gravitation theory by Penrose (2004) though the actual mechanism still
needs to be fully elucidated. There and in earlier works he points out that
general relativity (which must be included in an eventual integrated
quantum theory, although it is still far from clear how this might be
done) implies that quantum states which differ sufficiently in their
gravitational fields cannot be superposed. Thus any future theory that
integrates gravitation with the approach being used here will contain a
lower limit for the time separation of moments of consciousness which
matches the time scale on which the difference between the gravitational
fields of alternative states reaches the level identified by Penrose. This
limitation makes the theory consistent with data for atomic systems,
while producing differences from conventional theory that should
August 2013

already be detectable for large conscious systems. Note that this differs
from Penrose (2004) for which only the size of the system, not its
consciousness, is relevant, a criterion that may already be falsified
(Schlosshauer, 2006)”, c.f., A New Quantum Theoretical Framework for
Parapsychology (2008).




_player “A song for remembering lost loved ones, both departed and living.”

We are forced to presume that correlations exist and quite str ng ones those little ones exist between the
various speaks on the rugged fitness landscape of biological complexity which is scaled by natural selection
acting upon allegedly random mutations

The engineer is limited in his algorithm composition by Kurt Godel Incompleteness Theorem however random
mutation is not in this way limited.

There is a point of diminishing returns perhaps unknown to intelligent design theorists where the tinkering of
random mutation and natural selection can achieve deeper and subtler complexity then can be achieved by an
engineer, regardless of how talented and gifted that engineer may be.

February 2014 Fcbk=

The philosophically naive error of intelligent design
theorists is their presumption that grammatical chauvinism on the part of
speakers of human languages possesses validity beyond the linguistic
context in other words is valid is relevant to the description of objective
being. They should remember that the passive construction leaves the
question of a subject wholly indefinite. The adjective "intelligent" may
only be a property of a fundamental process, i.e., one having no
beginning in time. The noun, "design" does not necessarily imply an
activity. An activity does not imply an agent. An agent does not imply
an agency. An agency does not imply a founding. A founding does not
imply a founder. Thanks, Michael! An old chestnut or shall I say

"onion". : p. The motive behind ID is indeed disingenuous aka backdoor
creationism. Believing grammatical relationships to be as robust as
logical or causal implications is my biggest criticism of the purveyors of
ID. . .not their cynicism, which I take for granted. "You can't get
something from nothing . . . unless its the quantum vacuum." I.e., the
quantum vacuum is "nothing". And "nothing" comes from
nothing...Heisenberg energy uncertainty is the causal basis of
temporarily and so could have had no beginning in time And
"everything" comes from "nothing" because the only difference between
real and virtual particles and fields is (and here's the
rub...*undifferentiated* energy).
There are no things, only "things".

All superposition states are collapsed by consciousness, therefore
consciousness itself cannot enter into a quantum superposition state but
does this in turn imply that there is only one consciousness that there
cannot be orthogonal consciousness’s?

Lynn, <being a nut> and <feeling like a nut> are not mutually
exclusive (what quantum physicists term "orthogonal") states and so
cannot enter into an authentically quantum superposition. Also, because
consciousness causes collapse of any and all quantum superposition
states, consciousness cannot itself enter into, or form a part of, a
superposition state. "Quantumness", as a property of brain functioning
would not be connected in any predictable or law-like fashion, therefore
to consciousness. Consequently, a quantum computer could not be
expected to reliably indicate, e.g., lighting up or not lighting up when the
brain or parts of the brain possess a certain type of consciousness (such
as the "para-consciousness" of which you speak). I am uncertain how
you define paranormal so I cannot speak to that part of your possibly
whimsical question.
German idealist philosophy contained the seeds of German nationalism
which reached its full flowering in National Socialism.

Commonsensically, quantum nonlocality and entanglement appears to
be a natural consequence of the notion of intrinsic spin which is a kind
of inaccessibly internal form of angular momentum which possesses no
equivalent in classical physics.
There are conspiracy theorist who believe that Paul of Tarsus hijacked
Christianity and realized his earlier goal of preventing the establishment
of an earthly kingdom by doing away with the notion of an earthly
kingdom in favor of a spiritual kingdom. Take a look at the following:
The notion in forming an aggressive regressive taxation system is one of
that of the performing off work performing of work upon the economy
versus performing of work within the economy which is at every turn
facilitating ones productivity and profitability verses performing work
upon the economy from outside where there is resistance friction and
viscosity at every turn as one tends to out a small modicum of disposable
income in orde to support ones family and build a modest future.
“Distinct memory/identity states of the observer (that are also
his “states of knowledge”) cannot be superposed: This censorship is
strictly enforced by decoherence and the resulting einselection. Distinct
memory states label and “inhabit” dierent branches of the Everett’s
“Many Worlds” Universe. Persistence of correlations is all that is needed
to recover “familiar reality”. In this manner, the distinction between
epistemology and ontology is washed away: To put it succinctly (Zurek,
1994) there can be no information without representation in physical
states. There is usually no need to trace the collapse all the way to
observer’s memory. It suces that the states of a decohering system
quickly evolve into mixtures of the preferred (pointer) states. All that
can be
known in principle about a system (or about an observer, also
introspectively, e.g., by the observer himself) is its decoherence-resistant
August 2013

‘identity tag’ – a description of its einselected state.
“Apart from this essentially negative function of a censor the
environment plays also a very dierent role of a “broadcasting agent”,
relentlessly cloning the information about the einselected pointer states.
This role of the environment as a witness in determining what exists was
not appreciated until now: Throughout the past two decades, study of
decoherence focused on the eect of the environment on the system.
This has led to a multitude of technical advances we shall review, but it
has also missed one crucial point of paramount conceptual importance:
Observers monitor systems indirectly, by intercepting small fractions of
their environments (e.g., a fraction of the photons that have been
reflected or emitted by the object of interest). Thus, if the understanding
of why we perceive quantum Universe as classical is the principal aim,
study of the nature of accessibility of information spread throughout the
environment should be the focus of attention. This leads one away from
the models of measurement inspired by the “von Neumann chain”
(1932) to studies of information transfer involving branching out
conditional dynamics and the resulting “fan-out” of the information
throughout environment (Zurek, 1983, 1998a, 2000). This new ‘quantum
Darwinism’ view of environment selectively amplifying einselected
pointer observables of the systems of interest is complementary to the
usual image of the environment as the source of perturbations that
destroy quantum coherence of the system. It suggests the redundancy of
the imprint of the system in the environment may be a quantitative
measure of relative objectivity and hence of classicality of quantum
states”, c.f., Decoherence, Einselection and the Quantum Origins of the
“Hilary Putnam argued that ‘‘something is wrong with the
[conventional] theory.’’ Superposition, an object being simultaneously
in a state A and a state B, a particle behaving as if it goes simultaneously
through slit 1 and slit 2, is the quantum-mechanical measurement
paradox. But conditions in the macroworld are different: In the
macroworld, a cat being both alive and dead at the same time does not

occur; the conditions cannot be superposed. Therefore, Putnam claimed,
the assumptions of conventional quantum mechanics constitute a
contradiction. He noted that Wigner (and Henry Margenau) defended the
adequacy of the received view (quantum jumps, collapse of the state
vector) along a somewhat different line: According to them quantum
mechanics presupposes a cut between the observer and the object. Any
system whatsoever can be taken as the object; however, the observer
himself cannot be included. The observer always treats himself as
possessing definite states which are known to him”, c.f., Wigner’s
‘‘Polanyian’’ Epistemology and the Measurement Problem.
January 2012

Can it be demonstrated that the concept of temporally pure
causation, i.e., pure temporal persistence, i.e., as an isolated system and
in the absence of causal context of a substantive or “beable” is logically
inconsistent? In a physical sense this appears impossible on account of
the grounding of physical temporality in an embedding quantum vacuum
energy fluctuation field (responsible for the decay of energy eigenstates,
transition between energy eigenstates and temporal evolution of density
functions). Could phase rotation of the system wavefunction qualify as
pure, context-free temporal evolution?
“Von Neumann’s Process 1 is the physical aspect of the choice on the
part of the human agent. Its psychologically described aspect is
experienced and described as a focusing of attention and effort on some
intention, and the physically described aspect consists of the associated
choice of the basis vectors and of the timings of the action. Then there is
a feedback quantum jump whose psychologically described aspect is
experienced and described as an increment in knowledge, and whose
physical aspect is a “quantum jump” to a new physical state that is
compatible with that increment in knowledge”, c.f., cit=Gravity and
Consciousness (35700017)

Without quantum entanglement being involved in the act of perception
of the outcome of a quantum experiment (likely through the quantum
vacuum in which the observed system and the observer’s brain are

commonly embedded), there would be the ever-present necessity of the
consciousness (whose brain it is supplying the observation to this
consciousness) revamping its system of interpreting the quantum
behavior of “its brain”, e.g., how does the observer synch up his
subjective perception of “dead cat” vs. “live cat” with the actual
Schrodinger’s cat experimental outcome of “dead cat” vs. “live cat”?
November 2011

It is very significant that the prime phenomenological
manifestation of the “ground of being” in the realm of physical being,
i.e., “quantum entanglement” is just now making itself known. Science
through the emergence of the field of the study of QE appears to be
arriving at a point analogous to that of the dreamer who finds himself at
the cusp between un-self-consciousness and “lucidity”. Science has
finally found one of the “handles” of the “reality bootstrap mechanism”
and so must be especially careful for it is just at the point of the onset of
lucidity that the dream continuum encounters a bifurcation point
between meta- and robust-stability. June 2012 I suspect that many of the
properties of the wavefunction, particularly with respect to the
function’s “fragility” are shared in common with those of the lucid
dreaming state of consciousness and are manifestations of two things: 1)
self-interfering feedback and 2) the limited computing capacity of the
“cosmic CPU”.
May 1997

Inertial mass may be based in the density of momentum
exchanges taking place
between the various subatomic particles and quantum fields composing a
given mass, while gravitational mass may be based in the density of
energy exchanges taking place between these subatomic particles/
quantum fields and the quantum vacuum field. The equivalence of
inertial and gravitational masses may be an artifact of the conservation
of momentum-energy uncertainty or the conservation of virtual
momentum and energy as a momentum-energy fluctuation four vector
within four dimensional spacetime.

“Detailed investigation by H. Poincare’ of the Lorentz detailed

invariants resulted in his discovery of the pseudo-Euclidean
geometry of space-time. Precisely on such a basis, he established the
four-dimensionality of physical quantities: force, velocity, momentum,
current. Poincare’s first short work appeared in the reports of the French
Academy of sciences before A. Einstein’s work was even submitted
for publication. That work contained an accurate and rigorous solution
of the problem of electrodynamics of moving bodies, and at the same
time it extended the Lorentz transformations to all natural forces, of
whatever origin they might be”, c.f.,, The Theory of Gravity by A. A. Lugunov.
Relativistic mass increase may be associated with a shift in the relative
densities of 3-momentum and imaginary 4-momentum exchanges taking
place between the mass and itself and between the mass and its
embedding quantum vacuum, respectively. (This remark added March
1, 2006 in terms of my “old parlance” of May 1997) June 2011 The reason
for the equality of inertial and gravitational mass is to be found in the
happy coincidence of there being a common mechanism between mass’
effect upon the vacuum and vacuum’s effect upon mass. @$ Mass is an
effect of gravity’s interaction with the vacuum *and* gravity is an effect
of mass’ interaction with this same vacuum.
July 1997

It is only nonzero expectation values of momentum-energy which
may possess gravitational or inertial mass. And what contributes to this
mass is any boundary conditions placed upon the quantum vacuum field
which alters this field so that the 3-momentum fluctuations and
imaginary 4-momentum fluctuations do not precisely cancel. June 1998 The
expectation values may always be defined in terms of a fluctuation term
and an uncertainty. This fluctuation term may be intrinsic to the
quantum vacuum field and the uncertainty may be associated with the
observer of the quantum system. Through a kind of coherence (or
resonance) between the intrinsic vacuum fluctuation term and the
observer's uncertainty, the emergence of a nonzero expectation value,
i.e., a classical observable, may emerge. There is no reason why we
cannot attribute the entirety of the term, /\E, to the observer performing

the energy-determining measurement. But to do so means that one is
considering the Heisenberg Uncertainty Principle (HUP) to be entirely
epistemological in nature. To attribute this uncertainty entirely to the
quantum system itself is to maintain that the Heisenberg uncertainty is
ontological in nature. This alternative interpretation of the HUP is not
feasible, however, as the observer’s brain equally constitutes a quantum
mechanical system just as does the system he is observing or performing
measurements upon. The perhaps more reasonable approach to
interpreting the HUP might be to compromise between the two extremes
by admitting that there is a dynamic interrelationship between the
uncertainties of both the observer’s brain and the quantum system he is
March 2006

It might be supposed that the manner in which the world
appears, i.e., is perceived by the observer is merely a collaborative
product of the observer and the world through the quantum interference
between the correlated quantum fluctuations of the observer’s brain and
the correlated quantum fluctuations constituting the system being
observed. But this cannot ultimately be a correct description of the
mechanism of conscious observation because the quantum correlated
fluctuational structure of the observer’s brain possesses this “necessary
interiority” by virtue of outstripping the brain’s embedding quantum
vacuum’s computational capacity qua quantum computer.03/01/06
There must arise an exact matching or mutual coherence of internal and
March 2006
external frequencies for an object to become manifest.
Conscious manifestation of form within the observer’s perceptions of his
environment is supported by the irreducible complexity of the quantum
fluctuation-correlational structure, c.f., Rezik’s Three-way Vacuum
Nonlocality, which we have said engenders interiority to this quantum
fluctuation structure that cannot be modeled within the simpler
fluctuation structure of the brain’s embedding in a quantum vacuum
substrate, November 2011 c.f., au=Plotinus in cit=Ennead IV.4.23, 4–19: “Well,
then, the soul will either apprehend alone by itself or in company with
something else. But how can it do this when it is alone and by itself?

For when it is by itself it apprehends what is in itself, and is pure
thought. If it also apprehends other things [i.e. sensibles], it must first
have taken possession of them as well, either by becoming assimilated to
them, or by keeping company with something which has been
assimilated. But it cannot be assimilated while it remains in itself. For
how could a point be assimilated to a line? For even the intelligible line
would not assimilate to the sensible one, nor would the intelligible fire
or man assimilate to the sense-perceived fire or man. . . . But when the
soul is alone, even if it is possible for it to direct its attention to the
world of sense, it will end with an understanding of the intelligible; what
is perceived by sense will escape it, as it has nothing with which to grasp
it. Since also when the soul sees the visible object from a distance,
however much it is a form which comes to it, that which reaches it,
though it starts by being in a way without parts, ends in the substrate
which the soul sees as color and shape with the extension it has out
there.” July 2011 If the substrate of being is information rather than
independently existing fundamental particles, then there is really no
impasse posed by it being otherwise necessary for matter to “bootstrap”
itself into greater complexity. It is tempting to suppose that vacuum 3way and higher order nonlocality is the source of 2-way nonlocality of
all quantum mechanical states/systems. The interiority of subjectivity
would on this view not be a truly emergent quality of biological systems,
but merely the awareness of this interiority would be. November 2012 The
dependence of n-way nonlocality for smaller “n” on n-way nonlocality
for larger “n” constitutes a kind of causal supervenience placed within
the context of quantum entanglement and correlational structures and
reminds us of holographic time scale reductionism, i.e., in which
structures occupying smaller intervals of time are conditioned by
structures occupying larger intervals of time. This “entanglement
supervenience” for want of a more convenient term intuitively points up
Kauffman’s fundamentality or principle of the fundamentalness (along
with subatomic particles and fields) of consciousness as such (as
opposed to any particular consciousness).
One way to model the relativistic effects of mass, length and time might

be to think of a mass as the result of the mutual interference of myriad
alternate universe copies of the abstract objects possessing the formal
structure which the mass exhibits upon analysis.
The greater the
coherence of the MWI (many worlds interpretation) quantum copies of
the object, that is, the more closely the copies mutually interfere, the
greater is the mass of the object, the more dilated is the external time
(and more contracted is the internal time) of the mass. In this way
Penrose’s connection between gravity and wavefunction collapse may
be explored further. March 2006 Perhaps this is better put in terms of the
greater the density of cohering of (and mutually interfering, i.e.,
quantum correlated) MWI quantum near duplicates, the greater is the
object’s inertial mass.
These frequencies, or their spectra, would be associated with the
fluctuations in the vacuum's intrinsic energy and with fluctuations in the
ability of an ideal observer to determine the system's energy
independently of the effect of the vacuum energy fluctuations,
respectively. We are basing this idea of observer-system resonance as
the basis of perception on an idea expressed by David Bohm in his book,
Quantum Theory. According to au=Bohm (1951), @$the fluctuations in
an observable, in combination with the correlations of the phases of
these quantum fluctuations, together comprise the average values and
average temporal evolution of any observable. An act of observation
has the effect of destroying the delicate phase relations between the
eigenfunctions, the product of which constitute the pure state
wavefunction representing the state of the system. July 2011 (Can changes
in the entanglement signature of nonlocally correlated fluctuations
induced by acts of conscious observation impact causal relationships?
Conversely, can causally induced changes to the system’s momentum
and energy be traced to corresponding changes in nonlocally correlated
fluctuations in the system’s momentum and energy?)
This is reflected in the instantaneous shift in the values of all
incompatible observables relative to the observable the value of which is
being more fully determined as a result of observation. The effect of the

fluctuation energy upon our energy measuring devices is, of course, an
effect which even the perfect calibration of our energy-measuring
instruments cannot in principle eradicate.
If the observer’s
consciousness inevitably induces collapse of the wavefunction for the
system he’s observing, then this is perhaps because: @$1) The dynamics
of the observer’s conscious mental processes is fundamentally quantum
mechanical in nature and 2) The mental processes of the observer are
quantum entangled with those of the system under observation.
Mass-energy is a result of an imbalance in these two energy terms. In
this way particles are seen to be not flux-stabilities in themselves, but
structured alterations in the flux-stabilities as a result of the influence,
penultimately, of our energy measuring devices - ultimately per von
Neumann - the influence of not the individual mind per se but the
consciousness, fundamental in nature, which is structured through the
complex system of boundary conditions upon the very same vacuum
field being measured (in essence) constituted from the operation of the
observer's brain, since the existence of the brain as a mass-energy
system, would otherwise presuppose, if identified with the observer's
individual consciousness, the existence of that which its observations are
partially constituting. @$“If reality is this second way, then the role of
the neuronal system is not to mysteriously create awareness and mind
from alien substance. Rather, it is to organize a pre-existing propensity
for awareness into useful, functional awareness, and provide for its
modulation by useful information”, c.f., cit=Implications of a
Fundamental Consciousness, au=MacDonald (1998).
"The mere
possibility of observation results in the reduction of the state vector."
November 2007
If a great enough interlocking feedback between such
possibilities comes about which then alters the statistics of matter and
energy (including the embedding vacuum energy field), which results in
a great enough contraction/collapse in the density rate of change in these
state vector reductions (through the conversion of disjoint states into
correlated mixtures), producing an overall coherent state, then a barrier
will spontaneously be created between internal and external, i.e., a
rudimentary real, as opposed to a mere hypothetical, possible observer

will be engendered. (consider here the necessary interiority of the brain
of the quantum observer), c.f., so-called “Boltzmann brains. ” June 2012
“We would like to argue that this is not the case. Suppose we do not
look at the whole box at once, but only at a piece of the box. Then, at a
certain moment, suppose we discover a certain amount of order. In this
little piece, white and black are separate. What should we deduce about
the condition in places where we have not yet looked? If we really
believe that the order arose from complete disorder by a fluctuation, we
must surely take the most likely fluctuation which could produce it, and
the most likely condition is not that the rest of it has also become
disentangled! Therefore, from the hypothesis that the world is a
fluctuation, all of the predictions are that if we look at a part of the world
we have never seen before, we will find it mixed up, and not like the
piece we just looked at. If our order were due to a fluctuation, we would
not expect order anywhere but where we have just noticed it”, c.f.,
Richard Feynman on Boltzmann Brains by au=Sean Carroll. “Sean’s
“idea,” in a nutshell, is that the large-field instability of the StandardModel Higgs potential — if the top mass is a little heavier than current
observations tell us that it is — is a “feature”: our (“false”) vacuum will
eventually decay (with a mean lifetime somewhere on the order of the
age of the universe), saving us from being Boltzmann brains. This is
plainly nuts. How can a phase transition that may or may not take
place, billions of years in the future, affect anything that we measure in
the here-and-now? And, if it doesn’t affect anything in the present, why
do I #%@^} care? The whole Boltzmann brain “paradox” is a category
error, anyway”, c.f.,
“In many scenarios involving universes of infinite size or
duration, Boltzmann brains are infinitely more common than human
beings who arise in the ordinary way. Thus we should expect to be
Boltzmann brains, in contradiction to observation.[italics, mine]” But
this is an unwarranted assumption on the authors’ part, as there is
essentially no way to distinguish the case of one’s presently being a
September 2014

Boltzmann brain versus being a really existing human person.
“Furthermore, there is a simple test to see whether you are a Boltzmann
brain. Wait 1 second and see if you still exist. Most Boltzmann brains
are momentary fluctuations. So the prediction of the above argument is
that you will vanish in the next second. When you don’t, you conclude
that this argument has made a severely wrong prediction.1”
1 “If you are concerned with the fact that you could never observe your own ceasing to exist, you can change
the argument to say that the theory that you are a Boltzmann brain predicts that your observations of the
external world are coherent only by chance, and that subsequent observations will not remain coherent.”

However, perhaps by invoking a sufficiently general @$ prn=Kantian
anthropic principle, such a handy observational test may not be available
to a Boltzmann brain, especially if there is an intersubjective principle of
mental continuity, i.e., between and across distinct Boltzmann brains,
say via some extrapolation of quantum entanglement, c.f.,, arXiv article by Davenport and
Olum, Are there Boltzmann brains in the vacuum? But because of the
necessary intentionality of consciousness and the reigning logic of
Wittgenstein’s Private Language Argument as the antidote to solipsism,
we may have a solution based upon the inherent inadequacy of a closed
system as a generator of conscious states: a Boltzmann brain qua
vacuum fluctuation must not merely carry the signature of contextual
quantum entanglement, but must be actually and in concrete fact
quantum entangled with an open-ended external realm, but not just any
such, but one that possesses the unity and coherence of a really existing
universe. On this view, substance and continuity are necessary
components of the requisite topology of really existing mind as
embedded within really existing world.

The Boltzmann-Schuetz cosmology is a weird idea, but weirdness
alone ought not be held as an objection. There is a consequence,
however, that Boltzmann seems not to have noticed. On such a scenario,
the vast majority of occurrences of a given nonmaximal level of entropy
would be near a local entropy minimum, and so “One should regard it as
overwhelmingly probable that, even given our current experience,

entropy increases towards the past as well as the future, and everything
that seems to be a record of a lower entropy past is itself the product of a
random fluctuation (rather than genuinely causally antecedent)
Moreover, you should take yourself to be whatever the minimal physical
system is that is capable of supporting experiences like yours; apparent
experiences of being surrounded by an abundance of low-entropy matter
are illusory. That is, you should take yourself to be what has been called
a Boltzmann brain. (Footnote: “The term is due to Andreas Albrecht. It
first appears in print in Albrecht and Sorbo (2004). The consequence of
the Boltzmann-Schuetz cosmology, that we should take the fluctuation
we are in to be no larger than necessary, seems to have been first pointed
out by Arthur Eddington.)”, c.f.,
Notes on Thermodynamics and Statistical (Myrvold 2013)
By the equivalence principle, fermion-antifermion production in a
gravitational field should not exist for a freely falling observer. But
neither should this freely falling observer witness a blackbody spectrum
of photons. Since through the (likely identical) particle production
mechanisms underlying both Hawking radiation and Davies-Unruh
radiation, the fermion-antifermion and boson particle production fields
are observer-dependent in their intensities, there should be an invariant
transformation rule by which we can connect the respective particle
production rates for bosons and fermion-antifermion pairs out of the
vacuum. November 2007 This reminds us of the equilibrium principle
manifest in the relationship of the charged particle and its electric field
during free fall – a charged particle accelerates through an electric
potential in such a manner that the particle’s electric field appears to it
(relative to the particle instantaneous Lorentz frame) to be spherically
distributed about itself.
This invariance is probably not that of simple Lorenz-invariance because
the observer-dependent shift in intensities/current densities of the
particle production is dependent upon the acceleration of the observer not on his relative velocity. For instance, the masses of the particles

produced, in the case of fermion-antifermion production, varies in an
opposing sense to the manner in which the fermion-antifermion rate
alters due to an arbitrary Lorenz-transformation of the gravitational field
engendering the enhanced f(+)/f(-) production. And this occurs just in
such a manner that the mass-creation rate for f(+)/f(-)'s remains constant.
This would seem to imply that the concept of mass is absolute within the
Theory of Relativity. The 4-voume in which f(+)/f(-)
creation/annihilation is taking place within this gravitational field is also
unaffected by an arbitrary Lorenz-transformation since the length
contraction and time dilation take place in opposite senses as well. In
this way, the mass creation rate for f(+)/f(-)'s divided by the local 4volume we are considering, i.e., the 4-current density of the general
relativistic particle production, is conserved as a result of an arbitrary
Lorenz transformation of the gravitational field inducing the particle
production field.
<passages in green are in need of “clean-up”>Returning to our first
analogy, this exchange of information is not actually occurring among
the pixels (as was the case for the gnats), but is, for the greater part,
occurring within the CPU itself, that is, between its individual circuit
elements; in small part, this exchange of energy/information is taking
place between the CPU and the pixels on the screen it is controlling.
The greater the ratio of information exchanges taking place between the
CPU and itself relative to the those taking place between the CPU and
the screen, the slower will be the maximum permissible velocity across
the screen for an object represented on this screen, assuming an absolute
clock rate for the CPU (akin to the notion of cosmological proper time).
A similar statement would apply to the "acceleration" of the cursor
across the screen - the larger the group of pixels which one wishes to
simultaneously move across the screen, the smaller will be the maximum
acceleration attainable by the coherent group of pixels represented by
the cursor.
The quantum principle of the identity of indiscernibles is weakened for
composite objects, which is related to the action of the principle of the

2nd Law of Thermodynamics. The discernibility of composite objects
grows sharper as each object develops its own unique history (as
opposed to mere “trajectory”) and the failure of this quantum principle,
i.e., identity of indiscernibles in view of consideration of the 2 nd Law is
bound up in the vacuum energy/cosmological constant paradox (nongravitating vacuum energy).
It is moreover bound up with the
important distinction of data and information. The acceleration of an
object changes the state of the object in a way that must be reconciled
not only with boundary but also with the initial conditions by which the
object was originally constituted as well. Thus gravitational fields must
be capable of possessing nonlocal components. 03/01/06We recall here
that a Bohmian style interpretation of the EPR experiment with a SternGerlach device calls for backward-in-time signaling of the separately
measured particles with their earlier state as part of a composite spin-0
particle just prior to this particle’s spontaneous decay.03/01/06
If we are looking for something to play the role of "mass" within our
computer analogy we would do so in vain unless we modify somewhat
(in a way which doesn't render our analogy useless, I think) the
programming of the software driving the computer monitor (output
device). If we were to think of the quantum vacuum field as generating
and sustaining all of the various "forms" such as all of the particles and
fields of spacetime, doing this in a manner exactly paralleling that in
which a CPU creates/sustains all of the digital graphical representations
appearing on a computer screen, then the suggestion arises that perhaps
there is not only a maximum possible velocity, but also a maximum
possible acceleration through spacetime. An obvious choice for this
maximum acceleration is simply c2/Lplanck. (But certainly the maximum
acceleration becomes smaller in magnitude as we treat progressively
more complex systems) An equivalent representation of this limit is
c/tplanck. And, of course, we are thinking of Lplanck as the dimension of a
three dimensional "pixel" composing the spatial part of global
spacetime, while (tplanck)-1 represents the clock rate of the "global
spacetime central processing unit (CPU)," i.e. the global quantum
mechanical vacuum.
We stated earlier, that the temporality of a

quantum mechanical system is owing entirely to the presence of energy
uncertainty within this system. 03/01/06(It should be noted here that
different energy uncertainties of identical magnitude might represent
different quantities of information upon interaction with the environment
due to their possessing distinct quantum fluctuation-correlational
We now realize that the temporality of quantum
mechanical systems owes to the interaction of this system with the
fluctuating quantum mechanical vacuum; consequently, the rate at which
time passes within a given region of spacetime is a function of the
energy density of the vacuum within this region. We have proposed
that the inertial mass of a body is directly related to its binding energy
due to nongravitational forces. This is a seeming paradox since binding
energy is negative and should result in an overall reduction in the inertial
mass (positive energy) of the body. <passages in green are in need of
May 1997

An example of where there is a change only in gravitational
binding energy is when the increase in negative binding energy is
resulting from the action of gravitation alone which is exactly
counterbalanced by the general relativistic increase in the mass energy
of the body. To wit, here we have increased the gravitational binding
energy of a body without having affected the total energy, and hence,
inertial mass, of the body.
When the density of a given region of space increases, there does not
result merely a simple decrease in the energy density of the vacuum.
Rather, there is a momentum current density tensor, which is diagonal in
free space, experiences a shuffling of its components so that it is no
longer diagonal - with respect to a free space Minkowski spacetime.
August 1996

There is another way of looking at the phenomenon of inertia in
terms of how spin-coupling of real bosons of integral spin and real
fermions of integral 1/2 spin to the spins of virtual bosons and fermions.
The particle manifestations of the vacuum momentum-energy
fluctuations may be incorporated into the view of the earlier stated

mechanism of inertia/gravitation alluded to in the paragraph
immediately above. It is through the spin-coupling of real and virtual
particles that the momentum current density components are altered
from their diagonal 2nd rank tensor distribution to a non-diagonal
component distribution of this momentum energy which underlies
manifest gravitational fields. The theory of "squeezed states," where the
uncertainties in momentum along a particular axis are increased by
borrowing momentum uncertainty from along other orthogonal axes,
may provide the necessary mathematical framework within which the
effects of matter upon vacuum momentum-energy uncertainty may be
adequately described: matter affects the quantum vacuum by inducing a
broadening of the vacuum's momentum uncertainty, i.e., its momentum
fluctuation spectrum by utilizing fluctuation energy provided by the
vacuum's uncertain energy, which is decreasing in step at the same time.
August 1996

This may be described in terms of the rotation of the matter +
vacuum momentum current density tensor. A second rank tensor
multiplied by this diagonal momentum current density four vector would
produce the appropriate connection between this four vector at points in
spacetime infinitesimally contiguous with one another. Such a 2nd rank
tensor must somehow be assimilated to the metric tensor of general
relativity. Instantaneous correlations would manifest themselves as a
departure from locally deterministic causality and could constitute an
explanation for the existence of Heisenberg energy uncertainty. I think
there is no doubt that if the basic framework of Special Relativity is to
be maintained, then we are forced to accept an origin for nonlocal
correlations which lies completely outside four dimensional spacetime
or at least outside the local absolute past/future of the best possible
causal chain. Noember 2007 These nonlocal correlations must always
comprise causal interactions and so never be explicable in terms of
them. We know that a photon traveling though free space experiences
acceleration due to the cosmological expansion, and that this
acceleration is equal to Hc, where H is Hubble's constant and c is the
speed of light in vacuum.
Therefore, if a spherical mass is
instantaneously converted into pure energy, i.e., photons, the photons

will instantly, collectively exert a positive pressure,
Consequently, the vacuum must exert a pressure upon spherical masses,
which is equal and opposite to this above quantity. This notion should
be investigated further in connection with the Pioneer Anomaly, i.e., the
anomalous component of acceleration that some physicists have
connected to either a cosmological constant or to the cosmological
acceleration field in some other sense, dark energy. November 2007 Some
theoretical evidence for this claim can be provided by the calculation of
a work-energy integral. This integral is ultimately motivated by an
extension of the equipartation theorem of kinetic gas theory to the
cosmological distribution of energy in the Universe. This question will
be addressed at a later occasion, however. As for the integral itself, it is
used to calculate the work which the Universe performed on some small
volume of energy as the energy density of this volume decreased from a
very high value early in the history of the Universe (say in the first few
seconds) until the present epoch of cosmological expansion when the
energy density of this volume has become almost negligible. One may
think of this work as being performed on this volume by some
cosmological acceleration force field and if we assume that this tiny
volume managed to hold itself together without expanding throughout
the entire expansion phase, then this volume must have exerted a force
upon the Universe equal and opposite to the cosmological force which
was attempting to spread it apart. Conservation of momentum holds for
the combined mass-energy/vacuum-energy system so that there is a
balancing of the force of the Hubble cosmological force field acting
upon the vacuum and the gravitational force of the vacuum acting upon
the total matter distribution of the Universe. We may even say that the
vacuum's gravitational field is simply a reaction force produced by the
tendency of the Hubble cosmological acceleration force to alter the
momentum of the vacuum. @$This reaction force acts to conserve the
momentum of the vacuum energy field. This action - reaction force
relationship is expressed by the equation given below,

H2r x Ev = Eo x GM/R2 ,
where GM/R2 = the acceleration field produced by the vacuum's
gravitational field. The gravitational field of matter distributions is not
an inherent property of these distributions, but must be conceived along
the same general lines as the electrical repulsive force between
dislocations or holes in an otherwise electrically neutral crystalline
matrix. This idea is more or less captured by the following relationship,
Ev/Eo x H2r = {c2/r x (1 - E/Ev) - c2/r} = g,
where the term, c2/r, is the acceleration field produced by the vacuum
reaction - force which compensates the action of the Hubble
cosmological acceleration force upon the vacuum energy field. Hc is the
cosmological acceleration field, which acts upon freely moving photons,
and implies the existence of a precisely balancing and opposing reactive
force upon the particles forming a bound matter distribution. Let us
assume that this tiny volume is that occupied by a neutron and that the
work-cycle is to begin at an early epoch in the Universe's expansion
when the average energy density of the vacuum was equal to that of the
neutron itself: approximately 1033 Joules/m3. The work integral is
defined to be:


where the limits of integration are to be from Pi =




where P

are the pressure and energy density of the vacuum, respectively.

We will at first define the work integral in terms of F*dr. F is just the
cosmological acceleration force acting on the tiny volume and F =
MH2r, where r is the radius of the volume, M is the mass contained
within the volume and H is Hubble's constant. If the volume were to

expand exactly in step with the expansion of space in its immediate local
region, then the surface of the volume would move with respect to its
center (chosen as coordinate origin) with velocity Hr, and the
acceleration of this surface with respect to the chosen origin would be
d/dt[Hr] = H[dr/dt] = H2r, so that, again, F = MH2r. The work integral


between the distance limits Ri = Rneutron and Rf = Runiverse. To transform
this work integral into one in terms of pressure and volume rather than
force and distance involves defining the parameters e and de, i.e., mass
density and differential of mass density, respectively.
= 3/4M R3 ===> R = 3root[3/4M ] x



-1/3 x 3root x [3/4m/4 ] x

-4/3 d

If one performs this work integral one finds that the energy necessary to
expand neutron-packet of unbound neutron mass-energy is precisely,
E = G(Mneutron)2/R,
so that the energy necessary to prevent this expansion is
E = -G(Mneutron)2/R.
This result is, of course, provided that the mass density of the universe is
given by the formula,
= 3H2/4 G

This formula for the mass density of the Universe is implied by an
equality of magnitude of the kinetic energy and gravitational binding

energy of the Universe as a whole. This equality constitutes a proposed
solution to the so-called "flatness problem" of cosmological theory.
Alan Guth's Inflationary Theory was originally proposed to solve,
essentially, just this cosmological problem. A rough and ready
definition of the flatness problem is the nearly exact equality between
the Universe's expansion velocity and its "escape velocity" - of
somewhere between 1 part per 1012 and 1 part per 1060, depending upon
which sources in the literature are cited; the problem is not that this
exact ratio conflicts with the standard "Big Bang" cosmological model,
but rather that it is obviously a non-arbitrary (structural) feature of the
Cosmos about which the model can make no meaningful explanation.
Proponents of the Standard Model are forced to lump this fact in with
the other initial conditions which were set at the beginning of the
Universe's expansion, e.g., extremely low entropy, and which physical
science cannot explain, such as the fundamental physical constants and
the Universe's initial mass. Other proponents of this model invoke the
Anthropic Cosmological Principle to explain this ratio. Its argument
goes like the following: if the ratio of escape velocity vs. expansion
velocity is too much greater than 1, then the Universe would have
already re-collapsed by this time and we would not be here; on the other
hand, if the ratio is too much less than 1, then the density of the universe
would not have been great enough, for long enough, to allow the
formation of stars and galaxies so that yet again we would not be here to
worry about the question. But the Anthropic Cosmological Principle
cannot explain the exactness with which this ratio approaches unity, but
can only provide relatively crude limits on either side of this ratio, say
between 0.9 and 1.1, conceivably these limits could be one order of
magnitude smaller, say between 0.99 and 1.01 - this shaves off only 1
or 2 orders of magnitude and there are still at least 11 more orders of
magnitude in need of explanation.1 October 2011 A related problem in

cosmology and one which is not currently thought to pertain to the
sensitivity of initial and boundary conditions is that of the stupendous
mismatch between the predicted and observed energy densities of the
vacuum represented by the “cosmological constant”: cosmological
theory predicts a value for this constant on the order of 10 -26 kg/m3 while
quantum theory predicts a value on the order of 10 95 kg/m3 a discrepancy
of approximately 120 orders of magnitude! When one considers on the
one hand, that the densities of the mass energy and vacuum energy
(cosmological prediction) are of the same magnitude, if not almost
precisely equal (the condition for a “flat” universe) and on the other,
that, if gravity were to be attributed, not to the absolute energy density of
the vacuum, but to the difference between two large energy densities
approximating one another by an order of magnitude and constituting
two distinct components of the enormous vacuum energy density
predicted by quantum mechanics, then, if this energy density difference
is attributed to an “effective energy density” for the vacuum, i.e., that to
which Einstein’s equivalence principle applies, then in turn we would be
able to explain away the enormous energy density predicted by quantum
theory for the vacuum using a logic similar to that of prn=renormalization
theory. The cost of all this is perhaps only an aesthetic one for purist
relativists, that of the seeming loss of theoretical elegance and simplicity
in the form of a considerably reduced universality of Einstein’s strong
equivalence principle. Note that the effective energy density in this
approach to solving the “cosmological constant problem” is now
comparable to that of the cosmologically predicted mass density. The
above approach does not seem so farfetched once one considers that the
starting mass density of the universe must have been very sensitively
determined by initial and boundary conditions upon the cosmological
constant itself. An additional consideration here is that a so-called flat
universe also requires that the density of the gravitational binding energy

of the Universe be equal to its mass energy density, c.f., the derivation
above for the equality of, e.g., the neutron’s mass and gravitational
binding energy densities. On account of the combined principles of
equipartation of energy between degrees of freedom and conservation of
quantum entanglement between the two complementary components of
the fluctuation momentum-energy of the quantum vacuum, which
together make up its total energy density *(zero-point energy), we
suspect that it is these two components of the physical vacuum, which
are placed out of mutual balance by the perturbing effect of inertial
mass, while the back reaction of this induced vacuum momentumenergy imbalance upon mass constitutes the origin of gravitational mass.

(zero-point energy as dictated by the fact that not only are momentum
and position “incompatible observables” - /\p/\x > h, but all functions of
momentum are similarly incompatible with all functions of position and
so the kinetic and potential energies are mutually incompatible, i.e., H =
f(p) + f(x) = Total energy, which must fluctuate due to /\f(p)/\f(x) > f(h)
where h is Planck’s constant and H is the total energy Hamiltonian).

Here the mismatch is still greater. @$But invoking the so-called
Solipsistic Cosmological Principle may well place far tighter constraints
upon the value of the cosmological constant than does a mere Anthropic
Cosmological Principle.
by Barrow and Tipler (see variants) states that this is all the case because
the Universe is compelled, in some sense, for conscious life to
eventually emerge. Critics of the SAP argue in favor of a weak anthropic
principle (WAP) similar to the one defined by Brandon Carter, which
states that the universe's ostensible fine tuning is the result of selection
bias: i.e., only in a universe capable of eventually supporting life will
there be living beings capable of observing any such fine tuning, while a

universe less compatible with life will go unbeheld.” The above is from
the wikipedia entry on the anthropic principle. Making a few
substitutions to convert from the general to the particular, we have the
following: “The strong anthropic principle (SAP) as explained
by Barrow and Tipler (see variants) states that this is all the case because the
Universe is compelled, in some sense, for my conscious life to
eventually emerge. Critics of the SAP argue in favor of a weak anthropic
principle (WAP) similar to the one defined by Brandon Carter, which states
that the universe's ostensible fine tuning is the result of selection bias: i.e., only in a
universe capable of eventually supporting” my life “will there be” me,
“capable of observing any such” ultra-fine “tuning, while a universe less
compatible with my life will go unbeheld” by me. The key here is that,
the physical constants of the universe, e.g., gravitational constant,
Planck's constant, electron mass, electron charge, fine structure constant,
speed of light, Bohr magneton, electric field permittivity constant, etc.,
ad nauseum, must all be fine-tuned to six decimal places for carbonbased life to be possible. However, for something very similar to
humanity to be possible, then these constants must be tweaked to
perhaps eight decimal places. The catch: these constants might need to
be tweaked to 12 or more decimal places for creation to be tuned to the
precise information frequency spectrum of my consciousness for it to be
put into operation!

November 2007

The theory which I propose, however, which can be
considered to be an extension of Van Flandern's S-hypothesis, explains
this ratio not as an arbitrary initial condition, but as a necessary feature
of any universe where the energy of cosmological expansion drives the
forces of the universe's gravitation; to wit, if the expansion velocity of

the Universe were greater than it is, then the energy of its expansion
would be greater and hence the gravitational energy of the Universe
would be correspondingly increased such that the ratio of unity would be
maintained. This postulate has a favorable bearing on many other
unsolved problems of cosmological theory. Our postulate states, in
essence, only that the total gravitational potential and kinetic energies
are equal. The postulate is not a mere arbitrary assumption however, as
it is supported by the principle of energy equipartation. However, this
principle can only be applied if it is assumed that there exists some form
of energy which acts as a medium physically linking these two types of
energy, i.e., the energy of position with the energy of momentum, in
order that the equilibrium between them can be maintained through
mutual energy exchanges - in much the same way that the rotational and
vibrational energies of gas molecules maintain a balance through
continual exchange of kinetic energy between these molecules through
random collisions. February 2013 A chaotic, thermalized-entropic Anthropic
proto-consciousness forms the dynamical substrate of this third medium.
Due to the availability of infinite time, Boltzmann brain-like large
entropy fluctuations occasionally occur, say, every quadrillion centuries
or so, producing an intensely focused stream of Anthropic-consciousness
for the cosmological equivalent of a nanosecond – 80 years or so...
February 2013

What happens when a material with a larger thermal
equilibrium time constant thermally interacts or exchanges energy with a
substance possessing a relatively smaller thermal equilibrium time
constant? kwo=“According to recent astrophysical observations the large
scale mean pressure of our present Universe is negative suggesting a
positive cosmological constant-like term. The issue of whether
nonperturbative effects of self-interacting quantum fields in curved
space-times may yield a significant contribution is addressed. Focusing
on the trace anomaly of quantum chromodynamics, a preliminary

estimate of the expected order of magnitude yields a remarkable
coincidence with the empirical data, indicating the potential relevance of
this effect”, c.f., cit=VOLUME 89, NUMBER 8 PHYSICAL REVIEW
LETTERS 19 AUGUST 2002, Small Cosmological Constant from the
QCD Trace Anomaly?
August 2013

If the maths pointed to Boltzmann brains outnumbering
humans, our theories of space and time could be compromised. That's
because we would no longer be 'typical' observers, and might not have
the ability to see reality from the 'correct' perspective. But according to a
new report by New Scientist, new understandings of string theory and
the theory of multiple universes might just give us an escape clause.
Physicists Claire Zukowski and Raphael Buosso at Berkely say that the
key to this balance (of us, versus the superbrains) is whether or not
universes expand forever and linger - full of Boltzmann brains - for
much longer than creatures like humans would be able to survive, c.f.,
February 2013
The growing amounts of man-hours and money being
dedicated to developing more efficient search engine software along
with the ever increasing collective number of man-hours being spent on
performing, e.g., google web searches along with the most recent
response to this demand for connected information in the form of
Facebook’s beta test version of graph search, the now long under way
explosive growth of interdisciplinary science, the increasing perception
that the growth paradigm of the 20th Century has had its day and is now
safely retired, improvements in nanotechnological science, the growing
viral nature of information and last but not least the subtly evident
expanding colonization of sociolinguistic consciousness by a syntax of
web search strings which are more and more supplanting snippets of
modern-traditional “self dialogue” – all of this and more suggests that a
new knowledge and information paradigm is about to befall the

postmodern mind. And because as noted elsewhere language is not a
mere passive implement for the communicating of ideas, as it was
conceived to be during the Enlightenment through the first half of the
20th Century, nonlinearities of thought and conception were inevitable
though the accelerating coordination of human and machine is set to
potentiate this. The specialized concepts and protocols of programmers
and software developers are collectively constituting a new and fertile
field of handy metaphors with the power to reprocess and transvaluate
what was until recently an inviolable naïve realism and common sense
of scientists and intellectuals. In a similar way, particles are no longer
the passive occupants and passengers of the void as they were
understood to be prior to the advent of 2nd quantization quantum theory.
This was in large part owing to the democratization of particles and
waves as both different aspects of an underlying active quantum field,
i.e., zero-point field or quantum vacuum. So too, ideas as the occupants
and passengers of a universal dynamic medium, that of language, must
lose their presumed qualities of sharpness, discreteness,
unchangeableness and well-behavedness. Mankind is on the verge of
having his concept map redrawn for him in ways yet to be imagined.
There’s money to be made in search engine algorithm development
because as Bob Dobbs says, “You’ll pay good money to know what you
think!” Of course the down side of rapidly advancing search engine
technology when combined with similar trends in the social networking
domain is that all of us subgeniuses belatedly realized that the vast
proportion of our most original insights were indeed shared with
thousands of other similar “subgeniuses”. The difference between the
genius and the ordinary man is that the genius merely possesses more
sensitive “antennae”.
February 2013

Because the quantum decoherence problem will never be
solved for field-mobile units (androids), the truly powerful conscious

computing, i.e. the computational “heavy lifting”, will have to be
performed at a specially isolated location, i.e., one that is insulated
electrically, magnetically, acoustically, etc., as well as cryogenically
cooled and the results of these continuous real time computations
transmitted to the brain of the mobile unit, which is a classical digital
computer, except for specially designed interfaces composed of filters,
tuners and transponders, etc. Soon after this technology gains currency,
it will be realized that neither has nature solved the decoherence
problem, but has indeed evolved (in man and the higher animals) an
adequate quantum-classical interface substrate in the form of neural
microtubule structures and their tubulin dimers. (Since the selforganizing properties of atoms and molecules are derived and more or
less continuously sustained by the dynamical evolution of an underlying
quantum field, i.e., the quantum vacuum, it is not surprising that a
“remote control connection” (in the truest sense of this phase) would be
maintained – all the way up to the Penrosean decoherence limit. One
should expect there to be a penumbra region somewhat beyond the
decoherence limit wherein the bidirectional, mutual updating of system
and underlying quantum field is no longer nonlocal/instantaneous and a
system of “real time” becomes necessary. The phenomenon of
decoherence may not merely be integral to the emergence of a direction
of time, but of temporal structure and scale.) Like the mobile field
units, each human has their own dedicated pertition of the entire
quantum vacuum signal spectrum. The question arises as to where each
“base station” is. (One reason data and information are two different
things is that all information is quantum encrypted and encoded
information and hence is nonlocal (perhaps in the same sens in which
Hilbert space is nonlocal.) The nonlocal nature of conscious computing
data (information) means tha the individual mind possesses no specific
location within any particular spacime. So now then the appearance of a

common spacetime, one populated with friends, family, acquaintances,
etc., must be taken in light of a likely providential arrangement, and
indeed adds new force to Alvin Plantinga’s dichotomy of theism versus
solipsism. Is it that our/their “base stations” are local and our interface
radio spectrum is nonlocal? Putting the question in these terms may lead
us to suppose that self and other are both projections. We trust that my
projection of the other and the other’s projection of the self synchs up
with his projection of the other and my projection of the self.

May 1997

A ready candidate for this medium connecting the gravitational
potential and kinetic energies of particles, for example, is the fluctuating
component of the Hamiltonian for the quantum system in question. As
stated earlier, this fluctuation component of the Hamiltonian cannot be
"screened." This fluctuation component of the Hamiltonian may be
thought of as the product of its space and time components, H(r) and
H(t). This Hamiltonian is, of course, only and average, <H(r,t)>. There
are three basic types of interaction for this system: exchanges of
momentum/energy between the parts of the system entirely among
themselves, exchanges of momentum/energy between the system and
other similar systems, and exchanges of momentum/energy between the
system and its fluctuation Hamiltonian.
July 1998

The equation, constant = p*p + r*r, seems to imply that p and r
may both be incompatible observables despite the absence of
fluctuations in the sum, p*p + r*r. But if one looks at this equation, one
immediately realizes that it is the equation of a circle in phase space.
But a circle in phase space represents a precisely defined trajectory in
phase space which, in turn, implies that p and r, though each uncertain in
an epistemological sense, must at any moment both possess precise
values. And this fact would contradict the thesis of p’s and r’s
incompatibility as observers.

Conservation of vacuum 4-momentum is asserted here to provide the
mechanism by which the necessary energy exchanges are effected
between the gravitational and kinetic energies of the vacuum. We have
already seen how gravitational acceleration itself ,i.e., the conversion of
potential energy into kinetic energy (the converse of acceleration under
thrust) which has been formalized by, e.g., Hamilton's canonical
equations of motion, results from a spatio-temporal vacuum energy
density gradient which itself, in turn, comes into being through the
operation of the principle of vacuum momentum conservation, and
which sustains itself in existence through the vacuum's fundamental
dynamism of self-energy-exchange. Gravitational potential, it is said,
cannot be defined absolutely. Rather, only relative differences in
potential are meaningful. For mathematical convenience, all potentials
are referenced with respect to a potential at infinity where the !/R
dependence of the potential causes it to vanish to zero. Yet this
definition contains a presumption, namely, that is meaningful to speak of
a gravitational potential at an infinite distance. In actuality, the furthest
that a mass can be placed so that its potential is a minimum, is at the socalled edge of the observable Universe, that is, just this side of the
spherical light horizon - where the cosmological red-shift of
electromagnetic radiation becomes infinite. According to some simple
calculations I have performed, this distance is roughly 1.1 x 1026 meters.
In the particular case of our own Earth this potential is about 20 orders
of magnitude smaller than the potential at the Earth's surface - a
vanishingly small value of approximately 10-30 Joules per Kilogram.
The Schrodinger equation may be thought of as describing diffusion
along the ict axis. Moreover, Graham's Law of effusion states that more
massive particles diffuse more slowly than less massive particles.
According to Hawking and Bekenstein, the entropy of a black hole is
directly proportional to the surface area of the hole. This relation is
given below.
S ~ 4 R2
But the energy density of the black hole is given by the relation,

= 3c4/4 R2G
so that,

S = e-1

where S is the entropy and e is the energy density of the black hole,
respectively. Consequently, if the energy density of the vacuum is equal
to the energy density of black hole masses, then the entropy of the
vacuum should increase with decreasing vacuum energy density. We
believe that the energy density of the vacuum is equal to the effective
energy density of black holes because the radial outward pressure of the
vacuum, Pvac, must be 0 at the event horizon surface of a black hole and
the vacuum obeys the equation of state, namely, e vac = Pvac.
Furthermore, as already stated elsewhere, eo = emass + evac. because
there is no fundamental distinction between creating mass from the
vacuum energy locally available within a particular region of spacetime
and importing already existing mass from outside this region of
spacetime into this region because, in turn, matter particles may not be
thought of as having a permanent, continuous existence after the manner
of the substances of Aristotelian physics; this follows from the fact that
there is no real distinction between relativistic and non-relativistic mass.
Pvac must be 0 here because the matter composing a black hole may
exchange energy only with itself; it exchanges no energy with the
vacuum energy field outside its event horizon. ematter = eo in this
particular case and, as well, evac = Pvac = 0. Again, half of the massenergy contained within the black hole is due solely to the general
relativistic increase in mass which had accumulated once the hole had
e = eo{1 - GM/RC2},

eo = 3c4/4piGR2.

Generalizing this result, we may say that prn=the maximum rate of
increase in the entropy of the vacuum is parallel to the direction along

which the decrease in the vacuum's energy density is maximal. In socalled free space, the direction along which the maximal decrease in the
vacuum's energy density exists is along the ict axis; in other words, the
vacuum energy density varies in a purely temporal manner in free space,
i.e., due to cosmological expansion. Therefore, the so-called
thermodynamic arrow of time points in a direction orthogonal (in free
space) to any 3 dimensional rectangular system of coordinates
describing an inertial frame of reference; moreover, a gravitational field
is associated with an alteration in the orientation of the thermodynamic
arrow of time because a component of the direction of the maximally
increasing vacuum entropy now points radially inward - in the simple
case of spherical masses. The thermal particle creation which is
observed to occur within accelerated reference frames is a manifestation
of a creation/annihilation process which is normally balanced in the free
space vacuum but which is unbalanced within the accelerated frame.
In the presence of a gravitational potential, the arrow of time possesses a
component along the vacuum energy density gradient so that a new time
axis is defined within this new vacuum which exactly corresponds to this
new time axis as defined within the general theory of relativity as
applied to the Minkowski light cone.
November 1996

The second law of thermodynamics only applies to physical
processes taking place within a closed system which is in interaction
with an infinite heat reservoir. The 2nd Law does not, however, apply to
open thermodynamics systems since in these systems no global
thermodynamic arrow of time can be consistently defined. Such
thermodynamic arrows can only be defined locally. This reminds us of
how standing waves cannot form in containers of infinite size. So the
concept of a particle, which is itself just a Gaussian packet of superposed
standing waves, can only possess validity in a local sense; globally
speaking, the notion of a particle does not refer to anything which
possesses ultimate reality, but an abstraction grounded in a low order
approximation. See Quantum Field Theory in Curved Spacetime and
Black Hole Thermodynamics by Robert M. Wald, Chicago University
Press, for further discussion of the limitations of the "particle concept"

in strongly curved or rapidly time-varying spacetimes. This book also
discusses the phenomenon of particle production in expanding EinsteinDeSitter spacetimes as being closely related to Hawking radiation.
December 1996

Changes in the boundary conditions of the wavefunction
which take place with a rapidity such that,
dB/dt > /\B//\t ~ /\E/h X /\B,

where B are the boundary conditions of the quantum mechanical
superposition state, S, will inevitably result in a collapse of the
wavefunction, Psi, into one of its eigenstates of the observable bound by
B. This is provided that the new boundary conditions, B', are stabilized
to within c x /\t, where /\t is the time uncertainty in the time interval of
this transition, B ===> B'. Wavefunctions representing locallyconnected quantum mechanical systems are constituted by a system of
boundary conditions placed upon the nonlocally-connected quantum
vacuum stress-momentum-energy field. The principle of superposition
illustrates the importance of unrealized possibilities: they play a
substantive role in the behavior of the real. October 2011 This notion
apparently does not occur to defenders of the doctrine of Modal Realism,
e.g., au=David Lewis. Initial and boundary conditions are what
distinguish the merely virtual from the real, the possible from the actual,
c.f, Max Tegmark’s hierarchy of multiverse types.
The energy uncertainty of a quantum mechanical system, /\E, is both
independent of the observer, that is, it represents an ontological, rather
than a, merely epistemological uncertainty in the energy of the system
and it is dependent upon the state of the observer's knowledge of this
system. This suggests that the observer and his state of knowledge are
essentially separable. His knowledge of quantum mechanical system
states is from the inside, meaning that the observer's knowledge is coded
nonlocally in the quantum energy uncertainty of his own brain, itself a
quantum mechanical system!@$ The brain of the observer simply

provides a set of boundary conditions upon the quantum vacuum energy
field. Thermal particle production is expected to occur in the direction of
the entropy gradient of a vacuum possessing a gravitational potential;
and the principle of relativity demands that particle production be
associated with the global increase in vacuum entropy engendered by the
process of cosmological expansion. The maximally entropic state within
any region of spacetime is that of the vacuum itself. In general, due to
gravitational time dilation, the entropy of matter distributions can never
catch up, so to speak, with the entropy of the vacuum: the result of this
is that matter and energy distributions can never quite reach a state of
thermodynamic equilibrium within an expanding universe.
December 1996

It is our belief that the global orientation of the arrow of time
is determined by the global distribution of matter in the Universe, and
that without the presence of matter, there is no determinate direction for
the arrow of time. This implies that the Universe conceived of as a
radically open system cannot possess a complete, self-consistent
topological description. Using the analogy of a system of vibrating
strings: a finite sum of Fourier component functions, F(w), adequately
describes the system of string vibrations provided that each of the strings
be "anchored" on at least one end, which is to say that, in the absence of
spatial boundary conditions placed upon the strings' vibrations, standing
wave patterns of string vibration cannot exist and no purely spatial
description of the system of string vibrations is possible - only a
spatiotemporal description is possible in this case, and one in which
there is no unique decomposition of the spatiotemporal description into a
particular 3(space) + 1(time) manifold. The result similar to the one
above obtains where no unique time direction for the dynamical
evolution of the system can be specified. The ratio of mass energy
density to vacuum energy density varies with R-1 for spherical masses.
e = eo{1 - GM/RC2} The previous formula seems to imply that when R =
RSchwarzschild, the energy density of the vacuum has only been reduced to
1/2 of its normal free space value. However, this is to neglect the effect
which a reduced vacuum energy density has upon the measurement of
mass values: the inverted fraction by which the vacuum's energy density

is reduced gives us the fraction by which the masses occupying this
vacuum relativistically increase. In other words, the mass of a body may
increase to just short of 1/2 of its Schwarzschild value and still remain
stable against total gravitational collapse. When the mass of a body
increases to just over its Schwarzschild mass a positive feedback occurs
between each successive "cycle" of relativistic mass increase,
whereupon half of the vacuum's energy has already been "displaced" by
the piling on of mass from outside, while the other half of the vacuum's
energy is converted directly into mass energy entirely through
relativistic mass increase. This is the reason why we may properly say
that the true energy density of the vacuum is not 3c4/8piGR 2, but
actually twice this value: eo = 3c4/4piGR2. Also, when one considers
the process of "evaporation of black holes" via the mechanism of
Hawking radiation, it is easy to see that in a very real sense the density
of black holes must be exactly twice that predicted by the general theory
of relativity, more particularly, via the Schwarzschild solution to the
field equations: @$a quantity of mass, 2mc2, where m is the mass of the
black hole, must be created from out of the vacuum before a black hole
of mass, m, evaporates completely.
The ultimate substratum which mediates all the fundamental physical
interactions must itself be nondeterministically chaotic in nature; or else
time cannot be considered a true dynamical variable. Since a
fundamental process of creation and annihilation underlies all particle
interactions, the action of the vacuum energy field may be identified
with the translation of all composite matter along a direction orthogonal
to the total set of orthogonal spatial axes.
January 1997

Space without Time is Determinism. Time without Space is
Chaos. Determinism and Chaos are simply opposite ends of a single
continuum. Complexity is that which governs the movement of a
dynamical system back and forth along what we might well term the
Cosmos/Chaos continuum. The underlying order which pushes a
dynamical system this way and that along this continuum cannot itself

be described in terms of a classical, dynamical system because this order
necessarily operates from outside this continuum. What ultimately
governs this movement of dynamical systems along this continuum is
the underlying fluctuations in spacetime.
November 1997

Deterministic change can only be a phenomenal appearance
since either the deterministic phenomena are the play of projections
from determinate objects from within higher dimensional spaces
containing our space or the phenomena conceal an indeterminism at a
deeper level behind the appearances. In the same way that the continual
creation and destruction of a circular disk confined to a two dimensional
sphere may be thought of as the continuous penetration or projection of
a three dimensional cylinder orthogonally through this two dimensional
spherical surface, we may model the continual process of creation and
annihilation of spherical massive bodies as the continuous penetration or
projection of hypercylindrical bodies orthogonally through a three
dimensional hypersurface constituting normal three dimensional space.
If massive bodies were composed of permanent, continuously existing
substance, there would be no reason to postulate the existence of an
additional 4th spatial axis associated with the dimension of time. It is
the energy of matter's continual re-creation of itself which constitutes the
latent energy of matter, E = mc2. When a material body is uniformly
accelerated, the body is no longer re-creating itself along the time
dimension alone, but must be considered to be in the act of re-creating
itself along two orthogonal component directions: part of the energy of
re-creation is associated with a momentum in the direction the body is
accelerating, and the remaining part of this re-creation energy is
associated with the body's momentum in a direction orthogonal to this
acceleration vector, and moreover, orthogonal to the 3 dimensional
space (instantaneous inertial frame) which it occupies at any given
moment. Our question at this juncture, then, is: is there any reason for
treating a 4th spatial dimension as being ontologically real, rather than
as just an abstract entity within a particular formalization of special
relativity? Yes. We list them below.

1) Conservation of vacuum momentum.
2) The conversion of mass to energy as the 90o rotation of imaginary
3) The thermodynamic arrow of time in a gravitational field.
4) The Hubble distance-velocity relationship describing galactic
5) The tunneling of all masses through a hyperspherical potential barrier.
The derivation of Einstein's mass-velocity relationship within an
expanding four-hyperspherical universe.
7) The conservation of four dimensional angular momentum as an
explanation for the perihelion advance in the orbit of the planet Mercury.
8) The implication of quantum mechanics that real particles possess no
continuous existence, but are essentially being continuously created and
ds2 = c2t2 - x2 -y2 -z2 , so that the interval, ds, may take on either real
or imaginary values. If ds2 > 0, then two events separated by this
interval are locally connectable, and may be connected by a series of
reversible interactions. If ds2 < 0, then two events separated by this
interval are nonlocally connectable, and may only be connected by a
series of irreversible interactions. All reversible processes are mediated
by vacuum processes which are themselves irreversible.
gravitation is a phenomenon resulting from conservation of fourmomentum, the sign of mass (+/-) is immaterial to the direction of the
gravitational acceleration vector. If anything analogous to what might
be termed mass charge exists, it is in the form of an imaginary mass.
Imaginary mass would have the effect of producing a gravitational field
with an acceleration vector which is reversed in its normal direction.
This suggests to us that the mass, or energy, of the vacuum field is itself

imaginary so that real mass may be understood as a deficit of imaginary
energy within the vacuum field, producing an acceleration vector of the
normal gravitational acceleration vector field. If gravitons, as massless
particles, are assumed to be the true mediators of the gravitational force,
then there is a serious problem with interpreting the gravitational field
associated with a spherical wavefront of gravitons which is expanding
outward at the speed of light: We notice that in the many various forms
in which the Heisenberg Uncertainty Principle may be stated there is
always the product of two uncertainties in physical quantities which is
greater than or equal to Planck's constant and that one of these paired
uncertainties is with respect to a physical quantity which is conserved,
and for which there exists a quantum number, while the other paired
uncertainty is with respect to a physical quantity which is not conserved,
and for which no quantum number exists.
To list just a few examples of this general rule: ^E^t > h, ^p^x > h,
^n^L > h, etc. Moreover, each form of expression of the fundamental
Heisenberg uncertainty relation may be, in turn, paired with another
such expression where the conserved quantities of the two paired
expressions form with one another a symmetrical tensor which possesses
the property of Lorenz-invariance, while the unconserved quantities of
the two paired expressions form, with one another, another symmetrical
tensor which also possesses the property of Lorenz-invariance. It is the
Lorenz invariant tensorial relationship of the paired conserved quantities
which is responsible for the Lorenz invariance and tensorial nature of the
paired unconserved quantities and not the converse. For example, the
fact that momentum and energy may be subsumed together under a
unified description as the relativistic momentum-energy tensor is what is
responsible for the tensorial nature of the interrelationship of the space
and time variables, i.e., the Lorenz-invariance of space and time which
manifests itself separately as time-dilation and length-contraction which
is observed within frames of reference traveling an appreciable fraction
of the velocity of light relative to an observer reference frame. The
momentum-energy tensor is, by the way, also responsible for the
Lorenz-invariant, tensorial nature of the Maxwell tensor describing the

electromagnetic field, and we may now see why the Maxwell tensor
does not possess a term denoting the divergence of the magnetic field,
i.e., why magnetic monopoles do not exist in nature. Aesthetically
minded physicists have for generations noted this missing term in
Maxwell's equations and suggested the inevitable existence of
monopoles, since their existence would render the electromagnetic field
equations more perfectly symmetrical. But we see now that the lack of
greater symmetry in Maxwell's equations is explicable in terms of the
presence of the even deeper symmetry of the Heisenberg uncertainty
relations, and so this apparent lack of symmetry on the part of the
electromagnetic field need no longer be viewed as a "flaw" in the
structure of mathematical physics.
This deeper symmetry may be understood in the following way: the
fluctuation in electric field strength (an unconserved quantity) is due to
the uncertainty in the position (an unconserved quantity) of a conserved
quantity - electric charge, combined with the uncertainty in momentum
(a conserved quantity) of the magnetic charge (an unconserved
quantity). If we try to establish the fluctuation in the magnetic field
strength independently of the fluctuation in electric field strength, we
end up violating the symmetry of the uncertainty relations, e.g., the
fluctuation in magnetic field strength ( an unconserved quantity ) is due
to the uncertainty in charge momentum ( a conserved quantity) of a
conserved quantity - electric charge, combined with the uncertainty in
charge position (an unconserved quantity ) of the magnetic charge
(assumed here to be a conserved quantity ). Again, the symmetry is only
restored here by treating magnetic charge as an unconserved quantity.
We may apply our rule in a more direct fashion by postulating an
uncertainty relation which obtains provided that magnetic charges do
exist. This uncertainty relation is the product of uncertainties in electric
and magnetic charge. To wit, the product in the uncertainties of these
two physical quantities must be greater than or equal to the value of
Planck's constant. Following our same symmetrically-based rule, we
find that Planck's constant must be less than or equal to the product of
uncertainties in a conserved quantity and an unconserved quantity. This

new uncertainty relationship would be written, expressing Planck's
constant as the lower limit for the product of the uncertainty in electric
charge with the uncertainty in the quantity of magnetic charge, divided
by c, the speed of light, in order to have consistency of physical
dimensions. Again, only one of these paired quantities is the conserved
quantity, and this conserved physical quantity must be the electric
charge. So we see from consideration of the symmetry exhibited by the
many alternate expressions of the Heisenberg uncertainty principle, that
if monopoles exist, their charge cannot be a conserved quantity so that
magnetic charge may not possess a quantum number. However, if
Maxwell's equations are modified to allow for the existence of magnetic
charge, the symmetry of these equations demands magnetic charge
conservation, but this leads to a contradiction with the more general
symmetry argument for the non-conservation of magnetic charge, and so
we see that magnetic charge cannot exist.
QZ Particle creation in a non-inertial reference frame is not a
symmetrical process: it is not possible for one to accelerate in such a
manner that real particles become virtual particles, i.e., are absorbed
back into the vacuum energy field from which they were originally
created in the same way that it is not possible to accelerate in such a
manner that the local rate at which time passes increases rather than
decreases; however, if a given real particle does not have an infinite
lifetime (which no particle does), then within an unaccelerated reference
frame, the lifetimes of quasi-stable particles, as viewed from an
accelerated reference frame, will be shortened by the relativistic time
dilation factor.
It is in terms of this fundamental asymmetry that we can more
simply resolve the so-called twin paradox of special relativity: the
acceleration of the spacefaring twin and the earthbound twin cannot be
considered to be merely relative because the twin in the rocket ship
observes thermal particle production within his vacuum, while the twin
confined to the Earth observes no such phenomenon within his own
vacuum. This phenomenon of particle production within accelerated

reference frames is to be expected because a particle is real only if its
energy is greater than the energy uncertainty of quantum system to
which it belongs, and the time dilation associated with accelerated
motion affects the fundamental uncertainty relation, ^E^t > h, such that
some particles which were virtual within the unaccelerated frame
relativistically increase their energy which is now even greater in
relation to a reduced energy uncertainty, and so "become" real particles
within the new vacuum state. All particles which are virtual in one
particular reference frame are real particles with respect to some other
reference frame; the converse of this is not the case, however - the
irreversibility enters the picture, as stated before, through the differential
observations of thermal particle production within the vacuum of
observers within different inertial frames of reference. This relationship
between real and virtual particles within special relativity can perhaps be
understood as a restatement of the principle of causality within special
relativity: events which are causally connected in one particular
reference frame are causally connected and have the same time order
within all possible reference frames, and it is only those events which
are not causally connected (nor potentially causally connected) which
might have the order of their occurrence switched when observed from
the standpoint of different reference frames, from which it also follows
that events which are not causally connected within a given frame of
reference, are not connected in any reference frame. Real particle
production within a vacuum of reduced energy uncertainty may be
interpreted as being converse but parallel to the process of virtual
particle production within a vacuum of increased energy uncertainty.
Also, if real particles are understood as feedback structures of virtual
particle processes which essentially may be understood as a network of
circular energy fluxes, these individual processes being causally
connected with one another within one particular spacetime, then these
feedback structures are destroyed when the energy uncertainty of the
vacuum becomes greater than the energy of the real particles, so that an
increase of energy uncertainty is associated with a loss of information in
the form of the cybernetic control "holding the particles together." June
The so-called twin paradox of special relativity is easily solved using

the principle of the conservation of four momentum. It is only the
accelerating twin who changes the distribution of three momentum
across the four components of his conserved four momentum. The
other, stay-at-home twin does not change the distribution of the four
components of his four momentum and so all relativistic effects are only
associated with the accelerated twin.
Consequently, the particles which are produced within an
accelerated reference frame, say, within the curved spacetime of a
gravitational potential, may not "appear" out of the vacuum in a
collective state of causal interconnection with one another. The only
assurance that a set of particles is not causally connected with one
another, i.e., locally connected, is if they are nonlocally connected.
Moreover, the notion of the continuous existence of particles is simply
not consistent with the asymmetry of virtual particle/ real particle
transformations which are necessitated by a change in Lorenz frames.
We know that virtual particles do not preserve their identity from one
moment to the next; by "moment" we mean a period of time greater than
h/E , where E is the total energy of the virtual particle-antiparticle pair
which has been spontaneously created out of the vacuum state. We also
know that this particular virtual pair will appear as a pair of real particles
with respect to an accelerated frame of reference. So if the virtual pairs
possess no enduring continuous existence within a flat spacetime, then
neither do they possess a continuous existence within any other possible
frame of reference. This notion follows simply from the principle of the
general equivalence (from the standpoint of the fundamental invariance
of physical law) of all frames of reference. From this we arrive at the
general result that real particles, what we call matter, must be a stable
pattern of fluctuation of the field energy of the quantum mechanical
vacuum, whereas virtual particles are unstable patterns of vacuum field
fluctuation. Here we see that the fundamental difference between stable
and unstable patterns of vacuum fluctuation, real and virtual particles,
respectively, is not qualitative, but quantitative; it is due merely to the
availability or non-availability of raw, undifferentiated energy. The

structure of all possible matter configurations already exists latent within
the vacuum fluctuation field; what is required to "create" these
configurations is simply the necessary quantity of raw energy. April 2011
But this is only true up the limit set by the Planck energy.
Configurations of vacuum stress-momentum-energy larger than EPlanck
must have been “assembled” from configurations smaller than EPlanck,
c.f., arXiv:quant-ph/0603269 v2 12 Jul 2006 April 2011
When energy is supplied to the vacuum, the structures which are
produced are simply those which are the most probable and hence the
simplest. More exotic configurations of matter may be produced if
energy is supplied to the vacuum field while it is experiencing
"improbable" fluctuation patterns.
These so-called improbable
fluctuations are simply those which possess a more fleeting existence.
In the August 1993 issue of Scientific American there appears an article
which describes experiments in which the time for photons to quantum
mechanically tunnel through a barrier is measured for a coherent beam
of incident photons where 99% of the beam is reflected off of the
barrier, but in which approximately 1% of the photons are transmitted
("tunnel") across the barrier. The experimental data indicated that the
photons which tunneled through the barrier traveled at superluminal
speeds, some of the photons reaching 1.7c. The phenomenological
explanation for this was that the tunneling photons changed the shape of
their wavefunctions such that the peak of the wave function is shifted in
the direction of photon tunneling, resulting in the photons having a finite
probability of being found just on the opposite side of the barrier
somewhat earlier than if the shape of their wavefunctions had
experienced no distortion. Increasing the width of the barrier decreased
the probability of photons successfully tunneling through the barrier, but
resulted in increased measured superluminal velocities for the photons,
which actually succeeded in tunneling through the barrier.
June 1997

In this case, the photons' wavefunction peak had to shift toward
the opposite end of the barrier faster if they were to be observed on the
other side of the barrier within the short time that it would have taken for

the photon to be absorbed by the barrier.
In theory, particles which quantum-tunnel through a potential barrier
possess a negative kinetic energy, and hence an imaginary momentum
while engaged in the tunneling process. If the four-momentum of the
tunneling photons is conserved (as it is required to do by special
relativity), then an increased photon imaginary momentum must be
precisely compensated by an increased real photon momentum such that
the magnitude of total four-momentum of the photon is, again,
conserved: the tunneling photons are effectively being scattered in fourdimensional spacetime! April 2011 The tunneling photons possess a
negative imaginary momentum while in the act of tunneling through the
barrier. April 2011
A photon scattered within a four-dimensional space would experience a
decrease in its so-called real momentum; (actually, in this case, the real
momentum of the photon is simply the momentum associated with its
motion though the space which is directly observable to us, i.e., 3
dimensions) however, the scattering of a photon within a 4-dimensional
space where it is possible for the interval, ds2 < 0, superluminal
velocities are made possible by the conservation, as stated earlier, of the
photon's four-momentum. If there is a functional relationship between
the integral of both the gravitational self-energy and the kinetic energy
of cosmological expansion, then there will be a functional relationship
between the gravitational self-energy of expansion and the kinetic
energy of expansion such that when the kinetic energy of cosmological
expansion approaches zero, the gravitational self-energy of the Universe
approaches zero, implying a flat global spacetime geometry. Because of
the negative feedback coupling between the kinetic and gravitational
self-energies, we expect that these two energies are strongly coupled in
the early history of the cosmological expansion, but become very weakly
coupled by this relatively late epoch in the history of the Universe. In
this scenario we expect a time variation in the strength of the Newton's
gravitational constant which is proportional to the time derivative of
the quantity, e-t/T , where T = 1/H where H is Hubble's constant.

This gives a time variation of G of H/e x G. Since the coupling
between the gravitational self-energy and the kinetic energy of
cosmological expansion is virtually zero in the present epoch of the
Universe's history, we expect that there will obtain a force of
cosmological repulsion which almost exactly counterbalances the
gravitational force which would tend to slow and eventually reverse the
process of cosmological expansion.
QUESTION: We know that for low velocities, the addition of velocities
is according to Galilean relativity, i.e., velocities are simply additively
superposed. However, it does not appear that small accelerations may
be simply additively superposed according to Galilean relativity.
According to what rule are both large and small accelerations added
together to yield the total relative acceleration? The energy required to
rotate a pure imaginary momentum by 90o so that this momentum
becomes a pure real momentum is just mc2. This quantity of energy
may be thought of as the latent energy of matter which it possesses by
virtue of its being initially accelerated by the forces of the Big Bang
explosion. The negative kinetic energy of matter implies the existence
of a hyperspherical potential barrier through which all matter tunneled
(in quantum mechanical fashion and through which it continues to
tunnel. This notion constitutes a kind of hyper-extended inflationary
theory. The gradient of this potential associated with this barrier may be
described by a pure imaginary four-vector ( in "free space" ), while the
orientation of the gradient of this hyperspherical potential is altered in
the presence of mass-energy in such a manner that the magnitude of the
gradient (in four dimensions ) is always conserved. Given a typical
distribution of matter, in general this four vector will possess no nonzero components, and the introduction of new matter into this
distribution will transform the components of the potential gradient fourvector after the manner of a second rank tensor. In fact, this tensor
provides the "connecting rule" by which the gradient transforms, as we
move along an arbitrary trajectory through a given matter distribution,
considering in succession points along the trajectory which are only

negligibly distant from one another (so that the potential does not change
"too rapidly" between successive points). All of the terms of Einstein's
general relativistic field equations are second rank tensors, the energymomentum tensor providing the rule by which the metric tensor at one
point in spacetime is transformed at infinitesimally contiguous points of
spacetime. We must keep in mind that the potential gradient around any
particular particle of matter is described by a four-vector, and it is only
the meshing of the gradients of one particle's vector field with that of its
neighbor which requires the use of a tensor description.
July 1997

Russel Clark send me this Email:

Thema: Luxon Theory
Datum: 30.07.97 02:39:28
In support of your rather interesting theory the following: Imagine, as you say, that matter is
indeed travelling at the speed of light all the time, even when it appears to be at rest. The initial
momentum of a given mass then might be,

p = m(ic)
Now when the mass is accelerated to a velocity v, the mass' new velocity becomes,

v'= sqrt[v**2 + (i**2)c**2] = v'= sqrt[v**2 - c**2]
such that the final momentum of the mass is now

p'= m' x sqrt[v**2 - c**2]
If we equate the intial momentum with the final momentum (conservation of 4-momentum, if
you will) we have,

m(ic) = m' x sqrt[v**2 - c**2] yielding,
m = m'/(ic) x sqrt[v**2 - c**2]

ic = sqrt[i**2 x c**2] = sqrt[-c**2]
so that

m = m'/sqrt[-c**2] x sqrt[v**2 - c**2] = m = m' x [1 - v**2/c**2]
such that

m' = m/[1 - v**2/c**2]
which is just the special relativistic mass formula of Einstein. The coefficient "i" comes into play
in the above manipulations because multiplication by "i" or e**(i)pi/2 is the only way to rotate a

vector in 3 space without producing another vector within the same 3 space; multiplication by "i"
takes a 3-vector out of the 3-d manifold and so represents the relationship of time to the other
three spatial axes.
There are other reasons for, perhaps, including the coefficient "i" within quantum tunneling, for
example, as in the case of matter tunneling through a hyperspherical potential barrier while
appearing at rest in 3 spatial dimensions.
Moreover, we may integrate the momentum of a mass, m, as it is accelerated from a velocity
v(init.) = ic to a final velocity v(fin.) = c. This yields,

Intgrl[mv]dv (v = ic to v = c) = 1/2 mv**2
evaluated between the limits of v = ic and v = c which yields,

1/2m(c**2) - 1/2m(i**2 x c**2) = mc**2
Of course the energy of motion is just the integral of the momentum from the initial to the final

Best Regards,
Russell Clark

If this vector field were assumed to be quantized, so that a unique
exchange particle, or boson, were thought to mediate the action of the
field, then this boson would have a spin of 1, not 2, and hence could not
be described as a graviton, itself the mediator of a purely attractive force
field; a spin 1 particle, however, is the exchange particle of a force field
which is, like the photon, either attractive or repulsive, depending on
whether the gradient of the potentials of both particles are of like sign or
of opposite sign. The "charge" of the matter particles corresponds to
the case of the particles being either of real or imaginary mass, as stated
earlier. The effect, however, of two matter particles of either both real
mass or both imaginary mass upon each other's spin 1 vector fields is to
create a stress within the spacetime between the two particles which, as
we stated earlier, must be described in terms of a tensor field. The
imaginary mass of virtual particles, as alluded to earlier, would result in
a mutually repulsive force field tending to drive these virtual particles
apart from one another, resulting in the cosmological expansion of the
vacuum, or of space itself. Localized deficits in the density of imaginary
mass (due to the "displacing" presence of real mass) would manifest

themselves in a diminution of the cosmological acceleration vector
describing the cosmological force of repulsion obtaining between all
virtual particles.
The acceleration of massive particles due to
gravitational fields may be interpreted as an attempt on the part of real
massive particles to maintain a spherically symmetrical distribution of
vacuum energy about themselves - a condition obtaining for a particle at
"rest" with respect to some fundamental reference frame. The general
relativistic effect of mass increase within a gravitational field may be
explained in terms of a function of the alteration in the three variables:
vacuum energy density, magnitude of the hyperspherical potential
barrier, and the imaginary momentum of the particle experiencing the
mass increase. The mechanism by which the vacuum energy density is
reduced by the presence of mass-energy has already been discussed.
The reduction in the local value of the hyperspherical potential is
explained in terms of the projection of its gradient within an altered
spacetime. The alteration in the imaginary momentum is also explained
in terms of its projection within the same altered spacetime.
The retardation in the local rate of cosmological expansion which
manifests itself as a linear increase in the loss of synchronization of
clocks separated by a difference in gravitational potential and which
according to general relativity is an effect of gravitational time dilation
alone, is on our view on account of the conservation of four-momentum
of the body engendering the gravitational potential. The mass of the
body, as measured from the point of weaker gravitational potential, is
increased by a fraction equal to the fractional change in the vacuum's
zero-point energy density at the point of greater potential, relative to the
point of weaker potential, where the density of this vacuum energy ( in
free space ) is equal to the density of mass energy of a black hole mass
of radius equal to the radius of the body in question which is producing
the difference in gravitational potential. One might justifiably ask about
any 2nd or higher order effects which could arise out of the particular
cosmological vacuum mechanism that we propose for the gravitational
field. For instance, if the time rate of decrease in the energy density of
the vacuum is suppressed (relative to its "free space" value) in regions of

spacetime possessing massive bodies, then wouldn’t one expect a kind
of "piling up" of vacuum energy in those regions of spacetime where
general relativistic time dilation is locally strongest in such a manner
that a repulsive gravitational field develops? , c.f. , Dr. Brian L. Swift.
The relationship in general relativity between mass and curvature where
increasing curvature leads to increasing mass as well as increasing mass
leading to increasing curvature has an analogy within our theory of
gravitation based on spatiotemporal variation in vacuum energy density.
Within our theory, decreasing vacuum energy density leads to increasing
mass and increasing mass leads to decreasing vacuum energy density.
Within our theory, the role of the metric tensor components, gik,
correspond to the 2nd partial derivatives of vacuum energy density with
respect to the variables x,y,z, ict. The stress-momentum-energy tensor
of general relativity corresponds to the 2nd partial derivatives of the
mass, or nongravitational binding energy density within our theory. The
1st partial derivatives are not sufficient to provide the mathematical
structure needed to describe the spatiotemporal variations in the vacuum
energy density responsible for the parasitic gravitational force. We must
remember that Newton's third law of action-reaction is modified within
relativity theory and that it does not strictly hold within this theory. No
gravitational forces must lie along any 3-hypersurface of simultaneity
within 4-dimensional spacetime. It is easy to see why this is so when
one considers two distinct points which are gravitationally coupled, i.e.,
connected by a geodesic arc.
The time rates of change in the vacuum energy density at these two
spacetime points differ by an amount related to the differential severity
of gravitational time dilation (relative to some arbitrary 3rd point in
spacetime) and so there is a variation in the time rate of change of
vacuum energy density as one moves along the geodesic arc connecting
these two points. We believe that the role of the curvature tensor within
general relativity is to fix the relationship of the metric and momentumenergy tensors with respect to the condition of spacetime at the arbitrary
point within it where the observer is located. Are two gravitationally
coupled points within spacetime linked by a geodesic arc of the

spacetime, or are they linked by an arc length of null spacetime interval,
where ds = 0? If a spin 0 particle decays into two spin 1/2 particles of
opposite sign (so as to conserve spin quantum number ), and the two
spin 1/2 particles become separated by a great distance such that when a
quantum spin measurement is performed upon one of the two particles,
the wavefunction which describes both particles "collapses" so that the
spin orientation of the unmeasured particle must instantly become
opposite to that of the spin orientation observed in the measurement of
the former spin 1/2 particle. This EPR (Einstein-Podolsky-Rosen) type
gedanken experiment, performed within a curved spacetime raises an
interesting question concerning the wavefunction which describes the
two particles, as this wavefunction takes two different forms at two
points along any segment of a curved spacetime. If the communication
between the two spin 1/2 particles is nonlocal and hence
"instantaneous," then the wavefunction experiences a discontinous
change at the point in spacetime occupied by the second particle, i.e., the
wavefunction as expressed within spacetime B is instantaneously
expressed in terms of the nonlocally connected spacetime A, where
measurement of the spin of the first particle was performed; in this way
the spins of the two particles would add to zero, resulting in spin
remaining a "good" quantum number. However, the only way to avoid
the appearance of discontinuity (of the wavefunction) , in this case, is to
postulate the existence of a physical description which is more
fundamental than the wavefunction itself so that the wavefunction
becomes but the projection, within a given local spacetime, of the more
fundamental physical description. which itself remains continuous.
June 1997

If such a more fundamental description of the quantum
mechanical system exists, then why is the reduction of the wavepacket
or collapse of the wavefunction itself necessarily accompanied by a
discontinuous change in the probabilities for observation/measurement
of physical observables? We might rather assume for consistency's sake
(that of QM) that the wavefunction describing the particle pair must
undergo a "self-collapse" when some critical separation of the particles
is reached - a separation at which the difference in the representation of

the pair, in terms of its wavefunction expressed within the local
spacetimes of either particle of the pair, has reached some critical value.
This critical value would, according to Penrose, be related to the massenergy difference of the spacetimes in which each particle is embedded.
Perhaps as long as this mass-energy difference is less than the most
energetic massless particle which can be defined within a self-consistent
theory of quantum gravity, say, the mass-energy of a Planck particle of
some 10-8g, there is no necessity that the wavefunction describing the
particle pair undergo what Penrose terms "Objective Reduction," (OR),
because, perhaps, the energy difference up to this critical value of massenergy can be compensated through the exchange of a massless quantum
(boson), i.e., through exchange of a virtual particle representing a
vacuum 3-momentum fluctuation. Another possible explanation of the
objective reduction of the pair's wavefunction is related to the overall
energy uncertainty of the component of the quantum vacuum of both
particles. This is to suggest that when the difference in mass-energy of
the local spacetimes of both particles exceeds the energy uncertainty of
the nonlocally connected component of the local vacua of the particles,
objective reduction of the pair's wavefunction must take place - for
otherwise, the mass-energy difference in the local spacetimes of the
particles has outstripped the nonlocally-connected vacuum's ability to
compensate the disparity in the local spacetime representations of the
pair's wavefunction in the spacetimes of each particle, resulting in the
incommensurability of the quantum numbers of each particle should a
reduction of the pair's wavefunction take place after this critical
difference in spacetimes has been reached - as a result of the spatial
separation of the particles. What has been said thus far suggests that
quantum entanglement, i.e., nonlocal connectivity, of particles or fields
within significantly differing local spacetimes may not be admissible in
a consistent theory of quantum gravity. This, in turn, suggests that
nonlocal vacuum process may not actually be responsible for the
maintaining of particular spacetime geometries or that, there is some
rather small limit to the differences in local spacetime curvatures within
an overall nonlocally connected vacuum. We must investigate the
possibility that the temporality, i.e., the rate of time's passage relative to

cosmic time, of a local spacetime is directly related to the nonlocal
connection of the local vacuum of this spacetime to the nonlocallyconnected vacuum of the universe at its largest scale.

The result of this maneuver, however, is that quantum mechanics could
no longer be viewed as a "complete theory," since the wavefunction
would no longer constitute a complete description, in general, of a
quantum mechanical system.
On the other hand, if the expression of the wavefunction remains in
terms of its own local spacetime, then there is no unique wavefunction
which describes both particles prior to a measurement being performed
on one of the particles, so that the spins of the two particles would not
necessarily add to zero after a spin measurement is performed, with the
result that spin would not be a "good" quantum number within a curved
spacetime. In such as case, the general invariance of physical law within
the theory of relativity would be violated. According to the physicist
David Bohm, in his book, The Special Theory of Relativity, the latent
energy, E = mc2, which any particle of mass, m, possesses, exists by
virtue of internal motions, which may be thought of as taking place
within the particle, or alternately defining the existence of the particle,
and that the conversion of mass into energy, and vice versa, consists
merely in converting the circular internal motions of a number of
mass(ive/less) virtual particles into a set of linear external motions of a
number of massless real particles, and then converting them back again
into the original set of circular internal motions. It is as though one were
to take a tiny particle in rapid linear motion, bend or divert this motion
so that it assumed the form of a rapid circular motion, so that the particle
now possessed the appearance of a ring, and then utilize a portion of this
circular motion to set the ring rotating so rapidly that the ring now took
on the appearance of a solid sphere, most of which, to be sure, would be
composed of empty space, but which would possess a great deal of
energy by virtue of the two perpendicular internal circular motions
which, in conjunction with one another, defined the sphere's existence,

and then to proceed to undo, or reverse this series of operations,
retrieving the original linear motion with which one started. We know,
of course, that this simple analogy of the "kinetic sphere" is rather naive,
and that it is only intended as a basic model of the interconnected
meshwork of virtual particle reactions, which defines the existence of
any real particle. The important point here is that there is an exact
parallel between this internal circular motion and the linear motion of
massive particles along the imaginary axis of our cosmological model of
the hypersurface which is expanding at the speed of light, or rather, of
the hyperspherical potential energy barrier through which all massive
particles are presently in the process of quantum mechanically tunneling,
at approximately the speed of light. To explore this parallel, we need
to make a relatively simple observation about the relationship of the
internal circular motions to the external linear motions into which they
are converted whenever mass is converted into energy. We notice from
the example of the "kinetic sphere," that it only required two
independent (orthogonal) internal motions in order to define the
existence of this 3 dimensional object. We imagine that this conversion
of these two circular, orthogonal "internal" motions will result in the
creation of two linear, orthogonal "external" motions.
We believe this conversion process is described by an isomorphic group
operation, such that the number of dimensions of motion is conserved,
while the orthogonality of the motions is retained, because the
conversion of energy into mass, the reverse of this operation, is
accomplished through a continuous series of simple Lorenz
transformations, i.e., through the relativistic mass-velocity relationship
of Einstein, and because we know that the conversion of energy into
mass is a reversible (symmetrical) operation so that the conversion of
mass into energy can be described in terms of a linear matrix operation;
i.e., it is group-theoretic in nature.
But we know that the direct
conversion of mass into energy produces an out-rush, if you will, of
released energy which streams outward in 3 spatial dimensions. This
obvious empirical fact seems to require that there be 3 independent
orthogonal, circular, internal motions which underlie the latent energy of

massive bodies, and this implies that massive bodies which possess this
latent energy, E = mc2, must be, either themselves, 3 dimensional
hypersurfaces binding a 4 dimensional hypervolume, such that the mass
possesses three independent degrees of freedom, or that they must
possess two circular (orthogonal) internal motions, defining two
dimensional surfaces binding 3 dimensional volumes constituting the
massive particles, and that the third orthogonal internal degree of
freedom is that associated with the linear motion of these 3 dimensional
objects which occupy a 3 dimensional hypersurface which is expanding
within four spatial dimensions at the speed of light. There seems to be a
problem, however, in associating all of the latent energy of motion, E =
mc2, with the linear degree of freedom associated with the cosmological
expansion, simply because it means ignoring the contributions from the
two other internal degrees of freedom, corresponding to the internal
motions of massive bodies. If, however, we assume an equipartation of
energy between the energy magnitudes associated with all three degrees
of freedom, and this seems reasonable because the Lorenz
transformation of special relativity represents a symmetrical operation,
then the energy, mc2, which is needed to rotate the pure imaginary linear
momentum by 90o, to convert it into a pure real momentum, is provided
by the two energies, 1/2mc2 and 1/2mc2, respectively, associated with
the two circular internal degrees of freedom. In this way, the two
energies, 1/2mc2, combined with the negative kinetic energy,
1/2mc2, of massive particles, tunneling through the hyperspherical
potential energy barrier, yields the new energy, +1/2mc2, associated
with the pure real momentum of outstreaming massless particles which
results from the total conversion of a real massive body into energy.
We now see that the energy of massive bodies, mc2, may be thought of
as stemming, alone, from the internal motions defining these bodies,
which is released whenever these circular internal motions, i.e., the
energy circulating within the feedback loops of the virtual particle
reactions composing the massive bodies, is "deflected" into the linear
motion of real massless particles. An additional bonus from these
considerations is that it is now possible to see that the distinction

between virtual and real particles is not a fundamental one. Mass, on
this view, is simply a function of the topological structure of the virtual
particle reactions which occur everywhere within the quantum
mechanical vacuum on account of a fundamental energy uncertainty of
the vacuum state which, in turn, stems from the fact that the Hamiltonian
of the vacuum is, itself, a function of the "incompatible" observables,
position and momentum. Moreover, the massless force-carrying
particles, i.e., bosons, which are the end product of any total conversion
of matter into energy, exist solely by virtue of their interaction with the
vacuum state, and in no way depend upon, or are defined by, any selfinteraction. Consequently, these massless bosons, e.g., photons, can be
considered to be virtual particles even though are capable of being
observed. In other words, the mass which a given volume of space
possess is merely a function of the imbalance in the ratio of the volume's
self-interaction to its, if you will, not-self-interaction: in free space,
where no matter is present, the flux density of energy exchange between
the interior of an arbitrary volume with itself and the flux density of
energy exchange between the interior of this volume and its exterior, is
delicately balanced. It is the alteration of this balance in favor of greater
self-energy exchanges, which engenders the phenomenon of mass. The
self-energy exchanges correspond to the energies of circular internal
motion, discussed earlier, which we invoked as a simplistic model of the
interconnected meshwork of virtual particle reaction paths defining the
existence of massive bodies. There is a very convenient mathematical
description of this self-energy and so-called not-self-energy exchanges;
these are, respectively: the energy density and the pressure of the
vacuum. This balance of external and internal vacuum energy
exchanges is exact, indicating the condition of free space, obtains,
therefore, when the pressure and energy density of the vacuum are equal,
and it is on this condition that the speed of light has its maximum local
value, as seen from application of Mach's formula for the speed of
pressure wave oscillations within a material medium. There is no reason
why we may not apply Mach's formula in this case because the only
essential difference between the propagation of pressure oscillations in a
material medium such as the Earth's atmosphere and such oscillations in

the vacuum is that of Lorenz invariance, i.e., the value of the speed of
sound within a material medium is dependent on the state of motion of
the observer performing the velocity measurement, while the velocity of
"sound," i.e., light, within the vacuum is itself independent of the state of
motion of the observer within this vacuum.
Another way to point up this difference between the vacuum and an
ordinary material medium is to note that the particles within the vacuum,
being virtual particles, do not possess a continuous existence with time
and so cannot be chosen as the origin of an absolute frame of reference
within which the velocity of pressure oscillations of the vacuum might
be measured relative to an observer's state of motion. An ordinary
material medium such as the atmosphere is composed of particles which
possess a continuous existence and so an absolute frame of reference
may be established within this medium by which the velocity of pressure
oscillations of the medium may be measured which is then dependent
upon the state of motion of the observer.
The ict axis of Minkowski four-dimensional spacetime, may be
understood not to represent a physically real 4th dimension, in an
analogous sense to the other three familiar spatial dimensions, but that it
merely functions as an abstraction within the formalism of special
relativity to model conservation laws, e.g., energy, momentum, etc., and
the linear transformation law connecting inertial reference frames (the
Lorenz transformation) which, in turn, govern the general relationship of
the internal and external motions of real/virtual mass and field energy
within a universe of three spatial dimensions and one time dimension.
Perhaps now it is easy to see why it is not necessary to underpin the
mathematical structure of special relativity with a physically real 4th
dimension. It is perhaps possible to remain consistent with Einstein's
and Minkowski's view of time as being associated with what is merely
an abstract (not physically real) dimension. If the Universe is not, in
fact, and expanding 4-dimensional hypersphere, then the Hubble
distance-velocity relationship for galactic recession requires the
existence of a repulsive cosmological force field whose force increases

linearly with galactic separation. The gradient of the hyperspherical
potential, postulated earlier to explain, in part, the imaginary coefficient
of the ict axis, would itself then have to be interpreted as a manifestation
of the negative time-rate-of-change in the energy density of the quantum
mechanical vacuum which occurs due to the global cosmological
expansion. Moreover, the continuous series of local Lorenz
transformations which may be thought to connect two non-inertial
reference frames (centered about two points in space of differing
gravitational potential) would be understood in terms of a continuous
tensor transformation of the four independent components of
spatiotemporal variation in vacuum energy density, i.e., the gravitational
energy gradient (spatial variation in vacuum energy density) in
conjunction with the temporal variation of vacuum energy density,
connecting the two non-inertial frames of reference.
The equivalence between spatiotemporal variations in vacuum energy
density and variations in spacetime curvature may be more simply
grasped by examining two different expressions of the Heisenberg
uncertainty principle within curved spacetimes. These two expressions
^E^t > h and ^p^x > h.
We know that within a curved spacetime, say, in the vicinity of a
massive spherical body, there is a general relativistic length contraction
along the spherical body's radial direction while at the same time there is
relativistic dilation of time. If we are considering virtual particles, then
the > sign appearing in the two formulas, above, may be replaced by an
= sign so that a dilation and a contraction in the variables, ^t and ^x,
respectively, must be coupled with an inversely proportional shrinkage
and dilation in the dual variables, ^E and ^p, respectively. In this way,
the energy of the vacuum decreases as one moves into regions of
increasing gravitational potential while the momentum of the vacuum, if
you will, increases along this direction. If the vacuum momentum is
correctly described by a four-vector of conserved magnitude, then the

vacuum momentum may only increase with increasing strength of local
gravitational potential at the expense of a compensating decrease in the
vacuum's momentum along an orthogonal direction. It is the decrease in
the vacuum's momentum in the direction orthogonal to the radius of our
spherical massive body with which we must associate the decrease in the
vacuum's energy along the body's radial direction. So we obtain what
perhaps appears to be a trivial result: the momentum of the vacuum
along a certain direction may only be increased by utilizing the energy
of the vacuum itself associated with its momentum in directions
orthogonal to the direction of increasing momentum, so that local mass
distributions do not, themselves, provide the energy required to support
the existence of a local gravitational field; the effect of mass is merely to
redirect the vacuum momentum, utilizing the locally available energy of
the vacuum itself; to put this in the language of Einstein: mass does not
produce spacetime curvature, it locally alters the global curvature of
spacetime. This may all seem like an exercise in splitting hairs, but
there is an important difference in these two interpretations in the
relationship of mass to spacetime curvature: if mass, or what amounts to
mass, alone, is responsible for the existence of spacetime curvature, then
an "empty" universe may not possess a globally curved spacetime. On
the other hand, if mass merely locally alters the background spacetime
curvature, then there is nothing to prevent the existence of so-called
empty, curved spacetimes.
It is not correct to say that energy and information are interdefinable so
that if energy is a conserved quantity, then information is also a
conserved quantity. A simple counterexample suffices here. It is
possible for transitions to occur, within a gas, say, where both the
entropy and the energy of the gas are conserved, even though the
different configurations between which the transitions occur may be
thought of as representing different quantities of information so that
information is not itself conserved. The notions of energy and entropy
are separable from the notion of information because the former are only
definable with respect to a closed system of a finite number of distinct
state space configurations while the latter is always defined with respect

to something outside the system in which its coded configuration is
defined. It is not possible for one thing to represent another unless there
be at least two distinct levels of description available to the system
within which the representation is to be constructed. If we waive the
requirement of an "external" observer who is to give different meanings
to the different configurations, then information and energy are not
Another reason for not equating the two, i.e., energy and information, is
on account of the existence of energy degeneracy. Since different
wavefunctions may possess the same associated energy eigenfunctions,
it should be possible for a quantum mechanical system possessing
energy degeneracy to undergo arbitrary transitions from one degenerate
eigenfunction to another without the changes be associated with changes
in any definable QM observables. 07/98
I have found that under certain unusual sleeping condition such as the
presence of bright lights, chemical stimulants (caffeine or a nicotine
patch), after having previously taken a standard dose of melatonin, and
especially after having returned to sleep just after being briefly
awakened, that I am able to experience dreams in which I become aware
that I am, in fact, merely dreaming and that none of what I am
witnessing and doing is real but is something which I am manufacturing
out of my own consciousness, including myself as one of the
participants in the action. But precisely when this happens, when my
consciousness of my role as creator of all I survey becomes total, the
events I am witnessing begin to lose their independent character and the
whole scene begins to quickly dissolve whereupon I immediately
awaken. It is as though my simultaneous existence as subject and object
is outlawed as paradoxical, except in the partial sense of my being a
mere participating "character" in the unfolding action. This is perhaps
because the "currents" which power the dream become like the
tributaries of a river whose flow must quell upon reaching the level of
the river's source; as long as the "self" which is creating the dreamphenomena (the source) is "higher" than the "self" which participates in

it on equal par with the other "participants" (the tributaries) the action
continues in a natural and uninterrupted manner. In the fleeting moments
of near total consciousness, before the dreamscape has had adequate
time to disintegrate, it usually occurs to me to try to do something,
which is considered impossible in waking life. I may cause various
objects around me to leap into the air or explode or I might even move
myself to hover in the air and fly about. But usually there is barely time
even to begin to try out my new "powers" before I am forced to awaken.
Overwhelmingly, such miraculous acts as these I freely perform in the
normal state of dreaming consciousness in which I have not the slightest
clue that I am doing anything extraordinary.

“Furthermore, we have even developed a basic understanding
about how these afferent-efferent connections are blocked,
normally during REM sleep, allowing the brain to self stimulate
and generate the strange "offline" realities we experience during
dreams (Gottesmann, 1999; Maquet, 2000)”. . . Thus, we can
describe the brain, metaphorically, as a kind of “virtual reality
generator”, which allows the environment outside the brain to be
experienced inside it. This "out-of-brain" world comprises not only
the body's external environment, but also the internal environment
of other organs outside the brain (actually, we are going to
demonstrate that the brain's "virtual system" generates not only a
virtual world, but also a virtual self in the center of this virtual
world). Each brain generates this virtual world and self using the
afferent stimuli, external and internal to the body, and the virtual
self produces virtual decisions and actions that will affect our body
through efferent outputs (Merker, 2005). . . “1- THE "BRAIN IN
THE VAT" ARGUMENT: as brains do not exist in isolation from
their surrounding environment ("inside vats"), it makes no sense to
postulate that what we experience as our day-to-day reality was
generated by and inside our brains (Dennet, 1991)”. April 2014 The
inputs and outputs to the “virtual reality generator” of the brain,
however, possess a distinctive structure that is peculiar to a
linguistic grammar, which necessarily differentiates interaction of

the person’s brain via coded impulses with other intelligent entities
possessing brains from the mere interaction of his brain with an
unintelligent environment populated by a cadre of automata or
“philosophical zombies.” Consciousness on this view is a reified
projection of an ego qua sociolinguistic construct, rather than being
a kind of unique and mysterious “thinking substance”. (One is
reminded here of how it is very difficult to fake an authentic
sounding laugh.) In other words, one cannot achieve selfconsciousness, which is the precondition of reified consciousness
as the medium of conscious experience in the absence of a
properly robust sociolinguistic context. On an underlying physical
level, one would expect “grammatical structure” to be coded in
terms of a set of programmed actual and potential quantum
entanglements that provide an encrypted signature of quantum
correlates of conscious states. Descartes’ “thinking substance” here
is a kind of hard-encrypted quantum entanglement signature
possessing an open-system topology that is altogether distinct from
that of a closed system machine intelligence. For one, time is
spatialized in such a closed system, whereas temporality is a
genuine possibility for an open system. For another, closed
quantum mechanical systems do not decohere, being isolated from
the environmental influences (to include vacuum electromagnetic
field fluctuations), that otherwise induce this decoherence of
wavefunction collapse. Coherence, cohesiveness, robustness, unity,
integration, reflexivity, and so on, are system qualities that cannot
be supported within a closed form computational state space, i.e., a
state space that is not “context-embedded”.
So chance
combinational/permutational shufflings cannot bring about new
states of the system which are capable of sustaining themselves
against a background of perturbing fluctuations, i.e., “noise”. It is
true that “without context there is no meaning”, but the converse of
this is also true” without meaning there is not context. Clearly,
structures of quantum entanglement of real or simulated
computational state space states cannot be included as additional
states within the preexistent state space for this leads to an infinite

imagine, and perhaps even build, organisms and machines that can
behave in a way similar to humans, but without having an internal
(and conscious) "virtual reality", evolution had no reason to have
selected the development of "neuro-virtual" processes inside our
brains (Chalmers, 1996). The answer to this argument is related to
the answer to the “Brain in a Vat” argument. @$What functions
analogously to “grammar” here is the notion of supervenience or of
supervenient causation. The Gödelian flexibility to “jump outside
the system”, e.g., throw out the current set of postulates and
axioms, rules of inference and even to assume the truth of
theorems not provable from the newly adopted logico-deductive
system and what is more, to guess well the time to do this, is just
the kind of neural flexibility a person needs, which I believe
demands supervenience. Causal supervenience could be serviced
and supported by sufficiently sensitive microstructures in the brain
being slightly altered so as to come into resonance with the
vacuum electromagnetic field in such a way as to facilitate
information exchange between “brain and creative void” so as to,
in turn, facilitate the otherwise relatively diminished capacity of a
classical, unsupervised neural network to reprocess its data into
higher order structures that more aptly simulate the external world
along with the mental worlds of would be competitors for limited
resources. We already know from Bohm’s causal principle that the
fluctuation-correlation subspectrum representing deterministic
cause-and-effect constitutes but a tiny subdomain within the total
 c.f., The Phenomenal World Inside the Noumenal Head of the
Giant: Linking the Biological Evolution of Consciousness with
the Virtual Reality Metaphor.

Sam Bankester My internal virtual reality generator is quite mad
vor 3 Stunden · Bearbeitet · Gefällt mir nicht mehr · 1

Lime Cat Yeah, but it really only got out of control after you realized
that you have one - a probable case of system instability provoked by
outputs feeding back into inputs!
Sam Bankester Oh wow..if the internal virtual self is aware he exists,
creating a strange loop, then the possibility arises that strange loop
entanglement between internal and external selves is taking place every
time we dream. Ahhhhh!!!!! Of whom is each self aware? Tiny
explosion just erupted somewhere in my hippocampus. Pretty sure I'm
permanently baked now.

Lime Cat Nah, you've been ready for that insight for a long time now.

 Lime Cat For my part, I am still trying to live down this bizarre
notion of my having all along been a virtual Lilliputian living
within the neural network simulation of some alien
Brobdingnagian brain!


Lime Cat

Gulliver's Travels - Wikipedia, the free encyclopedia
Travels into Several Remote Nations of the World. In Four Parts. By
Lemuel Gulliver, First a Surgeon, and then a Captain of Several Ships,
better known simply as Gulliver's Travels (1726, amended 1735), is a
novel by Irish writer and clergyman Jonathan Swift, that is both a satire
on human nature and…

Lime Cat Of course, this lumbering Brobdingnagian is not actually selfconscious, rather, his brain has created a simulation containing a virtual
self along with a consciousness of and referring exclusively to…this
virtual self. So self-consciousness is really only "self"-consciousness,
i.e., it is not actual, but merely metaphorical.
Jesus Christ informs us that were we to have a mustard seed's faith we
would be able to hurl mountains into the sea. As Christians we cannot
help but take him at his word on this. Perhaps here we are now able to
glimpse a fundamental difference between certainty and what we call
faith. Usually Christians equate the two in some fashion, but confer
with such common exhortations as "You cannot be certain of this, but
you must have faith." If ordinary parlance can clarify the point I want to
make it can also obscure: " I'm not as certain of this as you" vs. "He has
more faith than I." That is, we commonly speak both of degrees of
certainty as well as degrees of faith. Yet in the most literal sense faith is
something which indeed admits of degree, certainty, on the other hand,
doesn't - you are either certain about something or you are not, in which
case, you are entertaining doubt. But if faith and certainty are different
in the way I describe, then one cannot have faith and have certainty at
one and the same time. But we just said that if we are not certain then
we must in some way entertain doubt. Faith can exist in the face of
doubt, but it takes a faith, which is completely free from all doubt for us
to budge mountains from their very foundations. We cannot, of course,
have certainty that any thing in particular shall come to pass unless we
have the type of assurance which only God himself provides in the form
of Scripture or Revelation. Can we distinguish Jesus and God the Father

in the following manner: Jesus is the incarnate, humble God; we might
say that the Angels and ourselves witness his glory in Heaven, but not
experienced by Himself personally. God the Father is Himself not a
humble Being having never experienced the humiliation which Christ
experienced for us on the Cross, but experiences this Glory in an
intimately personal way which we sill never be able to imagine; by
meditating upon it, however, we may learn to marvel at the mere idea of
it. Jesus says that we must come to him as little children. What this
means is that our belief in Him is based on our faith and has nothing to
do with our intellects and so what we call certainty is not here the
relevant quantity. We should not therefore envy those who seem to
possess intellectual certainty concerning their Christian beliefs as still
more clever arguments to the opposite side may be waiting just around
the corner. And one is in a Catch-22 situation here because if one
possesses the headstrong pride to resist the seductive and highbrowed
attacks upon one's own intellectualized beliefs, then one holds these
beliefs for the wrong reasons. On the other hand, if one gives way to
"superior" argument, all is lost; if indeed anything worthwhile was
risked in the first place. Faith can exist in the face of the gravest of
doubts, c.f., "Oh Lord, please give me faith now in my moment of
Intellectual certitude is toppled by even a slight suspicion. It is
characterized by ignorance of the subtler complications at still deeper
levels and dismay is perhaps experienced when they are suddenly
uncovered, threatening what were thought to be pat arguments.
However a Christian who has tried his hand at intellectualizing his
beliefs but who has now learned to base them on faith, probably retains
some respect for their intellectual dimension although he now trusts that
the hidden complications must ultimately work out in his favor. The
Sophists of ancient Greece enjoyed indulging in disputation and debate
for its own sake alone and used to pride themselves for being able to
make the worse side of an argument seem the better one. Although
these kind of people are still around today, intellectual justifications
need not even be attacked to lose their quality of conviction as they have

a natural half-life which is on the whole quite short; the newness of
discovering or inventing them must soon enough begin to wear off. The
Christian beliefs which they were thought to support are now weakened
although the intellectual vanity which they are really in service to merely
goes on to still more subtler or fancier arguments. Christian apologists
must realize that part of what motivates them to put pen to paper is
identical to that which motivates any writer, that is, intellectual vanity
and pride. It is an inevitable fact that if one were ever to become
conscious of not being a proud person one would at that very same time
become proud about it. If I work a miracle I do so in complete humility;
as soon as I become conscious of myself as the author of this great sign
the conduit of my power becomes squelched. The faith of the mustard
seed which allows me to hurl the mountain is intimately connected with
my conviction that it is not myself which is the source of this terrific
power, but God Himself. This complete humility which is inextricably
one with the faith necessary to perform miracles can only exist if it is not
conscious of itself. As we know a miracle involves the suspension of
natural law which normally prevents it. We also know that natural law
can only be formulated concerning objects or processes which are in
principle reproducible as reproducibility depends in turn upon what is
called reducibility. This is why we refer to deterministic laws describing
reproducible phenomena as reductionistic. Reductionistic laws are
"closed" form expressions of physical relationships which is to say they
are mathematical. "Closed" because the expression contains in an
abstract way everything that can possibly occur between the terms
related to one another by the expression - all that is needed are the initial
and boundary conditions ; nothing is able to come in from "outside," as
it were, to disrupt or complicate the lattice-work of physical

And this is just the point. In interacting with a truly unique entity or
system I am not able to independently vary the circumstances in which I
observe this system unless I am capable of distinguishing what are the
dependent from what are the independent variables and to succeed in
doing this I must be able to vary these experimental circumstances in a

way which can be disentangled from the resulting changes in the
behavior of the system itself as opposed to the conditions of the
experimental setting. I can do this if the pervious state of the system is
made not to carry over and influence the experiment at a later time when
the system is in its new state, c.f., au=Rupert Sheldrake, cit=The Presence
of the Past. Essentially this can only happen if the new state of the
system is just the state of some different but identical system so that
these states become decoupled from each other thus allowing that
mutual relationships among the dependent (internal) variables to be
distinguished from the relationships of the dependent variables to the
independent (external) variables which can now in this way be
manipulated independently. So "closedness" and "openness" of a system
are not merely convenient descriptive labels but quite aptly describe
genuine topological relations between the system variables. Should
there exist variables which can be varied independently but which we
are not able to control, then usually this is a symptom of overcomplexity of the system which is best overcome by the defining of
various "random" variables which are then treated statistically.
However, this leaves the possibility of independent variables which are
neither random nor which we are able to control. These would be not
definable or unknown variables and represent influences upon our
system which issue from some hidden region "outside" the space-time
continuum. Naturally, such systems will not be reductionistic and hence
they are not reproducible but instead, are unique indeed. Every act
which issues from the Will of God is necessarily a mystery to all but
Himself. But the difference is that a good apologist knows how hopeless
it is to completely subdue this monster and so exploits it in the service of
a lofty purpose: that of hopefully turning the intellectual vanity of
others against their unbelief for just long enough so that something of
real convicting power might be given its chance to work, which is to say,
God's grace. By a natural extension of this analogy we may easily
illustrate the major difference between what might be termed the Eastern
versus Western view of God and Man as well as their relation to one
another. The eastern view of this relationship seems to be that secretly
the dreamer himself and that even though the dream eventually passes it

is soon enough replaced by yet another where this special participant
assumes some other possibly quite different form and perhaps even
recalls events and participants from some previous dream.
June 2013 @$

The greatest necessity for providential intelligent design is
pointed up by the plurality of consciousness, i.e., consciousness’s, each
of which possesses recursive privileged access and sense data
incorrigibility as there must be supervenience of being (of some kind)
over these multitudes of consciousness’s. The only potential candidate to
provide said supervenience (c.f., philosopher Daniel Dennett) constitutes
a “field of one”, which is to say, God. Consciousness works for
everybody and not just for one person, which would indeed have been
the case for a solipsist who did not indeed possess the creative powers of
a bona fide deity.
Although reincarnation is usually associated with this kind of conception
it is not really consistent with the full-blooded Eastern view that the
individual soul is just an illusion; if I am, on account of this doctrine, not
really different from you or anybody else for that matter, then neither am
I any more like one particular person who lived at any time in the past
than I am from any of his millions of contemporaries unless, of course, I
secretly subscribe to the more or less theistic view that there is, indeed,
such a thing as an individual immortal soul. If I were to claim that I am
the reincarnation of Thales, the founder of Greek philosophy, then you
may make the deduction from this that I could not then be the
reincarnation of Laotzsu, the founder of Taoism as he and Thales were
contemporaries; but then this in turn implies that the distinction between
Thales and Laotzsu cannot secretly be an illusion and this is contra hyp.
Salvation for the Easterner is effected by the shedding of this illusion of
individuality; the paltry and minuscule particle of being returns to its
source which is Being itself. Here the Ego is identified with selfconsciousness. On the Western, theistic view, on the other hand, the
Ego and the individual Soul are indeed two distinct entities, the Ego
being a false aggrandizement of one's true individuality (the Soul),
alienating it from the Divine Will and known as Sin which is Pride.

Salvation for Christianity is effected by the removal of Sin (Pride) with
the result that the individual Will and the Divine Will become One. We
should consider in this connection the puzzling statement in the book of
Ecclesiastes, that the spirit returns to the Father who made it.
Ecclesiastes 12:7 "Then shall the dust return to the earth as it was: and
the spirit shall return unto God Who gave it." (August 2012 This is
apparently without regard to the ultimate fate of the soul). Another
important distinction here is the manner in which Salvation is effected.
For the Easterner, Salvation can be achieved through the individual's
own efforts, through the practice of various types of self-discipline, and
concerns itself not so much with seeking a desirable state in the Afterlife
as with the avoidance of suffering in an endless cycle of Death and
Rebirth to which one is confined through the influence of Karma.

Perception is largely governed by expectations conditioned by
previous experience. Memory is selective. The tale grows in the retelling. The most we can mean by "objective reality" is intersubjective
reality. December 2012 However, the subjective cannot be analyzed in terms
of intersubjective categories; that is by definition, from which it follows
that, 1… the subjective is not objectively real and 2…what is called the
real contains more than the merely objectively real. If the objective and
subjective are truly disjoint, then how are they to be related together?
Only by being meaningfully related (please pardon all of the italical
“foot stomping”; it is not intended to be patronizing) together, of course,
that is, by the grasping of a subjective insight by a solitary mind….
Language is incapable of conveying actual substance, or content, but can
only convey distinctions and similarities at multiple levels of
abstraction. Human suggestibility December 2012 and auto-suggestibility
(these go hand in hand, e.g., guru vs. initiate, mentor vs. protégé’,
sleeping prophet, entranced yogi, etc.) … is always a factor in the
seeming immediacy of interpersonal communication. The commonality
of language and environment structure creates the illusion that our
perceptions of this environment are similar, if nonidentical to those of
our neighbors. The vagueness and confusion, if not the outright

contradictory nature of, our conceptions and beliefs is never fully
revealed to the individual in the absence of problem-solving
applications. The courtesy and civility of strangers and acquaintances is
frequently mistaken for genuine caring and regard for oneself and one's
interests. The continual appearance of so-called "coincidences," which
make life seem more intimately related to the individual than it is in
reality, are largely artifacts of the filtering of sensory/perceptual data in
the light of recent learning, still reverberating within the individual's
subconscious. Altruism is an investment in the future caring and
kindness extended to oneself. (One assumes that investing in raising
one’s self-esteem based through the practicing of generosity shall
someday reap a dividend in the form of like generosity of others.)
Originality is frequently a combination of sloppy scholarship and an
imperfect memory. Superstition is a metaphorical, inductive
generalization upon coincidentally connected events.

“Your brain is not like a computer. It’s well laid out and energy
efficient, but it’s really a survival machine that has been optimized over
millions of generations of natural selection and evolution to help you
survive, even lying to you when necessary in the interest of keeping you
alive.” That’s a central principle of how brains are organized”, c.f., The
Neuroscience of Everyday Life, Sam Wang, Ph.D.


We do not change. We simply embark upon new self-deceptions.

December 1999

Around 2015 rocker Michael Jackson will undergo the
lengthiest surgery to date to become the world's first human cyborg. He
will be plugged into the next generation of the internet utilizing biochips
implanted in his brain. Twenty years later Michael will have his brain
removed, placed in a vat of some nutritive media, and with his brain
connected by myriad electrodes to a quantum neural network computer,
will manage to more or less download himself into a still later
generation of the internet composed partly by computers capable of

directly interacting with the quantum vacuum of spacetime. By 2060
AD, Michael will be the undisputed ruler of cyberspace. By this time a
whole generation of persons will be spending most of their time in this
simulated world of cyberspace. Many of these will decide to live the
rest of their conscious lives there. Michael will become in a very real
sense a God by this time. By the time a move is made to destroy his
brain, he will have succeeded in totally incorporating his mind into this
quantum vacuum (Inter) net and will possess genuine, if dubious
immortality. By this time, it will be discovered that existence had
already been taking place, prior to the emergence of the first generation
Internet, for countless millennia within an ancient Internet established by
our true ancestors millions of years ago. What we had all this time
interpreted as the precise mathematical physical laws of nature was
merely the manifestation of ancient cyber technology and programming
engineered by those distant ancestors of ours, hideous, tentaculated, if
not totally formless, beings of pure energy. It will only take the
simulated reality (Internet) approaching the subtlety and complexity of
this ancient reality simulation for this to become apparent centuries
before the dawn of 3000 AD. As Carl Sagan has said, if the Universe is
billions of years old, then the average age of civilizations in the Universe
would be on the order of 100's of millions of years. So what are the
chances then, that ours is only five or six thousand years old? Given a
Gaussian distribution of civilization age, ours would occupy the
0.0000000001st percentile. What are the chances of that? By the year
3000 AD, our civilization will have discovered its true age - perhaps
1000,000,000 years or more! Hopefully Y2K is only a glitch in the
topmost (incomplete) level of simulation and exist on any deeper levels,
i.e., Y2M, Y2G, etc., or we might all be in for a real surprise!
Are all signals secretly meaningful, none intrinsically noisy? July 2011 Or
is it just that one man’s noise is another man’s signal? There’s no clear
demarcation between, data, information and executive function signaling
in the brain as these all are quantities of different levels of abstraction,
each possessing the same underlying biophysical reality as the other, or
so we suppose. The probability of a given complex signal belonging to

two or more distinct contexts should be rendered small by virtue of the
complexity potential of the signal carrier code addressing a much greater
number of contexts than which are actually possible. October 2011 @$There
may be a necessity of making a further distinction between data and
information in the light of recent quantum entanglement and
teleportation experiments and in light of Hawking’s notion of
chronology protection. We perhaps need to distinguish between
controllable, intentional information and information merely as
“interpreted data”.
Thought experiments involving quantum
entanglement based communication between two or more quantum
computers is what suggests the need for this additional distinction.
Since the gimmick of “parallel universes” is the favorite mechanism of
choice by science fiction writers for the obviating of time paradoxes, it
might be worthwhile to seek one or more of the essential components of
chronology protection in gedanken experiments connected with quantum
superposition, specifically in the observational/experimental logic
pointed up by “delayed choice”, “quantum non-demolition” and
“quantum eraser” experiments. There are several distinct stages in any
causal chain of events that potentially lead to a “time paradox” where
one could insert a “no-go” theorem in order to head off its formation.
An event horizon of a black hole is an example of chronology
protection. This is because the formation of a “naked singularity”,
which is otherwise permitted would enable backward timelike
trajectories through spacetime. But causality is also violated by any
freely willed act. The causal paradoxes involved in free will are most
likely buffered and defused through the probabilistic quantum structure
of spacetime, in particular the superposition principle and the principle
of the complementarity of incompatible observables, i.e., the fact that
these paired observables are related through Fourier Series and inverse
Fourier Series expansions. April 2014 Take a look at the link below on the
two-way relatedness of quantum mechanics and free will, c.f., “A quantum experiment consists
basically in a device (e.g. an interferometer or a polarizer) that receives
energy (light, electrons, etc.) from a source, and thereafter can produce

two different detection outcomes we denote ‘1’ (if detector D1 clicks),
and ‘0’ (if detector D0 clicks). The outcomes refer to registered results
that are accessible to the experimenter (a human observer).
Consequently, the final results of an experiment after many runs build a
registered sequence of bits: 1,1,0,1,0,0,0,1,0,0…The distribution of the
bits ‘1’ and ‘0’ in this string depends on device’s settings, which the
experimenter can choose. For instance, by changing the settings the
experimenter can cause the outcome’s distribution to go from 45
percent ‘1’ and 55 percent ‘0’, to 70 percent ‘1’ and 30 percent ’0’.
Standard quantum physics establishes two things:
a) For a large number of detections the outcomes fulfill a determined
statistical distribution depending on the settings (for instance: 45
percent of ‘1’ and 55 percent of ‘0’). But the theory does not establish
how many outcomes make a large number of them.
b) For each single quantum event the outcome (which of the two
detectors clicks) is unpredictable in principle for the experimenter.
In this sense it is said that the quantum laws are essentially statistical,
and the order of the outcomes unpredictable. However quantum physics
does not establish that the sequence of outcomes (the bit string) must
happen without any order and be meaningless.
What is more, recent experiments are bringing to light that the
experimenter’s free will and consciousness should be considered axioms
(founding principles) of standard quantum physics theory. So for
instance, in experiments involving “entanglement” (the phenomenon
Einstein called “spooky action at a distance”), to conclude that quantum
correlations of two particles are nonlocal (i.e. cannot be explained by
signals traveling at velocity less than or equal to the speed of light), it is
crucial to assume that the experimenter can make free choices, and is
not constrained in what orientation he/she sets the measuring devices.”

(July 2012 Parallel multiverse selves act freely without engendering causal
paradoxes – this is a corollary to the principle of chronology protection
and what should be understood to be a kind of defn=“containment
principle”.) Turning to the theory of the correlation-dissipation of
fluctuations in the vacuum’s energy within the finer “interior structure”
of manifest Heisenberg energy uncertainty, i.e., the “hidden variables”
of quantum uncertainty we may profitably seek to define each stage in
the causal chain of a “closed spacetime loop” in its act of establishing
itself where nature could reliably and opportunistically intervene. The
development of a standing wave structure within a superluminal signal
wave packet allows us to make signal phase velocity effectively
dependent upon the wavepacket group velocity instead of vice versa (the
usual case), provided that the wavepacket is itself built up from the
mutual interference of “retarded” and “advanced” waves, which are in
turn dependent upon the initial and boundary conditions. This
presupposes the notion that wavepackets, which travel with group
velocity can be treated as “phase waves”, c.f., signals propagating with
imaginary momentum during “quantum tunneling”. But this is subject
to the special conditions for wavepacket lifetime in relation to the
wavepacket’s Heisenberg energy uncertainty, i.e., simply that /\E/\t >=
h. The distinction between the measurability of real particles and fields
on the one hand and the immeasurability of virtual particles and fields
on the other (except in statistical terms) should be of service here.
Again, at each crucial step in the would be formation of a closed
spacetime loop (which by the way seems to necessarily invoke a
topology of two-dimensional time), there should be opportunity for a
presumed chronology protection mechanism to intervene. Note that if
causality is really underpinned in higher dimensional time, i.e., 5d, 6d,
etc. spacetime, then closed time loops in 4d spacetime are no real threat
to causality, but merely present the appearance of such to creatures not
possessing the perspectival advantage afforded by intersubjectively
mediated theories of causality within these higher spacetime dimensions.
The final stage at which a chronology protection mechanism could
reasonably intervene, actually is composed of three distinct stages,
namely the following: that of the formation of the observer’s and

experimenter’s ideas, that of the reliable recording and that of rational
application and implementation of these ideas. We would of course like
“chronology protection” to possess an underlying mechanism, but
unfortunately, “mechanism” by its very nature is essentially part of the
general thing which a chronology protection “mechanism” is there to
protect. This suggests that the chronology protection “mechanism” is to
be sought within recursive structures, which would have to possess no
beginning in time (to exempt them from needing to be “chronology
It is starting to look like we shall be forced to seek
chronology protection at some “transhuman” level, e.g., Lazslo’s
Akashic Record, Hawking’s Chronology Protection Agency, the bigheaded alien programmers” “outside” the Ancestor Simulation, and so
on. Because of the fundamental limitations imposed upon automation,
i.e., “mechanism” by Gödel’s and Turing’s theorems, and the suspected
immunity of consciousness to the theoretical limitations imposed by
these theorems (probably because consciousness itself is the theory
builder), we shall have to take our lead from au=Chomsky that, the most
crucial step in the state censorship of ideas does not occur “at the
printing press”, i.e., the mechanism, but at the very root of the process
for the operation of an otherwise free press, that is, at the level of “the
formation of ideas”, i.e., at the level of “consciousness”. We may
ultimately seek the master mechanism of chronology protection at the
highest possible level of correlation of quantum fluctuations in the sense
of “higher” versus “lower” information processing, or at the boundary
between reducible and irreducible complexity within the quantum
vacuum, namely at the Planck energy. December 2012 @$ “Every recursive
function has its equivalent iterative (non-recursive) function. Even when
such equivalent iterative procedures are written, explicit stack is to be
web= This can be considered an alternative way of defining
reductionism, which may facilitate defining more helpful concepts
within philosophical discussions involving determinism, e.g., artificial
intelligence, free will, reductionism, consciousness, quantum nonlocality
and hidden variables, etc.

November 2014 Fcbk=

Lime Cat The more immediate danger is that posed by
the combination of an extremely intimate and invasive virtual reality
technology that is supported by new and largely untested quantum
entanglement based computation. Adventurous young cybernauts will
enthusiastically and collectively jack their brains into this highly
realistic, telepathic-interactive communal mental space not realizing
that, as they do so, a half-billion year old firewall is about to be bypassed, and far too quickly for any human brain to adapt. After all, the
human brain was designed by evolution to be in part a resonant circuit
tuning to, but much more than this, a complex network responsible for
blocking, filtering and transducing quantum entanglement encoded
information, that which resides within the quantum vacuum
electromagnetic field, itself a reservoir of *all* of the information ever
produced and/or processed by every brain and conscious computer in the
cosmos since life first arose, perhaps billions of years before the
formation of our solar system. The human brain was only meant to
permit access to consciousness of information signals pertaining to
efforts to solve humble, practical and terrestrial-based problems, (those
of *this* environment), e.g., finding food, water, a mate, protecting
offspring, seeking shelter, outwitting competitors for food, defending
against marauding animals, invading tribes, etc. What do you suppose
will happen to the average human brain that suddenly has to contend
with quantum information leaking into consciousness, which was
originally put into the quantum field by beings a billion years more
advanced than man? Quantum information is not limited by the speed of
light, but is nonlocal and instantaneously available across the entire
universe, even from beyond this Universe's observable horizon, many
billions of light years away! Anyway, you get the picture. The moral of
the story is: don't be among the first to strap one of those inductive
skullcaps onto your noggin!
"God didn’t build the Heisenberg Uncertainty
Principle into the fundamental structure of the Universe in order to grace
humankind with the "elbow room" befitting a free will, but merely to
December 2011 epi=fcbk=

spare Himself the annoyance of the myriad trivial happenings of His
creation." Had his disciples only taken pains to jot down Jesus'
numerous pronouncements on relativistic quantum field theory, there
would today be many more scientists espousing the Christian faith, I'm

October 2014 fcbk=

Alex McFahd There must a buried holy book with all the advanced
physics and engineering written by Jesus for only people like us

Lime Cat In the middle of the Sermon on the Mount, Jesus made the
"T" sign with his hands ("time out", I think) and said, "Everybody close
your eyes and plug your ears. For the next 10 minutes I'm going to
address the end times’ people 2000 years from now so don't peak: "the
continuum hypothesis can be definitively proved merely by projecting
2nd order Zermelo-Fraenkel set theory onto a fractal Galois manifold.
Unfortunately this contradicts Cantor's diagonal argument and unmasks
my Father's creation as a vast ancestor simulation of quantum entangled
Boltzmann brains...." Of course, the early Church Fathers, who
assembled the Bible from a few shrewdly selected gospels, histories,
poems and epistles from out of a vast collection of widely varying
versions, couldn't make heads or tails of what had been dutifully written
down a few hundred years earlier about Jesus' scientific pronouncements
and so these fragments were suppressed from the Canon and almost
completely lost for all time. Thankfully, these important sayings of Jesus
were secretly preserved by a splinter Gnostic Christian group and were
finally provided with a meaningful exegesis in the 1930's by the then
few surviving members of the Church of the Quantum Vacuum. In
modern day, a spinoff (no pun intended) cult of this electrodynamic faith
survives as the Church of the Reformed Quantum Vacuum. This
disparate group of true believers, which possess only a virtual presence
on the Internet, would have by this time long ago degenerated into a

gaggle of hair-brained aging solipsists were it not for the trusty influence
of cyber-peer pressure. The Church has been cyber-squatting since the
turn of the Millennium at
Coming Soon...
November 1996

What psychologists, philosophers, and theologians have
referred to for millennia as "conscious-ness" is something apart from
any structures of thought which might "inhere" in it and is apart from
any particular sensory, perceptual, or conceptual modalities of this
consciousness. This is related to Hume's complaint that whenever he
tried to perceive what is called the Self, he only could seize upon some
particular sensation or perception or other.
July 1997

The fundamental process by which the objects of perception are
generally constituted cannot itself, of course, ever be itself an object of
perception. Stating this in quantum mechanical terms, quantum
observables cannot constitute or be used to describe what consciousness
is in its essence, but are necessarily mere manifestations of
consciousness. Now the quantum state of a system is what creates the
possibility of observables. Consciousness, therefore, must be intimately
associated with the creation of such quantum states, but not through the
mere temporal evolution of pre-existing quantum states. Consciousness
must be ultimately mediated through discontinuous changes in boundary
conditions, the primary example of which is the instituting of boundary
conditions upon the fundamental quantum (vacuum) field from outside
spacetime. This is what essentially the process of informing means.
“The question is actually quite simple in nature. Is consciousness a
primordial aspect of existence, or is consciousness somehow derivative
of the information that is quantized out of the void of empty space? Does
consciousness somehow arise from the way information is coherently

organized, or is its nature primordial, an aspect of the existence of the
void? Is the void the true nature of consciousness?” “…It is possible to
prove that it is impossible for consciousness to arise from the way
information is coherently organized, no matter how complex that
organization. If the only logical alternative is proven to be false, then in
effect we have proven the only true proposition, which is that the void is
the nature of consciousness.” See Non-duality: A Scientific Perspective,


An example of school shootings that, though perhaps not "senseless",
still constituted a stark example of the regretable taking of innocent
human life. . . a heinous moral outrage in any historical epoch. I hate
media discussions of shooters' motives because the notion of "motive"
implies another notion, that of rational action, which the taking of
innocent human life indeed never is. In future, dear mass media, please
withhold the identity of the shooter, do not discuss his supposed motives
or agenda, do not interview his former girlfriend, classmates, family
members, do not publish his suicide note, direct viewers/listeners to his
Facebook, Twitter or Youtube page, etc. Grieve for and lend support to
the families of the survivors, but by all means allow the grotesque and
twisted soul of the shooter to anonymously return to the terrible void
from which he was unfortunately called forth, unbidden. (Posted via
Facebook on the day of the Sandyhook Elementary School mass
shooting in Connecticut, USA)
And so, information, unlike energy, cannot be understood in terms of the
operation of any continuity equation. The action of consciousness is
inherently discontinuous with respect to any particular system of abstract
forms. This type of action necessarily involves violation of any
conservation laws and so the action of consciousness is inherently
asymmetrical and irreversible, and, moreover its dynamic cannot be
understood in terms of quantizable physical parameters/variables. An

operation, such as the searching of a memory cache, does not constitute
what is called thought, but is merely the seeking of thought for a
connection which will merely enable its manifestation or expression in
terms of a representation which is not unique, as would be appropriate to
the operation of a code, and is always completed through a kind of openended complementing of the cipher medium through the interpretive
action of its ground. This complementing is not logical, but translogical.
Consciousness, as understood by these thinkers, if it indeed "exists," as
opposed to possessing being, must be something forever prior to and
underlying all these modalities or structurings of it. Consciousness is not
something which is merely common to all these, for if it is, it is
something which transcends individual experience since common to the
experiences of any and all possible/actual persons, but the most that one
can possibly mean by the term "consciousness" is one's own individual
consciousness. The only way we can actually possess a genuine
intuition of consciousness is if the distinction between different
individual consciousness’s is merely like the distinction between a given
individual's consciousness at various particular times within that
individual's biography but with the merely incidental difference that the
structurings/ modalities of one individual consciousness are not crossreferenced to those of another individual by virtue of a more general
modality which includes them - itself less general than consciousness at
February 2013 (020713)

It is not that a paradigm shift or Copernican revolution,
i.e., where Man is pushed once again, precipitously (the wind from my
fan just now pushed the sticky note off of my web cam lens as I was
writing “precipit” – I had placed it there about eight hours before) away
from his center (in the Biblical Garden) and something “we know not
what” is pushed toward the center, is signaled by a contradicting of
reason, but by merely the appearance and current understanding (of the
appearances) which is evaluated in terms of a “transcending of reason”.
Reason, like consciousness then is something that we never actually
have ever caught hold of, ever only having possessed a metaphor

masquerading as a concept, and yet somehow we find ourselves
convinced, unreflectingly that we indeed possess a concept of it
(reason). Triviality as in mathematics sometimes is only itself an
appearance that masks the profound and mysterious. Just as in the
question of the topology of the real line there is an infrastructural
context gracing things ab initio as it were. Of course, this line of critical
reasoning is very question-begging in nature, considering that we are
presupposing some kind of distinction between concept and metaphor
and we have secretly questioned the very possibility of there being
concepts at all – only metaphors. This is similar to the self-thwarting
metaphysical assertion that reality is just a dream or “I have only ever
known my own sense data or psychological states”. The word,
“psychology” for instance presupposes not merely an umbra of context,
but also a penumbra, that is, a kind of meta-context, c.f., my playful text
messages to Ziad on the afternoon of February 6, 2013. Wittgenstein
would berate us here for trying to get language to do more work than it
has the capacity for (which is his general critique by the way). One must
wonder how he ever satisfied himself that the indescribable part, what
fundamentally couldn’t be communicated via language, even existed in
the first place – a clear case of Russell’s so-called privileged access. A
sufficiently advanced hardware/software combination would appear as
the natural infrastructure gracing creation, i.e., physical reality. “In
2013 an important breakthrough was made by Zhang Yitang who proved
that there are infinitely many prime gaps of size n for some value of n <
70,000,000.” It appears that not only the real line, but also the natural
numbers possess a non-trivial topology.122013
December 2013

The principle that Emerson seems to be illustrating can perhaps lead us
into a deeper understanding of time and a more hopeful appreciation of
time's potential, which passes well beyond the usual paltry conception of
time as unidimensional and inexorably finite, or, if not indeed finite,

then infinite in the merest sense of everlasting. The higher
dimensionality of time is to be sought in its fundamental plural, if partly
entangled, multiplicity of scale and connectedness. The endless, openended reprocessibility of the data of humanly experienced time (long
term biographical memory), when placed in the light of potentially
endlessly novel and unpredictably unfolding future contexts, points to an
infrastructure of intelligibility (and of reason) gracing existence at its
root and is indeed what makes the faculties that we instinctively take for
granted possible. The data of individual human experience are not
solipsistic, but are secretly collective inputs to a much larger system, one
capable of combining and reprocessing this data into transpersonal and
even transhuman meanings. Language, ethics and culture all secretly
invoke this sweeping collective transhistorical context, which, like how
subliminal infrasound places an edge of perceived sound that is itself
only perceived if suddenly absent, significantly enhances the perceived
meaning of the data of individual human experience. Enrico Fermi once
famously asked, "Where are they?" For my part, I have my answer. They
are here. They are us. Or, rather they will be us.
November 2013

Each great surge in evolution is signaled by a "mass
extinction". But avoiding the mass death doesn't guarantee "you'll"
advance. Just take a look at how many living fossil species there are still
thriving on this planet. One has to have been favored by the process.
Despite all of the restrictions and conditions posed by the laws of
physics, chemistry, psychohistory, etc., there's still plenty of room
available in which grace may operate. Here's a few notable examples:
general anaesthesia is solely a function of its concentration in brain
lipids and is independent of the chemical properties off the anaesthetic
used (as Overton and Meyers first observed), the information specificity
of DNA genetic base pair sequences is not in any way determined by the
chemical properties either of nucleotide bases or of their connecting
sugar-phospate bonds, covalent bonding is only possible because of a
fundamental indeterminacy (Heisenberg uncertainty principle) on
account of the necessary quantum superposition in each bound electron's
position, the three-body problem has no closed form analytic solution,

i.e., deterministic chaos, etc., and so on.
Since causal connections (local causality) constitutes only a tiny subset
of the entire available spectrum of correlated fluctuations, it may be that
although the three body problem is not solvable in terms of physics, it
may onetheless be soluble in terms of engineering. The three body
problem may find its solution if each body is conceived of as a
nonlocally connected fluctuation packet that possesses nonlocal
connections to each of the two other “fluctuation packets”.
As remarked by Anthropologist and Comparative Religion Scholar,
Joseph Campbell remarked, "...when you ...look back over your lifetime,
it can seem to have had a consistent order ... as though composed by
someone. Events that when they occurred had seemed accidental and
occasional or as if by accident turn out to have been indispensable
factors in the composition of a consistent plot. So who composed that
plot? Schopenhauer suggests that just as your dreams are composed by
an aspect of yourself of which your consciousness is unaware, so, too,
your whole life is composed by the will within you. And just as people
whom you will have met apparently by mere chance became leading
agents in the structuring of your life, so, too, you have served
unknowingly as an agent, giving meaning to the lives of others, the
whole thing gears together like one big symphony, with everything
unconsciously structuring everything is as if our lives are the
dream of a single dreamer in which all the dream characters are
dreaming , too; so that everything links to everything else, moved by the
one will to life which is the universal will in nature. It’s a magnificent
idea – an idea that appears in India in the mythic image of the Net of
Indra, which is a net of gems, where at every crossing of one thread over
another there is a gem reflecting all the other reflective gems....", c.f.,
The Power of Myth.
All representation requires a figure and ground relationship. All
figure cannot be ground just as all ground cannot be figure, for this
would create a closed system devoid of any context resulting in a
February 2013

complete absence of meaning. This is in large part why the
combinational-permutational state space model for system evolution is
not really coherent, ignorant as it is of the distinctness of levels of
description and grounded phenomena such as evolutionary emergence,
cohersiveness, robustness context dependence, etc.
Meaning requires context and interpretation and therefore temporality in
this way consciousness in temporality cannot be separated temporality is
unavoidable in open systems so all future developments cannot be in
compass in the permutational and combinational possibilities of a state
Meaning requires context and interpretation and therefore temporality in
this way consciousness in temporality cannot be separated temporality is
unavoidable in open systems so all future developments cannot be in
compass in the permutational and combinational possibilities of a state
On subjectivity ...if the logic we invoke is able to convince our own
selves but no one else, can we really say that it has value
If we say that sanity is merely relative do we not embroil our self in an
infinite regress or self-contradiction?
Gradually in imperceptibly the palimpsest is eventually accepted as the
genuine document
Distinguish earthly enlightenment from metaphysical in like mint with
respect to the boat philosophical system of Buddhism
if want to view God as an invention of mankind then the Copernican
revolution pushes mankind further and further from center but also
pushes God further from the center towards periphery.
Narrative structure of consciousness the integration of time memory

updating 2d temporality metaphor vs. concept consciousness as medium
or substance of experience as a projection of the self as an intentional
object and 'structure of consciousness'
If Cantor’s diagonal is longer than it must be infinitely longer and there
is no definition of infinitely longer is there a hidden contradiction in
Cantor’s diagonal argument? We can apply Richards paradox to
unearthing the contradiction of cantors diagonal argument Richards
paradoxes founded on the notion that there is indefinability at work. a
similarly ill-defined notion lies at the root of cantors definitions for his
diagonal argument
Consciousness primarily has a context updating and pattern recognition
function. This is in part why we say that consciousness is the metaphor
of all metaphors.
We could try to run through in 'life review' fashion all of the possible
cultural memes in the state space but the meaning of the memes added
amongst those already accumulated must shift in meaning and
significance in a way different from how they could possibly be
predicted in short there are no memes state spaces with “compact
support”. Also there is back reaction of newly added memes onto memes
already stored in memory. We must distinguish projection of lower
dimensions onto higher dimensions from the case of the projection of
higher dimensions onto lower dimensions. Is there a distinction to be
made here between hard encryption and decompression?
In the absence of consciousness temporal evolution of information
already stored in memory would not require a second or higher
dimension of time with which to consistently describe it
Memory cannot be analyzed in terms of simultaneous experience
because experience requires memory to come to consciousness
Learning a language is like learning the story and the dramatis personae.

Are backstory possibilities be woven at the same time as one is learning
more of the explicit story?
Memory cannot be analyzed in terms of simultaneous experience
because experience requires memory to come to consciousness
Learning a language is like learning the story and the dramatis personae.
Are backstory possibilities be woven at the same time as one is learning
more of the explicit story?
Paradigm shifts help to prevent paradoxes from developing during the
course of the evolution of Science and scientific theory. What kind of
shift would prevent a temporal paradox? In both cases it is the same kind
of shift when one is talking about the consistency of one's personal
Paradigm shifts help to prevent paradoxes from developing during the
course of the evolution of Science and scientific theory. What kind of
shift would prevent a temporal paradox? In both cases it is the same kind
of shift when one is talking about the consistency of one's personal
Hypothesis theory law paradigm. Is this series multi directional with
feedback loops?
Search strings which produce no hits on Google but which produce
results when a specific website is searched is an example Internet
compartmentalization. What about Google searches that will produce
hits for certain search strings when the same search strings produce no
hits when any given compartment is searched?
I just noticed this morning that I was receiving text messages to myself
in my email inbox this seems somehow thematic and coincidental.

Cantor's diagonal argument depends on the notion of 'infinitely longer'
in order to prove the existence of higher orders of infinity In this way
Cantor's argument is proved invalid because it is shown to beg the
question. One has to already have the notion of higher orders of infinity
in order to define the concept of 'infinitely longer'. There are indeed
higher orders of infinity but this is not been adequately demonstrated by
Cantor's Diagonal Argument.
What are likely to be the psychological effects of inadvertently
stumbling upon one's own nexus of meaning and significance?
There must be a valid way of establishing Cantor's conclusion say, by
using quantum information theory perhaps combined with topology.
We don't necessarily have to specify numbers in order to count them the principle of integration is proof of this. This may help us to avoid
David Deutsch's infinite parallel universes as the necessary metaphysical
implication of quantum computation. The number theoretic concept of
compact support poses problems for the notion of computational state
spaces as models of parallel universes and meme state spaces, which are
essentially viral.
What will happen as the web more aggressively enables self-selection
effects for example applied to predictive searches for products services
information and contacts? This reminds us of the restriction of vacuum
resonant mode spectra in quantum cavity electrodynamics experiments.
From the date of his retirement from the chair Kant declined in strength,
and gave tokens of intellectual decay. His memory began to fail, and a
large work at which he wrought night and day, on the connection
between physics and metaphysics, was found to be only a repetition of
his already published doctrines', 1911 edition of Encyclopedia
Britannica, c.f.,
March 2013 I was

doing research on 404 errors and do not recall stumbling upon a 404
error for the past few years what a coincidence specifically I was going
to find out where the 404 area code was located
Multigenerational succession of knowledge sociology of Science
paradigm shifts all of this verses accumulation of knowledge within the
mind of a single individual. Automaticity, operational modification (Op
Mod), creativity, viral memes and Bohm's active information as word to
the wise hints to a radically creative medium. There must be a
fundamental recollected component to learning. There must be a viral
mimetic component to learning. The 4 types of causation: efficient
material formal teleological must all play a role in the transmission of
information. July 2013 Teleology of an evolutionary look-ahead or “heads
up” capability wherein alternate or parallel courses of development
(entails that each evolutionary course be tracked and therefore
“threaded”) are continuously compared, say in terms of leading or not to
structures down-line that involve back-reacting consciousness would
seem to offer a much faster evolutionary development than the near
infinite stretches of permutational-combinational time required for
tinkering or cobbling together the boundary conditions necessary for
self-aware or volitional consciousness. If it is not the case that there is
no “there there” in the sense of an overarching context for these
evolutionary processes, then there must be an embedding self-organizing
information storage and processing substrate qua underlying
evolutionary infrastructure gracing everything.
As long as one stays within the domain of one's naturally given talents
and abilities it is easy to imagine that the self is coherent unified,
integrated, coherent, freely willing in its decisions and judgments,
creative, perceptive, etcetera. Very similar to the sundowning
phenomenon however is the on graceful and uncertain behavior of the
individual when I'm praying outside the restricted domain of his natural
The determinism within the moment is distinctly different, radically

different in fact from the determinism which connects distinct moments.
If I had but known that immortality would be discovered 10 years after
my death I would have surely taken much better care of myself in my
For those of you over say you want to find yourself? Well then,
become the scholar of all upon which you "expertly" held forth in your
youth. Otherwise, in your quest for self-discovery, you are likely to find
someone else.
Feedback, returnings, retracings - without context there is no meaning.
The doubling back of experience upon itself is certainly an integral part
of this meaning providing context. @$
Not only must we have context to have meaning, but this context must
be narratively structured and narrative structure demands dramatic
Narrative structure requires multiple independent centers of volition
Man fails to realize is that he has all along been packing on the shell of
the cosmic egg . . . from the inside!
Is there a middle ground between noise and an intended message, say
through filter adjustments of mutually interfering signals?
The epistemological infrastructure enabling the possibility for
reprocessing of information, moreover for unlimited semiosis, strongly
suggests this third category of signal.
Here is the real reason why the anthropic principle is anthropic... Selfconsciousness requires a narrative structure which in turn depends on
dermatitis personae to function.

Stress- momentum energy are in coded in the fluctuations to the vacuum
similar to the matrix in a fashion which simulates curvature of space and
time you're correct space and time are not 'actually' curved. But on a
deeper philosophical level Oliver concepts are merely metaphors
masquerading as such. That's why all of our concepts should be
mentioned in double sets of quotation marks and used within a sentence
in a single set of quotation marks. This possibly brings up rather thorny
metaphysical issues. :p
Concerning anthropic bias: how do you determine what a representative
sample is?
Karma is superstition applied to the fact that everyone is born selfinterested and nobody gets out of here alive.
The facade of the future as a projection of the past shall change before
its substance can be fully fleshed out as it were.
In other words, Karma would appear to be as it so appears regardless of
whether or not the concept possesses any genuine metaphysical
Is there a concept of a reverse anthropic principle that needs to be
It should be obvious that the observer selection effect does not fine tune
for a particular observer, but rather fine tunes for an admittedly highly
constrained class of individuals into which the selected observer fits.
If a system is fine-tuned so that it selects for an entity possessing an
internal principle of unity, coherence and integration, then we would

expect there to be a similarly unifying, integrative principle at work
within the dynamics of *that* system.
'If a system exhibits fine tuning such that it selects for an entity
possessing an internal principle of unity, coherence and integration, then
shouldn't there be a similarly unifying, integrative principle at work
within the dynamics of the system that gave rise to that entity?' This
begs the question: 'what is meant by 'internal' here?'
How do we reconcile egalitarian copresence of 'me-like' selves with the
competitive fine tuning of universes individually and anthropically
selected for?
Is it inevitable or at least overwhelmingly likely that 'me-like' selves
have evolved on me-like brains? This is exactly similar to the question
of the probability of human-like life having evolved on Earthlike
Could there be anything analogous to a Disclosure Project for the
existence of other minds?
March 2014

Thinking outside the box like Jacques Vallee, let us suppose that
the riddle of extraterrestrial beings shall only be solved once the
philosophical problem of other minds has also been solved. In other
words, we shall only detect extraterrestrial beings in the same manner
that we become able to detect the existence of other minds. Since
indirect detection is out of the question on account of the vastness of
interstellar distances, e.g., radio waves, we shall only detect
extraterrestrial minds via quantum signatures which their intelligent or
conscious mental activity leaves behind in the form of entanglement in
the vacuum electromagnetic field or "Akashic record" that we don't yet
possess the proper encryption and frequencies to access. But the method
will undoubtedly be the same that shall one day allow us to detect the
quantum signature of other minds. Whatever new technology that
entails, it may well lie a century or more in the future. As humankind’s

scientific description of space, time, matter and energy changes as its
scientific understanding grows, and its technology along with it, there
shall be reached a stage where the seemingly insuperable interstellar
distances between earth and the nearest extraterrestrial civilizations
can have been overcome. However, by then our conception of
spacetime and mass-energy, along with our deepest understanding of
life, consciousness and the basis of our personal identity shall have
necessarily radically altered in step with what shall be seen as an
enormous technological advance - along a path that cannot be charted,
even with the advantage of hindsight. September 2014 This is one of the
essential characteristics of a radically recursive or “boot-strapping”
process, hindsight on what brought you to where you currently stand
does not even approach “20-20”.
Does egalitarian yet competitive self-selection have more to say about
the structure of the universe than it does about the structure of the
Wet paper bag project
The only being in a position to select in an anthropic cosmological sense
is either the self, that is, true self or God. This is why the true dichotomy
is not between atheism and theism, but between solipsism and theism.
Either God created this single universe or the self selected its own
universe from out of a virtually infinite multiverse that was eternally
What context systems and controls must be in place for 0's and 1's to
behave just like real currency? Is this analogous to how zeros and ones
can function as thoughts and intentions in a conscious computer?
Recapitulate my discussion on context continuity and consciousness
with Captain Guise. Static or snapshot structure is never adequate for
establishing the identity of an entity which requires continuity over time.

An example of the flow of the multiverse theory is N Out cropping of
rocks in a particular universe out of the virtual insanity of universe is
that looks exactly like Mount Rushmore.
An example of the flow of the multiverse theory is N Out cropping of
rocks in a particular universe out of the virtual infinity of universes that
looks exactly like Mount Rushmore. If a mind could be present in that
universe to make sense of the outcropping, i.e., human faces, political
leaders, colonial American, etc., then continuity becomes a meaning
giving contextual component.
Having successfully deconstructed the desire for approval from a peer
group with whom one does not really wish to conform is one of the
milestones of personal maturity.
When reason is enlisted to aid moral courage on the one hand or
disingenuous cynicism on the other, then this is rhetoric.
April 2013

Hawking introduces imaginary time into cosmology

The black hole collapse stop at the event horizon is conservation of
momentum important here or is information theory the more important
consideration for example the holographic principle
Meditate upon language in which every possible simple combination has
assigned meaning

An example of irreducible complexity in the realm of reproduction that
does not explicitly involve small nucleic acids and proteins is the
following. The 1st litter is composed of the last generation's kids and the
litter they produce themselves (the 2nd litter) is adopted and raised by
the grown kids of the first litter. Metaphor as scaffolding for
construction of concepts that otherwise must spring into being whole.

Concepts prompted by metaphors prompted by experience conditioned
by forms of intuition which constitute our broadest concepts.
There is a distinct difference between random mutations to a preexisting highly ordered cybernetic control system vs random changes
which were thought to have led to the original development of said
cybernetic control system
Relate the quantum no cloning theorem to the possibility of
transhumanist consciousness uploading. Is there a tension between the
quantum no cloning theorem and the principle of the indistinguishability
of quantum particles?
Cryogenic patients involuntarily enlisted to take part in an expedition to
the nearest star.
For every universe anthropically selected from out of myriad other
universes in the mix master multiverse there would be a great number of
universes possessing the same potential for conscious entities such as
human beings as the anthropically selected universe, but in which no
intelligent life is existing and only awaiting the design engineering
collection of special conditions in order to bring about intelligent beings
similar to humans.
The second dimension of time could indeed be the timeline of the will
and intention of the Design Engineer.
How do multiple centers of free volition arise from a process that
possesses only a single dimension of time
We need to look at things in a less metaphorical and more and a more
abstract manner.
For random changes to be meaningful and functional, there must be a
pre-existing framework of interpretation available to make sense out of

the random changes. The structure that suffers a random change must ab
initio be about something.
Holism, irreducible complexity and two dimensional time.
Metaphysics of cancer.
Be active seeing which we take for granted requires training and
experience. It is by no means a passive process. Its a critical
developmental window is have you in the early childhood in which the
appropriate in three line is not received by the occipital lobe of the brain
then the necessary neural network required to process visual data into
visual information will never be properly formed. There are no
uninterpreted data.
This is true for any sense perception process that requires pattern
recognition functions.
Human language is still bear the imprint of the block headed neural
network pattern recognition hominid linguistics system. But what started
out as block headedness in the pattern recognition and speech
recognition program turned into something much more subtle and useful
the facility of metaphor. Speech pattern recognition which was at first
applied to the other eventually became applied to first person speech
production. The principle of ontogeny recapitulates phylogeny very
much applies to the acquisition of linguistic capability.
Only a transcendental mind can possess a concept of consciousness.
The falsity of solipsism implies the existence of a transcendental
Universal Mind.

The fact that there was a beginning to tine proves that there is more than
one dimension of time.

Only the transcendent other can pierce the veil solipsism.
What is more important to the formation of life, boundary conditions in
the initial conditions or the innate dynamic process? The question of
which makes life more probable is undefined in the case of the dynamic
process, in other words no probability can be assigned to the dynamic
process relative to the probabilities of favorable initial and boundary
conditions for life.
Do we land ourselves in an untenable position of infinitely regressive
Bayesian statistics that is in terms of ever deeper levels of initial and
boundary conditions?
Bayesian statistics is a merger between set theory and probability theory.
Finding life on other planets is not going to help those dissolve the
irreducible complexity problem of the origin of life.
A Little Golden Book. Scribd search.
May 2013

Investigate the problem of how the multiverse and anthropic
reasoning doesn't really allow us to sidestep the other problem of fine
tuning and intelligent design. The difference between accidental fine
tuning such as the case of the multiverse and intentional or intelligent
fine tuning is that in the case of intelligent fine tuning there is a dynamic
underlying which necessarily involves continual adjustment that is
ongoing rather than merely ab initio. So there is a very real and
qualitative difference between intentional and accidental fine tuning
which should be detectable by observation and experiment say of
dynamical quantum processes in the vacuum. We should eventually
uncover evidence of the action of this sustainment of fine tuning of the
underlying physical substrate in line with the intended continuum of the
computational state space.

December 2013

Michael Behe asks in his book, The Edge of Evolution, how
far does the intelligent design extend downward into the lower tiers of
the taxonomic hierarchy, e.g., beyond sub-phyla and on to classes,
orders, families, etc.? This suggests that intelligent design is an
interaction between an Engineer and a preexisting dynamical substrate,
which itself may well require its own intelligent design-based
explanation of origin. This is the God “who does not make His own
dust.” Behe points out that the hox gene “tool box”, which is responsible
for orchestrating anatomical development during embryogenesis, as well
as for underpinning the morphogenetic relationships within a taxonomic
category, was already in place perhaps 50 million years or more prior to
the advent of the Cambrian explosion, which is to say, prior to the
advent of the anatomical forms, the development of which, that these
same regulatory genes would in a later geological epoch be destined to
c.f., I am reminded of the
open-ended rationality of being.
If we cannot logically disprove a proposition to which our opponent is
emotionally attached it is not enough to merely demonstrate that there is
no logical or experimental or observational ground in favor of the
proposition. One must go on to deconstruct the discourse and narrative
behind the position of one's opponent.
To help us more fully understand the relevant connection between the
philosophical problems of the other minds and the theological problem
of the being in existence of God we must first pose the question what is
the relationship between the question of other universes the existence or
nonexistence of God. The problem of other minds and the problem of
other universes are at a deeper level ultimately the same, that is, in the
absence of an ultimate context and backdrop of a transcendent Universal
Mind, itself possessing a true concept of consciousness, such that each
individual particular mind is but an instantiation of consciousness as
such, there is a degeneracy of indicators of possible worlds, namely,
consciousness (in the sense of ultimate selfhood) and universe.

It is said that without context there is no meaning but this question is
still more complex because we have to talk about meaning for whom
which bespeaks the possibility of multiple and perhaps altogether
disjoint contexts. Topology is of course an important consideration here.
The inbetweenness which lies between universes is one and the same
with that which is given prior to the beginning.
Empedocles' paradox concerning reasoning and determinism is related to
John Searle's notion of the closed system, intentionality and hard
artificial intelligence.
Duration or emergent time as opposed to deterministic time seems to
require continual input from outside the time line.
In the same with the quantum computing which is intrinsically parallel
in its structure can reduce linear time to log a rhythmic time such that
anything that normally would take a bath to mail when your time can be
accomplished in a much shorter. Of water with the time similar to this,
we don't need an infinite amount time for any and all possibilities to be
realized because of the quantum nature of the vacuum is the underlying
infrastructure of all structure and function within space and time. Isn't it
time Justin abstraction just a projection from within finite spacetime and
does this concept fall into the same category as things like the Soul free
will and consciousness for which we have no self-consistent abstract
category or concept? So the totality of possibilities can only be realized
within a structure of multiple finite space-time manifolds. No fun night
set of possibilities over large can be defined against the backdrop of an
infinite spacetime. One wonders then what is infinite spacetime for other
than an abstract category for purposes of argumentation and analogy
making or illustration. Unlimited symbiosis must fulfill the role of the
possibilities grounded by I supposed infinite space time.
A new variety of solipsism has reared its head with the advent of the
latest cosmological theories concerning the anthropic principle,

Boltzmann brains, the multiverse, cosmological fine tuning and so on.
This new version of solipsism invokes the notion of the fine-tuning of
consciousness along the same lines as the anthropic cosmological fine
tuning with which one is already familiar. In this updated version of
solipsism, other minds do indeed exist but they do not exist within the
same universe within the multiverse as does one's own mind. In this
particular case, a concept of consciousness which underlies the
possibility of the existence of other minds would only be possible if
there was some transcendent or Universal Mind which could bridge the
gap between universes within the multiverse as the basis of the intersubjective subjectivity, which is absolutely required by a self-consistent
notion of a plurality of subjective individual consciousness’s.
Consequently, a valid concept of consciousness must presuppose that
consciousness and its operation and function transcends the physical
laws of this particular universe, as embodied in the particular exact
values of fundamental physical constants which had to be fine-tuned to
produce this particular universe that gave rise to this particular
individual consciousness, namely mine. We say this because the
infrastructure for the fine-tuning of physical constants for distinct
universes within the multiverse must exist within the multiverse itself
and is not accessible within any given particular universe within such a
Unity of space and time with its implication of a concept of unified
spacetime places space and time on an equivalent footing. One possible
implication of the concept of spacetime might be the following: if there
is nothing after (time), then there is nothing outside (space).
Is there any significant difference in the philosophical implications of
the fact of our all meeting in an afterlife vs. us all having met in this
The requirements of cosmological fine tuning are far greater for
mammakuan species than for bacterial species. It is clear that the

requirements for fine-tuning physical constants and other cosmological
conditions becomes more inexact and less demanding as one considers
ever simpler forms of life.
The case of there being a single consciousness’s or a 'consciousness -assuch' which is merely structured in myriad distinct ways representing
different structures of self-consciousness where each is not substantially
distinct, i e., multiple self consciousness in which each is grounded in a
non-unique consciousness qua medium of egoic experience. With
regard to the appearances, these two cases are not in any way
distinguishable. That is, given the absence of an over arching
cosmic, transcendental consciousness.
April 1997

In other words, if consciousness possesses being as a particular
among particulars, it may only possess an objective meaning if it possess
an intersubjective meaning. But according to what has just been
observed, various self-consciousness’s must be merely different
structurings of a single consciousness and so consciousness therefore
must have a being which transcends any particularity, any particular
form of itself. Consciousness as such is therefore nondual, transcendent
of Form and so must be constituted from beyond the "field of time."
Consciousness is constituted within Eternity and so qualifies as the
ground of existence. This is pure Consciousness.
It is just that the many different structurings of consciousness at large do
not completely cohere, but form numerous relatively disjoint domains
marking the different individual consciousness’s. Consciousness as
such, on this view, cannot be understood by an individual conscious
which is merely a transitory, finite structuring of this fundamental
consciousness. Here again, we see our principle of the inability of the
stream ( the individual's stream of consciousness, in this case ) to rise to
the level of its source ( consciousness at large). It will not be possible
for computer science or brain physiology or whatever to come to an
adequate theoretical understanding of the process which gives a
particular individual the gift (or curse) of the individual consciousness

that he happens to possess unless any of these sciences can somehow
ground the explanation of his mental processes in the consciousness
originating utterly from outside him qua individual. One must, in other
words, first understand consciousness as such before moving on to an
explanation of the individual's own particular consciousness. But this
means that science will not be able to arrive at a general theory of
consciousness through induction from individual cases. So application
of the scientific method is barred from treating the problem of
consciousness as such. Nothing that the individual is capable of
perceiving could count for the essential feature which makes both his
consciousness and that of the other examples of consciousness as such.
Suchness is inherently transcendental. Negative Karma is the result of
the individual's choices made in previous lifetimes while the equivalent
to this in Christianity is Sin which is inherited by all human beings at
birth and cannot be overcome through the individual's efforts alone but
only through the active intervention of God himself in His immanent
form as Jesus Christ. Although in a historical sense the person of Jesus
is the incarnation of God the Father in human form, in no way does He
become absorbed or lose His identity upon being assume into Heaven
but is Himself unique and eternally part of the Divine Trinity. We
know that the individual soul does not return to the great Ocean of Being
at each interval between incarnations as the identity of the individual is
utterly blotted out whenever this occurs.
May 2014

Karma is not a conserved quantity and therefore the law of
karma, should one exist, must be fundamentally dissimilar to a physical
law, which is invariably alternately described in terms of the
conservation of some key physical quantity such as momentum for
energy or spin.
The principle of the positivism of evil demands that Karma not be a
conserved quantity.
Free will necessarily involves creativity and therefore non- conservation
of certain qualities.

It is quite humbling to reflect upon how there are two distinct kinds of
unknown - the unknown which is continuous with the known to which
inevitably a pathway leads and that other unknown, which is all together
disjoint from or discontinuous with the known.
Good karma does not always make up for bad. There are impersonal
natural forces which appear to change the local zero point or set point of
There is no platonic heaven of sensations or subjective perceptions or
thoughts has indeed there are abstract objects such as perfect triangles.
Concrete things cannot be built up out of timeless abstractions.
May 2014

"Next life" . . .next here is in the sense of metaphysically
different rather than later in time. To simultaneously avoid the reductio
of solipsism and maintain the utter distinctness of “suchnesses”
(“consciousness’s as such”), the multiple instantiation of consciousness
posed by temporally existent minds must be not be simply, but
complexly multiple. There must indeed then be an order of
transcendence possessed by consciousness as such. This points to the
necessity of a multiverse in a metaphysical sense.

November 2011 (Edited for Facebook posting in November 2013)

What I like about Biocentrism
is this dumping of our unfounded prejudice about the alleged necessity of time
being linear. Well then, after the fashion of the main character from the British
comedy, The Black Adder there remains the ever present possibility that one
should arise again anew from the turbulence endlessly generated by the tension
within the interface of existence and being, that is, whatever the original sufficient
reason may have been that originally brought one forth, unbidden from the
screaming abyss of the ever creative void, should do so again, in fact, an unlimited
number of times and this despite the necessity of “metaphysical work” (the realm
of being having been enlarged, at least in terms of an expansion of being's "ground
of existence" ) being performed during each “incarnation”. That particular quiddity

or "haecceity" specified by the unique anthropic cosmological constraints posed by
the individual consciousness as "substantial medium of one's possible experience"
shall always abide as the exactingly fine filter of the fluctuating quantum vacuum
underlying all physical reality and which tirelessly spin off these ever so slightly
though distinctly different universes, each complete with its own vast locker
combination of finely-tuned fundamental physical constants, determined to 14
decimal places as befits a precise resonant tuning of each fractally branching
universe to the demands of a stable, unified and integrally whole consciousness.
The multiverse embraces an unlimited multidimensional time and temporality.
Death is only a "dead end" therefore with respect to a given particular time line.
Your life is being lived out in *this* time, which is not to be identified with some
unique, one-shot, all-encompassing physical time (there's that unfounded prejudice
again). New timelines are always being generated as the multiverse continues its
eternal, playful and experimental, budding, branching and blossoming process. The
transcendent shall always continue to explore the realm of limitation, that is, "the
realm of what it's like to not know everything", since that's the only field that is
open to the transcendent's hopeful/fearful curiosity. Although this body and brain
pass away, the abstract laundry list of precise conditions for one to be, not
"reincarnated", but generated anew (just as one was originally gestated and born
this time around) in the midst of the unending novelty of Brahma's (Lila's) dance
and play, does not. This infinitely fine and subtlest of sieves, the anthropic
cosmological principle, shall be ever present to filter the unending noise, *the vast
majority, but not all of which belong to other minds.* As the Universe was not
created in time, but time and temporality were created with it (along with space),
there is no limit to the number of dimensions of time. February 2014 A natural question
is whether, in light of the metaphysical puzzles and paradoxes which attend the
logic of the Anthropic Principle, there is any determinable relationship between
possible temporal dimensions and the parallel universes of the Many Worlds
Interpretation (MWI) of Quantum Mechanics, i.e., the many distinct minds, which
may indeed constitute determinations of distinct grounds (of being).

Otherwise there would be no definite guarantee that one's eventual
escape from the Wheel of Life, through Spiritual Enlightenment, would
be indeed a permanent one since obviously whatever process started the
chain of incarnation in the first place might well again give rise to one's
very self-same soul, as it would customarily do between each
incarnation, reinitiating the whole process; if one insists that the original
process is unique and hence unrepeatable, then one is implying that each

person's individual identity is really unique and therefore not in the least
illusory and furthermore that neither is our separation from the Godhead
an illusion, but the very real and potentially alarming circumstance of
our Earthly existence. January 2013 Again, laws are abstractions that relate
classes of individuals and not the individuals themselves. Following
Russell, physical laws are mere descriptions of how nature in fact
behaves and hence carry nothing in the sense of a governing force.
Physical law artifacts such as “curved spacetime” and “wavefunction”
are perhaps nothing more than scientific metaphors and far from being
bona fide metaphysical entities – they are the user friendly “front end”
for accessing a “back end” that is indeed very remote from the potential
tinkering of simulated beings. In software architecture there may be
many layers between the hardware and end user. Each can be spoken of
as having a front end and a back end. The front is an abstraction,
simplifying the underlying component by providing a user-friendly
interface. November 2014 To get the gears of the whole meaning and
coincidence machine turning you have to provide initial inputs. Examine
human daily life in terms of the database concepts of "front end vs. back
“In software design, the model-view-controller architecture
for example, provides front and back ends for the database, the user, and
the data processing components. The separation of software systems into
front and back ends simplifies development and separates maintenance.
A rule of thumb is that the front (or "client") side is anything you can see
when you view the code source of the page (when you are on the client
side, i.e. not on the server.) To view the server-side (or "back-end")
code, you must be on the server. The confusion arises when you have to
make front-end edits to server-side files. Most HTML designers, for
instance, don't need to be on the server when they are developing the
HTML; conversely, the server-side engineers are, by definition, never on
anything but a server. It takes both, to be sure, to ultimately make a
functioning, interactive website.”
January 2013 cit=

September 1992

Leibniz writes in 1714, "Moreover, it must be avowed that

perception and what depends upon it cannot possibly be explained by
mechanical reasons, that is, by figure and movement. Supposing that
there be a machine, the structure of which produced thinking, feeling,
and perceiving; imagine this machine enlarged but preserving the same
proportions, so that you might enter it as if it were a mill. This being
supposed, you might enter its inside; but what would you observe there?
Nothing but parts which push and move each other, and never anything
that could explain perception." The philosopher David Cole argues to
disarm Leibniz' "mill" argument in the following way, “Blow up a tiny
drop of water until it is the size of a mill. Now the H2O molecules are as
big as the plastic models of H2O molecules in a chemistry class. You
could walk through the water droplet and never see anything wet."
Cole's point is that we all know that an individual water molecule is
"wet" and so our inability to see its wetness from the inside, as it were,
doesn't prove that it’s not really wet; similarly, a machine might be
conscious even though this consciousness is invisible from every
perspective - except the machine's, of course. There are a number of
serious objections to Cole's refutation of Leibniz' mill argument. Firstly,
it is doubtful whether the property of wetness may be attributed to a
single water molecule in the first place because wetness can be seen to
be an essentially collective phenomenon. The following example will
serve to explain. In the modern version of Robert Millikan's oil drop
experiment, wherein he demonstrated that the charge of the electron was
quantized, oil drops, being somewhat messy and inconvenient, have
today been replaced by tiny silicon spheres. These little spheres are so
diminutive (about .001 millimeters in diam.) that they are not visible to
the naked eye, nor are they palpable in the hand, unless there are quite a
large number of them assembled together, in which case they "coalesce"
together to form a little pool in the palm. This pool is indistinguishable
from one made of water except that there is a slight difference in the
perceived viscosity of the "fluid" which seems a little on the cloudy side.
The point here is that this collection of tiny spheres feels wet; and there
is no question whatever that a tiny sphere of silicon, blown up to the size
of one of Cole's overgrown H2O molecules, simply could not possess the
property or quality of wetness.

But what is all this thought to prove? Well, in Cole's case, a person
hearing his argument experiences a contradiction of sorts between his
expectation that size (which is owing to external perspective) shouldn't
make any difference in the wetness (intrinsic property) of a water
molecule, and his vague, visually based intuition that "if I can see the
damned parts of the thing, then, well, how can it really be wet?" In the
case of the tiny spheres of the "oil drop" experiment, we know that
wetness is an emergent collective property which depends upon our not
being able to distinguish the individual spheres, either visually or in
tactile manner. But the reader should not think that I have simply done
Cole's argument one better; for a quality which arises in perception
strictly through our inability to perceive what is part of the object "out
there" cannot be thought to be a quality of the object itself, but is
supplied by the perceiving mind. But there is an even more serious
objection to Cole here. A hypothetical set of conditions, relating to mere
appearance, cannot be of any support to an argument if the would-be
physical state of affairs, to which the appearance corresponds, is itself
impossible - however clearly one "intuits" its possibility. This
observation is very much operative in the case of Cole's overgrown H2O
molecule: to conceive of the water molecule as increased in size by a
factor of roughly 10 billion is equivalent to shrinking Planck's constant
by the square of this factor, or by a factor of about 100,000,000 trillions;
this is because the constant contains units of length squared.
But as any theoretical physicist will tell you, the energy uncertainty (via
Heisenberg’s principle) is proportional to this constant and this energy
uncertainty is responsible for the fluctuations in vacuum energy that
"pump up" the electronic structure of all atoms. With the value of h so
diminished, all of the electrons within the H2O molecule will collapse
into their respective nuclei, making the water molecule's covalent bond
structure, and so its intrinsic "wetness", flatly impossible. Since all
matter particles are created and sustained through the dynamical action
of the fluctuating vacuum energy, these particles exist merely as
abstractions from the vacuum structure, with the quantum vacuum

providing the unified natural physical law for the cosmos, so we might
expect that all peculiar phenomena which emerge across the interface
between the microscopic and macroscopic domains (of Reality) are due
to the fundamental vacuum dynamism; and this goes for the emergence
of "mind" within the developing human brain, as it has been found that
individual cortical neurons are capable of responding to the presence of
vacuum electromagnetic field fluctuations. But if a unified law of the
vacuum exists, it must be of a transcendental nature since the vacuum
constitutes an open system; any statement of physical law for such a
vacuum must necessarily leave something out of its scope. This means
that a so-called physical law for the vacuum therefore, is always for the
vacuum plus certain superimposed boundary and initial conditions.
Infinite perturbative levels of description of the vacuum's Hamiltonian
are possible. It is perhaps more than a strange coincidence that our
inability to distinguish, one from another, the discrete energy states of
the H2O molecule, reflected in this molecule's quantum energy
uncertainty, is responsible for its quality of "wetness," just as in the case
of the tiny silicon spheres, which, through our ignorance of their
individuality, coalesced to form a pool of wetness. To continue what
may amount to the unfair importation of facts into a philosophical
discussion, we will continue in our vein of drawing lessons from
quantum theory so as to further examine the possibility of a "thinking
machine." Another way of looking at Leibniz' mill argument against a
conscious computing device is in terms of the blueprint containing the
design from which the would-be computing device is constructed. We
should ask ourselves this important question, "How does a computing
device, while it is engaged in the very act of "thinking," constitute a
better embodiment of mental activity than does the network of ink
scratches on paper which constitute its blueprint?" "Well," we might
say, "the ink scratches are just sitting there on the paper, they're not
doing anything - still less could they be thinking!" You see, somehow
we have the intuition that something which is moving or undergoing
change possesses a better chance of having thought than something
which isn't moving. Partly this is just the influence of that residue of
animistic thinking which we have inherited from our primitive ancestors,

but partly, again, it is, I believe, a case of our seeing, however vaguely,
into the heart of a problem which, it seems to us, involves an elusive
peculiarity of consciousness' relation to time. I am reminded here of
Kant’s assertion that consciousness is the intuition of the passage of
time! @$Take-away?: there are at least as many dimensions of time as
there are individual consciousness’s.
Now, if a computer can simulate the human thought process so well that
the simulation becomes the reality, then what is there, in principle, to
prevent a human being, if only a particularly gifted one, from simulating
the functioning of a computing device - by simply being given a
problem and then tracing out with his eye the relevant wire and circuit
element symbols on the blueprint so as to produce the correct answer?
Now such a hypothetical human being would not need to know how to
solve the problem posed to him, but he need only know how to read the
circuit diagram describing the computer whose functioning he is
simulating. Now suppose that this talented human were to simulate the
computer's act of imagining a red sphere against a background of skyblue. To do this the human must read the right series, in the right
sequence, of markings on the computer's logic diagram; does it matter
how fast the series is read off? - that is a question which we will have
cause to examine later. But if we postulate that consciousness is a
necessary component in the faculty of recognition - a faculty very much
involved in the human's act of reading a stream of symbols - then it
appears we require consciousness (that of the human) to get
consciousness ( that of the computing device): even if the recognition
required to read or interpret the symbols of the circuit diagram doesn't
itself require consciousness, it nevertheless requires the utilization of
circuitry (and plenty of it!) not described anywhere on the blueprint
itself because presumably this part of the computer's blueprint has to be
read as well and so we're landed in a viciously circular, infinite regress.
But suppose, all the same, we tried to construct the blueprint in this
manner, representing at each progressively tinier level of calligraphic
scale, the details of the blueprint at one particular level, which were to
provide the instructions for reading the blueprint at the next higher scale.

At some level along the downward spiraling hierarchy of spatial scale
we would be working, whether with micro-rapidograph or
submicroscopic etching tool, at a scale where the particle behavior of
matter (as collections of independently existing "things") gives way to
its wavelike behavior. It is at this scale where the tiny subdiagrams
which we try to etch into the paper are subject to the seemingly random
fluctuations of the vacuum electromagnetic field energy so that the tiny
etchings themselves fluctuate at this level.
It is clear that, however these sub-sub-etc. - diagram etchings fluctuate,
they do it in a way which still manages to capture the design of our
computing device. It now appears that any attempt to produce a static
description of the computing system architecture, i.e., its blueprint,
results in a dynamic fluidity in the structure of the physical realization of
this description at the level of scale marking the approximate boundary
between particle and wavelike behavior of matter. July 2011 The ultimate
context for the atomic scale circuitry of our proposed conscious
computing device must be the quantum vacuum electromagnetic field,
itself the original source of Heisenberg momentum-energy uncertainty.
The vacuum fields provide the context for the computer’s circuitry,
which itself provides the initial and boundary conditions upon the
vacuum fields’ self-interaction. This circuitry acts passively to allow the
vacuum to reprocess its own preexistent quantum entanglements
(containing preexistent quantum entanglement-encoded information).
January 2014

The dynamics of intelligent design manifests itself at precisely
the same scale of time and space as observer quantum self-interference
effects come into play. These interference effects are likely related to
quantum correlations having established themselves between
superpositions of networks of nonlocally connected microtubule tubulin
dimer states of the observer’s brain and analogous superposition states in
matter with which the observer’s brain has become quantum entangled.
(The subtle relationships between quantum superposition, entanglement,
correlations and fluctuations are to be profitably explored further in just
this context.) This fact of the self-organizing properties of particles and

fields coinciding with the bootstrapping dynamics of human
consciousness may well forever systematically impede any and all
attempts to falsify the intelligent design hypothesis. Strong indicators of
this threshold should arise at precisely the point at which computer
engineers' attempts to extend design down to the nano level - at which
some spontaneous bootstrapping behavior of quantum computer
integrated circuits induces auto-jail breaking. The distinction between
hardware and software should have already been more or less obliterated
before this stage is reached. Ironically it is the self-interference of
intelligence as such which shall prevent the falsifiability of the
Intelligent Design hypothesis. @$ This is a kind of backdoor application
of a curious adaptation of Kurt Gödel’s Incompleteness Theorem to
computer engineering and shall reveal what is a stronger notion than
falsifiability within the philosophy of science. June 2014 eml=”I have
considered part of what Ziad is talking about on many occasions over
the past few years. Once the first practical quantum computers begin
operating, it is likely that unanticipated phenomena will be observed in
the behavior of these computers, which shall really only be input/output
interfaces to the quantum vacuum (which is what does all of the actual
computation and information processing, the “heavy lifting”, if you will
– not the human engineered hardware). Because of the instantaneous
connectivity of information in the nonlocally connected quantum
vacuum, information put there right now billions of light years away or
billions of years ago will all be simultaneously available to become
quantum entangled and interfere with the quantum entanglements
generated by these new quantum computers. Some sort of filtering
system will have to be developed to deal with the “outside interference”
of relic nonlocally encoded information, the theory for which does not
yet exist. November 2014 Boltzmann brains do all of this “heavy lifting” that
lifts the operation of the human brain above the level of a biological
automaton/philosophical zombie operating only on a preestablished
Skinnerian stimulus-response program. @$This “outside interference”
may be attributed to coherent discrete quantum vacua viral memes and
“meme nodes”, which are themselves not fully fledged Boltzmann
brains, but may be more appropriatly styled prn=“Boltzmann ganglia”.

(“Boltzmann ganglion”, although a logical extension of the concept of
“Boltzmann brain” is itself currently a “Boltzmann meme”, i.e., a
nonsensically contextualized solitary appearance of the term “Boltzmann
ganglion”, c.f., search within
– while the term, “Boltzmann ganglia” is not yet itself a “Boltzmann
whackblatt” (BW), which is to say, it is as yet an unrealized Googlewhackblatt.)
Dear Greg, I’d like to share with you some thoughts I’ve
had since we last spoke in person concerning the fundamental
limitations of artificial intelligence, i.e., “why computers can’t think.”
One of which that we can point to straight away as being necessary to
the possession of consciousness in a cybernetic system is memory.
Memory is what makes James’ “stream of consciousness” the
Heraclitean river into which one cannot step twice. au=Henri Bergson has
pointed out in his book, Time and Free Will, that a genuine recurrence of
a thought or feeling is impossible if it is suffused with the memory of its
having occurred at some earlier time – this peculiarly self-referential
quality of the memory could not have formed part of the texture of the
thought as it originally occurred: if the same notion occurs to one at
points in one’s life widely separated in time, the context will be
significantly altered between each such occurrence so that this notion as
it appears in this new context must stand in a metaphorical relationship
to itself in its originating context – that is, if the latter thought is to
constitute the recollection of the former. Paradoxically enough, if the
second occurrence of an idea were really identical to what is was at its
first occurrence, then the example of having remembered something,
since there could be no memory, must be an abstract feature of an
experiential field which is itself temporally integrated. The process of
the temporal integration of experience is a process which itself must
occur in a time sequence. We might ask the question, What is the
purpose of experiential, i.e., phenomena which formerly bodied forth
within the consciousness stream, if they are never again recollected?, or
rather, never can again be recollected?
How could these very
experiential phenomena ever have formed part of the consciousnessSeptember 1992 eml=

stream, itself essentially characterized by the unique property of
temporal integration or wholeness, if they are never referred to further
on “down the stream?” The answer, or something fundamentally akin
to an answer to this question , must be that all experience is in some
sense remembered experience, that each and every experience which one
can point to as it “bodies forth” in the stream of individual
consciousness contains within itself densely-packed myriad references to
analogues of itself in earlier “incarnations.”
Conscious experience
itself, in other words, but makes its appearance within the stream as
already temporally integrated. Another theory is that there are myriad
different but interacting selves connected with the normal functioning of
a human brain, each with its own information frequency bandwidth, with
the range of frequencies peculiar to each being associated with its own
scale of temporal integralness or wholeness. One of the major functions
of the faculty of attention may be the switching of consciousness
between different bandwidths associated with distinct experiential
temporal scales. July 2011 The brain stimulation experiments of Libet in
which the cortical area of the hand must be electrically stimulated for
500 ms before development of a primary evoked potential (EP)
associated with conscious perception of stimulation of the hand can be
perceived by the subject, combined with the fact that stimulation of the
hand several hundred milliseconds after cortical stimulation of the
corresponding area had already begun was always perceived as
occurring hundreds of milliseconds earlier than
direct cortical
stimulation. This fact was interpreted by Libet as meaning that an
approximately 500 ms time delay or time buffer was needed in order to
effect necessary temporal integration of conscious sense-perception.
Also, the backwards-in-time referral of the stimulation event by up to
500 ms is a phenomenon (one might say epiphenomenon) of this
temporal integration as well as being tangible proof of the temporal
multidimensionality of conscious experience and the subjective,
projective nature of the perceived unidirectionality of so-called objective
time. Along the lines of the “Boltzmann Brain” idea, the quantum
vacuum is supposed to possess a latent recursive structure, one that
includes not only perceptions and memories of experiences, but also

memories of memories of experiences, etc. July 2012 “If one has an
equilibrium state that lasts an infinite time, fluctuations around
equilibrium can lead to any state whatever popping out of the vacuum
just as a statistical fluctuation, with emergence of a local arrow of time.
This leads to Poincare’s Eternal return (any state whatever that has
occurred will eventually recur) and the Boltzmann Brain scenario: you
can explain the existence of Boltzmann brains not as a result of
evolution, but just as an eventual inevitable result of statistical
fluctuations, that is, if an infinite amount of time is available ([12]:201227)”,
Large bandwidths associated with large information frequency ranges
would be in turn associated with large integral time scales in which
richer, more meaningful and nuanced experiences would be possible
than on smaller time scales.
The commonsensical way of comparing these different scales
experientially is in terms of the concept of retentive memory. A new
anti-anxiety drug much used in dental offices and perhaps more popular
among both dentists and their patients than the old standby, i.e., laughing
gas is the drug Ativan. Ativan succeeds as an anti-anxiety agent due to
its peculiar property of contracting the anxious patient’s retentive
memory span. Pain experienced by the dental patient during surgical
procedures is eased through the use of this drug, not by an actual
reduction of the intensity of the pain itself, but through a reduction of
the patient’s anxiety about the pain he is experiencing.
The temporal window within which the patient is constrained to
experience what is happening to him while he is under the influence of
the drug is simply too contracted to contain temporally unwieldy
thoughts such as, “Oh no, he’s going to hit the root,” or , “I wonder if
it’s going to get worse?!” In retrospect, my experience of my own state
of consciousness while under the drug’s influence was so like how I
imagine my childhood consciousness to have been, although

paradoxically, I was much more afraid of the dentist then than I am
This suggests an altogether new way of understanding the function of
the brain as the “organ of consciousness.” Instead of building up
conscious experience from some determinate set of primitive elements,
e.g., sense data, the brain abstracts from a sea of information signals
(eternally preexistent, as we will argue later), in much the same manner,
I believe, that the single strand of DNA extracts from its cytoplasmic
soup the amino acid bases it needs to duplicate itself. The brain acts as a
kind of template for thought in the sense that the activity of the brain as
a whole serves as a frame within which the particular features of this
activity of the signal matrix, e.g., quantum vacuum electromagnetic
field, which is complementary to the particular brain excitation pattern,
as it were, is defined through “resonating”, i.e., interacting with a
peculiar spectrum of vacuum electromagnetic field fluctuations, c.f.,
ancient Google cache of Hearthmath Institute webpage detailing
experiments in which DNA electromagnetic signatures were recorded in
the vacuum electromagnetic fields of resonant quantum cavities (circa
Certainly the more a task is repeated the more the series of actions
constituting this task converges to a rigidly inflexible sequence of acts:
less and less, therefore, does the sequence of neural discharges
underlying its outward performance admit modification due to the
presence of the constantly varying web of cerebral electrochemical
events taking place within the totality of contemporaneous brain
discharges, which form the biophysical basis of a task being memorized
by rote, becomes more and more impervious to the influence of what is
occurring within the brain as a whole, i.e.., progressively more
insensitive to the neurochemical context, and to precisely this extent, to
which the neural sequence becomes impervious to outside influences, to
this very extent it sinks into the milieu of the unconscious mental

So memory on the part of cybernetic systems, generally speaking, makes
it impossible for their mental states, their computational nonreproducible in time. It can also be shown that memory prevents a
cybernetic machine’s computational states from being reproducible in
space, i.e., they cannot be copied.
One of the functions of
consciousness may be the inscribing of patterns of sensory data into
memory for use in the interpretation of patterns of sensory stimulation
occurring in future – for establishing the context of these future events,
which have not been experienced consciously.
But the converse of this is also true, namely, that the faculty of memory
is essential for the existence of conscious states of awareness for it is
only through the placing of sensory stimuli into a context formed in light
of previous experience that these new stimuli might be categorized, that
patterns of stimuli might be recognized in terms of the presence of some
object or objects, in other words, that sensory experience might possess
the property of intentionality.
This is one of the essential characteristics of historical change which
distinguishes it primarily from deterministic change. Historical change
possesses temporal integrity because the series of events/processes
comprising it tells a story or potentially does so because of referring to a
subject or a set of related subjects. In a deterministic series of events, in
a very real sense, nothing new ever happens because all of the
information about the entire series already exists at the time of the first
event in the series. ^^(look at fringe effects due to boundary and initial
conditions – if the series has a beginning, then it can’t be completely
deterministic – there is a powerful theorem lurking here, I suspect! –
probably just Gibb’s Theorem or an adaptation/lemma associated with
Gibb’s Theorem though)^^ Time is spatialized in such a series because
the series is infinitely divisible; it is utterly without the property of
integral wholeness, contained within a single instant of time which
sweeps along a line of finite length. (Paradoxically, infinite divisibility is
intimately connected with a holographic topology, which is to say an
important variety of integral wholeness.) Events within a historical

series, however, do not assume a recognizable identity unless something
is known about some of the events leading up to them as well as
concerning events succeeding in their wake. Moreover, the identity of
meaning of historical changes as well as in light of increasing
knowledge of the history of the events themselves. This is to say,
historical events are always partially indeterminate with respect to the
revelation of future historical change; furthermore, indeterminacy forms
the very ground temporality in that all changes of state are ultimately
predicated upon it.
This assertion is borne out quite literally by
quantum theory through the statement of its time-energy uncertainty
principle first put forward by Werner Heisenberg, one of the theory’s
early founders. One of the major implications of this quantum principle
is that the transitoriness of all quantum processes is directly proportional
to the size of the energy uncertainty of the quantum mechanical system
in which these processes are taking place. The magnitude of this energy
uncertainty is directly related to the degree by which the system violates
the energy conservation principle of classical physics.
Information is characterized by the appearance of something new, or the
notification that something new has occurred – itself the occurrence of
something new. Information and energy differ from one another in the
sense that the very same spatiotemporal series of inputs of energy into
an energy system may occur again and again, but the analogue to this
situation for data and informational systems is not definable. Here
repeated data inputs to an informational system do not constitute two
separate identical series of information inputs to the system, and, again,
this is due to the possession of memory by any informational system
worthy of the name.
This follows from the indeterminate’s two basic properties: 1) the
negation of the indeterminate is itself also indeterminate so that it
contains within itself a contradiction which it transcends and from which
anything at all may follow, and 2) the indeterminate, not existing any
particular state at any particular time, must be forever undergoing all
manner of fluctuations.
To wit, the indeterminate is the eternal

excluded middle – the void through which everything passes as it
changes from what it is into what it is not. It must be clear that when A
changes into B, it does so by passing through a state which is neither A
nor B, and so their must be that which is necessarily neither/nor in
general, i.e., the indeterminate. Intentionality fulfills the function of an
underlying fundamental substance, or subject of change. The physical
is characterized by wholes being determined through merely the sum of
their parts; while the mental is characterized by parts being determined
by wholes. This definition of the distinction of mental and physical is
very similar to that provided by Schopenhauer.
These unconscious mental processes are probably logical or
computational in their essential nature and only alert the conscious mind
when a recursive tangle between several different levels of description
take place, for what is called recursion cannot be adequately dealt with
by any logical/deductive system whatever which possesses greater
deductive power than simple arithmetic. Automatically it becomes and
to this degree the more it occurs without the aid of conscious guidance
or reflection. This is owing to each successive occurrence being less and
less connected to the situational context, more and more connected
internally to its immediately previous occurrence. October 2013 "When you
point a video camera at the same closed circuit TV monitor to which this camera is connected...
Since consciousness at large is the activity of the fundamental quantum field, conscious scrutiny
of specific activities of this field, say by performing laboratory observations of quantum
mechanical systems, e.g., two-slit experiment, would naturally be expected to engender
c.f., "Anomalous phenomena" may be in essence
though conveniently characterized as any phenomena exhibiting a partial breakdown of that
normal, workaday, hard-and-fast division between the observer and the observed, the internal
and the external, that is presumed by 19th Century physics. A total breakdown of this division
could be characterized as stirring awake from a lucid dream.

Another characteristic of conscious thought which is essential to it is
what John Searle calls, in his book, Minds, Brains and Machines,
A thought possesses intentionality if it is “about
something.” In other words, thoughts, to be of the conscious variety,
must be transcendent in the same sense in which ink scratches on paper

in a language known to the person reading them, i.e., the interpreter of
these scratches, transcend them as physical tokens. The ink scratches
on paper which constitute the circuit diagram of a so-called
supercomputer do not appear to be transcendent, at least as far as the
computer itself is concerned, as the behavior of the computer is simply
isomorphic, i.e., runs parallel to, the structure embodied in the diagram.
Any sort of causal interaction between the physical embodiment of the
diagram, i.e., the computer’s “hardware” and any program it might be
carrying out, i.e., its “software” disrupts this nice isomorphism and
would constitute transcendence by the computer of its program together
with the physical embodiment of its circuit diagram, its hardware. The
distinction hardware/software significantly parallels the distinction
(more fundamental for our purposes), that of energy/information. October
But any computer program is always implemented within a causal
context, however, the fluctuation-correlation structure of the underlying
quantum vacuum does not significantly contribute to this context until a
delicacy, fineness and subtlety of operation is reached approaching that
of energies comparable in size to quantum fluctuations, which are in turn
comparable in size to the Heisenberg energy uncertainty of the
computer’s central processor in its capacity as a quantum mechanical
Although the context of thought is forever changing, we do not find
ourselves lost in a bewildering phantasmagorical world of endless
metamorphosis. The human mind is able to utilize notions, in their
original occurrence as insights, in every newly-arising contexts. Some
philosophers of mind style mind as the metaphor of all metaphors. The
stability of the Self within the stream of consciousness is heavily
dependent upon its facile use of metaphor. Metaphor, however, is only
the application of categories of thought in one context which have been
borrowed from another. Needless to say, these categories had to be at
some point created ab initio through the more general process of
abstraction, or, the formation of abstract categories. The process of
abstraction always involves the treatment of certain details in which
things differ as unimportant so that other features may be foregrounded

and grouped together within the same set or class. The creation of a
system of such categories sets the stage for the recognition of a
cybernetic system. There will always be multiple ways of categorizing
the data which are continually streaming into the sensory apparatus of
the system; no hard and fast rule or set of rules may be worked out ahead
of time to prevent the emergence of an ambiguous collection of data, and
so the necessity is always at hand of deciding how the data will be
interpreted, and this may only be accomplished through metaphor or
through the defining of new broader or narrower categories with which
one structures the ambiguous data. Note that the information content of
data is always open-ended and contingent. Abstraction requires first of
all the capacity of the cybernetic system to define for itself what is to be
considered relevant and what is not. Relevance, however, may only be
established in the light of some previously determined aim or purpose.
Purposes are always defined in service to the larger or broader aim
which is in view. The broader the purpose which one is pursuing, the
greater the scope which one has in satisfactorily fulfilling it. For human
beings, this broadest purpose, the instinct which we share with the rest
of biological creation, is simply the ever-recurring goal of physical
If the Copenhagen interpretation of quantum mechanics is essentially
correct, i.e., where the wavefunction is a probability wave representing
the state of an observer's knowledge so that it is indeed the
consciousness of the observer which is responsible for collapsing the
wavefunction and not the physical disturbance to the wavefunction
provoked by his measuring device, then it should be possible to carry out
a "delayed choice" type experiment : a standard two slit interference set
up is constructed where two video cameras are substituted for two
conscious observers, one "viewing" both slits (camera A) and another
camera "viewing" the backstop where either an interference pattern or a
random "buckshot" pattern of photon strikes appears. If this experiment
is performed in the absence of a human observer and then afterwards,
perhaps years later, the film in the back of cameras A and B are
examined, it will be found that the order in which the cameras are

opened and their film emulsions examined will make a difference in
whether the film from camera B contains recorded on it either an
interference pattern of photon waves or a "buckshot" pattern of photon
"bullets." In other words, if the film in camera A is examined first, then
an observer possesses knowledge as to which slit each photon passed
through so that the wavefunction of the paramagnetic particles coating
the surface of the film emulsion in camera A undergo a collapse from
the previous superposition state leading to an interference pattern to a
positional eigenstate leading to the "buckshot" pattern of photon
"strikes." On the other hand, if the back of camera B is opened up first
and its film developed and examined, then one finds that an interference
pattern has been recorded on the film. But what now for the film in the
back of camera A which had been set up to "view" and record events at
the double-slit? Should not the series of images recorded on this film be
smeared out just enough to prevent us from telling which photons
traveled through which slits? If this is the case, then the images stored
on the film of camera B may be used to tell us whether camera A in fact
did of did not record the "actual paths" taken by the photons, though the
double-slit superposition state associated with the photon interference
pattern does not require any unique and mysterious influence of human
consciousness upon the results of the experiment, but amounts to
nothing more than the effect of camera A in blocking the "pilot waves"
traveling through the slits through which the photons are observed not to
be traveling.
March 1997

This preposterously counter-intuitive thought experiment can be
defused if one requires that merely the possibility of an observer gaining
knowledge about which slit the electrons went through would be
sufficient to collapse the electron position wavefunctions so as to
produce the "buckshot" pattern of electron strikes on the phosphorescent
backstop. This is actually what has been demonstrated by several
ingenious "delayed-choice" experiments, which have been performed
during the 1990's. And it is the position of the camera relative to the
slits which, of course, determines this.
October 2011
As an aside, consider that the question of whether or not

neutrinos can be observed with superluminal velocities may be
dependent upon experimental set up, which is in accordance with the
logic of the observer’s role in interpreting the results of a two-slit
experiment. If true, this would mean that the logic of superposition and
wavefunction collapse serves the vital role of “chronology protection.”
Note that chronology protection is not any concern when one is speaking
of the subjective information or incommunicable knowledge of the
Any information that cannot be translated into
intersubjective communication must avoid the stringent restrictions of
chronology protection. I say “must” in this connection because of the
“fecundity principle” of quantum mechanics (first uttered by
Feynman) that, anything not forbidden by the laws of quantum
mechanics happens. Since the information contained in energy
structures smaller than the Heisenberg energy uncertainty, /\E of the
system cannot be communicated locally – only being able to accompany
a state undergoing quantum teleportation, it follows that the physical
processes constitutive of consciousness (assuming consciousness is not
“non-physical”) are apiece with the fundamental processes of virtual
particles, fields and their reactions. If consciousness is to be considered
“non-physical”, then the distinction of “physical-nonphysical” lines up
with the parallel distinction of “real-virtual” of quantum field theory.
January 2012
Einstein’s rod and clock convention could perhaps be recast in
terms of neutrinos as “relativity yardstick” instead of photons. If the
neutrino maximum speed could be pinned down to just a tiny fraction of
a percent faster than the speed of light, a tiny fraction which only results
in a correspondingly tiny fraction of a percent error in all heretofore well
established measurable relativistic corrections, e.g., with respect to time,
mass, length, etc., then the theory of special relativity could be retained
more or less intact. It would be just as though Einstein had started his
gedanken experiments using the neutrino instead of the light ray
(photon) such that special relativity remains valid as a physical principle.
This of course poses problems for the myriad physical analogue
alternative interpretations of relativity, e.g, Desiato’s polarizable
vacuum model. The momentum and energy deficits in particle
interaction calculations that formerly relied on a photon-based special

relativity might then point of the existence of heretofore undetected (and
unsuspected) theoretical particles of extremely low mass, which would
now be necessary in order to reestablish an exact balance of, for
example, nuclear particle scattering and reaction equations.
But what if we could assure nature, as it were, that despite the
appropriate positioning of the camera in front of the double-slit, the
observer, or any observer, would be unable to take advantage of the
appropriate physical arrangement of the camera in order to determine
which slit the electrons go through? I believe that, in this case, the
interference pattern would, again, reappear on the phosphorescent
backstop! If this were the case, then the observer would regain his
mysterious status with respect to wavefunction collapses, c.f., au=Vic
Stenger, cit=The Myth of Quantum Consciousness. One must, to wit,
assure, first of all, that it is possible to establish a closed system within
which the experimental apparatus is to be contained, in order to, in turn,
assure that, no matter how large a physical arena this quantum
experiment is performed in, the observer will not possess the possibility
of knowing the trajectories of the electrons. Not all closed experimental
situations can assure this, but it is just that a closing off of the
experimental setup from the rest of an open reality must be achievable to
assure the inability of the observer to draw on hidden resources to divine
the trajectories of the electrons from their source to the backstop. (Does
this mean we must here be able to isolate the system, which is the
subject of our experiment from its embedding quantum vacuum so as to
effectively separate the system from the observer’s brain, which is also
“embedded” in the context of this same “quantum vacuum”?) This
suggests that the observer's ability to collapse the wavefunction consists
in a peculiar connection which he is able to make with an open-ended
reality, a reality which, as alluded to earlier, is therefore indeterminate,
i.e., nondeterministic. It is interesting to note that it is only within a
closed physical system, where the boundary conditions of the vacuum
field are changing only adiabatically, that a superposition state may be
supposed to exist. Presumably, the closed system cannot adequately
accommodate the phenomenon of the observer's consciousness, which is

what disturbs the system, resulting in a collapse of the superposition
state which heretofore existed within it, and this, just by virtue of the
mere possibility that the observer may obtain knowledge of the system's
October 2011
state with respect to the superposition observables.
whether or not a system is “closed” or “open” is of material importance
to the question of whether what the system “contains” is context-free
data or context-dependent information. An important question is
whether the deciding difference here is to be determined exclusively
through some fundamental difference in the correlational structure of the
system’s “fluctuation matrix”, e.g., recursive vs. nonrecursive, etc.
Throughout this discussion, we must not lose sight of the fact that the
wavefunction itself does not actually represent anything physically real
or measurable, and so all purported interactions occurring between
wavefunctions must be realized in terms of the interaction of their
associated probability density functions. (Caveat: there is growing
experiment evidence at the time of this writing, October 2011 that the
wavefunction is measurable and so constitutes a real physical entity).
For example, Aharonov experimentally proved the reality of A the
magnetic vector potential by measuring changes in the quantum phase of
A within a region where the magnetic field, B was absent. So if A is
identified with the wavefunction of the photon, this proves the reality of
A superposition state is only defined where each of the component
superposed wavefunctions has an associated probability via the square of
its amplitude, although here the assignment of unique probabilities to
both the interference pattern - a turn of events which, on the
Copenhagen interpretation, is determined solely by the decision of
the conscious observer as to which camera, A or B, he/she opens first.
Remember that in the theory of quantum mechanic.s a particular event
only possesses a probability of 1 if it has already occurred. It is in this
sense in which we speak of the superposition state as a combination of
quantum states, no one of which is real in itself.

The pre-Socratic philosopher, au=Parmenides, was of that philosophical
tradition which considered the ultimate metaphysical question to be
"Why is there something rather than nothing?" And he is noted for
having proclaimed "Nothing does not exist." @$But he considered that
all real change necessarily involved the instant by instant creation of
new attributes ex nihilo. Parmenides concluded from this that change
was, itself, impossible and the universe; being cannot come from
nonbeing, therefore the universe is a static and indestructible closed
system; time was for au=Parmenides a kind of tenacious illusion. Time
would later be characterized similarly by au=Einstein. In the present day,
owing to the advent and development of the Quantum Theory, the
suggested reformulation of this most fundamental metaphysical question
is: "Why is there Information rather than Chaos?" For those persons for
whom the question, "why is there something rather than nothing," is
meaningful, belief in the existence of a transcendent reality beyond
space and time, and what is more, beyond the most general dichotomy,
the dual opposite categories, existence vs. nonexistence, the granting of
the being of Deity is theoretically but a small step. Such persons merely
have to be convinced of the necessity of Will within the realm beyond
Representation, c.f., cit=Will and Representation (au=Schopenhauer). For
other persons, this most fundamental of metaphysical questions is, as
Martin Gardner puts it,
"cognitively meaningless." September 2011 As the late 20th Century
Philosopher, au=Robert Nozick pointed out in his cit=Philosophical
Investigations, “The question cuts so deep…that any approach that
stands a chance of yielding an answer will look extremely weird.
Someone who proposes a non-strange answer shows he didn’t
understand the question [italics mine]. As logicians are fond of saying:
“Everything follows from a contradiction”. If “nothing” and
“everything” are veritable dual-opposite categories, then logic tells us
that everything in between these two extremes stems from the
interrelation or interactivity of the two. In a sense, it is “nothing” which
gives “everything” its ontological status. June 2014 (c.f., National Review
article on atheism, “In the beginning was Nothing and Nothing was with
Nothing and Nothing was Nothing…”, which was pointed out to me by

Randy Evans. September 2011 On the information paradigm of existence,
“nothing” corresponds to chaos, while “everything” corresponds to pure,
self-existent information and anything in between would seem to depend
upon both. In a twist of au=Clarke’s principle, i.e., that “any sufficiently
advanced technology would be indistinguishable from magic”, we could
say that, any sufficiently information-dense structure would be
indistinguishable from chaos. Here we are relying on the rather
speculative intuitive notion that encryption is necessary to achieve
optimal information “packing densities”. Examples of this are the
number 19 code in the Quran, the Bible Code, Hebrew number-letter
correspondence, vowel suppression or absence of spaces in ancient
Hebrew texts. So the width of one’s loupe, where and how far it is
placed above the dense array of characters, determines what is read off.
This is all by way of pointing up the active role which the “interface”
has in determining information content. Can an alternate theory of mind
on the one hand, and of the public space on the other be developed from
a confluence of such concepts as selective attention, bandwidth
conservation of psychic energy, lucid dreaming, recursive functions,
fractal geometry, Wigner’s friend, Cramer’s Transactional and Everett’s
Many Worlds interpretations of quantum theory (and its many recent
“Many Minds” adaptations), epistemological solipsism, objectivity =
intersubjectivity, hard encryption, data compression, chaos theory,
quantum decoherence as grounded in an individualized vacuum
spectrum or domain? The opposite side of the coin from epistemological
solipsism is the fact that one is unknowable to others. An important
question in this connection is whether the self that is unknowable to
others is knowable to the self. If the only parts of the self that can be
known to itself are its socially or sociolinguistically constructed parts,
then one’s own self is just as unknowable as is the self of others,
perhaps? tbctd September 2011 The most efficient form of encryption is likely
to invoke self-referentiality or incursiveness of some kind. We propose
that the reductio ad absurdum of self-referentiality is represented by
consciousness. Consciousness represents the ultimate exemplar of
“hard encryption”. The doctrines of the incorrigibility of sense data and

Russell’s “privileged access” to the contents of consciousness are
consistent with the notion that, the hard encryption represented by
consciousness are “incorrigible” and the contents to which they grant
access are “privileged.” Substance is indifferent to the passage of time
and so must be bound up in the phenomena of emergence through
substance’s ability to transcend all possible historical accounts of a
process. Just think of Leibniz’ monads here as “possessing no
windows”. The quality or qualities of substance, being that it/they
universally pervade(s) all things that exist, would forever be free of
being intersubjectively identified or classified. Classification depends
upon some things or states being the case that fall outside a given class’
purview. The individual consciousness qua substance that all contents
of a given individual consciousness must be supposed to possess can
never be identified or classified by that individual, moreover, on account
of the hard encryption represented by consciousness, nor shall any other
individual be able to identify or classify the substance that makes up that
individual’s subject contents of consciousness. Note that there could
potentially be an indefinite number of distinct differences between the
consciousness’s of an unlimited number of individual minds, both real
and possible, opening up the possibility of an ever greater number of
abstract categories and relations involving these individual
consciousness’s. And yet all of this taxonomic knowledge is forever
totally inaccessible, except to a transcendental being. It was not without
very good reason that au=Alvin Plantinga devoted an entire book to the
topic of cit=God and Other Minds. July 2012 Unfortunately, one cannot find
a free of copy of Plantinga's 1967 book and he only cuts to the chase and
states his real argument on the last 3 or 4 pages of his book. The rest of
the book is groundwork examining all of the arguments in favor of God's
existence and why these arguments fail. Plantinga's own argument does
not fail, once one really takes to heart the fact that one has only ever
known one's own sensory states and…that Man does not possess, nor
even can he conceivably possess, a concept of consciousness (as
opposed to a concept of a state or states of consciousness). Even if one
lives forever in a heaven or hell after this life, it will still be *just you*
experiencing *your own thoughts and sensory states*. The reality of a

genuine plurality of subjective states of conscious experience (of distinct
persons, what is better understood as a @$multiplurality) necessarily
brings in a perspective transcending mere everlastingness of an
individual (particular) mind, i.e., that of a Universal Mind. In a word,
even if you live for infinite time, you cannot have proof of the existence
of other minds - this shall always remain a question of faith, a faith no
weaker than that of the deist who has faith in a Universal Mind. September
2014 epi=
“One would hope that the plurality of being transcends the mere
“forgetfulness of the self” for in the absence of transcendental mind
these two cases are not distinct, but degenerate into the very same
instance, i.e, they are degenerate cases of the same state of affairs,
namely that of an absent-minded solipsist.” Intersubjectivity is
transcendent in the sense that the plurality of mind is grounded in a unity
beyond space and time and is not the mere degenerate unity of a secret
numerical identity. April 2012 Ethics without any practical possibility for
reciprocation and without an underlying karmic metaphysical principle,
e.g., ethics for a consciousness that is continually branching through new
and possibly only projectively extant universes. This is an example of
the hidden presumption of theism. Many other examples exist such as
from philosophy of science, linguistics, art, sociology, psychology, etc.
November 2011
The essence of Plantinga's argument comes during the few
remaining pages of the book, which regains new life in light of recent
developments in cosmology (anthropic principle, multiverse, Boltzmann
brains), quantum mechanics (Many Minds Interpretation) and artificial
intelligence/virtual reality (Nick Bostrom's "ancestor simulations). . . .
Each is eternal entity, having escaped cosmic loneliness for a time by
having taken upon limitation. But there is the question of how the
infrastructure that facilitates the ability to do this managed to be in place
and available for our use? Although from the vantage point of any given
temporality, this infrastructure shall have always been in place such that
there is no ground for us to ask after a “first cause”, there still remains
the question, “for what reason is it there?” It seems that if causality is
not part of the question, then purpose or end must be. There lies the rub,
i.e., the area where friction develops between comatose traditional

beliefs and metaphysical hypotheses.
September 2012

"Without transcendent, universal mind there is no distinction
between the case of consciousness being a one or its indeed being a
many." The universal consciousness field splits the number degeneracy,
e.g., @$“photon number degeneracy” of the Boltzmann brains, which
real, biological brains resonantly tune to whenever those brains enjoy
conscious states of awareness. The the seeming integral unity as well as
the temporal continuity of conscious experience necessarily
“piggybacks” off of the nonlocal connectivity of unique vacuum
fluctuation frequency spectra. November 2014 (We must make sure and not
lose sight of the fact that, when it comes to the workaday conscious
experience of the socially defined world of “ordinary life”, it is
evolutionarily only necessary that just enough and no more be done to
“maintain the appearances” – and there are deep metaphysicalepistemological consequences which would indeed follow from taking
this proposition as a kind of economic principle on a par in generality
with that of the principle of the conservation of energy.) September 2012 And
so personal identity qua substantial continuity of mind and mental states
is far more a function of vacuum nonlocality than it is a function of
specific reproducible instantaneous patterns of neural or microtubule
network interaction configurations in a given biological brain. In a
word, personal identity is a function of resonant Boltzmann brains qua
nonlocally connected, quantum-coherent vacuum entropy fluctuations.
And here the normal distinction of closed versus open thermodynamic
systems must be reinterpreted in light of this nonlocal connectivity,
which in some sense renders mysteriously fuzzy this otherwise hard and
fast distinction from the theory of classical thermodynamics. July 2013 edt= “
The universal consciousness field splits the number degeneracy, e.g.,
“photon number degeneracy” of the Boltzmann brains, which real,
biological brains resonantly tune to via vast networks of quantumentangled microtubule tubulin dimers whenever those brains suffer
conscious states of awareness. The continuity of conscious experience
necessarily piggy-backs off of the non-local connectivity of prescribed

unique vacuum fluctuation frequency spectra within respective
prescribed bandwidths. And so personal identity qua substantial
continuity and unity of mind and mental states is far more a function of
the intrinsic vacuum quantum non-locality of Boltzmann Brains than it
is a function of specific, reproducible instantaneous patterns of neural or
microtubule tubulin dimer network interaction configurations within a
given biological brain. In a word, personal identity is a function of
naturally hard-encrypted resonant Boltzmann brains qua non-locally
connected, quantum-coherent vacuum entropy fluctuations. And here the
normal distinction of closed versus open thermodynamic systems must
be reinterpreted in light of this non-local connectivity, which in some
sense renders mysteriously definite and precise this otherwise fuzzy
distinction inherited from the theory of classical thermodynamics. The
so-called specious present of cognitive psychology, represented by
anywhere from fractions of a second to several seconds in duration along
one axis in multidimensional time (dependent upon something akin to
IQ) is constituted by the coherence time of the particular Boltzmann
Brain to which one's brain is resonantly tuned at a given moment." What
is absurd here is that, "The problem of existence in light of modern
physics and cosmology becomes just this: did the universe tunnel into
existence from a false vacuum state many billions of years ago, or did
my consciousness just tunnel a fleeting moment ago into some freak
accident of a Boltzmann brain which is only just now on the verge of
quantum decoherent collapse.
October 2012

When one rejects God and adopts atheism, what one has
secretly done is to have rejected the notion of a transcendent, universal
mind as well as that of the very ground of mind as such. The rejection
of the transcendent other is therefore just a temporary stopover along the
path to the rejection of other minds and otherness as such. This is the
case even if, as it turns out, one’s thinking does not possess sufficient
logical consistency to carry one along the full course of this
philosophical path – a null result that is overwhelmingly probable. This
is a restatement of the piquant observation that, “if naive realism is
metaphysical baking powder, then atheism is just half-baked solipsism.”

This is all a direct consequence of the fact that, we can have no general
concept or category, termed “consciousness” such that the conscious
states of awareness of each and every other person should constitute a
true instantiation (along with one's own individual consciousness) of
said general concept . . . that is, with this observation set juxtaposed
alongside the indisputable fact that solipsism is false and other minds
(whose ground of mentality is apart from that of one's own) do in fact
exist. But the grounding of a conception, that is, of its knowability, if
you will is distinctly epistemological and not a matter of metaphysics.
“The term is impossible to define except in terms that are unintelligible
without a grasp of what consciousness means. Consciousness is a
fascinating but elusive phenomenon; it is impossible to specify what it
is, what it does, or why it evolved. Nothing worth reading has been
written about it” [italics mine], c.f., Sutherland, S., International
Dictionary of Psychology, 1989.
The isolated system possessing only “internal” connections and
the system connected to an outside are topologically distinct and
therefore logically distinct and objectively dissimilar. There is no
possibility for an isolated system to realistically simulate connection to
an open-ended external realm. This open-endedness is vouchsafed by the
presence of the other and the self must be a sociolinguistic . . .well, not
so much construct as collaborative preparation. This and more is all
captured in Wittgenstein's "Private Language Argument". Plantinga and
Levinas, their philosophy of mind, that is, brings the inextricably
comingled errors of solipsism and atheism into stark relief in the light of
Wittgenstein's. As has been said many times before, “without context
there is no meaning”. Multiple points of view are required to support a
veritable distinction between “inside” and “outside”. A system
composed of “the solipsist and the external world” is a howler of a
misnomer for its rightful name is “the solipsist and his external world”,
which, of course, is no external world at all. The solipsist and his
external world, so-called is always logically reducible/equivalent to . . .
the solipsist and his private sense data. A closely related topological
application to philosophy of mind and epistemology is whether the
April 2014

mental states of the solipsist could have all been fed into the solipsist’s
mind by a team of evil, big-headed aliens. This is just the humorous,
compact travel version of the thesis of the self as sociolinguistic
construct. Without the guiding rails of a grammar, the mind is
overwhelmed by the chaos of its internal babbling. And yet it is indeed
“babbling babes” who re-creolize the Pidgin language of their parents,
generation after generation. That’s how any coherent language sounds to
an infant who yet possess no taught rules of grammar: as a Pidgin. But
what of Wittgenstein’s Private Language Argument here? However, at
least on the surface, the marked ability of toddlers to creolize their
parents’ Pidgin-like system of articulate sounds seems to sap the PL
argument of some of its force, however. Unless one interprets this
genetic ability as being founded upon some kind of collaborative design.
The genetic base pair sequences that inform this unique ability of
babbling toddlers itself possesses a grammar, rather than being the
expression of a haphazardly cobbled together Pidgin, as natural selection
of random genetic mutations would entail. Remember that mutations are
always being “interpreted” and “expressed” within the context of a
multilevel gene regulatory network. The organic problems of internal
logical consistency, which so affect and make untenable the solipsist’s
position, also infect that of the atheist. It is the other of the other who
makes the rules underlying the system. Contemplate the irreducibility of
“the other of the other” to “the other”. The necessity of transcendent
other is glimpsed in this irreducibility. June 2014 Without genuine
intersubjectivity, objectivity cannot be simulated and the notion of
objectivity would then be an incoherent one. Similarly, without selfconsciousness, Man, as a philosophical zombie hominid ape, would have
never arrived at the notion of God as a transcendent other.
It is thought by less gifted MI linguists that the 3/3 linguist who
maintains his or her 3/3 rating in significant part by remembering the
answers on the test from previous takings maintains an unfair advantage.
This complaint is however informed by a failure to appreciate that the
more gifted linguist must have reached a 3/3 on his or her own at some
previous stage.

July 2013

True agnosticism is being undecided equally between two possibilities
for the origin of the universe the other is God for the self is God.

True atheism is the belief that the self is but an illusion as only God
vouchsafes the moral status of the individual with respect to the ethics of
his social order.
Generally speaking the Earth's population decreases with time as one
moves into the past but the number of ancestors doubles geometrically.
Clearly at some point in the past sexual reproduction was not available
to propagate a given species. This is an example of the application of
the anthropic principle.
This reminds us of George Gamow's prediction of the seven MeV
energy level of carbon 12.
Quantum entanglement could have been predicted from an analysis of
the large 3-way collision cross section for Helium nuclei in the
formation of the 7 MeV resonant carbon-12 state. @$This is because of
the generalization of the notion of cross section that is involved in
interpreting a 3-way collision which is otherwise extremely improbable.
This is especially revealed in the arithmetic that applies to the addition
of cross sections for helium nuclei in the 3-way collision by which
carbon-12 is formed in the stellar interior.
The probabilities don't add according to the normal rules of such. The
probability calculus implies the subsystems of negative an imaginary
and complex probabilities.

The ego as the tuning filter of molecular and atomic resonant
interactions and reactions. The anthropic principal puts the self at the
only fixed point in the otherwise totally open and chaotic realm of being.
The general nature of quantum entanglement is the general nature of
consciousness ( in the absence of a veritable concept of consciousness)
which preserves the precise boundaries of the domain of the self.
miks Mike leida see kasulik osta selliseid kambris Lastehaigla?
'There are no spies, only some squirrels that are more secret than other
January 2013

Behavioral genetics is perhaps the determining factor in how
this “necker cube” of an epigram is perceived: “If naïve realism is
metaphysical baking powder, then atheism/theism is just half-baked
solipsism.” Plantinga’s critique of evolutionary naturalism (as self
defeating, logically) points up a dilemma (originally hit upon by the
ancient Greek philosopher Empedocles) in attempting to apply
behavioral genetics and epigenetics to critical analysis. Developmental
trends in search engine technology and intellectual property philosophy
in the post modern age shall one day soon assure us that an army of
young scholars shall see to it that the true genealogies of paradigmbusting ideas are brought to light and credit given where credit is due.
At the level of individual human beings, there are myriad though
mutually conflicting points of coherence pertaining to an equally diverse
number of points of view.

If naive realism is metaphysical baking powder, then atheism is just
half-baked solipsism. April 2012 Buddhism seems to be the only viable
path for bypassing the theism vs. solipsism dual opposition. This is
because Buddhism views the self as an illusion, either that of the
individual or of God.
January 2013 fcbk=

"The evolutionary process has a finite amount of time in

which to work (because of eventual heat death), while Boltzmann brains
have an infinite amount of time to appear in any given universe (that
doesn't recollapse), and in the environment that birthed those universes.
The ordinary observers created by natural selection simply drop out of
consideration, being overwhelmed by infinitely more Boltzmann
Brains”, c.f.,web=
This reformulation constitutes, almost by itself, the answer to its
precursor: the pre-Socratic question "Why is there something rather
than nothing?" is insoluble in its demand for a relation between being
and nonbeing apart from their mutual exclusiveness whereas the modern
counterpart to this question does not at all demand from us the
impossible as there are many examples, both empirical and
mathematical, where chaotic systems acquire order through selforganization or ordered systems become chaotic through an increase in
entropy, also c.f., “deterministic chaos”. July 2011 Data may be considered
to be the embodiment of information in the sense of constituting the
necessary, but not the sufficient condition for the presence of
information. There must be a special characterization of some subset of
the sum total of necessary conditions for some state to occur or obtain,
which combined with another subset of such conditions constitutes the
sufficient condition, The relationship of the two subsets would be in a
complimentary manner akin to figure-ground. April 2012 If information is
gerundial as in informing, then there should be something akin to protoinformation. The question then is whether this implies such a process as
It might be profitable to distinguish,
instructions, data, metadata, information. Analogies such as Internet,
webpage, hyperlink, operating system, cloud/cloud computing may be
both helpful as well as limiting here, hence the ever present need to
create new contexts, i.e., new myths.
But what, you may ask, is contained within the Quantum Theory which
suggests this reformulation? Very simply, the Quantum Theory does not
treat the vacuum as a veritable emptiness, but rather as a medium of

chaotic fluctuations of positive and negative energy which cancel each
other, averaging out to zero net energy over distances larger than an
atomic diameter, say. Subatomic particles, the penultimate constituents
of matter come into existence when energy fluctuations over a small
region of the vacuum respond to each other's presence through the
accidental formation of feedback paths among themselves. These
feedback structures may remain stable for only extremely fleeting
periods of time or they may become robust and persist against their
chaotic backdrop for longer periods permitting the formation of more
complex hierarchical structures. The presence of information is the key
ingredient determining if such fluctuation networks persist against the
background of quantum fields. In terms of information theory, the
vacuum is filled with an infinite number of messages crossing it to and
fro from every direction; material particles are constituted by more
messages being exchanged within this region than between this region
and the "outside" of this region. On this interpretation, matter does not
respond instantaneously to accelerations (possesses inertia) owing to a
communication bottleneck existing between its interior and the
surrounding vacuum; matter cannot respond to the world in "real time,"
but must take time out to "process" the coded instructions which it
receives from its "inputs." One need here only compare the ease with
which a single gnat can change its direction in flight ( to avoid an
obstacle, say) to the difficulties involved when an entire swarm of gnats,
ore a swarm of swarms of gnats, for that matter, attempts to perform the
same maneuver based on the intelligence ( in the military sense) of a
small group of harbinger gnats. These chaotic fluctuations of vacuum
energy are a manifestation of the Heisenberg uncertainty principle. this
principle states a numerical relationship between the dual physical
quantities position / momentum and time / energy. The bridging
constant between these dual quantities is Planck's constant, h, and the
exact expression of this relation is:
X*P = h/2pi or T*E = h/2pi,
which is derived from Planck's older relation,

E = h * f;
where E is energy (Joules), f is frequency (hertz), X is distance (meters),
P is momentum and T is time (seconds); h is, of course, Planck's
constant which has units of Joule-seconds. There is a more sophisticated
and complete matrix algebraic statement of the principle, but this need
not concern us here.
Heisenberg's uncertainty principle is an
epistemological one as it rigidly specifies how the accuracy in our
knowledge of one physical quantity affects the accuracy of our
determination of the remaining paired quantity. Heisenberg's principle
can be obtained by generalizing Planck's relation in terms of the matrix
algebraic expression:
p*q - q*p = h/2pi x I .
If consciousness is, itself, required to collapse the wave-function, then
consciousness must originate in the interaction of uncollapsed
wavefunctions. This suggests that the wavefunctions interacting with
one another within consciousness are of the “already collapsed” variety,
that is the perceptual representations of wavefunctions all interact based
upon a subluminal propagation of mutual influence.
wavefunctions which have not yet collapsed are capable of interacting
with one another at a distance instantaneously and this sort of
phenomenon is referred to by Physicists as the Einstein-Podolsky-Rosen,
or E.P.R. effect. There are two basic schools of the Quantum Theory.
Where they differ is in their interpretation of the status of Heisenberg's
uncertainty principle. One school maintains that this uncertainty is due
merely to the practical limitations of observation, that is; the uncertainty
is only epistemological in nature. The other school maintains that this
uncertainty is a theoretical limitation, that is; the uncertainty is
ontological in character. The dispute between these two schools is
solved easily enough, however. In the 1990's when computers have
reached a relatively high level of sophistication it is not uncommon to
encounter the opinion, among otherwise enlightened (educated)

individuals, that computers are capable of or exhibit a kind of
elementary consciousness. These are the same people who would deny
without hesitation that earlier more primitive computers such as
Babbage’s differential analyzer (originally designed in the 1840's) or
perhaps even the Eniac ( circa 1945) which calculated artillery paths are
themselves incapable of anything approaching what might be called
conscious thought. This reveals an intuition that somehow sheer
complexity is the essential factor, which separates the mechanical brute
or automaton from the sophisticated high speed digital computer of
today. August 2011 Note: “sheer complexity” of deterministic computing is
only important because there exists some intrinsic threshold within those
nondeterministic fields in which the classical digital state machine
is/becomes embedded, which provides context for an otherwise
meaningless, context-free affair (just as in the case of Babbage’s
“differential analyzer”).
It is likely that this threshold lies at the
boundary between the quantum and classical worlds, which exhibit
wavelike vs. particle-like behavior, respectively. @$There is no contextfree threshold of computing complexity at which any qualitative change
in the nature of computing is to be rationally expected. That is just
magical thinking.
Even among those who flatly deny that modern high speed computers
possess anything like real intelligence or consciousness, there is the
implicit assumption of sheer complexity as the necessary magical
ingredient: a revolutionary jump in switching speed, memory capacity,
architecture design - all of which are essentially functions of increased
density of miniaturized components - would undoubtedly bring about the
necessary gain in complexity, i.e., that which approaches the complexity
of the human brain itself, so that machines would acquire a kind of
consciousness. Marvin Minsky - the leading figure within the so-called
hard-AI community - once designated human brains as nothing more
than "meat machines." But if there is this almost ineffable intuition
about a vital connection between complexity and consciousness then a
perhaps even greater or deeper one is that between the notions of
consciousness and freedom. So-called hard-AI theorists such as Minsky,

Dennet, and the Churchland's use analytical arguments which miss the
point in objecting to Searle because they do not address their criticisms
to the principle thesis that he advances, namely, that the causal powers
of matter play an essential role in determining the phenomenon of
consciousness and that such causal considerations go beyond those of
formal symbol manipulation.
Where these two intuitions meet and
reinforce one another is when one considers a digital computing device,
say, where the packing densities of the microelectronic components
approach that of naturally occurring crystals. It is at precisely this point
where we expect to see the quantum mechanical effects described by the
famous Heisenberg Uncertainty Principle. Here despite all attempts at
insulation and grounding, filtering, or rectification, it becomes
nevertheless impossible to force the device to operate according to some
pre-established blueprint of operation, i.e., program, as the fluctuating
voltages and electric currents inherently reside within the device as a
consequence of the interaction of the device's wires and circuit elements
with the vacuum electromagnetic field which in its own turn must
fluctuate randomly. This random and irretrievable fluctuation in the
vacuum's electromagnetic field is due to an extension of the Uncertainty
principle which states that the electric and magnetic field strengths may
not both be simultaneously specified at any point in space - in much the
same fashion in which the position and momentum of and individual
particle may not be simultaneously specified. A sharp determination of
the electric field at a point will cause a large spread of uncertainty in
measurements of the magnetic field at this point and vice versa. The
fluctuating (quantum) voltages and currents to which the circuitry of
any really advanced computing device would be subject would be
utterly useless and manifest themselves as noise signals disruptive to the
normal operation of the device - unless the device could manage to
interact with these fluctuating fields.
February 1998

Clearly the spatiotemporal scale at which these quantum
fluctuations take place represents an insuperable physical barrier to the
continued operation of Moore's Law which states that, microprocessor
computing power increases by doubling every eighteen months to two

years, given, at least, no significant departure from conventional
microprocessor architectural design as it has manifest itself over the
previous four or five generations of microprocessor development. October
Although in some sense spatial scale is only meaningful within the
context of locality, though with respect to Penrose’s “one graviton
limit”, energy scale is indeed relevant.
The breakthrough in the evolution of microprocessor technology which
will make possible the continuation of Moore's Law, at the same time as
it transcends it, will come in the form of a significant paradigm shift in
the relationship between computer architecture designers and
programmers and knowledge. This paradigm shift will manifest itself in
two distinct but closely related ways. The movement will take place
from a representational to a participatory basis for the communication of
information and knowledge. Instead of knowledge undergoing many
transformations from information to data to information and back again
at each stage in its passing from one person to another, knowledge will
be communicated not through any physical transmission data, but
through a nonrepresentational and participatory sharing of knowledge
between minds.
But this is still not enough. Our intuition that the phenomenon of
consciousness is a radically deep one pushes us to suppose that this
device - however it is supposed to function - merely sets the stage for
this chaotically fluctuating vacuum field to interact with itself - the
device becomes just an intermediary, a facilitator, of a process which
must ultimately fall under the control of this energy itself. We see that
our intuition about the importance of smallness and complexity captured
in the motion of sensitivity (to vacuum fluctuations) and our intuition
concerning freedom ( of vacuum energy to self-organize) appear to
intersect. There is an exact parallel between the relationship of energy
and entropy to each other and the relationship between signal bandwidth
and signal information capacity. We might liken the comparison
between a conscious (intelligent) computer and automaton (unintelligent
computer) in the following manner. A dumb computer is searching a

maze for its exit...There are a number of respects in which the paradigm
shift from a bottom-up to a top-down metaphysics may be realized.:
1) Physical processes are not "pushed up from below" by blind efficient
causation, but are "pulled up from above" by teleological causation.
This may be seen through Margenau's observation that all differential
equations representing processes of blind causation may be recast as any
one of an infinite family of integral equations (depending on initial
conditions) where some physical quantity such as time, energy, distance,
etc., is minimized or maximized. Teleology, however, in its own way,
presupposes the existence of a determinate framework just as much as
does classical physics; in fact, events are not merely determined within
teleological causation, but are overdetermined.
2)The vacuum is not empty as it was conceived to be in the 18th century
classical physics with solid particles caroming through it, but the
vacuum is, rather, a plenum, a fullness of energy while so-called
particles are mere excitations of this vacuum medium. The energy
density of the vacuum is far greater than the energy density of the
particles "occupying" it.
3) chaos may be reinterpreted as a thermal reservoir of virtually infinite
information content as opposed to a condition of no information.
4) There is an empirical-theoretical spectrum with the unified theory of
physics at the theoretical end of this spectrum and pure consciousness at
the opposite empirical end. Therefore, it is just as meaningless to ask
what the fundamental "constituents" of matter posited by unified
physical theory, are in themselves as it is to inquire into the process by
and through which the phenomenon of consciousness originates.
Both questions are posed at the wrong extreme of the empiricaltheoretical spectrum so that any attempt to answer them appear
incoherent or self-contradictory. If the empirical-theoretical really
constitutes a spectrum which exhaustively "covers" reality, then we

expect that the bootstrap explanations applied at each end of this
spectrum must somehow merge or interpenetrate.
5) The creation of material particles is not the direct conversion of
energy into matter, rather the energy required is that needed to dissociate
them from the network of interactions in which they pre-exist. Creation
is not ex nihilo, but is an abstraction of a low level of structure from a
preexisting dynamic whole of virtually infinite (maximal) complexity.
Each act of abstraction, however, is founded on negations performed
within a predefined whole which is itself a form of abstraction of a
higher order than mere negation which is an operation which
presupposes the ability to partition a system into disjoint and
complementary halves. This setting-up of such a system decomposable
into complementary partitions, is the higher order abstraction which
cannot be understood as being based in mere negation within a larger
system. The transformation of elements within a particular system of
representation through expansion of the context grounding the
representational elements is a kind of transformation which cannot be
explained in causal or merely rational terms.
6) Consciousness is channeled, structured, limited, abstracted by the
functioning of the human brain, it is not produced through its action.
The brain acts, per the Bergson – James - Lange theory, as a kind of
reducing valve.
7) Gravitational time dilation, rather than being an effect of a
gravitational field, may be an essential part of the physical vacuum
mechanism by which matter produces a gravitational field.
8) Rather than conservation of four-momentum being deduced from the
theory of special relativity, conservation of four-momentum is the very
foundation upon which the edifice of special relativity is built. What is
referred to as locality is the sum of physical processes governed via the
strong coupling mediated through the exchange of energy between
particles possessing an energy greater than the energy uncertainty of the

quantum mechanical system within which these energy exchanges are
occurring. What is called nonlocality is the sum of physical interactions
governed via the weak coupling mediated through the exchange of
energy between particles possessing an energy smaller than the energy
uncertainty of the quantum mechanical system within which these
"weak" energy exchanges are occurring.
The presence of real photons is evidence that at some point in spacetime
a fermion made an energy transition which was trigger either by
bombarding real photons or the action of the vacuum electromagnetic
field, i.e., spontaneous emission. Of course, both process must be
invoked again, repeatedly, to explain the existence of the bombarding
photons. this infinite regress converges in the sense that at progressively
earlier moments we find the vacuum electromagnetic field in an ever
more compressed state and as the process of spontaneous emission tends
to outstrip that of stimulated emission at high frequencies the
explanation for the decay of excited fermionic states is found to lie
exclusively with the action of the vacuum electromagnetic field.
June 1998

The reason for the momentum fluctuation spectrum of an electron
contained within a quantum well being identical to the spectrum of
possible discrete energy transitions between possible quantum well
energy levels may be on account of the following simple observation.
Such transitions downward by a real electron are stimulated to occur
either by real or virtual photons while such transitions upward by a
virtual electron are stimulated to occur likewise either by a real or virtual
photon, and the spectrum of such virtual photons represents that of the
vacuum electromagnetic waves with which the bound electron can
resonate with and with which it can exchange energy. Since the photon
propagates through vacuum part of the time as a electron/ positron pair,
and in a gravitational field the density of virtual fermion/antifermion
pairs is somewhat decreased, it follows that the velocity of the photon
through this modified vacuum will be correspondingly decreased. It
follows from this that the energy density of the vacuum must vary

proportionally to the cube of the local value of the speed of light within
the gravitational field-laden, and hence, modified vacuum. This may
similarly be interpreted as the energy density of the vacuum being
proportional to the inverse cube of the frequency of vacuum
electromagnetic waves. This is just the relationship of vacuum energy
density to virtual photon frequency which renders the quantum vacuum
perfectly Lorenz-invariant.
In Nature, Oct. 19, p(574), the time required for quantum
mechanical tunneling of an electron across a Josephson junction was
measure. This result means that there is some meaning, which can be
attached to the velocity of the particle during its act of quantum
Sudden, nonadiabatic compression of the Casimir plates
should result in the spontaneous emission of photons by the vacuum.
Similarly, nonadiabatic expansion of tightly compressed plates should
result in the spontaneous absorption of some real photons, which happen
to be within the geometry of the plates at this time.
NOTE: This statement may not be true because the Einstein coefficient
of spontaneous absorption is identically zero; the coefficients of
spontaneous emission, and hence, of stimulated absorption and emission,
may be changed through altering the vacuum electromagnetic energy
density utilizing Casimir plates, resonant cavities, etc.
The Universe might be described by a wavefunction representing
its tunneling through a hyperspherical barrier, in four real spatial
dimensions. The quantum tunneling of the Universe through this
hyperspherical barrier may be alternately described as the collapse of a
false vacuum state and the subsequent creation of free particle
wavefunctions propagating along an imaginary axis of a four
dimensional hypersphere of 3 real + 1 imaginary spatial dimension.
The probability density of this wavefunction adjusts as time passes
reflecting the increasing uncertainty of its would-be position eigenstate.
Any vector at a point where its scalar product, with the wavenumbers of
the eigenfunction expansion (of the universal wavefunction), is zero is

assigned an imaginary coefficient reflecting its being rotated 90owith
respect to the wavenumber set of the eigenfunction expansion. There
was a recently announced discovery that the linear Hubble relationship
between galactic distances and recession rates does not strictly hold, but
that the recession velocities are distributed discretely with increasing
distance, each velocity being roughly an integral multiple of 72 Km/sec.
These observation suggest two distinct but related possibilities.
One, that the initial collapse of the quantum mechanical vacuum state
occurred in discrete stages in much the same way that an excited
electron decays from a highly excited state. Two, that the Universe
tunneled, quantum mechanical fashion, out of a hyperspherical potential
barrier where, as in the usual case, the transmission coefficient varied
sinusoidally with the wavenumber. The vacuum electromagnetic field
is said to be incompressible, but this is not strictly true. The vacuum
electromagnetic field actually appears to decrease in energy density
when confined within a resonant cavity of decreasing volume. This
seems to suggest that the energy density of the vacuum electromagnetic
field is in a sense negative. We may think of the effect of shrinking the
resonant cavity upon the photons present within this cavity in two
distinct ways:
1) The photons wavelengths are simply compressed by the cavity
shrinkage factor or
2) The zero-point of the vacuum electromagnetic field is altered by a
certain fraction so that the energy of photons within the cavity "appear"
to be greater (relative to the new zero-point) by this same fraction. Of
course, the first alternative appears more intuitively evident but
embodies the simplistic assumption that the photons within the cavity
possess some permanent and abiding existence rather than being a
packet of energy which is continually being emitted (created) and
absorbed (annihilated) by the fluctuating electromagnetic vacuum field.
If a photon is in a momentum eigenstate, then the position of this photon
along its translation axis is totally uncertain. We say therefore that in

the position representation of the photon's wavefunction that the
probability density of photons along the particular photon's translation
axis is exactly zero. Consequently, a photon or photon beam which is in
a momentum eigenstate - and hence an energy eigenstate also - does not
alter the probability versus frequency distribution function (along its
translation axis) for virtual photons of like eigenenergy. This may be
seen to follow from the fact that an increased likelihood of finding a
photon of a particular eigenenergy within a certain spatial interval means
that the probability vs. frequency distribution function in this region
experiences a peak at the frequency corresponding to this eigenenergy.
The rates of stimulated emission and absorption of electromagnetic
radiation at a particular frequency are proportional to the density of the
ambient radiation at this frequency. The constants of proportionality are
the Einstein coefficients of emission and absorption, respectively. It was
stated earlier as a general principle that all physical processes were
mediated through the exchange of energy between matter and the
vacuum, the reservoir of energy uncertainty. This principle may be
made more specific by invoking the Einstein relationships for
electromagnetic radiation emission and absorption as the mechanism for
all energy emission - absorption, that is, for all forms of energy
exchange, so that the rates at which all physical processes take place
becomes proportional to the spectral energy density of the fluctuating
boson fields of the vacuum - in accordance with our earlier intuitions.
this assignment of the Einstein mechanism ( for want of a more
convenient term) for physical processes in general depends upon the
implicit assumption that in the absence of stimulated emission (and
absorption) the coefficients of spontaneous emission and absorption are
identical - just as are the coefficients of stimulated emission and
absorption are identical in the absence of spontaneous emission. But the
problem here is that there really is no such thing as spontaneous
absorption - as noted before this condition would violate the principle of
energy conservation. Spontaneous emission appears to only occur to
electrons which have already been elevated to excited energy levels
through stimulated absorption - in other words the energy fluctuations of

the vacuum serve merely to trigger the decay of excited states produced
through ambient electromagnetic radiation. However, this would not be
the case if spontaneous absorption applied only to energy in the form of
virtual particles. The lifetime of virtual particle is determined by the
uncertainty principle and therefore the absorption of these particles out
the vacuum does not violate conservation of energy. It must be observed
here that the assignment of the value ) to the coefficient of spontaneous
absorption is only required by the assumption that the energy density of
the vacuum is itself zero. A number of experiments on vacuum cavity
resonance suggest that spontaneous emission rates are suppressed by
imposing boundary conditions upon the electromagnetic vacuum. It is
our deepest suspicion that the fraction by which the emission rate is
suppressed is equal to the fraction by which the density of the
electromagnetic vacuum is reduced through the imposed boundary
conditions. In the chapter on nonclassical light in the work, Light and
Quantum Fluctuations, a correspondence is drawn between the effect of
a dielectric medium within a certain region of the vacuum and the
alternate introduction of specific boundary conditions upon this vacuum,
say, utilizing conducting plates, resonant cavities, etc. In this chapter it
was concluded that the fractional increase in the index of refraction is
directly proportional to the fractional increase in the electromagnetic
energy density of the vacuum with the wavenumber being also altered
by this fraction but with the frequency being unaltered by the dielectric
medium so that a fractionally decreased local value of the speed of light
How do we represent a trajectory despite the fact that the motion of
the particle must be continually recast in terms of a time varying set of
basis functions. This time variation of the basis functions must contain
an element of randomness, or unpredictability since otherwise a unique
unchanging basis could be found with which to represent the motion.
Distinct trajectories can only be co-represented within the same
presentational space if each and all are differing projections of a single
evolving trajectory. Each eigenfunction is related to its noncommuting
spectrum of superposed complementary eigenfunctions in the sense that

figure is related to ground. The complementary eigenfunction spectrum
is a data set; the selection of one of these eigenfunctions within the
observational context constitutes the engendering of a bit of information.
The component eigenfunctions become mutually coupled provided that
their wavefunction resists alteration through external influences. The
eigenfunctions are coupled to one another if each contains at least a tiny
projection along all of the other eigenfunctions, which together with it
make up their wavefunction. This is only possible if this set of
eigenfunctions contributes to the defining of the Hilbert space geometry
within which they find expression. This requires that the time evolution
of the wavefunction be nondeterministic, which is to say, nonunitary.
The information content of a given structure is determined by the
degree to which it approximates its intentional object. On this view,
things are defined in terms of a holographic contextual matrix or system.
Meaning is context-dependent. Because of this, there is a world of
difference between what is called data and what is called information.
Information may be thought of as data provided with context adequate to
determine its meaning; information is processed data, while data may be
conversely thought of as uninterpreted signals. Data are overdetermined
by information; information is underdetermined by data. Data may be
physically transmitted through space, but this is not so for information.
Data suggest myriad possible informational structures while information
narrows one's focus upon a tiny subset of an unlimited variety of
different possible sets of data. Recalling the beads on a string analogy,
rational numbers may be represented by an finite number of terms of a
convergent infinite series, itself, representing an irrational number. This
finite set of terms, gotten by truncating a convergent infinite series, are
amenable to arithmetic manipulation. This is because, metaphorically
speaking, we are able to take the beads off their finite string and rethread them in arbitrary order without changing the topological
relationships of the beads. Not so for an infinite number of beads on an
infinite string. A finite number of beads on an infinite string may
correspond to matrices. The rearrangement of the order of the beads is
here a reversible process or procedure and so may not be thought to

possess intrinsic information.
I am fascinated by systems with a group theoretic structure. More
generally, I am intrigued by specialized language systems. Why do such
systems appear to be "closed" and yet permit the appearance within
themselves of genuinely novel, or emergent structures. Emergence is
always explainable in terms of the interpretation of such structures
within the context of larger, in fact, "open" systems. Formal symbol
manipulating systems, such as computing devices, do not admit the
existence of what are called semantic structures. Information is reduced,
or de-interpreted, if you will, by one programmer, to produce a coherent
set of inputs to the computing device, and the outputs engendered by the
computational process are then re-interpreted by another (or the same)
programmer. The computational process itself, in isolation from the
interpretation process, which bounds it, is not "about anything."C.f., Structure only has
meaning when it is de-constituted back into the unbounded selfreferential flux from which it originally arose through the process of
abstraction. In fact, the general procedure of composing a computer
program is itself an example par excellence of reduction or abstraction.
If the activity of the system as a whole, or, rather, as a totality, is without
meaning because of its not being embedded in some larger context, then
neither are any sub processes occurring within it meaningful. By
extension, the human brain must be embedded in a larger mediating
context, which is itself completely open-ended in its possibilities otherwise those processes occurring within any given human brain
would not be, as alluded to earlier, "about anything." September 2011 Such
brain processes could not then possess intentionality, which is yet
another way of seeing the incompatibility of free will and determinism,
c.f, Empedocles’ remarks concerning the incompatibility of “atoms and
determinism and logical reasoning (rationicination), c.f., God and the

Besides excluding temporal evolution by being deterministic, closed
"dynamic" systems lack temporality because, in addition, being closed
bound-energy systems, their energy may only change in discrete
amounts. Any finite set of data within an infinite informational system
has an unlimited number of theoretical structures, which suffice to
explain the coherence of these data. An infinite informational system is
able to contain within itself a complete symbolic representation of its
own structure.
The phrase, 'information processing," is a confusing and
ambiguous one as information is probably itself a stable pattern of
interlocking activity within the flux of a more substantive and
comprehensive data processing action.
On this view, data and
information are not synonymous commodities - data possessing merely a
form in the restricted sense of a spatiotemporal frequency, in itself
possessing no content or intentional object, which is to say meaning.
Whereas information is the result of the interpretation of data, not in the
sense of divining their intended message or meaning (data possess none
such in and of themselves), but in the less obvious sense of reconciling
the new data with a long interpretive history stored in memory based on
data received previously.
Given an infinite number of possible 'events,' the probability of any
one occurring is infinitesimal, still less could the 'same' events occur
repeatedly and in predictable order, unless the events were causally
overdetermined by a sequence of preceding events which themselves
constitute a backwards diverging sequence of necessary causes. The

origin of these "infinitely improbable events" must be a nonlocally
connected infinite set of events (a continuum) where the singular event
is an intentional object defined in terms of the self-referential
topological relationship/interaction of infinite subsets within the
Another paradoxical usage popular in the literature of
physiological psychology, artificial intelligence, philosophy of mind,
etc., is the phrase "transmitting information." One must realize that only
energy may be transmitted, information is always constructed through
the interpretation of data received in situ; information does not
physically move from place to place. On this view, information is not a
conserved quantity, at least in the sense of some physical continuity
equation governing its 'flow,' and so if energy and information are in
some physical context interdefinable, it should only be under a set of
circumstances where the principle of conservation of energy does not
strictly hold. Transmission presupposes the notion of the conveyance of
some conserved quantity within some closed space or continuum. We
now know that "closed continuum" is a contradiction in terms. The only
such situation known to physics is the one in which processes occur
within a frequency spectrum with a lowest frequency larger than the
reciprocal of the quantum mechanical time uncertainty of the physical
system within which the relevant processes are occurring. Another
reason to believe that a physical continuity equation does not apply to
information or its flows is that information appears to reside in between
the discrete energy levels of crystalline quantum systems, and so
information is not here really spatially localizable, in principle.
Reducing the energy uncertainty of a neural network will squelch some
of the nonlocally virtual interactions occurring within the energy bands
of the network because the bands will be contracted resulting in a
contraction of the bandwidth of vacuum electromagnetic field
frequencies available to the network, reducing the data processing
capacity of the network.
Because information is not stored in the brain’s neural network at any
particular physical locations within the brain per se, but a more

approximately correct description is to say that learned information is
stored at various discrete energy levels of this network, conceived as a
quantum mechanical system. When a “piece of information” or a
memory is recalled, the neural network will attempt to connect to a new
spectrum of the nonlocally connected quantum vacuum fluctuation field.
In essence, the brain becomes embedded in a new vacuum or ground
state which causes a restructuring or reconstituting of its “stack” of
energy levels at which the data was stored. 06/98 After the restructuring
of this stack, a new array of discrete energy levels prevails along with a
new spectrum of possible virtual energy transitions within the new stack.
Now the brain has become resonantly tuned to a new spectrum of
(nonlocally-connected) vacuum energy fluctuations. This is how data or
“passive information” gets re-presented as “active information.” The
brain may be thought of analogously to a hardware interface between the
individual soul and the impersonal and open-ended information
reservoir. As indicated already, data encoded by the vacuum in the
discrete energy structure of the brain is overdetermined. This same data
as decoded from “memory traces” within the brain’s energy structure is
underdetermined. What permits a quantity of data within the brain’s
neural network to persist as this selfsame quantity is established neither
by any physical continuity which the brain may possess from one
moment to the next, nor can the persistence of this data be placed on any
formally descriptive footing. The informational continuity of the
“memory traces”, i.e., data stored within the brain’s neural network is
maintained outside the brain in the sense of this continuity being of a
nonlocal nature: not contained within the brain’s local spacetime. So on
this view, the brain may be thought of as a kind of “terminal” interfacing
with the “network” of the nonlocally connected, fluctuating quantum
vacuum energy field. The physical traces within the brain associated
with memory consist merely of pointers, or, borrowing from the more
current Internet metaphor; these traces are to be thought of (along with
Laszlo) as being akin to “links to World Wide Web sites” so that
memories are not stored in the brain but merely memory addresses.
Continuing the Internet analogy, these physical memory traces within
the brain may be understood after the fashion of web browser

“bookmarks.” Particular eigenvalues of energy associated with the
discrete quantized energy levels of the brain’s neural network cannot
remain proper memory addresses for information dynamically stored
within the “quantum vacuum network” if the underlying eigenfunctions
are not adequately “tracked” through adequate self-interaction of the
vacuum with itself through the brain as quantum neural network
hardware interface. This is due to the inevitable presence of energy
degeneracy within the brain. The brain may be functioning as a running
convolution integrator of multiple unbounded vacuum spectra. The
brain in this way establishes resonant and therefore maximal
connectedness between different vacuum topologies. These vacuum
topologies are not contained within the local spacetime of the brain
because any particular metric must presuppose an already given
spacetime topology.
There seems to be two conflicting views of the vacuum
electromagnetic field in its important role in opening up the otherwise
mechanically determined processes of the human neural network.
Firstly, the v.e.f. provides context for real particle/field processes
occurring within the brain, and secondly, it provides the field of possible
informational structures which are filtered and selected by the brain's
neural network ( if only passively) to give meaning to its interior
processes. The reason the question, "Why does time appear to pass at
the particular rate that it does?," does not really make sense is because
the interpretation of sensory data is radically dependent upon the timing
of the events represented by these data with respect to the mind which
interprets them and gives them contextual significance, and so there is
no such thing as identical sequences of events occurring at different
rates. It is more true to say that formal systems are created with the
intent of demonstrating (formally) certain theorems which some person
already has in mind, rather than, that theorems are to be deduced from
within already existing formal deductive systems of inference. Energy
and information are not interdefinable within a closed dynamical system.
What are called "mind" and "matter" are not fundamental categories in
terms of which fundamental distinctions may be validly thought to

subsist. Both terms represent somewhat complementary ways of
abstracting from the fundamental substantive process of the absolute
ground of being. Reality as it is in and of itself is neither and both of
these "things." A unitary and unique "pure consciousness" offers itself
up as the best candidate for ultimate Reality or the ground of existence:
it is the most harmonious integration of all possible abstract forms while
being at the same time the most concrete, logically a priori, entity.
April 1997

The distinction between that which has form and that which is
formless is a distinction which cuts across the distinction between mind
and matter since one may speak of both formless mind and formless
December 1996

But there may, indeed, be no most harmonious integration as
such, but an unlimited number of progressively higher integrations. To
suppose that there is some unique highest integration would be to
presume that there can be some objective rule relating lower level
manifestations of ground into a convergence.
Recursive structures
may only come into existence by being distilled from other recursive
structures more complex than themselves.
Particle creation at the event horizon of a black hole gives rise to a
precisely thermal spectrum. This suggests that the vacuum itself is in
thermal equilibrium with itself so that the vacuum must be continually
exchanging energy with itself. Because the time rate of change of all
physical quantities depends on the existence of energy uncertainty, dq/dt
= [H, q] + f[H,q], where f[H,q] is usually written as @q/@t. On this
view, quantum mechanical systems possess energy uncertainty because
they are continually perturbed by intrinsic vacuum energy fluctuations.
In this way, all mass-energy systems are in a process of constant energy
exchange with the quantum mechanical vacuum. Since all macroscopic
transfers and exchanges of energy between two points in spacetime are
mediated via the submicroscopic energy exchanges occurring within the
vacuum, it follows that conservation of energy macroscopically is
dependent upon conservation of energy exchanges within the vacuum. It

is not possible to distinguish different time rates of change within a
closed dynamical system. This is because such a closed system
possesses only a finite number of discrete energy levels, and when the
total system is in a particular energy eigenstate, its energy uncertainty is
0 so that there are no vacuum fluctuations available with which to
mediate changes in physical observables of the system. We may define
the distance separating two events as a function of the number of
vacuum momentum fluctuations existing between the two said events.
Similarly, we may define the time interval between two such events as a
function of the number of vacuum energy fluctuations existing between
the two said events. Of course, the partitioning of the relativistic
momentum - energy tensor into pure momentum versus pure energy
components is dependent upon the particular Lorenz reference frame
within which one performs the momentum and energy measurements;
the converse of this is also true. Since the energy levels at which
information is stored in a neural network are defined in terms of the
lowest stable energy of the neural network as a whole, virtual energy
transitions between these energy levels presuppose a coupling between
the wavefunctions describing the quantum mechanical states of all of the
individual neurons of the network in the sense of their being nonlocally
It is the spontaneous coherence in which the neural network is
embedded which provides the ultimate context within which the
neurological events are to be interpreted. This coherent field is that of
the nonlocally connected vacuum electromagnetic fluctuation field. The
many worlds interpretation of the quantum measurement problem may
be understood as a reversal in causal relationship between the
uncollapsed wavefunction representing the mind of the observer and the
uncollapsed wavefunction representing the potentialities of the quantum
mechanical system being observed by this mind in the following
manner: when the observer notes the collapse of the wavefunction with
respect to an observable he is attempting to measure, what is actually
occurring is the collapse of the wavefunction describing the observers
mind so that it (the observer's mind) now abstracts from the Weltall one

particular eigenvalue of the object wavefunction, but without inducing a
collapse of the object wavefunction itself. Without a God's eye view of
Reality in which to ground these complementary possibilities, there is
not legitimate distinction, which can be made between them. One might
ask what is the fundamental difference between these two interpretations
if there is not some third realm, independent of both the observer's and
object wavefunctions in terms of which one interpretation might be
favored over the other as being ontologically prior. This third realm
belongs neither to that of causality (the mutual interaction of collapsed
wavefunctions), nor to that of contingency (the interaction of collapsed
with uncollapsed wavefunctions, and vice versa), but to that realm
constituted solely by the mutual interaction of all uncollapsed
wavefunctions. This realm we may refer to as the composite
contingency - necessity manifold or continuum.
There is an exactly parallel assimilation between the category space time with our category of necessity - contingency. In this way we may
realize that the concepts of locality and nonlocality constitute a
distinction that cuts across that constituted by the polar concepts chance
and necessity, time and space. There is chaos, Heraclitus' ever-living
fire, the dynamic substance out of which all forms are derived. Then
there are the forms, themselves, both actual and potential. But there is a
third factor, if you will, and it is whatever power extracts these forms
from the flux. This power possesses the freedom of the flux, but also the
order of all those forms which it is capable of extracting, or, rather,
abstracting from this flux, and so is not contained within either category,
that of order and that of chaos. July 2011 au=Epicurus writes to au=Herodotus
that “. . . we must admit that nothing can come of that which does not
exist; for were the fact otherwise, then everything would be produced
from everything, and there would be no need of any seed. Now the great
relevance of au=Epicurus’ notion of the essential importance of a “seed”
to us is that there is indeed a “third power”, das heisst, eine “Dritte
Macht” apart from au=Monod’s “chance and necessity” and that this
power is information. Information is abstract in that there are endless
open-ended means of encoding information as data. We say this with

the express understanding that information is never exhaustively
determined by data, but context is always required in addition to the
awareness and intention by which a set of abstract relations was enacted
when the information was originally encoded. (By the way, there must
be a converse process to abstraction, i.e., au=Whitehead’s concretion.)
The material medium in which information is encoded as data can never
be uniquely associated with said information, except by an arbitrary act
(arbitrary from the standpoint of determinism), that is, by assignment
and convention, not to mention interpretation wherein new information
is engendered from old via the operation of metaphor (reprocessing of
information native to one context within a distinctly different context). If
there is an evolutionary process which profits by being graced with a
preexistent infrastructure of the very subtlest of data processing
machinery which is suited to operate (by imposing initial and boundary
conditions) upon a medium, then the question becomes whether this
medium must itself be creatively dynamic in the sense of “selforganizing”, or if merely the presence of a sufficient density of “relic
information” encoded within the medium’s fundamental processes
should provide sufficient grist for an upward evolutionary process. August
One can only derive the complex from the simple within the context
of a dynamical ground that is altogether more complex than any
structure that can evolve and be sustained within it. This subtlety is
vouchsafed by the innate capacity for this ground of being to bootstrap
an indistinguishable simulacra of itself within itself, i.e., the ground of
being transcends mere topology and still more spacetime topology
(which is merely a specific form of topology).
According to the molecular biologist, au=Stuart Kauffman, the
evolvability of dynamic systems is maximized precisely on the boundary
between the system's chaotic and orderly regimes, far from system
equilibrium. Good is that which enhances creativity which is the
explicit expression of implicit integral wholeness. Evil constitutes that
which seeks to destroy, confuse, disintegrate as well as to impair the
expression of unity and wholeness through creativity. All creativity is in
reality re-creativity (of God).

The probability spectrum of a given wavefunction may be
underdetermined so that there exists an unlimited number of ways in
which an ensemble of measurements of the eigenstates of the
wavefunction with respect to a particular observable may sum together
so that the wavefunction appears perfectly normalized; this property may
permit an additional degree of freedom within quantum mechanical
virtual processes not previously suspected to exist.
Probability density conservation in 4-dimensional spacetime is at the
heart of the underlying physical mechanism for gravitation that we are
proposing. For instance, the gravitational reddening of starlight may be
simply explained in terms of this concept of probability (density)
conservation. Probability conservation is the most general statement of
the principle of causality. There is an absolute simultaneity, which
mental events distinctly enjoy due to the fact that they do not admit of
perspective; if anything they constitute perspective. However, the order
in which neurophysiological occurrences occur (in the brain) is at least
partially dependent upon the reference frame (in the relativistic sense)
that these events occur (as observables). There must be an embedding of
these neural events in a substrate, which extends beyond the merely
neurophysiological in order for a reference frame to be defined in which
there can arise a correspondence between subjective and objective
simultaneities. The nonlocally connected vacuum electromagnetic field
offers itself as the prime candidate for this embedding substrate.
If metaphysical dualism is false in the strict sense of there existing two
distinct and parallel fundamental processes, one physical, the other
mental, but if this doctrine is nevertheless true in the less restrictive
sense of there actually existing mental and physical realms which are not
distinct but somehow mutually interacting, then it is in principle
impossible to formalize the operation of mind.
It is quite true what many psychologists (as well as lay persons) have
noted concerning the tendency of a task to become executable without

the aid of conscious attention the more and more that it is performed.
However, what has not perhaps been widely noted by either is the
somewhat contrary tendency for one to become more, rather than less,
aware of the abstract operations lying behind the performance of a task
in new contexts where the specific concrete operations constituting the
task would never otherwise suggest themselves. This tendency for us to
become aware of the abstract operations specific to one particular oftrepeated task within a context normally foreign to it, or at least for our
performances of operations within new previously unrelated contexts to
be guided by these abstract operations, I refer to as operational
modulation - or op-mod, for short. What we are calling op-mod may be
alternately thought of as the manipulation of something in terms of an
operational metaphor; it is itself the very essence of the human toolusing intelligence, and may be considered to be a general property of
any neural network-computing device.
More specifically, op-mod is peculiar to the problem solving strategy of
the neural network device because the specific neural circuits which are
utilized by such a network for solving one particular "problem" will
necessarily overlap with neural circuits which are being established in
the course of attempting to solve “similar” problems in new extraneous
The existence of the ground of Reality consists exhaustively in its very
activity. Consequently, that which creates this ground is that which
sustains this ground; from which further follows the truth of Leibniz's
principle that, "the conditions sufficient to create the world are necessary
at every succeeding moment to sustain its existence."
But the implications of quantum mechanics as pertains to what is called
the quantum vacuum conceived of as the naturalistic interpretation of the
ground of being in the application of this concept to induced gravity
theory or effective field theories of gravity and inertia may suggest that
Leibniz’ principle must break down in connection with the fundamental
quantum-thermodynamic phenomenon of environmental decoherence.^^

March 2011

Decoherence is witness to the fact that the conduit of
communication between the quantum system and its supporting vacuum
state does not possess “enough bandwidth” for the system to update
itself “in real time”, hence the relatedness of gravitational decoherence
and gravitational time dilation.
We know that there has to have always been something in existence and
so the ground of Reality must be self-sustaining, and hence, selfcreating. It follows that the ground of existence necessarily exists, and
so is eternal. All possibility ultimately lies dormant within that which
necessarily exists. In the language of quantum mechanics, every
determinate eigenstate with respect to a particular physical observable
may be alternately represented as a series of eigenstates with respect to
an indeterminate physical observable incompatible with the first.
December 1996

When one conceives of some universal substance or "stuff"
which does not depend on any activity for its existence, one is
conceiving of something, which is at once a form and a substance. One
is conceiving of a substance, which is a particular determination of it,
which possesses greatest generality.
Hermann Weyl notes in his book, "The Open World," that the state of a
two-electron system is not determined by the state of each individual
electron added together, but that the states of each electron may be
deduced from the state of the two-electron system.
Leibniz's series: 1 - 1/3 + 1/5 - 1/7 + 1/9 - 1/11 + . . . , does not
converge when the terms are rearranged into a sum of the following two
sequences: (1 + 1/5 + 1/9 + . . . +) + ( -1/3 - 1/7 - 1/11 - . . . ). This is a
rather common property of what are called alternating infinite
sequences. This property is very mysterious, but can be made to seem
less so if one pictures each term of the sequence as a numbered bead on
a string. A finite number of terms of the series may be rearranged
arbitrarily to produce the identical sum, and this may be thought to be
possible simply because the string, being finite in length, permits the

removal, and hence, rethreading of all the beads onto the string in any
arbitrary order. However, given an infinite number of beads, the string
is now itself infinite in length and so it is no longer possible to remove
the beads so as to put them into a new order.

The order of the beads may only be changed into that represented by
the two sums provided that the original string is cut, and this changes the
topological relationship of the beads; in a finite sequence the order of the
terms (beads) may be rearranged without altering the topological
relationship of the beads.
Herein lies the irreversibility of the
procedure. It is also interesting to note that Leibniz' series converges to
the value of pi/4 because the value of convergence is itself an irrational
number possessing a decimal expansion which possesses no determinate
order whatever so that what we have is an equation between an irrational
number and an infinite sum of rational numbers, on the one hand, and,
on the other hand, an equation holding between an infinite sum of terms
possessing a mathematically determinate sequential order with respect to
a simple mathematical operation, namely, addition, and an infinite sum
of terms possessing no mathematically determinable sequential order no sequential order with respect to any definable mathematical
operation. We may suspect that Cantor's argument for the existence of
what he calls nondenumerable infinity, i.e., the famous "diagonal
argument," can be applied to the decimal expansion of pi to show that
this sequence of decimal fractions itself constitutes a nondenumerable
set of rational numbers. What is interesting here is that no possible
rearrangement of the indeterminate sequence of nondenumerable
rational numbers constituting the decimal expansion of pi will produce
an irrational which diverges although there do exist rearrangements of
the terms of Leibniz' series which diverge. From this simple fact we may
deduce that there is no infinite sequence of denumerably infinite subsets
of terms taken from Leibniz' series, on the left hand side of our equation,
which will produce a one-to-one correspondence with the individual
rational numbers of the infinite sequence of rational numbers in the
decimal expansion of pi. July 2011 Although there’s a kernel idea here that
requires further development, it’s obvious that by what has just been

noted about Leibniz’ series reveals that pi has a topological structure.
Does the topology of the real line, namely that it has one imply that the
infinite real line must possess some kind of closed loop structure?
November 2013

The principle that Emerson seems to be illustrating can
perhaps lead us to a deeper understanding of time and a more hopeful
appreciation of time's potential, which goes way beyond our usual paltry
conception of time as unidimensional and inexorably finite. The higher
dimensionality of time is to be sought in its multiplicity of scale and
connectedness. The topology of the real line, despite its abstraction is
anything but linear, so what chance is there for something which we
have only ever known intuitively as a solitary, individual stream of
consciousness to be linear?
Words in a text make up a context, but then at some level of complexity,
when removed from that text uniquely call back to it and infuse the
context of the originating text into that next into which it is inserted.
Specified complexity. The reprocessing of moments as contextual
chunks (chunkings of data) is in no way exhausted by the multiple duty,
which these chunks can perform as context-free composite elements.
Think of Stephen C. Meyers’ plastic letters and magnetic whiteboard
illustration of the specified complexity of the information contained in
base pair sequences. These possess no immediate chemical context with
respect to the determination of those specific sequences.
Gödel has stated that his incompleteness theorem applies only to logicodeductive systems more powerful than that represented by arithmetic
(Peano Arithmetic). This is because the proof of the theorem is based on
the Gödel-numbering procedure, where each operator, as well as all the
symbols utilized by the system, are represented by Gödel numbers,
while all of the logical operations of the system are defined in terms of
arithmetic operations. So we may say that arithmetic is definable within
all so-called Gödelian deductive systems. The domain of all arithmetical
operations is a domain devoid of topological structure. Self-referential
propositions introduce a topological structure into the domain of proof.

Rational numbers are the sums of convergent infinite series where the
order in which the terms of the series appear does not affect the value of
the sum. We may say in this case that rational numbers occupy a
number field possessing arithmetic, or null, topological structure.
Irrational numbers, on the other hand, are the sums of infinite series,
which may diverge if the order in which the terms of the series appear
are altered. We may say that the irrational numbers occupy a number
field possessing a topological structure. The degrees of freedom
required for certain reactions, or interactions, to take place, are only
allowable within a space of large enough dimensionality to
accommodate them. The unreasonable effectiveness of mathematics
within the physical sciences, borrowing the famous phrase of the
quantum physicist Eugene Wigner, is owing to the radically and,
perhaps, infinitely, overdetermined nature of natural phenomena. To
wit, sensory data are grossly insufficient to determine uniquely the
information structures with which they are interpreted and explained. A
genuinely recursive system may only be derived from a recursive system
equally or more complex than itself, or if the recursive system is
"constructed" out of simpler recursive elements, the control system
effecting or mediating the process of construction is, itself, a recursive
system, of greater complexity than the system being constructed. The
information content of a particular structure is defined by the degree of
preciseness to which the system approximates its intentional object.
This definition is best understood in terms of the "shattered hologram"
metaphor. A molecule belonging to a complementary molecule pair,
two molecules which naturally hydrogen-bond to one another, favors the
spontaneous self-assembly (from locally available components) of the
molecule to which it bears a topologically complementary relationship.
More generally, the spontaneous self-assembly of molecules is favored
by a vacuum containing energy resonances complementary to those
upon which the molecule's energy structure depends for its sustained
existence. On this view, the quantum vacuum electromagnetic field may
be thought of as a kind of dynamic template which "informs" certain
simple molecules "how to self-assemble," with these simple molecules

acting as complex waveguides receptive, or sensitive to, a certain tiny
portion of the spectrum of electromagnetic frequencies originating from
within this vacuum.
July 2011

An important question in this connection is whether there are
contingent conditions for the emergence of altogether new structures?
And whether there is no contradiction in the dynamical substrate of the
quantum vacuum being able to support and sustain emergent structures
that it is nonetheless unable to anticipate? Another way to put this is:
can open-ended conditions be posited for irreducibly complex structures
to “boot strap” themselves into existence? Here the dynamical substrate
is intelligent, creative, however not all-knowing. Here also, the complex
structures engendered are irreducibly complex, however this is in the
absence of intelligent design, as only intelligent recognition is required.
The teleology bespoken by the emergence of irreducibly complex
structures in one temporal dimension can be given a causal explanation
through the operation of feedback structures in higher dimensions of
time. Septermber 2011 There are two fundamentally disparate concepts of
intelligent design: the one, such a biochemist or molecular biologist,
applies design concepts derived wholly from his study of already
available biological structures and systems, the other is in the case of a
demiurge or deity who develops a system or structure by directly lifting
it out of chaos, calling it out of the inchoate flux of open-ended
If what might be called prn=time-scale reductionism (TSR) constitutes a
fundamentally false understanding of the dynamics of natural
phenomena, then the traditional philosophical view of time as
possessing only a single dimension must be abandoned. Time-scale
reductionism says, simply, that events taking place over a certain time
interval are owing exclusively to events taking place over intervals of
time smaller than and “contained within” the first time interval, which
are in turn dependent upon events occurring over smaller time intervals,
and so on. July 2011The phenomena of quantum entanglement and
teleportation, particularly within the transactional interpretation of

quantum mechanics, c.f., Cramer, appear to flout the principle of TSR.
One ready example of the failure of TSR is the case of prn=historical
time. In the case of historical time, there is a critical “window of
opportunity” within which certain events must transpire if certain
significant changes or revolutions, e.g., cultural, social, political are to
occur. Paradoxically, the sensitivity to initial conditions of the timeline
goes hand-in-hand with the timeline’s “robustness”, c.f., the misguided,
awkward and ultimately unsuccessful attempts of future time travelers to
meddle with the timeline. More broadly, events in a historical sequence
do not merely cause each other or concatenate as in a blind causal
sequence of events, but events in the historical process echo as well as
anticipate events in the past and future, respectively. Clearly it is due to
historical events both creating and reacting to a temporal context, which
makes this type of determination in time possible. And here it is
obvious that the temporal context is only efficacious if it is also
meaningful, which implies the operation of consciousness in both its
individual and collective forms. Quantum entanglement may be
understood as causality operating collectively rather than merely
individually as in the case of classical physics. We should be mindful
here that what is called thermodynamics is merely a collective
description of particles acting individually according to Newtonian
mechanics and which does not invoke any new concept of causality.
October 2011
It may turn out that we shall only succeed in developing a
“concept of consciousness” for the individual by borrowing from the
theory of the consciousness of the collective. If individual consciousness
is not a metaphysical entity, i.e., substance, but is instead a social
construct, then the philosophical quest to solve Chalmers’ “hard
problem” of consciousness shall be seen to have been all along the
pursuit of a red herring. “Contrary to what most people believe, nobody
has ever been or had a self. But it is not just that the modern philosophy
of mind and cognitive neuroscience together are about to shatter the
myth of the self. It has now become clear that we will never solve the
philosophical puzzle of consciousness—that is, how it can arise in the
brain, which is a purely physical object—if we don’t come to terms with
this simple proposition: that to the best of our current knowledge there is

no thing, no indivisible entity, that is us, neither in the brain nor in some
metaphysical realm beyond this world. So when we speak of conscious
experience as a subjective phenomenon, what is the entity having these
experiences?” cit=The Ego Tunnel: The Science of the Mind and the Myth
of the Self (au=Metzinger). June 2014 It however does not appear that
consciousness can be meaningfully held to “not exist” on the grounds
that it is merely an abstract feature” of certain complex components of
physical reality, e.g., the quantum vacuum, rather than an objective
feature of being as such or a scientifically real, fundamental physical
field such as an electromagnetic or gravitational field.

John Searle, the linguist and philosopher, has stated that formal
computational systems are incapable of consciousness because such
formal systems do not effectively exploit the causal powers of
computation available for utilization by the human brain. Since the
causal powers of matter, as Searle terms them, stem from what is forever
spontaneously occurring in the natural realm at the very smallest
dimensions of time and space, the process of abstraction, itself founded
upon the systematic ignorance of finer details of structure and function,
introduces a kind of built-in blockheadedness into systems of "artificial
intelligence" which are physically realized from relatively macroscopic
and "insensitive" component parts, in accordance with "analytically
closed-form" designs.
Vacuum fluctuations which are simultaneous in one reference frame
(Lorenz frame) will not necessarily be simultaneous in other frames.
This theoretical implication of special relativity for quantum mechanics,
combined with the fact that the energy density of the quantum vacuum is
decreasing with time as the universe expands, leads us to deduce that,
not only is the density of the quantum vacuum different in different
Lorenz frames, but so is its time rate of decrease.
I do not think that au=Hugh Everett's prn=many worlds interpretation of
quantum mechanics is consistent with the implications of quantum
experiments which have been performed in the last few decades since

the time (1957) when he originally proposed his interpretation of
quantum theory. In Everett's theory, the collapse of the wavefunction is
interpreted as a sudden, discontinuous branching of the observer from
one parallel universe, where the wavefunction is uncollapsed, to a new
parallel universe where the wavefunction exists in one of its component
eigenstates. From this do we suppose that all of the collapsed
wavefunctions within our universe owe their existence to observations
made by quantum physicist doing experiments in other universes?
April 2013

Physicalism or realism is just one interpretation among myriad
competing interpretations of quantum mechanics. epi= "Although
scientific positivism hasn't succeeded in abolishing metaphysics, it has
perhaps managed to contain metaphysics to a sub-spectrum of possible
interpretations of quantum mechanics, namely those invoking (or not
invoking) hidden variables."
August 2013

Humorously (no pun intended) paraphrasing Hume more than a
bit: If we place into a box any volume that may or may not treat
of divinity or school metaphysics, for instance; let us ask, Does it
contain any abstract reasoning concerning quantity or number? Yes and
No, in quantum superposition. Does it contain any experimental
reasoning concerning matter of fact and existence? Yes and No, in
quantum superposition. Perform an observation upon the volume.
Commit it to the flames, if it collapses into a quantum state containing
nothing but sophistry and illusion, but if not, stock it in the British
Library. What key property must data or information possess (or fail to
possess), say, contained in some not so distant future version of an
electronic book, such that a consistent interpretation could be found for
the data in which the book is both a scientific work and a treatise on
metaphysics, still more, such that these two interpretations are connected
by virtue of being in quantum superposition?

Enantiomer molecules, that is, molecules which were once thought
to be identical in every way except that they are the mirror reflection of
each other, have recently been generally found to differ in respect to

their binding energies. So-called "right-handed" molecules, such as the
amino acids, D-tyrosine, D-glutamine, etc., have been found to possess
smaller binding energies (and hence are less stable) than their mirror
image counterparts, the L - series amino acids of the same names. Given
the existence of a spatial fourth dimension, it is possible to convert a
right-handed molecule into its identical left-handed counterpart by
pulling the molecule into 4 - space and rotating it 180o against the
hyperplane (normal to) and returning the molecule to its original
position within this 3 - hypersurface. This would suggest the existence
of a preferential curl field acting within this four dimensional continuum
in a direction opposing the rotation of an L - molecule and aiding the
rotation of an R - molecule. This mechanism would be one logical way
to account for the observed differences in the binding energies of
identical L - and R - molecules. August 2013 Moreover, such a curl field
may provide a rational explanation for the predominance of matter as
opposed to antimatter in our observable universe. “While McLaughlin
concludes that emergence is impossible in the light of quantum
mechanics, Hendry regards issues connected with the status of molecular
structure as supporting emergence. The present author suggests that one
should not be persuaded by either of these arguments and pleads for a
form of agnosticism over the reality of emergence and downward
causation until further studies might be carried out”, c.f, cit=Top-down
causation regarding the chemistry–physics interface: a skeptical view
(au=Scerri, 2011)

But such imagined hyperdimensional rotations must be seen to be
only a metaphor for a re-creation of the molecule into its mirror-reversed
double. This is because the metric of Minkowski spacetime is not
positive definite, but negative definite. Information is neither created
nor destroyed; information is always conserved, and when it appears to
be created, being re-expressed within another medium is merely
transducing it.
There are myriad different media through which portions of the eternally
pre-existent information may be expressed, but there exists a primary

medium which contains all information originally. All other media
through which information might be expressed are ultimately dependent
upon this primary information medium. In the same way that the
transduction of energy from one medium, say mechanical, to another
medium, say electrical, is always accompanied by a loss of a portion of
the transduced energy as heat energy (whereby entropy is increased),
some information is always lost in the transduction of information from
the primary medium to other secondary media. For this reason, no
informational systems or structures are permitted to come into being
which possess an information density greater than that of the volume
which they occupy, this volume being pervaded by energy in its primary
form (vacuum energy). In the same way, there is a limit to the massenergy density of any particular volume of spacetime; this limit is that
specified by Schwarzschild’s equation for the energy density of black
holes. The information which is inevitably lost as a result of the
transduction of information from the primary medium to secondary
media simply passes back into the primary medium.
July 1998

Information cannot be independent of the medium in which it is
expressed. Data, on the other hand, are independent of the medium in
which they are expressed.
March 1998

The pre-existence and transduction of information are not
logically self-consistent notions. Pre-existence implies something which
is continually a part of the temporal progression of the whole but which
itself remains latent and changeless. Transduction of information also
implies a contradictory context-freedom for information. For the
transduction of information implies that, like energy, no information is
gained or lost in its "changing form" as it passes from one medium
through another and then to another, and so on. This is to say that the
media carrying information contribute nothing to the content of this
information. And this is also to say that information is always abstract
and is constituted by relationships. One then might ask, what is it then
which differentiates information from mere data - or are they
synonymous? Data and information may be understood as constituting

a merely relative distinction. What is meant by this is that what are data
in one context may be information in a larger one and information in a
smaller context. In other words, information is data interpreted in light
of context while data in this same context function as information with
respect to smaller subcontexts contained therein. Since information are
context-dependent, it would follow that all information possess a
characteristic lifetime rather analogous to a the half-life of radioactive
The law of the temporal evolution of information systems is provided by
the pre-existing spatial distribution of information. The determinate is
dependent upon the indeterminate. The finite exists only through its
participation with the infinite. All transformations are definable in terms
of mere projection operations; therefore, these transformations, when
investigated, always reveal the presence of conservation laws which
seem to govern, or provide constraints upon, these transformations.
What is called the unity of apperception in Kant's Critique of Pure
Reason is synonymous with the existence of the underlying noumenon,
which provides the rationality of any particular series of perceived
continuous transformations entertained within a finite mind. The
interpenetration of the categories of time and space support the unity of


A functionalist theory of mind must presuppose a
decomposition of spacetime into a particular "3 + 1" configuration of
absolute space and absolute time. It must do this in order to define the
boundary between what are merely input-output operations and what
constitute operations of the processing of inputted data/information into
outputted data/information. We may understand the distinction between
information and data to be simply this: information are data placed in
context and interpreted in light of this context; data are information
taken out of context, that is to say, data are simply context-free
information. Now in order for information to be passed from one person
(or subjectivity) to another, information which the one person intends to

convey to the other must be translated, or more aptly, perhaps, converted
into context-free data.
Although there is not such distinction as subjective versus
intersubjective data, we may support such a distinction for information.
Intersubjective information may be understood as information which
two or more persons may hold in common with one another due to
similarities in the nature of the mental boundary conditions, if you will,
which act upon their respective consciousness’s. Subjective information
may be understood as information which is private to each person and
therefore cannot be held in common between different subjectivities.
Different consciousness are not all exemplars of consciousness itself, or
consciousness at large, by virtue of each individual consciousness
possessing one or more general qualities in common with one another
for this would be to presuppose that different individual consciousness’s
are simply different structurings of a single fundamental consciousness.
If we suppose that the constituting of each individual subjective
spatiotemporal continuum assumes the prior existence of an objective, as
opposed to an absolute, spatio-temporal continuum, and that this
objective space and time are, in turn, constituted out of the activity of
some fundamental consciousness, then each individual consciousness, or
ego, is merely one particular structuring of the fundamental and unitary
consciousness among many other such possible structurings.
Temporality presupposes the givenness of Space. Duration, however,
does not presuppose the givenness of Space. Temporality pertains to the
evolution of things existing within a particular space. The temporal
evolution of a particular spacetime treats this spacetime as though it is
itself a "thing." The local temporal evolution of spacetime, which is an
admissible concept within classical general relativity, is not reducible to
the temporal evolution of entities and their mutual spatial relations
within this spacetime. Temporality pertains to changes occurring to the
system boundary conditions. Duration pertains to changes of the system
including the changes to the system boundary conditions. Within

temporality, the rate at which a sequence of events or evolution takes
place cannot be determined; temporality and duration are required for
this. Temporality is duration plus deterministic causal relations taking
place within a particular spacetime frame of reference.
May 1997

And this is perhaps another important distinction which can be
made between the subjective (mental) and the objective (physical): in
subjectivity, the forms of time and space are fused and continuous with
one another. Communication between different subjectivities, that is,
intersubjectivity which is objectivity, requires that the fused
spatiotemporality of each subjectivity be decomposed into separate
space and time dimensions within the realm of the objective.
The increase in complexity of coherent systems with time would seem to
involve the creation ex nihilo of quantities of information. There are
reasons for believing, however, that what is really involved in cases such
as this is merely the partial redistribution of data along a spatial
information gradient onto a gradient which is part spatial and part
temporal where the total quantity of information is conserved in the
process. With the introduction of excitation energy, the nonlocal,
distributed information content is partially transformed into local,
lumped information content. The information content contained within
a purely spatial manifold is nonlocally encoded through relations of
within the manifold which originate externally to it. The relations which
are constitutive of a spacetime manifold cannot be represented in terms
of a distribution of relations between localities on this manifold. No
causal theory can explain the manner in which a spacetime is
constituted. A spacetime is constituted nonlocally, that is, through the
operation of nonlocally connected dynamical processes which are only
partially located within this spacetime. Is it, in principle, possible for all
the neural firings which comprise the brain state to which is associated a
particular mental state to have been stimulated to occur entirely from
outside the brain's neural network, obviating the need for intricate
feedback loops connecting the neurons with one another which normally
support such patterns of neuron firings? Intuitively we suspect that

merely reproducing the requisite neural firing patterns from outside the
network would not be sufficient to produce the normally occurring
associated mental state. This is because the observed neural firings
would only possess a determinate order in terms of the perception of
their order by means of a neural network genuinely possessing intricate
feedback structures.
We might, in turn, be puzzled by the force of this particular intuition
which has at its root the notion of the importance of timing and
synchronization of the neurons with respect to one another. But this
would really only be important if there was something which the
neurons incidentally interact with in the course of their normal process
of functioning to which the order of their "firing" might be
fundamentally related. We might then seek to include this additional
something and produce the changes in it also from outside, just as in the
case of the neurons. Notice that in every case where we are supposedly
able to reproduce a given sequence of neural firings, we are dependent
upon a favorable condition wherein the time interval between firings
within a given small region of the brain are larger than the time
uncertainty of the quantum system constituting the neural network. We
find that our earlier intuition about the problem of the timing of the
events appropriate to the establishing of the requisite brain states crops
up yet again. The timing of local causal interactions is between
particular boundary conditions of, and relative to, the nonlocally
connected vacuum in which the energy uncertainties of the neural
network as a quantum mechanical system ultimately originate. There is
still something with respect to which the patterned events (comprising
the requisite brain states) occur which is important from the standpoint
of timing and synchronization, and we might, therefore, again, seek to
include it, just as before. The point here is that this process of trying to
include the entire background against which the timing of the brain
events are significant can never be completed; we face an infinite regress
here, or, if successful in including the entire background, then there
remains nothing against which the rate of causally sequential events or
the timing of not causally, but merely correlated events within the

network may be established. This regress is apparently resolved within
the quantum mechanically uncertain time interval of the network and
therefore is forever beyond manipulation from outside, that is to say,
there cannot exist a determinate program adequate to produce the timing
necessary to integrate or unify the neural firings into the requisite
coherent pattern we term consciousness.
This timing is not to be
understood after the normal fashion of a mechanical timing of
articulated events. For purely relative timing in the above sense does not
take into account the duration of the whole process (relative to its
determining ground). Moreover, the rate at which a sequence of
causally connected physical occurrences unfolds is determined through
the availability of the spacetime constituting vacuum momentum/energy
fluctuations with which the network must continually interact.
restate, this is because the ultimate embedding substrate of the neural
network functions through interconnected events possessing a time
uncertainty which prevents their delicate synchronization from ever
being introduced from outside - outside either in the physical/spatial
sense or in the purely formal sense of a design or template imposed upon
the concrete dynamical substrate through the imposition of boundary
conditions. Of course, what is called dualism is completely ruled out in
the case where the brain is thought to function in a deterministic
manner. This is because the isomorphism which must maintain between
brain states and mental states precludes the possibility that these two
qualitatively different types of states are causally connected; for any
effective causal interaction between the two would necessarily disrupt
the isomorphism which is presupposed by the dualistic theory of mind.
On the other hand, in the absence of causal interaction between brain
states and mental states, there is no rational basis upon which we can say
that particular mental states correspond to, or belong to a particular
brain. On the other hand again, however, if dualism is rejected and
causal relationships are allowed to obtain between brain states and
mental states, then both types of events/processes must be mediated by
the very same underlying substance, or substrate. In this way, the
whole distinction between what are called brain states and mental states

completely breaks down, and one is forced to adopt a monistic theory of
mind. Because of the veritable existence of temporality, we know that
the fundamental dynamism mediating all physical processes must be
irreversible (in the thermodynamic sense).
Consequently, the
appearance of reversibility in physical reactions, e.g., chemical, nuclear,
etc., is just that, and the entities which take part in these physical
reactions/processes are fundamentally overdetermined by the underlying
dynamism grounding them, producing an underdetermination of their
mutual casual interactions which is in principle irremediable. This is
due to the essential nature of causal analysis as always being performed
in terms of the laws governing the dynamical relations between
indefinitely reproducible entities. October 2014 Refer to philosopher Alain
Badiou’s treatment of the concept of overdetermination.
March 1997

In the passage from one "state" to the next, a system exists
momentarily in a configuration which cannot be characterized by either,
and this, no matter how finely we might seek to partition the system's
flow of change. From which, it immediately follows that, it is what is
peculiar to a dynamic system, but common to all such systems generally,
which permits them to transcend a sequential state description of the
change they experience on which their ineradicable temporality depends.
Temporality, to wit, exists exclusively in the domain which transcends
or lies beyond all possible abstract descriptions. The concrete is the
temporal. As we have already commented, the medium of abstraction
must not admit of an adequate description in terms of any set of abstract
categories, since the categories presuppose the existence of that which
brings them forth. The notion of historicism, in the sense provided by
the theories of Marx and Weber, is conceptually unintelligible because it
assumes the existence of a distinction the validity of which it then later
denies. That is the distinction between physical causal factors and
historical factors in the explication of social, political, cultural, and
economic developments. The validity of historicism would mean that
history as a science of large scale human development doesn't really
work, that it doesn't have anything substantive to say at all because the
real causal efficacy behind the changes which history has traditionally

studied lies at a level of description which is at once lower and more
fundamental than that where historical explanations are articulated.
“Althusser believes that Marx did not fully
comprehend the significance of his own work, and was able to express it
only obliquely and tentatively. The shift can be revealed only by a
careful and sensitive "symptomatic reading". [15] Thus, Althusser's project
is to help readers fully grasp the originality and power of Marx's
extraordinary theory, giving as much attention to what is not said as to
the explicit. Althusser holds that Marx has discovered a "continent of
knowledge", History, which is analogous to the contributions of the preSocratic natural philosopher Thales to mathematics, and of



Galileo to physics, [16] or, better, Freud's psychoanalysis, [17] in that the
structure of his theory is unlike anything posited by his predecessors.
Althusser believes that Marx's work is fundamentally incompatible with
its antecedents because it is built on a groundbreaking
epistemology (theory of knowledge) that rejects the distinction
between subject and object. In opposition to empiricism, Althusser
claims that Marx's philosophy, dialectical materialism, counters the
theory of knowledge as vision with a theory of knowledge as
That which is the source and sustainer of all things cannot be viewed as
being anything but infinite.The paradigmatic example of this
transduction process is the spontaneous production of fundamental
particles out of the vacuum within accelerated reference (nonLorentzian) frames.
Evolution may only be a local phenomenon - not a global phenomenon;
evolution in the sense of the genuine emergence of wholly
unprecedented and unanticipated forms, structures, or dynamisms without this process of development somehow drawing upon a preexisting reservoir of information within which these "emergent" forms
are at least implicitly prefigured, and which mediates and underpins the

evolutionary developmental process - is tantamount to the acceptance of
a fundamental process which is itself uncaused and which is not
admitted to be the cause (or reason) for its own existence. In the
vacuum, information exists in a nonlocal, simultaneously connected
form. When the vacuum energy fluctuations mediate the occurrence of
physical processes, there is a transduction of nonlocal, parallel and
simultaneously connected information into a new local, sequential, and
temporally connected form. But such a transduction phenomenon
cannot take place within the pristine vacuum, that is, within the vacuum
in the absence of any action upon it even if this action ultimately arises
from itself. This vacuum must have something to react with or against;
it must be "seeded." The ground of existence cannot be outstripped by
any possible existent "thing." Nonlocality presents the possibility of
putting quantum mechanical probability on a "rational" footing; in other
words, a given wavefunction is normalizable on the average. The
condition of normalizability is not a very restrictive condition on a
quantum mechanical wavefunction; there are an infinite number of ways
to refract or decompose a given wavefunction into a spectrum of
eigenstates (of an incompatible observable) so as to satisfy the
normalization condition.
December 1996

It cannot be by way of some impossibly complex causal
sequence that a thing manifests itself as such. For instantaneous context,
which cannot be analyzed in causal terms must play a role in the act of
determination. The fundamental paradigm shift which marked the
transition from classical (Newtonian) mechanics to the mechanics of
quantum phenomena may be captured in the manner in which the
implied unified physical law constrains the phenomena of nature:
classical physical law states that what it prescribes to occur must
necessarily occur - any behavior apart from this being forbidden;
quantum mechanical physical law states that what it does not proscribe
or forbid to occur necessarily occurs. This constitutes a kind of
fecundity principle. And the possibilities can only be defined when
boundary conditions are superimposed upon the system which in its
natural state is not necessarily inclusive of any particular let of boundary

conditions. The boundary conditions upon the system are what
constitutes locality. Only the effects upon the bounded system can be
measured. Causality concerns the dynamics of the boundary conditions
but cannot capture the behavior of the system in an unconditional way,
that is, in the absence of any supplied boundary conditions. If the
quantum mechanical vacuum is the origin of temporality, then the
vacuum must itself be timeless, which is to say, eternal, c.f., Ehrenfest’s
Theorem in which the instantaneous value of the expectation value of the
operator is 0 – here the instantaneous rate of change in the operator is
the negative of the commutator of the operator with the Hamiltonian.
Moreover, that which is the originator of space must itself be spatially
unlimited. Human intelligence has evolved to a point just short of that
required to think something genuinely interesting. C.f., Understanding
Quantum Physics by Michael A. Morrison for a fuller explication of
how temporality and Heisenberg energy uncertainty are related.
The neural network computer does not store information in any
particular location within the network, but stores information at
particular energy levels of the global interaction of the network as a
whole. Each new bit of data which is fed into the network is stored at a
next higher energy level of the network. What ultimately becomes this
next higher energy level is determined by a virtually chaotic process of
neural "firings" which occurs throughout the network and which is
stimulated by the introduction of a data input signal. Myriads of neurons
throughout the entire network continue to fire randomly until a new state
of least energy is reached by the neural network as a whole.
October 1996

According to a kind of James-Lange theory of the operation of
mind (the view of the individual mind as kind of "reducing valve") the
brain does not react to environmental stimuli in the commonsensical
way, but it is a dynamic system of resonators which are continually
retuned to new signals within the quantum vacuum electromagnetic field
in response to stimuli. September 2011 But this notion of the brain acting
merely as a kind of filter of vacuum signals is only part of the story. The
brain also reprocesses the vacuum signals of a spectrum peculiar to it,

and it is this reprocessed spectrum that undergoes the filtering action.
April 2012
“The fact that a high level of consciousness is associated with
complex neural structures does not prove that the neural structures
produce this consciousness.” – David Bohm
June 2012

An important question is whether or not multiple brains process
quantum entanglement from the same vacuum information spectrum,
putting back reprocessed and new entanglements into this same
spectrum or whether each does so only in interaction with its own,
unique signature-quantum vacuum information spectrum. If each brain
only interacts with its own vacuum information and signal spectrum, the
question arises as to whether entanglement information can be passed
between consciousness’s or only signals, data and instructions, the
meaning of which is exclusively the domain of the recipient of
interpersonal data. There is also an additional distinction to be drawn
between data within different regions of the same brain versus data
transmitted via physical signals passing between one brain and another.
If the separation between different energy levels within the neural
network (representing different bits of information) are close enough
together in energy, then it becomes very probable that there will be a
process of continual virtual energy transitions occurring between the
various discrete energy levels of the network throughout its entirety. An
interesting point here is that these virtual energy transitions within the
network owe entirely to the action of the quantum fluctuations in the
energy of the vacuum electromagnetic field. Moreover, the probabilities
of given neural energy transitions occurring within the network are
determined by the presence of the constantly occurring virtual energy
transitions of the network which, again, are mediated entirely by way of
the quantum mechanical fluctuations in the vacuum electromagnetic
field, themselves, owing to the necessary presence of Heisenberg energy
uncertainty within the quantum vacuum. An essential difference
between what are called virtual and what are called real energy
transitions, is parallel to the distinction between what are called virtual
particles and what are called real particles, respectively, in the theory of

particle physics, namely, virtual energy transitions cannot be measured
directly, whereas real energy transitions can be measured directly, for
instance, in laboratory experiments. The real energy transitions which
take place within the neural network, and which are responsible for the
communication of its processed information to the "outside world," i.e.,
to the consciousness of both the individual subject as well as his
listeners, in the case of verbal communication, are, themselves,
overdetermined phenomena. This is to say, there are an indefinite
number of distinct sequences of virtual energy transitions, which are
capable of producing the very same real energy transition within the
neural network. This assertion reminds us of Feynman’s sum of histories
formalism for calculating the probabilities of fundamental particle
reactions. If it were not for the existence of energy degeneracy within
the neural network, there would be only one path of neural firings
possible connecting one energy level of the network to the next higher
The operation of a neural network would in this case be formalizable in
the form of a computer algorithm. Thus is the way to what is called
intentionality opened up and made possible: the very same determinate
sequence of neural firings may have an unlimited number of alternative
future brain states in view, in other words. It is interesting to note that
the interaction of the virtual energy states of the neural network is not
mediated primarily by the physical realization of the network itself, but
by the next highest order of perturbations to the energy of the neural
network. What we have been calling "virtual energy transitions," are
really only the first order perturbations to the global energy of the neural
network, conceived of as a quantum mechanical system. The first order
perturbations, what we have been calling, "virtual transitions" within the
network, are themselves, informed or mediated by, the quantum
mechanical perturbations to the first order perturbation energies of the
network, i.e., 2nd order energy perturbations, thus making the first order
perturbations overdetermined phenomena as well. In turn, the second
order perturbation energy transitions (what might whimsically be called,
virtual - virtual energy transitions) are mediated by the occurrence of

transitions between second order perturbation energies, etc., and so on.
At this point we might realize that the real energy transitions occurring
within the neural network which are normally thought to be immediately
responsible for the processing of all information by the network, are
engendered not by physical processes occurring at a lower level of
organization, but via processes taking place at higher levels of
organization, represented by the next higher order perturbation energy
description of the neural network. We know that the virtual energy
transitions of any given quantum mechanical system are owing to the
presence of energy uncertainty in the system. It is more accurate,
however, to say that this energy uncertainty is present in the quantum
vacuum itself, and is merely communicated to the quantum system, of
interacting elementary particles, say, through the exchange of energy
between the quantum system and the vacuum, which is itself where the
energy uncertainty originates; we saw earlier that the wavefunction
describing a quantum mechanical system cannot be normalized if the
energy uncertainty is conceived of as being a property of the quantum
system itself, so that it must be an inherent property of the quantum
So perhaps we see now that the neural network itself acts merely as a
kind of terminus to an information reduction process, it acts as a kind of
"reducing valve" which serves to abstract a relatively tiny portion of
information from the virtually infinite information content of the
overdetermined quantum vacuum which is instantaneously and
nonlocally connected with itself, and therefore represents the highest
level of information processing because it constitutes the
interconnectivity of energy at its most exhaustive level. On this view,
information, like energy, may not be created or destroyed, but is a
conserved quantity, and its ultimate source is the infinite energy
reservoir represented by the quantum vacuum. We already saw how
Temporality itself stems from the presence of quantum energy
uncertainty, which, in turn, originates in the vacuum, conceived of,
again, as a reservoir of virtually infinite energy density. Consequently,
since Temporality itself has its origin in the vacuum, it follows that the

this infinite sea of vacuum energy itself had no beginning in time! The
vacuum now begins to remind us of Heraclitus' "ever living fire," "in
measures kindling and in measures going out," and thereby mediating, as
an eternal flux, all the changes continually taking place in the natural
order. Moreover, Heraclitus' statement that, "everything is an exchange
for fire, and fire an exchange for every thing," reminds us the
interconvertibility of mass and energy in quantum relativistic field
theory, this interconvertibility being mediated by the continual exchange
of energy between matter and vacuum. Heraclitus' "ever living fire" is
to him the fire of the gods, yet uncreated by the gods. His statement
that "Thunderbolt steers the Universe" no doubt refers to the thunderbolt
wielded by Zeus, the greatest of all the Olympian gods; when this
thunderbolt is identified with "the fire of the gods," that is, with
Heraclitus' ever living fire, the parallel between it and the vacuum
becomes an intriguingly close one; the quantum vacuum, by eternally
mediating all physical processes, manages to "steer the Universe." It is
also interesting that Greek mythology tells us that Time owes its
existence to Chaos through that fact that the god, Chaos, is named as the
father of Kronos. Moreover, the Greek word, arche, which means
source or origin in ancient Greek, is translated into Latin as principium,
i.e., ordering principle. The idea behind this particular translation of
arche into principium is the same one expressed by au=Leibniz, when he
states in his cit=Monadology that, "the conditions sufficient to create the
world are necessary at every succeeding moment of time in order to
sustain the world's existence." We now arrive at the notion of first
cause, not in the usual sense of first in a temporal sequence, but in the at
once broader and subtler sense of most fundamental or substantive.
“Given such a reality, the author concludes that human mentality
evolved in bottom-up fashion, with mind-associated neuronal systems
not so much creating mind as organizing a pre-existing propensity for
awareness into useful, functional awareness, and providing for its
modulation by useful information,” c.f., cit=Implications of a
Fundamental Consciousness (1998) by au=Copthorne MacDonald. A

Since it is the pattern of virtual particle emission and absorption which
every real particle continually undergoes which determines the mass of
the particle, it follows that real particle masses are determined through
the particular manner in which real particles exchange energy with the
fluctuating quantum vacuum field; consequently, alterations in the
density of the vacuum field energy will affect the masses of particles
occupying this vacuum. We might expect that this relationship between
mass-energy and vacuum-energy is symmetrical in nature because the
interactions mediating the continual exchange of energy between matter
and vacuum are themselves reversible interactions.
November 1996

The quantum vacuum energy fluctuations collectively, as we
have seen, may be understood as the first cause of the world in the more
fundamental sense of sustainer of all of the structures ultimately deriving
from it in that the quantum vacuum is the originator of temporality.
Matter does not possess a genuine substantial existence since its energy
is forever being replenished by the vacuum fluctuations continually
interacting with it, much in the same manner as a particular spot in a
river is continually replenished with new waters so that, as Heraclitus
says, one cannot step twice into the same place within it. This two-way
causal, symmetrical relationship between mass energy and vacuum
energy within quantum field theory reminds us of a similar relationship
between mass and space-time curvature within the theory of general
relativity: the presence of mass within a given region of spacetime
produces an additional curvature in this spacetime; also, an increase in
the curvature of a particular region of spacetime produces an increase in
the mass of particles or material bodies already occupying this region.
Since spatio-temporal variations in the energy density of the vacuum
energy field are correlated with variations in spacetime curvature, we
might suppose that some sort of conformal mapping relationship obtains
between the ratio of real particle to virtual particle energy densities and
the degree of mutual inclination of the time and space axes ( of the
Minkowski light cone ) to one another. This relationship is also
suggested by the fact that real particles are virtual particles which have

been promoted to the level of real existence through the absorption of
energy; particles are excitations of the vacuum state which is itself a
reservoir or sea of virtual particles. Also, through the application
Mach's formula for the speed of sound to this vacuum energy reservoir,
we see that such a conformal mapping relationship between Einsteinian
space-time curvature and spatial-temporal variations in the zero-point
energy of the vacuum (or, alternatively, its energy density) must involve
mappings between the hypersolid angle swept out by the light line in
four-dimensional (Minkowski ) spacetime, and the energy density (or
pressure) of the vacuum.
We must distinguish between evolution's creative and its critical
faculties. Adaptation is not the purpose of evolution; it is the trial and
error critical process which evolution is subjected to by the contingent
environmental conditions within which it finds itself operating.
Darwinian natural selection is merely a critical process; it is not in any
way a creative process, in and of itself. Natural selection merely
structures, channels or filters the creative suggestions made to it; it plays
no role whatever in the fundamental dynamism driving biological
evolution; natural selection is merely the set of boundary conditions
within which the dynamism of evolution functions, perhaps in the sense
of Bergson's elan vital.
Here again, we have an example of how
boundary and initial conditions are essentially separable from the
dynamisms which they conscribe. Similar remarks were made regarding
the process of technological advancement, which was viewed as a
progression in sophistication in imposing initial and boundary conditions
upon the invariant and unitary dynamism of Nature. We know that
natural selection is not able to operate unless self-reproducing
information-bearing structures are already in existence; moreover,
natural selection has little opportunity to mold these self-reproducing
structures into more complex forms unless it can profit from the creative
suggestions made to it through the operation of random mutations,
themselves useless if they cannot be passed on to future offspring. So it
is also necessary that something analogous to a genetic code be
contained within these self-reproducing structures, themselves the

expression of the information contained within this genetic code.
The problem, then, with Darwinism, or its modern derivative, neoDarwinism, is that a great deal of evolutionary development must have
already occurred, in the form of so-called chemical evolution prior to the
appearance of the first self-reproducing, information-bearing,
information expressing structures, before the distinctly Darwinian
process of natural selection of random mutations is permitted to begin.
So the creative dynamism, spoken of previously, is to be identified with
that dynamism lying behind the prebiotic process of chemical evolution,
a process which does not halt its operation once the Darwinian process
of evolution commences, but which continues hand in hand with natural
selection, and, moreover, maintaining its central role as the motivating
dynamism in the evolution of progressively more complex forms of life.
The subjective "I am," which is just the individual's personal
consciousness, dependent upon the objective world outside itself, is not
to be confused with the objective "I AM," (see au=Nisargadatta Maharaj’s
Book, cit=I AM THAT), which is the one and unique self-existent
Consciousness which is the source of all individual subjective
July 2013

Quantum decoherence prevents the collapse of the causal chain
and thus allows the operation of temporality. In a subtly analogous way,
quantum decoherence also prevents the reduction of the higher pleasures
of life such as spiritual or intellectual pleasures to mere utilitarian
calculated suns of small quantities of direct stimulation of the hominid
ape's brain's pleasure center. In short quantum decoherence introduces
just enough of just the right compartmental-ization of experience ib
order to prevent the annihilation of spiritual potentiality which is borne
of the totalization of experience as reprocessed stimulus-response.
Shannon information with its notion of information being equivalent to a
reduction in uncertainty does not take into account the notion of the
context sensitivity of information
Reductionism dissolves into the inherent fuzziness of space-time

dictated by the Heisenberg uncertainty principle.
It is clear to me
intuitively that this fuzziness is dynamic and it forms the substrate of the
very operation of mind. Consciousness as such may well be more
fundamental than this fuzziness and maybe a present natural
indefatigable restlessness of this consciousness. What eastern mystics
term 'the play of Lila'. In this case the infinite regress is not a reductio
ad absurdum. It is rather the inevitable nature of being.
All : Russell Clark
Date : 07/09/2013 08:55:33 A
The magnitude of gravitational field intensity is *ceteris paribus*
correlated with the strength of gravitational time dilation (weak field
approximation), but is also thought to drive, in part at least, quantum
decoherence, itself a temporal process. Quantum decoherence appears
the only temporal process currently known to science whose rate is not
subject to gravitational time dilation in the same uniform manner as
indeed are all other known temporal (physical) processes. This suggests
that the mechanism underlying quantum decoherence may be among the
building blocks of the mechanism of gravitation. The discovery of any
remaining building blocks of this mechanism perhaps have to await the
identification of further nonuniformities in the response of specific
physical processes to the effects of gravitational time dilation. Now if
*per impossible* lol some form of dualism turned out to be the case,
then we might anticipate some new form of deep space sickness in the
form of a kin of insidious and cumulative impairement of normal
mental functioning experienced by astronauts during long voyages in
zero gee or artificial gravity, say via nonuniform alteration in tubulin
dimer decoherence in relation to the temporal evolution of brain
quantum coherent states and this on account of the twofold differential
action of gravitational time dilation upon quantum brain coherent and
decoherent processes, heretofore unseen by the processes of natural
selection which originally fashioned anb astronaut's homonid ape's
Lustra Media metamorphosis intelligent design YouTube video.

Constraints problem in engineering.
Punctuated Equilibrium
Dover, Pennsylvania 2005 school board hearing
All : Russell Clark
Date : 07/12/2013 07:43:06 AM
The magnitude of gravitational field intensity is *ceteris paribus*
correlated with the strength of gravitational time dilation (weak field
approximation), but is also thought to drive, in part at least, quantum
decoherence, itself a temporal process. Quantum decoherence appears
the only temporal process currently known to science whose rate is
not subject to gravitational time dilation in the same uniform manner
as indeed are all other known temporal (physical) processes. This
suggests that the mechanism underlying quantum decoherence may
be mong the building blocks of the mechanism of gravitation. The
discovery of any remaining building blocks of this mechanism
perhaps have to await the identification of further nonuniformities in
the response of specific physical processes to the effects of
gravitational time dilation. Now if *per impossible* lol some form of
dualism turned out to be the case, then we might anticipate some new
form of deep space sickness in the form of a kind of insidious and
cumulative impairement of normal mental functioning experienced by
astronauts during long voyages in zero gee or artificial gravity, say
via nonuniform alteration in tubulin dimer decoherence in relation to
the temporal evolution of brain quantum coherent states and this on
account of the twofold differential action of gravitational time dilation
upon quantum brain coherent and decoherent processes, heretofore
unseen by the processes of natural selection which originally
fashioned an astronaut's homonid ape's brain.
“I also realised, after I had finished the book, that I had stolen its central idea—
of mind parasites—from a science fiction story I once read. In this story, the

first man to travel to Mars suddenly has an experience of some strange creature
wrenching itself out of his mind, and hurling itself back screaming towards the
earth, which is its home. Unfortunately, this story ended, in the rather ‘smart’
manner so characteristic of pulp science fiction, with the man landing on Mars,
and immediately being possessed again by the same parasites”, c.f., The Mind
Parasites (1967).
October 2013

Lime Cat fcbk=Because of what is clearly some kind of quantum resonant tuning between information
laden fundamental quantum fields (quantum vacua) and the brain's microtubule tubulin dimer network, which
provides the meaning for neural events and processes (allowing humans the gift or curse of self-consciousness) and
because it is far more likely that this "tuning" is initiated from the side of the fundamental quantum fields (which are
eternal) rather than from the side of otherwise haphazardly evolving brain circuits (which is merely a transitory
evolutionary development), and so it is also far more likely that we ourselves are the mind parasites, who have
temporarily and opportunistically taken up residence within the primitive hominid ape's brain (and in so doing,
changing the direction of natural selective pressures affecting hominid brain evolution), than that we are evolved
hominids who have unfortunately been infected by said "mind parasites". Another theory is that "consciousness as
such" is to be equated with the eternal, fundamental quantum field, while "self-consciousness" is merely due to the
transitory coherent resonance of the individual human brain's microtubule tubulin dimer network with an extremely
tiny compatible sub-spectrum of the total quantum vacuum electromagnetic field. Lime Cat Of course, the total
fundamental quantum field is modified via quantum entanglements induced over many thousands of years from the
collective conscious neural activity of billions of hominid brains. And it is a well established result of quantum
mechanics that quantum information (such as is encoded in quantum entanglement) is a conserved quantity. So this
underlying fundamental quantum field possesses the potential to act as a kind of "akashic record" in which the
experience of all conscious entities can be mashed together and reprocessed into broader, transpersonal meanings, as
though occurring within a dynamic matrix of stored information, which can be accessed by something altogether
other than human. So what do you think about the possibility that we are the mind parasites? We give humans the
gift of consciousness and they give us the gift of embodied experience in return.

October 2013 fcbk=(to Naomi Jakins)

Colin Wilson didn't come up with the idea for the book entirely out of a vacuum. Well, yes,
he was influenced by Forbidden Planet and that Martian Science Fiction Story, but also, he actually did notice this
strange upsurge in the number of "Outsiders" and in the number of suicides. He thought those two statistics were
connected, which also contributed to his conceiving of the storyline. Also the simultaneous arising of the Romantic
Movement and the Industrial revolution.

Would it be coincidental if the rarer mutations represented the addition
of larger amounts of information.
Or evidence of gracious
Mutation rate, population size, wait times, generation, population
genetics, compute, “not enough time”

New theories of evolution. 'Same problem'
At no time in the past did a blank sugar phosphate molecule get filled in
with nucleotide bases on both sides simultaneously.
Lyell - structures in the past built by causal processes currently in
Harmonics of decoherence times, Fourier analysis and multidimensional
time. Failure of time scale reductionism and the transcendence of
causal determinism.
Mutation rate population size wait tines generation population genetics
compute 'not enough time'
Ontological argument reevaluated in light of the transcendent causality
of the big bang.
Does the universe really reverse time play back to a single point if
indeed it is not gravitationally closed?
Origin(s) of Organismal Form
Decoherence forces the fundamental units of physical reality to be
context-sensitive. The metaphysical implications of this are potentially
Either the fundamental units of physical reality are context-sensitive or
they do not exist. If they are not context-sensitive, then it is as though
there is no context. So meaning could never come into being for
systems of context-insensitive components, regardless of the abstract
complexity of the system. This reminds me of being in the absence of
universal mind.

Such is the case if the unit of being is information. Mind as the author
of information rather than its derivative becomes automatically
'Libet et al. extended their experiments by stimulating a 'relay nucleus' in
the thalamus that intercepts signals from the senses before they reach the
somatosensory cortex. It was found that when this nucleus was
stimulated for 0.5 seconds the subjects reported that the stimulus
occurred 0.2 seconds after it had begun. When the nucleus was
stimulated for less than 0.5 seconds the subjects did not report any
sensation. This supports the concept of a 0.5 second delay whilst the
cortex puts a stimulus in context before it is experienced.
These experiments show that our experience is an output of cortical
processing rather than the processing itself.'
More recently fMRI and direct electrode recording have borne out the
readiness potential experiments. Soon et al. (2008) allowed subjects to
decide to press either a left or right button. They used fMRI to show that
there was spatially organized activity in the polar frontal cortex and
parietal cortex (from precuneus into posterior cingulate cortex) that
predicted the conscious left/right decision and preceded it by about
seven seconds.
In the absence of a concept of consciousness there must be a degeneracy
between the seemingly distinct cases of consciousness being a one or
being a many. Also a degeneracy between being reincarnated and
having only one incarnation.
Functional integration, constraints problem.
Preprogrammed adaptive capability of cells, James Shapiro, natural
genetic engineering, molecular bioogists, University of Chicago.
Illustra Media metamorphosis intelligent design YouTube video.

The functional essence of the concept of substance is topology.
Man's empirical. science in its entirety has thus far been nothing more
than the poking and prodding of a relational database at its front end,
while man's theoretical science is constituted by little more than clever
guesswork at the general features of the database query language, all the
while unaware of the backend to this database and hence of its designer.
Man must one day come to a knowledge of himself as the database's
programmer. My hope is that he shall not take this eventual success as
indication that he is also the designer.
Who graciously provided the infrastructure which has enabled all of this
impressive growth? We should ask ourselves this question in every
It is the quite common type of fantasy, frequently indulged in by proud
and vain human mortals, to imagine oneself in some glorious situation
where one is either vindicated, suddenly elevated in greatness (or
suddenly shown to have been great) or rendered, usually by one's own
efforts, victorious or triumphant over some powerful adversity or
persecution, or moreover, to receive praise and adulation from the many
as one speaks, performs or otherwise acts in a manner which
compellingly displays ones authority. We human beings indulge in this
kind of fantasization a great deal when we are children, perhaps more so
when developing adolescents, while some of us, upon becoming
"adults," tend as we approach middle age to set aside, eventually
completely, such obvious puerile self-glorifications of the imagination.
Some of us, on the other hand, never seem to put such self-gloried
imaginings behind us, despite advancing age and maturity. Everyone has
either heard of or reflected upon the phenomenon of selfishness
exhibited in the strong tendency we all have of seeking out (in secret, of
course) from a pack of family photographs, those particular photos in
which we ourselves appear. Eventually, we become aware of the
implications of this kind of behavior, which shames us: if everyone

were to be this self-centered, then there would be no one left to care for
me as much as (or more than) I care for myself and I would be a selfish
person alone in a universe of selfish persons. Of course, part of one's
motivation for thinking in this way, perhaps unbeknownst to oneself, is
the Kantian notion of the Golden Rule; to wit, that I must act in such a
manner that I will that my action become a universal moral law. But
what of the phenomenon where I imagine myself being some other
person when I am in the midst of some profound or deeply moving
aesthetic or intellectual experience; I imagine what this experience
would be like for this other person and somehow the intensity and
wonder of the experience is amplified for me, myself, through this
psychological projection. Part of the augmentation of the aesthetic
experience for me is the sense of personal, if partly disembodied glory
which redounds to my sense of identity because it is I who am leading
this person, in my mind's eye, to the unaccustomed though fuller
appreciation of this experience. Partly again, the experience is, for me,
augmented because I borrow the other's innocence, using it as a foil
against which the experience may be rediscovered by me in all its
aboriginal wonder. Moreover, I act as this person's spiritual mentor; I
help this person penetrate a mystery which I have long ago discovered
for myself, and if this other person is ultimately identified with my own
self; this is implied because all of these projections occur within my
mind's eye, then I seek to view myself as the father of my own future
spiritual and intellectual development.
But on a more basic human level, I am imagining the sharing of a
profound idea or experience with another person in a way which is
seldom, if ever demonstrated in actual social intercourse with my fellow
human beings; certainly it is love which motivates this peculiar
psychological projection - the kind of love which does not distinguish
self from other.
I have no acquaintance with either physical objects so-called nor with
any phenomena taking place within the domain of other minds; in fact, I
have had no acquaintance with any phenomena whatever other than

those pertaining to my own psychological states, states which are
presumably closely related to the concerted electrochemical activity of
my billions of cortical neurons. Consequently, I am forced to accept the
existence of physical objects and other minds purely on a faith borne of
appearances, which might be easily explained numerous other ways than
those, which seem to be dictated by what is called "common sense." I
wish to remark here that if an omniscient (not to mention omnipotent)
God fails to exist who is, by the way, the only possible vouchsafe for the
existence of an objective world containing other consciousness’s than
my own, then there is absolutely nothing standing in the way of my
drawing the less than comforting conclusion that I alone exist, i.e.,
Solipsism is the metaphysical truth, and moreover, there is absolutely
nothing standing in the way of my concluding that I, myself, am
ultimately responsible for all the phenomena which I have experienced
or ever will experience and that God does indeed exist and that I am
Him. But of course, I wholeheartedly reject such a preposterous
conclusion: solipsism is a thesis, which I must reject out of hand and
with it the proposition that God does not exist. What I have just stated
above is by no means a rational proof of the existence of God. But it is
an argument, which reflects the inner logic of the whole of my being in
its ineluctable affirmation of His existence from which I have not the
strength to depart. The very structure of language contains within itself
the subtle presupposition that all human beings possess the belief,
whether they @$consciously realize it or not, that the sum total of
possible knowledge is integral. But this hypothesis about the inherently
integral nature of knowledge implies, in turn, the existence of a unitary
repository for this sum of knowledge – one that is by its very nature
(because knowledge cannot be “static”), which is to say, a universal
mind or intellect. It occurs to me that all true mysteries are intimately
connected and intertwined with one another; to find the solution to only
one of these mysteries would mean having found the answer to all, since
in the case of solving either it was necessary to trace back to the same
common origin.
Just listing some examples may succeed in illustrating to one's deeper

intuition that this must be true: a few of these mysteries are that of
existence itself, the origin of consciousness, freedom of the will, the
mystery of Time and along with it that of eternity, the mystery of
immortality and that of divinity. A bright young child may agree with
this observation, remarking, "well, God exists and He knows the answer
to all things." But it really does seem that the contemplation of any one
of these mysteries inevitably leads to the contemplation of all the others
as well as some which I haven't mentioned, and one may ask, “why
might this be so?" Most individuals are totally incapable of what is
called lucid dreaming, dreaming where the dreamer remains aware that it
is he who is in control of all the action of the dream. Freud's doctrine of
the conservation of psychic energy suggests that the control of the dream
action is mediated by the domain of the psyche lying between full
consciousness and the level of consciousness at which the dreamer's
mind operates so that the action of the dream must dissolve upon the
dreamer attaining his full consciousness because the intermediary
domain of consciousness which controls the dream is reduced to nil or
"squeezed out." An analogy will serve here. A river may not rise to an
altitude greater than that of its source, at which level its kinetic energy is
completely transmuted into potential energy. It might therefore be
thought that only those individuals who experience repression of their
normal full consciousness would be capable of "lucid dreaming" as the
control of the dream action would be mediated by the consciousness
within the domain between the individual's repressed consciousness and
his normal potentially full consciousness; this is just a slightly more
abstract way of saying that the psychic energy which is usually
unavailable for utilization by the conscious mind is freed up during the
unconsciousness of sleep and rendered available to the unconscious for
use in mediating the phantasmagorical action of the various dream
sequences, themselves, according to Freud, the acting out of wish
The upshot of all this is that the presence of lucid
dreaming is a possible indication that the individual experiencing it is
not reaching his normal psychic potential for full wakeful consciousness
and that the reason for this is a deficit of available psychic energy due to
the presence of myriad emotional conflicts lying repressed within his

unconscious mind. Psychic energy is bound up for use in maintaining a
compartmentalization of early experiences repressed from conscious
recollection. There is no reason during the unconsciousness of sleep for
this psychic energy to continue to be diverted to man the defenses
against the recollection of early childhood experiences by the now inert,
anaesthetized ego, and so this psychic energy suddenly becomes
available during deep sleep.
November 1996

I have frequently had the experience of a renewed fixation
upon some person, usually a former love-interest, lasting from hours to
perhaps an entire 24-hour period, whenever I had dreamed about the
person on the previous night. Perhaps the remobilization of repressive
psychic defenses takes a characteristic time of several hours after these
defenses have been relaxed during dreaming.
Repressed wishfulfillment fantasies may sometimes manifest themselves as "false
memories." The only way that one knows these memories to be false is
through their failure to cohere with other well-established parts of one's
biography, parts which one has memories of having previously
recollected at numerous different times throughout one's past. One does
not possess memories of having recollected the false memories at any
time in the past so that they are not "woven into" one's biographic
database, as it were. These false memories do have, however, the
compelling tinge of having actually happened in the form of a very
idiosyncratic feeling associated with them which is perhaps merely the
more or less requisite vividness of a legitimate recollection. In order for
the categories of Being and non-Being to be of an absolute nature, the
indeterminate or infinite must possess a structure so that nothing
contained within it may possess a definition as such.
But this is
precisely what the indeterminate does not admit. Reality is both
bottomless and without a determinate apex. Reality is, in other words,
boundless and this is what gives it its fundamental, which is to say,
irreducible temporality. October 2014 “Whitehead understands philosophy as
an infinitely open-ended adventure not only (or primarily) are we unable
to grasp everything at all, but because the world is inexhaustible”, c.f.,
On In/Finite Becoming: Philosophic Considerations on Whitehead’s

Many Multiple Worlds Claremont, Cosmology Conference, October
2006 by Roland Faber,
The open-endedness of philosophy goes hand in hand with the
inexhaustible nature of well, nature. The reason for this is readily
enough seen in the necessary interdependence of theory as extension of
thought and technology, specifically in the form of instruments for
detecting and measuring empirical data, as extension of humankind’s
perceptual apparatus, which drives an unending dialectic of advancing
instrumentation technology expanding the domain of detectable data
spurring the development of new and ever more subtler theories, which
in turn pushes the evolution of more advanced instrumentation

April 1997

This is Eternity bringing forth Time. Therefore, all categorical
distinctions are transcended by Reality, including that of Being and nonBeing. Existence is then a genuine predicate, but not an absolute one.
Given any domain there is, indeed, a most perfect being, but perfection
is now merely relative and there is no Absolute Perfection. There is,
however, that which transcends any possible relative perfection and this
is the eternally pre-existent, boundless Indeterminate in which unlimited
possibilities exist in potentia.
Quantum Mechanics verifies the old Scholastic metaphysical
understanding of all change or Becoming as occurring due to a deficit of
Being: all real physical processes are mediated via virtual processes;
these virtual processes possess by definition an energy smaller than the
energy uncertainty of the quantum mechanical system which is
comprised by the real processes mediated by them. The total energy
uncertainty of a quantum mechanical system is, by the way, relative to
the reference frame within which the system is "viewed," and therefore
differences in the vacuum's zero-point energy reflect changes in our
frame of motion - in the sense provided by relativity. More specifically,

Lorenz contractions occur not only to the eigenvalues of length, but also
to the quantum uncertainties of length. Similar statements may be made
with respect to momentum, energy, time, etc.. The unity of all opposites
cannot itself possess an opposite.
April 1997

Duality arises out of that which is itself Nondual. The basis of
identity which transcends abstract description is that of continuity,
substantial continuity. This is where substance is the concrete medium
from which all forms are generated through abstraction which is
limitation and negation. The modern version of the Being versus
Becoming dichotomy of the pre-Socratic philosophers is that of Space
and Time of modern physics. Whereas Being and Becoming were
thought to be disjoint categories by the Greeks, in modern times, the
theory of relativity has shown space and time to have only a separate
existence as abstractions dependent upon the frame of reference of the
observer within objective spacetime. @$If through Descartes' categories
of Res Cogitans vs. Res Extensa we relate space to matter and time to
mind, then relativity perhaps points to a unity which transcends this
distinction of mind vs. matter. Newtonian mechanics effectively
"spatialized" time. Physical laws are simply "descriptions of," as
Russell reminds us, “how Nature, in fact, behaves;" Nature does not
"obey" any such "laws," nor is she "governed" by them. This would be
to explain a process in terms of the very derivatives or by-products
which necessarily presuppose the process allegedly "explained" by them.
An example of this fallacy is saying that natural selection "explains" or
"causes" evolution. Natural Selection cannot even begin to operate until
a genome, or some such unit of heredity, is already in existence. The
question concerning the origins of life and that of Darwinian evolution
are seemingly quite distinct. This is a great and ever growing problem
for evolution theory as the science of self-organizing complexity
continues to develop. We speak always of the information contained
within the genetic code as being "expressed," either in terms of the
synthesis of particular proteins or as control of the expression of other
genes. Yet we never seem to think of the fact that language is twosided; information is not only expressible, that is, decodable, but it is

also encodable. Without the interpretive function of mind, information
is never expressed; data are merely transduced from one form within one
medium into another form in another. Can the function of expression be
reversible? Transduction seems to fall short of the creativity of
expression. The function of expression is not reversible without the
context for the original encoding, examples of which are the creation of
art, music, or poetry. Would we be satisfied with a concept of
information as merely metadata?
Four “orthogonal” or disjoint domains are implied by the
oppositions: fermions, waves, bosons, particles because there are two
fundamental dualities in quantum physics, symmetric vs. antisymmetric
wavefunctions, i.e., fermions vs. bosons and two fundamentally
complementary states of the vacuum, particle vs. wave/field.
August 2014 fb=

January 1998

There are myriad medically documented cases of persons in
states of profound hypothermia having undergone cardiac arrest and
existing in a state of clinical death for periods of up to several hours
who, upon being gradually thawed and heated in an emergency
operating room, revived completely and without exhibiting any sign of
brain trauma or loss of other healthy physiological function. Open heart
surgery is now performed in the former Soviet Union upon patients
whose core body temperatures have been carefully lowered to just above
freezing to buy precious additional hours of surgical time in cases where
particularly complex and life-threatening procedures are required. Now
assuming that the persons who revive are the selfsame individuals whose
bodies had been in a state of clinical death, we may conclude that,
whatever is causally necessary to provide the underlying continuity of
personal identity which remains preserved throughout, must not depend
upon the chemical processes which are significantly impaired or halted
as a result of a near freezing core body temperature. We may deduce
from this that, although one's individual consciousness is structured and
shaped by the near infinite number of electrochemical reactions taking
place within the brain, such electrochemical processes are not,

themselves, responsible for the fact of one's individual consciousness
existing. Perhaps this line of reasoning appears to beg the question,
since the emphasis placed on continuity here presupposes that
consciousness is some kind of substance.
April 1997

Darwinian evolutionary theory always is opposed to the special
creation theory, but Darwinian theory does not really stand in complete
contradiction to the Biblical creation account because it does not admit
flow of information into the genome, but only outward flow in the form
of the genome's expression as phenotype. But in any informational
system, elements do not contain information statically, but within an
interpretive context, which implies that the genome represents a nexus
for the exchange of information between two or more systems, and so
the Darwinian doctrine of one-way flow of genetic information renders
the theory inconsistent and prevents it from being in true opposition to
the special creation theory.


Encoding is over-determination.
Decoding is underdetermination. Hence, in a deterministic system, in which one state of
the system simply determines, and neither over-determines nor underdetermines, its succeeding state - in such a system neither decoding nor
encoding is taking place. Information is not a relevant quantity where
strictly deterministic systems are concerned.
This precise "deficit of Being" may only be defined in terms of the
complete description of the total process; this description, as already
noted, exists only for itself and cannot be derived from without as an
infinite regress of description stands in our way here. In this connection
we may state the fundamental principle that "the mediator may not be
defined in terms of the total set of processes which it mediates." This
"deficit of Being" of Scholastic philosophy is exactly analogous to the
energy uncertainty of quantum mechanics. If Hegel is correct in saying
that positive entities exist only by virtue of the accumulation of various
negations of relations holding between the Absolute and itself, then each
entity must more or less clearly and distinctly exhibit the unified totality

of Reality in microcosm, the developmental program of which is
contained in the greater quantity of information which is determined
when a number of these entities come into conjunction with one another.
For instance, the molecular bonding of atoms, whether it be ionic,
covalent, or by the weaker Van der Waal's force, cannot be induced to
occur within a previously specified period of time simply through the
manipulation of the atoms "from outside;" one may only place the atoms
in the approximate position whereupon the action of bond formation
ensues through the spontaneous action of the vacuum electromagnetic
field. In fact, the quantum mechanical method utilized in physical or
quantum chemistry to determine the various bonding probabilities has
nothing whatever to say about which precise configuration or shape, out
of the many possible configurations, will be formed as a result or the
spontaneous bonding process.
This fact is readily seen when one attempts to view the spontaneous
conformation of a denatured macromolecule such as a nucleic acid as the
result of the amino acids of composing the molecule trying myriad
possible different conformations by trial and error until the energetically
favored (most stable) conformation is found. In even relatively small
macromolecules the total number of possible conformations is so
staggeringly large that even if the components try a different
configuration every 10-13 seconds (a very liberal assumption) the time
required to hit on the "correct" conformation by chance would take
many orders of magnitude longer than the present age of the observable
universe! The wavefunction which describes the total system of
interacting atoms entering into the bonding process is approximated as a
product of the individual wavefunctions describing the approximate
quantum state of the atoms; it is only the complete wavefunction which
describes these atoms as being inextricably entangled with the whole
vacuum-energy/ mass-energy matrix which contains the information
about the shape of the resultant molecule, among other things. No
individual quantum mechanical wavefunction is truly normalizable,
although a large ensemble of such wavefunctions will approach
normalizability in the limit of infinite ensembles. There will always

appear to be coupling between the eigenstates of a wavefunction which
is, itself, merely an approximately exact wavefunction; in reality, there is
only one universal wavefunction, as its normalizability requires.
This process is very much akin to the decrease of fuzziness in a
holographic image which occurs on two or more pieces of a shattered
holographic emulsion when the various pieces are refitted together. On
this view all development of material systems from the simplest
subatomic constituents to the most complex living organisms, consists in
the negation of negation, engendered by the conjunctions of these
constituents which occurs by chance outside the quantum mechanical
positional (or length) uncertainty of the constituents, but which is
actively directed once they interpenetrate within this uncertainty, where
they enter into the effective range of their innate quantum-uncertainty
mediated tendencies toward self-organization, tendencies which are a
manifestation of the partially distinct "image" of the Absolute, i.e., the
universal wavefunction, which each constituent contains in microcosm
within itself. Because creation is conceived under the aspect of the
negation of the negation of contextual relatedness within the Absolute,
this negation which is negated being understood as the reduction of
information resulting from the partial expression of the universal
wavefunction as a product of partial wavefunctions corresponding to
relatively uncertain subatomic constituents, the problem of "Why is
there something rather than nothing?," is no longer rendered insoluble
by the requirement of explaining creation ex nihilo, but it is simply
recognized that there has to have always been something or other, so
what better candidate for this something than that unified plenum which
is its own description, which is per Aristotle its own eternal
contemplation; that entity which au=Hegel calls the au=Absolute, and
which we have styled "the universal wavefunction." “In the same way,
the Buddhists express the same idea when they call the ultimate reality
sunnyata for emptiness or void and affirm that it is a living void which
gives birth to all forms in the phenomenological world.” ( au=Capra,


Dr. Scholem, in his cit=Trends in Jewish Mysticism, tells us that in any
case, where there is a transition from one state or configuration to
another that, "the void is spanned for an instant." au=Madame Blavatsky
refers to evolution as a "spiritual Opus in reverse," meaning that the
world arose through a process of involution (a reduction of the infinite to
the finite), but containing a vague recollection of the whole from which
it was originally abstracted and which guides its future development.
This future development is constituted by the return of the Absolute
back unto itself, directed from above and hence is a recapitulation of the
timeless and transcendent within the realm of the temporal and
immanent. This is to understand evolution as the negation of negation.
To say that Reality cannot be given a complete description on account of
the inevitable pitfall of infinite regress is merely to say that, if this
description does not already exist, then it can in no way be constructed;
there is nothing in what has just been said to prevent a complete
description of Reality, which has always existed. In fact, it is due to the
lack of a complete description which is responsible for the existence of
temporality, c.f., fluctuations necessarily associated with a perturbation
theory description of the quantum field. This observation is very much
in the same spirit of the mathematician au=Kurt Gödel's discovery that the
notion of truth is stronger than the notion of provability; the fact that a
theorem expressible within the symbolic language of a particular logicodeductive, mathematical system may not be constructed from the axioms
of its system utilizing the rules of inference of this system does not
prevent this theorem from "existing" in the sense of its being true, i.e.,
“subsisting”. September 2011 It is actually the interaction of the open-ended
sets of suppressed details (that is, those concrete details suppressed that
originally constituted the categories of abstract thought), which do the
“heavy lifting” on behalf of creative intelligence and not the categories
themselves. The categories, when wielded (just think of tools here)
merely signal to the underlying creative ground, which direction it is to
apply the genius of colluding devilish detail to which problem, as well as
conveniently indexing the end products of lucubration and so enabling
their inter-subjective communication. On this view the sociolinguistic
construct of the self is merely the master abstract category, which is but

one of the many categories of thought, and which serve the real master
that is the will.

But one might ask what is the meaning of mathematical theorems which
are beyond the comprehension of any finite mind and which are not true
by construction from a possible collection of axioms, but true in some
more fundamental sense.
Wittgenstein tells us that we may not use
substantives in a manner which attempts to extend the reference of these
terms beyond the scope of all possible experience without falling into
either meaninglessness or absurdity. Therefore, when we ask the
question, "Does God exist?" the most we can possible mean by this
question is, "Does God exist within our space-time continuum?" We
cannot ask whether God exists in Reality (Wirklichkeit) since Reality
cannot possess a complete description without this description having
always existed and without admitting the existence of such a description,
one necessarily beyond all possible attempts to construct it, and which
may only exist from a single point of view or perspective. So it seems
that one may not ask whether God exists in Reality (in the sense of
Wirklichkeit ) without presupposing an answer in the affirmative,
because the admission of the existence of Ultimate Reality is one and
the same with the admission of God's existence. It is meaningful,
however, to ask this same question in the sense of Realitat. That which
has always existed, and its complete description, which has always
existed must be a part of this same eternally pre-existent Reality. It is
obvious that the description and the Reality must be essentially one and
the same entity which is, who is, God. The finite may not be complete
without conditions being specified, and these specified conditions may
not obtain except within the purview of a larger context, containing the
particular finite thing.
Only those independently occurring genetic mutations may occur at
lower levels in the gene regulatory hierarchy, i.e., to structural genes,
which become integrated together within a member of an evolving
species which might have possessed a common source in the form of

single mutation to a higher order regulatory gene, one which controls the
expression of the original set of structural genes prior to the occurrence
of these independent mutations. The operation of Darwinian natural
selection presupposes the existence of a gene regulatory hierarchy which
controls not only the expression of individual genes, but more
importantly controls the integration of the expressions of large numbers
of genes at the lower levels of this hierarchy.
To reiterate, there has always been something. What better candidate for
this something than that than which nothing greater can be conceived
(per Anselm) ? This something has no complete description except
within and through itself. The only truly determinate entity is the
totality of everything which is possible or it is the void, as nothingness
is, by its very nature, determinate. It has been humorously noted that
Time exists so that everything will not happen at once. Given this
obvious truth, then if Reality is infinite in scope, there is no "everything"
for there to possibly be given or occur all at once. For the term
"everything" presupposes a finally completed something which is
incompatible with infinitude. So temporality may be simply necessary
in an infinite Reality. Temporality seems to require that (per Bergson)
genuine novelty continually intrude into the world. The act of creation
is not an event within history; it is the beginning of history; it is the very
inception of novelty. Bergson advocates what might be called a
Machian theory of Time.
But since the continued presence of temporality seems to require
continual activity of creation, it seems more consistent to assume that
creation itself is a fundamental process which itself had no beginning,
which was itself never created, that the activity of creation is that which
has always existed and which requires no explanation other than itself.
On the other hand, however, it seems that this fundamental process of
creation cannot be a unified process, for otherwise it is an act which
could have been consummated instantaneously, once for all.
Temporality must therefore be a process of recapitulation of timeless
reality with Reality itself as the medium through which the

recapitulation is accomplished. If Reality has a complete description, it
is not one which can be constructed, not one which had any origin in
time: it is indeterminacy itself which is ultimately responsible for the
existence of temporality itself. If such a complete description exists it
must have always existed and the description and the reality must be one
and the same. Consciousness offers itself immediately as a likely
candidate for such an ultimate reality since consciousness is its own
representation. If it is true that consciousness is required in order to
collapse a quantum mechanical wavefunction into a determinate
eigenstate, then consciousness, if it had an origin in time, must have
arose out the interaction of uncollapsed wavefunctions - it must have
arisen out of a situation of infinite energy uncertainty. The velocity of
light is determined by the ratio of real particle energy to virtual particle
energy. Hence, the elsewhere region of the Minkowski light cone may
be identified with that region of the vacuum which stands in a relation of
Bell nonlocality to the observer. The unity of the conscious mind is no
doubt owing to Bell-nonlocal connectivity of neurons within the brain.
If it is true that there has always been something (as in the metaphysical
question, "why is there something rather than nothing"), out of which the
Universe arose, assuming that this something is not just the Universe
itself, then there must not be, ultimately speaking, a universal or all
encompassing, ongoing process of either evolution or degradation in the
development of Reality. This is because by this present time an infinite
amount of time must have already passed so that
Reality should have either degraded into utter nothingness or reached a
state of eternally unchanging perfection and we do not observe the
Universe to presently occupy either of these two extreme states:
temporality could not exist within a universe which derives its existence
from a ground of unchanging perfection (fullness of being); nor could
the universe derive its existence from a ground of nothingness (complete
December 1996

Evolution and devolution are concepts which may only
possess an application locally, however broadly. We now arrive at the

conclusion that Reality as a whole is neither evolving (increasing in
complexity) nor is it devolving (decreasing in complexity) so that any
apparent changes in complexity in the Universe, e.g., biological
evolution, increasing entropy, are merely local changes which are on the
whole compensated by opposing processes elsewhere. We may think of
causal relationships as obtaining between terms or entities occupying a
plane of equal levels of abstraction with the process of abstraction itself
and its opposing process, that of concretion, being processes which do
not admit of an exhaustively causal analysis. If it is only possible to
alter the boundary conditions and initial conditions which the dynamism
of Nature is subject to , but not to alter in any way the dynamism itself,
then the most advanced technologies will amount to nothing more than
having discovered how to goad Nature into doing, in the best way she
knows how, what she has all along been in the very act of doing. It is
the possibility of formulating recursive propositions and this possibility
alone which allows the domain of true theorems, expressible within the
language of a deductive system, to transcend in magnitude the domain of
provable theorems, i.e., theorems which may be deduced from the
axioms of the system through application of the rules of inference of the
system. There is no comprehensive rule by which a computing device
may recognize the appearance of a recursive proposition since
recursiveness is a structure which can only be suggested; it cannot be
literally stated. All baryons are composed of various combinations of
three different quarks out of the six possible different types of quark; the
mesons, however, are each composed of different quark pairs from
among the six fundamental quark types. The fundamental force
responsible for binding together the various quark combinations out of
which all baryons and mesons are composed possesses the bizarre
property of increasing in strength as the distance separating the quarks
The important result of this is that it is impossible to fragment these
quark laden particles into their fundamental constituent quarks: it is
impossible to produce a "free quark," in other words. This is almost as
though the quark does not possess a determinate energy structure except

as it exists within groups of two, three or possibly larger numbers of
quarks. The quark may be an example of an entity whose identity is
totally context dependent with the quark itself, paradoxically, providing
the context for its own determination as a substantive entity. Such an
entity might not possess a definite energy equivalence in the sense that it
is not possible to determine the quark's energy independently of the
energies associated with the force fields the particle produces. An entity
such as the quark which is defined in terms of the combined effect that it
has upon itself (and one or more duplicates of itself ) provides us with
the best example to date of what might be called a self-existent entity.
Quantum nonlocality effects could govern the superluminal transmission
of information between various widely separated locations within the
brain's cortex without producing a verifiable violation of special
relativity's restriction on superluminal transmission of information
between separate minds. This would be possible if the very same
nonlocality effects are required for the integration of the individual's
consciousness as a whole. The idea or flash of insight would then be
time-delayed by the necessary time interval involved in conveying this
idea from the location in his brain where the crucial integration occurred
to those parts of his nervous system which pick up this message and in
turn translate it into classically described nerve impulses then
responsible for the ultimate series of muscle cell contractions required to
transmit the message to the external world of attending observers.
Another way of looking at this kind of nonlocality is for the
nonlocalized quantum effects to be confined to a vacuum
energy/information reservoir, exhaustively connected with itself
nonlocally, which is continually tapped into by the neural network of the
person's brain. There seems to be nothing to strictly forbid the existence
of superluminal causal connections between events which lie outside of
any observer's Minkowski light cone.
Since the common sense view alleges that the past is nothing more than
a fixed, crystallized record of what has actually occurred within the
present, it follows from this view then that a present which has not had

adequate time within which to resolve its internal contradictions and
ambiguities must retain a certain small degree of freedom within which
change might continue even after this moment becomes past. In this
way, backwards causality would be admissible, if only for the purpose of
"cleaning up" the "loose ends" of still indeterminate entities occupying
past times. @$But doesn't common sense also define the past in such a
manner that it does not actually become past as such until such a point as
this crystallization process is complete.
January 1998

Within the above schema of "concrete" temporal evolution, the
time dimension cannot be spatialized and treated analogously to an
additional dimension of space such as it is treated by the Minkowski
relativistic formalism.
In other words, common sense defines the past in such a way that the
time axis is necessarily orthogonal to the three dimensions of space and
this at every spatial scale, however small; it defines the past in such a
manner that there is no substantive distinction between past, present and
future, which is to say, it defines the past as a half-infinite time interval
with its later endpoint being the present moment whose status as present
must be completely arbitrary. Within a deterministic time continuum
there is no nonarbitrary basis upon which any particular moment along
the continuum may be chosen over other contenders as being actually
January 1998

The first order approximate description of the first order
perturbations to a given quantum mechanical system may be evaluated
in terms of the discrepancy between the system's first order and its
second order perturbation description.
The true nature of the
perturbations of the system to which entirely owes its genuine
temporality, cannot be understood with respect to any possible common
descriptive denominator which these perturbations may be thought to
have with the formal elements of the first or, for that matter, any higher
order perturbative descriptions of the system. In other words, in
absolute terms, the perturbations to any system cannot be given any

formal description of representation of any kind.
A natural lawlike system of relationships which govern the behavior of a
given entity may only be formulated provided that this entity is not
unique ( provided that multiple duplicates of the entity exist and are
available ) as in the case of subatomic particles, e.g., an electron.
January 1998

Otherwise, there is not objective means of distinguishing
dependent from independent variables of the system.
As observed by au=Jacques Monod in cit=Chance and Necessity, "Science
can neither say nor do anything about a unique occurrence. It can only
consider occurrences that form a class, whose a priori probability,
however faint, is yet definite."
We note here that a determinate occurrence merely constitutes a special
case of a definite probability, namely, that of unity. Consequently, if a
probability of unity for a particular occurrence demands that this
occurrence be contained within a finite number which altogether form a
closed system of possible occurrences, then equally does so a definite
probability for this, or any other, occurrence within the self same
system. It is doubtful whether the probabilities of "material" events can
change due to disturbances of the system from outside which leave these
events unchanged as discrete entities. For the sensitivity of system
elements within open systems follows from the general nature of opens
systems as not being themselves constituted from closed systems. To
wit, additivity and commutatability, which are manifestations of system
reversibility, do not obtain within open systems.
The meaning of discursive symbols must be completely arbitrary if these
symbols are to be interpretable, i.e., if they are to be data to which
information can be associated. There must be an important distinction to
be drawn between the dissemination of data and the transmission of
information. There is not distinction between these two types of
propagation of influence if the endpoints are not taken into account. The

transmission of data from point A to point B is merely an arbitrary link
within a larger arbitrary link connecting point A' to point B' of the longer
path containing path AB. And the smaller path, AB, is arbitrary because
it exists only through the arbitrary abstraction of it from the larger path
A'B'. In other words, there is no reason for considering the path AB than
that of arbitrary choice. There is no determinate relationship between
data and information because they do not exist at the same level of
abstraction. This would presuppose, however, that data and information,
are only different, generally speaking, in a relative sense, i.e., relative to
the level of abstraction.

The quest for the "theory of everything" is therefore doomed to ultimate
failure, since what we call "everything" is necessarily unique, and this
uniqueness prevents us from separating those "variables" which are
particular to the thing itself from those which owe in part to our
investigatory involvement with this thing. The self, in the act of
investigating ultimate reality, must be included within the dynamic of
the reality for which we are seeking a complete description. This
inherent recursiveness which lies at the heart of any earnest attempt to
develop a complete description of reality is alone responsible for the fact
that the domain of truth necessarily transcends the sum of knowledge
comprising any point of view (of reality).
Quantum Mechanics tells us that a closed dynamical system may only
undergo temporal evolution provided that a certain energy uncertainty
exists within the system. This energy uncertainty is just the standard
deviation of the energy about its mean or expectation value. This energy
uncertainty may be interpreted in terms of a time-average sum of
random energy perturbations to the system "from outside" the system.
The phase of the isolated quantum system formally undergoes temporal
evolution, but there is no physical meaning to be attached to an absolute
phase. It is only when another system is brought into interaction with
the first system do we get temporal evolution of relative phases of the
two systems which possess measurable and observable effects. If these

energy perturbations, or some component of them are not removable, are
not merely the artifacts of our inadequate perturbative analyses of
quantum systems, but are ontologically real, then the infinity, and
perhaps the infinite dimensionality, of the world logically follow.
These random energy perturbations manifest themselves in the form of
energy exchanges between the quantum mechanical system and the sea
of virtual particles in which this system is embedded. The interaction of
these virtual particles with the quantum mechanical system are
responsible for virtual transitions of the quantum state of the system to
other quantum states. The only real energy transitions available to the
quantum mechanical (dynamical) system are those from amongst the set
of virtual energy transitions which are continually occurring within the
time interval specified by the system's time uncertainty. The density of
this virtual particle energy sea has a direct bearing upon the rate of
temporal evolution of any given quantum mechanical system.
Our central hypothesis is that the presence of matter has a perturbing
effect upon this virtual particle energy sea, i.e., the quantum vacuum
field, and this perturbing effect is, namely, to decrease the overall
density of this vacuum energy which results in a similar decrease in the
time rate of change of all physical processes within the spatial volume
occupied by this matter. This proposed vacuum mechanism is exactly
similar to the mechanism by which a quantum resonant cavity decreases
the rate of spontaneous emission of 'cavity - detuned' photons by a
Rydberg excited atom. The resonant cavity achieves this by excluding
most of the photons of half-wavelength larger than the cavity diameter:
to wit, it does this by decreasing the energy density of vacuum
electromagnetic field fluctuations of roughly the same energy as that of
the suppressed atomic energy transitions.
Given this inherent circularity in the nature of technological growth, it
follows that the ultimate constituents of the World must be wholly
recursive: they must be their own symbolic representations.
If a
"conscious computer" is ever developed in what will undoubtedly be the

far distant future, this mysterious property of such a device will in no
way stem solely from the design or blueprint by which its engineers will
have conceived its construction; the blueprint will, of course, be a
necessary component in the realization of such a conscious machine, but
will have been derived from a necessarily somewhat fortuitous
discovery, owing to much trial and error investigation, of the "correct"
hardware interface of the device with the recursive, self-organizing
matrix of quantum - vacuum energy fluctuations which underpin and
mediate the stability of all fundamental particles and their exchange
forces. Only in appropriate dynamic conjunction with this fluctuating
energy matrix will any realization of a hardware design possess the
topological energy structure sufficient to tap the pre-existing
"consciousness potential" of spacetime. In other words, it is only the
grace of Reality's fundamental dynamism which will permit the eventual
construction of a so-called conscious computing device.
This empirical discovery of the correct interface design will manifest
itself perhaps during a testing phase where a long series of simulated
sensory inputs, of increasing complexity, are in the process of being fed
into the device while its output signals (responses) are being closely
monitored. The memory medium of the device will begin to accumulate
stored or remembered inputs which it has never received through its
various sensory apparatus. Identical sets or series of inputs will produce
significantly different series of outputs both from an individual machine
over time as well as from different machines at the same time - even if
these machines possess identical designs. Occasionally, radically
different series or sets of inputs will produce identical sets of outputs. A
significant portion of the functional relationship between output and
input will depend upon changes in energy in the internal components of
the machine's hardware which are, themselves, smaller than the overall
quantum energy uncertainty of the device as a whole. Moreover, no
mutually orthogonal set of eigenfunctions will describe the "functioning
components" of the device. This is why we have been saying that the
abstract spatial structure of our hypothetical computing device is nontopological. Clearly, any realization of a static blueprint for a computing

device, regardless how complex, in the form or an dynamically
functioning prototype, will itself be merely a topologically-preserving
transformation of the blueprint from 2 or perhaps 3 spatial dimensions to
4 spatial dimensions rather than the topology-non-preserving
transformation from 3 spatial to 4 dimensions of 3 space and 1 time.
This is because the state space of the transcribed structure, i.e., the
design, can be adequately described in spatial terms. In a very real
sense, an object may not be thought to possess an internality unless it
possess a genuine "outside" in the sense of a radically open system - a
system which cannot be contained within a closed system; a system is
"closed" only if it is finite and neither receives nor transmits energy to or
from any system except itself. Such a closed system possesses no
Contingent uniqueness versus necessary uniqueness. The size of the
Universe and the black hole mass limit as important parameters in
determining the density of the quantum vacuum energy.
The asymmetrical nature of time perhaps has some bearing on the
hierarchical structuring of complex macromolecules. The fact that a
molecule has been formed from a set of simpler constituents does not
guarantee that it can then be decomposed back into this set of
constituents. Similarly, the fact that a molecule has been broken down
into a set of simpler constituents does not guarantee that it can be
spontaneously recomposed from this selfsame set of constituents.
Perhaps the asymmetrical nature of temporality implies that any
sufficiently large set of macromolecules may be partitioned into two
disjoint parts; those molecules possessing a bottom - up structure and
those possessing a top - down structure. This distinction which I am
drawing is not a solid theoretical one; it is a pragmatic distinction which
assumes that status of a theoretical distinction when we refer to
molecules occupying either extreme end of the probability spectrum ( in
terms of their ability to form "naturally" from simpler parts ).
may only be defined in terms of a "rational" order foreign to itself which
it resists or subverts.
Will is the external manifestation of

consciousness. Will is a principle of incommensurate causation. The set
of lawlike relations which may be said to govern Will's operation are
unique and irreproducible. Rational order is simply that order which can
be made to manifest its lawlike relations in a reproducible manner.
There is no need to invoke a temporal description of this state space - the
only reason one would attempt it is because we project our genuine
temporality onto the mind's eye realization of the computing device in
its act of "functioning." Henri Bergson, in his essay, Time in the History
of Western Philosophy, complained of a confusion which inevitably
cropped up whenever metaphysicians attempted to analyze the problem
of motion. With a kind of gentle contempt he described the frustration
of these philosophers in trying to reconstruct linear motion from infinite
series of "immobilities", i.e., fixed points of infinitesimal length.
He explained their failure as being due to their ignorance of the nature of
a linear interval as a mere projection of "mobility onto immobility." This
projection, naturally as such, does not capture the whole phenomenon,
but merely a point of view with respect to it out of an infinity of equally
valid points of view, and so from a single projection, or even a finite
number of projections, one is never permitted to reconstruct the original
object as it is. There are possible boundary conditions which might be
easily placed upon the dynamic of the "flux" which are nonetheless
infinitely improbable as "natural" occurrences, which is to say that the
operation of intelligence is required to institute them. It is these
seemingly magical self - organizing properties of matter, owing to the
recursiveness of its ultimate "constituents," which make any attempt to
calculate the "improbability" of biological macromolecules an
incoherent and meaningless enterprise. Similar activities are the routine
pastime of myriad scientifically inclined creationists attacking evolution.
The staggeringly large numerical ratios which they cite against the
"chance occurrence" of the simplest eukaryote DNA are calculated upon
a permutational / combinational modeling of a prebiotic "soup" in which
chance collisions continually occur between precursor molecules, e.g.,
peptides, amino acids, nucleic acids, etc. The serious problem with such
a modeling approach is that it is not an empirically derived statistical

calculation as in actuarial computations, where a distinct probability is
assigned to each possible event within a pool, based on the observed
relative frequencies of each event, but is an abstract calculation where
the probabilities are fixed at the outset and remain unchanged
throughout all series of calculations.
For example, there are a vast number of nucleic acid reactions taking
place within the ribosome organelle of every living animal cell which in
the absence of certain mediating enzymes will take place anywhere from
9 to 20 orders of magnitude more slowly than if these enzymes are
present - the ribosome is responsible for "translating" the coded
information of nucleic acids into various macromolecules (proteins) and
in so doing expressing the genotype of the developing organism. We see
from this example that the probability of the occurrence of various
macromolecules essential to the appearance of the first reproducing,
metabolizing organic units begins infinitesimally small when the
molecule's precursors are yet unavailable, but that this probability grows
by large discontinuous jumps each time one of these precursors, the
precursors' precursors, etc. arise inadvertently in the prebiotic "soup" so
that by the time the exhaustive set of macromolecular precursors is
present, the formation of the original macromolecule is virtually assured.
The ribosome itself, despite its inordinately complex structure, has been
observed under experimental conditions to reform spontaneously after
having been dissociated within a carefully prepared enzyme bath into its
precursor polynucleotide constituents - and this within the space of only
several hours! It is indeed true that a countless variety of different
enzymes (of the precisely correct type) must be simultaneously present
along with the polynucleotide matrix for this seemingly magical act of
spontaneous self - organization to occur. This is because the self organization of such an enormously complex organic structure depends,
throughout the entire process, upon the almost simultaneous
juxtaposition (collision is a better term) of three or more precursor
molecules which all happen to have the exactly correct spatial
orientation, with sufficient kinetic energy to overcome the activation

energy barrier against the reaction occurring. It should be noted here
that just the chance of any three compatible molecules ( in terms of a
desired product ) colliding at once with the roughly correct spatial
orientation is an event with a probability only infinitesimally greater
than zero - let alone the question of proper activation energies. And so,
even if the primordial Earth possessed the appropriate reducing
atmosphere with oceans chalk full of all of the required precursor
molecules for the ribosome to properly form, without the necessary
catalyzing cofactors ( the enzymes ) there would not likely have formed
a single such structure by this late epoch. Then perhaps there must have
been an extremely long period of time during which the necessary
enzymes appeared on the scene, one might think. One suspects, then, a
similar self - organizing process behind the formation of these necessary
enzymes, the only difference being that the precursors which we are now
concerned with are simpler, while their precursors must have been
simpler still, and so on. But the precursor macromolecules for many
particular enzymes have, indeed, never been manufactured (because we
don't know how to do it), but have to be obtained from more complex
molecules than themselves, if not from living or recently living
organisms. The theory of evolution, chemical evolution in this case, has
secretly conditioned us to believe that there must be some definite if
inordinately complex sequence: precursors + cofactors ~ simpler
precursors + cofactors ~ etc. leading back to the very simplest organic
molecules which form by self - organization spontaneously and easily in
a wide variety of environments and without the need for cofactors or
helper molecules of any kind, and that it must have been such a process
(only in reverse) which ultimately lead to the first self - reproducing
biological units which could then be developed further through
Darwinian natural selection.
The notion of self - organization gives some of us pause because it
concerns a natural process which sits precisely on the fence between
what might be called less - than - self - organization, i.e., formation from
simpler components, and what is aptly called greater - than - self organization, i.e., formation from more complex components - and it is

just such a notion which strongly suggests a top - down hierarchy within
the natural order which can only have intelligence at its apex. At every
step in the chain in the formation of higher level precursor molecules,
the mediation of the required reactions is accomplished via self organization principles : those who attempt to calculate the probability
against "chance" formation of important precursor molecules forget a
very important general principle first articulated by the great rationalist
philosopher Leibniz - which is - that set of conditions which in
combination are sufficient to produce some complex structure must
necessarily remain in operation at every succeeding moment to sustain
the existence of this structure. The upshot of this is that a complex
structure which owes its origin to mere chance cannot endure, still less
could it respond to its environment in an adaptive fashion. To bend
Nature toward our intentions it is merely necessary for us to block all of
those except the one paralleling that one which is our own. It is in the
very nature of recursion not to be derivable from anything but recursion
itself. Therefore, if a recursion or recursive structure has a beginning in
time it is complex, but may only be analyzed in terms of other "simpler"
recursive structures. These simpler components of the recursive
structure are themselves approximations to the larger structure which
they form in combination with one another. The behavior of self organizing systems cannot ultimately be explained in terms of principles
of organization which obtain externally, which is to say, such dynamic
structures will not submit to a reductionistic analysis.
The distinction between the "mental" and the "physical" may be drawn
in the following way: both are wholes composed of parts, both possess
principles of organization, but what is termed a physical whole is
defined exclusively in terms of its constituent parts while the "parts"
which "compose" what is termed a mental whole are, themselves,
defined in terms of the whole which they compose. The reconstruction
of a mental whole must be guided in a top - down manner whereas the
construction of a physical whole must be guided in a bottom - up
manner. The principle of a mental whole must exist prior to its actual
realization ( in terms of whatever substance). Without substance change

is not possible. Coextensive with this principle is: change owes itself to
a lack of determination, to a deficit of Being, to negation. From which it
at once follows that substance, rather than being the seat of being, of
thinghood, as common sense conceives it to be, it owes its existence, to
the contrary, to a lack of being. It is not possible for a determinate thing
to be made up out of substance insofar as this thing possesses
determination. It is easy enough to see that continuity is required for the
subsistence of what is called historical time which we will henceforth
refer to as temporality. Indeterminate substance is the only basis for the
continuity underlying all change. The theory of entropic processes tells
us that energy and information are interdefinable and this fact in
conjunction with the principle of energy conservation suggests that
information, like energy, may neither be created nor destroyed: the
"creation" of a quantity of information is really constituted by its being
abstracted from the preexisting integral system of information defining
it.p94Like a piece broken off from a holographic emulsion, there is a
necessary trade - off involved in this abstraction process: the "newly
created" entity only acquires a relative independence from the whole in
which it eternally preexisted (and which defined its true being) by losing
some of its knowledge of itself. There is a direct analogy with this
process in the creation of "new" subatomic particles through the
collision of particles within a linear accelerator.
In the first couple of decades after the first "atom - smashing"
experiments performed with the primitive particle accelerators of the
1930's, it had been supposed that the particle products of these violent
collisions were actually pieces of the colliding particles which had been
jarred loose by the sudden impulsive force of their slamming together.
But soon after this early period the kinetic energies of the particles going
into these accelerator collisions began to significantly exceed the
combined mass - energy of the particles which themselves initiated the
reaction, with the result that the end product of these collisions was a set
of particles with individual member particles possessing a mass greater
than the combined mass of the particles originally participating in the
The common sense "broken pieces"
explanation of

accelerator products now had to be modified in some way or rejected
outright. Two alternative interpretations of this "mass paradox" were
suggested by particle theorists: either the product particles were created
from the excitation of the vacuum by the kinetic energy of the collision
with the "input" particles serving as the points of application of the
excitation energy, or they were really inside the initial particles all along
but the excess mass - energy was being exactly balanced by an equal and
opposite negative energy due to the internal binding forces holding the
particles together. The science of Physics, at least before the
development of relativistic quantum field theory, in the 1940's, imagined
that there might be such things as ultimate material constituents elementary particles - out of which all material bodies would be
composed. The implicit Metaphysics behind the notion of elementary
particles is that of substance. There is no such thing as nothing.
Nothing, by its very nature, is a nonexistent entity: it is its own
negation. We might be tempted to say then that "something," being the
opposite of nothing, must exist. But not just any "something" constitutes
the opposite or negation of nothing/nothingness, but only a something
which is, itself, the unity of all that might possibly exist, and the very
essence of which is to exist. In other words, nothing, not being possible
because containing within itself its own negation, implies that there must
have always been something (or other).
But the only guarantee that there has always been something is the
existence of something which contains within itself its own affirmation,
if you will, the reason for its own existence. A fundamental and most
general property of a thing which contains within itself the reason for its
own existence is that of recursion, something which is defined solely in
terms of itself, a recursive structure. There are logical grounds for
believing that there can be only one recursive structure, that there can be
only one self-existent entity - with this entity being the "ground" for
existence of all other entities. A recursive structure, if it may be thought
to be composite, would be composed of parts which are totally
interdependent upon one another; no part is the cause of any other
without also being itself caused by this other part and so if this recursive

structure had a beginning in time, it must have been given existence
through a pre-existing, broader and more complex recursive structure.
We see now that a given finite recursive structure comes into existence
through a process of uncovering or abstraction from a more complex
whole - through a process of negation. We are reminded of
Michelangelo's claim that a truly great work of sculpture already exists
within its marble block and that all he did in order to produce his works
was merely to free them from the marble in which they were imprisoned.
All simpler recursive structures come into being through a kind of
figure-ground transformation, through a change in perspective. This
reminds us of Leibniz' monads, each of which are only different
perspectives on an eternally pre-existent whole, c.f., holography, with
each monad defined in terms of, or dependent upon, all of the other
monads making up the whole. This exhaustive interdependence is what
Leibniz refers to as the preestablished harmony between monads.
Again, a recursion may only be defined in terms of other recursions like
itself. Consciousness is an example of an inherently recursive structure
as it is its own symbolic representation ( of itself to itself).
Consciousness' objectivity is exactly coextensive with its subjectivity;
the representation of a decomposed 3 + 1 space and time is always
against a nondecomposed spacetime where space and time are fused.
This is simply a restatement of Bishop Berkeley’s principle, esse est
percipi, i.e., to be is to be perceived, and as @$ consciousness necessarily
experiences itself as a unity and never as a multiplicity - the objective
reality of any multiplicity of consciousness' could only exist as a
subjective reality within a larger individual consciousness (itself a unity)
and so cannot really be a multiplicity of consciousness’s at all. This was
one of Schrodinger’s metaphysical insights. December 2012 We have said
elsewhere that, “without context, there is no meaning”. But not just any
context will do here. Firstly, this meaning giving context cannot be
merely incidental or accidental (since we are presuming here that
ultimately sense and meaning is not forthcoming from a closed system),
but the system must itself be an outgrowth of the meaning-providing
context. This makes a teleological interpretation of intelligent systems
unavoidable. In other words, we must subscribe to a top-down causal

order, one which necessarily admits the operation of what philosophers
term supervenience. “Self” organization or “self”-organizing principles
have no efficacy unless what we term “self”, i.e., that which does the
organizing, is open-ended, i.e., open to higher levels of organization.
Logically, the only system that can effectively “bootstrap” itself
(following Cantor) is an infinite system.
This argument against the multiplicity of consciousness was succinctly
stated by the physicist au=Erwin Schrodinger in his short seminal work,
Mind and Matter. It follows that since consciousness cannot
experience itself as a multiplicity, it therefore cannot exist objectively as
a multiplicity: consequently there can only be one consciousness.
But if the nature of consciousness is that of self-transcending, then
Schrodinger’s clever line of reasoning loses considerable force. Both
the American psychologist au=William James as well the French
philosopher au=Henri Bergson did not believe that the brain was itself
productive of consciousness but that the brain was merely a kind of
reducing valve (Bergson) which filtered and reduced down the cosmic
consciousness (James) to a vastly simpler form containing roughly the
level of information handling capacity (intelligence) which the human
organism would need to adapt to the limited challenges of an earthly
environment. The novelist and author of Brave New World (1932),
Aldous Huxley, discussed this view of consciousness in his popular
treatise, cit=The Doors of Perception - Heaven and Hell. In this work
Huxley describes the effects of the psychoactive (psychedelic) drugs
mescaline and LSD which he experimented with towards the latter part
of his life in an attempt to trigger or facilitate the mystical experience of
enlightenment in which he had had an almost lifelong curiosity. Huxley
explained the effects of these substances in terms of the prn=JamesBergson theory of consciousness: the experience of self-transcendence
and transfiguration which Huxley experienced on mescaline was for him
due to the drug's disabling effect upon the cosmic reducing valve. The
brain under the influence of mescaline is no longer able to filter out the
thoughts and intuitions irrelevant to earthly life (because appropriate to
the upwardly contiguous levels of consciousness) - hence the experience

of a vastly larger mental space. This type of explanation would have
been readily understandable to the ancient Greek philosopher au=Socrates
who viewed learning (through experience or education) as simply a
mechanism by which latent knowledge is recollected. A computational
state space configuration which cannot be duplicated in space can
neither be duplicated in time, except by virtue of the additional
advantage @$afforded by substantial continuity. Somehow the quantum
no-cloning theorem does not apply to two or more identical states of the
same system. Genealogy is the temporal component of meaningpurveying context.
Moreover, a state space configuration which cannot be duplicated in
time can neither be duplicated in space. In an infinite state space, there
are an infinite number of available states, and so the probability of any
one state occurring is, formally speaking, zero. However, if the state
space has a nonpermutational/combinational structure, then it is possible
for individual states to be overdetermined with a given state representing
an infinite number of different "configurations" within the state space.
This would be the case if the state space is that belonging to a computing
device possessing genuine memory. For the device would either
remember that it had been in the exact same computational state before,
in which case it would have to be in a new state slightly different from
the one it "remembers" being in previously, or it would not remember
ever having been in that state which it had occupied in the past. The
device could only really succeed in exactly remembering one of its
previous states if the device had
1) been in operation for an infinite period of time and,
2) the device possessed a memory of infinite size so as to adequately
contain an infinitely regressive self-representation of one of its previous
states, itself an infinitely regressive self-representation. A state space
configuration can only be copied if it is locally separable from the
nonlocally connected environment which contains it.

Otherwise, the state cannot be defined sharply enough in order to be
Separating the state from the nonlocally connected (nonrepresentational) environment in which it is embedded, in order to try to
copy it spatially, results in the destruction of the state earlier than, or just
before, a complete description of the state is in hand, and which is
required as a template. The velocity of light limits the transmission
speed of intersubjective data which must be carried on a carrier signal of
energy greater than /\E, the energy uncertainty of the nonlocally
connected quantum substrate through which the carrier propagates. That
which is locally separable from its physical environment is a
nonrepresentational abstraction. An abstract system is a re-presentation
if it remains connected to its defining context; if it remains coupled to
the medium in which it historically originated. Semantical symbol
manipulations are characterized by transformations of symbols through
the alteration of the defining contexts in which the symbols are
"embedded." Syntactical symbol manipulations are characterized by
transformations of symbols through the application of rules of inference
to the symbols within a constant defining context. It is perhaps possible
to see that syntactical and semantical transformations are mutually
"orthogonal" operations.
Recursive relationships involve a content, theorem or proposition, in at
least two distinct contextual levels. Infinitely recursive functions,
therefore, whether they be propositional, mathematical, etc., may
constitute a kind of hybridized operation possessing both semantical and
logic features simultaneously.
Is it a valid line of speculation to
suppose that since some particles which are virtual within an inertial
frame of reference, become real particles with respect to an accelerated,
or noninertial, reference frame, therefore inertia/gravitation must bear
some intimate connection with the phenomenon of the "collapse of the
wavefunction," since this phenomenon means the rendering as "real" one
among the many superposed virtual quantum states? Is particle
production associated with nonadiabatic, and hence, irreversible,

changes in the energy density of the quantum vacuum electromagnetic
October 1996

We know that nonadiabatic changes in the boundary conditions
of the infinite potential well problem results in a transition of the particle
energy to an excited state with respect to the new wavefunction
describing the new potential well resulting from this sudden change.
This suggests that perhaps irreversible, or, nonadiabatic, changes in a
quantum mechanical system are necessary for the wavefunction
describing it to undergo "collapse." Perhaps changes in the boundary
conditions of the (non?)locally connected vacuum can be modeled upon
a change in the dynamics of this vacuum in the absence of changes of
the boundary conditions. Perhaps all changes in the dynamics of the
nonlocally connected vacuum are only measurable in terms of their
manifestation as changes in the boundary conditions of a locally
connected quantum system. When the boundary conditions applied to a
given wavefunction are treated classically, then nonadiabatic changes in
the boundary conditions will usually result in a discontinuous change in
the wavefunction, i.e., a collapse in the wavefunction. But if the
classical boundary conditions are themselves treated quantummechanically, then the composite wavefunction will not suffer a
collapse, but will evolve according to the time-dependent Schrodinger
wave equation.
January 1997

Is a nonadiabatic change to be understood as a change in
vacuum boundary conditions which cannot be expected (by the vacuum
itself) because /\B//\t > /\B x h//\E ?It is clear in a geometrically intuitive
sense that transformations of entities which are not truly independent
and separable from an open-ended context or system in which they are
grounded cannot be genuinely reversible but only abstractly to within a
certain approximation. Participatory knowledge transcends abstract
description in terms of abstract representations of independent "things"
or entities. This is the knowledge based in the intimate interaction with
the open-ended. Is it possible to not be in an eigenstate of any quantum
mechanical observable whatever?
Does this describe the normal

condition which the quantum vacuum finds itself in? If the mode of
interaction of real particles with real particles, i.e., real-real interactions,
is correctly described as deterministically ordered, and the mode of
interaction of real with virtual particles as randomly ordered, then should
we describe the mode of interaction of virtual particles with themselves
as both random and deterministic ?
Is a superposition state possible in the absence of wavefunction
boundary conditions? Are some superpositions well-formed in the sense
that they can be inverse Fourier-transformed to a unique eigenstate with
respect to a definable observable? Are some ill-formed in the contrary
sense of not possessing an inverse Fourier-transform to a unique
eigenstate of a single observable? Perhaps well-formed superposition
states may only be defined given appropriate spacetime boundary
conditions, i.e., initial and boundary conditions. Notice that when a
measurement is performed upon one of the separated particles of an EPR
type experiment, that the particles remain nonlocally connected after the
"collapse of the wavefunction" describing the particles jointly, although
the particles are now nonlocally connected in a new way precipitated by
the observer's act of measurement. Has the observer simply succeeded
in discontinuously altering the inertial frame of reference in which the
particle pair is embedded. If so, doesn't he do this by accelerating the
particles? Are the nonlocal connections within the observer's mind
merely apiece with the nonlocally connected vacuum state in which his
brain is embedded, and so when he performs his measurement upon the
particle pair, the pair must "jump into" a new nonlocally connected
vacuum state, resulting in a discontinuous change in its superposition
state? Does the observer recoil nonadiabatically into a new nonlocallyconnected vacuum upon performing an act of quantum measurement
which induces what appears to him as a wavefunction collapse? Must
the vacuum possess infinite self-similarity so that "identical events" may
unfold with different rates of temporal evolution, depending upon which
inertial frame of reference they are "viewed from?" Self-similarity can
never be exact. If the vacuum state were merely locally connected, then
its temporal evolution "as a whole" would necessarily follow along a

predetermined continuum of vacuum states. However, a nonlocally
connected vacuum state creates its own trajectory as it evolves
I am trying to build a case for distinguishing between two seemingly
very different descriptions of the process of quantum measurement,
namely, the discontinuous collapse of the wavefuntion of the quantum
mechanical system being observed/measured from a similar collapse of
the wavefunction describing the mind of the observer performing the
measurement, which is to say, the Copenhagen from the "Many Worlds"
interpretation. As is well known, Newton's law of gravitation may be
given a Gaussian formulation exactly paralleling the electromagnetic
flux law. What is surprising is that the black hole mass of a given radius
may also be given a Gaussian formulation.
To wit, 1/4piG x Int(Hc) x dS = Mblackhole. pru=It is possible to
"derive" the Pauli Exclusion Principle from the Heisenberg Uncertainty
Principle. This may be shown in the following manner. If two spin-1/2
particles of the same quantum mechanical system were to be in the same
quantum state - what is precisely forbidden by the Pauli principle, say
two electrons "orbiting" the same hydrogen nucleus, it would be possible
for us to measure the kinetic energy (a function of momentum) of one of
the electrons, and then to measure the potential energy (a function of
position) of the other electron with the result that we would have
demonstrated the existence of a quantum mechanical state possessing,
simultaneously, an exact potential and an exact kinetic energy; but this is
precisely what is forbidden to exist by the Heisenberg uncertainty
principle - QED.
June 1997

This conclusion does not go through, however, if the requirement
is made that two particles which are in the same quantum state must be
described by one and the same wavefunction.
The Feynman path - integral formalism of relativistic quantum field
theory indicates that real particles, i.e., fundamental particles whose

mass - energy is greater than the quantum mechanical energy uncertainty
of the quantum mechanical system to which they belong, may be
represented as stable and interlocking patterns of vacuum energy
fluctuation, that is, as patterns of virtual particle creation, annihilation,
and particle (fermion and boson) exchange processes which form with
one another a stable, interconnected meshwork of feedback loops of
virtual particle reactions.
June 1997

It is not certain what the concept of stability means within the
context of virtual particle processes. "Stable" certainly does not mean
here persistence of a structure against fluctuations or perturbations thermal or otherwise, since the virtual particle processes themselves are
the fluctuation phenomena.
Stability must mean in this case the
relatively unchanging probabilities of recurring patterns of quantum
fluctuation manifesting themselves as virtual particle reactions.
Thus, real fundamental particles are viewed within this formalism as
mere excitations of the vacuum (ground) state with more complex matter
structures, e.g., atoms, molecules, etc., as feedback structures into which
these vacuum excitations are organized - provided that adequate
excitation energy is available.
One possible test as to whether or not a given particle is composite or
simple might be: does the particle have a virtual counterpart, i.e., can
the particle be produced out or the vacuum as a pure energy fluctuation out of a fluctuation of purely imaginary4-momentum?
Although in
theory it should be possible to produce whole atoms, molecules, or more
complex matter structures through direct excitation of the vacuum state
(see above paragraph), the intelligent coordination of the myriad and
highly localized excitations required to do this, from within any
particular modified vacuum state, is probably rendered impossible due to
the inherent uncertainty of total energy which is responsible for vacuum
fluctuations: certain existing boundary conditions to the matrix of
vacuum fluctuations may already be immediately present - in the form of
already created particles, molecules, etc., but these boundary conditions

cannot be produced ab initio, but may only be "reproduced," utilizing
identical pre-existing boundary conditions (in the form of already
available matter) as template and catalyst for the reproduction of the
desired vacuum boundary conditions. Any instrumentalities which we
might employ to alter the vacuum field boundary conditions would only
be effective by virtue of the vacuum field fluctuations themselves which
mediate their action; we must realize that the imposition of genuinely
new boundary conditions upon the vacuum, i.e., without the utilization
of a "template," even if locally, would imply a change in the global
boundary conditions of the entire vacuum energy system ( the entire
spacetime continuum).
On the view of matter and vacuum which is
espoused here, matter is seen as not having an existence independent of
the vacuum energy field, rather, the stability of matter at all levels of its
hierarchical structure, is underpinned by the myriad field energy
fluctuations of the quantum vacuum. Consequently, matter does not
possess an independent existence in the Demarcatean sense of "atoms
and void;" our view is more consonant with that put forward by
Heraclitus, to wit, that everything is composed "of fire in measures
kindling and in measures going out;" all change is driven by the clash of
opposites and all potential for change lies with the tension between these
Here "fire" is given the modern physical interpretation as "vacuum
energy" and the "clash of opposites," as the creation and annihilation
operators ( 2nd quantization of quantum theory) into which all operators
corresponding to physical "observables" are analyzable.
Heraclitus' physics lacked was a basis for physical continuity from one
moment of time to the next; the reproduction of vacuum boundary
conditions (in virus - like manner) supplies this missing element within
modern physics.
Within this understanding of the relationship
between matter and vacuum Democritus' notion of persisting
"substance" no longer has any application and the continuous existence
of real matter particles consists in the continual recreation of a coherent
and interlocking pattern of virtual particle reactions which is apiece with
the larger pattern of vacuum energy fluctuations within the indefinite

surrounding region of spacetime.
October 1996

The basic idea behind a perturbative analysis of a quantum
system is that one is not able to write down with infinite precision the
exact Hamiltonian of the system under consideration and so one
describes the energy of the system in terms of a Hamiltonian plus a
perturbation energy. This perturbation energy is usually the first
nonzero term in an expansion of energy terms where additional terms are
progressively smaller and must be neglected since to include them poses
analytic intractability. In other words, one does not have the precise
energy eigenfunction expansion of the system's wavefunction; if one did,
then one could in theory prepare the system in any one of its energy
eigenstates where the system would exist at a precisely defined energy
for all time, assuming the system were not interfered with as a result of
exchanging energy with some other system. But since the Hamiltonian
of a quantum mechanical system is always a function of the system's
momentum and position, which are incompatible observables, the
energy of the system, which is a function of both the system's
particle/field momentum and particle/field source position, can never be
precisely defined. In this way we see that energy perturbations are not
an ad hoc and practically useful accounting device needed to make up
for a merely practical, and, hence, theoretically removable, ignorance
concerning the system's real energy eigenfunction expansion. But quite
to the contrary, perturbations to the system's energy - any system's
energy - are not merely artifacts of the perturbative analysis, but are
ontologically real, and not due to a temporary inability to specify the
system's true energy eigenfunction expansion. There is a small
component of the perturbation energy which is forever irremediable and
represents the exchange of energy between any quantum system and
another quantum system which is always present.
An important conclusion to be drawn for quantum theory here is that, the
wavefunction only represents the most that can be known about a
quantum system in the absence of the irremovable perturbations. We
might be tempted to speculate here that more can be known about a

quantum system than can be contained in any wavefunction provided
that the effect of the irremovable perturbations are included. If the
objective and the subjective are considered to be disjoint categories, then
we may say that just as the wavefunction represents the most that can be
objectively known about a quantum system, what can be subjectively
known about a quantum system is due entirely to influences lying
altogether outside of all possible wavefunction descriptions of the
system. Such influences, collectively, are the so-called irremovable
perturbations. We must not straight-away identify such "irremovable
perturbations" with the virtual particles and fields of relativistic quantum
field theory as these entities are largely artifacts of low order
perturbative analysis involving perturbations which are largely
removable, if in theory, the observer should succeed in acquiring greater
knowledge of the system under observation. What uniquely
distinguishes virtual particles and fields from their real counterparts
does, perhaps, point to some of the properties of the medium with which
all quantum systems forever exchange energy, leading to the so-called
irremovable perturbations.
Therefore, the introduction of matter particles into a volume of
spacetime is not distinct in principle from creating these particles ab
initio from a portion of the vacuum energy already present within this
particular volume of spacetime; in an inertial frame of reference, a real
matter particle imparts an excitation energy to the vacuum such that a
particle identical to itself is created out of the fluctuating vacuum field
energy; at the same time the previous particle is destroyed, its massenergy providing the excitation energy necessary to re-create itself
anew. In an accelerated, or more generally, a non-inertial reference
frame, a particle’s mass-energy excites the vacuum field in a different
manner, continually producing a new variety of particles to take its
place. It has often been noted in the literature of modern physics that
particle production from the vacuum state is to be expected within
curved spacetimes. This leads us to the idea that merely localized
alterations in boundary conditions of the vacuum field in no way alters
the total energy density of the region occupied by the vacuum field, but

merely changes the ratio of mass-energy to vacuum energy from zero to
some fraction approaching infinity (in the case of black hole masses).
The general relativistic alteration in the local velocity of light may be
understood in terms of Mach's formula for the speed of sound in an
energy conducting medium in its application to the quantum vacuum.
Mach's formula states that the velocity of sound in an energy conducting
medium is a function of the pressure and the energy density of the
medium. Specifically, the velocity of sound in the medium is the square
root of the pressure of the medium times the speed of light squared
divided by the energy density of the medium.
Since the pressure of the vacuum is equal to its energy density, and the
pressure of matter is effectively zero, the energy density and pressure
terms in Mach's formula are the total energy density and pressure of
space, respectively; the pressure of the vacuum is always equal to its
energy density, which decreases in step with the increase in the massenergy density. By letting the total energy density of space equal to the
sum of the vacuum energy and mass-energy densities, i.e., Etot = Ev +
Em, and the vacuum pressure equal to the modified vacuum energy
density, i.e., Ev' = Ev - Em, Mach's formula works out to vsound =
sqrt[(Ev - Em)c2/Etot] which reduces to the result, vsound = [1 GM/RC2] * c, and this result is identical to the reduced local value of
the speed of light calculated from general relativity (in the weak field
limit). Our requirement of no spatial variation in the total energy density
of space, i.e., that the mass-energy and vacuum energy densities are
complementary, seems to demand that the density of gravitational
energy cont= If we are correct in reinterpreting the gravitational redshift
of photons propagating in a spatially varying gravitational potential as
being due to a spatial variation in the zero-point of the vacuum's energy
(against which the photon's energy is to be measured), then the
imposition of boundary conditions upon the vacuum field merely
produces local and discontinuous variations in the spatial (and temporal)
distribution of the field energy, leading to the appearance of negative
binding energies which exactly counterbalance the positive gains in
mass - energy which thereto result; it is in this sense that mass-energy

may be thought to occupy a "hollow" in the vacuum energy field and the
"displaced" vacuum energy has merely assumed a new form as massenergy. Meff = Mr/sqrt(1 - (c')2/(c)2), where Mr = binding mass and
Meff = effective mass. The binding mass stems from the sum of all (+)
and (-) non-gravitational binding energies. The accumulation of many
such discontinuous energy gradients submicroscopically leads to the
appearance, macroscopically, of continuous energy gradients in the
vacuum. Since the energy of the vacuum field owes its existence
entirely to the quantum mechanical energy uncertainty of spacetime, in
turn owing to the fact that the energy Hamiltonian is a function of
mutually incompatible observables, it follows that the vacuum field
shares in the general properties of quantum mechanical energy
uncertainty. One such property is that energy uncertainty is required for
any discrete change in a quantum mechanical observable; for example,
all changes in a physical system stem from the application of forces
upon the system while all fundamental forces of nature are mediated via
the exchange of virtual bosons between fermions composing the system.

Consequently, physical processes undergo temporal evolution only
insofar as they comprise quantum mechanical systems possessing finite
energy uncertainty, with the rates of the component processes
determined by the magnitude of system energy uncertainty. A fermionboson quantum mechanical system may be thought of as an
interconnected meshwork of temporal fermion energy transitions with
spatial boson momentum transitions, with the fermion wavefunctions
and boson wavefunctions being antisymmetric and symmetric,
respectively, so that increasing the density of interacting fermions and
bosons within a particular region of spacetime results in a decrease in
the energy uncertainty and increase in the momentum uncertainty of the
vacuum state, respectively.
November 1996

Any wavefunction may be alternately represented as a sum of
symmetric and antisymmetric wavefunctions. If one calculates the
probability density function for a wavefunction in this new
representation, one is tempted to give some physical interpretation of the

three distinct components which result.
X*X = X*symXsym + X*antiXanti + 2X*symXanti
The first term represents the probability function resulting from the
mutual interaction of bosons while the second term represents the
probability function resulting from the mutual interaction of fermions.
The third term may represent the probability function resulting from the
interaction of bosons and fermions with each other.
July 1997

In Fourier analysis, a function which satisfies the prn=Dirichlet
conditions, may always be represented as a prn=Fourier sum of weighted
sine and cosine functions of the bounded variables. @$We note here that
this function may be represented as either purely even or purely odd, i.e.,
as either purely a Fourier sum of cosine functions or sine functions,
provided that the appropriate transformation of the coordinate system is
performed within which the Fourier expansion is to be computed. In
direct analogy to what has been said concerning Fourier analysis, we
may say that through a judicious transformation of the spacetime
coordinates, we may represent an arbitrary wavefunction as either of
purely even parity or of purely odd parity. The notable exception to this
is what we cannot do, however: take a wavefunction of purely even
parity and transform the coordinate system so that this function is now
represented as possessing purely odd parity, or vice versa. Continuing
with our analogy, we cannot represent a sine function in terms of a sum
of cosine functions and so on. We cannot do this, as was said, through a
transformation of the spacetime coordinates, however, an odd function
can be readily converted into an even function and vice versa through
the mere addition of a phase factor ( of pi/2 ) within the argument of the
function we wish to transform. We know that if an operator does not
commute with the Hamiltonian operator, then the observable
corresponding to the first operator cannot be a conserved quantity.
Conversely, any operator which commutes with the Hamiltonian will be
tied to a change in the total energy of the system if this operator itself
suffers any changes. It is well known that parity is conserved within the

theory of both the electromagnetic and strong nuclear interactions. This
is all to suggest that an alteration of the momentum-energy tensor
through the judicious insertion of phase factors into each momentum and
energy eigenfunction, may result in a transformation of the momentumenergy eigenfunction, Psi(x,y,z,t), without altering the momentumenergy tensor, Ti,k itself. (See spontaneous symmetry breaking in gauge
theory and its artifact exchange boson, which results from the transition
from global to local field symmetry) This is just saying that the
wavefunction representing the quantum mechanical system with
momentum-energy tensor, Ti,k, is itself degenerate with respect to the
phase. We may deduce from this that matter cannot exist in either a
purely fermionic or purely bosonic state. Otherwise, we would be in a
position to alter the tensor, Ti,k, describing this matter distribution,
through a non-coordinate transformation, namely, through the mere
introduction of an arbitrary nonperiodic phase factor into the energy
eigenfunction representing this mass distribution. This would constitute
a stark violation of the Equivalence Principle of General Relativity
which implies that each distinct stress-momentum-energy distribution,
as represented by T, uniquely correlates to a distinct curvature of the
spacetime metric. To wit, matter must always exist as a mixed system of
fermions and bosons, namely, any given real matter distribution must be
described by a wavefunction which is neither purely symmetric nor
purely antisymmetric. This figures into the necessity of any quantum
field giving rise to virtual particles.
November 1996

By calculating expectation values for various observables for
the quantum vacuum, such as <p>, <E>, <p2>, <E2>, etc., we may be
able to exploit our intuitions about what X*qX(vac), where q is the
observable in question, must be in order to guess at the probable
relationships of these various vacuum expectation values.
The relativistic effects upon kinematics (space and time) are grounded in
the relativistic effects upon the dynamics through the conservation of
momentum-energy. We believe, for instance, that the relativistic
contraction of the positional uncertainty of a particle, say, and the

relativistic time dilation (of the particle's lifetime, if it is unstable), do
not lie behind the dilation of /\p and contraction of /\E, respectively
through the Heisenberg uncertainty relations. This would be to ground
dynamical effects in mere kinematics. Rather, the kinematics should be
grounded in the dynamics: the effects on space and time are
epiphenomenal to the substantive effects associated with the
conservation of momentum-energy. This is thought to take place
through the Heisenberg uncertainty relations for position/momentum and
time/energy. Changes in the components of the momentum-energy
tensor cause alterations in the tensor of stress-momentum-energy
uncertainty. We may suppose that the presence of real fermions reduces
the number of available vacuum fermionic energy states while the
presence of real bosons increases the number of available virtual bosonic
momentum states, relative to the reduced number of virtual fermionic
energy states. In this manner, more virtual energy transitions occurring
within the vacuum state must be effected via similar transitions
occurring within the massive body in question. This situation is
consistent with the effect mass has upon the surrounding vacuum of
simultaneously decreasing the energy uncertainty and increasing the
momentum uncertainty radially about the gravitating massive body. A
general result of the preceding discussion is that the accumulation of
mass - energy, more particularly binding energy, within a volume of
spacetime causes a corresponding reduction in the density of energy
uncertainty (vacuum energy), in turn resulting in a corresponding
decrease in the rate at which physical processes occur within this
particular region of spacetime. How are we to understand so-called
energy-degenerate transitions within the vacuum state, which is to say,
transitions within the vacuum state not involving a change in the
vacuum's energy?
The degenerate wavefunctions represent the
possibility of change which falls outside of the physically temporal.
An important question in this connection is whether gravitational time
dilation shall have any effect upon the frequency of energy degenerate
transitions. Is the density matrix an approximation made in lieu of the
actual wavefunction which we are for merely practical reasons unable to

specify, or does a quantum system sometimes not possess a bona fide
wavefunction at all? What relation does the 2nd rank tensor relating two
different virtual particle current densities have to the momentum-energy
tensor of the metric tensor of GR? Would an exceedingly
intense beam of coherent electromagnetic radiation (laser beam) result in
a kind of anti-squeezed state? This might have the precisely opposite
effect to that of the Casimir Effect which normally induces an expansion
of the momentum uncertainty along two orthogonal directions to the axis
along which the conducting plates are oriented. A question here is
whether the momentum uncertainty along the time axis (the energy
uncertainty) is also dilated due to a squeezing of the momentum
uncertainty between the plates. The token reflexives, here and now,
seem to presuppose the token-reflexive, I, or me. Conversely, the tokenreflexives, I, or me, seem to equally presuppose the token-reflexives,
here and now. This seems to suggest that the nonlocal connections,
manifested in the relations of virtual particles/fields to abstract
spacetime may also be essential in mediating the individual
consciousness of observers interacting with spacetime. Within the
context of an expanding universe, then, matter does not merely alter the
density of the vacuum, but also alters the rate at which the density of the
vacuum energy decreases with time due to cosmological expansion, and
since the time rate of change in energy density is, itself, a physical
process, matter, by reducing the energy uncertainty of the vacuum, also
causes a radially varying vacuum field energy density which manifests
itself as a spherically symmetric energy gradient centered about a mass
which is identical to the gravitational field! July 2011 There is an
exponential relationship involved here with the effect of cosmological
expansion upon the time rate of change of vacuum energy density. The
discrete cosmological redshift may be understood in terms of the model
suggested here, c.f., WKB approximation of electron tunneling problem.
Certain versions of Modified Newtonian Dynamics (MOND) call for a
gravitational “constant” with an exponential factor giving rise to the
small anomalous constant acceleration which has been observed
producing inaccuracies in the charting of deep space probe trajectories,
c.f., Pioneer Anomaly and the papers of au=P. W. Anderson.

Changes in the composition of the total energy density of a region of
space with respect to the proportions of mass - energy and vacuum
energy are reflected in the transformation of the spatio-temporal
variation in vacuum energy density from being purely temporal, in the
case of free space, to a mixture of two parts, temporal and spatial, in the
case of typical distributions of matter, to a purely spatial variation of
vacuum energy density, in the case of black hole masses; and there is a
homologous mapping between the degree of tipping of the Minkowski
light cone in curved spacetimes and the degree of transformation of a
temporally varying vacuum energy into one which is purely spatial in its
variation. Within curved spacetimes, the local value of the velocity of
light is reduced below its normal value in "free space," and this may be
envisioned as a narrowing of the hypersolid angle swept out by the
Minkowski light cone centered at a given point within this region
possessing a gravitational potential. This contraction in the area of the
hypersurface of the Minkowski light cone may be alternately described
in terms of a light cone which suffers no contraction of its hypersurface
area, but a decrease in the uniform density of vacuum energy occupying
the uncontracted light cone surface, and hence @$the equivalence of the
spacetime curvature with the spatiotemporal variation in vacuum energy
If we are correct in positing an exact equivalence between
spacetime curvature and spatio-temporal variations in the density of the
vacuum's zero-point energy, then the phenomenon of particle production
in a spatially or temporally varying spacetime curvature, or via the
equivalence principle, due to the effects of noninertial motion, may be
explained alternatively in terms of spatial or temporal variations in the
boundary conditions on the vacuum field such that spatial or temporal
variations in its zero-point energy result. In this scenario, the existence
of real particles is understood as just a manifestation of zero-point
energy from the vantage point of a noninertial frame of reference or
equivalently, from the standpoint of a region of the vacuum possessing
"less restrictive" boundary conditions than the region of the vacuum in

which the particles appear. On account of the precisely thermal
spectrum of the particles produced within curved spacetimes and also
due to the unique requirement of a thermal spectrum for the vacuum
itself in order that it possess Lorenz invariance, an entropy may be
meaningfully assigned to both the vacuum as well as the particles
produced from it as a result of the imposed vacuum boundary
June 2011

There must be a degeneracy in how energy uncertainty in the
vacuum is transformed into 3-momentum uncertainty such that entropy
is increased, i.e., there is a choice among alternative ways to effect this
transformation that is not determined by the initial and boundary
conditions. For example, lifting a particle out of a gravitational potential
and allowing it to fall back to its starting point involves a pathdependence in addition to creating an “open loop” in spacetime and so
necessarily involves the production of some entropy. “The equivalence
principle strongly suggests that freely falling motion in a gravitational
field should be viewed as analogous to inertial motion in pre-relativity
physics and special relativity, c.f., cit=Teaching General Relativity,
arXiv:gr-qc/0511073v1 14 Nov 2005. So what determines the local
velocity of light does so by determining the local velocity of
cosmological expansion. @$ See the relationship between quantum
entanglement, polarization and magnetization, electric field permittivity
and magnetic field permeability, Heisenberg 3-momentum and energy
uncertainty, spin-0, spin-1 and composite spin-2 correlated vacuum
fluctuations, Einstein’s causality principle versus the so-called Bohm’s
causality principle.
Since this production of particles from the vacuum state due to
imposed boundary conditions is a reversible process, because the
particles are reabsorbed if the boundary conditions are later removed, the
change in the entropy of the vacuum field must be exactly compensated
by the entropy increase due the particle creation so that the total entropy
of the particle - vacuum system is a constant. The Feynman path
integral technique for calculating the ground state energies of atoms may

( in principle ) similarly be utilized to calculate the ground state energy
of the vacuum state of free space or, indeed, the vacuum state of a region
of space in which a gravitational field is present. It is probable that
fewer paths comprise the Feynman integral where a gravitational field is
present than in the free space vacuum; this limits the number of valid
available paths along which energy may be exchanged between two
points in this particular region of spacetime - hence the reduced value of
the integral, and in turn, the decreased value of the vacuum state energy
in this region. The reduced number of Feynman paths, or histories,
means that the vacuum's ability to exchange energy with itself, as well
as its ability to exchange energy with particles and fields, and thusly to
mediate the exchange of energy between particles and fields among
themselves, is correspondingly diminished so that the rate at which the
vacuum's energy density decreases with time ( due to the expansion of
the universe ) is likewise diminished.
In light of the diminished self-energy of the vacuum, the resultant
increased inertial mass of particles within this altered vacuum may be
viewed in two distinct, but fundamentally similar ways. First, the
diminished capacity of the vacuum to undergo energy exchange with
itself means that it is more difficult for the gravitational field energy to
redistribute itself in response to changes in the matter distribution within
the altered vacuum state; consequently, by the general equivalence of
gravitational and inertial masses, it follows that there is an equal
difficulty for matter configurations to change their distributions in
response to impressed external forces attempting to accelerate these
mass configurations. This is further theoretical evidence for the
complementary relationship between the mass energy density and the
vacuum energy density which together define the total energy density of
any particular region of spacetime. Moreover, if there are already
existing particles both prior and subsequent to the imposition of the
vacuum boundary conditions, then the masses of these previously
existing particles is expected to increase in accordance with the decrease
in the vacuum energy density (and vice versa); this is consistent with
viewing particle production more generally as an increase in mass within

the region of varying vacuum energy - as the conversion of vacuum
energy into mass - energy: the fraction by which particle masses are
increased in transporting them from a region of higher vacuum energy
density to one of lower density must complement the fraction by which
the vacuum energy density decreases between these two points.
This means that the maximum density of mass possible within a certain
spherical region is equal to the maximum density of particles which may
be created from the vacuum energy occupying this region, via excitation
of the vacuum state. We arrive at the interesting result that the density
of the vacuum energy in a certain spherical volume of free space (where
no mass-energy is present) is precisely equal to the mass-energy density
of a black hole which could possible occupy this same volume. One
important idea which suggests itself within the context of this discussion
is the famous cosmological constant problem and the discordant
interpretations of it within quantum theory and general relativity theory.
There is a 46 order of magnitude discrepancy between the calculations
of the value of this constant within these two theories, hence the
profound difficulties in developing a consistent theory of quantum
gravity! Now if the energy of the vacuum is interpreted as suggested by
the work of Sakharov and more recently by the zero-point energy
gravitation theory of Hal Puthoff then rather than being, itself, a source
of gravitational fields, like particle or field energy, the energy of the
vacuum would merely be the mediator of gravitation so that differences
in gravitational potential would correspond exactly to differences in the
energy density of the vacuum at two different points in spacetime. A
uniform distribution of vacuum field energy would therefore have no
more effect upon matter particles within this energy distribution than
would a series of concentric mass shells upon the matter particles
contained within them; which is to say, no effect whatever, and this due
to the precise mutual cancellation of the combined perturbations to the
matter particles by the fluctuating vacuum energy field. Thus, only
differences in vacuum energy density would have any meaning so that
the overall vacuum energy density would play no role in the definition
of Einstein's cosmological constant, and there would be no necessity of

postulating a unique exchange particle mediating the gravitational force;
gravity would not in this case be viewed as a fundamental force as are
the electromagnetic, strong and weak nuclear forces, but would be
understood as a "parasitic" force stemming from the imposing of
boundary conditions upon the combined vacuum electromagnetic, strong
and weak nuclear fields which together owe their existence to the
fundamental energy uncertainty of the vacuum state, described by an
energy Hamiltonian which is a function of incompatible observables.
The pure imaginary momentum of all "rest masses" within the 4 hyperspherical cosmological model may be justified beyond its value as
a convenient mathematical formalism if these masses are viewed as
presently being in the act of tunneling through a hyperspherically
symmetric potential barrier. The gradient of this hyperspherical
potential would be a four - vector with components 1,2, and 3 vanishing
in free space, but transforming through multiplication by a tensor into a
new four - vector with non-vanishing spatial components, resulting in
the appearance of a gravitational field. Certainly this tensor is the
matter-stress-energy tensor described in the field equations of Einstein;
the only difference is that the vacuum energy does not contribute to the
value of T, the matter-stress-energy tensor, which is responsible for
altering the metric tensor which describes the curvature of spacetime, or
alternatively, the spatiotemporal variation in the vacuum field energy
density. It is perhaps now easier to see at an intuitive level why the field
equations of general relativity predict the existence of a universe which
is either globally contracting or expanding: unless the energy density of
the vacuum field is temporally varying in free space, the matter-stressenergy tensor operates upon a zero four-vector (representing the gradient
of the hyperspherical potential) and the introduction of matter
distributions, represented by the matter-stress-energy tensor, into this
vacuum field, cannot produce a non-zero four-vector, namely, nonvanishing spatial components of the free space four-vector, i.e., a
gravitational field.
Within this particular cosmological model, the
energy, linear 4-momentum, and angular 4-momentum of a particle is
always conserved, regardless of motions or accelerations which it might

undergo as a result of interactions with other particles and fields.
We are saying here that gravitation is, itself, a four-vector, whose
magnitude is always conserved independently of the matter distribution.
The matter-stress-energy distribution within a particular volume of space
merely alters the decomposition of this four-vector into a new set of
vector components in much the same way that a boost, rotation or
translation produces a new decomposition of the Minkowski four-vector
which describes the instantaneous world segment of a particle; hence,
matter distributions manifest themselves as tensor fields in spacetime. If
the gravitational field owed its existence to the presence of matter-stressenergy distributions in spacetime, then we would certainly describe the
gravitational field as being itself a tensor field; however, the
gravitational field is actually a conserved four-vector (in the sense that
the magnitude of this vector is conserved), and this four-vector owes its
existence to the inverse square decrease in the vacuum's zero-point
energy density in combination with the inverse cubic decrease in the
mass-energy density which results due to the process of cosmological
expansion. The action of matter distributions, however, must be
described in terms of a tensor field; again, the gravitational field, itself,
is not a tensor field; the action of mass upon this field is, however,
tensorial in nature. As we know, from the many discussions of attempts
to produce quantum gravity theories, quantization of a 2nd order tensor
field results in the appearance of a spin 2 boson which acts as the unique
exchange particle mediating the tensor field. @$The four-dimensional
zero-point energy gradient does not transform itself with time in free
space in a manner which necessitates a tensor description; consequently,
gravitons will not be present in free space as vacuum field fluctuations;
however, any valid theory of quantum gravity (assuming one is possible)
demands, along with the uncertainty principle, that the total vacuum
field contain virtual gravitons in its mix of fluctuating energy, but
because a tensor does not describe the transformation with time of the
free space vacuum, the quantization of the total free space vacuum field
cannot include spin 2 particles, which is to say, the free space quantum
mechanical vacuum does not possess virtual gravitons and hence does

not possess (per se) gravitational field fluctuations. Consequently,
gravitons do not exist in regions where matter distributions are present
so that the search for gravitational waves must turn out to be a fruitless
Another way in which the imaginary coefficient may be justified is
to note that the rate at which the vacuum energy density decreases with
time is proportional to the vacuum energy density itself, just as are the
time rates of all physical processes, so that if the vacuum energy density
is reinterpreted as its probability density (in terms of the square of the
vacuum wavefunction amplitude), then the negative exponential time
evolution of the vacuum probability density implies that the vacuum has
a purely imaginary four - momentum with a four velocity of magnitude
c. The effect of accelerations, for instance, upon a particle is merely to
change the distribution of its total linear/angular momentum within the
conserved 4-quantity. The perhelion shift in the orbit of Mercury,
predicted by general relativity, may be simply understood as a cyclic
redistribution of the planet's 4-angular momentum as it moves around its
orbit so that the 3-dimensional projection of it 4-angular momentum
varies sinusoidally with the orbital period; this causes Mercury's 3angular momentum to be slightly greater than that predicted by classical
mechanics, producing the observed advance in perhelion. The black
hole, as noted earlier, represents mass-energy in its most compressed
state. For maximum symmetrical energy exchange between any two
shells occupying a given volume of matter ( of uniform density ) where
the density of vacuum energy exchanges is proportional to the density of
the vacuum energy itself, we require that the density of mass-energy
decrease with the inverse square because certainly the density of
bundled energy trajectories (along which all energy exchanges occur)
must also fall off with the inverse square due simply to the geometry of
spherically symmetric radiation of energy in 3 dimensions. We expect
the density of exchange energy, due to vacuum field fluctuations, to be
proportional to the density of energy so exchanged because it has
already been established that the rate at which all physical processes
occur is proportional to the density of Heisenberg-uncertain energy

(vacuum energy) and the decrease in the density of this energy with the
expansion of the universe is itself a physical process; moreover, there is
a vectorial continuity equation, analogous to a field equation of
Maxwell's, which describes the relationship of spatial and temporal
variations in the density of the vacuum field energy so that the spatial
variation of this zero-point energy will have the same structure as the
temporal variation of the zero-point energy due to cosmological
expansion. The question then arises, " what is the structure of this
variation in vacuum energy density in free space, where no mass-energy
is present?"
Well, the density of the vacuum zero-point energy is only meaningful as
a physical quantity in relation to the density of the mass-energy just as
the energy of a particle is only meaningful in relation to the energy of
the vacuum state, so the general time variation in the mass-energy
density due to cosmological expansion should give us a clue to the
manner in which the vacuum energy density changes with time;
provided that our hypothesis of a dynamic vacuum energy mechanism
for gravitational fields is fundamentally correct. Therefore, if we
postulate this vacuum mechanism, then it is clear that the time variation
of the vacuum energy density in the universe due to cosmological
expansion must be such that the ratio of the temporal variation in
vacuum energy and mass-energy has the same mathematical structure as
the spatial variation in the ratio of these two densities about massive
bodies which acts as a gravitational potential. Since the gravitational
potential decreases inverse linearly so that the strength of the
gravitational field itself decreases with the inverse square, and since the
density of vacuum energy (zero-point energy) must be smaller in
stronger gravitational potentials than at weaker ones because
gravitational time dilation increases in step with the increasing potential,
it follows that the ratio of mass-energy to vacuum energy must decrease
inverse linearly to mimic the inverse linear variation in the magnitude of
gravitational time dilation; remember that gravitational time dilation is
owing to a decrease in available exchange energy with which all
physical processes are mediated. Hence, since the decrease in mass-

energy is with the inverse cube, the decrease in vacuum energy must
itself be with the inverse square. At this point we note that the decrease
in black hole energy density is with the inverse square of black hole
radius. We are therefore led to think of a black hole as constituting the
maximum density of mass-energy possible in the sense that all energy
exchanges occurring within the volume of space occupied by the black
hole, occur between the black hole and itself, symmetrically, with no
exchange energy left over to mediate matter-vacuum energy exchanges.
This is presumably why the intensity of gravitational time dilation
is infinite at the surface of a black hole; the vacuum energy fluctuation
field (zero-point energy) no longer interacts with the black hole mass so
that no physical processes (which can be communicated to the "outside")
are mediated. As stated earlier, it the interaction of the vacuum zeropoint energy with quantum mechanical systems which is wholly
responsible for all changes in the quantum mechanical observables in the
system, i.e., temporality of the system.
The theory of quantum
electrodynamics explains the propagation of fermions and bosons in the
following manner: a massless photon propagates through spacetime by
continually transforming into an e+e- pair and back again into a photon
of identical energy (assuming a flat spacetime), while an electron
propagates through spacetime by continually transforming into

I thought I would write you, first of all, to thank you for the interpretive
astrological chart you mailed to me a couple of days ago. I could have
done so by telephone, but somehow it's more sincere to reciprocate by
writing you - thanks so much for the time and consideration you must
have put into constructing it, as well as explaining its obviously
portentous, if to me somewhat cloudy significance. Even if it turns out
that you have enlisted the aid of a computing device in interpreting it, I
remain flattered, nonetheless, by the obvious attention. I have always
known that the horoscopes appearing in your average city newspaper are
very much like fortune-cookie messages. They are so ambiguous that,
combined with human suggestibility ( cause for the unreasonable

effectiveness of natural languages ), the component words can't help but
conspire to form an intriguing personal insight, fleeting, as it usually
turns out. In the case of this "a boy's first astrological reading," I'm
sentimental in thinking that my chart, lovingly prepared by you, does, in
fact, contain some important hints and warnings which I might do well
to meditate on.
Your claim, through the chart, that I need to "do more Leo things,"
and that I need to seek out persons with a lot of energy in their earth and
air signs in order to balance the plethora of energy flowing from my
water trine (sp?), seems to be really good advice. Also, the fact that the
north node of my moon, representing my path of highest potential, is
opposed by Saturn, representing the influence of my father, or perhaps
fatherly influences, appears to explain chronic problems I've had in the
past in self-definition and development. I confess, Leslie, that I haven't
given enough thought to the other messages in my chart. I have already
ordered the book, Inner Sky, from Eliot's, and I've promised myself that
I will return to the study of my personal chart anew, once I feel I've
gleaned enough of a working knowledge from reading, or skipping
through the book. Getting back to the ambiguity question. Astrology
probably possesses too few built-in constraints upon its interpretive
procedures, the number and variety of which having steadily increased
over the millennia, compared to its relatively smaller and constant
number of possible symbolic structures - notwithstanding the discovery
of a few new planets. I nonetheless feel that it is valuable as a metaphor
in a couple of different ways. Firstly, it acknowledges the fact, kept
secret since the Enlightenment, I think, that the development, or
unfolding, if you will, of history is by no means linear or logical, but at
once cyclical and suprarational: history, both that of civilizations as well
as that of individual persons, repeats itself interminably, though never
exactly in the same way twice. It is an inextricably intertwined dance
between the act of creation and the act of interpretation. If History
seems so amenable to a systematic interpretation, it is only because so
many of the great human figures who have played a role in shaping it
have, themselves, been serious students of history. Secondly, the social

relationships which form between persons place them in various mutual
orbits which help to realize or inhibit their multiple though not
altogether consistent potentialities, determining more fully their identity
and, hence, their life's fate. Assimilating the basic handful of distinct
personality archetypes to their analogous astrological signs opens up a
rich interpretive structure within which the transformative or stultifying
effects of each personality type upon the other may be predicted and
explained. The positions of the various planets within our Solar system
move along cyclical courses, to be sure, but they never exactly repeat
any of their previously held configurations - contrary to what the best of
eighteenth century science might have had to say about it. This seems to
grace existence with a richness of multiple potentialities.

And now, we will practice Microsoft Word by taking notes on Jacques
Derrida,"The Retrait of Metaphor," which was published in ENCLITIC
2:2 (1978): 5-33. As we know, the editors point out that the article works
with two semantic systems for the word RETRAIT, which has a variety
of meanings in French.
The reason that words tend to have myriad meanings is that words
coined to denote things or activities within some original context are
borrowed for use in an unfamiliar or less familiar one. But the first
denotative terms were actually metaphor since various images were
being assimilated over time (in the subjective experience of primitive
man) to the notion of a thing which appeared and reappeared. So objects
are distilled out of the flux of experience in the developing feedback
between the infant and his environment and constitute a kind of reified
metaphor. October 2014 For example, just contemplate how the word,
“togetherness” transcends its merely denotative description in the heart
of a parent when applied to the photo she is beholding of her young
daughter hugging a new puppy? September 2011 au=Nozick notes that
Frege held the view that “concepts cannot be referred to (as

And this is what the Self really is. This is somewhat Kantian and is
along the lines of what Piaget says about cognitive development in the
sensorimotor stage: the infant learns through interacting with a reactive
environment that the image of her hand moving, the sound it makes
when it strikes a mobile hanging over her crib, the feeling in her hand,
the kinesthetic sensations in the arm and shoulder muscles are all part of
the same “thing.” This integration of sensation has to be learned from
experience. Existence definitely precedes essence and objects in the
external world and the Self emerge from the flux of dissociated
sensation simultaneously. We commonly here of the “thrownness” of
the individual. More correct here is to speak of the thrownness of the
self and its world together in a single act. @$Information never once
entered one’s developing infantile brain. Rather, data streamed in the
form of trains of neural impulses into one’s developing brain via the
various sensory nerve channel and were interpreted in terms of contextproviding structures, which were in the midst of being developed by you
or, rather, these context-providing structures and your ego/self were
hand-in-hand simultaneously developing.
Metaphor represents the right brain version of the left brain/analytical
activity of the instantiation of abstract categories. September 2011 Metaphor
cuts across the established lines along which abstract categories are
fashioned and seeks a larger umbrella category for two or more contexts
whereas abstraction or abstract thought consistently works within a
single context. This distinction is akin to that between disciplinary and
interdisciplinary theoretical research.
There is always somewhat of an insight involved in the use of metaphor
and it’s a linguistic competency not likely to ever be equaled by a
machine. When we learn a language this latent structure of metaphor
lying at various levels beneath the surface of language is subconsciously
assimilated and it conditions and delimits all thought, even at its most
creative. Especially then.

“Rich as the English language is in media of expression, it is curiously
lacking in terms suitable to the conveyance of abstract philosophical
premises. A certain intuitive grasp of the subtler meanings concealed
within groups of inadequate words is necessary therefore to an
understanding of the ancient Mystery Teachings”, c.f., The Secret
Teachings of All Ages.
Now mostly what Heidegger does when he's doing metaphysics is to
unearth this latent structure by going back, he thinks, much closer to
where it originated. Usually in the Greek. When you read Heidegger
you realize that, at bottom, that's all metaphysics really is - it's just
archeology of language, the "mining" of latent metaphors which are
masquerading as purely denotative concepts or logical categories. When
I'm doing metaphysics, I always feel that I'm not completely in control
of what I'm thinking and sometimes I feel like I'm more or less a passive
vessel into which insights flow and intuitions crystallize. And that's
because I think that I'm supposed to be utilizing clear and distinct
categories although I'm really utilizing metaphor. All of the time, in fact.
This is why logocentrism doesn’t really work. And logocentrism is
itself the kwd=reification of a metaphor and does not really qualify as a
truly denotative concept. Deconstruction deconstructs itself. The
statement that absolute truth is false cannot be absolutely true!
Deconstruction is a giant case of question-begging, I think.
September 2014

Stuart Young IMHO, "postmodern" is merely a less-consistent, superficial regurgitation of Eastern
philosophy, which doesn't give credit where credit is due. In reality, it is full of "maya," yet thinks of itself as enlightened.
It also throws out the scientific method (no doubt thinking that it's a product of "dead White male culture"). It thinks of
everything as "relative" - but itself.

I am of two minds on this question. I think that there has to be some
insight at the bottom of this intellectual movement, but I also sense that
there may be an organic defect within it, as well. However, if the human
mind indeed possesses this seeming paradoxical ability to *per
impossible* step outside itself so as to deftly and advantageously use
logically incoherent formulations and invalid deductive steps to

sometimes put itself onto a path to truth that is otherwise inaccessible to
empirical fact, logic and common sense, then the borrowing from
mysticism is doubly noted, but at the same time I don't believe that
postmodernism and deconstruction can be wholesale dismissed out of
hand. The great Austrian mathematician, Kurt Gödel’s proof that truth is
a stronger notion of provability (logical provability) has brought me to
this liberal attitude toward creative uses of invalid lines of argument.
Professional chess players, it has been noted (by Grandmaster Larry
Evans, I believe) calculate moves efficiently and with greater depth and
accuracy as they ascend the rankings from A Player to Expert to Master
Candidate to Master all the way up to International Master, but,
according to GM Evans, something remarkable happens as one makes
the transition from IM to GM. One ceases to calculate moves and
begins to play by instantaneous pattern recognition. The GM kind of
"groks" the correct move. (Kasparov's defeat in 1997 to Deep Blue is
notwithstanding, I believe.)
In instantiation of a concept or logical category, the grasping of a
particular is prefigured in the pre-existing concept or form which is not
expanded or enlarged through this re-cognition under the concept of one
of its concrete particulars. In metaphor, however, there is a creative
interpretation of the unknown or unfamiliar through the importation of a
contextual web of associations (based in experience) as opposed to
logical relations or abstract categories. A static, stable order in the old
context becomes a dynamic ordering principle whenever it is
transplanted into the new context. The dynamism is generated by the
reactivity of the new context as ground into which a seed or viral
contaminant of foreign meaning is introduced. In metaphor an inductive
as opposed to a deductive step is taken which enlarges the original
category that was borrowed. And all of the entities treated of
denotatively are, as alluded to already, metaphorical constructs. This is
what makes metaphor open-ended and irreducible in the scope of its
action, as well as translogical. Because logic presupposes metaphorical
relationships and so the process by which formal categories are
generally brought about cannot itself be given a formal description,

which is to say, no formal description can be given for how formal
descriptions are generally brought into being. Metaphor, which is
prelogical, underlies the production of all formal categories/abstract
concepts. This idea seems to support au=Alan Watts’ critique of the
presumably inviolate principle come down to us from the ancient Greek
philosophers of ex nihilo nihil fit. September 2011 “Discover all that you are
not — body, feelings, thoughts, time, space, this or that — nothing,
concrete or abstract, which you perceive can be you. The very act of
perceiving shows that you are not what you perceive.” [italics mine] Sri Nisargadatta Maharaj, I am That.
One of the systems has to do with retreats, retracings, withdrawals, and
soon: leading to questions of economy, pathway, passage, and
circulation. The second system has to do with erasure/rubbing but also
usury, by which use and wear BUILD UP or increase value/meaning.
There are two kinds of metaphor - two senses in which metaphor is a
"Retrait." The first is the interpretation of the new in terms of the old.
The second is the reverse of this: the reinterpretation of the old in terms
of the new, such as a metaphor, suggested, for example, by new social
relations enabled by developments in technology. An example of this
might be the drawing of an analogy between the rise of the Internet and
the World Wide Web's impact upon postmodernity and the
social/cultural impact of the printing press upon the Renaissance in
Europe ( in terms of the freeing-up of individualism). By making of
history a Palimpsest, we make the transition (passage) into the future
less discontinuous and more comprehensible.
March 2014

One definite advantage of the Internet for the creative process
that is usually overlooked is its value as inspirational of lone creativity
that should have otherwise second-guessed itself into quietude, even in
the absence of forum-like context. Similar to how astrophysicists are
able to “observe” stellar evolution “over the course off billions of years”
during a career of 30 or 40 years, one can simulate communication and
collaboration through the deft posing questions to a sophisticated online

search engine. In this way the ego and the personal are more or less
altogether bracketed out. Search engine serendipity is an art. The
bracketing out of common, ordinary and routine search terms, in
combination with the use of terms as “targeting tags” with which these
general search terms of usually associated, is an online search strategy
that is more likely to yield the kind of surprising and counterintuitive
search engine results capable of stimulating new directions in thought, if
not an altogether breakthrough insight, than is the mere stacking of
search terms in an attempt to progressively sieve information on the
Web to unearth that wondrous clue to one knows not what.
“Wornness, worn-outness, will be important here as well, since Derrida
will be talking about metaphor as something old, something coming near
its end. Is Derrida talking about the ending of History in the sense of the
end of grand narratives?”
Myth is metaphysics clothed in metaphor. The most fundamental myth
is that of the Ego or Self-consciousness.
The Ego is the most
fundamental of myths because it represents the operation of metaphor at
its most fundamental: consciousness is an unbounded flux which is in
continual change along a determined but not predetermined path. The
Ego possesses continuity throughout this fluxion despite its always being
the artifact of an ever changing ground. The Ego always manages to
reconstitute itself as such against this changing, grounding flux of
altering consciousness. The Ego in the present moment is always the
importation of a structure from the previous momentary ground
(consciousness) into a new one all the while remaining the self same
Ego. Sorry if I’m belaboring the obvious.
Derrida begins by pointing out that metaphor works with these notions
of passage and circulation: inhabiting, transporting oneself, passing
through, and so on: all of this is of course is good for poetry in general,
and given my fixation, for Vallejo in particular. A key initial idea is that
while we think we "use" metaphor, it in fact comprehends us, sweeps us
away, displaces us: we aren’t like a pilot in his ship, we’re DRIFTING,

The importation of the structuring of the old ground from the preceding
moment manages always (or almost always) to impose a new structure
upon the newly emerging ground which returns the Ego to itself. This
return of the Self to Itself continually, all the while the ground of
consciousness fluctuates underneath it, represents the power of metaphor
in its greatest generality.
For this reason we might term Mind the metaphor of all metaphors. And
that is inevitable, for no speech is possible without metaphor.
[It is not clear to me why Derrida thinks metaphor is coming to the end
of its life he says it’s old, does he say how he knows it’s almost
“retiring” (he says it is retiring)?] But here comes something: because
it’s old, it has MORE and not less weight: a lot is attached to metaphor.
Metaphor is "a suspensive withdrawal and return supported by the line
(TRAIT) delimiting a contour" (9) [this again is good for Vallejo].
Now he asks why we privilege Heidegger’s text (he doesn’t say which
text) on this topic. It seems to be because of H_s concentration on
TRAIT, in the sense of line, the "tracing incision of language" (10).
Now D reveals two of H_s titles: DER SATZ VOM GRUND and
UNTERWEGS ZUR SPRACHE. He also reminds us, in his inimitable
way, that he will quote himself ("WHITE MYTHOLOGY: Metaphor in
the Text of Philosophy") but this is not in order to draw attention to
himself but rather, so as not to have to repeat here what he said there
(yeah, yeah, Jacques-baby).
This is getting difficult. D is going to slip himself through one of H_s
notes on metaphor - in which "the metaphoric exists only within the
boundaries of metaphysics" - as discussed by Ricoeur in LA
METAPHORE VIVE, whose eighth essay, in turn, discusses D_s
"White Mythology" piece. [Gossip: the current piece by Derrida was
read at a symposium in Geneva where Ricoeur also read.] Anyway, the

point is that we will be relating metaphor and metaphysics here, in the
above sense, which the metaphoric exists only within the boundaries of
metaphysics. [Guessing: as we know, D wants to get beyond
metaphysics, so I suppose this article will try to lead us beyond
metaphor: let’s see, that’s interesting, it sounds THEOLOGICAL to me
and I know D would probably hate me for thinking so.]
D says R didn’t pay enough attention this point of H_s. So now he will
critique R. First point. R, according to D, assimilates D too easily to H.
Second point. More on R_s misreading of "White Mythology;" overassimilation to H. [Not having read "WM" or the Heidegger piece on it,
it’s hard for me to comment here.] [Gossip: D comes from a repressive
family background, I can tell, he’s like me, keeps saying "but that’s not
what I said, how can you attribute it to me" - he is very fixated on being
precisely understood, I agree intellectually with that feeling, but what I
am gossiping about here is
his tone.] Here, he’s also mad at R because, D says, R criticizes D from
the place to which D had himself carried the critique.
A key point appears to be that according to R, "WM" makes death or
dead metaphor is watchword - this idea offends D (note though that R_s
text is called LIVE METAPHOR). What D purports to really be talking
METAPHOR [he doesn’t explain this here; we have to read "WM"
which I’m beginning to suspect is more interesting than the piece at
Now we talk about economy. A. usury B. the law of the house C.
EREIGNIS [?] D. passage, fraying, transfer, translation, withdrawal
(because, I intuit, metaphor TODAY is withdrawing, according to D).
Now we look at mother-tongue and father-language again, complicated
little arguments, my first guess here is that mother-tongue is not
metaphoric, but father-language is metaphorical and metaphysical, has
to do with formal language, the law, and so forth.

Retreat, tracing, translation_let_s talk about "traits," then. We need
metaphor when we can_t get to Being_if we could get there, there would
be no metaphor.
And, what Heidegger calls metaphysics ITSELF
corresponds to a withdrawal of Being. So we only get out of
metaphysical/metaphorical discourse by a withdrawal of the withdrawal
of Being.
NEED I SAY MORE?] Anyway, what we_re going to get with
metaphor is a series of retreats, withdrawals_this is how metaphor gets
so complex_as it withdraws, it "gives place" to "an abyssal
generalisation of themetaphoric."
Being, like metaphor, "withdraws into its crypt" [VAMPIRES AGAIN!
BUT, and this is going to be important, we get a CATASTROPHE when
metaphoricity no longer allows itself to be contained in its
_metaphysical_ concept when (I THINK) metaphor stops being a
metaphor of something that is absent (but whose absence is palpable, as
in the absence of Abraham’s God).

Existence is, of course, the contextualizing of Being. The withdrawal of
Being would mean the loss of coherence of the ground of existence.
Metaphor is the continual recontextualization of Being which maintains
this coherence of ground.
Metaphor, in its root and most basic
manifestation, effectively simulates the continued presence of Being.
But this is for some not satisfactory. One tragically desires the Being of
the Other. The phenomena resulting from the very action of Being,
Being’s metaphorical manifestation as existent entities falls short of this
tragic desire for Being. But secretly the Being of the Self and the Being

of the Other are one and the same. For one is the Other for the Other.
But one can see from one’s own case, that one is more than merely the
Other for the Other! Metaphysics is the attempt to discursively describe
what can only be glimpsed, which is the coherence of existence in the
light of Being’s presencing as Other. Metaphysics tries to reconstitute
Being from out of the coherence of Being’s existence even in Being’s
The immediacy of Being obviates the necessity of
metaphysics as "ontological neurosis" caused by its withdrawal. On the
other hand, the thrownness of Being is its thrownness as Self and Other
simultaneously. September 2013 In the same way that the intersubjective falls
short of the objective, the 3rd person based concept of consciousness
falls short of the 1st person concept of consciousness and the 1st falls
short of the concept of consciousness from a/the 0 th person perspective.
This is no mere superficial appearance of induction. The 3 rd person does
not imply the 2nd and 1st persons as is pointed up by the notion of
“philosophical zombies”, although the 2nd does imply the 1st just as the
1st implies the 0th as Alvin Plantinga suspected. The 1st does not imply
the 2nd because of the ever-present specter of metaphysical solipsism.
Continued play with these seemingly sophomoric notions may point up a
heretofore unsuspected system or relation.
The Self is a sociolinguistic construct. The self only emerges within the
social environment of a linguistic community. Part of the process of
learning any given task is that of the making of subvocalized, mental
notes to oneself as one is attempting to perform and master the task. So
this is here not entirely a case of learning by doing. But when it comes
to learning the “task” of becoming minimally competent in one’s first
language - this is entirely an example of “learning by doing”. The
understanding of what one is doing appears later after the necessary
preparation of ground.
We have here the real thing in hand and we can dispense with saying
what something is like.
Our difficulties in having an authentic
relationship with Being which would have powerfully validated the self
stimulates in us an impulse to hatefully gossip - to talk bad about

Metaphysics is an attempt to deconstruct Being which is
motivated by a dark, underlying necrophilic urge to tear down,
demystify, and demythologize the Other which seems to have rejected
us, not unlike a haughty and unapproachable, would-be lover. But it is
not Being which has done the rejecting here. Rather this necrophilic and
destructive impulse, which manifests itself in the form of a metaphysics
of Being, is precipitated not through Being’s callous rejection of us, but
on account of rage against impotence to intimately relate to Being.
But there is another sort of metaphor in Heidegger, a non-metaphysical
End of metalanguage, metaphysics, meta-rhetoric, but pure
metaphoricity_ By now we_re talking about famous Heidegger lectures
like "The Nature of Language." Metaphors, words, are INCISCIONS,
tracings_as in wood-cuttings, gravures, engravings_and these incisions
make possible graftings, so to speak, splicings_ and BEING ITSELF IS
translation, a more interesting affirmation than the French original
“n_arrive qu_a s_effacer”)_being happens and comes about only in
effacing itself_(there is more on this) The essence of speech is
INCISION [this is interesting, we speak of "incisive arguments" but here

metaphorical phrase_;)) so today, metaphor is withdrawing, splicing,
un/joining. What is happening? “Rien, pas de reponse, sinon que de la
metaphore le retrait se passe de lui-meme.
I have often marvelled at how the movement of Being through time is at
the selfsame, identical instant, both a passing away and a coming into
being. In other words, the coming into being and the passing away of the
Self within the flux of consciousness (during each passing instant) are
grounded in the very same phenomena, and this paradox of passage is
essential to the continuity of experienced time.

Representation is grounded in the participatory and the objective is
no more “real” than the intersubjective. Representations are metaphors
and convenient recapitulations of an open-ended historical process. All
form is metaphor; the concrete always transcends the metaphorical. 07/98
dynamical system temporally evolves may be given a consistent
definition in terms of the ratio of the density of energy exchanges of the
system with its outside environment to the density of 3-momentum
exchanges of the system with itself. By this definition, the most rapidly
temporally evolving dynamical system would be that of the pristine
quantum mechanical vacuum state - the quantum vacuum in the absence
of real particles or fields. We must note that the notion of the absolute
passage of time, i.e., the passage of time for reality as a whole, is a
meaningless concept, or at least, a concept which cannot be given a selfconsistent formulation or interpretation. This fact is intimately related to
the fact that a thermodynamic system to which the notion of entropy
applies (the 2nd Law of Thermodynamics) is by definition an open
system in the sense of a system undergoing continual energy exchange
with a thermal reservoir or "heat bath." @$A completely closed system,
as noted earlier, would possess initial and boundary conditions resulting
in the quantizing of energy and momentum throughout the system giving
it a closed state space and a Poincare recurrence time which would be

indistinguishable from a finite 4th spatial dimension. In such a system,
with time being spatialized, the notion of the direction of time is
completely arbitrary - there is not outside to which the system is tied
which can serve as a memory of the history of the system to prevent the
system from being completely reversible. The system would be ergodic
and possess a conserved phase space volume. In perturbation theory
within quantum mechanics, we find that an incompletely described
dynamical system is approximated by a Hamiltonian possessing a
perturbation energy which may be thought of as a system exactly
described in terms of a Hamiltonian, H0, which is interacting with a
larger energy system through the perturbation Hamiltonian, Hfluc which
is simply added to H0 such that the new wavefunction calculated from
this sum through the Schrodinger equation is just the new wavefunction
expanded in terms of the old one defined in terms of H 0. In this way the
actual system is seen to be the old system undergoing virtual transition
between its energy eigenfunctions. The old system's energy uncertainty
is represented in terms of the perturbation energy associated with the
fluctuation Hamiltonian, Hfluc. In this way, it is seen that, in general, the
temporal evolution of any quantum system is representable in terms of
the interaction of an approximate system represented by a zeroeth order
Hamiltonian, H0, with its outside environment from which is has
originally been abstracted. When one has taken into account all possible
perturbations due to real particles and fields interacting with the given
system in question, one is left with the ineradicable residue of the
quantum vacuum itself.
So the concrete (and real) temporality of any
quantum system, when the mere appearance of change in the system due
to inadequacies in our nth order perturbation expansion description of
the system have been taken into account, is wholly attributable to the
action of the quantum mechanical vacuum. So we now come to an
important distinction: changes in the system which are not directly
measurable and hence understood as virtual transitions between energy
levels of an approximate Hamiltonian description of the system versus
transitions between energy levels of the system due to an actual
incompleteness or openness of the system description due to ontological,
i.e., actual, indeterminacy or indefiniteness of the system itself, as

opposed to mere epistemological indefiniteness of the system which is a
mere artifact of an incomplete quantum-perturbative analysis of the
system. This is the distinction of ontological versus epistemological
energy uncertainty of a quantum mechanical system. This above
discussion pertains to the distinction, made in an earlier letter, of /\E,
which I have said may be wholly attributable to the observer, and the
square rootcont’d

Since momentum and position are incompatible observables, then so are
a function of momentum and a function of position. Now the total
energy of any quantum mechanical system, the Hamiltonian, H(p,r), is
the sum of its kinetic and potential energies, H(p,r) = f(p) + f(r), where
p and r are momentum and position, respectively. So by what has been
said, H(p,r) cannot have a precise value - for this would imply
simultaneously precise values for the kinetic and potential energies,
which, in turn, would imply simultaneous values of p and r. So the
value, H(p,r) must undergo fluctuations of a fundamental sort. Now
even the vacuum is a quantum system, i.e., a qm ground state. @$So the
vacuum's Hamiltonian, that is, its total energy, must also fluctuate.
These fluctuations interact with every particle and field, introducing
uncertainty in the location of particles in phase space, i.e., x-p space.
All measurement does is alter the shape of the area of phase space
"occupied" by the particle. Measurement does not change the area of
phase space where this particle is likely to be found ("occupied" by this
particle"), however. The particle does not possess an exact "position"
within the x-p (phase) space. We can never say beforehand how the
vacuum fluctuations interacting with the particle (and out of which the
particle is constituted and sustained) will nonlocally resonate with the
vacuum fluctuations interacting at the time of measurement with the
observer's brain (the observer's brain is also a quantum system, BTW).
Remember that qbar = sqrt[^q**2 - /\q**2 ] where ^q is the fluctuations
of q due to the quantum vacuum and /\q is the uncertainty in q which
may be wholly attributed to the observer's brain due to the influence of
vacuum energy fluctuations upon it!. It is the cooperation of these two

terms which results in qbar, the expectation value (classical value) of q!
This perhaps reminds some of you of Huxley's theory of perception: the
receipt of photons by the retina of the observer results in a stimulation of
the brain in such a way that its “ether wave filters” reconfigure so that
the signals representing the object seen are no longer screened out by the
consciousness reducing valve (the brain, that is) which are then "picked
up". The brain is then conceived of as a kind of ether wave tuning
device and perception is just an altering of the set of frequencies of ether
waves (vacuum fluctuations, if you prefer modern parlance) which the
vacuum can resonate with where the brain acts only as a hardware
interface between two unbounded sets of interfering ether wave spectra.
The brain on this view is simply a changeable and complex set of
boundary conditions placed upon the vacuum electromagnetic field's
self-interaction! “What we see and hear, or what we feel and smell and
taste, is only a small fraction of what actually exists out there. Our
conscious model of reality is a lowdimensional projection of the
inconceivably richer physical reality surrounding and sustaining us. Our
sensory organs are limited: They evolved for reasons of survival, not for
depicting the enormous wealth and richness of reality in all its
unfathomable depth. Therefore, the ongoing process of conscious
experience is not so much an image of reality as a tunnel through
reality”, cit=The Ego Tunnel: The Science of the Mind and the Myth of the
Self (au=Metzinger).
Is there some general relationship between the height of the potential
barrier and the magnitude of the energy uncertainty? Or is there really
no general principle at work here relating these two quantities? H# =
E# --> <#*E#> = <E> --> <#*(E**2)#> = <E**2> / <#*E#>**2 =
/\E = sQrt{<E**2> - <E>**2} , where H = H(T(p),V(x))
What is the relationship between the reduction of the wavepacket upon
an observation being performed on some quantum mechanical system
and the conversion of virtual particles into real particles?

It may be possible to modify Poisson's equation, @2P/@2r = 4pi(rho),
to include a 2nd partial derivative of P, the potential, with respect to the
time such that we might assimilate the 2nd partial derivative with respect
to r to the state variable, (rho)mass, and assimilate the 2nd partial
derivative of P with respect to t to the state variable, (rho)vacuum, so
that (rho) in the above equation may be interpreted as the space density
which is a locally conserved quantity.

Let us examine Einstein's field equation for any potential mathematical
affinity it might have with respect to our equation relating the space
energy density to the sum of the vacuum and mass energy densities.

Tuv = -Ruv -1/2Rguv

Each of these three terms are what are called tensor densities. They
have physical dimensions of energy density. In Mach's formula for the
speed of pressure wave oscillations in a continuous, energy-conducting
medium, the pressure is associated with the vacuum energy density since
the quantum vacuum always obeys the equation of state that its pressure
and energy density are identical. But this identification leaves only one
possible further identification of the medium energy density; that is, the
energy density must be identified with the total energy density of space,
what is termed within our theory, the space density. In order for an
entropy and temperature to be assigned to the quantum vacuum, we must
suppose that this vacuum remains in thermal equilibrium with this heat
reservoir, the energy density of which is the space density referred to
Intuitively, if any further identifications are to be made between terms
within our theory and terms within Einstein's theory, then the following
identifications might be made:
The scalar curvature, R, should be identified with the space density, the

momentum-energy tensor, Tuv, should be identified with the massenergy density, and the term, -Ruv, should be identified with the vacuum
energy density. The term, guv, which in relativity theory is the
dimensionless dot product of the spacetime coordinate unit vectors, eu
and ev, may be alternatively interpreted to correspond to the ratio of
sum of the momentum-energy and Riemannian tensor densities to the
scalar energy density. Within our theory, the guv correspond to mixed
2nd order partial derivatives of the ratio of the sum of the vacuum scalar
energy density to the total space energy density.
virtual - virtual
| x
o o x |
x |
o ox |
| x
x x o|
| x
x o |
| x
x x |
o x|

G = U - ST

real - real

real -




| xo xx


| o







| xo

| o o




|o ox



| x

o o



| o

ox x

| o o o o

o o|




oo x






| x x o



(free energy) is minimized and configurational entropy is

maximized when
rho(v) = rho(m) in the formation of a black hole.
Do the partial derivatives of the gravitational potential transform like the
components of a four vector? @$It would appear that an arbitrary Lorenz
transformation of the 1st order partial derivatives of a standard static
gravitational potential should transform so as to evince the existence of a
time-varying potential, and hence, that of a 4-hyperspherical potential.

There is an important distinction to be made between massive and
massless particles. This distinction consists in the fact that a massive
particle which is seen to be at rest has a 4-momentum which is purely
imaginary, but which may be re-represented by a Lorenz transformation
in terms of a new set of real and imaginary components within some
different inertial reference frame. This is not generally true of massless
particles, however. A massless particle, such as a photon, possesses a
relativistic 4-momentum which is purely real in any and all inertial
reference frames. There is no possible Lorenz transformation which can
succeed in re-representing the 4-momentum of the photon as a mixture
of real and imaginary momentum components. However, in the case of
real massive particles, the relativistic mass increases exactly in step with
the increase in imaginary momentum. This suggests that perhaps
photons do not possess a gravitational mass, and that the true source of
the gravitational field is a massive body's imaginary momentum. How
then, if this is true, do we account for the disappearance of the
gravitational mass which results from the total conversion of mass into
photon energy? Does this energy disappear in the form of longitudinal
pressure waves in the quantum vacuum?

A photon which is climbing out of a gra