According to a Gerald Massey web page:
"... ......In later life Massey became increasingly interested in Egyptology. He studied the extensive Egyptian records housed in the British Museum, eventually teaching himself to decipher the hieroglyphics. Following years of diligent research into the history of Egyptian civilisation and the origins of religion, Massey concluded that Christianity was neither original nor unique, but that the roots of much of the Judeo/Christian tradition lay in the prevailing Kamite (ancient Egyptian) culture of the region. ...".
Gerald Massey (1828-1907) wrote:
In an introduction to A Book of Beginnings and to The Natural Genesis, Charles Finch said:
"... Gerald Massey sought to unravel the psychocultural strands emanating from the remotest human heritage. ... His investigations convinced him that the roots of modern culture went back ... to the ancient beginnings in Africa. ... Massey took the position that people and culture migrated out of Africa into the rest of the world ... and that there exists a global cultural unity that is African at its root. Molecular biology is bearing Massey out. ...In Volume II of A Book of Beginnings ... he minutely dissects the religion and culture of the ancient Hebrews to reveal them as ... an off-shoot of old Kemit and, even more remotely ... of Africa itself. ...
Josephus ... in his essay Against Apion paraphrases the Egyptian annalists' explanation of the Exodus, indicating that the people who so departed Egypt were themselves Egyptian ...
In The Natural Genesis ... Massey ... delved ... into the source of symbols. ... Massey was keen to trace the manner in which symbols arose from images taken from the surrounding topography of nature in the inner African cradleland, or "placentaland", ( Ta-Kenset ) ... time-space itself was configured as the uroboric serpent that forms the circle of eterniity be taking its tail in its mouth. ...
... it was Massey's contention that revelation and prophecy consisted solely in knowing the major celestial time cycles ... the Precession of the Eqinoxes ... ramifying into a Great Year of 26,000 years ... Massey realized ... that the ancients had written a Book ages anterior to the beginning of conventional history, and that Book was inscribed in the heavens. ...
Jesus, or ... Yehusua's surname "Pandera" meant "panther" and ... the panther-skin was the emblem of the Afro-Kamite priesthood ...".
In an Appendix to Ancient Egypt, The Light of the World, Massey compared some Egyptian and Christian terms. Here is only a small sample of an extensive detailed list of correspondences:
"... Egyptian Christian ... Ra, the holy spirit = God the Holy Ghost ... ... the hawk or the dove as = the dove as the bird of the the bird of the holy spirit Holy Spirit ... The trinity of = The Trinity of the Father, Atum(or Osiris) the father, Son, and Holy Spirit ... Horus (or Iu) the son, and Ra the holy spirit Isis, the virgin mother = Mary the virgin mother of Jesus ... of Iu, her Su or son The outcast great mother = Mary Magdalene with with her seven sons her seven devils ... Seb, Isis, and Horus, = Joseph, Mary, and Jesus, the Kamite holy trinity a Christian holy trinity ... Sut and Horus, = Satan and Jesus, the twin opponents the twin opponents ... Anup, the Baptizer = John the Baptist ... The star, as announcer = The Star in the East that indicated for the Child-Horus the birthplace of Jesus ... Hermes, the scribe = Hermas, the scribe ... The paradise of the = The Holy City lighted by one pole-star one luminary that is neither the sun nor the moon = the pole-star ... The ark of Osiris-Ra = The Ark of the New Covenant in heaven ...".
In his books, Gerald Massey had similar detailed lists of correspondences between Egyptian and
A recent book along the lines of Gerald Massey's work is Black Spark, White Fire : Did African Explorers Civilize Ancient Europe?, (Prima Publishing 1997) by Richard Poe, who quotes Diodorus as saying
"... Now the Ethiopians, as historians relate, were the first of all men. They say also that the Egyptians are colonists sent out by the Ethiopians ...".
Martin Bernal wrote Black Athena: The Afroasiatic Roots of Classical Civilization (2 volumes, Rutgers 1987, 1991). His father was J. D. Bernal who in 1929 wrote in "The World, the Flesh, and the Devil"
"... The stage should soon be reached when materials can be produced which are not merely modifications of what nature has given us in the way of stones, metals, woods and fibers, but are made to specifications of a molecular architecture. ...Normal man is an evolutionary dead end; mechanical man, apparently a break in organic evolution, is actually more in the true tradition of a further evolution. ... man himself must actively interfere in his own making ... The decisive step will come when we extend the foreign body into the actual structure of living matter ...
Connections between two or more minds would tend to become a more and more permanent condition until they functioned as a dual or multiple organism. ... The complex minds could ... extend their perceptions and understanding and their actions far beyond those of the individual. Time senses could be altered: the events that moved with the slowness of geological ages would be apprehended as movement, and at the same time the most rapid vibrations of the physical world could be separated. ... The interior of the earth and the stars, the inmost cells of living things themselves, would be open to consciousness ... and ... the motions of stars and living things could be directed. ...
consciousness itself may ... become completely etherealized ... becoming masses of atoms in space communicating by radiation, and ultimately perhaps resolving itself entirely into light. That may be an end or a beginning ...
... leaving on one side the not impossible state in which mankind would be stabilized and live an oscillating existence for millennia, we have to consider ... the alternatives: whether mankind will progress as a whole or will divide definitely into a progressive and an unprogressive part. ...
More and more, the world may be run by the scientific expert. The new nations, America, China and Russia, have begun to adapt to this idea consciously. ... this scientific development could take place by the colonization of the universe and the mechanization of the human body. ... there would be an effective bar between the altered and the non-altered humanity ... If ... the colonization of space will have taken place ... Mankind - the old mankind - would be left in undisputed possession of the earth, to be regarded by the inhabitants of the celestial spheres with a curious reverence. ...
We are on the point of being able to see the effects of our actions and their probable consequences in the future; we hold the future still timidly, but perceive it for the first time, as a function of our own action. ...".
and as Jack Sarfatti says, J. D. Bernal "... was the Founder of Birkbeck College situated on Malet Street behind the venerable British Museum ..." where Gerald Massey studied.
In response to criticism of Black Athena, Martin Bernal wrote Black Athena Writes Back (Duke 2001), in which he said:
"... I [Martin Bernal] have never claimed that my ideas are original, merely that I am reviving some neglected older views and bringing together some scattered contemporary ones ...[ CH = Chadic, BER = Berber, E = Egyptian, O = Omotic, BEJ = Beja, S = Semitic, CC = Central Cushitic, EC = East Cushitic, SC = South Cushitic ]
... By far the most important single reaction to Black Athena has been the publication of ... Black Athena Revisited ...[whose]... senior editor [was] Mary Lefkowitz ...[and who wrote a]... popular book Not Out of Africa [ How Afrocentrism Became an Excuse to Teach Myth As History ]...
In 1991, when Lefkowitz first encountered Afrocentrism through reading Black Athena, she was appalled. She discovered that there were people writing books and teaching that Greek civilization had derived from ... Egypt. ... Lefkowitz's dislike is focused on those who have argued that African Americans share a common heritage with Ancient Egypt and further, that through Egypt, Africa played a significant role in the formation of Ancient Greece and hence "Western Civilization". ... It is this view of Greek hybridity and dependence on older, non-European civilizations that Lefkowitz finds disturbing. ...".
I find it unsettling that, according to a Publishers Weekly review of Not Out of Africa (on an Amazon web page), Lefkowitz says that her attacks on Black Athena are "... defending academic standards ...", a circumstance that reminds me of opposition to new ideas, censorship, the deaths of Socrates and Giordano Bruno, and Gerald Massey's last poem:
For TruthHe set his battle in array, and thought To carry all before him, since he fought For Truth, whose likeness was to him revealed; Whose claim he blazoned on his battle-shield; But found in front, impassively opposed, The World against him, with its ranks all closed: He fought, he fell, he failed to win the day But led to Victory another way. For Truth, it seemed, in very person came And took his hand, and they two in one flame Of dawn, directly through the darkness passed; Her breath far mightier than the battle-blast. And here and there men caught a glimpse of grace, A moment's flash of her immortal face, And turned to follow, till the battle-ground Transformed with foemen slowly facing round To fight for Truth, so lately held accursed, As if they had been Her champion from the first. Only a change of front, and he who had led Was left behind with Her forgotten dead.
In 1995, Di Nella and Paturel at Lyon observed in astro-ph/9501015 that:
"The distribution of galaxies up to a distance of 200 Mpc (650 million light-years) is flat and shows a structure like a shell roughly centered on the Local Supercluster (Virgo cluster). This result clearly confirms the existence of the hypergalactic large scale structure noted in 1988."
Back in 2003, in astro-ph/0302496, Tegmark, de Oliveira-Costa, and Hamilton said:
"... there is a preferred axis in space along which the quadrupole has almost no power. This axis is roughly the line connecting us with (l, b) = (-80, 60) in Virgo. ... Moreover, this [octopole] axis is seen to be approximately aligned with that for the quadrupole. ... In contrast, the hexadecapole is seen to exhibit the more generic behavior we expect of an isotropic random field, with no obvious preferred axis. ...".
Update to 2006: Cosmology News said:
"... 4 March 2003 A new study shows that the (weak) large-scale temperature fluctuations in the microwave background are perpendicular to a spatial axis pointing towards Virgo. This tentatively suggests that if the universe is indeed multiconnected, it connects up with itself in that direction. In other words, if you want to travel through space in a straight line and return to your starting point, Virgo is the most promising direction to head.8 October 2003 An expository article and a research article in the 9 October 2003 issue of Nature show that the Poincaré dodecahedral space may account for the weak large-scale temperature fluctuations in the microwave background. The Poincaré dodecahedral space is like a 3-torus, but made from a regular dodecahedron instead of a cube. Another crucial difference is that the dodecahedral model implies that space is slightly curved, unlike the 3-torus which is flat like Euclidean space. Confirmation or refutation of the proposed model is expected soon, within weeks or months. Links will be provided here as more information becomes available.
April 2004 The second-year WMAP data, expected by February of this year, has been delayed due to some "surprises" in the data. The nature of the surprises is being kept secret while the WMAP team studies them in hopes of understanding their significance. The surprises may well be related to some anomalies discovered in the first-year data, such as statistically significant differences in the CMB between the northern and southern galactic hemispheres, and unsettling coincidences between the directions of the largest scale fluctuations and the plane of the solar system (the ecliptic). The latter suggest that the largest scale fluctuations on the microwave sky might not be coming from deep space after all, but rather from sources in or around the solar system, or could perhaps even be due to some still undiscovered error in the data analysis. Of course such speculation should be taken with a grain of salt until the WMAP team releases the second-year data along with their analysis of it.
February 2006 Careful analysis has shown that the strange alignment of the broadest Cosmic Microwave Background (CMB) fluctuations is real and significant at the 99% level. The alignment, together with the extreme weakness of those broad fluctuations, presents a real mystery whose resolution is not yet in sight. On the other hand, alignments with the ecliptic turned out to be spurious. The long-awaited second-, third- and fourth-year WMAP data, including the polarization data, still have not appeared. The WMAP team is analyzing these data with utmost care, leading one to speculate that exciting conclusions may follow. ...".
March 2006 - Sean Carroll, in a 16 March 2006 CosmicVariance blog entry, said:
"... the new WMAP results ... I can quickly summarize the major points as I see them. ...
- ... the power spectrum: amount of anisotropy as a function of ... multipole moment l ... The major difference between this and the first-year release is that several points that used to not really fit the theoretical curve are now, with more data and better analysis, in excellent agreement with the predictions of the conventional LambdaCDM model. That's a universe that is spatially flat and made of baryons, cold dark matter, and dark energy.
- In particular, the octopole moment (l=3) is now in much better agreement than it used to be. The quadrupole moment (l=2), which is the largest scale on which you can make an observation (since a dipole anisotropy is inextricably mixed up with the Doppler effect from our motion through space), is still anomalously low.
- The best-fit universe has approximately 4% baryons, 22% dark matter, and 74% dark energy, once you combine WMAP with data from other sources. The matter density is a tiny bit low, although including other data from weak lensing surveys brings it up closer to 30% total. ...
- Perhaps the most intriguing result is that the scalar spectral index n is 0.95 +- 0.02. This tells you the amplitude of fluctuations as a function of scale; if n=1, the amplitude is the same on all scales. Slightly less than one means that there is slightly less power on smaller scales. The reason why this is intriguing is that, according to inflation, it's quite likely that n is not exactly 1. Although we don't have any strong competitors to inflation as a theory of initial conditions, the successful predictions of inflation have to date been somewhat "vanilla" - a flat universe, a flat perturbation spectrum. This expected deviation from perfect scale-free behavior is exactly what you would expect if inflation were true. The statistical significance isn't what it could be quite yet, but it's an encouraging sign.
- ... lower power on small scales (as implied by n<1) helps explain some of the problems with galaxies on small scales. If the primordial power is less, you expect fewer satellites and lower concentrations, which is what we actually observe. ...
- The dark energy equation-of-state parameter w is a tiny bit greater than -1 with WMAP alone, but almost exactly -1 when other data are included. ...
- One interesting result from the 1st-year data is that reionization - in which hydrogen becomes ionized when the first stars in the universe light up - was early, and the corresponding optical depth was large. It looks like this effect has lessened in the new data ...
- A lot of work went into understanding the polarization signals, which are dominated by stuff in our galaxy. WMAP detects polarization from the CMB itself, but so far it's the kind you would expect to see being induced by the perturbations in density. There is another kind of polarization ("B-mode" rather than "E-mode") which would be induced by gravitational waves produced by inflation. This signal is not yet seen, but it's not really a suprise; the B-mode polarization is expected to be very small, and a lot of effort is going into designing clever new experiments that may someday detect it. In the meantime, WMAP puts some limits on how big the B-modes can possibly be, which do provide some constraints on inflationary models. ...".
The WMAP 3-year Polarization paper says:
"... there is a residual signal in our power spectra that we do not yet understand. It is evident in W band in EE at l = 7 and to a lesser degree at l = 5 and l = 9. We see no clear evidence of it anywhere else. ... The W-band EE l = 7 value is essentially unchanged by cleaning, removing a 10 degree radius around the Galactic caps, or by additionally masking ±10 degree in the ecliptic plane. ... To avoid biasing the result by this residual artifact which also possibly masks some unmodeled dust and synchrotron contamination, we limit the cosmological analysis to the QV combination. ...We detect the optical depth with tau = 0.088 + 0.028 - 0.034 ...
The same free electrons from reionization that lead to the l < 10 EE signal act as test particles that scatter the quadrupolar temperature anisotropy produced by gravitational waves (tensor modes) originating at the birth of the universe. The scatter results in polarization B modes. ... While scalar and tensor fluctuations both contribute to the TT and EE spectra, only tensors produce B modes ... The tensor contribution is quantified with the tensor to scalar ratio r ... Using primarily the TT spectrum, along with the optical depth established with the TE and EE spectra, the tensor to scalar ratio is limited to r < 0.55 (95% CL). When the large scale structure power spectrum is added to the mix ... the limit tightens to r < 0.28 (95% CL). These values are approaching the predictions of the simplest inflation models. ...
The detection of the TE anticorrelation near l = 30 is a fundamental measurement of the physics of the formation of cosmological perturbations ... It requires some mechanism like inflation to produce and shows that superhorizon fluctuations must exist. ...".
The WMAP 3-year Temperature paper says:
"... The new polarization data ... produce a better measurement of the optical depth to re-ionization, tau = 0.088 + 0.028 - 0.034. This new and tighter constraint on tau helps break a degeneracy with the scalar spectral index which is now found to be ns = 0.95 ± 0.02. ...Sky maps of the modes from l = 2 - 8, derived from the ILC map, are shown in Figure 14. ...
... There has been considerable comment on the non-random appearance of these modes. ... It has been noted by several authors that the orientation of the quadrupole and octopole are closely aligned ... and that the distribution of power amongst the a_lm coefficients is possibly non-random. ... the basic structure of the low l modes is largely unchanged from the first-year data. Thus we expect that most, if not all, of the "odd" features claimed to exist in the first-year maps will survive. ... the quadrupole amplitude is indeed low, but not low enough to rule out /\CDM. ...
there do appear to be some questionable features in the data ... These features include:
- low power, especially in the quadrupole moment;
- alignment of modes, particularly along an "axis of evil" ...
- the quadrupole and octupole phases are notably aligned with each other and ...
- the octupole is unusually "planar" with most of its power aligned approximately with the Galactic plane ...
- the quadrupole plane and the three octopole planes [are] "remarkably aligned."...
- three of these planes are orthogonal to the ecliptic and the normals to these planes are aligned with the direction of the cosmological dipole and with the equinoxes. This had led to speculation that the low-l signal is not cosmological in origin ...
- "multipole vectors" ... characterize the geometry of the l modes ...[showing]... that the "oriented area of planes defined by these vectors . . . is inconsistent with the isotropic Gaussian hypothesis at the 99.4% level for the ILC map." ...
[ "multipole vectors" are described by Copi, Huterer, and Starkman on a cwru web page with an illustration on which the following image is based:
]
- the l = 5 mode is "spherically symmetric" at 3 sigma, and the l = 6 mode is planar at 2 sigma confidence ...
- l = 3 and 5 modes are aligned in both direction and azimuth ...
- unequal fluctuation power in the northern and southern sky ... the ratio of low-l power between two hemispheres ...[shows]... that only 0.3% of simulated skies have as low a ratio as observed ...
- a surprisingly low three-point correlation function in the northern sky;
- an unusually deep/large cold spot in the southern sky; and
- various "ringing" features, "glitches", and/or "bites" in the power spectrum. ...
the most scientifically compelling development would be the introduction of a new model that explains a number of currently disparate phenomena in cosmology (such as the behavior of the low l modes and the nature of the dark energy) while also making testable predictions of new phenomena. ...".
The WMAP 3-year parameter paper says:
"... r and ns are defined at k = 0.002 Mpc-1. ... The WMAP data requires either tensor modes or a spectral index with ns < 1 to fit the angular power spectrum. ...The low l multipoles, particularly l = 2, are lower than predicted in the /\CDM model. ... Models with significant gravitational wave contributions, r = 0.3, make a ... prediction ...[of] ... a modified temperature spectrum with more power at low multipoles ...
The deviation of the primordial power spectrum from a simple power law can be most simply characterized by a sharp cut-off in the primordial spectrum. Analysis of this model finds that putting in a cut off of k_c = 3 x 10^(-4) / Mpc improves the fit ...
the simplest inflationary models predict only mild non-Gaussianities that should be undetectable in the WMAP data ... If the universe were finite and had a size comparable to horizon size today, then the CMB fluctuations would be non-Gaussian ... Since the release of the WMAP data, several groups have claimed detections of significant non-Gaussianities ... Almost all of these claims imply that the CMB fluctuations are not stationary and claim a preferred direction or orientation in the data. ... we choose to address in a unifying manner the large scale "asymmetry", "alignment" and low l power issues discussed in the literature after the first year release ... by testing the hypothesis that the observed temperature fluctuations, Tbar, can be described as a Gaussian and isotropic random field modulated on large scales by an arbitrary function ... mild deviations ... are observed ... the best fit form for f ...[is]... an axis lying near the ecliptic plane. This is the same feature that has been identified in a number of papers on non-Gaussianity. ... If we were eager to claim evidence of strong non-Gaussianity, we could quote the probability of this occurring randomly as less than 2%. We, however, do not interpret the improvement ... as evidence against the hypothesis that the primordial fluctuations are Gaussian. Since the existence of non-Gaussian features in the CMB would require dramatic reinterpretation of our theories of primordial fluctuations, more compelling evidence is required. ... an alternative model that better fits the low l data would be an exciting development. ...".
My opinion from 2003 remains unchanged. It is that dipole (line segment) configurations
*---*
are naturally related to spatial axes in 3-dimensional space, as are quadrupole (square) and octopole (cube) configurations
*---* *---* | | |\ |\ *---* *-*-*-* \| \ | *---*
However, since higher multipoles, from hexadecapole on up, are related to 4-dimensional and higher dimensional hypercubes, they do not have such a direct relationship to spatial axes in 3-dimensional space. It seems to me that the dipole, the quadrupole and the octopole are all aligned with respect to the same axis, which corresponds to the Great Attractor in Virgo. Therefore, I thought in 2003, and I still think, that the dipole, quadrupole and octopole are all related to the Great Attractor in Virgo, and that
could be the Segal conformal gravity sector of my D4-D5-E6-E7-E8 VoDou Physics Model, because:
A WMAP web page has a nice illustration from the NASA/WMAP Science Team of our expanding universe:
Another WMAP web page has an illustration from the NASA/WMAP Science Team
with a caption that says:
The Cosmological Constant is described from an Algebraic Quantum Field Theory point of view by Hollands and Wald in gr-qc/0405082, where they say:
"... there are holistic aspects of quantum field theory that cannot be properly understood ... by applying ordinary quantum mechanics to the low energy effective degrees of freedom of a more fundamental theory defined at ultra-high-energy/short-wavelength scales ...the absurdly large value obtained for the stress-energy of a quantum field when computed by applying quantum mechanics without subtractions to the low energy modes of the field is usually referred to as the "cosmological constant problem". In 4 dimensions, a calculation ... would yield an expected energy density of order ... ( 10^19 GeV)^4 ... By contrast, the actual energy density of our universe is only of order ... ( 10^(-12) GeV)^4 ... the enormous discrepancy between the naive mode-sum calculation and the observed energy density is therefore generally viewed as a very serious "problem".
We do not share this view. As we have argued above, there are many aspects of the theory of a quantum field that simply cannot be understood by viewing its low energy degrees of freedom as being independent. The mode sum calculations ... do not properly take into account the holistic aspects of quantum field theory. ... If one accepts the holistic aspects of quantum field theory, there is still a "cosmological constant problem", but it is rather different than the usual formulation of it. The puzzle is not, "Why is the observed energy density of the universe so small?" ... Rather, the puzzle is, "Why is the cosmological constant so large?"
Quantum field theory predicts that the stress-energy tensor of a free quantum field in an adiabatic vacuum state in a slowly expanding 4-dimensional universe should be of order of L^(-4), where L denotes the size and/or radius of curvature of the universe. For our universe, 1/L would be of order 10^(-42) GeV. But observations of type Ia supernovae and the cosmic microwave background strongly suggest that, at the present time, the dominant component of stress-energy in the universe is smoothly distributed (i.e., not clustered with galaxies) and has negative pressure. The energy density of this so-called "dark energy" is thus (10^(-12) GeV)^4, i.e. roughly the geometric mean of the unsubtracted mode sum and quantum field theoretic predictions for vacuum energy density.
... if dark energy does correspond to vacuum energy of an interacting quantum field, it is our view that its properties will be understood only by fully taking into account the holistic nature of quantum field theory.
... Of course, it remains a very significant puzzle as to why quantum field theory possesses holistic aspects, i.e., how they arise from the more fundamental, underlying theory. However, it is likely that we will need a much deeper understanding of the underlying theory in order to account for this. ...".
Such a deeper understanding is supplied by my D4-D5-E6-E7-E8 VoDou Physics Model. It is based on a generalized hyperfinite II1 von Neumann algebra factor whose basic building block is the real Clifford algebra Cl(8). Triality symmetries of Cl(8) give ultraviolet cancellations leading to a zero value of the vacuum fluctuation Cosmological Constant in the full high-energy regime of 8-dimensional spacetime.
These symmetries are inherited by the effective low-energy regime with 4-dimensional spacetime, so that it also has a zero value of the vacuum fluctuation Cosmological Constant /\. The non-zero /\ that we observe now in our universe (and that we measured with WMAP) is due to the Conformal Gravity of 4-dimensional spacetime based on the ideas of Irving Segal, which gives calculated values of the ratio Dark Energy : Dark Matter : Ordinary Matter that are consistent with WMAP observations.
The Holland andWald paper at gr-qc/0405082 is discussed in physics/0603112 by Bert Schroer, who says therein:
"... there are two important concepts of localizations in relativistic quantum theory: the Newton-Wigner localization and modular localization. ...the N-W localization results from the adaptation of the Born x-space localization probability to relativistic wave functions and connects to a position operator and associated localization projectors and probabilities ...
The use of N-W localization becomes ... deadly wrong (superluminal acausalities) if used for propagation over finite distances.
modular localization results from the attempt to liberate the causal localization inherent in pointlike quantum fields from the non-intrinsic aspects of field-coordinatization. Modular localization theory assigns a preferred role to operator algebras associated with spatial wedge regions; in some sense which can be made precise wedge algebras implement the best compromise between particles and fields. As in the Lagrangian quantization approach the perturbative construction of a model is in principle determined once one specified the Lagrangian, the QFT in the modular localization setting is uniquely determined in terms of the structure of its wedge algebras (the position of the wedge algebra within the algebra of all operators, or the algebraic structure of generators). The algebras for smaller regions (spacelike cones, double cones) are determined in terms of algebraic intersections of wedge algebras. ...
the local net of spacetime-indexed operator algebras which ... quantum fields ... generate ... is analogous to the coordinate-independent setting achieved in modern differential geometry. ...
Modular localization ... does not lead to projectors and probabilities but is correct concept for the covariant causal localization of states and operator algebras. ... There are massiveWigner representations in d=1+2 dimensions with anomalous (non-halfinteger) spin whose associated fields have plektonic (braid group) statistics which is inconsistent with a pointlike localization as well with an onshell structure. More precisely even in the "freest version" (vanishing scattering cross section) the realization of braid group statistics requires that any operator whose one-time application to the vacuum is a state with a one-particle component has necessarily a nonvanishing vacuum polarization cloud, in other words there is no on-shell free anyon field.. Many properties of anyons (anyon=abelian plekton) can be seen by applying modular localization to the Wigner representation. ...
if vector potentials become quantum objects ... The standard way... is to temporarily forget the positivity requirement and to uphold the pointlike structure so that the usual perturbative Lagrangian quantization approach could be applied. This is achieved by artificially extending the quantum theory by adding in unphysical ghosts which at the end of the calculation have to be removed.
From the outset it is not clear that after having done the perturbation theory in this unphysical setting one can remove the ghosts from quantities which in classical sense would be gauge invariant and the best formulation which makes such a descend manifest (by formulating the physical descend as a cohomological problem) is the well known BRST formalism of gauge theory. ... The problem starts if such a BRST "catalyzer" (Ghosts are neither in the original problem of spin one particle representations nor in the final physical answers) is not considered as a temporary computational trick, but becomes elevated to the status of a fundamental physical tool. ...
Behind all these remarks is a theory, which after the Hilbert space operator formulation of quantum mechanics is the most impressive examples of a perfect matching of mathematical concepts: the modular (Tomita-Takesaki) theory of operator algebras and its unifying awe-abiding power to relate statistical mechanics, quantum field theoretical localization and the local quantum physical reason d'etre for the emergence of internal and external symmetries from general properties of operator algebras. ... Connes used this new concepts to significantly extend the classification of factor algebras started by Murray and von Neumann. ...
A particularly radical result in comparison with the standard Lagrangian setting is the possibility to describe a full-fledged QFT with all its structural richness in terms of a finite number of "monades" i.e. copies of one unique object in a common environment such that all physics is encoded in the relative positions of these copies. If one interprets the word monade in this physical realization of Leibniz's philosophy as the unique (up to isomorphism) hyperfinite type III1 Murray von Neumann factor ...
[ In my D4-D5-E6-E7-E8 VoDou Physics Model, the "monade" is a generalized hyperfinite II1 von Neumann algebra factor whose basic building block is the real Clifford algebra Cl(8).
John Baez, in his week 175, described the classification of von Neumann algebras:
"... While classifying all *-algebras of operators is an utterly hopeless task, classifying von Neumann algebras is almost within reach - close enough to be tantalizing, anyway. Every von Neumann algebra can be built from so-called "simple" ones as a direct sum, or more generally a "direct integral", which is a kind of continuous version of a direct sum. As usual in algebra, the "simple" von Neumann algebras are defined to be those without any nontrivial ideals. This turns out to be equivalent to saying that only scalar multiples of the identity commute with everything in the von Neumann algebra.
People call simple von Neumann algebras "factors" for short. Anyway, the point is that we just need to classify the factors: the process of sticking these together to get the other von Neumann algebras is not tricky.
The first step in classifying factors was done by von Neumann and Murray, who divided them into types I, II, and III. This classification involves the concept of a "trace", which is a generalization of the usual trace of a matrix.
Here's the definition of a trace on a von Neumann algebra. First, we say an element of a von Neumann algebra is "nonnegative" if it's of the form xx* for some element x. The nonnegative elements form a "cone": they are closed under addition and under multiplication by nonnegative scalars. Let P be the cone of nonnegative elements. Then a "trace" is a function tr: P -> [0, +infinity] which is linear in the obvious sense and satisfies tr(xy) = tr(yx) whenever both xy and yx are nonnegative.
Note: we allow the trace to be infinite, since the interesting von Neumann algebras are infinite-dimensional. This is why we define the trace only on nonnegative elements; otherwise we get "infinity minus infinity" problems. The same thing shows up in the measure theory, where we start by integrating nonnegative functions, possibly getting the answer +infinity, and worry later about other functions.
Indeed, a trace very much like an integral, so we're really studying a noncommutative version of the theory of integration. On the other hand, in the matrix case, the trace of a projection operator is just the dimension of the space it's the projection onto. We can define a "projection" in any von Neumann algebra to be an operator with p* = p and p2 = p. If we study the trace of such a thing, we're studying a generalization of the concept of dimension. It turns out this can be infinite, or even nonintegral!
We say a factor is type I if it admits a nonzero trace for which the trace of a projection lies in the set {0,1,2,...,+infinity}. We say it's type In if we can normalize the trace so we get the values {0,1,...,n}. Otherwise, we say it's type Iinfinity, and we can normalize the trace to get all the values {0,1,2,...,+infinity}. It turn out that every type In factor is isomorphic to the algebra of n x n matrices. Also, every type Iinfinity factor is isomorphic to the algebra of all bounded operators on a Hilbert space of countably infinite dimension. Type I factors are the algebras of observables that we learn to love in quantum mechanics. So, the real achievement of von Neumann was to begin exploring the other factors, which turned out to be important in quantum field theory.
We say a factor is type II1 if it admits a trace whose values on projections are all the numbers in the unit interval [0,1]. We say it is type IIinfinity if it admits a trace whose value on projections is everything in [0,+infinity]. Playing with type II factors amounts to letting dimension be a continuous rather than discrete parameter! Weird as this seems, it's easy to construct a type II1 factor. Start with the algebra of 1 x 1 matrices, and stuff it into the algebra of 2 x 2 matrices as follows:
( x 0 ) x |-> ( ) ( 0 x )
This doubles the trace, so define a new trace on the algebra of 2 x 2 matrices which is half the usual one. Now keep doing this, doubling the dimension each time, using the above formula to define a map from the 2n x 2n matrices into the 2n+1 x 2n+1 matrices, and normalizing the trace on each of these matrix algebras so that all the maps are trace-preserving. Then take the union of all these algebras... and finally, with a little work, complete this and get a von Neumann algebra! One can show this von Neumann algebra is a factor. It's pretty obvious that the trace of a projection can be any fraction in the interval [0,1] whose denominator is a power of two. But actually, any number from 0 to 1 is the trace of some projection in this algebra - so we've got our paws on a type II1 factor. This isn't the only II1 factor, but it's the only one that contains a sequence of finite-dimensional von Neumann algebras whose union is dense in the weak topology. A von Neumann algebra like that is called "hyperfinite", so this guy is called "the hyperfinite II1 factor". It may sound like something out of bad science fiction, but the hyperfinite II1 factor shows up all over the place in physics! First of all, the algebra of 2n x 2n matrices is a Clifford algebra, so the hyperfinite II1 factor is a kind of infinite-dimensional Clifford algebra. But the Clifford algebra of 2n x 2n matrices is secretly just another name for the algebra generated by creation and annihilation operators on the fermionic Fock space over C2n. Pondering this a bit, you can show that the hyperfinite II1 factor is the smallest von Neumann algebra containing the creation and annihilation operators on a fermionic Fock space of countably infinite dimension. In less technical lingo - I'm afraid I'm starting to assume you know quantum field theory! - the hyperfinite II1 factor is the right algebra of observables for a free quantum field theory with only fermions. For bosons, you want the type Iinfinity factor. There is more than one type IIinfinity factor, but again there is only one that is hyperfinite. You can get this by tensoring the type Iinfinity factor and the hyperfinite II1 factor. Physically, this means that the hyperfinite IIinfinity factor is the right algebra of observables for a free quantum field theory with both bosons and fermions.
The most mysterious factors are those of type III. These can be simply defined as "none of the above"! Equivalently, they are factors for which any nonzero trace takes values in {0,infinity}. In a type III factor, all projections other than 0 have infinite trace. In other words, the trace is a useless concept for these guys. As far as I'm concerned, the easiest way to construct a type III factor uses physics. Now, I said that free quantum field theories had different kinds of type I or type II factors as their algebras of observables. This is true if you consider the algebra of all observables. However, if you consider a free quantum field theory on (say) Minkowski spacetime, and look only at the observables that you can cook from the field operators on some bounded open set, you get a subalgebra of observables which turns out to be a type III factor! In fact, this isn't just true for free field theories. According to a theorem of axiomatic quantum field theory, pretty much all the usual field theories on Minkowski spacetime have type III factors as their algebras of "local observables" - observables that can be measured in a bounded open set. ...". ]
... the environment ...[is]... a joint Hilbert space in which this operator algebra sits in different positions ... if these relative positions are defined in an appropriate way in terms of modular operator algebra concepts (modular inclusions and intersections with a joint vacuum), then the existence of a Poincare (or conformal) spacetime symmetry group and of a net of local algebras (generated from the action of these symmetries on the monads) are consequences. ...
A recently solved interesting problem of QFT which required a conceptual insight beyond the standard setting is the quantum adaptation of Einstein's local covariance principle to QFT in curved spacetime. The reason why it took such a long time to understand this issue is that the local (patch-wise) isometric diffeomorphisms of the classical theory have no straightforward implementation on the level of quantum states (as compared to the unitarily implemented standard global spacetime symmetries as Poincare invariance of the vacuum state in Minkowski spacetime). The standard formalism for expectation values based on Lagrangian action functional (or any other quantization formalism) does not separate states from operators. ... after the algebraic approach led to such a separation one learned how to ... formulate quantum local covariance ...[using]... its algebraic functorial formulation in terms of a functor which relates a category of causal manifolds with a category of certain algebras, the old problem one had with states became clear: states are dual to algebras.
When one dualizes the algebraic statement one finds that only foleii of states are invariant, the quantum local covariance does not leave them individually unchanged. The upshot of these investigations is a new way of looking at QFT: instead of considering quantum fields on prescibed Lorentzian causally complete (globally hyperbolic) manifolds, a field theory model in the new setting is a functor between all causally complete manifolds and an operator algebraic category (e.g. the Weyl algebra, the CAR algebra,....).
... the ... quantum version of Einstein's local covariance ... does not support the naive zero point energy counting arguments as in ... S. Weinberg, Rev. Mod. Phys. 61, (1989) 1 ... which treat the vacuum as a relativistic quantum mechanical level system. These arguments have been uncritical used by many particle physicists ... In a very interesting paper Hollands and Wald ...[ gr-qc/0405082 ]... show that the local covariance setting of QFT contradicts such relativistic quantum mechanical picture of filling momentum space levels ... which is in harmony with the idea the momentum space (Fourier transform) only acquires its physical interpretation through covariant localization and not the other way around. Unfortunately the incorrect idea that QFT is some sort of relativistic quantum mechanics is extremely widespread, so that their arguments probably will not get the attention which they deserve.
Needless to mention that generically curved spacetime reference states which replace the Minkowski vacuum do lead to nonvanishing expectation of the correctly (in agreement with the local covariance principle) defined stress-energy tensor. There is however a new coupling parameter involving the curvature and within a curved spacetime setting one has to make assumptions about its numerical strength. Hollands and Wald show that the problem of a theory based on a energy-stress tensor quantized according to the requirement of local covariance does not lead to such gigantic values for the cosmological constant. ...".
Further, Schroer goes on to say:
"... The string-theorists "only game in town" claim is based on the belief that the main content of QFT is already known. But if a theory allows for such a radically different conceptual setting as I have indicated ... it is quite far from having reached its closure. It rather seems to call for another post-renormalization revolutionary step before it can reach its final form. ...
... The crisis in particle physics ... finds its most visible outing in the hegemony of string theory ... As long as some leading physicists, including Nobel prize winners, are failing to play their natural role as critical observers (in contrast to their more critical predecessors as Pauli who kept particle physics in a healthy rational state), the present situation will continue and may even deteriorate. ...There is a related sociological problem. If an idea which promoted the careers of many physicists is kept alive for such a long time it becomes immune against criticism. I think nobody at this late point would seriously expect that somebody who invested more than 3 decades into a theory which led to tens of thousands of publications but failed to make contact with real physics will come up and say "sorry folks this was it"? ...
With a large number of chairs at theoretical physics departments worldwide being occupied by string theorists I do not see much hope. ... the chance for a radical change of direction through newcomers entering particle physics will remain extremely dim ... Nowadays somebody who has the capabilities and the guts to resist the lure of the string hegemon in pursuit of his own original ideas will run a high risk to see the end of his academic career with no old-fashioned patent office around which could serve as a temporary shelter. ...".
In the .mov version of Atiyah's 24 Oct 2005 KITP talk Atiyah says (at about 1:12:24) that his class of models is based on "… the past history of a particle moving as a real particle …", which seems to me to be the past world-line of the particle. An audience member describes to Atiyah (at about 1:05:02):
"… a common thread between the class of models you are suggesting, Connes class of models, and some unsolved problems in string theory.So, one simple way to think about the class of models you are talking about is just to do a power series expansion of x(t-r) in t and … the higher derivatives of t so then you have an infinite order differential equation.
Similarly, quantum field theory on a noncommutative spacetime can be expressed in terms of a star product which is an exponential of derivatives and therefore is also in some sense a differential equation with an infinite number of derivatives and
the best nonperturbative formulation of string theory we have is string field theory which is expressed in terms of Witten's star product on strings which is also expressed in terms of some exponential of derivatives, but which we don't understand how to grapple with as well …".
It seems to me that a natural physical interpretation of that "common thread" is that strings should be interpreted as world-lines, NOT as individual particles or precursors of individual particles.
However, Atiyah himself does not agree. He said to me by e-mail "Yes my idea is to make things depend on the past world line of the particle. Your comment on infinite numbers wf derivatives is pertinent but does not really help. No, I do not think world lines are strings. I should emphasise that my ideas are very tentative and evolving all the time.".
Quantum Ranger, in a comment on Peter Woit's blog, asked "… what happens to the missing "planck-memory", it seems to be forever evolving "backwards", as for sure, even the Planck-memory has to been formated from a previous "past"? …".
I agree that is a good question, and it is also something that nagged in the back of my mind (in the form of why should the memory / past world-line be cut off at the Planck scale).
If there were no past time cutoff, then the basic entity would be the entire (back to the big bang?) past history world-line of each particle. Maybe such a model would be like that of Andrew Gray, who said in quant-ph/9712037 (in the abstract) "… probabilities are … assigned to entire fine-grained histories. The formulation is fully relativistic and applicable to multi-particle systems. It shall be shown that this new formulation makes the same experimental predictions as quantum field theory …".
The same Andrew Gray proposed a "Quantum Time Machine" in version 1 of quant-ph/9804008v1, but he withdrew that proposal on 8 Aug 2004, the same day that he posted version 2 of quant-ph/9712037. Therefore, it seems to me that although Andrew Gray felt his "Quantum Time Machine" was flawed, he still feels that his formulation of quantum theory in terms of "entire fine-grained histories", which sounds to me a lot like Atiyah's model without the Planck-scale cutoff, is valid.
I wonder whether Atiyah knows of Gray's model, and, if so, how he (Atiyah) thinks it compares with his (Atiyah's) model.
Peter Woit said in his blog: "… Both Atiyah and Witten are extremely quick on their feet. … Raoul Bott, who had just walked away from Atiyah and Witten, shaking his head … told me he found listening to the two of them "scary" since they were so much quicker than he was. Bott is a great mathematician also, but one who has to think everything through slowly and carefully to understand it, quite different than Atiyah or Witten. …".
Peter Woit's characterization might be that Atiyah is a Hare and Bott is a Tortoise, yet working together they produced wonderful results. In an interview Bott described his work with Atiyah, saying: "… In most of my papers with Atiyah he would write the final drafts and his tendency was to make them more abstract. …".
Bott went on to say: "… I like the old way of presenting things with an example that gives away the secret of the proof rather than dazzling the audience. … on the whole I like the problems to be concrete. I'm a bit of an engineer. For instance, in topology early on the questions were very concrete - we wanted to find a number! …".
As to physics and physicists, Bott made an observation about the Princeton IAS under Oppenheimer: "… Oppenheimer had taken over, and he was very dominant in the physics community. He had a seminar that every physicist went to. We mathematicians always thought they ran off like sheep, for we would pick and choose our seminars! …". Maybe superstring theory under Witten and Gross is a the contemporary manifestation of physicists' sheep-like behaviour.
Bott, in his interview at http://www.ams.org/notices/200104/fea-bott.pdf , said "… the start of my long and wonderful collaboration with Michael Atiyah. We first of all gave a new proof of the periodicity theorem which fitted into the K-theory framework … Then Grothendieck, in the purely algebraic context, gave a … proof …[of]… the index theorem … using his K-theory in the formal, algebraic way. … Before, we had taken complex analysis or algebraic geometry as a given, so that the differential operator was hidden … here, suddenly the topological twisting of the differential operator came into the equation. Of course, Atiyah and Singer immediately realized that this twisting is measured with the homotopy groups of the classical groups, by the so-called symbol. Eventually the whole development of index theory fitted the periodicity theorem into the subject as an integral part. Atiyah very rightly chose Singer to collaborate on this project. …".
In their book Spin Geometry (Princeton 1989 at page 277), Lawson and Michelsohn said: "… In 1982, E. Witten found a different approach … through consideration of symplectic geometry and supersymmetry. … he outlined a proof of the index theorem for the Atiyah-Singer operator … however … none of these methods [including Witten's] applies to prove the index theorem for families or the Cl_k - index Theorem (in their strong forms). These theorems in general involve torsion elements in K-theory which are not detectable by cohomological means. …".
In his book Introduction to Superstrings and M-theory (Second Edition, Springer 1999, 1988 at page 338), Michio Kaku said: "… new developments in supersymmetry have now made it possible to prove the Atiyah-Singer index theorem from a simple Lagrangian. Traditionally, the proof of the Atiyah-Singer theorem has been inaccessible to most physicists because of the intricacies of the mathematical formulation. …".
Reading those excerpts in sequence leads me to think that a reason that superstring physicists are so attached to supersymmetry is that it is only through Witten's supersymmetric approach that they can understand the Atiyah-Singer index theorem.
However, by restricting themselves to the Witten supersymmetric construction, the supersymmetry physics people are cutting themselves off from possibly very fruitful avenues of constructing new and possibly realistic physics models.
For instance, Lawson and Michelsohn, at page 270 of their book cited above, said [I have omitted some tildes etc from notation due to ASCII limitations]: "… Given a real operator … in the basic case, no information is lost under complexification. This is not true, however, if one passes to the index theorem for families. The index of a family of real operators takes its value in the group KO(A), and .. for example … KO(Sn) = Z2 for n = 1 (mod 8) but K(Sn) = {0} in these dimensions. For this reason Atiyah and Singer established a separate index theorem for families of real operators. It is a more subtle and profound result … the appropriate theory is not KO-theory … It is the more general KR-theory …".
If Kaku's assessment of physicists' inability to understand a KR-theoretical index theorem is correct, then I share Peter Woit's sense of loss if Atiyah is not now "working on the relation between K-theory and physics".
According to Freund in chapter 21 of his book Supersymmetry (Cambridge 1986) where chapter 21 is a NON-SUPERSYMMETRY chapter leading up to a supergravity description in the following chapter 22: "... Einstein gravity as a gauge theory ... Whether the gauge group be the Poincare or the [anti-] de-Sitter group, we expect a set of gauge fields w^ab_u for the Lorentz group and a further set e^a_u for the translations, ... Everybody knows though, that Einstein's theory contains but one spin two field, originally chosen by Einstein as g_uv = e^a_u e^b_v n_ab (n_ab = Minkowski metric). What happened to the w^ab_u ? The field equations obtained from the Hilbert-Einstein action by varying the w^ab_u are algebraic in the w^ab_u ... permitting us to express the w^ab_u in terms of the e^a_u ...". The w do not propagate ... ... We start from the four-dimensional de-Sitter algebra ...so(3,2). Technically this is the anti-de-Sitter algebra ... We envision space-time as a four-dimensional manifold M. At each point of M we have a copy of SO(3,2) (a fibre ...) ... and we introduce the gauge potentials (the connection) h^A_mu(x) A = 1,..., 10 , mu = 1,...,4. Here x are local coordinates on M. From these potentials h^A_mu we calculate the field-strengths (curvature components) [let @ denote partial derivative] R^A_munu = @_mu h^A_nu - @_nu h^A_mu + f^A_BC h^B_mu h^C_nu ...[where]... the structure constants f^C_AB ...[are for]... the anti-de-Sitter algebra .... We now wish to write down the action S as an integral over the four-manifold M ... S(Q) = INTEGRAL_M R^A /\ R^B Q_AB where Q_AB are constants ... to be chosen ... we require ... the invariance of S(Q) under local Lorentz transformations ... the invariance of S(Q) under space inversions ... ...[ AFTER A LOT OF ALGEBRA THAT I WON'T TYPE HERE ]... we shall see ...[that]... the action becomes invariant under all local [anti]de-Sitter transformations ...[and]... we recognize ... the familiar Hilbert-Einstein action with cosmological term in vierbein notation ... Variation of the vierbein leads to the Einstein equations with cosmological term. Variation of the spin-connection ... in turn ... yield the torsionless Christoffel connection ... the torsion components ... now vanish. So at this level full sp(4) invariance has been checked. ... Were it not for the assumed space-inversion invariance ... we could have had a parity violating gravity. ... Unlike Einstein's theory ...[MacDowell-Mansouri].... does not require Riemannian invertibility of the metric. ... the solution has torsion ... produced by an interference between parity violating and parity conserving amplitudes. Parity violation and torsion go hand-in-hand. Independently of any more realistic parity violating solution of the gravity equations this raises the cosmological question whether the universe as a whole is in a space-inversion symmetric configuration. ...".
Ark Jadczyk physics model discussion - February 2006 based on John Gonsowski Signs of the Times Forum Date: Sun, 26 Feb 2006 20:54:49 -0500 To: lark1@quantumfuture.net From: Tony Smith <f75m17h@mindspring.com> Subject: Signs of Times Forum Ark, thanks for your message and reference to the Signs of the Times Forum discussion about some aspects of my web site. I am a bit late and slow in replying because I have just returned from a trip to Connecticut (a final illness and funeral, so somewhat sad and stressful). A lot of points have been raised in your discussion with John G, and it is easier for me to deal with them by e-mail to you and let you post it (or any part of it that you find interesting) if you want to do so. Feel free to delete stuff you find uninteresting. Here are some of the points that I see, and some comments in which I will ignore signature issues to try to keep this short: As to fundamental structure as it emerges from Clifford algebra, see my web page at http://www.valdostamuseum.org/hamsmith/ClifTensorGeom.html Roughly, if the universe is describable by a union of all possible (some very large) real Clifford algebras, and then you use real periodicity to factor each Clifford algebra into a tensor product of Cl(8) tensor algebras then that union (in the limit) is a generalization of the well-known von Neumann hyperfinite II1 factor, where the generalization is the replacement of the complex 2x2 structures by the real Cl(8) Clifford algebra. Then, if each Cl(8) in each tensor product Clifford algebra "chain" describes the physics of a small neighborhood in our universe, each "chain" can (and probably will) "fold up" in such a way that each Cl(8) neighborhood is connected to other Cl(8) neighborhoods so that they collectively form a macroscopic region of an 8-dim spacetime that has a Planck-scale lattice structure. In this sense, an 8-dim spacetime "emerges" from a primordial giant "union" or collection of all possible real Clifford algebras. The 8-dim spacetime has a natural continuum approximation whose structure is M4 x CP2 with physics somewhat related to the Kaluza-Klein model of Batakis as described on my web page at http://www.valdostamuseum.org/hamsmith/YamawakiCP2KKNJL.html#Batakis This web page also describes in some detail how, in this model, Nambu-Jona-Lasinio type T-quark condensates, and their connection to Higgs and Vacua, explain the three peaks of T-quark events that have been observed by Fermilab (two of those three peaks have been ignored by Fermilab's official consensus publications, but they are there nontheless, and my web site has some discussions about that). As to where gravity comes from in this picture, the Cl(8) bivector Lie algebra Spin(8) has a subalgebra the conformal Spin(2,4) = SU(2,2) which produces gravity by the MacDowell-Mansouri mechanism http://www.valdostamuseum.org/hamsmith/cnfGrHg.html#CnfMMgr Since the conformal group has the anti-deSitter/Poincare group as a subgroup, it is possible that some regions in our universe (such as our solar system inside the orbit of Uranus) see gravity in a non-expanding Poincare phase like the fixed pennies on an expanding balloon, while some regions beyond Uranus see gravity in an expanding conformal phase like the expanding surface of a balloon. The picture is motivated by, and the conformal phase is similar to, Segal's conformal gravity http://www.valdostamuseum.org/hamsmith/SegalConf.html and it is consistent with (and effectively explains) both the Pioneer anomaly and the unusual rotational axis of Uranus. http://www.valdostamuseum.org/hamsmith/SegalConf3.html#pioneerexpmt and it also gives Dark Energy : Dark Matter : Ordinary Matter ratios that are in agreement with the WMAP observations http://www.valdostamuseum.org/hamsmith/cosconsensus.html#grvphtncc Possible exploitation of Dark Energy using coherent arrays of small Josephson Junctions is discussed at http://www.valdostamuseum.org/hamsmith/coscongraviton.html#QEDDE It is interesting that such coherent Josephson Junction arrays may be related to coherent tubulin arrays in human consciousness http://www.valdostamuseum.org/hamsmith/QuanCon.html and that there might be interesting resonant connections among such related structures http://www.valdostamuseum.org/hamsmith/QuantumMind2003.html I will note that the link immediately above is to a version of a paper that was barred from the Cornell arXiv due to their blacklisting me. Note that they have allowed posting by others on the subject of quantum consciousness, for example, papers by Fred Thaheld at quant-ph/0509042 and physics/0601060 and others. As to why strings might be relevant, see my construction of a physically realistic string theory model at http://cdsweb.cern.ch/search.py?recid=730325&ln=en Due to being blacklisted by the Cornell arXiv, it is not there, but I put it up on the above site at CERN shortly before CERN discontinued its EXT series on its preprint server. As to information and the initial "big bang", I generally subscribe to the approach of Paula Zizzi as I discuss at http://www.valdostamuseum.org/hamsmith/cosm.html#QCdSInfl There are many technical issues, some of which need more work, but in a short message there is no way that I can deal with them all. However, the end result is that I have constructed a physics model that is in reasonably good (in my opinion) agreement with all experimental observations. See my web page at http://www.valdostamuseum.org/hamsmith/2002SESAPS.html and, for neutrino masses and mixing angles, http://www.valdostamuseum.org/hamsmith/snucalc.html#asno I wish that I could say that I would happily discuss by e-mail and in blogs etc all questions in detail, but I am only one person with a limited time in life, and there are MANY very detailed questions, and I have no help from institutional affiliation or school of coworkers, so for further details I refer any interested party to the above links and to all other material on my web site. I know that there may exist errors in detail, and that my terminology may not be precisely what everyone else would like, and that some parts of my web site etc were written years before others, and there may be some inconsistencies due to evolution of my thinking, but I believe that any inconsistencies and any errors in detail and terminology are correctable and that as an overall structure the above model gives a substantially accurate description of nature and some interesting avenues for future exploration (particularly with respect to Dark Energy and consciousness). Tony Date: Mon, 27 Feb 2006 11:14:51 -0500 To: lark1@quantumfuture.net From: Tony Smith <f75m17h@mindspring.com> Subject: Lagrangian Ark, on the Signs of the Times Forum, you ask why I use the Lagrangian that I use. My Lagrangian for 4-dim spacetime with CP2 internal symmetry space is inherited from a Lagrangian over 8-dim spacetime that comes from the Cl(8) Clifford Algebra. The Cl(8) Clifford algebra structures give Lagrangian components as follows: 8-dim vector part -> 8-dim base manifold over which the Lagrangian density is integrated 28-dim bivector part -> gauge boson curvature term of Lagrangian density 16-dim spinor part -> 8-dim fermion spinor particle and 8-dim fermion spinor antiparticle part of Lagrangian density Further details and the physical interpretation of the other parts of Cl(8) are given on my web site, as are details of how the 4-dim spacetime Lagrangian appears when you break the full octonionic symmetry of the 8-dim spacetime by introducing a preferred quaternionic subspace (it can be thought of as "freezing out" at lower energies such as where our experiments are done). I should note that the (not 1 to 1) supersymmetry between the 28 gauge bosons and the 8 fermion particle types may be useful in cancellations for ultraviolet finiteness for the 8-dim spacetime, which is useful for the 4-dim spacetime Lagrangian of the corresponding low-energy theory, and that the 3 generations of fermions for 4-dim spacetime come from the structure of the dimensional reduction from 8-dim spacetime due to freezing out of a preferred quaternionic subspace. (The 8-dim spacetime fermions have only one generation.) As I would hope might be clear from the above, the Lagrangian is NOT just made up ad hoc, it is a natural construction based on the structure of Cl(8). Tony Date: Mon, 27 Feb 2006 15:16:48 -0500 To: lark1@quantumfuture.net From: Tony Smith <f75m17h@mindspring.com> Subject: manifold and density Ark,you ask "... What is your 8-dimensional manifold, and how you define your "Lagrangian density"? ...". The 8-dim manifold is S1xS7 which is the Shilov boundary of the bounded complex domain of type IV(8) that corresponds to the type BD(I) rank 2 symmetric space Spin(10) / Spin(8)xU(1) After a particular quaternionic structure is frozen out, the resulting Kaluza-Klein type space has two parts: compact internal symmetry space CP2 = SU(3) / U(2) and 4-dim spacetime S1xS3 which is the Shilov boundary of the bounded complex domain of type IV(4) that corresponds to the type BD(I) rank 2 symmetric space Spin(6) / Spin(4)xU(1) Of course, I should say that your work, including papers with Coquereaux, have been very important to me in trying to understand such Lie sphere geometry structures. The complex domain of which the 4-real-dim spacetime is the Shilov boundary has physical significance in determining the relative strengths of the forces (gravity, color, weak, electrmagnetic) using techniques motivated by (but not identical to) those of Armand Wyler in the 1960s-70s with respect to calculation of the electromagnetic fine structure constant. If you are interested in Wyler's work, you can go to my web site. The unpublished papers that he wrote while at the Princeton IAS under Freeman Dyson's directorship are both in one pdf file at http://www.valdostamuseum.org/hamsmith/WylerIAS.pdf It is about 17 MB in size, and is a long download for slow dialup. Details about the Lagrangian (sorry for the notation which is done due to limitations of html when I wrote it up) are at http://www.valdostamuseum.org/hamsmith/2002SESAPS.html#D4D5E6Lagrangian You can see an ealier version (not up to date in all technical details, but generally similar) in more familiar LaTeX notation in a paper that I put on the xxx.lanl.gov archive before it became the Cornell arXiv and before I was blacklisted: http://xxx.lanl.gov/abs/hep-ph/9501252 A very important unconventional technique is application of the work of Meinhard Mayer (who used the book of Kobayashi and Nomizu, volume 1) on dimensional reduction of gauge models. Here are references: Mayer, Hadronic Journal 4 (1981) 108-152, and also articles in New Developments in Mathematical Physics, 20th Universitatswochen fur Kernphysik in Schladming in February 1981 (ed. by Mitter and Pittner), Springer-Verlag 1981, which articles are: A Brief Introduction to the Geometry of Gauge Fields (written with Trautman); The Geometry of Symmetry Breaking in Gauge Theories; and Geometric Aspects of Quantized Gauge Theories. If it were more widely known, maybe my work would be more acceptable to the physics community, but it is not so widely known (of course, people like Mayer, Trautman, et al know it well, but they are not a big proportion of the physics community). Tony PS - IIRC (if I recall correctly, which may not be the case as my memory gets more questionable as I age), when I published my early work (including the Mayer mechanism stuff) in the 1980s in the International Journal of Theoretical Physics, \I think that Mayer may have expressed an opinion about it, saying something like "... if even half of what this paper says is true, it is a very important paper ...". Unfortunately for me, he may be the only physicist who ever said such a thing, while a lot of physicists obviously dislike my work and/or me (since I am blacklisted). Date: Mon, 27 Feb 2006 20:28:51 -0500 To: lark1@quantumfuture.net From: Tony Smith <f75m17h@mindspring.com> Subject: Wyler and quaternion stuff Cc: f75m17h@mindspring.com Ark, you ask about Wyler and IAS. As far as I know, Wyler's two 1972 IAS papers were never published beyond being presented to Freeman Dyson. My copies were given to me in the 1980s by Robert Gilmore, whom I visited at Drexel to discuss Wyler's work. As to why I am pretty sure the papers were never published, the story is sad and interesting. When Wyler left IAS, he gave the two papers to Freeman Dyson, probably with the Operations paper on top of the Light Cone paper, and Wyler wrote on the top sheet "a Monsieur le Professeur Dyson, avec me profonde reconnaissance A. Wyler" When Gilmore was working on Wyler's stuff (1970s), he talked to Dyson, who gave him both papers. Dyson wrote to Gilmore (on the top page adjacent to the writing by Wyler): "R. Gilmore: This seems to be the only copy I have. Don't tell Wyler I gave it to you. F. D." As I talked with Gilmore in the 1980s, Gilmore gave me the papers. Gilmore told me that Dyson had told him that he (Dyson) was unhappy with Wyler because, during his (Wyler's) time at IAS, Wyler did not interact with anyone (did not give talks, did not even engage in informal discussion) but only stayed by himself at his desk in his little cubby-hole cubicle-like study place at IAS. Gilmore said that, although he thought there might be something to Wyler's ideas, Wyler had told him (Gilmore, on a visit to Switzerland) that he (Wyler) was not sure of physical interpretations, but that he (Wyler) thought the math was so beautiful that if he connected it to the fine structure constant maybe a lot of physicists would get interested in the subject matter. Gilmore said that he (and his wife) were tired of fighting the hostility of the physics community, and therefore he was giving up any further work related to Wyler's approach (that is why Gilmore was OK with giving me the papers). I, however, have been stubborn ( and/or stupid ) enough to pursue such things despite the hostility (up to and including blacklisting) that I have encountered. Since I still have the papers, and they were the only ones that Dyson had, and also the only ones that Gilmore had, I guess that they have never been published anywhere, and they are probably the only copies in existence (unless Wyler kept some copies for himself). It makes me sad to think that the material has been so unappreciated (not to mention viciously attacked by people like David Gross) for so long. ------------------------------------------------------------- Also, you ask about choosing a quaternionic structure. One reason to choose a quaternionic structure is to look at an 8-dim E8 lattice. In 4 dimensions with basis {1,i,j,k}, the 24 D4 lattice nearest neighbors look like +/-1, +/-i, +/-j, +/-k (8 vertices) and (1/2)( +/- 1 +/- i +/- j +/- k) (16 vertices with all 2^4 sign possibilities). In 4-dim, you have the basis axes plus 16 diagonal-type vertices, with the components of the diagonal-type vertices evenly spread among all 4 basis elements. In 8 dimensions with octonionic basis {1,i,j,k,e,ie,je.ke}, an E8 integral domain has 240 nearest neighbors that look like +/-1, +/-i, +/-j, +/-k, +/-e, +/-ie, +/-je, +/-ke (16 vertices) and 224 vertices that look like (±1 ±ie ±je ±ke)/2 (±e ±i ±j ±k )/2 (±1 ±ke ±e ±k )/2 (±i ±j ±ie ±je)/2 (±1 ±k ±i ±je)/2 (±j ±ie ±ke ±e )/2 (±1 ±je ±j ±e )/2 (±ie ±ke ±k ±i )/2 (±1 ±e ±ie ±i )/2 (±ke ±k ±je ±j )/2 (±1 ±i ±ke ±j )/2 (±k ±je ±e ±ie)/2 (±1 ±j ±k ±ie)/2 (±je ±e ±i ±ke)/2 In 8-dim, you have the basis axes plus 224 diagonal-type vertices, but the components of each of the diagonal-type vertices only have 4 of the 8 basis elements, so the E8 integral domain looks like it is sort of fundamentally 4-dim, and particularly that if the element e were replaced by 1, it would go down to 4-dim. Also, there are 6 other E8 integral domains of the same form, for a total of 7, and they correspond in a structurally natural way to the 7 imaginary octonions. The physical significance of the 7 different E8 integral domains appears when you stack 8-branes in the unconventional string theory model that I put on CERN EXT (just before it was discontinued) at http://cdsweb.cern.ch/search.py?recid=730325&ln=en In terms of continuum structure, picking a quaternionic subspace is most naturally described in terms of calibrations of the octonionc spacetime, in which there is an associative 3-form (say, basis {i,j,k}) that "picks out" the 3 spatial dimensions and a coassociative 4-form (say, basis {e,ie,je,ke}) that "picks out" the 4-dim internal symmetry space. Calibrations are described in books like Spinors and Calibrations, by Reese Harvey, and Spin Geometry, by Lawson and Michelsohn. All this may look complicated (so much so as to deter most physicists >from working on it), but it fits together so well that it feels right to me (for example, if there were not 7 different E8 integral domains, then the stacking of 8-branes in the string version would not work right). Also, I should note that in the string version, strings are world-lines, not little particles. Roughly, the string version gives something like the Bohm guiding quantum potential stuff. Short string world-lines are virtual internal loops, and long string world-lines are "asymptotic" particle paths. Such a physical interpretation of strings has not been well received by the conventional superstring community, to say the least. Tony Date: Mon, 27 Feb 2006 20:39:49 -0500 To: lark1@quantumfuture.net From: Tony Smith <f75m17h@mindspring.com> Subject: Wolf paper Ark, you ask about a a copy of Wolf, J. (1965), J. Math. Mech. 14, 1033. It has been a useful paper for me, but I think there were some technical errors in some of Wolf's early papers, probably including it, and that the most nearly correct version is at Joseph A. Wolf, The Geometry and Structure of Isotropy Irreducible Homogeneous Spaces, Acta Math. 120 (1968) 59-148 and Erratum Acta Math. 152 (1984) 141-142 The stuff is mentioned in Wolf's book Spaces of Constant Curvature (5th ed) (Publish or Perish 1984), but there (page 202) Wolf says: "... Added in proof. A completes structure theory and classification has now been worked out for the isotropy irreducible riemannian manifolds. Unfortunately it is too long and technical to summarize here. ...", and he makes reference to the above Acta Math paper and its Erratum. I wish that he had expanded the book to include the material, but he did not. Tony
In astro-ph/0512327 Christian Beck says:
"... The physical nature of the currently observed dark energy in the universe is completely unclear, and many different theoretical models co-exist. Nevertheless, if dark energy is produced by vacuum fluctuations then there is a chance to probe some of its properties by simple laboratory tests based on Josephson junctions. These electronic devices can be used to perform 'vacuum fluctuation spectroscopy', by directly measuring a noise spectrum induced by vacuum fluctuations. One would expect to see a cutoff near 1.7 THz in the measured power spectrum, provided the new physics underlying dark energy couples to electric charge.The effect exploited by the Josephson junction is a subtile nonlinear mixing effect and has nothing to do with the Casimir effect or other effects based on van der Waals forces. A Josephson experiment of the suggested type will now be built, and we should know the result within the next 3 years. ...".
The Josephson experiment mentioned by Christian Beck is by P A Warburton of University College London. It is EPSRC Grant Reference: EP/D029783/1, "Externally-Shunted High-Gap Josephson Junctions: Design, Fabrication and Noise Measurements", starting1 February 2006 and ending 31 January 2009 with £ Value: 242,348. Its abstract states:
"... In the late 1990's measurements of the cosmic microwave background radiation and distant supernovae confirmed that around 70% of the energy in the universe is in the form of gravitationally-repulsive dark energy. This dark energy is not only responsible for the accelerating expansion of the universe but also was the driving force for the big bang. A possible source of this dark energy is vacuum fluctuations which arise from the finite zero-point energy of a quantum mechanical oscillator, hf/2 (where f is the oscillator frequency). Much experimental and theoretical astrophysics and cosmology research is currently focussed on confirming the source of dark energy. A recent publication by Beck and Mackey, however, suggests the possibility that dark energy may be measured in the laboratory using resistively-shunted Josephson junctions (RS-JJ's). Vacuum fluctuations in the resistive shunt at low temperatures can be measured by non-linear mixing within the Josephson junction. If vacuum fluctuations are responsible for dark energy, the finite value of the dark energy density in the universe (as measured by astronomical observations) sets an upper frequency limit on the spectrum of the quantum fluctuations in this resistive shunt. Beck and Mackey calculated an upper bound on this cut-off frequency of 1.69 THz. Measurements of quantum noise in Josephson junctions were performed in a quite different context in the early 1980's. Most notably for this work, the spectrum of zero-point fluctuations in RS-JJ's was measured by the BErkeley group, but only up to 0.6 THz. The upper frequency limit of these measurements was dictated by the gap energy of the lead-alloy superconductors used in that experiment. At higher frequencies tunnelling of quasiparticles dominates over all other electronic processes. We therefore propose to perform measurements of the quantum noise in RS-JJ's fabricated using superconductors with sufficiently large gap energies that the full noise spectrum up to and beyond 1.69 THz can be measured. Unfortunately niobium junctions, which may now be repeatably and reproducibly fabricated, have a cut-off frequency of, at best, 1.5 THz. There are two candidate families of superconductor which present themselves as viable alternatives to niobium: the nitrides and the cuprates. Nitride junctions have cut-off frequencies of around 2.5 THz, which should give sufficiently low quasiparticle current noise around 1.69 THz at accessible measurement temperatures. Cuprate superconductors have an energy gap an order of magnitude higher than the nitrides, but here there is finite quasiparticle tunnelling at voltages less than the gap voltage, due to the d-wave pairing symmetry. By performing experiments on both the nitrides and the cuprates we will have two independent measurements of the possible cut-off frequency in two very different materials systems. This would give irrefutable confirmation (or indeed refutation) of the vacuum fluctuations hypothesis. ...".
Some points that may be relevant to the experiment are:
1 - the critical density in our universe now is about 5 GeV/m^3
2 - it is made up of Dark Energy : Dark Matter : Ordinary Matter in a ratio DE : DM : OM = 73 : 23 : 4
3 - the density of the various types of stuff in our universe now is
4 - the density of vacuum fluctuations already observed in Josephson Junctions is about 0.062 GeV/m^3 which is for frequencies up to about 6 x 10^11 Hz
5 - the radiation density (for photons) varies with frequency as the 4th power of the frequency, i.e., as ( pi h / c^3 ) nu^4
6 - if Josephson Junction frequencies were to be experimentally realized up to 2 x 10^12 Hz, then, if the photon vacuum fluctuation energy density formula were to continue to hold, the vacuum energy density would be seen to be 0.062 x (20/6)^4 = about 7 GeV/m^3 which exceeds the total critical density of our universe now
7 - to avoid such a divergence being physically realized, neutrinos should appear in the vacuum at frequencies high enough that E = h nu exceeds their mass of about 8 x 10^(-3) eV, or at frequencies over about 1.7 x 10^12 Hz
8 - if Josephson Junctions could be developed to see vacuum fluctuation frequencies up to 10^12 Hz, and if the photon equation were to hold there, then the obseved vacuum fluctuation density would be about 0.5 GeV/m^3 which is well over the 0.2 GeV / m^3 Ordinary Matter energy density which means that DE and/or DM COMPONENTS WOULD BE SEEN IN VACUUM FLUCTUATIONS IN JOSEPHSON JUNCTIONS THAT GO UP TO 10^12 HZ FREQUENCY
9 - Some other possibly relevant data are:
10^28 cm = present radius of our universe = 10^26 m The radius of our universe at the time our solar system formed 5 by ago may have been about half its present radius,.
Uranus orbit = 19 AU = 19 x 380,000 km = 19 x 3.8 x 10^8 m = 7 x 10^9 m
Uranus orbit volume = 4/3 x pi x 7^3 x 10^27 m^3 = 1.4 x 10^30 m^3
Earth Reserves Duration for 10^10 people using energy at present USA level (Terawatt-years - years of reserves):
1 GeV = 10^(-10) J joule, 1 eV = 10^(-19) J joule, A megaton of TNT is 4.184 x 10^15 joules
Q the quad (short for quadrillion) is defined as 10^15 BTUs, which is about 1.055 x 10^18 joules,
If 10^10 people consumed enough energy to maintain a USA-type standard of living by using energy at the same rate as the present USA, that would be about 100 Q (quadrillion BTU, or 10^15 BTU), or about 300 x 10^11 kw-hours (kilowatt-hours), for each year, for about 3 x 10^8 (300 million) people, or about 10^5 kw-hours/year per person for a total energy consumption for all 10^10 people per year of about 10^5 x 10^10 = 10^16 kw-hours/year = 3 x 10^4 Q/year.
Using about 10,000 hours in a year as an approximation to about 8,766 hours in a year: 1 Q = 3 x 10^11 kw-hours = 3 x 10^14 watt-hours = 300 x 10^12 watt-hours = 300 Terawatt-hours = 300 x 10^(-4) Terawatt-years = ( 1/30 ) Terawatt-years so that the total energy consumption for all 10^10 people per year would be about 3 x 10^4 Q/year = 10^3 Terawatt-years/year.
Narlikar and Padmanabhan, in their book Gravity, Gauge Theories, and Quantum Cosmology (Reidel 1986) say:
"... Let M denote a [Minkowski] spacetime manifold ... Consider the conformal transformation ... F ...[to a]... new manifold ...M_c ... we are essentially quantizing F against the background M ...... in Einstein's description the geometry of spacetime provides the background for the operation of other physical interactions ... we have to ask the meaning that can be attached to such interactions during quantum fluctuations of spacetime geometry. In a general fluctuation the causal relation between two spacetime points is not invariant because the light cone structure is not preserved under the fluctuation. There is one exception ... The global light cone structure of a spacetime is preserved in a conformal transformation Thus, the causal structure of the various propagators describing ... interactions ... is not affected during a conformal fluctuation and a conceptual picture of what is going on is preserved. ...
... the conformal degrees of freedom are capable of being quantized exactly by the path integral approach. ... the restriction to conformal degrees of freedom enables us to discuss the quantum effects in a fully covariant manner. In this respect the approach ... differs from Euclideanization of time ...[which]... departs from general covariance. ...
... cosmology ...[of]... quantum conformal fluctuations ... does lead to significant differences from classical cosmology.
[First]... the probability measure of singular cosmological models in the full range of conformal transformations is zero. Thus, singularity is unlikely to be the menace it has been in the classical context.
Second, our approach has produced a working theory rather than an abstract formalism, and ... explicit answers to path integrals can be obtained. ...
... the ... Quantum gravity ... metric is ... a quantum variable ...[so]... we cannot ascribe a definite value to the metric or to the line interval. ... Consider ... quantum conformal fluctuations around the flat space. Flat space can alternatively be considered to be the gravitational vacuum, and we expect gravitational vacuum fluctuations to be present in the flat space. The action governing the conformal fluctuations ... has the same form as the action for a massless scalar field except for an overall minus sign. ...
The quantity of interest is the probability amplitude for the measurement to give ... the value N ... as the resolution of the apparatus L goes to zero the fluctuations in N continue to increase. ... we cannot attribute a unique proper distance between to events ... In other words, the concept of a unique distance between events ceases to have any meaning when L = L_p ...[there is]... an 'uncertainty principle' delta N delta l > L_p where delta l is the uncertainty in the conformal factor ... the expectation value of the proper interval between any two events in spacetime is bounded from below at ( L_p / 2 pi )^2 ... Physically, we may consider the coformal factor as a 'conjugate variable' to proper length. The vacuum fluctuations of this conformal degree of freedom produce a 'zero point distance' in the spacetime. ... the divergence problem can be avoided if we use the expectation value of the proper length ... we shall quantize the conformal degree of freedom exactly and treat the background metric in a semiclassical limit. The back reaction of the QCF on the metric can be taken into account in the sense of expectation values. ... consider ... a system with cosmological constant /\ ... assume that it is positive ... The classical solution, corresponding to the principle of stationary action, may be taken to be the de Sitter universe ... Quantum fluctuations .. can induce a tunnelling through potential barrier and render the local minimum unstable. The QCFs make the de Sitter spacetime unstable and give it a finite lifetime t . ... The QCFs ... potential has the same form as that of the 'double-hump' potential ... The reciprocal of ... the tunnelling probability per unit time ... gives the lifetime t of the metastable ground state ... the inflation factor ... Z ... for the universe is ... an exponential of an exponential ... Z is huge for a wide range of lambda ... the minimum value of Z is = exp(106) andit occurs around lambda = 11 ... Two 'natural' choices for lambda are = 1 (... dimensional reasons ...) and = 10^(-8) (if lambda arises from GUTs potentials.) These choices give Z values of exp(10^16) and exp(10^10), respectively. ...".
Jack Sarfatti, in an early draft of his paper at gr-qc/0602022, said
"...there is the conjecture of the added quantum gravity correction to the Heisenberg uncertainty principledx = ( hbar / dp ) + ( L_p^2 dp / hbar )
The general idea here is that if you pump too much energy into too small a volume you will create a black hole whose event horizon increases with additional energy causing an increase in the position uncertainty rather than the usual decrease.
With all of the above speculations in mind, I now define the 1-form invariant curved space-time tetrad field as
L = sqrt( hbar G / c^3 ) ( (dtheta)_phi - theta(dphi) )
We can think of the non-closed 1-form L as a line flux density because its integral around a closed loop is a measure of the proper time around the loop. ...
/\ = ( 1/ L_p^2 ) ( 1 - | PSI |^2
...[PSI is]... the large-scale vacuum condensate inflation field order parameter in the FRW metric limit ...".
Roger Penrose, in his book The Road to Reality (USA edition, Knopf 2004) says:
"... any non-constancy in ' /\' would have to be accompanied by a compensating non-conservation of the mass-energy of the matter ...... a superposed state will spontaneously reduce into one or the other of its stationary constituents in an average timescale of about hbar / E_G , where E_G is the gravitational self-energy of the difference between the two mass distributions ... this ...[is]... gravitational OR ( where OR stands for the 'objective reduction' of the quantum state ). ... the energy uncertainty in E_G would appear to cover such a potential non-conservation, leading to no actual violation of energy conservation. ...
... extremely tiny gravitational energy uncertainty E_G ... say some 10^(-33) of a joule ... is sufficient to give ... a ... collapse lifetime of one-tenth of a second or less ... It should be noted that the timescale hbar / E_G involves the quotient of the two small quantities hbar and G , and so need not be a small quantity in human terms. This is in stark contrast with the characteristic quantum-gravity quantities, the Planck length and the Planck time ... of sizes 10^(-33) cm and 10^(-41) s , which ... arise from the product of hbar and G. ...
... I envisage that the phenomenon of consciousness ... which I take to be a real physical process ... fundamentally makes use of the actual OR process ... A-lattice neuronal microtubules as originally suggested by Stuart Hameroff ... would require some kind of large-scale quantum coherence, acting broadly across considerable regions of the brain ... a conscious event would be associated with a partial state reduction ( orchestrated OR ) of this quantum system. ...
... an ingenious suggestion ... by Andrew Duggins ... depends upon the fact that quite different regions in the brain are responsible for different aspects of perception ... yet ... consciousness ... comes up with ... a single image. ... Duggins's idea is to test to see whether there are significant violations of Bell's inequalities involved in the forming of a conscious image, indicating the presence of non-local EPR-type occurrences ... which would strongly suggest that large-scale quantum effects are part of conscious perception. ...
... Some people ... arguing that consciousness simply 'emerges' as some sort of 'epiphenomenon' ...[ take the ]... position ... of computational functionalism, according to which it is merely computational activity ... that gives rise to conscious mentality. I have argues strongly against this view ... partly using ... Godel's theorem and the notion of Turing computability ... My arguments demand that this missing theory must be a non-computational theory ... i.e. its actions lie outside the scope of Turing-machine simulation ...".
Fred Thaheld, in physics/0601060, entitled "An interdisciplinary approach to certain fundamental issues in the fields of physics and biology: towards a Unified Theory", say:
"Recent experiments appear to have revealed the possibility of quantum entanglement between spatially separated human subjects. In addition, a similar condition might exist between basins containing human neurons adhering to printed circuit boards. In both instances, preliminary data indicates what appear to be non-local correlations between brain electrical activities in the case of the human subjects, and also non-local correlations between neuronal basin electrical activities, implying entanglement at the macroscopic level. If the ongoing extended research and the analysis of same continues to support this hypothesis, it may then make it possible to simultaneously address some of the fundamental problems facing us in both physics and biology through the adoption of an interdisciplinary empirical approach based on Bell's experimental philosophy, with the goal of unifying these two fields. ...Only a very few experiments have been conducted to date attempting to explore the possibility of a quantum physics-biology interrelationship, with the first one utilizing pairs of human subjects in Faraday cages, where just one of the pair is subjected to photostimulation, investigating possible electroencephalographic (EEG) correlations between human brains (Grinberg-Zylberbaum et al, 1994). Later experiments, building upon this pioneer research have been performed and continue to corroborate, with increasing experimental and statistical sophistication, these unusual EEG correlations (Standish, 2001; Richards et al, 2002; Standish et al, 2004; Wackermann et al, 2003).
Experiments have also been conducted which have revealed evidence of correlated functional magnetic resonance imaging (fMRI) signals between human brains (Standish et al, 2003). These correlations occurred while one subject was being photostimulated and the other subject was having an fMRI scan performed.
Research has also been ongoing for over a year at the Univ. of Milan (Pizzi et al, 2003; 2004a; 2004b; Thaheld, 2000a; 2004a) utilizing pairs of 2 cm dia. basins containing human neurons on printed circuit boards inside Faraday cages separated by 20 cm. Laser stimulation of just one of the basins reveals consistent wave-form autocorrelations between stimulated and nonstimulated basins. In addition, there are indications that biological quantum nonlocality has been observed in the coherence of induced magnetic dipoles involved in muscle contraction in single actin filaments at the mesoscopic level, (Matsuno, 2001; 2003; Hatori et al, 2002) and, that cell motility underlying muscle contractions is accompanied by a quantum mechanical coherence on a macroscopic scale (Matsuno, 1999; 2001). All these experiments are described in greater detail later in this paper and, when taken together, seem to be pointing us in an unusual direction implying entanglement and nonlocality. ...
References [partial list]
- Beck, F., Eccles, J.C. 1992. Quantum aspects of brain activity and the role of consciousness. Proc. Natl. Acad. Sci. USA. 89, 11357-11361.
- Grinberg-Zylberbaum, G., Delaflor, M., Attie, L., Goswami, A., 1994. The Einstein-Podolsky-Rosen-Paradox in the brain: the transferred potential. Physics Essays 7, 422-428.
- Hagan, S., Hameroff, S.R., Tuszynski, J., 2002. Quantum computation in brain microtubules; decoherence and biological feasibility. Phys. Rev. E 65, 061901.
- Hameroff, S., 1994. Quantum coherence in microtubules: A neural basis for emergent consciousness? J. Consciousness Studies 1, 91-118.
- Hameroff, S., 1998. Anesthesia, consciousness and hydrophobic pockets &endash; a unitary quantum hypothesis of anesthetic action. Toxicology Lett. 100-101, 31-39.
- Hatori, K., Honda, H., Matsuno, K., 2001. Magnetic dipoles and quantum coherence in muscle contraction. quant-ph/0104042.
- He, G-P, Zhu, S-L, Wang, Z.D., Li, H-Z., 2003. Testing Bell's inequality andmeasuring the entanglement using superconducting nanocircuits. quant-ph/0304156.
- Matsuno, K. 1999. Cell motility as an entangled quantum coherence. BioSystems, 51, 15-19.
- Matsuno, K., Paton, R.C., 2000. Is there a biology of quantum information? BioSystems 55, 39-46.
- Matsuno, K., 2001. The internalist enterprise on constructing cell motility in a bottom-up manner. BioSystems 61, 114-124.
- Matsuno, K., 2003. Quantum mechanics in first, second and third person descriptions. BioSystems 68, 107-118.
- Pizzi, R., Fantasia, A., Gelain, F., Rosetti, D., Vescovi, A., 2003. Looking for quantum processes in networks of human neurons on printed circuit board. Quantum Mind 2, March 15-19 Tucson, Arizona. (http://www.consciousness.arizona.edu/quantummind2/ abstracts.html) (http://www.dti.unimi.it/~pizzi).
- Pizzi, R., Fantasia, A., Gelain, F. Rossetti, D., Vescovi, A., 2004a. Non-local correlation between human neural networks on printed circuit board. Toward a Science of Consciousness conference, Tucson, Arizona. (http://consciousness.arizona.edu/tucson2004) Abstract No. 104.
- Pizzi, R., Fantasia, A., Gelain, F., Rossetti, D., Vescovi, A., 2004b. Nonlocal correlations between separated neural networks. Quantum Information and Computation II. ed. E. Donkor, A.R. Pirick, H.E. Brandt. Proceedings of SPIE 5436, 107-117.
- Richards, T.L., Johnson, L.C., Kozak, L., King, H., Standish, L., 2002. EEG alpha wave evidence of neural energy transfer between human subjects at a distance. Tucson Toward a Science of Consciousness conference. (http://www.consciousness.arizona.edu/Tucson2002). Abstract No. 352.
- Standish, L., 2001. Neurophysiological measurement of nonlocal connectivity. In: Science and Spirituality of Healing conference. Kona, Hawaii. (available at: Samueli Institute, http://www.siib.org).
- Standish, L.J., Johnson, L.C., Kozak, L., Richards, T., 2003. Evidence of correlated functional magnetic resonance imaging signals between distant human brains. Alter. Therapies 9 (1), 121-125.
- Standish, L.J., Kozak, L., Johnson, L.C., Richards, T., 2004. Electroencephalographic evidence of correlated event-related signals between the brains of spatially and sensory isolated human subjects. J. Alter. Compl. Med. 10, 307-314.
- Thaheld, F.H., 2000a. Proposed experiment to determine if there are EPR nonlocal correlations between two neuron transistors. Apeiron 7 (3-4), 202-205. (http://redshift.vif.com).
- Thaheld, F.H., 2004b. A method to explore the possibility of nonlocal correlations between brain electrical activities of two spatially separated animal subjects. BioSystems 73, 205-216.
- Wackermann, J., Seiter, C., Keibel, H., Walach, H., 2003. Correlations between brain electrical activities of two spatially separated human subjects. Neurosci. Lett. 336, 60-64. ...".
The book "Inside the Neolithic Mind" by David Lewis-Williams and David Pearce (Thames & Hudson 2005), says in part:
"... Horizontal shamanism (HS) is comparatively democratic ... HS ... depends on individual shamans contacting the spirit by means of hallucinogens. ....In vertical shamanism (VS) the principal component is esoteric knowledge that is revealed to and transmitted within a small elite. ...
the Upper Plaeolithic was probably dependent on some form of HS.
During the Neolithic, some of the features ... of VS began to appear. It was these features that enabled people ... to build large towns and to construct massive monuments. ...".
A May 1993 OMNI magazine interview quoted Terence McKenna:
"... In his 1992 book Food of the Gods, McKenna delineates a radical history of drugs and human evolution, chronicling our descent from "stoned apes" and extolling the virtues of psilocybin mushrooms and DMT (dimethyltryptamine), a potent psychedelic compound. ...MCKENNA: ... My book is about the history of drugs ... the key unlocking ...[the]... great mystery ... of how our minds and consciousness evolved from the ape ... is the presence of psychoactive plants in the diet of early man. ...
MCKENNA: From 75,000 to about 15,000 years ago, there was a kind of human paradise on Earth. People ... Nobody went more than three or four weeks before they were redissolved into pure feeling and boundary dissolution. Community, loyalty, altruism, self-sacrifice -- all these values that we take to be the basis of humanness -- arose at the time in a situation in which the ego was absent.
OMNI: If this was all so wonderful, why did it end?
MCKENNA: The most elegant explanation is that the very force that created the original breakthrough swept away its conditions. The climatological drying of Africa forced us out of the forest canopy, onto the grasslands, and into bipedalism and omnivorous diets. We lived in that paradisiacal grasslands situation, but the climate was slowly getting drier. Mushrooms began to be less available. ...
[ Although I agree with Terence McKenna that climate change could have caused a "paradisiacal" culture to end, my opinion is that the most likely such climate change was the end of the Ice Age around 11,600 years ago in which the sea level rose, forcing people to crowd into higher ground, increasing competition for more limited resources, and rewarding predatory competitive behavior and rigid military social/economic structures. ]
MCKENNA: ... My scenario, if true, has enormous implications. For 10,000 years, with the language and social skills of angels, we've pursued an agenda of beasts and demons. Human beings created an altruistic communal society; then, by withdrawing the psilocybin or having it become unavailable, we've had nothing to fall back upon except old primate behaviors, all tooth-and-claw dominance.OMNI: You're giving an enormous amount of power to a drug. What can you tell me about psilocybin?
MCKENNA: ... For the last 500 years, Western culture has suppressed the idea of disembodied intelligences -- of the presence and reality of spirit. Thirty seconds into the DMT flash, and that's a dead issue. The drug shows us that culture is an artifact. You can be a New York psychotherapist or a Yoruba shaman, but these are just provisional realities you're committed to out of conventional or local customs. ... Psilocybin shows you everything you know is wrong. The world is not a single, one-dimensional, forward-moving, causal, connected thing, but some kind of interdimensional nexus.
OMNI: If everything I know is wrong, then what?
MCKENNA: You have to reconstruct. It's immediately a tremendous permission for the imagination. I don't have to follow Sartre, Jesus, or anybody else. Everything melts away, and you say, "It's just me, my mind, and Mother Nature." this drug shows us that what's waiting on the other side is a terrifyingly real self-consistent modality, a world that stays constant every time you visit it.
OMNI: What is waiting? Who?
MCKENNA: You burst into a space. Somehow, you can tell it's underground or an immense weight is above it. There's a feeling of enclosure, yet the space itself is open, warm, comfortable, upholstered in some very sensual material. Entities there are completely formed. There's no ambiguity about the fact that these entities are there.
OMNI: What are they like, Terence?
MCKENNA: Trying to describe them isn't easy. On one level I call them self-transforming machine elves; half machine, half elf. They are also like self-dribbling jeweled basketballs, about half that volume, and they move very quickly and change. And they are, somehow, awaiting. When you burst into this space, there's a cheer! Pink Floyd has a song, "The Gnomes Have Learned a New Way to Say Hooray." Then they come forward and tell you, "Do not give way to amazement. Do not abandon yourself." ...
OMNI: What are these elves, these creatures about?
MCKENNA: They are teaching something. Theirs is a higher dimensional language that condenses as a visible syntax. For us, syntax is the structure of meaning; meaning is something heard or felt. In this world, syntax is something you see. There, the boundless meanings of language cause it to overflow the normal audio channels and enter the visual channels. They come bouncing, hopping toward you, and then it's like -- all this is metaphor; they don't have arms -- it's as though they reach into their intestines and offer you something. They offer you an object so beautiful, so intricately wrought, so something else that cannot be said in English, that just gazing on this thing, you realize such an object is impossible. ... The object generates other objects, and it's all happening in a scene of wild merriment and confusion.
Ordinarily language creates a system of conventional meanings based on pathways determinate by experience. DMT drops you into a place where the stress is on a transcending language.
Language is a tool for communication, but it fails at its own game because it's context-dependent. Everything is a system of referential metaphors. We say, "The skyline of New York is like the Himalayas, the Himalayas are like the stock market's recent performance, and that's like my moods" -- a set of interlocking metaphors. We have either foreground or background, either object or being. If something doesn't fall into these categories, we go into a kind of loop of cognitive dissonance. If you get something from outside the metaphorical system, it doesn't compute. That's why we need astonishment. Astonishment is the reaction of the body to the ineffectiveness of its descriptive machinery. You project your description, and it keeps coming back. Rejected. Astonishment breaks the loop. ...
.. Something in an unseen dimension is acting as an attractor for our forward movement in understanding.
OMNI: Attractor?
MCKENNA: It's a point in the future that affects us in the present. ... Our model that everything is pushed by the past into the future, by the necessity of causality, is wrong. There are actual attractors ahead of us in time -- like the gravitational field of a planet. Once you fall under an attractor's influence, your trajectory is diverted.
OMNI: Does the attractor have a kind of intelligence?
MCKENNA: I think so. It's what we have naively built our religion around: God, totem. It's an extradimensional source of immense caring and reflection for the human enterprise. ...
OMNI: How will science explore the after-death state?
MCKENNA: By sending enough people into this other dimension to satisfy themselves that this is eternity. Here the analogy of the New World holds: A few lost sailors and shipwreck victims like myself are coming back, saying, "There was no edge of the world. There was this other thing. Not death and dissolution, not sea monsters and catastrophe, but valleys, rivers, cities of gold, highways." It will be a hard thing to swallow, but then the scientists can go back to doing science on after-death states. They don't have to throw out their method. ...
OMNI: How do you see the future?
MCKENNA: If history goes off endlessly into the future, it will be about scarcity, preservation of privilege, forced control of populations, the ever-more-sophisticated use of ideology to enchain and delude people.
We are at the breakpoint.
It's like when a woman comes to term. At a certain point, if the child is not severed from the mother and launched into its own separate existence, toxemia will set in and create a huge medical crisis.
The mushrooms said clearly, "When a species prepares to depart for the stars, the planet will be shaken to its core."
All evolution has pushed for this moment, and there is no going back. What lies ahead is a dimension of such freedom and transcendence, that once in place, the idea of returning to the womb will be preposterous.
We will live in the imagination.
We will quickly become unrecognizable to our former selves because we're now defined by our limitations: the laws of gravity; the need to eat, excrete, and make money. We have the will to expand infinitely into pleasure, caring, attention, and connectedness. If nothing more -- and it's a lot more -- it's permission to hope. ...".
[My opinion is that a lot of what Terence McKenna says about the past and future of humanity is true, except that brain-altering drugs are not a necessary part of our history, either past or future. Personally, I try to avoid such brain-altering drugs because I see things described by Terence McKenna without them, and I think that chemically altering my brain would interfere with and distort my seeing such things. (Perhaps Terence McKenna would regard me as "unstable" or "fragile".)]
Combining McKenna's ideas with the book "Inside the Neolithic Mind", I am beginning to think that it might go like this:
In the Paleolithic days of sparse populations, maybe everbody could democratically "see" such things, and there would be plenty of mushrooms-whatever to help everybody who needed them to get there;In the early Neolithic, population density increased, the easy availability of mushrooms-whatever for everybody was less, and class structure emerged to control the more dense population.
Restrictions on hallucinogenic drugs would allow the upper-class priests to continue to "see", while the lower-class masses would be more "blind" and tend to follow their priestly leaders (the few members of the masses who could "see" anyway would either be coopted into the priesthood or eliminated as heretics).
The priestly class directed the construction of monuments etc patterned after such consciousness-experiences, which served to give the less-pereceptive masses some rough idea about the stuff, but did not give them all the details, so that the masses would remain in awe of the priests.
This is consistent with the way some contemporary governments control their general population through media, drug-prohibition laws, etc.
A war may occur as early as March 2006, with this scenario:
In order to stop the Iranian Oil Bourse from selling oil in Euros beginning 20 March 2006 (and also to stop Iranian nuke construction) the USA bombs 400 or so Iranian nuke sites and takes the Khuzestan part of Iran (ethnic Arab and 90% of Iranian oil), which is flat and as easy to take with tanks plus air cover as was Iraq. Then China, which has just purchased Khuzestan oil from Iran, tries to protect its oil interests by giving (with cooperation of Russia etc) advanced weapons to Iran. Then Iran uses the advanced weapons to destroy the USA fleet in the Persian Gulf, and USA bases in Iraq, Arabia, Kuwait, etc, and also hits Israel. If the war stops there, the USA loses control of Persian Gulf oil, the USA dollar collapses, and the USA goes into a Great Depression. In light of that, the USA might attack China (and maybe Russia etc) and they might retaliate, and goodbye to many big cities. The ultimate winner in the long run would be whoever could rebuild fastest. Katrina has shown clearly the USA (lack of) rebuilding ability. China has for some years been building underground facilities from which it can recover from nuclear attack. Who do you think will be the long-run winner? Should you start to learn Mandarin? Of course, the above might not happen, but the Ides of March are very dangerous this year of 2006, and, as long as there are big-power conflicts over the limited supply of cheap Persian Gulf oil, the danger remains.
According to a 23 January 2006 OpEd News article by Mike Whitney: " ... America monopolizes the oil trade. Oil is denominated in dollars and sold on either the NYMEX or London's International Petroleum Exchange (IPE), both owned by Americans. This forces the central banks around the world to maintain huge stockpiles of dollars even though the greenback is currently underwritten by $8 trillion of debt and even though the Bush administration has said that it will perpetuate the deficit-producing tax cuts. America's currency monopoly is the perfect pyramid-scheme. As long as nations are forced to buy oil in dollars, the United States can continue its profligate spending with impunity. (The dollar now accounts for 68% of global currency reserves up from 51% just a decade ago) The only threat to this strategy is the prospect of competition from an independent oil exchange; forcing the faltering dollar to go nose-to-nose with a more stable (debt-free) currency such as the euro. That would compel central banks to diversify their holdings, sending billions of dollars back to America and ensuring a devastating cycle of hyper-inflation. ... Krassimir Petrov, Ph.D in economics, says in a recent article The Proposed Iranian Oil Bourse ... "From a purely economic point of view, should the Iranian Oil Bourse gain momentum, it will be eagerly embraced by major economic powers and will precipitate the demise of the dollar. The collapsing dollar will dramatically accelerate U.S. inflation and will pressure upward U.S. long-term interest rates. At this point, the Fed will find itself between …between deflation and hyperinflation-it will be forced fast either to take its "classical medicine" by deflating, whereby it raises interest rates, thus inducing a major economic depression, a collapse in real estate, and an implosion in bond, stock, and derivative markets, with a total financial collapse, or alternatively, to take the Weimar way out by inflating, whereby it pegs the long-bond yield, raises the Helicopters and drowns the financial system in liquidity, bailing out numerous LTCMs and hyperinflating the economy. No doubt, Commander-in-Chief Ben Bernanke, a renowned scholar of the Great Depression…, will choose inflation. …".
In 2000, the Physics / Math Community acted as a force for freedom in supporting Piet Hut (and the concept of tenure) when the Princeton IAS (ironically, created by Bamberger money as a home for Einstein when universities such as Princeton felt uncomfortable about accepting a Jew) attempted to expel Piet Hut, a tenured faculty member.
In 2006, the Physics / Math Community acted as a discriminatory Gentlemen's Club in opposing Peter Woit (and the concept of freedom of thought and expression) when the Cornell arXiv barred his blog from trackbacks.
The two situations have a common factor:
perhaps due to 6 years of Bush, Enron, outsourcing to China, 9/11 Patriot Act, and Oil Wars,
the 2006 Physics / Math "community of scientists and scholars" has not the courage it showed back around 2000.
Maybe the 21st century will be known as a Dark Age.
An petition on the web was effective in the case of the Institute of Advanced Study against Piet Hut, where the issue was an attempt by IAS to dismiss him even though he had tenure.
According to a 14 November 2000 article in the Princeton Packet by Jeff Milgram:
"… The institute last week dropped its lawsuit in federal court in Trenton to force Dr. Hut to live up to an agreement to leave the faculty he joined in 1985. Dr. Hut also dropped a counterclaim against the institute."I believe that two of the most important considerations were that my legal position was very strong and the reaction of the community of scientists and scholars was even stronger," said Dr. Hut, who received the support of more than 30 colleagues around the world. …".
The "support of more than 30 colleagues around the world" was evidenced by such things as public web statements by
Although the link to the "lawsuit page" on Edward Nelson's web page is no longer effective, if you use the wayback machine you can see a page that lists statements of support as of 7 November 2000 by about 60 people from many places, including but not limited to Princeton, Cambridge, Caltech, Tokyo, M.I.T., Berkeley, and many others.
As Richard Muller said in his statement:
"… I suspect that the Institute for Advanced Study will look very foolish, in about a decade or two …".
The past few weeks have been the most distressing to me since I was blacklisted by the Cornell arXiv in 2002.
From 2002 to recently, my pain of being blacklisted was counterbalanced by seeing new data and comparing it with my physics model (see note 1 below).
During that time, I thought that the Cornell arXiv behaviour was an aberration that would be corrected when the physics community as a whole realized what was going on.
Now, not only does it seem to me that less new and interesting things are emerging about physics,
but, through the case of the Cornell arXiv barring Peter Woit's blog from trackbacks (see note 2), and discussion of that on Sean Carroll's blog and the blog of Chad Orzel, the whole physics community DOES realize what is going on, but its reaction is (see note 3):
Approval by Jacques Distler (a member of the Cornell arXiv Physics Advisory Board) of the actions of Harvard Professor Lubos Motl, who in his blog (to which the Cornell arXiv permits trackbacks) and comments on other blogs, names me (and some others) as "completely moronic crackpots" with no discussion whatsoever of the merits or flaws in the physics models developed by me (and those of some others);and toleration by the vast majority of blog commenters of the practices of the Cornell arXiv, which have been described as "... formation of a "Gentleman's" private club ..." functioning like a "... a guild run garden party with a guest list ...".
Now (perhaps I have been very naive over the past several years), for the first time I realize that it is not just an aberration at the Cornell arXiv, but that a consensus of the physics community really does consider itself to be a club for which I am unfit for membership, and therefore no physicist will ever consider my physics work no matter how useful it might be.
To me, the situation is almost as incomprehensible as it is reprehensible,
and the pain reminds me of 1959 when I was denied service at the Cloister in Sea Island, Georgia.
Therefore, as of now, I am resolving to cease all efforts to engage in physics-related e-mail discussions, blog commentary, etc.
In 2002 I spoke at SESAPS 2002 about my overall physics model which had calculations of ratios of particle masses and force strengths, KM parameters, and the UCC-DCC mass difference, as well as discussion of three peaks in Fermilab T-quark event data.
In 2003 I saw some interesting work relating the Riemann Hypothesis to quantum theory, along the lines proposed by Hilbert and Polya.
In 2004 I calculated neutrino masses and mixing angles and the Dark Energy : Dark Matter : Ordinary Matter ratio that was observed by WMAP.
In 2005 I realized, based on papers of Yamawaki and his coworkers, that two of the three Fermilab T-quark data peaks corresponded to T-quark condensates with Nambu-Jona-Lasinio and Bardeen-Hill-Lindner structure, and the third to a T-quark condensate with and 8-dimensional Kaluza-Klein NJL structure, and I had correspondence with Carlos Castro that clarified some details of my calculations using techniques motivated by Armand Wyler.
Peter Woit's reaction to actions taken by the Cornell arXiv is evidenced by some of his statements on his blog and web page
"... This whole experience of having to engage in a detailed public defense of my credentials as a researcher, including responding to a large number of nasty attacks on this subject from people who have no credentials of their own, has been a rather trying experience. But the process has left me (and I gather most people who have been following it) with a lot of evidence that this is all about suppressing my criticisms of string theory, and I think the behavior of some of the people involved has been deeply disgraceful and unprofessional. ...There are lots of interesting issues about the role of the arXiv as it increasingly replaces the historical role of peer-reviewed journal. But one overriding fact is that it has become extremely important for the math and physics communities, and the people managing it have an increasingly important responsibility.
The arXiv does need to protect itself against crackpots, and I can see justification for a certain amount of lack of transparency at times because of this....
I'm a lot more elitist and willing to see suppression of crackpottery than many of my commenters ...
I've just deleted a bunch of comments about Piet Hut, Witten, Brian Josephson, etc. The institute controversy over Piet Hut has nothing whatsoever to do with the issue at hand here, which is the decision by the arXiv moderators to suppress links to this weblog. I'm not arguing that the arXiv should not suppress things for which a legitimate case can be made that they are crackpot science. From what I have seen of Josephson's work, I think such a legitimate case can be made. ..."
and from the fact that Peter Woit deleted from his blog a comment by "anon" that said in part:
"... It is sad that CERN Doc Server is no longer accepting preprints and copied of publications from external researchers like yourself and Tony Smith. I can't update my own paper on it, as CERN only accepts external feed from arXiv now, so arXiv really has now achieved a dictatorial power to decide which direction science goes. ...".
So, it seems to me (as of now 12 March 2006) that Peter Woit is:
Jacques Distler said on his blog:
"... as a member of the arXiv Physics Advisory Board ... I'm going to ... try to explain the thinking that went into the policy... For a paper to be accepted to appear on the arXivs, it must go through a two-stage filter.1. The author must be an approved submitter, usually through having been endorsed.
2. Each paper from an approved submitter must be accepted by the moderator for that section of the arXiv. ...
In the case of papers, the second-stage filter of moderation is clearly necessary. ...
If you are banned from posting papers to the arXivs ... such comments will be deleted ...
it was decided that ... Trackbacks would go through just a single stage of filtering. ... The solution which was adopted, in the end, was that trackbacks would be accepted if they come from active researchers. It's not particularly hard to figure out who's an active researcher: just look at their publications. ...".
A Chad Orzel blog said:
"... Having the ArXiv board decide who is and isn't an "active researcher" is just insane, if the goal is actually to avoid controversy. Not only is the closed-group nature of the decision ample fodder for conspiracy theorists, just the name is a disaster. If you're going to be insulting, why not go all the way, and just call your approved posters "Really Smart People"? Really, this policy is so stupid, it had to be the work of a committee. It takes a lot of smart people working together to miss something so bloody obvious. ... The whole stupid situation was made even worse by having the standards known only to people on the ArXiv board ...".
A comment by Doug on Sean Carroll's blog said:
"... how do we eliminate the "noise" of those at the bottom of the spectrum whose unsophisticated, kluged together, ideas are burdensome to the sophisticates? The only answer is the formation of a "Gentleman's" private club ... The scientific "cronyism" of academia is being exposed more and more by the Internet. ...".
A comment by c niedman on Jacques Distler's blog said:
"... The trouble for your goup seeking to put up a velvet rope is that ... science isn't a guild run garden party with a guest list. ...Would Watson or Crick have been active researchers to Erwin Chargaff before their big break? Would Alexander Grothendiek be allowed trackbacks as a blogger if he started his own site in his current state? How about Irving Segal as a cosmologist (rather than a mathematician)? Would John Nash be allowed to comment on game theory when not in a proper frame of mind? What about Papakyriakopoulos before Ralph Fox gave him legitimacy at Princeton? How about letting Teichmuller and Jordan share their views on national socialism (as well as moduli and commutative algebras). What if Streleski or Kaczynski wishes to start picking up research (and blogging) again? The Bogdanovs? Luca Turin? Kary Mullis? Ramanujan (pre Hardy)?
... At its best, it's more like a free-for-all rave where we all throw up our ideas to see which brainwaves stick to the walls. Like it or not, the next big thing can come from anywhere and, more importantly, anyone. ... just hope against hope that the next big idea comes from one of an A-list colleague and not a disgruntled outcast. ...".
A comment by Benni on Peter Woit's blog said:
"... It could be, that the next Einstein won't be endorsed, because his paper is so original and so complex that no endorser would have time to read through these many formulas which have nothing to do with present day theory. ...".
A comment by Dick Thompson on Peter Woit's blog said:
"... Tony Smith, who didn't ...[ get in arXiv ]... although way off the main road, and maybe presented with "attitude", seem to be mathematically sound - I haven't seen anybody say otherwise. ...". Peter Woit replied "... Dick, I agree with you about the Tony Smith ... question. ...".
Harvard Professor Lubos Motl commented on Sean Carroll's blog:
"... Expecting that someone has a right for his blog articles to be ... linked in scientific journals and their electronic counterparts is a crazy idea, especially if these blog articles are primarily addressed to completely moronic crackpots such as Chris Oakley, Danny Lunsford, Quantoken, and others .." and on his blog says "... it's a crackpot. Greetings to Chris Oakley, Danny Ross Lunsford a.k.a. Iman Zumbal, Quantoken, Tony Smith, secret milkshake, MathPhys, Pudding, and many others. ...".
Even though Beethoven allowed Matthias Artaria to replace the Grosse Fugue finale to Op. 130 String Quartet in B flat major with a "more accessible" finale, with the Grosse Fugue being published separately, in CD liner notes for The Late String Quartets, Melos Quartett, Deutsche Grammophon (1986), Constantin Floros wrote: "... Theodor Helm ...[wrote]... in 1910 ...[about 83 years after Beethoven's death]... :
"It is precisely those works of Beethoven which the great majority of his contemporaries described as confused, insufferable, even "crazy" (this epithet was applied to the last quartets almost universally for more than a lifetime) which now grip, thrill, involve and move us the most." It is curious that Beethoven himself foresaw this development. When the news was brought to him that one of his quartets, played by Schuppanzigh, had met with a poor reception, he said laconically: "One day it will please them." ...".
In his book Mathematical Cosmology and Extragalactic Astronomy (Academic Press 1976) (pages 72-75, 88-91), Irving Ezra Segal says:
"... The suitably scaled 15 linearly independent generators Lij of symmetries of unispace [ Segal's term for the Conformal RP1 x S3 SpaceTime used in the D4-D5-E6-E7 model - from here on on this page I will call it Conformal SpaceTime - the 15 generators generate the Conformal Group Spin(2,4) = SU(2,2) ] ... differ from the 11 generators of the group of global conformal transformations in Minkowski space by terms of the order 1 / R^2 [as R becomes infinite, where R is the radius of curvature of Conformal SpaceTime ] ...
- The angular momenta Lij ... [ i,j = 1,2; 2,3; 3,1 ] ... have ... the same expression both in Minkowski space and in [Conformal SpaceTime] ...
- The same is true of the boosts ... -iL0,j ... ( j = 1,2,3 ) ... and the infinitesimal scale transformation [ L-1,4 ] ...
- The scale generator - L-1,4 ... determines a ... scalar field. This ... is most naturally interpreted from a gravitational standpoint ...
- ... two ordered sets, each containing four of the Lij, converge on the same ... fields in Minkowski space ... in particular, R^(-1) L-1,j and R^(-1) Lj,4 [ for j = 0,1,2,3 ] both ... agree ... [as R becomes infinite] with the [corresponding Minkowski] conventional energy-momentum component. ... The differences L-1,j - Lj,4 thus are ... representable by a ... vector field, which physically would appear most naturally as potentially related to gravitational phenomena ... ".
Dark Energy ( also known as the Cosmological Constant) comes from the 10 Rotation, Boost, and Special Conformal generators of the Conformal Group Spin(2,4) = SU(2,2), so, at first approximation, the fractional part of our Universe of the Cosmological Constant should be about 10 / 15 = 67%.
Black Holes, including Dark Matter Primordial Black Holes, are curvature singularities in our 4-dimensional physical spacetime, and since Einstein-Hilbert curvature comes from the 4 Translations of the 15-dimensional Conformal Group Spin(2,4) = SU(2,2) through the MacDowell-Mansouri Mechanism (in which the generators corresponding to the 3 Rotations and 3 Boosts do not propagate), so, at first approximation, the fractional part of our Universe of Dark Matter Primordial Black Holes should be about 4 / 15 = 27%.
Ordinary Matter gets mass from the Higgs mechanism which is related to the 1 Scale Dilatation of the 15-dimensional Conformal Group Spin(2,4) = SU(2,2), so, at first approximation, the fractional part of our universe of Ordinary Matter should be about 1 / 15 = 6%.
According to a 23 March 2006 ESA news web page:
"... Martin Tajmar, ARC Seibersdorf Research GmbH, Austria; Clovis de Matos, ESA-HQ, Paris; and colleagues have measured ... a gravitomagnetic field ... generate[d] ...[by]... a moving mass ... Their experiment involves a ring of superconducting material rotating up to 6 500 times a minute. Superconductors are special materials that lose all electrical resistance at a certain temperature. Spinning superconductors produce a weak magnetic field, the so-called London moment. The new experiment tests a conjecture by Tajmar and de Matos that explains the difference between high-precision mass measurements of Cooper-pairs (the current carriers in superconductors) and their prediction via quantum theory. They have discovered that this anomaly could be explained by the appearance of a gravitomagnetic field in the spinning superconductor (This effect has been named the Gravitomagnetic London Moment by analogy with its magnetic counterpart). ... Although just 100 millionths of the acceleration due to the Earth's gravitational field, the measured field is ... one hundred million trillion times larger than Einstein's General Relativity predicts. ... The electromagnetic properties of superconductors are explained in quantum theory by assuming that force-carrying particles, known as photons, gain mass. By allowing force-carrying gravitational particles, known as the gravitons, to become heavier, they found that the unexpectedly large gravitomagnetic force could be modelled. ... The papers can be accessed on-line at the Los Alamos pre-print server using the references: gr-qc/0603033 and gr-qc/0603032. ...".
In gr-qc/0603033, Tajmar, Plesescu, Marhold, and de Matos say:
"... a rotating superconductor produces a magnetic field proportional to its angular velocity. ...... in addition to this so-called London moment, also a large gravitomagnetic field should appear to explain an apparent mass increase of Niobium Cooper-pairs. This phenomenon was indeed observed and induced acceleration fields outside the superconductor in the order of about 100 micro g were found. The field appears to be directly proportional to the applied angular acceleration of the superconductor following our theoretical motivations. ...".
In gr-qc/0603032, Tajmar and de Matos say:
"... In quantum field theory, superconductivity is explained via a large photon mass as a consequence of gauge symmetry breaking and the Higgs mechanism. The photon wavelength is then interpreted as the London penetration depth and leads to a Photon mass about 1/1000 of the electron mass. ... As the photon's mass is non-zero, the usual Maxwell equations for electromagnetism transform into the well known Proca equations with additional terms due to the finite Photon wavelength ...[for]... the Proca equations ... :
- The "Meissner" part ... shielding of electromagnetic fields entering the superconductor ... becomes important only for large photon masses (which is not the case in normal matter) ...
- but the "London Moment" part ... a magnetic field ... generated due to the rotation ... becomes important for very small photon masses ...
the photon mass inside normal matter ... is ... (due to the negative sign in the Larmor theorem) always a complex value independent of the sign of charge ...
we ... postulate that the complex mass will change into a real value by passing from normal to coherent matter ...
Recent experimental results on the gravitomagnetic London moment ...[ gr-qc/0603033 ]... tend to demonstrate that gravitational dipolar type radiation associated with the Einstein-Maxwell equations is real. This implies that Maxwellian gravity is not only an approximation to the complete theory, but may indeed reveal a new aspect of gravitational phenomena associated with a vectorial spin 1 gravitational boson, which we might call the graviphoton. ...
[ In my Segal Conformal Gravity model, there are 15 such spin 1 graviphotons, which are generators of the Conformal Group Spin(2,4) = SU(2,2):
we will therefore use the term graviphoton for studying the Proca type character of gravity and its consequences on cosmology, coherent matter and high energy particle physics. ...
Similar to the case of the photon, the graviphoton mass is not zero based on the measurement of the cosmological constant. ... The recent measurement of the cosmological constant /\ = ( 1.29 +/- 0.23 ) x 10^(-52) m^2 by WMAP can be linked to the graviton (graviphoton) mass by a recent result from Novello et al and others
m_g = ( hbar / c ) sqrt( 2 /\ / 3 ) = 3 x 10^(-69) kg ...
the graviton (graviphoton) [ mass ] is ... a real number which is ... confirmed by our experimental results ... Similar to electromagnetism, we obtain a Meissner and a London moment part for the gravitomagnetic field generated by matter ... The gravi[pho]ton mass ... therefore describes the inertial properties of matter in accelerated reference frames. This is a very fundamental result and new insight into the foundations of mechanics. It can be also interpreted as a form of Mach's principle. ...
if we take the case of no local sources .. the graviphoton mass will be zero, and we will find, by solving the weak field equations in the transverse gauge, the "classical" freely propagating degrees of freedom of gravitational waves associated with a massless spin 2 graviton ...
[ This is the case in which the 6 rotations and boosts and the 4 translations all are used by the MacDowell-Mansouri mechanism to make the spin 2 gravitons,
and the 4 special conformal transformations and 1 dilation are fixed and constant, and so do not propagate. ]
However, in the case of local sources, a spin-1 graviphoton will appear. ...
[ With local sources, the 1 dilation propagates and gives Higgs mass
and the 4 special conformal transformation graviphotons also propagate and get mass and produce Dark Energy /\ . ]
The cosmological constant for Einstein's equation ... is the given by/\ = ( 3 / 2 ) ( 1 / lambda_g^2 ) = ( 3 / 2 ) mu_0g rho_m ...[where]... mu_0g = 4 pi G / c^2 ...
The average density of the universe is rho_m = 10^(-26) kg m^(-3) . This gives a gravi[pho]ton mass of m_g = 3.2 x 10^(-69) kg and a cosmological constant ... of /\ = 1.3 x 10^(-52) m^2 . Those values are exactly within present experimental observations!
How large is the local gravi[pho]ton mass? In a piece of iron for example, the absolute value of the graviton mass would be 2.8 x 10^(-54) kg, which is still undetectable small. ...
So in fact, the vacuum energy density is equal (up to the numerical factor of 0.75) to the energy density of matter ( rho_E = rho_m c^2 ). This ... gives a totally new perspective to the energy of the vacuum - being defined as function of the local density of matter. ...
[ In my Segal Conformal Gravity model, the basic numerical factor is 0.67 = 10 / 15, the fraction of the 15 conformal group generator graviphotons that are in the 10 generator subgroup Q made up of the 6 rotation and boost graviphotons and the 4 special conformal transformation graviphotons. According to gr-qc/9809061 by R. Aldrovandi and J. G. Peireira, "... a semi-direct product between Lorentz and special conformal transformation groups ...[forms]... the group Q ...[which has the]... symmetry of the strong cosmological-constant limit ...".
some of the many consequences ...[of]... a graviton mass ...pare]. ... :
- actually solv[ing] the cosmological coincidence problem (observations showed that dark energy makes up about 73% of the energy present in the universe). ... it is a natural consequence of the fact that the graviton mass is a function of the matter density and a flat universe. We can use this analytical result also to express a direct relationship between the Hubble and cosmological constant H^2 = (4/9) /\ c^2 ...
- A graviton mass leads to a frequency dependence on the propagation of gravity in free space. ...
- a non-zero cosmological constant has also an influence on the Schwarzschild solution for black holes ... In our case, the cosmological constant of a black hole is given by /\_S = 36 c^4 / ( 343 M^2 G^2 ) ... This gives a gravitational horizon which is just a little bit smaller than the Schwarzschild radius ... This has the consequence, that the gravitational force inside the black hole should now decrease with a Yukawa type modification affecting gravitational and gravitomagnetic fields. The denser the black hole, the stronger will be the gravitational force decay ...
- a spin-1 field such as the graviphoton in Einstein-Proca equations prohibits black hole solutions ...[ according to ]... Obukhov, Y.N., Vlachynsky, E.J., Ann. Phys. 8(6), 497-509 (1999) ...
[ In my Segal Conformal Gravity model, our universe is made up of regions that are in one of two phases. In the model of pennies on a balloon,
In the Dark Energy expanding balloon regions (75% of our universe) the 4 special conformal generator spin-1 graviphotons propagate, so the Einstein-Proca equations prevail and there are no Black Holes.
In the Ordinary Matter penny regions, the special conformal generator spin-1 graviphotons do not propagate and the 1 dilation Higgs/graviphoton produces Ordinary Matter (5% of our universe) and the 4 translation graviphotons produce Primordial Black Hole Dark Matter (20% of our universe).
The 6 Lorentz generator graviphotons act in accord with a generalized MacDowell-Mansouri mechanism to give torsion and connect with spinor fermions. ]
- Perhaps that most important consequence of the local graviton mass is its relation to superconductivity. ... the application of Proca equations to superconductivity are well established. In a superconductor, we have now a ratio between matter being in normal and in a condensed (coherent) state. So we have two sets of Proca equations, one which deals with the overall mass and one with its condensated subset.
The coherent part of a given material (e.g. the Cooper-pair fluid) is also described by its own set of Proca equations ... but with one important difference: Instead of the ordinary mass density rho_m we have to take the Cooper-pair mass density rho_m* ... The ... Meissner part ... is not different from our previousassessment for normal matter, but the second part changes ... due to the fact that the gravi[pho]ton mass depends on all matter in the material, not just the coherent part. ... We switched here from the real graviton (graviphoton) mass to a complex value similar to what we discussed for the photon. This gives again the right sign for the gravitomagnetic London moment as observed ...
[ In the Dark Energy coherent state, the analogue of Cooper-pairs and electromagnetic photons is virtual particle-antiparticle pairs of the vacuum and Segal Conformal Gravity special conformal transformation graviphotons. ]
the presence of Cooper-pairs inside the superconductor leads to a deviation from the equivalence principle and from the classical gravitational Larmor theorem.A rigid reference frame mixed with non-coherent and coherent matter is not equivalent to a rigid reference frame made of normal matter alone, with respect to its inertial and gravitational properties.
However, in the case of a Bose-Einstein condensate where we have only coherent matter ... the equivalence principle is again conserved. ...
gravitomagnetic fields are also present inside the superconducting ring. The first part is the classical London moment, with its origin is due to the photon mass, and the second part is its analog gravitomagnetic London moment, which will produce an additional field overlapping the classical London moment. ... depending on the superconductor's bulk and Cooper-pair density, the magnetic field should be higher than classically expected. Indeed, that has been measured without apparent solution throughout the literature. ...
Tate et al used a sensitive London moment measurement to determine the Cooper-pair mass in Niobium. This mass was found to be larger ( m*/ 2 m_e = 1.000084 ) than the theoretically expected value ( m*/ 2 m_e = 0.999992 ). ... the local graviphoton mass ...[gives]... not only a conjecture to explain Tate's anomaly, but also a good reason why a rotating superconductor should produce a gravitomagnetic field which is larger than classical predictions from ordinary rotating matter. ...
Higgs condensate acts like molasses and slows down anything that interacts with it. The stronger the interactions between the particles and the Higgs condensate are, the heavier the particles become ... The standard model predicts that the vacuum energy density is directly proportional to the square of the Higgs mass m_H ... the Vacuum Expectation Value (VEV) ... v ... of the Higgs mass ... essentially measures the mean number density of Higgs particles n_H condensed in the zero-momentum state (vacuum) ... Assuming that a Cooper-pair condensate is a possible form of a Higgs condensate, we can equal the density of condensed Higgs particles n_H to the Cooper-pair density n_s in a superconductor. ... we can estimate the mass of the Higgs boson in function of the local density of mass m_H = ( 3 / 2 ) ( rho_m / n_s ) Taking the example of Niobium ( rho_m = 8570 kg m^(-3), n_s = 3.7 x 10^28 m^(-3) ), we estimate the Higgs mass as mH=192 GeV. ... This result leads us to consider coherent matter on the same physical foot as spacetime vacuum. Coherent matter would then be a form of vacuum. ...
[ The Cooper-pair condensate analogue for the Higgs is a condensate of Truth Quark-AntiQuark pairs, acting in a 3-element system of Higgs, Truth Quarks, and the Vacuum (see hep-ph/0307138 by Froggatt) , in accord with Nambu-Jona-Lasinio type models as described by Yamawaki et al in hep-ph/9603293 and hep-ph/0311165 . ]
the local density of mass determines respectively the local mass of the graviphoton as well as the local mass of the Higgs boson. On the other side by equalling both equations we deduce that the local Higgs boson mass is proportional to the square of the local graviphoton's mass ... This can be understood as being a fundamental bridge between linearized general relativity with a cosmological constant, and the standard model. ...in classical matter the photon has a complex and the graviton(graviphoton) a real value. In coherent matter we suggest the hypothesis that it is exactly the other way round, which solves the sign change problems associated to the classical and gravitomagnetic London moment. ...".
In cond-mat/0602591, de Matos and Tajmar say:
"... Since Cooper pairs have anomalous mass excess, and all rest mass comes from the Higgs mechanism, and since rest mass interacts through gravity, what is the relationship between gravity and the Higgs mechanism in a superconductor?Some attempts to answer parts of that question can be found in the literature literature 29-35, but the final quantum theory of gravity is still not completed. ...
- 29 Sardanashvily, G. A., "Gauge Gravitation Theory. What is the Geometry of the World?", Los Alamos Physics Archive, gr-qc/9410045, 1994
- 30 Sardanashvily, G. A., "Gravity as a Higgs Field. I. Geometric Equivalence Principle", Los Alamos Physics Archive, gr-qc/9405013, 1994
- 31 Sardanashvily, G. A., "Gravity as a Higgs Field. II. Fermion-Gravitation Complex", Los Alamos Physics Archive, gr-qc/9407032, 1994
- 32 Sardanashvily, G. A., "Gravity as a Higgs Field. III. Nongravitational Deviations of Gravitational Fields", Los Alamos Physics Archive, gr-qc/941103, 1994
- 33 Bluhm, R., Kostelecky, V. A., "Spontaneous Lorentz violation, Nambu- Goldstone Modes, and Gravity", Los Alamos Physics Archive, hep-th/0412320, 2004
- 34 Smith, F. D. T., "SU(3) X SU(2) X U(1), higgs, and Gravity from Spin(0,8) Clifford Algebra Cl(0,8)", Los Alamos Physics Archive, hep-th/9402003, 1994
- 35 Smith, F. D. T., "Higgs and Fermions in D4 &endash; D5 &endash; E6 Model based on Cl(0,8) Clifford Algebra", Los Alamos Physics Archive, hep-th/9403007, 1994 ...".
I regret that due to me being blacklisted by the Cornell arXiv, my more recent work along those lines is not available on the Physics Archive for Tajmar, de Matos, and others to see.
Stuart Hameroff said in the abstract of his paper at Biosystems Volume 77, Issues 1-3 , November 2004, Pages 119-136:
"... It is proposed here that normal mirror-like mitosis is organized by quantum coherence and quantum entanglement among microtubule-based centrioles and mitotic spindles which ensure precise, complementary duplication of daughter cell genomes and recognition of daughter cell boundaries. ... Impairment of quantum coherence and/or entanglement among microtubule-based mitotic spindles and centrioles can result in abnormal distribution of chromosomes, abnormal differentiation and uncontrolled growth, and account for all aspects of malignancy. ...".
......