How about this one: Surely distant starlight must prove an old universe. No one disputes that the stars and galaxies are very distant, millions and even billions of light years away. Of course, a light year is a distance measurement that indicates how far light, traveling at its presently measured speed of 186,000 miles a second, travels in one year. So doesn’t this prove the light needed millions and billions of years to get here?
This problem assumes that the speed of light has always been the same, and that clocks have always measured time passing at the same rate in all times and places in the history of the universe. It may seem like an open and shut case, but actually, as we shall see, explaining distant starlight is a task for all cosmologies, including the conventional Big Bang model. I am not a physicist or astronomer by profession, but I want to make everyone aware of the work of many physicists and astronomers who have put forth scientifically sound alternative cosmological models that indicate or at least accommodate a young universe, on the order of 6,000-10,000 years old. These models question some of the underlying assumptions of the conventional models such as the big bang theory.
The Big Bang theory has a light time travel problem:
It should be shown at the outset that even the conventional models have to solve a distant starlight problem. The problem is called the “horizon” problem. In the big bang model, the universe began with a small point called a singularity, which then expands rapidly. Before expansion, this model requires that different regions of the universe started with very different temperatures, yet today we can detect electromagnetic radiation coming from great distances all over the known universe, and this radiation shows that the temperature is very uniform in all places. But how did this happen between regions that are now billions of light years apart? This could happen only by these regions exchanging electromagnetic heat and light energy until the temperature is uniform. This is what happens when an ice cold glass of water comes to room temperature if we wait long enough. Electromagnetic energy traveling at the current speed of light would not have had time to even out the temperature for points billions of light years apart, since they would have to have exchanged light and heat energy many times.
This is why inflation theory was brought in to save the big bang model from this horizon problem. Inflation theory, which actually has no convincing supporting evidence ,{1} has the universe expanding slower at first, which supposedly allows the temperature differences to smooth out before there is a rapid, explosive inflation after that. As can be seen in the referenced articles, there is no know cause for this inflation, nor a mechanism for stopping it, as well as other problems.
Therefore, the big bang’s starlight travel time problem remains, and so one cannot dismiss the Biblical chronology in favor of the conventional one.
Natural or Supernatural?:
There are underlying assumptions in conventional models that the processes and rates observed today were always in operation in the past-called uniformitarianism. There is also an up front exclusion of the possibility of a supernatural creation event, where different processes and rates were brought into play. If we assume instead that a creation event really happened, then there would have been processes that don’t happen today, operating at much faster rates than we see today. We can’t argue that a supernatural explanation is wrong because something can’t be explained by natural means. This is simply circular reasoning and exclusion of the supernatural on philosophical grounds.
So let’s look at some alternative cosmological models that have been proposed to deal with the distant starlight problem, both natural and supernatural models.
The Dasha Theory by Dr. Dan Faulkner:
Astronomer Dr. Dan Faulkner has come up with one alternative model which he calls the “Dasha Theory” named after the Hebrew word used in Genesis meaning “to grow” or to “bring forth” as in Genesis 1:11. God “brought forth” the stars and their light so Adam could see them on Day 4 of creation. Remember, creation is said to be a miraculous process, like the virgin birth of Christ, or the resurrection. So this is a model that accommodates the supernatural.
In this model, the current laws of physics don’t come into existence until after the creation period. The Bible speaks many times of the heavens being stretched out during creation (see Isaiah 40:22, Job 9:8, Psalm 104:2, and over a dozen other verses). The light from the stars could have been brought forward (dasha) abnormally fast by a process that is undescribed, enabling it to be seen on Day 4.
Some have objected to this by pointing out that we can see supernovas-star explosions in distant galaxies and so they say this couldn’t have happened during the creation period, since everything was pronounced “very good”. Faulkner defends his theory by pointing out that “very good” sometimes doesn’t necessarily mean perfection-and there is no life lost in a supernova explosion. “Very good” could simply mean fulfillment of moral good, and things doing what they are designed to do. For more on this, Faulkner has put out a video called “The Dasha Theory”, (Dasha Theory-A starlight and time solution by Dr. Danny Faulkner)
Now you might have been tempted to tune me out for starting out with a solution that invokes supernatural processes. While this is totally appropriate given that my starting assumption here is not to exclude but include the supernatural (after all, I am talking about creation!), there are also other models out there that invoke natural processes and known scientific theories as well. They simply change some starting assumptions, as we shall see. These models are scientifically based and yet point to a young universe. Let’s look at four different alternative models, put forth by four scientists.
Dr. Russell Humphrey’s White Hole/Achronicity Model:
Physicist Dr. Russell Humphreys first model, called the White Hole Cosmology, was introduced in 1994 in his book Starlight and Time. {2} His model is based on Einsteinian Relativity and uses the same equations as the standard big bang theory, but has some different starting assumptions.
Standard big bang cosmology assumes that the universe has no center and no edge, with matter filling all of space, and since there would be no boundary and empty space around the matter, there would be no unique center or center of mass, and no net gravitational force since all galaxies would be surrounded by an even distribution of other galaxies. What many people don’t realize is that this is a purely arbitrary assumption, not required by the scientific evidence, but based on the idea that Earth has no special place in the cosmos such as in or near the center. It is called the Copernican Principle.
Hawking and Ellis {3} comment on the reason for it: “…we are not able to make cosmological models without some admixture of ideology. In the earliest cosmologies, man placed himself at the center of the universe. Since the time of Copernicus we have been demoted to a medium sized planet going round a medium sized star on the outer edge of a fairly average galaxy…we would not claim our position in space is specially distinguished in any way.”
Notice that they call this principle an “admixture of ideology”. That is, they start up front with the idea that the creation account is false, and that man has no special place in the cosmos. This does not come from observable evidence but from a philosophical conclusion that we are the result of random processes and not from a Creator with a special purpose and place for us. The only physical evidence they point out is that the universe is isotropic, that is, it looks about the same in every direction. However, we shall see that these alternative models can explain this property of the universe just as well without assuming the Copernican Principle.
On the other hand, the creation account in Genesis implies that the universe does have a center (Gen 1:2) from which God causes the expansion of the universe outward from the center of a large mass. And there is now actual scientific data that indicates the universe may have a center of mass after all (see articles:
Prestigious journal endorses basics of creationist cosmology
Massive Quasar Cluster Refutes Core Cosmology Principle
We also get from relativity theory that gravity affects clocks. A clock at high altitude runs faster than a clock at a lower elevation. This has been verified experimentally many times. This is because the clock at the lower altitude is deeper into the “gravitational well” of the Earth. The deeper into a gravitational well, the more the clocks slow down. So when someone asks, how long did it take starlight to get here, we need to ask, “whose clocks?” Although this time dilation effect, as it is called, is not much today even for clocks far out into space, there is evidence that the universe has expanded greatly, and when it was much smaller time would have run much faster at the edge of the universe than in the center, which would be deep into the universe’s “gravitational well.” All these effects fall out using the same equations for General Relativity as the standard model. So in this model light from distant stars would have plenty of time to reach earth where clocks would have been running slower. So what effect makes this possible?
If the universe has a center, then there is a gravitational center of mass. If the universe has expanded, then at one time in the past there was the same amount of matter as today, but packed into a smaller space. If the universe was smaller by a factor of fifty, as referenced by Humphreys (Starlight and Time, page 22) relativity allows it to either be inside a black hole or a white hole. All the matter would be contained inside what is called the event horizon of a black hole, the event horizon being where time is greatly slowed or stopped. But black holes do not expand. However, General Relativity allows for a white hole, which reverses the events, and unlike a black hole which holds everything in, the white hole requires that light and matter inside the event horizon expand out, and as they do, the event horizon shrinks in diameter. So if you have a bounded universe, that has expanded, General Relativity indicates you have a white hole (Starlight and Time, page 25-26.)
Since time would stand still at the event horizon, as the event horizon got smaller, it would eventually reach earth at the center, on day 4 if we go by the creation account, and while clocks were running fast in the distant universe, they would be stopped or running very slowly on Earth. So you would see distant objects in the universe age billions of years, and light would have plenty of time to reach Earth. For a layman’s summary in Starlight and Time, see pages 9-29, for a more technical explanation, see pages 83-128.)
For links to Humphrey’s basic theory some modifications he made to it, and his responses to critics, see these articles:
New Vistas of Space-Time Rebut the Critics
Russel Humphreys answers various critics
In 2007 Humphreys made some further modifications and came up with an improved version of the original model. He explains in this excerpt from his article referenced below {4}:
“In November of 1915 Albert Einstein published the crowning conclusion of his General Theory of Relativity: a set of sixteen differential equations describing the gravitational field.2 Solutions to these equations are called metrics, because they show how distance-measuring and time-measuring devices (such as rulers and clocks) behave. The equations are so difficult to solve that new metrics, giving solutions under specific conditions, now appear only once every decade or so. Metrics are foundational; they open up new ways to understand space and time. For example, the first metric after Einstein’s work, found by Karl Schwarzschild in 1916,3not only explained the detailed orbits of planets, but also pointed to the possibility that ‘black holes’ might exist.
In the fall of 2007 I published a new metric as part of an explanation of the ‘Pioneer anomaly’, a decades-old mystery about the slowing-down of distant spacecraft.4 Compared to many modern metrics,5the new one is rather simple. It describes space and time inside an expanding spherical shell of mass. I was interested in that problem because of the ‘waters that are above the heavens’ that Psalm 148:4 mentions as still existing today above the highest stars (see figure 1). The waters would be moving outward along with the expansion of space mentioned in 17 Scripture passages.6
Figure 2. A moving clock measures the spacetime interval ds between two events.
According to data in my previous paper, the total mass of the shell of waters is greater than 8.8 × 1052 kg, more than 20 times the total mass of all the stars in all the galaxies the Hubble Space Telescope can observe.7 However, because the area of the shell is so great, more than 2 × 1053 m2, the average areal density of the shell is less than 0.5 kg/m2. By now the shell must have thinned out to a tenuous veil of ice particles, or perhaps broken up into planet-sized spheres of water with thick outer shells of ice. It is only the waters’ great total mass that has an effect on us, small but now measurable.
Figure 3. Gravitational potential F inside a spherical shell of mass increases as radius R of the shell increases between two events.
Because of the great mass of the ‘waters above’, I could neglect the smaller mass of all the galaxies in deriving the metric. Although other distributions of mass could also solve the Pioneer mystery, this one seems more applicable to biblical cosmology.
Being relatively simple, the new metric clarifies a new type of time dilation that was implicit in previous metrics but obscured by the effects of motion. This new type, which I call achronicity, or ‘timelessness’, affects not only the narrow volume of space at or just around an ‘event horizon’ (the critical radius around a black hole at which time stops), but all the volume within the horizon. Within an achronous region, we will see, time is completely stopped. I pointed out a related effect, ‘signature change,’ in an earlier paper,8 but all I had to go on then was an older metric, the Klein metric, which was quite complicated. The complexity obscured what that metric suggested could happen to time. The cosmology this paper outlines is a new one that does not stem from the Klein metric.” [4]
” The new metric I derived in 2007 has yielded several interesting results. One is a straightforward explanation of the Pioneer anomaly. In this paper, it has revealed a new type of time dilation, achronicity. The fundamental cause of achronicity appears to be that gravitational potential becomes so negative that the total energy density of the fabric of space becomes negative. That stops the propagation of light, all physical processes, and all physical clocks, thus stopping time itself.
I have examined the effect only for essentially motionless bodies (having velocities very much less than that of light). In a later paper, I hope to explore some of the interesting and possibly useful effects of achronicity for non-negligible particle velocities. The speculative scenario in the previous two sections shows how useful achronicity could be in creation cosmology. Other scenarios are easily possible, and I hope that other creationists making alternative cosmologies will find timelessness a good tool.”
This new model builds on Humphrey’s previous models. As he shows above, it is based on a new solution (metric) of Einstein’s General Relativity equations, and allows for a new type of time dilation that is an even more powerful solution to the light time travel problem. He uses the illustration of space being stretched out like a trampoline, noting that there are many Bible verses that seem to speak of space as a kind of “material” that can be stretched, rolled, etc. And modern science has a concept of “material” for space as well. As mass of stars are added, it caused the fabric of space to drop below a critical timeless zone, and then as space is then stretched, the created stars and galaxies come out of the timeless zone, and their light follows that zone all the way back to Earth, which is the last to emerge from this timeless zone For a series of lay-friendly articles that explain this much better than I can, by Humphreys and Dr. Larry Vardiman, see:
A New Creationist Cosmology in No Time at All-Part 1
A New Creationist Cosmology-in No Time at All-Part 2
A New Creationist Cosmology-in No Time at All- Part 3
This model also solves a mystery concerning the Pioneer spacecrafts:
Creation Cosmologies Solve Spacecraft Mystery
What these models show is that there are several possibilities are viable within General Relativity, depending on your beginning assumptions, and that there are several which accommodate a young universe.
Dr. John Hartnett’s Cosmological Relativity Model:
Another relativistic time dilation theory has been put forth by physicist Dr. John Hartnett. His model is a five-dimensional cosmology based on the work of Moshe Carmeli, called Cosmological General Relativity. Hartnett’s Theory is an extension of Carmelian General Relativity as he spells out in his book Starlight, Time , and the New Physics.{5}
This theory proposes a five dimensional space-time-velocity solution rather than a four dimensional space-time solution as in the standard big bang Friedmann-Lemaitre model. The new dimension is the radial expansion velocity of space. Carmeli solved the same equations, but in 5 dimensions rather than 4.
His model has clocks in what he calls cosmic time running about 1 trillion times faster than Earth clocks on day 4, due to the effects of accelerated expansion of space as stars and galaxies were being created. It is similar to Humphrey’s model in that it also starts with the assumption of a center of mass, and an expanding universe. However, rather than a gravitational effect as in Humphrey’s cosmology, this is due to the extra dimension of velocity from very rapid expansion of the cosmos. In other words, the clocks run faster in the distant cosmos due to this rapid stretching of space. What falls out of his equations is that in galaxies that are undergoing this extreme acceleration, clocks would run much faster than clocks on earth. The farther away, the faster the acceleration, as if the galaxies were fixed on a rubber sheet that was being stretched. This rapid expansion then slowed greatly after the creation period, and clocks no longer ran as fast in the distant cosmos.
The 5 dimensional solution to Einstein’s equations was done by Carmeli, and contains the 4 dimensional standard solution as a subset. The model assumes a spherically symmetrical, isotropic universe, but not one that is homogenous, that is, that has matter evenly distributed about it. And it turns out that when a survey was done of the distribution of galaxies in a large section of space, called the Sloan Digital Sky Survey, it was found that the galaxies are not evenly distributed, but are in large concentric circles around a center point which is slightly off from Earth’s location. This poses problems for the standard model, but fits with Hartnett’s and Humphrey’s models. This survey showed we are at least near the center of the visible universe.
His model has the creation of the solar system first, then the rapid expansion of space outside the solar system. So in the solar system the space-time 4 dimensional model would apply, but in the rest of the cosmos the five dimensional space-time-velocity model would apply.
Again, Hartnett can explain this much better than I in this excerpt from his article A 5 D Spherically Symmetric Expanding Universe is young {6}:
“In this paper I explore an extension of Carmelian cosmology. In itself, it has had success in describing the large-scale structure as seen in the type 1a supernovae distance modulus versus redshift data,10,11 and in fitting to the anomalous rotation curves of spiral galaxies.12 I propose that the only 5D spacetimevelocity metric that can be correct on both the local scale, reproducing the 4D spacetime metric of SR and GR, and on the cosmological scale, reproducing the 4D spacevelocity metric of CSR and CGR, is one that requires that enormous cosmological acceleration and accompanying time dilation has occurred, in the past, between Earth clocks and those in the rest of the universe. This means the universe is very young as measured by Earth clocks. It only has the appearance of great age because we are biased by the vast size of the universe. Based on the observed retardation of cosmological clocks in the distant universe, I postulate that during Creation Week, specifically on Day 4, Earth clocks ran extremely slowly compared to the rest of the universe.
This means that if the new theory is shown to fit the observations of the large-scale structure of the universe and is consistent with Einstein’s well-tested special relativity theory, then we are forced to conclude that the correct understanding of the expanding universe means that clocks on Earth once ran at much slower rates than clocks in the universe. As a result, we have a mechanism for light to travel to Earth from the most distant galaxies within the biblical timescale.”
“This is the critical point in understanding this paper. The assumption is that during the creation of the heavenly bodies15 on Day 4 the universe underwent a very rapid expansion. For example, the Bible tells us:
‘He wraps himself in light as with a garment; he stretches out the heavens like a tent’ (Psalms 104:2).
‘He sits enthroned above the circle of the earth, and its people are like grasshoppers. He stretches out the heavens like a canopy, and spreads them out like a tent to live in’ (Isaiah 40:22).
‘This is what God the Lord says—he who created the heavens and stretched them out, who spread out the earth and all that comes out of it, who gives breath to its people, and life to those who walk on it’ (Isaiah 42:5).
‘This is what the Lord says—your Redeemer, who formed you in the womb: I am the Lord, who has made all things, who alone stretchedout the heavens’ (Isaiah 44:24).
The very fabric of space was stretched, and during that time of stretching, stars and galaxies were created.”
Hartnett’s cosmology explains anomalies such as type Ia Supernovas which have higher redshifts than they should, without resorting to “dark energy” {7} and also explains the anomalous rotation curves of spiral galaxies without invoking unproven “dark matter” or “dark energy”, because it takes into account the effects of the expansion of space on the galaxies {8}
Hartnett, who formerly was an atheist, comments on this new cosmology and its relation to the Bible: “But don’t be mistaken; though Carmeli is some sort of rebel in that he has challenged the established thinking, in his mind his new theory does not present as anything more than a new type of big bang model. However, we can apply the same theory to extract a new model that is consistent with what we would expect, starting with the Genesis history. The starting conditions cannot be determined from observations, and even if we could see back in time to the beginning,, the same data could support a range of different historical interpretation-there is no unique history presented by the evidence. To get the correct starting conditions you would need the testimony of an eye-witness to those events, which is what we have in God’s account of what He says He did, in Genesis 1.” {9}
So again we see it comes back to worldview, as cosmologist George Ellis admits: “People need to be aware that there is a range of models that could explain the observations. For instance, I can construct you a spherically symmetrical universe with Earth at its center, and you cannot disprove it based on observations. You can only exclude it on philosophical grounds. In my view there is absolutely nothing wrong in that. What I want to bring into the open is the fact that we are using philosophical criteria in choosing our models. A lot of cosmology tries to hid that.” {10}
Dr. Jason Lisle-The Anisotropic Synchrony Convention Model:
Astrophysicist Dr. Jason Lisle has come up with yet another way to potentially explain the distant starlight problem. He acknowledges the value of the previous models, but also suggests that the time for starlight to get to Earth depends on the convention one uses to measure time. His model is called the Anisotropic Synchrony Convention. A Synchrony convention is a procedure used for synchronizing clocks that are separated by a distance. This theory is based on the fact that the speed of light in one direction, that is the one-way speed of light, actually cannot be objectively measured. What is measured in experiments is the round-trip speed of light, using mirrors to reflect the light back. So it is possible that the one-way speed of light could actually be instantaneous, even though the round-trip two way speed of light is constant.
Lisle explains why we can’t measure the one-way speed of light in this excerpt from his article {11}
In other words, we are free to choose what the speed of light will be in one direction, though the “round-trip” time averaged speed is always constant.
The reason that the one-way speed of light cannot be objectively measured is that you need a way to synchronize two clocks separated by a distance. But in order to synchronize two clocks separated by some distance, you have to already know the one-way speed of light. So it cannot be done without circular reasoning.
We need to have a way of synchronizing clocks to know the one-way speed of light. But we need to know the one-way speed of light in order to synchronize clocks. Einstein was well aware of this dilemma. He said, “It would thus appear as though we were moving here in a logical circle.”2
Einstein’s resolution to this dilemma was to suggest that the one-way speed of light is not actually a property of nature but is instead a convention—something that we may choose!” {11}
So we can actually choose a convention, similar to choosing Local time over Universal Time on Earth. Anisotropic refers to light having different speeds in different directions, as opposed to the convention Einstein used, isotropic-the same speed of light in all directions.
Genesis may imply the Anisotropic Synchrony Convention (ASC), since starlight was made available immediately. So in this convention the one way speed of light from the distant galaxies to Earth was instantaneous.
It may seem unlikely that light would not have the same speed in all directions. But even though we may assume for everyday use that light speed is constant in all directions as measured by our clocks, in a relativistic universe, as we approach the speed of light, time and space no longer have absolute values independent of the observer.
In his more technical article, Lisle shows that using the Einsteinian convention, with light speed in all directions the same, leads to some interesting results when we have one observer in motion relative to the other. {12}. In fact, they will get different answers as to whether some events happened at the same time, or in what order they happened. With ASC we find that two observers see the same events as simultaneous, regardless of their velocity.
He makes the case also that since we can choose a convention, it makes sense to see which one fits the Bible. As we said above, light traveling very fast from the stars to Earth would fit the ASC. Also, people in most of history would not know anything about the speed of light, or lookback time and with ASC , it is not required to know the distance to an object, so ASC best preserves the clarity of Scripture. Things in space would be seen as they happen. Astronomer seem to use ASC when they name a supernova after the year they saw it, rather than the year they believe the light left the source. ASC is just one more possible model that depends on one’s starting assumptions rather than the observations.
Finally, I come to a model that takes a different approach to the data that supports relativity., and proposes something radical to many physicists- a faster speed of light in the earlier days of the universe. I will look at physicist’s Barry Setterfield’s Theory mainly, although recently other scientists have proposed a change in light speed to help solve some of the big bang theory’s problems.
Barry Setterfield’s CDK Model: An Overlooked Cosmology, the Heat Problem in Accelerated Radioactive decay, and the Zero Point Energy
The Institute for Creation Research RATE team in their work spanning about 8 years found evidence for acceleration of radiometric decay. For example, they found large amounts of helium in rocks equivalent to millions of years of decay end products, but that should have in the supposed millions of years of time diffused out of the rocks. They also found polonium halos which were evidence again of quick formation as well as accelerated decay. But there is a problem with the idea of accelerated decay under the theories proposed, the problem of excessive heat, which is especially true if you have large amount of decay too far after the beginning of creation, and a large amount of acceleration of decay happening in just one year of the flood. They did propose some mechanisms that would absorb this heat, {13} but I think there is a creationist cosmology proposed by physicist Barry Setterfield, that solves this radioactive heat problem. And this same model also deals with the light travel time from distant stars and galaxies.
As we have seen above, models for light travel time based on relativistic time dilation have been proposed that would work in theory, but have been hard to prove with hard experimental data. Another problem for the conventional big bang model has been trying to explain the formation of galaxies and stars with gravity alone. They need to invoke things like shock waves from already existing exploding stars to push the matter together so gravity can supposedly take over. But this doesn’t explain the formation of the first stars and galaxies. Dark matter is proposed to explain motions of galaxies and the holding together of star clusters, and dark energy is also invoked to explain the supposed acceleration of the universe’s expansion, even though there is no hard evidence for either dark matter or dark energy’s existence.
But there is an overlooked cosmology, proposed by physicist Barry Setterfield that could possibly neatly solve all of these problems. This is the ZPE or Zero Point Energy cosmology, also called CDK or the slowing of light speed. The ZPE according to the theory is an energy that pervades the vacuum of space at a temperature of absolute zero, even after all gases, liquids, and solids are removed. The ZPE is not an idea that Setterfield came up with, rather Max Planck first conceived of it in a paper he wrote in 1911 (Planck’s second paper) in which he specified it as the cause and the measure of quantum uncertainty (Planck’s constant). Planck had written his first paper in 1901 which was the basis for much of today’s QED (Quantum) physics, but he was actually unsatisfied with that first paper because it did not give a real physical mechanism for the jitter motion of particles and the associated uncertainty. In 1925, Einstein, Nernst, and others examined this proposal and approved of it. The ZPE’s existence was verified by Mulliken in 1925, from the measured shift of the spectral lines of boron monoxide, and confirmed in other ways since by evidence such as the Casimir effect, the inability to freeze liquid helium without pressure, the “noise” in electronic circuits, and the existence of Van Der Waal’s forces. We don’t feel the ZPE energy for the same reason we don’t feel air pressure from the atmosphere-it is the same inside and out. For a more complete technical explanation of all this, please read Journal of Theoretics, {14}especially the first 10 pages. This article discusses two approaches to modern physics; QED (Quantum Electrodynamics) and SED (Stochastic Electrodynamics). Also for an excellent, easy to read summary of Setterfields work by another scientist, please look at The Setterfield Cosmology {15}
Included in the above article {15} was the fact that de Broglie in 1962, who had written one of the early papers which help introduce quantum physics, suggested that physicists had missed something and should take a second look at the ZPE, which provides an actual physical explanation for quantum phenomena, instead of these just being an inherent property of matter. This re-examination is now ongoing in the area of SED physics.
The ZPE according to the theory originated when God stretched out the universe, creating a kind of potential energy similar to stretching out a rubber band. When you release a stretched rubber band, the potential energy is converted to kinetic energy. For the ZPE, this manifests itself in virtual particles (Planck particle pairs), the density of which increases as the “rubber band” continues to snap back, making space “thicker” , in effect. Photons of light traveling through space are momentarily absorbed and then re-emitted from these particles. The more there are, the more the light is slowed down. Thus light starts at the beginning of the universe at a much higher speed, and as the kinetic ZPE increases, there are more virtual particles to go through, like a runner having to jump more and more hurdles, and thus the light slows down.
There is much data to support the light speed slowdown. Measurements of light speed over the last few centuries have shown a consistent decrease, even when error bars are taken into account. This is using 16 different methods of measurement. Some creationist reviews were skeptical, but a comprehensive defense of the statistical trends were published since then by statistician Alan Montgomery and physicist Lambert Dolphin, which have not been answered to date. See: Defense of statistical trends in light speed{16}{17}
Other quantities have been measured as changing in sync with light speed, such as Planck’s constant (increasing), electron rest mass (increasing), and a slowing of atomic time relative to orbital time. 11 “constants” in all have been measured as changing. This is contrary to what is said by some creationists who don’t seem to consider that these quantities have actually been measured as changing.
As mentioned in the article in ref. {15}, Setterfield is not the only scientist that has put forth the idea of faster light speed in the past. Rinus Kiel comments:
“Is Setterfield alone in his views? Certainly not! Several investigators have spoken about this theme! Because there is a painful problem in Big Bang, that must be solved, namely that galaxies, even as far as 13 billion light years away, do not show any trace of cosmological evolution, but are fully ‘evolved’ and adult. Which means that the supposed cosmological evolution should have taken place in 0.7 billion years, being only 5% of the age of the universe. Some examples:
Victor S. Troitskii of Radiophysical Research Institute in Gorki (Russia) concluded in 1987 as a result of his investigations in the redshift anomalies:
J.W. Moffat in 1993: There must have been a high speed of light in the beginning of the cosmos.
Andy Albrecht and João Magueijo (1999): Many cosmological puzzles are solved easily if only the speed of light in the beginning was very high.
John D. Barrow (1999): In a BBC TV-interview he said: “Call it heresy, but all the big cosmological problems will simply melt away, if you break one rule, the rule that says the speed of light never varies”.
Moffat as well as Albrecht, Magueijo and Barrow assume that this high initial light speed has decreased very quickly to the current value; but they have not taken seriously the relation to the other ‘constants’ as Troitskii did.
Most significantly, the ZPE model explains a problem with the values of increasing red shifts measured in distant galaxies, called quantization, which has not been solved with experimental data under the current Big Bang model. Tifft and others found that the red shifts go in discrete jumps in value, much like as if the velocity of cars measured on the highway only went in multiples of 5, with nothing in between. So of course this is a problem if the red shifts are simply Doppler shifts.
The ZPE model sheds some light on this problem. What we could be seeing is an atomic phenomenon rather than an expansion effect. The ZPE maintains atomic orbits stability. An electron orbiting a nucleus radiates energy, and so should spiral into the nucleus unless some energy force counteracts this collapse. In QED physics quantum laws are invoked to solve this difficulty, but without a physical explanation. The ZPE provides one, in that electrons absorb energy from the ZPE which balances the energy they radiate and keeps the orbits stable. Now, if the ZPE is higher, then the orbit of the electrons will be at a higher energy level, and light emitted from the atoms of the light source will be of higher energy, and thus bluer. So as we look back in time, the ZPE was less, light was faster, and also light was emitted at a lower energy level, and therefore redder. The quantization occurs because the electron energy levels are discrete and can only take on certain values before they are bumped up by more energy to the next orbital energy level.
Again I will refer you to the above link that explains this better than I just did: The Setterfield Cosmology {15}, (I agree with Kiel in this paper that Setterfield’s cosmology deserves some attention. (Incidentally, Setterfield’s geological model has Noah’s Flood farther down in the geological column, with the fossils forming in post-Flood catastrophic conditions. However, he still has a young earth perspective and I personally think that the differences Setterfield might have in his geological model of Noah’s Flood from other creationist models are a separate issue and have little to no bearing on the general validity of his cosmology. From what I see I think there are some good evidences that at least a majority of the fossils are from Noah’s Flood, but that I will cover in separate series of articles. I think most of his differences in the placement of the Flood/Post Flood boundary in the geological column, and how much of it was from the Flood can be answered by the Catastrophic Plate Tectonics model and other models. And it is healthy science to have several competing models out there. Remember, we have only one eyewitness account, and that is in the Bible.)
In addition, the ZPE model reproduces the same experimental results that confirm General and Special Relativity See: A New Look at Relativity and the Zero Point Energy{18}. For example, in the article Setterfield describes how the ZPE, which consists of electromagnetic waves, cause the jitter motion of sub-atomic particles, which in turn send out a secondary magnetic field. This has been verified experimentally. This secondary radiation increases the strength of the ZPE locally:
“The jittering of sub-atomic particles by the ZPE results in these charged particles sending out a secondary electromagnetic field. This is predicted by classical physics and verified experimentally. This secondary radiation boosts the strength of the ZPE locally. Thus, an adjacent charged particle will experience two driving forces: First are the driving forces of the ZPE causing it to oscillate, and second are the forces due to the secondary fields produced by the ZPE-driven oscillations of the first particle. Similarly, the ZPE-driven oscillations of the second particle will cause their own secondary fields to act back on the first particle.
The net effect is an attractive force between the particles. The sign of the charge does not matter; it only affects the phase of the interactions. This net attractive force between the particles has been shown by Haisch, Rueda and Puthoff to be identical to gravity. Thus, where there are many particles, there are many secondary fields which manifest as gravity, and which, at the same time, boost the ZPE strength locally, so that it becomes significantly greater. This local increase in ZPE strength around a large collection of particles results in the slowing of atomic clocks as does any increase in ZPE strength. Further details are in “General Relativity and the Zero Point Energy:” (This paper also shows that the perihelion advance of Mercury is predicted by SED physics. Earlier, this had been one of the strongholds of Einstein’s theory).”
“ In other words, the presence of the ZPE and its effects fulfills the requirements as the actual physical mechanism which replaces the purely mathematical modeling of relativity.” {18}
Similarly, the ZPE also explains the effects of Special Relativity, the slowing of clocks at high speeds:
“Einstein’s theory of Special Relativity has to do with the effects of velocities on moving objects. These effects include increases in atomic masses as velocities become high, as well as the resulting slowing of atomic clocks. We have observed that the acceleration of an electron through a linear accelerator results in an increase in mass of the electron. This has been hailed as proof that relativity is correct. However, the SED approach predicts exactly the same effect as a result of the existence of the ZPE.
Using the SED approach, it has been shown that the masses of sub-atomic particles all come from the “jiggling” of these particles by the impacting waves of the ZPE. This “jiggling” imparts a kinetic energy to these mass-less particles and this energy appears atomically as mass. An increased “jiggling” occurs when a particle is in motion, because more ZPE waves are impacting the particle than when it is at rest. An increase in particle mass is then the result. The higher the velocity, the more “jiggling” occurs and the greater the resulting mass. This has been mathematically quantified.
As the mass increases, it can be shown that the rate of ticking of atomic clocks slows down since kinetic energy is conserved. Atomic clocks are based on the rates of atomic processes. Atomic processes are governed by the atomic masses which, if they increase, require either more energy for the same amount of speed, or less speed for the same amount of energy. Since one of the basic laws of nature is that energy is conserved, then the atomic particles must move more slowly as they gain mass. This, in turn, would mean that atomic time varies with changes in mass, and that any increase in mass would result in a slowing of the atomic clock. This has been experimentally demonstrated by accelerating a short half-life radioactive particle. As the mass has increased with the speed, the rate of decay has slowed down. This experiment has been used to show Einstein’s theory of Special Relativity is right, but the same result is predicted from SED physics with acceleration through the ZPE.” {18}
I recommend reading Setterfields articles: Reviewing the Zero Point Energy, and for a more comprehensive paper ZPE and atomic constants, Also a paper that talks about plasma physics in conjunction with the ZPE and how this approach could solve some of the problems with galaxy formation is Plasma universe It’s hard to argue with the plasma theories for the origin of the galaxies, since in both lab work and computer simulations, the plasma forms itself into the same filamentous structures we see in the shapes of galaxies. Here also is a mainstream physics paper that talks about the great magnitude of the ZPE energy: Zero Point energy article Calphysics Institute Notice in this last article how they consider the ZPE approach, but are reluctant as Setterfield has described, to go down the SED path, choosing String Theory and M-theory instead. As De Broglie described, the mainstream physics world may need to take a second look at Planck’s second paper.
But what about this heat problem with radioactive decay? How does the ZPE approach solve this sticky problem? The speed of light, “C, is proportional to the rate of radioactive decay for all types of radiometric decay. Both are due to atomic processes affected by the ZPE: Radiometric dating and the ZPE The RATE team found much evidence for accelerated decay, including helium retention in zircons, and Uranium and orphan polonium haloes. But what to do with all the heat generated by accelerated decay? Setterfield’s model has some answers: one is that the radiation density would not change. See the previous link Radiometric dating and the ZPE and also Radiant Energy Emission.
The properties of space are affected by the ZPE as we noted above. I am going to quote Setterfield’s simplified explanation: “1. Space transmits electromagnetic waves, such as light. This means space itself must have both electric and magnetic properties. The electric property of space is referred to as ‘permittivity’ and the magnetic property is referred to as ‘permeability.’ These properties are governed by the number of virtual particles popping in and out of existence in a given volume. When there are fewer virtual particles per given volume, both the permittivity and the permeability of space are lower, which means that there is less resistance to the electric and magnetic elements of the photon (‘packet’ of light). Without this resistance, light travels more quickly.
2. In combination with the first point, when the speed of light was faster, a photon of light would travel farther in one second than it would travel now. That means that the same amount of light, or any radiation, would take up a greater volume at any one time. And THAT means that in any given, or defined, volume, the actual density of radiation from any given reaction would be less before than now.
3. Although faster radioactive decay rates mean that more radioactive atoms are decaying in a given time, the heat problem is offset by two factors: First that the amount of heat radiation in a given volume is lower, as explained in the previous two points. Secondly, as explained earlier in this paper, as we go back in time we are also going back to before so much energy was available to the atom. Before each quantum jump, the atom had lower energy than after. So the net effect here is that the earlier in time, the lower the energy of the atom, even though the light speed and therefore the actual rate of decay were faster. This lower energy in the atom thus somewhat reduced the amount of heat released by any given decay process.
Thus, the expected ‘frying’ effect of a higher radio decay rate which would be part of a time of higher light speed was counteracted by several factors:
First, the initial depth in the earth of radioactive materials.
Second, the increased volume taken up by any given photon.
Third, the lower energy in the atom in the past” (Taken from A Simplified Explanation of the Setterfield Hypothesis)
He also discusses how if the earth started out in a cold state with an ocean covering the surface, then even with a faster rate of radioactive decay, and with the radioactive elements deep in the earth (being brought up to the earth’s crust later in earth’s history), then it can be calculated that the temperature of the core today, assuming the biblical age for the earth of about 7,000 years, would be about 5,800 degrees, with 1,900 degrees now at the top of the lower mantle. This is close to today’s temperature estimates for those regions.
Here is the more technical explanation of Radiation Density and heat production taken from the above link Radiometric dating and the ZPE: “Radiation energy densities also influence the heat from radioactive decay processes. With lower ZPE, and higher decay rates, came a greater number of gamma and X-rays. But this was moderated by the lower vacuum permittivity and permeability so radiation intensities were the same as today. Now radioactive decay processes produce heat. This applies to gamma and X-radiation, which are electromagnetic in character, as well as alpha and beta radiation, which are sub-atomic particles. Gamma and X-radiation can often accompany alpha and beta emission also. There is a reason why heat is produced by all these forms of radioactive decay.
The high energy forms of radiation ionize and/or excite the atoms of the substances through which they pass. The ionization process strips an electron(s) off those atoms with which the alpha, beta or gamma radiation interacts, while the excitation process shifts an electron(s) to a higher orbit in the host atom. In the case of excited atoms, the electrons return to lower orbits, emitting low energy photons which appear as heat, until the atom is again in its ground state. In the case of ionization, where the electrons are stripped off, these electrons cause the secondary excitation of atoms with which they interact. The process continues until all the kinetic energy originally imparted to these electrons is used up by the excitation process. As the excited atoms return to their ground state, they emit low energy photons that again result in heat. Gamma radiation produces ionization and the excitation of atoms over a relatively large distance, while alpha and beta particles only produce significant results over a short distance.
It can therefore be seen that the majority of heat from radioactive decay is generated by the lower energy photons. Since they are electromagnetic in character, they are subject to the moderating effects of the permittivity and permeability of space as shown by (76) and (77). Thus, even though a given radioactive source emitted more electromagnetic waves per unit time of both high and low energy, their effects would have been the same as the fewer number of waves emitted by that same source today. Thus heat production was similarly moderated.”
Also another quote taken from a presentation given in Germany by Rinus Kiel, who has done some research on the The Setterfield Cosmology already referred to above: “Radioactivity and other ‘constants’
“Several other nature ‘constants’ vary in rhythm with the strength of the ZPF.
Radioactive decay rate
The radioactive decay rate also depends on the strength of the ZPF, and thus follows the light speed curve, meaning:
The RATE project
A few words about the results of the RATE project (radio dating research program of ICR):
Conclusion: It seems like there is solid scientific data to support the ZPE cosmology, and it solves some sticky problems like the heat problem in accelerated decay, as well as the undeniable problem of the quantized red shifts, the data from which can be used to draw the light speed decay curve. This curve also fits the curve for the decline in the measured light speed. Setterfield’s theory along with plasma theory does away with the need for dark matter and dark energy.
The RATE project mentioned above produced some great results supporting accelerated radiometric decay, but is dogged by that heat problem. The cosmologies based on time dilation to explain the light speed problem work in theory, but they don’t have as much hard data supporting them as Setterfield’s theory does. This theory also provides along with plasma theory an explanation for the formation of galaxies that has been a great difficulty for conventional big bang theories. It is an outgrowth of the theories of well-known scientists such as Planck, DeBroglie and others. The fact that Setterfield’s cosmology solves all of these simultaneously signals to me that it or some explanation like it, can be of invaluable use to the rest of the creationist world in providing a coherent theory for a young earth and universe that has experimental backing.
It’s time all of the creationist organizations started working together on this!
For discussions and Setterfield’s answers to challenges to his model, see his website www.setterfield.org. under: http://www.setterfield.org/GSRdiscussion.html and http://www.setterfield.org/GSRcritics.html
For a comprehensive reference on Barry Setterfield’s work, see his book Cosmology and the Zero Point Energy, Natural Philosophy Alliance Monograph Series, No. 1, 2013.
{1} Steinhardt, Paul J. , The Inflation Debate-Is the theory at the heart of modern cosmology deeply flawed? Scientific American, April 2011, pp. 37-43. Also see articles:
Light-travel time: a problem for the big bang
New study confirms BICEP2 detection of cosmic inflation wrong
{2} Humphreys, D. Russell, Starlight and Time- Solving the Puzzle of Distant Starlight in a Young Universe, Master Books, Colorado Springs, CO, 1994.
{3} Hawking, Stephen W., and Ellis, G.F.R., The Large Scale Structure of Space-Time, Cambridge University Press, Cambridge, 1973, p.134.
{4} D. Russell Humphreys, New Time Dilation helps Creation Cosmology, Journal of Creation 22 (3) 121-127, December 2008.
{5} Hartnett, John, Starlight, Time, and the New Physics, How we can see starlight in our young universe, Creation Book Publishers, Atlanta, Georgia, 2007.
{6} Hartnett, John, A 5 D Spherically Symmetric Expanding Universe is Young, Journal of Creation, 21 (1) 69-74, April
2007.
{7} See ref {5} Starlight, Time, and the New Physics, pp. 64-67. Shows that the Carmelian model agrees with observed matter densities whereas the conventional model doesn’t. also see links: What are type 1a Supernovae telling us?
{8}See Ref, {5} pp. 42-48. also see article ref {6} A 5 D Spherically Symmetric Expanding Universe is Young
{9}Starlight, Time, and the New Physics, page 64.
{10} Gibbs, W.W. , Profile: George F. R. Ellis; thinking globally, acting universally, Scientific American, 273 (4): 28-29, 1995.
{11}Lisle, Dr. Jason, Distant Starlight-The Anisotropic Synchrony Convention, Answers Magazine, Jan-Mar 2011, pp. 68-71.
{12} Lisle, Dr. Jason, Anisotropic Synchrony Convention-A Solution to the Distant Starlight Problem. Answers Research Journal, 3, 2010, 191-207.
{13} Radioisotopes and the Age of the Earth, L. Vardiman, A. Snelling, and E. Chaffin, editors, Institute for Creation Research, El Cajon, CA, and Creation Research Society, St. Joseph, Missouri, 2000, pp.369-375.
{14}Setterfield, Barry, Journal of Theoretics-Exploring the Vacuum, 2002.
{15} Kiel, Rinus, The Cosmology of Barry Setterfield, 2008
{16} Dolphin, Lambert and Montgomery, Alan, Is the Velocity of Light Constant in Time? , 1993.
{17} Montgomery, Alan, A Determination and Analysis of Appropriate Values of the Speed of Light to Test the Setterfield Hypothesis, 1995.
{18} Setterfield, Barry,A New Look at Relativity and the Zero Point Energy, Jan. 18,2010.