overlap.gif (7997 bytes)


Hubble's Law

During the 1920's, Edwin Hubble discovered that the Universe is expanding, with galaxies moving away from each other at a velocity given by an expression known as Hubble's Law (V=H x D).  Here V represents the galaxy's recessional velocity, D is its distance away from Earth, and H is a constant of proportionality called Hubble's constant.

The exact value of Hubble's constant is somewhat uncertain, but is generally believed to be about 75 kilometers per second for every megaparsec in distance, km/sec/Mpc. (A megaparsec is given by 1 Mpc = 3 x 106 light-years). This means that a galaxy 1 megaparsec away will be moving away from us at a speed of 75 km/sec, while another galaxy 100 megaparsecs away will be receding at 100 times this speed. So essentially, the Hubble constant sets the rate at which the Universe is expanding.

The standard picture of cosmology explains how to picture this expanding universe. As an example consider a loaf of bread, with raisins sprinkled evenly throughout it. As the bread expands during cooking all the raisins are moved further and further apart from each other. Seen from any raisin all the other raisins in the bread appear to be receding with some velocity.

This model also explains the linearity of the Hubble law, the fact that the recession velocity is proportional to distance. If all the lengths in the universe double in 10 million years then something that was initially 1 megaparsec away from us will end up a further megaparsec away. Something that was 2 megaparsecs away from us will end up a further 2 megaparsecs away. In terms of the speed at which the objects appear to be receding from us, the object twice as distant has receded twice as fast!

[Back]


Last Updated: 8/18/98