Share this story
Close X
Switch to Desktop Site

Did the universe just get 15 percent larger?

Astronomers question the accuracy of a key figure used to calculate the distance of faraway galaxies.

About these ads

Trust but verify. That little piece of strategy that Ronald Reagan applied to the measurement of nuclear stockpiles during the cold war is being used by astronomers.

When measuring the size and age of the universe, astronomers have relied on the "Hubble constant," named after the late American astronomer Edwin Hubble. The constant relates a galaxy's distance to the speed at which it's moving away from us.

Astronomers determine a galaxy's speed by looking at its light. The redder the light, the faster the speed. Divide that speed by the Hubble constant, and you get the galaxy's distance from Earth.

As Max Bonamente at NASA's Marshall Space Flight Center in Huntsville, Ala., has explained, "astronomers absolutely need to trust this number because we use it for countless calculations."

Attempts to verify that number, however, don't yet support that level of trust.

This high-stakes need for verification has energized a painstaking effort to refine the Hubble constant. Fritz Benedict and Barbara McArthur at the University of Texas in Austin led the international team that published their attempt to refine this number in this month's Astronomical Journal.

To verify the Hubble constant, astronomers compare the distance they get using the constant against the distance they get from an independent measurement based on Cepheid variable stars. A Cepheid's light varies with time. This variation reflects the star's inherent brightness. Once astronomers know this inherent brightness, or intrinsic luminosity, they can estimate the distance to the Cepheid by noting how dim it appears from Earth.


Page:   1   |   2

Follow Stories Like This
Get the Monitor stories you care about delivered to your inbox.