Monthly Archives: November 2009

Special relativty 1 : Quantum gravity 0

One of the key postulates of Einstein’s theory of special relativity is that the speed of light in a vacuum is constant. This means that whatever the energy of the photons making up the ray of light, its speed is always the same. It’s called Lorentz invariance, and it’s been on trial once again, courtesy of the Fermi Large Area Telescope (LAT) Collaboration.

In a paper published in Nature last month, Abdo and colleagues analysed the light coming from a distant and fleeting gamma ray burst to try and pick up any variation in the speed of its photons – and found no variation, at least down a limit.

Gamma-ray bursts (GRBs) are believed to be released during supernovae and are the brightest events occurring in the universe – despite the fact that most of them are billions of light years away from Earth. The radiation emitted during a GRB is extremely intense, typically releasing as much energy in a few seconds as the Sun will in its entire lifetime. They are good candidates for measuring a variation in light speed due to the cosmological distances the light has to travel to reach us – even tiny variations in photon speed are amplified enough to be revealed in the sharp features of the light curve emitted during the burst.

Artist's Illustration of Gamma-ray Burst GRB080319B

Illustration of a Gamma-ray Burst

Researchers at the Fermi LAT Collaboration were alerted to a particularly interesting gamma-ray burst after it was picked up by both the Large Area Telescope and the Gamma-ray Burst Monitor, which are aboard the Fermi Gamma-ray Space Telescope. This telescope is a joint project between NASA, the US Department of Energy and government agencies in France, Germany, Italy, Japan and Sweden, and is currently in low Earth orbit. A photon, with an energy of 31GeV, emitted less than a second after the start of the burst was singled out and used to find a limit for the variation of the speed of light.

In special relativity this limit should not exist, as there is no length at which the Lorentz invariance should be broken. However, some theoretical physicists think that at very small lengths it could in fact be violated. In order to formulate a “theory of everything”, they are attempting to reconcile gravitational effects with quantum mechanics and create a theory of quantum gravity. According to these theories, at the Planck scale (lengths of approximately 1.62 x 10-33 cm) quantum mechanics should interact with gravity, influencing the nature of space-time and so changing the speed of light.

In the research conducted by the Fermi LAT Collaboration, however, Lorentz invariance was found by two independent methods to hold true down to the Planck length divided by 1.2. This is a blow to some quantum gravity theories that require the fabric of space-time to be altered on small scales.

While this may be bad news for some modern day physicists, it’s good news for Einstein – after over 100 years, his theory of special relativity still stands.

Reference: Nature doi:10.1038/nature08574

Image Credit: NASA/Swift/Mary Pat Hrybyk-Keith and John Jones


Leave a comment

Filed under Felix Articles, Physics

Geoengineering – Just a load of smoke and mirrors?

Geoengineering is seen by some as a quick and easy way out of global warming, and by others as dangerous, unpredictable and just another excuse not to cut carbon emissions. But what actually is it? Well, there is no generally accepted definition, but  anything that can be classed as a large-scale scheme to manipulate the effects of global warming would probably fit the bill. There are a variety of ways this manipulation can be done, however most methods can be split into two categories: solar radiation management and greenhouse gas remediation. In other words, reducing the amount of sunlight reaching the Earth, or soaking up greenhouse gasses from the atmosphere.

The first category includes techniques intended to increase the albedo (or reflectivity) of the Earth, for example sending billions of aluminised reflective balloons into the atmosphere, or positioning a giant mirror in space that would act as a sunshade between the Sun and the Earth. One of the most noteworthy of these types of proposals is one that aims to inject up to two metric tons of sulphuric aerosols into the atmosphere, which would then act as condensation nuclei and influence the micro-physical and optical properties of clouds. This is the idea of Dutch Professor Paul Crutzen, who won the Nobel Prize for Chemistry in 1995 when he and his colleagues discovered the cause of the hole in the ozone layer. Mount Pinatubo has already shown that this method would succeed in reducing the average global temperature; after it’s eruption in 1991, sunlight reaching the Earth was reduced by 10% and temperatures decreased by 0.5°C over the globe, staying at their new level for 3 years on average. However, ozone destruction also increased substantially and the average precipitation worldwide dropped significantly in the 16 months following the eruption.

The second major type of geoengineering method involves enhancing natural “carbon sinks”. Fertilising the oceans with iron to stimulate phytoplankton growth falls into this category, as does reforestation. These processes aim to attack the rising levels of greenhouse gases, and hopefully lock them away for years to come.

There are plenty of advantages to geoengineering. For instance, a lot of the methods that decrease the amount of sunlight hitting the earth would be very cheap to implement – it would be possible to create a new ice age for around 0.01% of the USA’s gross domestic product by decreasing global temperatures. Also, the changes resulting from applying geoengineering methods would appear much more quickly than any changes arising from cutting emissions, suggesting that it may be a good way to buy the Earth some time while we wait to see the results of emissions cuts.

Criticisms are also easy to come up with. It has been argued that these techniques will only put off the inevitable, and while doing so risk making politicians and others complacent about emissions cuts. What’s more, little detailed research has been done into some of these methods, so there may be unintended consequences that we know nothing about at present. For example, reducing solar input could result in lowering crop yields, which could cause famine.

In an ideal world, to combat global warming we would cut emissions by the 60-80% needed to stabilise the concentration of CO2 in the atmosphere. However, from 2001 to 2002 emissions actually increased by 2% and this trend doesn’t look set to change much in the near future. Soon, some geoengineering methods may become necessary to stop drastic climate heating before it’s too late.

Leave a comment

Filed under Felix Articles