Category Archives: Felix Articles

A galaxy far, far away… (take two)

Last November I wrote about the most distant galaxy ever seen. Since then, one even further away has been found. This post is (slightly) adapted from an article I wrote for Felix about this new galaxy.

The newly discovered galaxy is circled in the top left hand corner. Credit: NASA/ESA

There haven’t always been stars and galaxies in the universe, and the time when they began to form — known as the reionisation epoch — is the subject of much interest in astrophysics. A paper published in Nature details a discovery that could tell us more about this mysterious time.

In the paper, Rychard Bouwens and colleagues say they have detected the most distant galaxy ever seen; the light from the galaxy was emitted only 500 million years after the Big Bang. This age puts it well within the epoch of reionisation.

The galaxy has the highest redshift ever observed; it was from this that the team were able to calculate the galaxy’s age.

Cosmological redshift is a measure of how fast an object is moving away from the Earth and is a consequence of the expanding Universe. Objects with higher redshift are moving away from the Earth faster than those with lower redshifts, which means they are further away from Earth and also further back in time. To put it simply, the higher the redshift the older the galaxy.

Bouwens and colleagues used something known as the Lyman-break technique to identify the galaxy. This technique relies on a sharp drop in the spectrum of a galaxy that is due to the absorption of energetic photons by neutral gas that surrounds galaxy forming regions. The discovery can then be confirmed by looking at optical images.

For a long time the observations required to study the reionisation epoch were out of reach, but recent images from Hubble are making the detection and study of far away galaxies possible for the first time. The new galaxy was discovered in images taken by Hubble’s Wide Field Camera 3.

Bouwens and colleagues also looked into the rate of star formation at the time just after the newly discovered galaxy. They discovered that in just 200 million years the rate of star formation increased tenfold. This confirms that the newly discovered galaxy is right in the heart of the reionisation epoch, and sheds new light on how the stars and galaxies we see today formed.

It seems that galaxies around at the same time as this new discovery may not have been able to fully reionise the universe. This leaves the means by which the universe went from being a neutral gas to an ionised one, with electrons and protons stripped away from each other, a mystery. However, the existence of galaxies at this time does point to the first stars forming 100 million years beforehand, or around 400 million years after the Big Bang.

“We’re seeing huge changes in the rate of star birth that tell us that if we go a little further back in time we’re going to see even more dramatic changes,” said Garth Illingworth, a co-author of the paper from the University of California at Santa Cruz. “We’re moving into a regime where there are big changes afoot. Another couple of hundred million years back towards the Big Bang, and that will be the time when the first galaxies really are starting to build up.”

References
Bouwens, R., Illingworth, G., Labbe, I., Oesch, P., Trenti, M., Carollo, C., van Dokkum, P., Franx, M., Stiavelli, M., González, V., Magee, D., & Bradley, L. (2011). A candidate redshift z ≈ 10 galaxy and rapid changes in that population at an age of 500 Myr Nature, 469 (7331), 504-507 DOI: 10.1038/nature09717

1 Comment

Filed under Felix Articles, Physics

Huge leap forward in quantum computing

Originally published on 19/11/10 in felix, the student newspaper of Imperial College London.

Do you think you could make sense of this sentence if every fourth word was missing? How about trying to hold a conversation when you can only hear three quarters of what the other person is saying? Cutting out a fraction of the information being transferred in a given situation may make life slightly difficult, but it certainly doesn’t stop the meaning being conveyed in most cases. This is because of the redundancy built into language. However, redundancy is not only useful for conversations on a dodgy phone line – it can also come in handy in the world of quantum computing, as two researchers explained in a paper published in Physical Review Letters last week.

"Cosmic internet"

I've no idea what this picture is supposed to represent, but it you have to admit it does look pretty.

The research was carried out by Sean Barrett, of Imperial College, and Thomas Stace, at the University of Queensland in Brisbane, Australia. They found that if a quarter of the qubits (the quantum equivalent of bits, which store information in a classical computer) are lost, the computer can still function as normal. Barrett and Stace looked at the remaining information and used a code that could check for errors to decipher what was missing. “It’s surprising, because you wouldn’t expect that if you lost a quarter of the beads from an abacus that it would still be useful,” said Dr Barrett.

One of the main differences between a classical bit and its quantum equivalent is that the latter can exhibit entanglement. This means that, no matter how far away two entangled qubits are, if one changes so will the other – instantaneously. Quantum computers take advantage of this effect, as well as another property of quantum systems known as superposition, to perform complicated calculations much faster than classical computers. At the moment, though, the largest quantum computers have only two or three qubits.

It had previously been thought that large quantum computers would be very sensitive to missing information, but this research shows that they should be much more robust than we’d imagined. At this stage, the work is theoretical and scientists must do a lot more in order to make quantum computers bigger than a few qubits in the lab.

When large quantum computers are a reality, they may have the potential to revolutionise fields as far apart as drug modelling, electronics and code breaking. However, we won’t know exactly what applications quantum computers will be best suited to until we’re able to make one.

“At the moment quantum computers are good at particular tasks, but we have no idea what these systems could be used for in the future,” said Dr Barrett. “They may not necessarily be better for everything, but we just don’t know. They may be better for very specific things that we find impossible now.”

Reference:
Sean D. Barrett, & Thomas M. Stace (2010). Fault tolerant quantum computation with very high threshold for loss
errors Phys. Rev. Lett. 105, 200502 (2010) arXiv: 1005.2456v2

1 Comment

Filed under Felix Articles, Physics

Physicists catch glimpse of dark matter

In early December last year, the particle physics community was abuzz with rumours of a discovery made by the Cryogenic Dark Matter Search (CDMS-II) collaboration in the US. Announcement talks were scheduled for 18th December, all seminars before that cancelled and there were rumours of a paper submitted by the collaboration to Nature (a rumour, however, that was refuted by a Senior Editor at Nature shortly after it was started). The CDMS had, it was said, found evidence of the existence of dark matter.

It would have been the last major physics breakthrough of the noughties – but sadly, it was not entirely true. The CDMS were announcing their latest results, but those results did not show irrefutable evidence of dark matter’s existence.
Nevertheless, that does not mean that what they have discovered isn’t important. In a paper submitted to arXiv on the same day as their announcement talks were held, the CDMS team reported that they had seen two events characteristic of a particular class of possible dark matter components called weakly interacting massive particles, or WIMPs.

WIMPs are likely to have masses similar to that of atomic nuclei. Despite having never been seen, they are considered one of the main candidates for dark matter, with the appropriately named massive compact halo objects (MACHOs) as their main rival. As WIMPs only interact through the weak nuclear force and gravity, they rarely interact with normal matter, making them very difficult to spot.
The CDMS experiment is located half a mile underground in the disused Soudan mine in Minnesota, and uses germanium and silicon detectors cooled to almost absolute zero. This low temperature means that if a WIMP does pass through these crystals, the heat it generates will cause a charge to move in an applied magnetic field, and this in turn will cause a signal to be sent to a computer in the lab. Once the data has been collected, it is analysed to distinguish the background events from the interesting ones.

What the CDMS found was two interesting events, showing characteristics that would be expected from WIMPs. However, there is still a 23% chance that these events were merely due to background particles such as cosmic rays or radioactive decay. For there to be no reasonable doubt of a discovery, i.e. a less than one in a thousand chance that the events seen were not WIMPs, there would need to have been five events present in the data rather than two. Despite not showing absolute proof, the results will help to set new upper limits that may rule out certain theories currently proposed to explain the dark matter problem.

The CDMS collaboration are now in the process of upping the sensitivity of their experiment by trebling the number of detectors. The upgrade should be finished by summer 2010, then all they can do is sit back and wait patiently for some WIMPs to come along.

1 Comment

Filed under Felix Articles, Physics

First results from the LHC

The LHC has been subject to much media coverage and criticism since it was started up last September only to break down little more than a week later, but the first results from the record-breaking particle accelerator are finally in. Admittedly, nothing earth-shattering has been discovered yet, but the turnaround speed of the paper alone must be a record.

The first collisions took place on Monday 23rd November and since then the twin proton beams have reached an energy of 1.18TeV, smashing the previous record of 0.98TeV. Researchers working on ALICE, one of the six experiments within the LHC, however, took data from some of the very first collisions when the protons were circulating at only 450GeV per beam. They had their paper accepted by the European Journal of Physics just 8 days after the collisions took place. Presumably, most of the paper was already written before the LHC was even fired up, with gaps left to fill in the results when they came in.

Before being accepted by the European Journal of Physics, the paper was published online at arXiv.org, an open access repository for scientific papers not yet published elsewhere. Anyone interested in the paper can read it online, but be warned, you’ll have to skip to page 6 to even get to the abstract due to the list of authors (over two pages worth) and involved institutes (more than 100 in total).

In the paper, Aamodt and colleagues describe how some 284 events recorded in the first collisions were used to measure something called the pseudorapidity density of the charged particles. Pseudorapidity is used in particle physics to describe the angle of the particle beam relative to the axis. This may not sound very exciting, but the ALICE collaboration are pleased as the results agree with theory and previous experiments, meaning that the LHC is working well and should provide high quality data to work with when they get to the really interesting stuff.

The first super high energy collisions at the LHC are on track to start in early 2010, and will reach energies of 3.5TeV per beam.

Reference: arXiv:0911.5430v2

Leave a comment

Filed under Felix Articles, Physics

Special relativty 1 : Quantum gravity 0

One of the key postulates of Einstein’s theory of special relativity is that the speed of light in a vacuum is constant. This means that whatever the energy of the photons making up the ray of light, its speed is always the same. It’s called Lorentz invariance, and it’s been on trial once again, courtesy of the Fermi Large Area Telescope (LAT) Collaboration.

In a paper published in Nature last month, Abdo and colleagues analysed the light coming from a distant and fleeting gamma ray burst to try and pick up any variation in the speed of its photons – and found no variation, at least down a limit.

Gamma-ray bursts (GRBs) are believed to be released during supernovae and are the brightest events occurring in the universe – despite the fact that most of them are billions of light years away from Earth. The radiation emitted during a GRB is extremely intense, typically releasing as much energy in a few seconds as the Sun will in its entire lifetime. They are good candidates for measuring a variation in light speed due to the cosmological distances the light has to travel to reach us – even tiny variations in photon speed are amplified enough to be revealed in the sharp features of the light curve emitted during the burst.

Artist's Illustration of Gamma-ray Burst GRB080319B

Illustration of a Gamma-ray Burst

Researchers at the Fermi LAT Collaboration were alerted to a particularly interesting gamma-ray burst after it was picked up by both the Large Area Telescope and the Gamma-ray Burst Monitor, which are aboard the Fermi Gamma-ray Space Telescope. This telescope is a joint project between NASA, the US Department of Energy and government agencies in France, Germany, Italy, Japan and Sweden, and is currently in low Earth orbit. A photon, with an energy of 31GeV, emitted less than a second after the start of the burst was singled out and used to find a limit for the variation of the speed of light.

In special relativity this limit should not exist, as there is no length at which the Lorentz invariance should be broken. However, some theoretical physicists think that at very small lengths it could in fact be violated. In order to formulate a “theory of everything”, they are attempting to reconcile gravitational effects with quantum mechanics and create a theory of quantum gravity. According to these theories, at the Planck scale (lengths of approximately 1.62 x 10-33 cm) quantum mechanics should interact with gravity, influencing the nature of space-time and so changing the speed of light.

In the research conducted by the Fermi LAT Collaboration, however, Lorentz invariance was found by two independent methods to hold true down to the Planck length divided by 1.2. This is a blow to some quantum gravity theories that require the fabric of space-time to be altered on small scales.

While this may be bad news for some modern day physicists, it’s good news for Einstein – after over 100 years, his theory of special relativity still stands.

Reference: Nature doi:10.1038/nature08574

Image Credit: NASA/Swift/Mary Pat Hrybyk-Keith and John Jones

Leave a comment

Filed under Felix Articles, Physics

Geoengineering – Just a load of smoke and mirrors?

Geoengineering is seen by some as a quick and easy way out of global warming, and by others as dangerous, unpredictable and just another excuse not to cut carbon emissions. But what actually is it? Well, there is no generally accepted definition, but  anything that can be classed as a large-scale scheme to manipulate the effects of global warming would probably fit the bill. There are a variety of ways this manipulation can be done, however most methods can be split into two categories: solar radiation management and greenhouse gas remediation. In other words, reducing the amount of sunlight reaching the Earth, or soaking up greenhouse gasses from the atmosphere.

The first category includes techniques intended to increase the albedo (or reflectivity) of the Earth, for example sending billions of aluminised reflective balloons into the atmosphere, or positioning a giant mirror in space that would act as a sunshade between the Sun and the Earth. One of the most noteworthy of these types of proposals is one that aims to inject up to two metric tons of sulphuric aerosols into the atmosphere, which would then act as condensation nuclei and influence the micro-physical and optical properties of clouds. This is the idea of Dutch Professor Paul Crutzen, who won the Nobel Prize for Chemistry in 1995 when he and his colleagues discovered the cause of the hole in the ozone layer. Mount Pinatubo has already shown that this method would succeed in reducing the average global temperature; after it’s eruption in 1991, sunlight reaching the Earth was reduced by 10% and temperatures decreased by 0.5°C over the globe, staying at their new level for 3 years on average. However, ozone destruction also increased substantially and the average precipitation worldwide dropped significantly in the 16 months following the eruption.

The second major type of geoengineering method involves enhancing natural “carbon sinks”. Fertilising the oceans with iron to stimulate phytoplankton growth falls into this category, as does reforestation. These processes aim to attack the rising levels of greenhouse gases, and hopefully lock them away for years to come.

There are plenty of advantages to geoengineering. For instance, a lot of the methods that decrease the amount of sunlight hitting the earth would be very cheap to implement – it would be possible to create a new ice age for around 0.01% of the USA’s gross domestic product by decreasing global temperatures. Also, the changes resulting from applying geoengineering methods would appear much more quickly than any changes arising from cutting emissions, suggesting that it may be a good way to buy the Earth some time while we wait to see the results of emissions cuts.

Criticisms are also easy to come up with. It has been argued that these techniques will only put off the inevitable, and while doing so risk making politicians and others complacent about emissions cuts. What’s more, little detailed research has been done into some of these methods, so there may be unintended consequences that we know nothing about at present. For example, reducing solar input could result in lowering crop yields, which could cause famine.

In an ideal world, to combat global warming we would cut emissions by the 60-80% needed to stabilise the concentration of CO2 in the atmosphere. However, from 2001 to 2002 emissions actually increased by 2% and this trend doesn’t look set to change much in the near future. Soon, some geoengineering methods may become necessary to stop drastic climate heating before it’s too late.

Leave a comment

Filed under Felix Articles