Monthly Archives: November 2010

Huge leap forward in quantum computing

Originally published on 19/11/10 in felix, the student newspaper of Imperial College London.

Do you think you could make sense of this sentence if every fourth word was missing? How about trying to hold a conversation when you can only hear three quarters of what the other person is saying? Cutting out a fraction of the information being transferred in a given situation may make life slightly difficult, but it certainly doesn’t stop the meaning being conveyed in most cases. This is because of the redundancy built into language. However, redundancy is not only useful for conversations on a dodgy phone line – it can also come in handy in the world of quantum computing, as two researchers explained in a paper published in Physical Review Letters last week.

"Cosmic internet"

I've no idea what this picture is supposed to represent, but it you have to admit it does look pretty.

The research was carried out by Sean Barrett, of Imperial College, and Thomas Stace, at the University of Queensland in Brisbane, Australia. They found that if a quarter of the qubits (the quantum equivalent of bits, which store information in a classical computer) are lost, the computer can still function as normal. Barrett and Stace looked at the remaining information and used a code that could check for errors to decipher what was missing. “It’s surprising, because you wouldn’t expect that if you lost a quarter of the beads from an abacus that it would still be useful,” said Dr Barrett.

One of the main differences between a classical bit and its quantum equivalent is that the latter can exhibit entanglement. This means that, no matter how far away two entangled qubits are, if one changes so will the other – instantaneously. Quantum computers take advantage of this effect, as well as another property of quantum systems known as superposition, to perform complicated calculations much faster than classical computers. At the moment, though, the largest quantum computers have only two or three qubits.

It had previously been thought that large quantum computers would be very sensitive to missing information, but this research shows that they should be much more robust than we’d imagined. At this stage, the work is theoretical and scientists must do a lot more in order to make quantum computers bigger than a few qubits in the lab.

When large quantum computers are a reality, they may have the potential to revolutionise fields as far apart as drug modelling, electronics and code breaking. However, we won’t know exactly what applications quantum computers will be best suited to until we’re able to make one.

“At the moment quantum computers are good at particular tasks, but we have no idea what these systems could be used for in the future,” said Dr Barrett. “They may not necessarily be better for everything, but we just don’t know. They may be better for very specific things that we find impossible now.”

Sean D. Barrett, & Thomas M. Stace (2010). Fault tolerant quantum computation with very high threshold for loss
errors Phys. Rev. Lett. 105, 200502 (2010) arXiv: 1005.2456v2

1 Comment

Filed under Felix Articles, Physics

A galaxy far, far away…

This post was chosen as an Editor's Selection for

When we look up into the sky at night, we see stars (even in London I can usually spot a few!). But there haven’t always been stars and galaxies in the universe. In a period known as the dark ages – not to be confused with the other dark ages – there was no light at all. After the the ionised gas that filled the universe in its very early life cleared, there was a very long period of, well, nothing. The universe was transparent, but contained no stars or galaxies for just less than 400 million years. The process that allowed stars and galaxies to eventually form is known as reionisation, and new research published in Nature last month details a discovery that may open a new window on to this time.

Galaxy UDFy-38135539 shown in the Hubble Ultra Deep Field. Image: NASA

In the paper, Lehnert and colleagues reported detecting the most distant object physicists have ever seen: a galaxy, the light from which was emitted less than 600 million years after the Big Bang. It’s the first galaxy known to have lived fully within the epoch of reionisation.

The galaxy, which goes by the catchy name UDFy-38135539, has a redshift* of z = 8.6 – the highest ever observed – and it was from this that the team were able to calculate the galaxy’s age. UDFy-38135539 was first spotted by Hubble’s Wide Field Camera 3, but Lehnert and colleagues made ground based observations using an instrument called SINFONI on the Very Large Telescope in Chile to look at the galaxy in more detail.

The instrument helped by splitting up the light from the galaxy in a process known as spectroscopy, allowing the team to look for a feature called the Lyman-α line. Each photon making up the Lyman-α line would have been emitted when an electron in a hydrogen ion dropped down from a higher energy level to a lower one. The photons that Lehnert and colleagues observed were ultraviolet when they were emitted from the galaxy, but when they reached Earth had wavelengths in the infrared region, giving the high redshift mentioned above. This stretching of the wavelength occurs because of the length of time the photons took to travel here – they were created just 600 million years after the big bang. This may sound like a long time, but if you consider that the age of the Universe is 13.7 billion years, you’ll appreciate that 600 million years is actually very early on in the grand scheme of things, and certainly very, very long ago. Physicists know that reionisation started within 600 million years of the Big Bang, so Lehnert and colleagues came to the conclusion that galaxy UDFy-38135539 must have lived within the epoch of reionisation.

Diagram showing the universe from the Big Bang, to now. Image: Wikipedia

At the beginning, the Universe was very hot and dense, with conditions similar to that in a particle accelerator. After three minutes, the Universe had expanded and cooled enough to have formed all of the elementary particles, as well as protons and neutrons (also in these first three minutes was the matter-antimatter annihilation that destroyed most of the antimatter in the universe). For 400,000 years after this the universe was full of ionised gas, and was opaque. Then, in a period known as recombination, the electrons and protons in the gas got together and formed atoms; this made everything a lot clearer. For around 400 million years after that, the universe remained transparent and rather empty, with no stars or galaxies. It was only when astronomical objects, possibly quasars or small galaxies, began to form due to gravitational collapse that things began to get a bit more interesting.

These objects poured radiation out into the universe, causing reionisation. During reionisation, electrons were stripped back off the hydrogen atoms that were formed when they first joined up with the protons, and the universe began to turn into an ionised gas once again. Fortunately, due to the expansion that had taken place in the time between recombination and reionisation, the universe was able to remain transparent despite these bubbles of plasma that had begun to form all over the place. This process led to the formation of the stars and galaxies we see today, but physicists are not yet sure exactly how this all happened.

Because the galaxy UDFy-38135539 lived within the epoch of reionisation, it may be able to help us explain how reionisation started, and how these objects that formed in the cosmological dark ages were able to transform the universe from a mostly neutral one, to one filled with ionised gas.

There are already a few other faraway candidates lined up for study too, and Lehnert and colleagues have shown that such study is possible with current instruments, but astronomers will have to wait for the new wave of telescopes to really study reionisation in detail. The James Webb Space Telescope (JWST), which is the successor to Hubble, and the Extremely Large Telescope (ELT), which is the successor of the VLT, are two that will allow for this more detailed investigation. Both are due to be up and running later on this decade.

* Cosmological redshift is a measure of how fast an object is moving away from the Earth and is a consequence of the expanding Universe. Objects with higher redshift are moving away from the Earth faster than those with lower redshifts, and are further away too. See an earlier post of mine for an image showing redshift in action.


M. D. Lehnert, N. P. H. Nesvadba, J. -G. Cuby, A. M. Swinbank, S. Morris, B. Clement, C. J. Evans, M. N. Bremer, & S. Basa (2010). Spectroscopic confirmation of a galaxy at redshift z=8.6 Nature, 467 arXiv: 1010.4312v1

Further reading

BBC article, including short phone interview with Malcolm Bremer who was one of the physicists on the team.

Blog post at Cosmic log, which includes a Q&A with lead researcher Matt Lehnert.


Filed under Physics