An Introduction to Quantum Computing

2008-2009

Over the past 30 years the introduction of the silicon chip and transistors has transformed the world into a technological one; personal computers, mobile devices and other electronics have massively improved the processing power available to the public. Despite the fast pace of conventional computers, there are still a number of problems which cannot be solved particularly quickly, such as factorising large integers. It is anticipated that – unless a particularly clever algorithm is devised – this problem is a fundamental deficiency in conventional computing. Because of this, the search for new computational technologies has been around for as long as computing has.

Quantum computing was an idea first popularised by famous physicist Richard Feynman in the 1950s. He proposed that large computational advantages lie in quantum theory, but it took Peter Shor to show scientifically that the quantum computer would trump conventional computing with his quantum algorithm for factoring integers. Since then significant research has been applied to building a reliable quantum computer.

As part of my Physics course at Durham University I was asked to write an essay and presentation about the current state of quantum computing: its history and unsolved problems.

Read the essay here or read the presentation here.

« More projects