Home About

The father of computing, augmented reality, and bionic eyes

Anna Friedlander

Opinion

5/10/2009






Late last month the British Government apologised for its appalling treatment of Alan Turing, the father of modern computing.
In a 1936 paper (“On computable numbers, with an application to the Entscheidungsproblem”) Turing proved that a machine could perform any mathematical computation if it could be represented by an algorithm.
In a later paper Turing proved that the halting problem is undecidable—that is, that there does not exist an algorithm to determine, for all possible computer programmes, whether or not each will eventually stop or continue on forever, and conceived of a ‘universal machine’ that could perform the tasks of any and all other machines.
In essence, every computer today is merely a variation on the ‘universal machine’ described by Turing.
Turing was among the first people to conceive of artificial (machine) intelligence, and developed what is now known as the ‘Turing Test’. This test, for whether we should consider a machine to be ‘intelligent’, hinges on whether or not that machine could fool a human, by means of question and answer, into thinking that it is in fact a fellow human being. Turing avoided the issue of how we could determine whether or not the machine was really intelligent or just a pretty good facsimile, by cutely pointing out that we have no such assurance that our fellow human beings are truly intelligent beings, but we politely agree that we all are anyway.
During World War II, Turing was instrumental in breaking the secret codes the German forces used to communicate with each other (to communicate troop movements or bombing targets, for example). Such codes were encoded and decoded using machines like the Enigma—the codes of which Turing’s work meant the Allied Forces could decipher.
But Turing was gay, and in 1952 he was arrested for “gross indecency”. He subsequently elected to receive oestrogen injections rather than jail time—effectively chemically castrating him.
Turing committed suicide in 1954 by biting into a poisoned apple.
Turing’s treatment seems particularly cruel when considering his punishment was meted out by the same state that had benefited so much from his service cracking German codes during World War II.
I thought it appropriate to honour Alan Turing’s life and triumphs by reflecting on a couple of awesome new technologies that would not have been possible without the work of Turing and his contemporaries—the mothers and fathers of modern computing.
Early computers were so big they took up a room, or even an entire floor of a building. Today they fit on your desk, and even in that cute little phone in your back pocket.
For a while now, scientists and engineers have been working on the computers embedded in cellphones to enable them to ‘augment’ reality.
A number of companies, including Nokia, have recently demonstrated phones that utilise the functionality of their camera, GPS system, and internet connection to superimpose information (the names of constellations, for example) upon objects. Standing on the street choosing between two restaurants? The goal of such technology is that the user can simply hold their phone up to see reviews or ratings superimposed on each.
There is no word yet on whether such reality augmenting phones will come standard with Don’t Panic printed upon them in reassuring lettering.
In a similar development, engineers at the University of Washington have started work that could make our eyes bionic.
Writing in IEEE Spectrum, Professor Babak Parviz said that the goal of his team at Washington is to create a device that allows data to be superimposed on the wearer’s visual field—he describes this as something like what the Terminator sees in the movies of the same name (to my relief, a thorough internet search does not reveal any links between Parviz and Skynet or Cyberdyne Industries).
The engineers have created a prototype contact lens that incorporates control and communication circuits, and a tiny antenna. The antenna receives information, which is projected onto the retina using a tiny LED.
So if you’re reading this on a computer (if the university hasn’t blocked access to Salient on the SCS computers yet), or in the print edition in the back of a lecture theatre while texting your friends, spare a thought for Alan Turing—a man whose departure from this world did not befit his amazing contribution to it.
If you’re interested in reading more about Alan Turing, David Leavitt’s The Man Who Knew Too Much is an excellent biography of Turing’s life and work. It’s a great mix of the personal and the professional in Turing’s life: of Turing’s family and relationships, insights into his personality, his education alongside contemporaries like Betrand Russell, to in-depth explanations of his seminal work “On Computable Numbers” and that of other big ideas and developments of the time, his work for the allied forces in World War II, and the machines the allies and their German adversaries used to create and crack coded messages.