French philosopher Denis Diderot famously said: “How near are genius and madness! Men imprison them and chain them, or raise statues to them.” It's a maxim that holds particularly true for great scientific minds, which have often been subject to both these fates as their work moved from being highly contested to widely accepted. But even more frequently, pioneering individuals in science and technology remain unrecognised, overlooked by history even when their innovations have transformed our understanding of the world.
This feature marks the first in a series exploring the unsung heroes of technology, their innovations and eccentricities, breakthroughs and downfalls, and the way they have shaped the way we live, whether society acknowledges it or not.
Alan Mathison Turing did many things in his brief lifetime, but conforming was not one of them. The so-called 'father of computer science', World War II code-breaker, and artificial intelligence pioneer would have turned 100 this year, although he managed to achieve far more in his 41 years than many who have lived a full century.
Born in London on 23 June, Turing's talents emerged at an early age, as did the characteristics that marked him as both remarkable and slightly eccentric. His keen interest in mathematics and science, for example, in favour of more classic disciplines like Latin and English, caused his teachers to worry he wouldn't acquire a “balanced” education. He also preferred to develop ideas and processes from scratch, relying on his own insights rather than previous work. This approach led to him conducting all manner of experiments in his dorm room, the results of which often lay around like a “witches' brew” on countertops and window sills, according to one housemaster at Sherborne School, which Turing attended from the age of 13.
There was no question, however, of his eagerness to learn, evidenced by his cycling almost 100km to attend his first day of school, which happened to coincide with the 1926 General Strike in Britain. At 16, Turing encountered Albert Einstein's theories on relativity, not only understanding them but inferring that Einstein was questioning Newton's laws of motion, from a text in which this wasn't made explicit.
His school years also marked Turing's recognition of his homosexuality and the beginnings of an interest in the workings of the human mind. This line of thinking was fuelled by the sudden death of Turing's close friend, Christopher Morcom, who died from tuberculosis at age 18. The loss devastated Turing, and prompted questions around the material and spiritual nature of the brain and consciousness - concepts which would preoccupy him throughout his lifetime.
Pursuing his primary interest, Turing went on to study mathematics at King's College, Cambridge, in 1931, gaining a first-class honours and becoming a Fellow in 1935. A turning point came a year later, when Turing produced a paper in which he defined concepts like 'computation' and 'algorithm', in an attempt to solve the 'Entscheidungsproblem' (decision problem).
Essentially, this problem posed the question of whether, given a program and an input, the program will eventually halt when run with that input, or carry on running indefinitely. In his paper, 'On computable numbers, with an application to the Entscheidungsproblem,' Turing proposed a hypothetical machine, later known as a Turing machine, which would execute tasks according to a “table of behaviour” - a finite set of instructions which it would read from a tape of theoretically infinite length. These machines were seen as being universal if their 'table of behaviour' was sophisticated enough to read the tables of other Turing machines, and execute the same set of instructions - effectively, one machine for any number of tasks. In this way he concluded that universal machines would be able to compute anything that is computable.
Now, of course, it's natural to think of these concepts in terms of a computer running a software program, but this was the mid-1930s. It would be another 30 years before something as simple as a handheld electronic calculator became available, never mind the complex workings of a personal computer. Turing proved that the decision problem had no solution, by showing it was impossible to decide algorithmically whether a given Turing machine would ever halt. Without Turing's knowledge, another mathematician, Alonzo Church at Princeton, came to the same conclusion just months before, although their methods differed. Turing eventually went on to study under Church, obtaining his PhD from Princeton University in June 1938, before returning to Cambridge.
While it's obviously impossible to credit the invention of the computer to one person, as Paul Gray writes in Time: “The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine.”
Breaking codes
Turing, at this time still in his twenties, could happily have continued working on challenging mathematical conundrums at Cambridge, but the small matter of a world war intervened. He had been working part-time for the Government Code and Cypher School (GCCS), Britain's main code-breaking centre in Bletchley Park, and when the war broke out in 1939 he joined full time. It was here that Turing entered the next phase of his career, helping to save millions of lives in the process.
Soon after joining GCCS, Turing was set to work on deciphering German military messages created by the Enigma machine, which scrambled ordinary text into nonsensical gibberish. Within weeks he had come up with an electromechanical machine capable of deciphering code more effectively than the Polish system - called 'bomba kryptologiczna - from which Turing's creation (“bombe”) got its name.
Along with input from mathematician Gordon Welchman, the bombe became one of the key methods for deciphering Enigma messages, with more than 200 being in operation by the end of the war. Turing went on to make several key cryptography breakthroughs, including cracking the 'unbreakable' Enigma naval communications in December 1939; coming up with a process to decipher Lorenz messages, created by a German machine code-named Tunny; and helping create a secure speech device called Delilah.
While battlefield heroics offered visible victories, the code-breaking battles taking place behind closed doors were no less pivotal. In a BBC article on Turing's wartime legacy, Jack Copeland writes: “As early as 1943 Turing's machine was cracking a total of 84 000 Enigma messages each month -- two messages every minute. Turing personally broke the form of Enigma that was used by the [Nazi] U-boats preying on the North Atlantic convoys of arms, food, and raw war materials from the US to Britain, without which a starving Britain could not have survived.”
Copeland cites historians as saying the code-breaking activities at Bletchley Park, especially cracking the U-boat Engima code, “shortened the war by as many as two to four years”. If the U-boat codes hadn't been broken and the war had continued for this time, another 14 million to 21 million people might have been killed, notes Copeland. Turing was awarded the Order of the British Empire for his wartime services in 1945, but the secrecy surrounding the code-breaking operations meant his role remained unpublicised for many years.
For all his brilliant cryptography skills, Turing was also considered something of an oddball at Bletchley. There are a number of stories about his particular eccentricities, including a system he invented to prevent the chain from falling off his bicycle (this happened at regular intervals, so he began counting the number of revolutions and got off in time to adjust the chain by hand), as well as chaining his mug to the radiator pipes, to prevent it from being stolen. A gifted long-distance runner, Turing would sometimes jog the 64km to London for meetings, and almost qualified for the 1948 Olympic Games.
Breaking ground
After the war, Turing returned to the machines he had conceptualised in solving the decision problem, looking at practical applications for his theoretical ideas. He was invited to design an actual Turing machine by the National Physical Laboratory (NPL) in London, and submitted the design for what he called the Automatic Computing Engine (ACE) in March 1946. However, Turing's plans for ACE, particularly in terms of storage capacity, were highly ambitious and there were delays in the project being approved. Turing eventually became disillusioned, returning to Cambridge in 1947 where his work broadened into the fields of neurology and physiology.
The NPL made no further progress with ACE, and it was eventually built at Manchester University, where Turing was appointed Reader in the mathematics department, although construction took place without his personal involvement.
Turing's next great breakthrough came in 1950, when he published a seminal work called 'Computing machinery and intelligence', which proposed the now famous 'Turing test'. The test was intended as a method for determining whether a machine could be considered 'intelligent'. It rested on the idea that if a human was conversing with a computer, and couldn't tell whether it was speaking to another human or a machine, the computer could be said to be "thinking".
The questioning method was not aimed at checking the machine's ability to give the correct answer, but how closely the answer resembled a typical human response. The field of artificial intelligence has since moved on from this notion, but the Turing test remains a significant milestone in the thinking around machine intelligence.
Breaking laws
While many of Turing's ideas were ahead of their time, the man himself was tied to 1950s England, and the social norms that came with it. It was thus the liberal approach to his personal life (he was open about his homosexuality) rather than his professional endeavours that led to his decline, and some believe, eventual death.
While at Manchester, Turing had gotten involved with a 19-year-old working class man named Arnold Murray, who was later implicated in a robbery at Turing's home. During the ensuing investigation, Turing admitted to a sexual relationship with Murray, at a time when homosexual relations were still a felony in Britain. He was tried and convicted of "gross indecency" in March 1952, under the same legislation which saw Oscar Wilde imprisoned, and given a choice between prison and chemical castration. Faced with losing his academic career, he agreed to the latter, receiving oestrogen injections for a year which eventually rendered him impotent.
Turing's conviction also led to the removal of his security clearance, and he was barred from continuing his work with the Government Communications Headquarters, the intelligence agency that had evolved from GCCS in 1946.
Then, on 8 June 1954, a few months after the end of his treatment and just weeks before his 42nd birthday, Turing was found dead with a half-eaten apple lying at his bedside. The official cause was cyanide poisoning and while an inquest determined it was suicide, his mother and some others believed his death was accidental, perhaps due to his careless handling of chemicals (he conducted frequent experiments in a chamber he called “the nightmare room”). Other friends speculated the poisoned apple was a tribute to Turing's favourite fairy tale, Snow White and the Seven Dwarfs, in an attempt to end the persecution he was suffering as a result of his homosexuality. In any event, police never tested the apple for the presence of cyanide.
Some 50 years after Turing's death, on 10 September 2009, British prime minister Gordon Brown made an official public apology on behalf of the British government for "the appalling way” Turing was treated. The apology followed an Internet campaign which had gathered thousands of signatures, and Brown acknowledged the computing pioneer's “utterly unfair” treatment, apologising on behalf of both the government and “all those who live freely thanks to Alan's work”.
This nod to the impact Turing has had on the lives of millions, through his advances in the field of computing, emphasises how often such contributions are overlooked. Steve Kroon, senior lecturer in the computer science department at Stellenbosch University, says this hasn't changed much, despite the modern focus on technology. “I think the 'businessmen' of tech will continue to be the ones who are well-known, while the guys making the ground-breaking discoveries (possibly at the labs of Google or Apple) will remain mostly unknown outside of their specialist field.”
Nonetheless, Turing's genius will live on in the lives of all those dependent on computers to complete their daily tasks. As one commentator remarks: “Every time we use a computer today we are enjoying the legacy of Alan Turing and other computing pioneers. And it's not just computers on desks, it's absolutely everywhere - they're embedded in the 21st century, and one can't help thinking, if he could have used any of the kit we have today, it would have been a remarkable thing.”
Share