The A.M. Turing Award has been awarded to one individual every year by the Association for Computing Machinery since 1966. The recipients are chosen based on their technical contributions to the field of computing. The Turing Award is often referred to as the “Nobel Prize” of Computing, it is generally regarded as the highest honor in the field. The first 40 years after its inception it was awarded to men, but in 2006 Frances E. Allen became the first woman to win Computer Science’s most prestigious award.
Allen won the Turing Award for her “pioneering contributions to the theory and practice of optimizing compiler techniques that laid the foundation for modern optimizing compilers and automatic parallel execution.” Her work spanning 45 years at IBM laid the groundwork for modern compiler optimizations and automatic parallel execution of code. Not only did she develop the theory, she executed on it developing high performance compilers and systems throughout her career. These compiler techniques are fundamental to how we write and develop code today.
Allen grew up in a small town in upstate New York. She gravitated towards mathematics in high school, and went to Albany State Teachers College (now SUNY Albany) to pursue her degree. After returning to her hometown to teach high school mathematics for a brief period of time, she continued her education at the University of Michigan Ann Arbor. At Michigan she began taking courses in computing and in 1957 she graduated with a Masters degree in Mathematics.
Allen originally had planned to return to Peru, New York and continue teaching at her local high school, but IBM recruited her during one of its on campus career fairs. Her mathematical and computing prowess coupled with her ability to teach resulted in a role at the Thomas J. Watson Research Center at IBM, where she taught staff scientists John Backus’s newly developed programming language FORTRAN.
At the time programs were written in assembly or machine code, and the staff scientists were incredibly good at optimizing code for their specific machine and task. Many were skeptical that learning this new high level language was beneficial. However Allen, with Backus’s help was able to win over the IBM scientists sighting two main goals of the developing language: programmer productivity and application performance. These two goals would become a theme of Allen’s career.
In order to teach the class Allen first had to learn the language herself. She was often learning the language components only a few weeks before her students. In order to learn FORTRAN Allen began reading the compiler source code, and this sparked her interest in compilers. “It set my interest in compiling, and it also set the way I thought about compilers, because it was organized in a way that has a direct heritage to modern compilers,” Allen remarked.
Allen’s next big project at IBM was working on the compilers for the Stretch and Harvest supercomputers. Stretch was IBM’s first transistorized supercomputer, and Harvest was a custom add on built for the NSA to do code breaking of secret messages. Harvest, with its stream coprocessor and the TRACTOR magnetic tape system, could process 3 million characters per second. Allen and her team set upon the daunting task of building one compiler framework, targeting two machines and operating on three source languages. This was an incredibly ambitious project at the time given that most compilers were written in assembly targeting one machine.
During the Stretch/Harvest project Allen also was IBM’s liaison with the NSA. She helped coordinate the design of the Alpha programming language, designed to detect patterns in arbitrary text. In 1962 Allen spent a year with the NSA installing the system and defining acceptance tests.
The Stretch/Harvest program is often viewed as a major failure because it made aggressive performance estimates, promising 100 times the speed of the IBM 704, when in actuality it was only 30 times faster. However the compiler work Allen was involved in was incredibly successful, this was one of the first instances of a compiler sharing an optimizing back end which could compile multiple languages and produce code that could run on multiple machines.
Allen continued her work on compilers when she joined Project Y, the Advanced Computing Systems (ACS) Project. This was one of the first compilers built for a CPU that did not process instructions one at a time, but instead could work on multiple instructions simultaneously. This concurrency introduced a whole new set of challenges, and so new techniques were used to optimize the compiler including Flow Analysis, instead of just representing programs as a linear sequence of statements the compiler now represented it as a graph that could be analyzed to discover further optimizations like re-using a computed value in another region of code. Allen’s work on the ACS compiler led to programs that could execute much faster than in previous systems.
One of Allen’s final projects for IBM was the Parallel TRANslation Group (PTRAN) a compiler which took FORTRAN programs written for linear execution and generated code capable of executing on parallel computer architectures. This project introduced the concepts of the program dependence graph, a representation now used by many parallelizing compilers to detect and extract parallelism from sequential code.
In 1989 Allen became the first female Fellow at IBM. In 1991 she became an IEEE fellow, and in 1994 she became an ACM Fellow. In addition in 2002 she received the Augusta Ada Lovelace Award from the Association of Women in Computer Science. In 2002 Allen retired from IBM.
Throughout her career Allen focused on taking programs as programmers like to write them, and making them run efficiently by doing sophisticated analysis and optimization of the code. Arguably without Allen’s compiler optimization work, FORTRAN would not have been as successfully adopted in early computing due to programmer’s reluctance to give up performance for high level abstractions. Without Allen’s work we would not have modern compilers with Flow Analysis and Parallelization of Code. Her work has truly had a lasting impact, and a major influence on how computing is done today.