Code Nation. Michael J. Halvorson

Читать онлайн книгу.

Code Nation - Michael J. Halvorson


Скачать книгу
memory, including a plan for allocating resources in an IBM PC that contains 64KB of RAM. (Photo by ©Doug Wilson/CORBIS/Corbis via Getty Images)

      As this example demonstrates, there is more than meets the eye to building non-trivial computer programs, and much of the process takes place well before the programmer loads the code into memory and actually runs the program. The emphasis here is on Teamwork teamwork, and it serves as a corrective to the misconception that Programming programming is usually the work of a solitary coder sitting alone in front of a computer screen.

      Conceptually, programming involves refining Algorithms algorithms, the ordered collections of steps that are proposed to automate processes and solve problems elegantly and efficiently. Some algorithms are limited in scope, like the eight or ten steps that might be necessary to receive contact information from a user and store it in a computer file. (To complete this task, an algorithm might prompt the user for a name and address, assign the input to temporary variables, check the variables for suitable content, format the content, and then insert the information into a database at the appropriate location.) Algorithms Algorithms can also be incredibly complex, such as the comprehensive searching and sorting schemes that Google uses to sift through data gathered from the World Wide Web, then present this information to a user via a commercial web Programming C; C++ browser.

      Systematic attempts to teach programming must somehow train students to become proficient in coding skills, the use of algorithms, debugging techniques, and other important abilities. To appreciate how this relatively obscure problem-solving process became a popular movement, we turn now to the proliferation of programming languages in the 1950s, and the development of an extremely successful programming Computer language language, Formula translation (FORTRAN) FORTRAN.

      In 1981, computer programming pioneer Sammet, Jean Jean Sammet observed that, by her count, there were already some thousand computer languages in the U.S.8 The proliferation of languages was not new, nor was it tied to the development of PCs. In fact, 20 years earlier there were already so many languages in use on mainframe computers that the journal Communications of the ACM published a “Tower of Babel” image on its January 1961 cover. (See Figure 3.3.) The image depicted the mythical tower-to-heaven structure described in the biblical book of Genesis, glossed with the names of dozens of computer languages on the mythical tower’s rings. The artwork recalls earlier critiques of “progress” in America’s social and political history, and it may also poke fun at the hubris of software industry officials for propagating so many compilers. (Human hubris is a pressing concern of the Genesis narrative.)

      In fact, the task of learning to program has sometimes been explained as a simple process of picking a computer language and learning all its features, as if mastering a language’s grammar is the same thing as learning to think logically, or to understand how a computer processes information. As we observed earlier, however, software development involves much more than simply learning the syntax of instructions, as important as that may be. Jean Sammet captured the importance of Language syntax language syntax when she commented: “In the last analysis [language choice] almost always boils down to a question of personal style or taste.”9 In other words, the syntax of languages is interesting, but much in the differences between systems is simply a matter of fashion or technical culture.

       figure

      Figure 3.3Communications of the ACM “Tower of Babel” Cover Image (January 1961), depicting the multiplication of computer programming languages (Courtesy of the ACM) Association for Computing Machinery (ACM)

      So where did all the languages come from and why do programmers propagate them?

      As I noted earlier, the first electronic computers did not utilize software or what we now call “programming” at all. They were hardwired devices that performed individual tasks, such as calculating the trajectory of a rocket. If you wanted to change the problem being computed, you didn’t modify the software, you changed the wiring to accommodate the problem. By hand.

      An example of this type of device is the so-called Atanasoff–Berry Computer Atanasoff–Berry Computer, conceived in 1937 to solve linear equations, one problem at a time. The computer was assembled over a 5-year period at Iowa State College. When it was finished, the machine could be set up to solve two linear equations with up to 29 variables. This was impressive work and the results were highly valued by the Iowa State Physics Department. But in this context the device was essentially a single-purpose computer that specialized in linear equations.

      One of the first programmable computers was the ENIACENIAC computer, designed by John Mauchly and J. Presper Eckert at the University of Pennsylvania. Dedicated in 1946, the ENIAC utilized sophisticated wiring, 18,000 vacuum tubes, panels of switches, and punched-card equipment for input and output.

      The physical task of programming the ENIAC was considered less important than the abstract task of devising complex numerical calculations that the machine could solve. Accordingly, the job of “setting up” the computer with punched cards, cables, and switches was left to skilled female workers who had been trained in mathematics. (Note: The ENIAC team used different terminology than later computer designers, so the terms “programming” and “programs” in this section are somewhat anachronistic.)10

      Regardless of gender considerations, it was not easy to create actual programs for the ENIAC system. In addition to conceptual errors and coding mistakes that arose as part of the planning process, the early log books indicate that there were regular shut downs due to faulty tubes, short circuits, carry errors, divider faults, water leaks, and other problems.11 In short, programming involved a host of physical issues in the early days that modern software developers have no knowledge of or responsibility for today.

      In the 1950s, most early programs were written in numerical machine code, which consisted of 1s and 0s representing instructions for a specific computer. Programmers needed to learn the instruction set for a given computer, and then express the instructions in binary (base-2), octal (base-8), or hexadecimal (base-16) number systems depending on the machine’s internal architecture. Octal was especially common in early computer systems like the DEC PDP-8, ICL 1900, and the IBM mainframes, which structured internal memory using 12-bit, 24-bit, and 36-bit words.

      Hopper, Grace MurrayGrace Murray Hopper described the intricacies of designing programs in this era in a keynote address at the first ACM conference on the history of programming languages, held in 1980 to document the achievements of the early days. (Hopper is shown with a Univac I computer system in Figure 3.4.) At the conference, Hopper explained that she wrote machine code programs in octal, where she routinely added, subtracted, multiplied, and divided in base-8 arithmetic. Hopper performed routine mathematical calculations in her head as she completed her work, although it was also common for engineers to use lookup tables to save time. (Ironically, Hopper later found it difficult to balance her own checkbook using base-10 arithmetic, as she was so steeped in using octal.)12 Later, Hopper and her peers used assembly language when newer computers arrived as way to write computer instructions in a more readable (textual) format.

       figure

      Figure 3.4The operator’s console of a Univac I computer with four computer programmers (1957). From left to right, Donald Cropper, K. C. Krishnan, Grace Murray Hopper, and Norman Rothberg. (Courtesy of the Computer History Museum)

      In assembly language, Program instructions program instructions are composed using short names or abbreviations (mnemonics) for Machine language machine language instruction codes. For example, the instruction “ADD” instructs the central processing unit to add the contents of one register to another. Symbolic names are also used to reference memory locations in assembly language program code.

      Most Assembly language computer science students learn assembly language as part


Скачать книгу