Curatorial Insights Archives - CHM https://computerhistory.org/blog/category/curatorial-insights/ Computer History Museum Sat, 20 Apr 2024 14:15:09 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 Fifty Years of the Personal Computer Operating System https://computerhistory.org/blog/fifty-years-of-the-personal-computer-operating-system/ Thu, 18 Apr 2024 17:03:12 +0000 https://computerhistory.org/?p=29263 Fifty years ago, PC software pioneer Gary Kildall demonstrated CP/M, the first commercially successful personal computer operating system.

The post Fifty Years of the Personal Computer Operating System appeared first on CHM.

]]>
PC software pioneer Gary Kildall demonstrated CP/M, the first commercially successful personal computer operating system in Pacific Grove, California, in 1974. Following is the story of how his company, Digital Research Inc., established CP/M as an industry standard and its subsequent loss to a version from Microsoft that copied the look and feel of the DRI software.

Early Days

Gary Arlen Kildall was born to a family of Scandinavian descent in Seattle, Washington, in 1942. His inventive skills flourished in repairing automobiles and having fun but suffered in scholastic pursuits. He qualified for admission to the University of Washington based on his teaching experience at the family-owned Kildall Nautical School rather than his high school grades.

Dorothy and Gary, circa 1978. Photo: Courtesy Kildall Family

Gary entered college and married his high school sweetheart Dorothy McEwen in 1963. He was one of 20 students accepted into the university’s first master’s program in computer science. Here, his mathematical talents were applied to a subject that fascinated him: all-night sessions programming a new Burroughs computer. To avoid the uncertainty of the draft at the height of the Vietnam War, on graduating with a PhD, he entered a US Navy officer training school and was posted to serve as an instructor in computer science at the Naval Postgraduate School (NPS) in Monterey, California.

Herrmann Hall, Naval Postgraduate School, Monterey. Creative Commons CC0 1.0 Universal Public Domain Dedication.

Gary remained at NPS as an associate professor after his tour of duty ended in 1972. He became fascinated with Intel Corporation’s first microprocessor chip and simulated its operation on the school’s IBM mainframe computer. This work earned him a consulting relationship with the company to develop PL/M, a high-level programming language that played a significant role in establishing Intel as the dominant supplier of chips for personal computers.

To design software tools for Intel’s second-generation processor, he needed to connect to a new 8″ floppy disk-drive storage unit from Memorex. He wrote code for the necessary interface software that he called CP/M (Control Program for Microcomputers) in a few weeks, but his efforts to build the electronic hardware required to transfer the data failed. The project languished for a year. Frustrated, he called electronic engineer John Torode, a college friend then teaching at UC Berkeley, who crafted a “beautiful rat’s nest of wirewraps, boards and cables” for the task.

This is going to be a “big thing”

Late one afternoon in the fall of 1974, together with John Torode, in the backyard workshop of his home at 781 Bayview Avenue, Pacific Grove, Gary “loaded my CP/M program from paper tape to the diskette and ‘booted’ CP/M from the diskette, and up came the prompt: *.”

“This may have been one of the most exciting days of my life, except, of course, when I visited Niagara Falls,” he exclaimed. We now have the power of an IBM S/370 [mainframe computer] at our fingertips.” This is going to be a “big thing,” they told each other and “retired for the evening to take on the simpler task of emptying a jug of not-so-good red wine … and speculating on the future of our new software tool.” 

By successfully booting a computer from a floppy disk drive, they had given birth to an operating system that, together with the microprocessor and the disk drive, would provide one of the key building blocks of the personal computer revolution. While they knew it was important, neither realized the extraordinary impact it would have on their lives and times.

781 Bayview Avenue, Pacific Grove, circa 1974. Photo: Courtesy Kildall Family

As Intel expressed no interest in CP/M, Gary was free to exploit the program on his own and sold the first license in 1975. He continued teaching part-time at NPS, and in 1976, with his wife Dorothy as cofounder, they established Intergalactic Digital Research to pursue commercial opportunities. They shortened the company name to Digital Research Inc. (DRI) when it became available.

Glenn Ewing, a former NPS student, approached DRI with the opportunity to license CP/M for a new family of disk subsystems for fast-growing microcomputer maker IMSAI Inc. Reluctant to adapt the code for another controller, Gary worked with Glen Ewing to split out the hardware dependent-portions so they could be incorporated into a separate piece of code called the BIOS (Basic Input Output System).

Before CP/M, computer manufacturers designed their operating systems to work only with their own hardware and peripheral equipment. An IBM OS would only work with IBM computers; a Burroughs OS would only work with Burroughs computers, etc. Applications had to be written for each computer’s specific OS. Such “closed systems” made it difficult or impossible to mix and match the best pieces of equipment and software applications programs from different manufacturers.

The BIOS code allowed all Intel and compatible microprocessor-based computers from other manufacturers to run CP/M on any new hardware. This capability stimulated the rise of an independent software industry by expanding the market’s potential size for each product. A single program could run without modification on computers supplied by multiple manufacturers, laying an essential foundation for the personal computer revolution.

DRI advertisement from 1978

Dorothy and Gary opened their first office at 716 Lighthouse Avenue, Pacific Grove, on the upper floor, with a view of Monterey Bay. They sold CP/M disks via mail order and walked to the post office every workday to pick up checks resulting from ads placed in industry magazines such as Byte and Dr. Dobbs’ Journal of Computer Calisthenics and Orthodontia.

A licensing deal with computer manufacturer IMSAI bestowed credibility across the industry. CP/M became accepted as a standard and was offered by most early personal computer vendors, including pioneers Altair, Amstrad, Kaypro, and Osborne.

Outside the DRI office at 801 Lighthouse Avenue in November 1980. Photo: John Pierce

In 1978, revenue topped $100,00 per month, and DRI purchased a Victorian house at 801 Lighthouse Avenue for the company headquarters. By 1980, DRI employed more than 20 people, and Fortune magazine reported that the company generated revenue of $3.5 million, five times the revenue of Microsoft at that time. Gary also acquired a Piper aircraft that allowed him to fly from Monterey to meet regularly with his customers in Silicon Valley and beyond.

To accommodate the expanding engineering staff hired to service the hundreds of different computer models used by more than a million people worldwide, DRI purchased a 1909 American Foursquare-style residence at 734 Lighthouse. Today, it houses the offices of the Carmel Pine Cone newspaper.

Gary in 734 Lighthouse Avenue. Photo: John Pierce

One Friday afternoon, Gary called the engineering staff together and announced that he would give them all a raise over the weekend. On Monday, when they returned to work, contractors began raising the building to make room in the basement for a new Digital Equipment Corporation VAX 11/750 computer system. After several weeks, supported by heavy wooden beams and house jacks, the engineers’ desks were five feet higher.

By 1983, DRI’s annual sales reached $45 million. The company employed over 500 people, including more than 100 engineers, and had expanded into another building at 160 Central Avenue, which today houses the Monterey Bay Aquarium’s offices.

The IBM PC Effect

In 1980, IBM established a new business division in Boca Raton, Florida, to develop a desktop computer for the mass market. To get the IBM PC, as it became known, to market as quickly as possible, they used commercially available components, including an Intel microprocessor chip. Bill Gates knew Gary from early discussions about merging their companies and setting up shop in Pacific Grove, so when an IBM procurement team visited Microsoft to license the BASIC interpreter program, he referred them to DRI for an operating system.

Gary at Monterey Airport with his Piper Aerostar. Photo: Tom Rolander

When the IBM team arrived in Pacific Grove, they met in the morning with Dorothy and DRI attorney Gerry Davis to agree on the terms of a non-disclosure agreement. Gary, who had flown his aircraft to Oakland to meet an important customer, returned in the afternoon, as scheduled, to discuss technical matters. IBM wished to purchase CP/M outright, whereas DRI sought a per-copy royalty payment in order to protect its existing base of business. The meeting ended in an impasse over financial terms, but Gary believed they had essentially agreed to do business. 

Kildall tried to renew the negotiations a couple of weeks later. IBM did not respond because, in the meantime, Bill Gates purchased an OS from Seattle Computer Products that was written to emulate the look and feel of CP/M. He then sold a one-time, non-exclusive license to IBM, which used the designation PC DOS. With great foresight, he retained the right to license the product to others as MS-DOS.

When Gary learned of this transaction, he threatened IBM with a lawsuit over what he believed was an illegal copy of CP/M. IBM responded by agreeing to fund DRI to adapt CP/M for the PC and to make both brands of OS available to customers. With CP/M’s reputation and enhanced features, DRI believed customers would opt for the better product.

IBM announced the PC on August 12, 1981, but with the PC-DOS list price set at $40 versus $240 for CP/M, most customers simply chose the former as the lower-cost option. Attorney Gerry Davis recalled that “IBM clearly betrayed the impression they gave Gary and me.”

Aftermath

DRI continued to thrive for several years with a multi-tasking operating system for the IBM PC-XT and a host of new products. The company also introduced operating systems with windowing capability and menu-driven user interfaces years before Apple and Microsoft.

At its peak, DRI employed over 500 people and opened operations in Asia and Europe. However, by the mid-1980s, in the struggle with the juggernaut created by the combined efforts of IBM and Microsoft, DRI had lost the basis of its operating systems business.

Dispirited, Gary, who never relished the responsibility of managing a large company or displayed the cut-throat business acumen of a Gates, sold the company to Novell Inc. of Provo, Utah, in 1991. Ultimately, Novell closed the California operation and, in 1996, disposed of the assets to Caldera, Inc., which used DRI intellectual property assets to prevail in a lawsuit against Microsoft.

In other pursuits, Gary founded KnowledgeSet with his friend and DRI VP of engineering, Tom Rolander, where they created the first CD-ROM encyclopedia for Grolier.

In an oral history for the Computer History Museum, Brian Halla, Intel’s technical liaison to DRI, recalls that Gary “showed me this VAX 11/780 that he had running in his basement, and he was so proud of it, and he said, ‘I figured out a way to have a computer generate animation,’ and he said, ‘Watch this. And he runs a demo of a Coke bottle that starts real slowly and starts spinning, and so as maybe several months went by, he lost interest in this, and he sold his setup to a little company called Pixar.'”

Kildall continued to innovate after selling DRI. He moved to Austin, Texas, where he founded Prometheus Light and Sound to explore wireless home networking technology and participated in charitable work for pediatric AIDS.

Gary Kildall died in 1996 at age 52 following an accident in Monterey. His ashes are buried in Seattle, the hometown he shared with Bill Gates. Dorothy McEwan Kildall purchased the Holman Ranch in Carmel Valley and served on many community boards, including the Heritage Society of Pacific Grove. She died in 2005.

The Legacy of Gary Kildall

In 1995, the Software and Information Industry Association presented Gary Kildall with a posthumous Lifetime Achievement Award, citing eight significant areas in which he contributed to the microcomputer industry.

In an obituary published in the Microprocessor Report in 1994, his friend, the late John Wharton, commented, “I don’t think Gary ever really begrudged Bill Gates his business success or his personal fortune. … what I think Gary wanted most was to share his excitement and enthusiasm for computers and technology with others.”

Gary Kildall in 1988 Photo: Copyright Tom O’Neal, Carmel Valley, CA

On April 25, 2014, the Institute of Electrical and Electronic Engineering, “The world’s largest professional association for the advancement of technology,” installed a bronze IEEE Milestone in Electrical Engineering and Computing plaque outside the former DRI headquarters at 801 Lighthouse Avenue. The Milestone program honors important events in electrical engineering and computing. Achievements such as Thomas Edison’s electric light bulb, Marconi’s wireless communications, and Bell Labs’ first transistor are recognized with plaques in appropriate locations.

The citation reads: “Dr. Gary A. Kildall demonstrated the first working prototype of CP/M (Control Program for Microcomputers) in Pacific Grove in 1974. Together with his invention of the BIOS (Basic Input Output System), Kildall’s operating system allowed a microprocessor-based computer to communicate with a disk drive storage unit and provided an important foundation for the personal computer revolution.”

In 2017, US Navy dignitaries, friends, family, and peers gathered to celebrate the dedication of the Gary A. Kildall Conference Room on the Naval Postgraduate School campus in Monterey. The ceremony included the installation of a duplicate of the IEEE plaque in the conference room.

Despite this wide recognition of his technical accomplishments, Gary’s legacy remains mired in a tangle of myths and conspiracy theories. The most persistent being driven by a 1982 comment attributed to Bill Gates and published in the London Times newspaper that “Gary was out flying when IBM came to visit, and that’s why they did not get the contract.”

The former editor of the Times, Harold Evans, atoned for that story in a PBS documentary and his book They Made America: Two Centuries of Innovators from the Steam Engine to the Search Engine. The subtitle of the chapter on Gary, “He saw the future and made it work. He was the true founder of the personal computer revolution and the father of PC software,” offers a sympathetic telling of the life and times of the entrepreneurial genius who helped give birth to the PC operating system 50 years ago this year.

Additional information at the Computer History Museum 

Comments in quotes in this article without source attribution are from Gary’s unpublished draft of Computer Connections: People, Places, and Events in the Evolution of the Personal Computer Industry, written in 1993. The Kidall family has authorized the online publication of extracts from this memoir in the blog Gary Kidall: In His Own Words.

The Computer History Museum has also made the source code of several early releases of CP/M available for non-commercial use.

A search for “Kildall” in the CHM collection catalog yields 45 records comprising objects, documents, and images, including a video of the 2014 CP/M IEEE Milestone Dedication event.

 

Main image: Gary Kildall at the first West Coast Computer Faire in the San Francisco Civic Auditorium in 1977. [CHM Object ID: 500004174 © Tom Munnecke/Hulton Archive/Getty Images]

FacebookTwitterCopy Link

The post Fifty Years of the Personal Computer Operating System appeared first on CHM.

]]>
Amplifying History https://computerhistory.org/blog/amplifying-history/ Wed, 06 Mar 2024 17:39:16 +0000 https://computerhistory.org/?p=29068 CHM acquires two early hearing aids that used the new transistor technology of the late 1940s to revolutionize the devices.

The post Amplifying History appeared first on CHM.

]]>
CHM Acquires Innovative Hearing Aids

With their hearing aids, the hard of hearing have consistently been among the earliest commercial adopters of new electronic technologies, as historian Mara Mills has shown. The histories of disability and of technology interweave in many different, fascinating, and important ways. The desire among many of the hard of hearing for assistive technology—hearing aids—has been a significant force in the history of electronics.

Mara Mills, “Hearing Aids and the History of Electronics Miniaturization,” IEEE Annals of the History of Computing, April-June 2011, pp. 24-44.

In the early years of the 20th century, the first hearing aids to use electricity were based on telephone technology. Many users found them terribly conspicuous, yearning for something smaller, more comfortable, and far less visible. The advent of the vacuum tube introduced the age of electronics, opening new possibilities for creating, altering, and transmitting sound.

Hearing aid manufacturers quickly adopted the new electronics technology, using tubes to amplify sound. They, along with the makers of portable radio sets, had an insatiable demand for smaller tubes that were more robust and less power hungry. The electronics industry responded, and the smaller tubes enabled miniaturized, less conspicuous, hearing aids.

Throughout the Second World War, military demands further drove the miniaturization of electronics, with ultra-miniature vacuum tubes, and the intensive development of printed circuitry. After the war, hearing aid manufacturers quickly used these developments to further their own decades-long efforts at miniaturization.

Transistor Technology

At the end of the 1940s, an incredibly important development in the history of electronics occurred: the invention of the transistor. The transistor could perform the electronic functions of a vacuum tube but following an entirely different principle. Instead of flows of electrons through the vacuum in the interior of a tube, transistors worked by controlling the flows of electrons through solid materials called “semiconductors.” With this new principle, transistors could be made much smaller than vacuum tubes, requiring far less power. Eventually, they were made to be extremely reliable as well.

While early transistors were dedicated to telecommunications and military production, the earliest commercial uses of transistors were in hearing aids, in keeping with the hard of hearing as the earliest of adopters of new electronics technology and miniaturization. Perhaps the world’s foremost collector of transistors and related objects, Jack Ward, recently donated to the Computer History Museum two very early hearing aids that employed transistors: the Sonotone Model 1010 and the Telex Model 954.

Sonotone 1010 donated by Jack Ward to the Computer History Museum. Photo by Aurora Tucker.

Inside the Sonotone 1010 donated by Jack Ward to the Computer History Museum. Photo by Aurora Tucker.

The Telex 954 donated by Jack Ward to the Computer History Museum. Photograph by Aurora Tucker.

While the transistor was first demonstrated in a laboratory at the Bell Telephone Laboratories at the very end of 1947, the Sonotone appeared on the market in December 1952. It was the first commercial product to use transistors. The Telex Model 954 appeared somewhat later, in December 1954, and, like the Sonotone 1010, used two miniature vacuum tubes along with one transistor to provide the amplification of sound. CHM is honored to have these incredible devices in its collection, preserving artifacts so important to both disability history and the history of electronics.

Explore More

Mara Mills, “Hearing Aids and the History of Electronics Miniaturization,” IEEE Annals of the History of Computing, April–June 2011, pp. 24-44.

Jack Ward’s Transistor Museum.

Main image: Inside the Telex 954 and the edge of the Sonotone 1010 donated by Jack Ward to the Computer History Museum. Photo by Aurora Tucker.

FacebookTwitterCopy Link

The post Amplifying History appeared first on CHM.

]]>
Echoes of History https://computerhistory.org/blog/echoes-of-history/ Wed, 14 Jun 2023 18:09:29 +0000 https://computerhistory.org/?p=27616 Three generations of Sutherland's visit CHM to see Jim Sutherland's 50-year-old creation, the ECHO IV home computer system.

The post Echoes of History appeared first on CHM.

]]>
Visits with computing pioneers are magical. Recently, Jim Sutherland, creator of one of CHM’s most intriguing artifacts, came to the Museum’s environmentally controlled storage facility with his son and grandson to see something he made over half a century ago.

Sutherland was the visionary engineer who, in 1965, built a home computing system based on minicomputer parts he had scavenged from work. He called it ECHO IV, an acronym for the “Electronic Computing Home Operator.”

ECHO quickly caught the attention of the media, appearing in dozens of publications. Like some of today’s coverage of new technology, the tone vacillated from wonder to irony. Even Jim’s wife Ruth remarked at the time, “At first, I thought it might really replace me!” Read the full story here.

The Sutherland family in front of ECHO IV. Jim sits at ECHO IV’s keyboard. His wife, Ruth, puts a raincoat on daughter Sally, while Jay and Ann look on. (Photo: Pittsburgh Post-Gazette, 1966)

Jim’s son Jay and grandson Evan took a transcontinental flight to visit CHM and see, perhaps for the last time, this wonderful invention of nearly 60 years ago. It was deeply moving to witness Jim’s joy at rediscovering something he had not seen in decades, seeing his pride at showing his grandson what he had built, and hearing Jay’s detailed memories of using ECHO IV as a young boy of about Evan’s age.

Occasions like this are great opportunities for revisiting the history of specific objects and asking questions of their creators. As former CHM Trustee Donna Dubinsky once said, “We live in an era when we can ask the great inventors of our days directly about their work . . . imagine being able to go up to Michelangelo and ask him questions.” And so, earlier that day, while Jay and Evan were on a guided tour of CHM, I conducted an extended oral history with Jim about ECHO IV as he sees it from today’s perspective. Stay tuned!

Main image:  From left, ECHO IV, Jim, Jay, and Evan Sutherland at CHM, May 18, 2023.

 

SUPPORT CHM’S MISSION

Blogs like these would not be possible without the generous support of people like you who care deeply about decoding technology. Please consider making a donation.

FacebookTwitterCopy Link

The post Echoes of History appeared first on CHM.

]]>
Weather By Computer https://computerhistory.org/blog/weather-by-computer/ Wed, 07 Jun 2023 16:04:04 +0000 https://computerhistory.org/?p=27575 Using homegrown satellite communications equipment in the early '60s, the CDC 1604 laid the foundation for modern weather forecasting tools.

The post Weather By Computer appeared first on CHM.

]]>
Laying the Foundation

Established on the Monterey Peninsula in 1961, the Fleet Numerical Weather Facility (FNWF), known locally as Fleet Numerical, was chartered to apply the newly emerging processing power of digital computers and communications technology to provide accurate weather and ocean condition prediction services to the US Navy. 

Based on the Naval Postgraduate School (NPS) campus and at a facility on Point Pinos in Pacific Grove, and using homegrown satellite communications equipment and Model #1, Serial #1 of the Control Data Corporation 1604 computer, FNWF laid the foundations for modern weather forecasting technology.

The Fleet Numerical Weather Facility

The Navy established a Numerical Weather Problems Group (Project NANWEP) in Suitland, MD, in 1958 to generate operational weather prediction products for the Navy. To take advantage of the computing capability at NPS, in March 1959, the Navy assigned NANWEP to Monterey under Capt. Paul M. Wolff, where, in 1961, it was renamed the Fleet Numerical Weather Facility (FNWF).

Wolff distinguished FNWF’s mission from other activities in the field: “Atmospheric and oceanographic analysis and prediction problems have been faced before — in the universities, in industry, in governmental agencies. To my knowledge, however, FNWF acts singularly in its treatment of the two fluids as a single, coupled system. Correct solutions to environmental problems demand this approach.” [1]

The Navy Acquires a Supercomputer

The NPS Department of Mathematics purchased its first electronic automatic digital computer, a National Cash Register NCR 102A in 1953. It was used in practically all phases of the physical sciences, including early approaches to weather simulation.

NCR 102A at NPS. Photo by Dean Vannice. Source: Calhoun: The NPS Institutional Archive

Pioneering computer architect Seymour Cray and his team built the world’s first commercially successful transistorized computer at Control Data Corporation (CDC) in Minneapolis in 1959. The central computer weighed one ton, and the console half a ton. With a clock speed of 5 microseconds, the 48-bit CDC 1604 claimed to be the fastest machine of its time.

CDC 1604 transistorized logic module. Source: Calhoun: The NPS Institutional Archive

In 1958, when the Bureau of Ships contracted to acquire ten 1604s from CDC, Cray lobbied for the first system to be delivered to Monterey. [2] And in January 1960, he personally supervised the installation of Model #1, Serial #1 of the CDC 1604 in Room 101A of Spanagel Hall.

The CDC 1604 is delivered and assembled in Spanagel Hall. Source: Calhoun: The NPS Institutional Archive

“I was there when Cray sat at the 1604 console and, like a master pianist, ran through the test programs,” said Edward Norton Ward, a mathematician and the first computer technician hired by Professor W. R. Church, Chairman of the Mathematics Department. “I watched and listened. When it’s raining knowledge, you just hold out your hand.” [3]

Lt Harry Nicholson at a 1604 console. Capt. Nicolson served as Commanding Officer of FNOC from 1982- 86. Source: Calhoun: The NPS Institutional Archive

According to Professor Douglas Williams who became Director of the NPS Computer Center in 1963, “It was used by submitting machine language programs on paper tape. There was no operating system and no assemblers, compilers, or utilities. I obtained a Fortran compiler—folklore says it was written by Seymour Cray.” [4] Despite its limitations, the 1604 boasted impressive computing power for the time, with 32,768 bits of 48-bit core main memory and 100,000 computations per second.

CDC 1604 tape drive storage units. Source: Calhoun: The NPS Institutional Archive

In August 1960, the Monterey Peninsula Herald reported that FNWF demonstrated “the first surface weather map to be produced by a computer … it cuts the time for compiling hemispheric weather forecasts from hours to minutes. And is 40 percent more accurate than old hand methods.”

Printout shows the temperature at the sea surface and various depths. Source: Weather by Computer

FNWF acquired its own CDC 1604 in 1961, which was installed with the school’s machine in the converted lobby of the first floor of Spanagel Hall. The complete system incorporating a CDC 160A for data transmission, a Varian 530 plotter, ASR-33 teletype machines, and tape storage drives, is shown below.

Diagram of Control Data computer system FNWF. Source: Weather by Computer

In 1963, CDC published a report, Weather By Computer, that described the FNWF operation and programs written for the 1604 that generated a broad range of weather predictions for naval operations worldwide.

Weather By Computer. Source: Computer History Museum

To eliminate conflicts between the immediate demands of computer processing time for weather prediction and the teaching needs of the school, in 1964, FNWF built a dedicated computer center on the NPS campus. To handle the increased computational load, a CDC 3200 computer (a 24-bit version of the 1604) was purchased in October and was running at full capacity by the end of the year.

Also, in 1964, FNWF established a separate Communications Division to design, fabricate, and test special-purpose electronic communications and interface devices to serve unique requirements for receiving and transmitting real-time weather data. Administrative offices, workshops, and R&D laboratories were located in a former Navy radar training facility on Point Pinos at 1352 Lighthouse Avenue, Pacific Grove.

When NPS acquired an IBM 360 Model 67 in 1967, Model #1, Serial #1 of the CDC 1604 was transferred to FNWF and moved to Point Pinos, where it continued to serve for archival storage of weather data.

All FNWF operations were consolidated at a single site at the Navy Annex, Monterey Airport in 1974 where today, as the Fleet Numerical Meteorology and Oceanography Center (FNMOC), it serves as a primary DoD production site for computer-generated meteorological and oceanographic analysis and forecast products worldwide. The operation is highly respected, and its computing capability ranks as one of the most powerful in its field in the world.

Main image: The CDC 1604 supercomputer with a figure as scale. Source: Wikipedia. https://en.wikipedia.org/wiki/CDC_1604

 

NOTES

[1] Wolff, Paul M. “Oceanographic data collection” (1968) Bulletin American Meteorological Society: Vol 49, No. 2, (February 1968), p. 96.

[2] Created by Congress in 1940, the Bureau of Ships responsibilities included supervising the design, construction, conversion, procurement, maintenance, and repair of ships and other craft for the Navy; managing shipyards, repair facilities, laboratories, and shore stations; developing specifications for fuels and lubricants; and conducting salvage operations.

[3] Honneger, Barbara, “NPS Computing: 50 Years Golden and Growing,” Calhoun: The NPS Institutional Archive.

[4] Douglas Williams, Interviews, Calhoun: The NPS Institutional Archive.

[5] The Computer History Museum collection holds a 1604 main cabinet, all three sections of the operator’s console, and the core memory unit, together with numerous other related documents and manuals.

SUPPORT CHM’S MISSION

Blogs like these would not be possible without the generous support of people like you who care deeply about decoding technology. Please consider making a donation.

FacebookTwitterCopy Link

The post Weather By Computer appeared first on CHM.

]]>
A Backup of Historical Proportions https://computerhistory.org/blog/a-backup-of-historical-proportions/ Wed, 10 May 2023 15:18:14 +0000 https://computerhistory.org/?p=27317 Discover what surprises await in CHM's release of the Xerox PARC file system archive.

The post A Backup of Historical Proportions appeared first on CHM.

]]>
Access the Xerox PARC file system archive here.

An Ancient Anxiety

“Is my phone really backed up in the Cloud?” “When was the last time I backed up my laptop?” “Is it true that I need a local backup of my Google Drive?!” “Oh dear, I forgot my password!” Now that we have interwoven computers so deeply into our daily lives, an ancient anxiety has become a fiercer everyday companion for us. For centuries we have worried “Are my most precious records okay?” In the past, we calmed this anxiety using a variety of technologies: safety deposit boxes, shoe boxes, photo albums, photocopies, scriptoria, institutional archives, and more. In a world of digital computing, we are all too aware of the fragility of record keeping. In some ways, our ancient anxiety has expanded.

Scriptoria were dedicated spaces for the copying of manuscripts, making this a drawing of a 16th century backup. From the National Gallery of Art. https://www.nga.gov/collection/art-object-page.74850.html

Computing professionals have been living with this digital flavor of archival anxiety for longer than the rest of us. From the very beginning, the fluidity and fungibility of digital information came with fragility. Making matters worse, many of the means for holding and storing digital information were less reliable and much harder to work with than today’s. As a result, computing professionals met their anxiety about—and real challenges of—digital fragility with a new discipline: They started to make purposeful copies. They began to back things up.

A Laboratory for the Office of the Future

PARC in 2022. Cmichel67, CC BY-SA 4.0 <https://creativecommons.org/licenses/by-sa/4.0>, via Wikimedia Commons

In 1970, the well-heeled corporate behemoth Xerox, with a nearly perfect monopoly on the quintessential office technology of photocopying, cut the ribbon on a new and ambitious bet on its future: the Xerox Palo Alto Research Center (PARC). PARC was a large research and development organization, comprised of distinct laboratories. Several concentrated on extending Xerox’s dominance of photocopying, like the General Science and Optical Science Laboratories. Others, specifically the Computer Science and Systems Science Laboratories, were aimed at a new goal. They would develop computer hardware and software that could plausibly form the basis for the “office of the future” some ten to fifteen years hence, giving Xerox a profound head start in this arena. The previous year, Xerox had leapt into the computer industry through the purchase, for an enormous sum, of the company SDS.

The leadership of PARC scoured the computing community across the United States and recruited what proved to be an astonishing collection of young talent. Part of the attraction PARC held for this cohort was, surely, the fact that the new laboratories held the opportunity to pursue a vision about the future of computing that they already held deeply. In this future, computing would be increasingly personal, graphical, interactive, and networked. Xerox’s deep pockets, and a PARC leadership that shared this vision, proved compelling.

Backing Up the Office of the Future

At PARC, the new recruits wanted to have the same sort of computing environment they had been familiar with in their academic research: a PDP-10 mainframe from the Digital Equipment Corporation running the timesharing TENEX operating system from BBN. Xerox refused. They had just purchased SDS, a maker of timesharing computers, and couldn’t countenance such a major purchase from their prime competitor. The PARC computing crowd responded by simply building their own clone of a PDP-10, calling it MAXC (an eye-poking pun on the name of the founder of SDS, Max Palevsky), and installed TENEX. Immediately, they began to back up what they were creating with MAXC. Using a TENEX program named BSYS, the PARC researchers could store their data and programs on 9-track magnetic tapes. Tape backups had arrived at PARC.

A 9-track tape drive in the collection of the Computer History Museum. https://www.computerhistory.org/collections/catalog/102752062

The next several years to 1975 contained a remarkable flourishing of computing developments at PARC. The researchers created the Alto computer and a swath of novel software for it that, through the subsequent decades, has broadly defined our use of computers. To learn more about this remarkable story, you might start here. Critical to the use and success of the Alto were PARC’s innovations in computer networking, specifically the creation of Ethernet for wired connectivity. Ethernet wove the many Altos across PARC together, further connecting them to the now two MAXC systems as well as a variety of printers. Moreover, PARC researchers developed the PUP networking protocol, allowing Xerox to knit together many local Ethernet networks across the US into a sprawling corporate internetwork.

Individual Alto users could store and back up their files in several ways. Altos could store information on removable “disk packs” the size of a medium pizza. Through the Ethernet, they could also store information on a series of IFSs, “Interim File Servers.” These were Altos outfitted with larger hard drives, running software that turned them into data stores. The researchers who developed the IFS software never anticipated that their “interim” systems would be used for some fifteen years.

With the IFSs, PARC researchers could store and share copies of their innovations, but the ancient anxiety demanded the question: “But what if something happened to an IFS?!” Here again, Ethernet held a solution. The PARC researchers created a new tape backup system, this time controlled by an Alto. Now, using Ethernet connections, files from the MAXC, the IFSs, and individuals’ Altos could be backed up to 9-track magnetic tapes. Later, at the end of the 1970s, the PARC researchers even developed a new program called ARCHIVIST, which ran on a more powerful successor to the Alto known as the Dorado. ARCHIVIST automated the process, allowing researchers to archive to and retrieve files from the IFSs by sending simple commands through electronic mail.

From Backup to Migration

Nearly a decade later, at the close of the 1980s, PARC’s researchers increasingly adopted commercially produced computers from outside the company, rather than the Altos, Dorados, and other systems that they had devised in-house. These outside computers were new workstations produced by a local firm, Sun Microsystems. While the Sun systems were directly inspired by the Altos, they brought PARC closer to the computing mainstream through Sun’s embrace of the Unix operating system and microprocessors. This shift to Sun implied yet another wrinkle for PARC’s solutions to its archival anxieties.

A Sun workstation in a Stanford laboratory. https://www.computerhistory.org/collections/catalog/102657163

By the start of the 1990s, PARC’s computer researchers began storing their information on new Unix-based servers using Sun’s Network File System (NFS) protocol, which has gone on to be a standard for Unix and Linux systems worldwide. These new PARC NFS servers used 8mm digital tape cassettes for backup. MAXC was decommissioned, and no one used the ARCHIVIST system anymore. PARC had accumulated an impressive thicket of 9-track magnetic tapes holding backups of programs, data, messages, and documents from the astonishing contributions of PARC to computing across the 1970s and 1980s, but now no one was using the 9-track tape systems anymore. With this, a particularly horrible aspect of the ancient archival anxiety came to the fore: “What if I lose the key to my lock box?” “What if I can’t access my backups anymore?” Now backup’s twin, migration, took center stage. PARC’s computer crowd wrote fresh programs that migrated the data from the 9-track tapes to the new 8mm digital tape cartridges, which they also used for their NFS servers. The older tapes were discarded, and the 8mm tapes of this remarkable record of the work of the 1970s and 1980s then sat for another decade.

An 8mm data tape cartridge. Mister rf, CC BY-SA 4.0 <https://creativecommons.org/licenses/by-sa/4.0>, via Wikimedia Commons

Migration Springs Eternal

Like hope, migration springs eternal. In 2003, the researchers at PARC realized that, while 8mm tapes were still in use, other media were becoming more popular. To keep the archive of PARC’s astonishing accomplishments accessible, migration would again be necessary. In that year, PARC researcher Dan Swinehart approached Al Kossow to tackle the challenge. Kossow was then a senior engineer at Apple Computer and already known as a passionate preservationist of both computer hardware and software, especially around the Alto. Kossow was able to transfer all the data from the 8mm tapes to a set of DVD-ROM discs. Again, this unique archive for the history of computing was safe, sound, and accessible—strictly within the confines of PARC.

A few years later, in 2006, Kossow joined the Computer History Museum (CHM) full-time as the Robert N. Miner Software Curator. When he had worked on the migration of the PARC archive to DVD, Kossow had created an extra CD-ROM onto which he had copied almost 15,000 files relating to the work done specifically on the Alto in the 1970s and 1980s, reflecting his keen appreciation for the importance of the Alto in the history of computing. Now at CHM, he and others began an effort to see if PARC would be willing to donate the extra Alto CD-ROM to CHM, and thereby open it up to the world. Testifying to the perseverance of CHM and the sagacity of PARC, the Alto archive CD-ROM was donated to CHM in 2011 with permission to make it public.

Public Translation

CHM now faced a major challenge. How could this nearly four-decade old software, data, and information be made accessible to today’s public? The information was created with a now deeply obsolete forty-year-old experimental computer, with research software that no one had touched in decades. Much of the Alto archive was in now arcane formats for printing like “Press” or for document editing like “Bravo.” Certainly, you couldn’t provide the public with working Altos to read the archive.

Paul McJones (right) in 1973, with Edsger Dijkstra. https://mcjones.org/dustydecks/archives/2011/04/

An answer to the dilemma came in 2013, through the contributions of Paul McJones. McJones is a retired software engineer and established software preservationist who had met many (future) PARC researchers while he was working at Berkeley in the 1960s and 1970s. In the second half of the 1970s, McJones had done programming for the new division set up by Xerox to commercialize PARC’s computer innovations. He again worked with many former PARC researchers in projects at DEC’s laboratory in Palo Alto, and then again as a principal scientist at Adobe.

During 2013, McJones crafted a program that processed the Alto archive, creating HTML translations of Bravo files and PDF translations of Press files, and organizing them into a set of web pages for access, search, and browsing. With this further act of migration qua translation, the Alto archive was at last ready to share with the world, and in 2014 https://xeroxalto.computerhistory.org/ went live.

From the Alto Archive to the PARC File System Archive

Since its launch, the Alto archive has proved essential for efforts at both CHM and the Living Computer Museum (LCM) in Seattle. At LCM (which sadly closed during the COVID pandemic), senior software engineer Josh Dersch used the archive and Al Kossow’s Bitsavers repository to build ContrAlto, an emulator for the Xerox Alto that can run on contemporary computers and, in turn, run the software found in the Alto archive. At LCM, ContrAlto was a key part in an impressive Alto restoration that visitors could use. At CHM itself, the Alto archive proved indispensable to a number of projects, ranging from its own restoration of an Alto, a major event on the history of the Alto, and a series of video ethnographies of software innovations on the Alto.

Charles Simonyi (standing) and Tom Malloy demonstrate the Bravo word processor on a restored Alto for a 2017 Computer History Museum event. Courtesy Doug Fairbairn.

But what of the rest of the PARC archive from the 1970s and 1980s that resided on the sixteen or so DVDs that remained sitting in a box? Could the rest of the archive be collected by CHM and, through it, be released to the public? Did the archive contain sensitive personal information that should not be released? Did it contain intellectual property that was still vital for PARC, or that was owned by others? Could these types of materials be identified and filtered out?

Once again, Paul McJones offered his expertise and help. Acting as a CHM volunteer, he entered into an NDA (non-disclosure agreement) with PARC enabling him to work there on the remaining archive. He copied the archive from the DVDs to a contemporary hard drive and identified personal files that should be filtered out. He used his translation and organization program to make the remaining archive readable and accessible and transferred it to PARC researchers and legal staff for review. Eventually it was sent to CHM. The resulting archive of nearly one hundred fifty thousand unique files of PARC’s groundbreaking work from the 1970s and 1980s arrived at CHM on a thumb drive and could now be made available to the public.

A screenshot of a Press file, detailing PARC backup procedures, now rendered as PDF in the new archive.

With the new archive, new challenges arose in preparing it for public release. Paul McJones’ program could convert Press and Bravo files to PDF and HTML, making them readable, but not the Tioga files found in great abundance in the new archive. Tioga is the file format for a successor text editor to Bravo that the PARC researchers had created and used extensively in the 1980s. A significant fraction of the archive remained inscrutable. This time, Josh Dersch, the creator of the Alto emulator, answered the call. He was able to supply logic for Paul McJones’ program to render Tioga files as HTML documents. The archive was finally unlocked.

The PARC File System Archive, Unlocked

The nearly one hundred and fifty thousand unique files —around four gigabytes of information—in the archive cover an astonishing landscape: programming languages; graphics; printing and typography; mathematics; networking; databases; file systems; electronic mail; servers; voice; artificial intelligence; hardware design; integrated circuit design tools and simulators; and additions to the Alto archive. All of this is open for you to explore today at https://info.computerhistory.org/xerox-parc-archive Explore! 

One thing that is missing, hopefully temporarily, are files related to the critically significant programming language and environment Smalltalk. Smalltalk is a key piece in both the history of object-oriented programming and that of the graphical user interface. The Smalltalk materials in the archive are currently under review by the company Cincom, which owns significant intellectual property rights in Smalltalk and markets Smalltalk-based software globally today. An additional unresolved question is what 8mm tape backups may possibly remain at PARC for the NFS servers, holding the archives of work done at PARC across the 1990s and in the new millennium. It is a topic for further investigation.

Exploring the Archive: The Unexpected Story of Interscript

What kinds of discoveries await in https://info.computerhistory.org/xerox-parc-archive? I’d like to share something really surprising and fascinating that I came across in the archive—a new story that has enriched my view of a topic in the history of computing that is tremendously important. I hope that it might inspire you to find your own discoveries in the archive.

Take a moment to consider how most writing occurs today. What tools do people most commonly use? Pencil and paper? Pen or brush and ink? Compare that to all the writing that we do through computing: taps on a keyboard—physical or onscreen—assembling texts, messages, posts, mail, lists, and documents of a bewildering assortment. Think also of voice to text, itself a kind of writing, sending messages, submitting search queries, and the like. In many parts of the world today, I think it’s very safe to say that most writing takes place through computing. How did this happen? One thing is certain, it did not happen on its own. How did we make computers write? This question animates my new book project, and while it is at an early stage one finding is absolutely clear: Many of the most innovative minds in the history of computing have devoted an extraordinary amount of time and energy to this very project of making computers write.

One of the episodes in this long historical project is the creation of PostScript, a coding language that afforded the ability for computers to produce high-quality printed pages. It acted as a common language that let you print exactly what you wanted, no matter which computer, app, or printer you happened to be using. I wrote about the story of PostScript a few months ago, when CHM released the source code for PostScript in connection with the fortieth anniversary of Adobe, the company that made it. While you may not be familiar with PostScript, you are certainly intimately aware of a technology that directly developed out of it: PDF.

Adobe cofounders John Warnock (left) and Chuck Geschke (right). Courtesy Adobe Inc. and Doug Menuez.

Adobe was founded in 1982 by two Xerox PARC computer researchers, Chuck Geschke and John Warnock, and their first order of business was to create PostScript. The reason was that the pair had worked with others—Butler Lampson, Bob Sproull, and Brian Reid—on a very similar project at PARC, the coding language Interpress. While Interpress differed from PostScript in some aspects of fundamental approach, the intention behind Interpress was exactly the same: creating a coding language for the high-quality printing of documents. Computers, programs, and printers that could “speak” Interpress would be able to cooperate seamlessly. The Interpress effort had started in Geschke’s laboratory at PARC in 1979, and by 1981 it had reached an advanced state of development. Leadership at Xerox had even agreed that Interpress would become the whole corporation’s standard, but that this would take years to happen. Concerned about that slow pace in the face of rapid developments in computing, Geschke and Warnock left PARC in 1982, forming Adobe to get a standard coding language for printing quickly into the world.

What I stumbled across in https://info.computerhistory.org/xerox-parc-archive reveals part of the story of what happened next for the researchers who had worked on Interpress and who remained at PARC. This was a new effort, initially called InterDoc, later Interscript, that aimed to do for editable documents just what Interpress and PostScript did for printable documents. Perhaps the same approach—creating a new coding language for the interchange of documents between various computers and apps—could work here as well.

The promise of Interscript, in two figures from a 1985 Xerox document. https://bitsavers.org/pdf/xerox/interscript/IntroductionToInterscript.pdf

The Interscript effort, as electronic mail held in the archive show, really took off in 1981 as the success of Interpress became clear. Spurred by Butler Lampson and Jim Mitchell, the project also included Brian Reid, who had worked with Lampson on Interpress, as well as Bob Ayers and Jim Horning, who worked especially closely with Mitchell. The Interscript project ran from 1981 into at least early 1984.

An email exchange in the archive, documenting the emergence of the first name for the effort, InterDoc.

What the Interscript team immediately discovered was that editable documents presented a greater challenge than printable documents. Editable documents were inherently dynamic. By definition, they were going to be subject to constant change. And these changes were not just about what words they contained and in what order. These changes were also about the organization and appearance of the text, the layout, from outlined or numbered text to headlines, captions, illustrations, columns, and the like. Creating a coding language that could contend with such dynamic complexity was a true challenge.

Furthermore, the editors that were emerging in the first half of the 1980s ran from the rudimentary to the elaborate. This spread of editor functions was itself another challenge. How could an interchange format work from the simplest to most complex editors? How could simple editors just work on the parts of a document that they could, but leave everything else alone?

The tree structure of Interscript’s layout templates. https://bitsavers.org/pdf/xerox/interscript/IntroductionToInterscript.pdf

To meet the challenges, the Interscript team again turned to computer science. Not only would they turn to a new coding language as part of the solution, but they would also look to one of the key organizational forms of computer science, the “tree” data structure. In it, elements are connected to one another in a hierarchy, just like the trunk of a tree leads to branches, and on to sticks and then twigs, each step in greater profusion. Very roughly put, Interscript would capture the possible layouts of an editable document as a tree structure of possible templates. Careful control algorithms would then guide the “pouring” of the text into the proper templates of the tree. These scripts would allow the editable document to be reconstituted, edited, and then shared between different computers and programs.

A portion of a November 1983 Tioga document, rendered as HTML in the archive, summarizing some of the milestones and motivations of the Interscript project.

Although significant progress had been made on designing Interscript into early 1984, the effort then appears to have ended abruptly. While Butler Lampson, in a telephone interview with me recently, holds it ultimately ended because it was “naïve” given the complexity of editable documents, another factor was that, at the end of 1983, the Computer Science Laboratory at PARC descended into chaos. This was the Laboratory that housed the Interscript project, and its charismatic leader Bob Taylor abruptly resigned, soon followed by half of its technical staff. Lampson left to rejoin Taylor, and Mitchell, temporarily thrust into the position of the manager for the unravelling laboratory, himself quickly departed for Acorn Computers.

Unsolved Problems

Remarkably, Lampson explains, no one has yet to solve the problem that Interscript set out to address. We still lack a common format for editable documents that can contend with layout well. In his view, only partial and de facto solutions exist. Microsoft’s Word, itself originally directly based on the Bravo editor from Xerox PARC, has become a de facto standard, but only because any editor needs to work with Word documents if it is to be commercially viable. And even so, we all know how layout suffers when moving a document from one editor to another. PDF, with its roots in printed documents, only succeeds in limited ways with editing. For his part, Mitchell believes that the fundamental approaches of Interscript had great promise, and that if they had been more diligently pursued by PARC and Xerox, our lives with electronic documents could have been much different, and for the better.

So here, in a single, small directory in https://xeroxparcarchive.computerhistory.org, lies a fascinating story about making computers write, and an unsolved problem within it. Who knows, perhaps the person who finally solves it will find inspiration in the archive.

Acknowledgements

This archival project, and this article, would have been impossible without the efforts of:
Al Kossow
Paul McJones
Hansen Hsu
Josh Dersch
Butler Lampson
Jim Mitchell
“JKF,” the author of the 1991 README in the rosetta.tar file of the archive at PARC, who is very likely James K. Foote
Tim Curley
Heather Walker
Eric Bier
John Kitchura
PARC, A Xerox Company
Xerox

Additional Resources

David C. Brock, “50 Years Later, We’re Still Living in the Xerox Alto’s World,” https://spectrum.ieee.org/xerox-alto

The Alto in CHM’s flagship exhibition, Revolution: The First 2000 Years of Computing, https://www.computerhistory.org/revolution/input-output/14/347

A selection of video recordings featuring an Alto computer restored by CHM, https://youtube.com/playlist?list=PLQsxaNhYv8dbSX7IyztvLjML_lgB1C_Bb

A 1986 lecture by Alan Kay, “The Dynabook—Past, Present, and Future,” https://www.youtube.com/watch?v=GMDphyKrAE8&list=PLQsxaNhYv8dbIuONzZcrM0IM7sTPQFqgr&index=8

A 1986 lecture by Butler Lampson, “Personal Distributed Computing – The Alto and Ethernet Software,” https://www.youtube.com/watch?v=h33A-KWJKDQ&list=PLQsxaNhYv8dbIuONzZcrM0IM7sTPQFqgr&index=9

A 1986 lecture by Chuck Thacker, “Personal Distributed Computing – The Alto and Ethernet Hardware,” https://www.youtube.com/watch?v=A9n2J24Jg2Y&list=PLQsxaNhYv8dbIuONzZcrM0IM7sTPQFqgr&index=10

Main Image: The Xerox PARC File System Archive, newly released by the Computer History Museum.

 

SUPPORT CHM’S MISSION

Blogs like these would not be possible without the generous support of people like you who care deeply about decoding technology. Please consider making a donation.

FacebookTwitterCopy Link

The post A Backup of Historical Proportions appeared first on CHM.

]]>
The Remarkable Ivan Sutherland https://computerhistory.org/blog/the-remarkable-ivan-sutherland/ Tue, 21 Feb 2023 12:50:16 +0000 https://computerhistory.org/?p=26909 CHM releases to the public for the first time a full oral history with Ivan Sutherland, pioneer of computer graphics, virtual reality, asynchronous systems, and more.

The post The Remarkable Ivan Sutherland appeared first on CHM.

]]>
In His Own Words

Ivan Sutherland has blazed a truly unique trail through computing over the past six decades. Along the way, he helped to open new pathways for others to explore and dramatically extend: interactive computer graphics, virtual reality, 3D computer graphics, and asynchronous systems, to name but a few.

The Computer History Museum is delighted to make public its two-part oral history with Ivan Sutherland, one of the most influential figures in the story of computing to date. These new oral history interviews present a wonderful opportunity to learn more about Ivan Sutherland’s life in computing directly from the source, with his own reflections and interpretations and in his own words. The transcripts for these interviews can be viewed and downloaded here and here. And the full interview video can be viewed below.

The Museum is deeply grateful to Bob Sproull, a lifelong colleague of Sutherland and himself a major figure in computing, for his roles as instigator, interviewer, and editor for these oral histories, and for involving me, Marc Weber, and Jim Waldo in the effort. The Museum is also delighted to make these oral history interviews public during the 60th anniversary year of Ivan Sutherland’s breakthrough in interactive computer graphics, the program Sketchpad, for which he earned his PhD from MIT in 1963.

A Man of Many Parts

There is a phrase, far more popular in 17th and 18th century England than it is today, that recurs for me when thinking about Ivan Sutherland and the remarkable story of his life in computing: “A man of many parts.” The description was used for an individual who had made serious contributions to a domain, while also possessing multiple, and often diverse, talents and pursuits. The description fits Ivan Sutherland well, but I think it also misses something important: there is a commonality in Sutherland’s multiple contributions and accomplishments, a connective tissue or shared wellspring for his many parts.

To get at this wellspring, start with geometry. From his youth, Sutherland possessed an unusually keen spatial, geometric intuition. In his mind and at his hands, he experienced an immediacy in perceiving how things fit and worked together. Perspective drawing involves a set of techniques to represent a three-dimensional scene on the two-dimensional plane of a sheet of paper or a stretch of canvas. These renderings can proceed in different ways, determined by the number of vanishing points employed. Together the vanishing points define the viewpoint of the observer. One-point, two-point, and three-point perspectives are all very different, providing distinct ways to understand the represented scene.

Thomas Eakins, Untitled (Perspective study of boy viewing object), Hirshhorn Museum and Sculpture Garden, Smithsonian Institution, Washington, DC, Gift of Joseph H. Hirshhorn, 1966. Accession Number 66.1553.A-B

This switching of viewpoints, the ability to look at something from a fresh and unexpected angle, and then to integrate this new perspective with those that came before, seems to me the link between Sutherland’s unusual spatial intuition with his diverse contributions in computing. It’s an ability to find a new viewpoint on a subject, to look at it from this novel perspective, and then to explore how this vantage might change the subject itself through fresh solutions and directions.

Connections and Intersections

Ivan (left) and Bert Sutherland at the Computer History Museum in 2017. Courtesy Doug Fairbairn.

In what follows, I trace some of the lines of Sutherland’s story, intersecting them with related materials held in the Museum’s collection. As recounted in his oral history interviews, Ivan’s life in computing was profoundly shaped by interactions he and his brother Bert had with two central figures in the early history of computing: Edmund Berkeley and Claude Shannon. Bert, who went on to a remarkable career in computing himself, distinguished by his roles as a research manager at Xerox PARC and at Sun Laboratories, told his story in his own oral history with the Museum.

The Sutherland brothers, through a connection of their mother’s, began visiting Edmund Berkeley in New York City from their home in Scarsdale while Ivan was still in grade school. At the time, Berkeley was establishing himself as a leading author, publisher, and consultant for the new world of digital computers. In Berkeley’s offices, the Sutherland brothers encountered his light-seeking robot “Squee,” now in the collection of the Computer History Museum, which also holds some of Berkeley’s papers.

Berkeley’s “robot squirrel,” Squee. Computer History Museum, B1630.01, © Mark Richards. https://www.computerhistory.org/collections/catalog/B1630.01

The Sutherland brothers worked on their own versions of light-seeking robots afterward at home, using surplus parts their engineer-father helped them to source in New York City, and the pursuit became rather long-lasting for Ivan. As an undergraduate engineering student at Carnegie Tech (today’s Carnegie Mellon University), and then again during his early stint as a graduate student at Caltech (before moving to MIT after one year), Sutherland continued to build more advanced, refined light-seeking robots of his own design. The reason? Aesthetics, he explains in his oral history. For Sutherland, engineering design has a strong aesthetic dimension. Beauty and simplicity, for Sutherland, gave the practice of engineering an aesthetics, an affective pull. “In fact, I think that engineering and art are very closely related,” he explains.

Ivan Sutherland discusses a surplus military gunsight computer his father installed for the brothers in the family kitchen. From the new CHM oral history.

In Berkeley’s offices, the Sutherland brothers also had the opportunity to work with his new creation, Simon, a very simple and inexpensive computer. Unlike the giant mainframes of this era, which relied on thousands of vacuum tubes, Simon was animated by a handful of inexpensive relays—simple electrical on/off switches. Nevertheless, the machine was able to perform mathematical and logical operations.

Berkeley’s Simon. Computer History Museum, 102627259, © Mark Richards. https://www.computerhistory.org/collections/catalog/102627259

Further, Simon was programmable, using instructions encoded on a punched paper tape. During his high school years in the 1950s, Ivan Sutherland was able to devise a working program for Simon, allowing it to perform division, quite a feat for the humble machine. “I’m quite proud of having written a division routine for a two-bit computer when I was in high school,” he explains in the oral history. “So I can almost literally say I’ve been in the computer business nearly all my life.”

Through Berkeley, the Sutherland brothers were introduced to another key figure in the early years of digital computing: Claude Shannon, renowned for his development of information theory. While a maestro of abstraction, Shannon was also a keen builder. During a visit to Shannon’s office at the Bell Telephone Laboratories in northern New Jersey, he showed the brothers his creation Theseus. It consisted of a small maze of movable metal panels affixed to the top of a metal box containing magnets and relay electronics like Berkeley’s Simon. Through the action of the relay electronics and magnets, a toy mouse was able to find its way through the maze and then “remember” the successful route. While the Sutherland brothers were duly impressed, their attempts to recreate this early effort in machine problem-solving and artificial intelligence proved unsuccessful.

Claude Shannon with Theseus. Computer History Museum, 102630792. https://www.computerhistory.org/collections/catalog/102630792

Breakthrough at MIT

After graduating from Carnegie Tech, Ivan Sutherland headed to Caltech for graduate studies in electrical engineering. There, as he recounts in his oral history, he was invited to attend a lunch with Marvin Minsky and Oliver Selfridge, two central figures in digital computing at MIT and the new field of artificial intelligence. Over the meal, Sutherland listened to Minsky and Selfridge’s enthusiastic reports of new computer developments at MIT and its Lincoln Laboratory. Adding to Sutherland’s excitement about the computer activity at MIT was the fact that Claude Shannon had moved there. Sutherland quickly decided to continue his graduate work at MIT, and Shannon agreed to advise him.

Two unidentified women at the controls of the TX-2 in 1962. http://www.bitsavers.org/pdf/mit/tx-2/photographs/2022-10-31/P91-206_RR_127176.jpg

Once at MIT, Sutherland met with Wesley Clark, the designer and impresario of an immensely powerful experimental computer, the TX-2, at MIT’s Lincoln Laboratory. Clark had designed the TX-2 incorporating two critical innovations in computer component technology: high-speed switching transistors and large capacity magnetic core memories. The machine would provide valuable lessons about the use, capabilities, and potential of these new technologies.

A transistorized “flip-flop” logic module from the TX-2. Computer History Museum, 102732767. https://www.computerhistory.org/collections/catalog/102732767

But perhaps more importantly, for Clark the TX-2 had the potential to make real a kind of computing that could become more widespread in the future. As Sutherland explains in his oral history, “Wes took TX-2 and treated it as a window into the future of what computing might be if everybody had one of his own.” Sutherland proposed to use TX-2 to create software for generating engineering drawings. Without hesitation, Clark gave him access to the machine.

Ivan Sutherland discusses the origins of Sketchpad in his new CHM oral history.

In January 1963, Ivan Sutherland successfully completed his PhD on the system he created on the TX-2, Sketchpad. With it, a user was able to interactively, and in real time, create line drawings on the computer’s CRT screen, using a light pen for direct input on the display. Sketchpad afforded many different capabilities for working with these line drawings, such as the automatic completion of shapes, sizing, the ability to copy and repeat elements, and more.

Ivan Sutherland using Sketchpad on the TX-2, circa 1962-1963. Computer History Museum, 102652182. https://www.computerhistory.org/collections/catalog/102652182

A Sketchpad image, from Ivan Sutherland’s dissertation, 1963. Computer History Museum, 102726907. https://www.computerhistory.org/collections/catalog/102726907

For Sutherland, and for many others who experienced and learned about it, Sketchpad represented much more than just a new way to create line art. As he put it in his thesis, “The Sketchpad system makes it possible for a man and a computer to converse rapidly through the medium of line drawings. Heretofore, most interaction between men and computers has been slowed down by the need to reduce all communication to written statements that can by typed; in the past, we have been writing letters to rather than conferring with our computers… The Sketchpad system… opens up a new era of man-machine communication.” A listing of Sutherland’s source code for Sketchpad in the Computer History Museum’s collection is available here, and his 1994 lecture about the history of Sketchpad can be viewed here.

Innovation in the Military

After MIT, Sutherland fulfilled his ROTC commitments to military service by serving in the US Army, first at the NSA, where he continued work on computer graphics, and then as the second director of the Information Processing Technology Office of ARPA, the Advanced Research Projects Administration of the Department of Defense. Only in his mid-twenties, Sutherland succeeded the MIT psychologist J. C. R. Licklider, who had established the office and its leading role in supporting computer science and artificial intelligence research in the nation.

While Sutherland continued many of Licklider’s projects at ARPA, he added new projects of his own to the mix. Critically for Sutherland, he supported a new effort by Wesley Clark, the designer of the TX-2 who had moved from MIT to Washington University, St. Louis. Clark had created an innovative small computer for an individual user called the LINC, especially suited to the real time needs of biomedical research, and moved the project and team to St. Louis. (Clark discusses the history of the LINC in a 1986 talk here.) Now, Clark envisioned an entirely new approach to computer design. In it, computers would be built up from distinct units, each unit providing an entire function. In this way, computers could be composed in a flexible and bespoke manner, built with just what was needed for some use, no more. Clark called the approach macromodule, and Sutherland funded the research.

Wesley Clark (left) and Charles Molnar (right) with a LINC computer. Molnar was a key figure in the macromodule research with Clark. Computer History Museum, 102680046. https://www.computerhistory.org/collections/catalog/102680046

The researchers in Clark’s macromodule effort succeeded in building a variety of different units, such as the one below donated to CHM by Ivan Sutherland that performed addition. The modularity of this new approach entailed a radical departure in digital computing design. In the mainstream, all the operations of computers were coordinated by following the regular beat of a single electronic signal, the “clock.” For the macromodule approach, an alternate, asynchronous approach to the orchestration of computer operations was required. The practical challenges and the theoretical potentials of asynchronous systems became a central passion and focus for Ivan Sutherland thereafter.

An addition function macromodule from Wesley Clark’s research group. Computer History Museum, 102766550. https://www.computerhistory.org/collections/catalog/102766550

Computer Graphics at Harvard and Utah

After his appointment at ARPA, Sutherland accepted a tenured engineering faculty position at Harvard University. There, Sutherland expanded his graphical ambitions from the two-dimensional abilities of Sketchpad to the concept of three-dimensional graphics and a new interface for experiencing them. Sutherland created a laboratory of graduate and undergraduate students alike, aimed at creating views of 3D scenes—drawn with lines—as well as a display worn on the head that would present different views of the 3D scene depending on the direction that the user looked. By the close of the 1960s, they had a system in place that could do just that. This project is frequently cited as an early milestone in the history of virtual reality. Sutherland discusses the project and its relation to virtual reality in this 1996 lecture.

The head-mounted display from Sutherland’s Harvard project. Computer History Museum, 102680042. https://www.computerhistory.org/collections/catalog/102680042

Students of USC professor and VR researcher Scott Fisher image Sutherland’s head-mounted display at the Computer History Museum in 2022 for a project to recreate his laboratory in virtual reality.

Some early results of the USC virtual reality effort.

Soon afterward, Sutherland left Harvard for the University of Utah, and for a new startup he was cofounding to pursue systems for 3D computer graphics. The key partner for Sutherland in both moves was David C. Evans, an accomplished computer researcher. Evans was establishing a computer science department focused on 3D computer graphics, the same focus as the company he was starting with Sutherland. The new company, Evans and Sutherland, moved quickly to produce workstations for creating 3D graphics, beginning with the LDS-1 and then moving on to the very successful Picture System. Other products and efforts became essential to computer animation and to military pilot training.

A page from a brochure for the Picture System. Computer History Museum, 102646288. https://www.computerhistory.org/collections/catalog/102646288

Sutherland and Evans also fostered a remarkably productive and creative community of students in computing and especially computer graphics at Utah, counting the cofounders of Adobe, Pixar, Silicon Graphics, and more among its members. Some of these figures discussed this remarkable environment in a 1994 meeting.

Sutherland’s experiences through his time in Utah comprise just the first half of his story in computing and engineering. Beyond it lies another startup, a faculty career at Caltech, a revolution in VLSI microchip design, a walking-robot project at Carnegie Mellon, venture capital investing, a consulting firm that became the basis for Sun Laboratories, and fresh contributions to asynchronous systems that continues to this day at Portland State. For these stories, Sutherland’s new oral history interviews (Part 1 and Part 2) are an incredible source, as are this event with the Sutherland brothers in 2004 and this retrospective lecture by Ivan Sutherland at the Computer History Museum in 2005.

Main image: Ivan Sutherland. Photo credit: BBVA Foundation.

 

SIGN UP!

Learn more about the Art of Code at CHM or sign up for regular emails about upcoming source code releases and related events.

FacebookTwitterCopy Link

The post The Remarkable Ivan Sutherland appeared first on CHM.

]]>
The Lisa: Apple’s Most Influential Failure https://computerhistory.org/blog/the-lisa-apples-most-influential-failure/ Thu, 19 Jan 2023 13:31:19 +0000 https://computerhistory.org/?p=26613 CHM publicly releases the source code to Apple's Lisa computer, including its system and applications software.

The post The Lisa: Apple’s Most Influential Failure appeared first on CHM.

]]>
Happy 40th Birthday to Lisa! The Apple Lisa computer, that is. In celebration of this milestone, CHM has received permission from Apple to release the source code to the Lisa software, including its system and applications software.

Access the code here.

What is the Apple Lisa computer, and why was its release on January 19, 1983, an important date in computer history? Apple’s Macintosh line of computers today, known for bringing mouse-driven graphical user interfaces (GUIs) to the masses and transforming the way we use our computers, owes its existence to its immediate predecessor at Apple, the Lisa. Without the Lisa, there would have been no Macintosh—at least in the form we have it today—and perhaps there would have been no Microsoft Windows either.

From DOS to GUI

Before the 1970s and even into the early 1990s, a majority of personal computer users interacted with their machines via command-line interfaces, text-based operating systems such as CP/M and MS/DOS in which users had to type arcane commands to control their computers.

Apple II ProDOS Command-line interface. The catalog command shown lists the files on the current disk. Public domain.

The invention of the GUI, especially in the form of windows, icons, menus, and pointer (WIMP), controlled by a mouse, occurred at Xerox PARC in the 1970s, on the Alto, a computer with a bitmapped graphics display designed to be used by a single person, i.e. a “personal computer,” despite the research prototype’s high cost. Key elements of the WIMP GUI paradigm, especially overlapping windows and popup menus, were invented by Alan Kay’s Learning Research Group for their children’s software development environment, Smalltalk.

Screenshot of Smalltalk-78 emulation running at https://smalltalkzoo.thechm.org/HOPL-St78.html. Shown is a demo given by Dan Ingalls to Steve Jobs at PARC in 1979. Overlapping windows were a key new feature of Smalltalk, which was a development environment. Note the lack of icons, buttons, or an ever-present menu bar. Commands, including window resizing, were executed by right-clicking the mouse and selecting from a popup menu.

In 1979, a delegation from Apple Computer, led by Steve Jobs, visited PARC and received a demonstration of Smalltalk on the Alto. Upon seeing the GUI, Jobs instinctively grasped the potential of this new way of interacting with a computer and didn’t understand why Xerox wasn’t marketing this technology to the public. Jobs could see that all computers should work this way, and he wanted Apple to lead the way by bringing this technology out from the research lab to the masses.

From Apple II to Lisa

Apple had already been working on a computer in its own R&D labs to leapfrog the company’s best-selling, but command-line-based, Apple II personal computer. It was code-named “Lisa” after Lisa Brennan (now Brennan-Jobs), Steve Jobs’ child with a former high school girlfriend, whom he initially refused to acknowledge as his own. The code-name stuck, and a backronym, Local Integrated Systems Architecture, was invented to obfuscate the connection to Jobs’ daughter.(1) Unlike the Apple II, which was aimed at the home computer market, the Lisa would be targeted at the business market, would use the powerful Motorola 68000 microprocessor, and would be paired with a hard drive.

After the PARC visit, Jobs and many of Lisa’s engineers, including Bill Atkinson, worked to incorporate the ideas of the GUI from PARC into the Lisa. Atkinson developed the QuickDraw graphics library for the Lisa, and collaborated with Larry Tesler, who left PARC to join Apple, on developing the Lisa’s user interface. Tesler created an object-oriented variant of Pascal, called “Clascal,” that would be used for the Lisa Toolkit application programming interfaces. Later, by working with Pascal creator Niklaus Wirth, Clascal would evolve into the official Object Pascal.

Apple Lisa 2 screenshot. Icons on the desktop and the menu bar with pulldown menus at the top of the screen have made their appearance. This interface is very similar to that of the original Macintosh. Photo Courtesy of David T. Craig. CHM Object ID 500004666.

A reorganization of the company in 1982, however, removed Jobs from having any direct influence on the Lisa project, which was subsequently managed by John Couch. Jobs then discovered the Macintosh project started by Jef Raskin. Jobs took over that project and moved it away from Raskin’s original appliance-like vision to one more like Lisa—a mouse-driven GUI-based computer but more affordable than the Lisa.

Steve Jobs with John Couch, VP and General Manager of the Lisa division, showing off the original Lisa, 1983. Photo courtesy of John Couch.

Competition and Collaboration

For a few years, both the Lisa and Macintosh teams competed internally, although there was collaboration as well. Bill Atkinson’s QuickDraw graphics became part of the Macintosh, and Atkinson thus contributed to both projects. Lisa software manager Bruce Daniels actually left the Lisa project to work on the Macintosh for a period of time, greatly influencing the direction of the Mac towards the Lisa’s GUI. Larry Tesler’s work on the object-oriented Lisa Toolkit application frameworks would later evolve into the MacApp frameworks, which used Object Pascal. Owen Densmore, who had been at Xerox, worked on Printing on both the Lisa and the Macintosh.

Bill Atkinson’s Apple ID badge. Atkinson was an important figure in the creation of the Lisa, developing key aspects of the user interface. Credit: Folklore.org

Managers in the Lisa development group. From left to right: Wayne Rosing (hardware, later all of Lisa engineering), Larry Tesler (applications software and libraries, user interface design and testing), Bruce Daniels (software, systems architecture). Photo by John Blaustein. Scan of page 97 of Personal Computing Magazine, March 1983, CHM #102661078.

The Lisa’s user interface design underwent many different versions before finally arriving at the icon-based desktop metaphor familiar to us from the Macintosh.(2) Nevertheless, the final Lisa Desktop Manager still has a few key differences from the Mac. One was a document-centric rather than application-centric model. Each program on the Lisa featured a “stationery pad” that resided on the desktop, separate from the application icon. Users tore off a sheet from the stationery pad to create a new document. Users rarely interacted with the application’s icon itself, but rather with these stationery pads.(3) The idea of centering the user’s world around documents rather than applications would reemerge in the 1990s with technologies such as Apple’s OpenDoc and Microsoft’s OLE.

The Cost of Innovation

Lisa was released to the public on January 19, 1983, at a cost of $9,995. This was two years after Xerox had released its own commercial GUI-based workstation, the Star, for $16,595, which was similarly targeted towards office workers. The high price of both machines compared to the IBM PC, a command-line based PC which retailed for $1,565, doomed them both to failure. But there were other significant problems too. The Lisa’s sophisticated operating system, which allowed multiple programs to run at the same time (“multitasking”) was too powerful even for its 68000 processor, and thus ran sluggishly. The Lisa shipped with a suite of applications, including word processing and charts, bundled with the system, which discouraged third party developers from writing their own software for it. The original Lisa shipped with a floppy drive (“Twiggy”), designed in-house, that was unreliable. 

Brochure showing Lisa 1 screen and Twiggy floppy drives. Brochure text lists the original specs, a 32-bit Motorola 68000 processor (16-bit data bus), 1 MB RAM, and 364 x 720 resolution bitmap display. External ProFile hard disk is not shown. CHM #102634506

From Lisa to the Mac

Announced in the famous Superbowl ad, the Apple Macintosh shipped in January 1984 for $2,495. Eliminating a hard drive, multitasking, and other advanced features, and a greatly reduced memory made it much more affordable than the Lisa. An innovative marketing program created by Dan’l Lewin (today CHM’s CEO) that sold Macintoshes at reduced prices to college students contributed significantly to the Mac’s installed base. The advent of Postscript-driven laser printers like the Apple LaserWriter in 1985, combined with the page layout application PageMaker from 3rd party software company Aldus, created a brand-new killer application—desktop publishing—for the Macintosh.(4) This new market would grow to a billion dollars by 1988, and the Macintosh became the first commercially successful computer with a graphical user interface and a product-line that continues to this day.

The Lisa 2 series, consisting of two models, Lisa 2/5 and 2/10, priced at $3,495 and $5,495, respectively, was announced alongside the Macintosh in January 1984. Lisa 2 replaced the original Lisa’s twin Twiggy floppy drives with a single Sony 3.5” floppy drive, the same drive that was in the Mac. In January 1985, the Lisa 2/10 was rebranded as the Macintosh XL with MacWorks, an emulator that allowed it to run Mac software, but despite improved sales this product was killed off in April 1985 to focus on the Mac.(5)

The Lisa 2 series was announced in January 1984, with the Macintosh, as part of the Apple 32 SuperMicro series. Note that the twin Twiggy drives have been replaced by the Mac’s Sony 3.5” floppy drive. Not only did this improve reliability, but also improved compatibility with the Mac, allowing them to use the same floppy disks. CHM #102689034

The release of the GUI-based Lisa and its successor the Macintosh inspired several PC software companies to create software “shells” that would install GUI environments on top of MS-DOS command-line based IBM PCs. The first of these was VisiOn, released in late 1983 by VisiCorp, the publisher of the first spreadsheet program VisiCalc. This was followed in 1985 by GEM from Digital Research, the company behind the command-line based CP/M operating system. Microsoft followed with Windows the same year.

The Influence of Innovation

Both GEM and Windows were released after the Macintosh and were influenced by user interface elements from the Mac. Though Windows was first released in 1985 it was not widely used by most PC users until 1990’s Windows 3.0. Between Windows and the Macintosh, GUIs have become the primary user interface paradigm on personal computers.

Lisa in use by John Couch’s son, with Couch looking on. The image illustrates “What You See Is What You Get,” with Couch holding a printout that mirrors the drawing on the screen. Despite this marketing image, the Lisa, at $9,995, was not aimed at the home computer market, but rather at office professionals. But, used to selling retail, Apple lacked experience in direct sales, which was how computers were sold to businesses, a strategy IBM had perfected. Businesses also required IBM mainframe compatibility, which the Lisa did not have. Corporate customers preferred the IBM PC, which cost only $1,565. Photo Courtesy of John Couch.

Despite the Lisa’s failure in the marketplace, it holds a key place in the history of the GUI and PCs more generally as the first GUI-based computer to be released by a personal computer company. Though the Xerox Star 8010 beat the Lisa to market in 1981, the Star was competing with other workstations from Apollo and Sun. Perhaps more importantly, without the Lisa and its incorporation of the PARC-inspired GUI, the Macintosh itself would not have been based on the GUI. Both computers shared key technologies, such as the mouse and the QuickDraw graphics library. The Lisa was a key steppingstone to the Macintosh, and an important milestone in the history of graphical user interfaces and personal computers more generally.

NOTES

(1) https://www.folklore.org/StoryView.py?project=Macintosh&story=Bicycle.txt

(2) Roderick Perkins, Dan Smith Keller, and Frank Ludolph, “Inventing the Lisa User Interface,” Interactions 4, no. 1 (January 1, 1997): 40–53, https://doi.org/10.1145/242388.242405. See also https://www.folklore.org/StoryView.py?project=Macintosh&story=Busy_Being_Born.txt 

(3) https://www.callapple.org/modern-apple-computing/the-legacy-of-the-apple-lisa-personal-computer-an-outsiders-view/

(4) John Scull and Hansen Hsu, “The Killer App That Saved the Macintosh,” IEEE Annals of the History of Computing 41, no. 3 (July 2019): 42–52, https://doi.org/10.1109/MAHC.2019.2918094

(5) Owen Linzmayer, Apple Confidential 2.0: The Definitive History of the World’s Most Colorful Company, Rev. 2nd ed. (San Francisco, CA: No Starch Press, 2004), 79–80.

SIGN UP!

Learn more about the Art of Code at CHM or sign up for regular emails about upcoming source code releases and related events.

FacebookTwitterCopy Link

The post The Lisa: Apple’s Most Influential Failure appeared first on CHM.

]]>
“The Surface State Job” https://computerhistory.org/blog/the-surface-state-job/ Mon, 12 Dec 2022 16:37:25 +0000 https://computerhistory.org/?p=26494 Happy 75th birthday to the transistor! Find out who was REALLY behind its invention.

The post “The Surface State Job” appeared first on CHM.

]]>
Celebrating the 75th Anniversary of Bardeen and Brattain’s Invention of the Transistor

We should tell Shockley what we did today

— Walter Brattain

While driving home from Bell Telephone Laboratories Murray Hill facility in New Jersey during a “Miracle Month” of intense activity culminating in their demonstration of the first transistor, Walter Brattain told his carpool colleagues, “Bardeen and I produced an amplifier using the field effects at very low frequencies . . . But the next night, I swore them all to secrecy. They weren’t supposed to know anything about this.” [1]

John Bardeen in 1956. Photo courtesy: nobelprize.org

Walter Brattain in 1956. Photo courtesy: nobelprize.org

Walter Houser Brattain (1902–1987) was born to American parents in Amoy (now Xiamen) China. On their return to the US, the family settled on a ranch near Tonasket, Washington. He received a master’s degree from the University of Oregon and a PhD from the University of Minnesota. Brattain joined Bell as a research physicist in 1929, where he was noted as a skilled experimentalist.

Born in Madison, Wisconsin, theoretical physicist John Bardeen (1908–1991) skipped three grades in school as a child prodigy. He earned a master’s degree from the University of Wisconsin and a PhD from Princeton, where he pursued an interest in solid-state physics.

Under the secret code name ‘The Surface State Job,” their project was an important priority within Bell Labs, the research arm of the American Telephone and Telegraph Company, to find a smaller and lower power replacement for bulky, power-hungry vacuum tubes in telephone switching systems. Mervin J. Kelly, the Labs’ research director, believed that crystalline semiconductor materials, such as germanium or silicon, might offer a solution. In 1936 he recruited William Shockley from the Massachusetts Institute of Technology (MIT) to research solid-state materials for this application.

Born in London, England, to American parents, William Bradford Shockley spent his youth in Palo Alto, California, just yards from the famed Hewlett-Packard garage. A precocious child, he was “ill-tempered, spoiled, almost uncontrollable, who made his doting parents’ lives miserable.” [2] He earned a bachelor’s degree at the California Institute of Technology and a PhD in theoretical physics from MIT. Brilliant, Intel’s Gordon Moore commented that Shockley “could see electrons,” but egotistical and volatile, he enjoyed management’s support but was less popular with his peers. [3] “He understood everything but people,” according to Nobel Laureate Charles Townes. [4]

Shockley’s semiconductor amplifier idea

Convinced he could find a solution based on solid materials, in 1939, Shockley wrote, “It has today occurred to me that an amplifier using semiconductors rather than vacuum is in principle possible.” Brattain assisted Shockley with experiments on his idea for what we would call today a field-effect transistor (FET) but achieved no useful result. 

World War II disrupted this work, but it resumed in 1945 when Shockley hired John Bardeen and asked him to see if he could find anything wrong with his design. Bardeen initially concluded that it should have worked.

The FET is a device that uses an electric field to control the flow of current in a semiconducting material. Shockley had published a paper in his MIT days that assumed that electrons near the surface would be as free to move about as those in the bulk of the material. On March 19, 1946, Bardeen determined theoretically that they were not. He concluded that electrons in that region must be trapped, thus creating a surface state that formed a barrier to movement. 

Bardeen and Brattain, aided by physicist Gerald Pearson and chemist Robert Gibney, devoted themselves to figuring out if he was correct. By early 1947, in a laboratory experiment, they demonstrated the presence of the barrier. As their manager, Shockley offered suggestions on how to breach the barrier but was not involved in their work on a day-to-day basis.

“The Magic Month”

On Monday, November 17, Gibney suggested that Brattain apply a voltage between a metal plate on the upper surface and a contact on the rear of a slab of germanium crystal to create a strong electric field perpendicular to the surface. A drop of liquid electrolyte at the point where electrical contacts touched the material neutralized the surface state and produced a measurable field effect in the structure. 

Following Bardeen’s suggestion to probe the surface with a sharp metal point surrounded with the electrolyte, on November 21 Brattain produced a functioning amplifier, albeit only at very low frequencies. A couple of weeks of long hours and feverish activity on blackboards and lab benches, during what Shockley called “The Magic Month,” combined fortuitous “accidents” in processing the material with smart intuitive sense in taking advantage of what they learned, yielded amplification without the presence of the electrolyte.

Bardeen calculated that reducing the distance between the two contacts would enhance the effect. Brattain came up with an ingenious approach that involved cementing gold foil onto a plastic wedge and with surgical precision slicing the tip with a razor blade to create two contact points separated by the width of a sheet of paper.

On the afternoon of Tuesday, December 16, 1947, they attached a spring to press the crude contraption firmly against the germanium surface. Brattain found that if he wiggled it just right, “I had an amplifier with the order of magnitude of 100 amplification, clear up to the audio range.” [5] The solid-state semiconductor amplifier was born. 

He and Brattain agreed: “We should tell Shockley what we did today.” [1]

Elements of Bardeen and Brattain’s Transistor. Image: © Computer History Museum

Bardeen seldom discussed his work at home; however, that night, he remarked casually to his wife who was peeling carrots in the kitchen, “We discovered something today.” “That’s great,” she responded automatically. Sometime later, Jane found out that the something was the transistor. [6]

The demonstration 

Shockley admitted that their news “provoked conflicting emotions in me. My elation with the group’s success was balanced by the frustration of not being one of the inventors.” [7] But, realizing the importance of their breakthrough, he arranged a demonstration of the amplifier for Bell executives on Tuesday afternoon, December 23, 1947.

Brattain’s record of the December 23, 1947 demonstration. Courtesy Lucent Technologies 1997.

Brattain recorded in his notebook that with a microphone and headphones, “This circuit was actually spoken over and . . . could be heard and seen on the scope presentation.” Sadly no one remembers what was said, just that it worked. Shockley called it a “magnificent Christmas present.”

Within days after Christmas, Bell Labs’ patent attorneys began to document their work and prepare for a public announcement. As Shockley’s ego-driven, self-promotional activities made him the most visible spokesman for Bell Labs, orders came down the line that no pictures be taken of Bardeen and Brattain without his presence. Publicity photos at the time show him front and center of the scene.

Electronics magazine cover. © McGraw-Hill Publishing Company, Inc

At the first press conference in New York on June 30, 1948, a spokesman claimed the transistor “may have far-reaching significance in electronics and electrical communication.” Unimpressed, The New York Times relegated the story to “The News of Radio” page —below the announcement of a soap opera sponsor.

New York Times report on the announcement of the transistor. Published on July 1, 1948

Brattain’s colleague John Pierce is credited with coming up with the name. Aware that it operated on the principle of trans-resistance, Pierce derived transistor from the associated electronic component called a resistor.

Western Electric, the equipment arm of AT&T, began manufacturing point-contact transistors in 1951 and, by mid-1952, was producing more than 6,000 devices a month, predominantly for telephone switching systems and hearing aids.

Sonotone 1010 (1952) First commercial hearing aid to use a transistor. Photo: Author Joe Haupt (CC-by-S-2.0)

A more practical transistor

According to Brattain, Shockley, who was pushing to incorporate some of his ideas into their patent filing, “called both Bardeen and I in separately, shortly after the demonstration, and told us that sometimes the people who do the work don’t get the credit for it. I told him, ‘Oh hell, Shockley, there’s enough glory in this for everybody.’ ” But he “went off by himself and worked at home, and in a way ceased being a member of the research team.” [1]

Spurred by professional jealousy at not being more visibly involved with the transistor’s invention and a need to maintain his standing relative to his subordinates, Shockley began a month of intense theoretical activity alone. He determined that point-contact transistor operation was not the near-surface field effect that had been assumed but was due to an entirely different structure in the bulk of the crystal called a P-N junction.

As a result of this work, on January 23, 1948, Shockley conceived a distinctly different element, called a junction transistor, that proved to be more reliable and easier to build in volume than the point-contact device. Fabricating working transistors still presented formidable challenges until Bell Labs announced the advance on July 4, 1951. His version became the dominant active electronic building block for the next two decades by enabling new generations of powerful computers.

John Bardeen accepts the Nobel Prize. Walter Brattain waits behind him.

Based on his theoretical contributions to the understanding of semiconductor physics and his invention of the junction transistor, Shockley joined Bardeen and Brattain in accepting the 1956 Nobel Prize in Physics for “researches on semiconductors and their discovery of the transistor effect.”

Sources of Quotations

[1] “Oral History Interview of Walter Brattain - Session II,” May 28, 1974. Niels Bohr Library & Archives, American Institute of Physics. Retrieved on 11.1.2022 from https://www.aip.org/history-programs/niels-bohr-library/oral-histories/4532-2

[2] Transistorized! ScienCentral, Inc. and The American Institute of Physics (1999) Retrieved on 1.16. 2016 from: http://www.pbs.org/transistor/album1/shockley/index.html

[3] “Interview with Gordon E. Moore,” March 3, 1995, Silicon Genesis: Oral Histories of Semiconductor Industry Pioneers, Stanford University. Retrieved on 11.1.2022 from: https://landley.net/history/mirror/interviews/Moore.html

[4] “Absent at the Creation,” Ronald Kessler, The Washington Post (April 6, 1997) Retrieved on 11.1.2022 from: https://www.washingtonpost.com/archive/lifestyle/magazine/1997/04/06/absent-at-the-creation/2a432ee5-b1e3-49b9-93f2-ad821d1832dd/

[5] “Oral History Interview of Walter Brattain - Session I,” June 1964. Niels Bohr Library & Archives, American Institute of Physics. Retrieved on 11.3.2022 from https://www.aip.org/history-programs/niels-bohr-library/oral-histories/4532-1

[6] Vicki Daitch & Lillian Hoddeson, True Genius: The Life and Science of John Bardeen, Joseph Henry Press (2002)

[7] John Rhea and Paul Plansky, “Twas Two Days Before Christmas,” Electronic News, December 18, 1972.

For More Information

Michael Riordan and Lillian Hoddeson, Crystal Fire: The Birth of the Information Age (New York: W. W. Norton, 1997)

Joel Shurkin, Broken Genius: The Rise and Fall of William Shockley (London: Macmillan, 2006)

The Silicon Engine - Timeline, 1950s https://www.computerhistory.org/siliconengine/timeline/ 

David A. Laws, ” The Lunch That Launched Silicon Valley,” The Bold Italic (Feb 25, 2021) https://medium.com/p/7a3c4d9906f3/edit

Main image: Bardeen and Brattain’s Point Contact transistor 1947. Photo: Bell Telephone Laboratories

FacebookTwitterCopy Link

The post “The Surface State Job” appeared first on CHM.

]]>