Remarkable People Archives - CHM https://computerhistory.org/blog/category/remarkable-people/ Computer History Museum Tue, 28 May 2024 15:54:12 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 In Memoriam: Gordon Bell (1934–2024) https://computerhistory.org/blog/in-memoriam-gordon-bell-1934-2024/ Thu, 23 May 2024 22:02:14 +0000 https://computerhistory.org/?p=29519 CHM remembers the incomparable Gordon Bell, a computer pioneer, brilliant engineer, cofounder of the Museum, and CHM Fellow.

The post In Memoriam: Gordon Bell (1934–2024) appeared first on CHM.

]]>
Engineer, Entrepreneur, Visionary

Chester Gordon Bell was born in Kirksville, Missouri on August 19, 1934, and was named after both his father, Chester Bell, an electrician and appliance repairman, and Lola Gordon Bell, a grade schoolteacher. Bell gained experience working with electricity at an early age, wiring houses and repairing appliances for the family business, Bell Electric. Hard work and a passion for science experiments and puzzles led him to attend the Massachusetts Institute of Technology (MIT), where he earned his Bachelor of Science and Master of Science degrees in Electrical Engineering. While at MIT, Bell worked with pioneers of computing Jay Forrester and Ken Olsen—all three would become CHM Fellows.

In 1958, Bell was a Fulbright Scholar at the University of New South Wales in Australia, where he taught computer systems design and proposed to his bride-to-be, Gwen, via an English Electric DEUCE computer, a computer designed after the Pilot ACE of Alan Turing.

In 1960, Olsen and colleague Harlan Anderson recruited Bell to their new company, Digital Equipment Corporation (DEC). DEC was instrumental in the development of minicomputers—a new class of computer they defined—and Bell quickly became a pivotal figure in the company’s growth through his technical skills and engineering leadership.

Bell’s first major project at DEC was the PDP-1 (Programmed Data Processor-1), one of the earliest minicomputers. The PDP-1 was revolutionary for its time, a powerful yet relatively affordable computer for which Bell designed the I/O subsystem (and invented the UART). Bell’s work on the PDP-1 laid the foundation for the subsequent PDP computer series, including the wildly successful PDP-8 (1965) and PDP-11 (1970) models.

Gordon was respected for both his technical and engineering leadership skills at DEC and Microsoft. Bell (left) at the PDP-6, Digital Equipment Corporation, 1964.

From 1966-72, on leave from Digital, Bell was Professor of Computer Science and Electrical Engineering at Carnegie-Mellon University. He and Allen Newell coauthored Computer Structures (1971), rewritten in 1982 with Dan Siewiorek. The book was an ambitious attempt to compare computer architectures of the time using a specialized taxonomy they had developed and was widely used in universities.

In 1972, Bell returned to Digital as vice president of research and development, and led the development of DEC’s VAX (1975-1978) family of minicomputers—models of which defined technical computing for the next decade around the world, from small laboratories to large multi-user university departments. In spite of DEC’s promotion of its in-house VMS operating system, it was upon VAX computers that the competing Unix system evolved. In part because of this, the VAX family was arguably the most important group of computer systems of its time—for both the growth of the internet and for academic computer science in universities to flourish. Bell was also instrumental in the promotion of Ethernet as a standard for local area networks, along with Intel and Xerox, a key milestone in its acceptance (as was establishment of the IEEE 802 working group).

With Ken Olsen and then-wife Gwen, Bell cofounded the Digital Computer Museum in Marlborough, MA (1979); becoming The Computer Museum, Boston, MA (1984); and, with cofounder networking entrepreneur Len Shustek, The Computer Museum History Center, Mountain View, CA (1996), and finally, the Computer History Museum (1999). (We write now from the mighty oak which has grown from the acorn these three visionaries planted that day). Since its founding, Gwen and Gordon Bell were very generous donors and supporters of the Museum and have attracted other generous supporters as well through their network of personal contacts.

Museum History

Gordon understood before many that the technology revolution needed to be preserved. The Computer Museum, Boston was cofounded by Bell in 1979.

Bell left DEC again in 1983 after suffering a heart attack in March of that year. Upon recovery, he later cofounded supercomputer startups Encore Computer, Ardent (1986) and Stardent (1989). The goal at Encore Computer and its follow-on companies was to build computers with supercomputer-level performance and visualization at lower cost by using off-the-shelf components.

In the 1980s, Bell became involved with the National Science Foundation to help drive its investments in supercomputing and in 1987 established the Gordon Bell Prize in conjunction with the Association of Computing Machinery to drive innovation in parallel processing—a Prize that remains highly sought after by the high-performance computing community.

In recognition of his efforts, in 1991, President George H.W. Bush awarded Bell the National Medal of Technology “for his continuing intellectual and industrial achievements in the field of computer design; and for his leading role in establishing cost-effective, powerful computers which serve as a significant tool for engineering, science, and industry.”

Gordon was awarded the National Medal of Technology by President George W. Bush in 1991.

In the 1990s, Bell was instrumental in convincing Microsoft to establish a research arm and he joined Microsoft Research from 1995 to 2012, working on various projects including his daily “life-logging” application, MyLifeBits.

Gordon pioneered wearable technology with his passion for “lifelogging.” Shown here with his wearable camera on the cover of IEEE Spectrum, 2005.

In 2003, Bell was named a Computer History Museum Fellow, “For his key role in the minicomputer revolution, and for contributions as a computer architect and entrepreneur.”

Gordon Bell accepts his 2003 Fellow Award.

Gordon was a founding trustee of the Computer History Museum in Mountain View. Shown here with his wife Sheridan Sinclaire-Bell, at the CHM Fellows Awards, 2019. Bell was made a CHM Fellow in 2003.

Bell could think at the micro and macro scales with ease. After decades of observation, he proposed this eponymous law: “Roughly every decade a new, lower priced computer class forms based on a new programming platform, network, and interface resulting in new usage and the establishment of a new industry.” This is an observation-based conclusion, using the history of computing as its lodestar: mainframes, minicomputers, microcomputers, smartphones, the Cloud . . . are chapters in Bell’s “book” of computer classes.

Datamation magazine called Bell, “the Frank Lloyd Wright” of computing. As an architect for five decades of computing systems, he not only witnessed but personally shaped our civilization’s transition from room-sized mainframe computers to the personal computer and beyond—making this “most incredible invention of the 20th century” as Bell called the computer—smaller, more interactive, affordable—and ubiquitous. Gordon was like a meteor, beginning life in rural Missouri and streaking across our skies with ideas, enthusiasm, and energy.

With heavy but grateful hearts we say, “Farewell, Gordon. It has been the honor of a lifetime to share in your remarkable life.”

Thoughts and Tributes

Gordon Bell passed away on May 17, 2024. He is survived by his first wife, Gwen, and his second wife, Sheridan Sinclaire-Bell, whom he married in 2009; his son, Brigham, and his daughter, Laura Bell, both from his first marriage; his stepdaughter, Logan Forbes; his sister, Sharon Smith; and four grandchildren.

Below, members of the CHM community remember Gordon.

Len Shustek, Cofounder, Computer History Museum

I am so sorry to hear this news. Gordon was such a force of nature that it seemed like he would go on forever. I knew him first by reputation when I was a computer science student in the 60’s. I’ve known him in person from the day in 1995 that he showed up in my office to talk about the computer museum I wanted to start and said “I have a deal for you.” He did, and it worked. Gordon was brilliant. He was opinionated. He wasn’t always consistent, but he was always passionate. He made things happen, and the world is better for it. He will be missed by many, including me.

Jack Dongarra, 2021 ACM Turing Award

Gordon was a friend, colleague, and mentor. He established his prize, initially set at $1,000, for the best speedup of a real application running on a real machine. The first year’s prize was awarded to the entrant who demonstrated the highest speedup. Subsequent winners had to double the previous winner’s speedup until either the speedup reached 200 times that of the sequential application or 10 years had passed—whichever came first. Today, the Bell Prize is highly sought after. Gordon was a special person in many ways. His work and our memory of him will be with us for a long time, and his presence will be sorely missed.

Donna Dubinsky, Former CEO, Palm, Inc. and CHM Trustee

I had the opportunity to interact a lot with Gordon through some of the early years at CHM. I always found him to be an inspirational and original thinker. Gordon was one of those people where I leaned in when he was talking. I knew that something fresh, interesting, and unexpected would come forth.

Ray Ozzie, Former CTO, Microsoft

I can’t adequately describe how much I loved Gordon and respected what he did for the industry. As a kid I first ran into him at Digital (I was then at DG) when he and Dave were working on VAX. So brilliant, so calm, so very upbeat and optimistic about what the future might hold. The number of times Gordon and I met while at Microsoft—acting as a sounding board, helping me through challenges I was facing is uncountable. Gordon, I and we will miss you. Thank you for what you gave to each of us, and to all. You were one of my heroes.

Professor John Gustafson, First Gordon Bell Prize Winner, 1988

Gordon Bell had an innate sense of what computers could be and should be. He was not only a pioneer in computing, but the creator of other pioneers through the challenges he set forth. I know of no other person in computing who has done so much to help the careers of others, while also advancing the industry.

Eric Hahn, Founding Partner, Inventures Group

Gordon changed my life. In 1975, Gordon had the generosity and grace to arrange for a pimpled-faced 15 year-old to purchase a PDP-8. That teenager was me and that PDP-8 was the starting point of what became a 50-year career in software development. I don’t believe a day has gone by when I am not thinking about or writing software—a direct result of Gordon’s generosity. As luck would have it, in 2000, I was given the chance to return some of my good fortune to Gordon through the Computer History Museum by helping to fund the move of Gordon’s collection of DEC artifacts to its permanent home on the West Coast. Gordon’s contributions to the industry are legendary but I will never forget how he transformed my own small life. I will miss him dearly.

FacebookTwitterCopy Link

The post In Memoriam: Gordon Bell (1934–2024) appeared first on CHM.

]]>
In Memoriam: Niklaus Wirth (1934–2024) https://computerhistory.org/blog/in-memoriam-niklaus-wirth-1934-2024/ Fri, 05 Jan 2024 16:42:21 +0000 https://computerhistory.org/?p=28737 CHM remembers CHM Fellow Niklaus Wirth, developer of the Pascal programming language, who passed away on January 1, 2024.

The post In Memoriam: Niklaus Wirth (1934–2024) appeared first on CHM.

]]>

Increasingly, people seem to misinterpret complexity as sophistication, which is baffling—the incomprehensible should cause suspicion rather than admiration.

— Niklaus Wirth

With sadness we note the passing of Niklaus Emil Wirth, Swiss computer scientist and 2004 CHM Fellow, who died at the age of 89 on January 1, 2024.

Niklaus Wirth was born in Winterthur, Switzerland, in 1934. He received the degree of electronics engineer from the Swiss Federal Institute of Technology (ETH-Zurich) in 1959, an MSc from Laval University (1960), and a PhD in electrical engineering and computer science from UC Berkeley (1963).

Upon graduation from Berkeley, Wirth became an assistant professor at the newly created computer science department of Stanford University. From 1968 until his retirement in 1999, he was a professor at ETH in Zurich. There, he developed the programming languages Pascal (1970), Modula-2 (1979), and Oberon (1988). Pascal, in particular, became a widely used programming language in computer science education and influenced a generation of students and professional programmers.

Following two separate sabbatical leaves at the Xerox Palo Alto Research Center (PARC) in California, Wirth became an enthusiastic adopter of the groundbreaking workstations he saw there, and returned home inspired to build similar systems. While doing so, he simultaneously created several elegant and useful programming languages and environments that had profound research implications.

Wirth contributed to both hardware and software aspects of computer design and wrote influential books on software engineering and structured programming. Among other recognitions, he holds the ACM Turing Award (1984).

Main Image: Professor Wirth (right) receiving 2004 CHM Fellow Award from CHM CEO John Toole (left).

Learn More

Niklaus Wirth items in CHM’s permanent collection, include:

Lecture: “Odysseys in Technology: How I Became Interested in Programming Languages,” by Niklaus Wirth. [https://www.computerhistory.org/collections/catalog/102703182]

“The Personal Computer Lilith,” by Niklaus Wirth. [https://www.computerhistory.org/collections/catalog/102680745]

“MODULA-2” by Niklaus Wirth. [https://www.computerhistory.org/collections/catalog/102721252]

Computer History Museum Fellow Awards 2004 Video. [https://www.computerhistory.org/collections/catalog/102705912]

Other Items related to Wirth and his work: https://www.computerhistory.org/collections/search/?s=wirth&page=2

FacebookTwitterCopy Link

The post In Memoriam: Niklaus Wirth (1934–2024) appeared first on CHM.

]]>
In Memoriam: John Warnock (1940–2023) https://computerhistory.org/blog/in-memoriam-john-warnock/ Tue, 22 Aug 2023 15:45:13 +0000 https://computerhistory.org/?p=27959 CHM remembers John Warnock, Adobe cofounder, who passed away on August 19, 2023.

The post In Memoriam: John Warnock (1940–2023) appeared first on CHM.

]]>
A Mathematician with an Artist’s Eye

I draw, I paint, I studied geometry, I’m a mathematician. I’ve never separated my visual world and the world of mathematics.

— John Warnock

The Adobe Corporation has announced that cofounder John Warnock died at the age of 82 on August 19, 2023. His death will be a loss to many, most of all his closely knit family, but extending as well to many professional colleagues and, indeed, to us at the Computer History Museum.

Among his many interests, Warnock was engaged with history. He collected books from across the history of science, technology, and mathematics. He donated artifacts and documents from the history of Adobe to CHM, where he became a fellow in 2002, participated in our workshop on the history of desktop publishing in 2017, and was instrumental in facilitating our recent public release of the PostScript source code. Indeed, we were working with Warnock to finalize our two-part oral history interview with him when we received the news of his passing. We could think of no better way to remember him, than to make this interview available today. 

CHM’s Oral History with John Warnock, Part 1. Download transcript here.

CHM’s Oral History with John Warnock, Part 2. Download transcript here.

Remembering John Warnock

The Computer History Museum collaborated with Adobe to create the Adobe Experience Museum on the ground floor of the new Adobe Founders Tower in downtown San Jose. Shown here are John and Marva Warnock at the ribbon cutting in January 2023.

Early Years

John Warnock was born in the suburbs of Salt Lake City, Utah, in 1940. His childhood home was thoroughly artistic, with parents and siblings alike drawing and painting. By his own description, Warnock was a “lackluster student,” but that by no means meant that he was without interests. Working as a “stock boy” in a photography shop during the summers, fueled—and financially supported—what turned out to be a life-long passion for photography.

A major shift in Warnock’s life, however, came through a dynamic high school mathematics teacher. Warnock said, “He had a very interesting way of approaching things. He said, ‘We’re going to use a college trigonometry book. How many of you can solve all the problems in the book?’ A week later, a guy would come in with a stack of paper and have solved all the problems in the book. He sort of challenged people, and a big part of the class would do it. I mean, just do it. He was an amazing guy and I fell in love with mathematics . . . it changed me from being a non-student to a good student.”

Graduating from high school in 1958, Warnock followed family tradition and attended the University of Utah in Salt Lake City. While he was set on studying mathematics, little else was decided: “I thought I’d probably end up being a teacher at that time. I really hadn’t thought about it much. I guess I’d thought about being a photographer or something like that, but I would say teaching or photography or something like—I really didn’t think about it much.”

Finishing his undergraduate studies in just three years, Warnock stayed on at Utah for a master’s in mathematics. From there, he got a job at IBM in 1963 where he had his first exposure to computers: “Well, with mathematics behind you and learning FORTRAN and things like that, it was fairly straightforward. I learned how to do that. But mostly when you . . . started visiting customers and solving real problems, that’s when you started to learn about operating systems, how [they] worked. As my customers would always say, ‘You’re here so we can train you’ . . . So I learned how to program.” One of the accounts he had at IBM was the University of Utah Computer Center.

Discovering Computer Graphics

The prospect of being drafted for the Vietnam War led Warnock back to the University and a deferment so that he could pursue a PhD in mathematics. To support himself, he worked in the University’s Computer Center, continuing much of the work he had done as an IBM employee. There, he encountered Gordon Romney, the first graduate student in the newly formed Computer Science Department then being formed by Dave Evans, who had come to Utah with a huge ARPA contract to make it a center of computer graphics research: “They were experimenting with displays and how you do three-dimensional stuff on displays. One day Gordon Romney, who was working on it, came in and he said, “I have a problem,” and he sort of described the hidden surface problem to me. I was in the Computer Center. I thought about that problem and then said, “Gee, here’s a way you could solve it.” Gordon said, “You probably ought to talk to Dave.”

John Warnock’s journey into computer graphics at the University of Utah.

Warnock did, and soon switched to pursuing a PhD in computer graphics under Dave Evans. With Evans’ recruitment of Ivan Sutherland to the Utah faculty, it did indeed become the most important site of computer graphics research, and Warnock was in the thick of it. The department also connected Warnock to the larger community of ARPA-supported computer science researchers across the country, which had a profound impact on him and on computing: “It was great. No, it was really great. I think that’s essentially sort of what launched all computing that we know today.” Warnock made significant strides in creating 3D color images as part of his graduate studies at the close of the decade, and then found immediate work in computer services for a few years.

An early color 3D graphics image by John Warnock from the late 1960s. Courtesy John Warnock.

In the early 1970s, Dave Evans recruited Warnock back into the computer graphics fold, hiring him into the firm Evans and Sutherland. Dave Evans and Ivan Sutherland had started the firm in Utah to pursue the commercial potentials of 3D computer graphics and animations. Soon, Warnock was opening a new Evans and Sutherland office in Mountain View, California, to work on an ambitious 3D color graphics powered simulation of the New York harbor for training ship pilots.

With his success with the Harbor Pilot Simulator, and a reorganization of Evans and Sutherland in the works, Warnock was asked to move back to Utah to become a vice president at the company. Interested in remaining in the Bay Area, he reached out to a friend, William Newman, from Utah computer graphics who was now working at the nearly Xerox Palo Alto Research Center (PARC). Newman quickly connected him to Chuck Geschke, another member of the ARPA community of computer scientists who was also working at PARC. The pair met, and as Warnock recalled: “We’re roughly the same age, and roughly the same education. We have the same number of kids that are the same ages. We both refereed soccer. We hit it off well. He said, ‘I’m starting a new laboratory called the Imaging Sciences Laboratory. I’d like to have you interview.’”

Warnock quickly joined Geschke’s lab: “My primary responsibility was to make device-independent graphics, so that a piece of software could drive not only the bitmap display, it could drive color displays or grayscale displays, and deal with a much broader spectrum of stuff. But, also, potentially could drive high resolution printers.” This work directly fed into a new challenge for the group. PARC’s researchers had created a radical new kind of networked personal computer, the Alto, that Xerox was now turning into a real product. As Warnock recalled: “[Xerox] has started to make the Star a commercial product . . . They found that they had no universal printer protocol, and so they said, “Well, we’ve got to sell printers to these guys. Why don’t we have PARC design a printer protocol for Xerox?” This was also one of the most fascinating projects in the world, because we very rarely met. We did the whole design via email, and we’d have arguments via email, and everything.”

Cofounding Adobe

Warnock put much of what he had learned about computer graphics at Evans and Sutherland and PARC into this new effort, and the printing protocol that came out of it, called Interpress. While he and Geschke thought it had significant potential for Xerox and for the computing world at large, Xerox told them it would be over a half-decade before it could be fully implemented by the company. Warnock recalled: “I went into his office, and I said, ‘We can live in the world’s greatest sandbox for the rest of our life, or we can do something about it.’” They did, leaving PARC to create Adobe in 1982.

John Warnock on cofounding Adobe.

Chuck Geschke (seated) and John Warnock (standing) at Adobe. © Doug Menuez/Stanford University Library Systems

At Adobe, they would create this universal printing protocol—allowing diverse computers and programs to create high quality text and images on a great variety of printers—more fully along the lines Warnock thought best. Everything to be printed, including fonts, would be treated as geometry, as mathematically described shapes. Adobe’s protocol, PostScript, also used careful mathematics to ensure the quality of text on printers and screens. As Warnock recalled: “The first time we tried this, it just worked like a champ.”

PostScript became a roaring success for Adobe, for both its mathematical approach and for a savvy insight by its founders. They licensed fonts from the biggest typography houses worldwide: “Chuck and I had a very strong feeling that it was worth your weight in gold if you could license the real thing. Now, you could have the same outlines, but if you couldn’t call it Times Roman, and you couldn’t call it Helvetica, the design community wouldn’t trust it.”

Alliances were essential for the success of Adobe’s digital press. This photograph from the January 1985 debut of Apple’s LaserWriter—that relied on Adobe’s Postscript—captures the alliances that Adobe forged and would prove critical to the growth of the company. Shown here, far left is then-President of the Linotype Group, Wolfgang Kummer, who licensed Linotype’s famous fonts; next to Kummer is John Warnock, Adobe cofounder and architect of Postscript, followed by Steve Jobs, cofounder and then chairman of Apple and finally Aaron Burns, cofounder of the International Typeface Corporation, which also licensed its large typeface library to Adobe. Courtesy John Warnock.

Applying Life Lessons

Following the success of PostScript, Warnock went on to become a champion for important new application products for Adobe. He was deeply involved with the creation of Illustrator, a popular drawing application closely tied to PostScript technology. He was a key champion for Photoshop, the pathbreaking application for digital photography: “I started my own dark room I guess when I was 15 years old and I started mixing my own chemicals for the dark room when I was like 16 or 17 and I had my own enlarger and I had a graphics camera and I was on the yearbook photography staff, so I’ve had a long history with cameras and photography. And I learned to do a lot of things in the dark room and I started seeing those in Photoshop, the things you could do in a dark room and then it became totally obvious that anything you could do with an image, you could do in software.”

John Warnock on how his experiences in photography prepared him for Photoshop.

Importantly, John Warnock was responsible for the creation of PDF. Early on in the company, he had figured out a way to modify PostScript so that printing would happen faster in Steve Jobs’ first demo of the LaserWriter, Apple’s first laser printer for the Macintosh. Years later, in 1991, he envisioned that this same kind of modification would be key to creating a new kind of electronic document that would have high visual quality but also be safe for electronic sharing through email or the internet. From the launch of this new format, PDF, in 1993 and for years afterward, Warnock championed the effort. After several years, PDF caught on like wildfire, and has become an international standard for digital documents and their exchange. In all of these application efforts, Warnock kept the user perspective at the forefront, because he was one himself: “And from day one, I’ve used Illustrator every day of my life. And I’ve used Photoshop every day of my life. And I’ve used Acrobat every day of my life.”

A Legacy

PostScript. Illustrator. Photoshop. Lightroom. PDF. InDesign. Behind all of these technological success stories, Warnock saw the importance of culture, of bringing people together in a way they could seize opportunity sustainably: “I think a big part of the company . . . is the culture of the company. And one of the things that Chuck’s always said is always hire people who are smarter than you. And we’ve also tried to be very egalitarian . . . it’s always been giving people opportunity, it’s always taking care of people, being fair . . . [W]hen Chuck and I first started the company, we explicitly said to each other, we want to build a company that we would like to work for. Okay, this is sort of number one. We also felt that transparency is really, really, really important . . .We sort of tried to bend over backwards to keep both the cultural and ethnic distribution of the company and the male-female balance there and I think we do really well at that . . . always believed that diversity, hybrid vigor is great!”

John Warnock on the values behind Adobe’s corporate culture.

While the deaths of John Warnock and Chuck Geschke are truly a great loss, their commitments to technological innovation, artistic excellence, diversity, and humane values are enduring examples of lives well-led and that should be studied by generations of technologists to come.

Main image: © Doug Menuez/Stanford University Library Systems

FacebookTwitterCopy Link

The post In Memoriam: John Warnock (1940–2023) appeared first on CHM.

]]>
In Memoriam: Gordon Moore (1929-2023) https://computerhistory.org/blog/in-memoriam-gordon-moore-1929-2023/ Sat, 25 Mar 2023 22:01:03 +0000 https://computerhistory.org/?p=27207 CHM remembers the visionary Gordon Moore, who pioneered the development of the integrated circuit and forever changed our world.

The post In Memoriam: Gordon Moore (1929-2023) appeared first on CHM.

]]>
Architect of our Digital World 

Gordon Earle Moore died on March 24, 2023, at the age of 94. Moore was a man of parts, with wide-ranging talents and accomplishments. His most longstanding commitments were those to his family—as husband of 73 years to Betty I. Moore and as father to sons Ken and Steve Moore—to the study of science and technology, and to sport fishing. Moore was also a PhD chemist, semiconductor manufacturing technologist, self-described “accidental entrepreneur,” industrial R&D leader, corporate executive, venture investor, and philanthropist.

Moore is perhaps most widely known for the phenomenon of “Moore’s Law,” the developmental dynamic in silicon microchips that, for over a half-century, resulted in exponential increases in the complexity and functionality of microchips with accompanying exponential decreases in the cost of digital electronics. This dramatic increase in the functionality and affordability of digital electronics in the form of the silicon microchip has been foundational to the widespread use of digital electronics and computation globally, and in all areas of society and culture, thereby producing our contemporary digital world. Through roles in technology and business leadership at the two most important silicon microchip firms to date—Fairchild Semiconductor and the Intel Corporation—both of which he cofounded, Gordon Moore served as a key architect of our digital world.

A 16-minute biographical film about Gordon Moore, cowritten by Computer History Museum historian David C. Brock.

Origins

Born on January 3rd, 1929, in San Francisco, Moore spent his first decade in the coastal farming village of Pescadero, north of Santa Cruz on the San Francisco Peninsula. Moore’s family had been the first Anglo settlers of the village in the 1840s, and for the most of his life he lived and worked within a forty-mile radius of his childhood home. Moore’s father was a deputy sheriff, and his mother was a Pescadero native. In was in Pescadero, appropriately, that Gordon Moore developed his lifelong passion for sport fishing.  

The family moved to Redwood City at the close of the 1930s, where Gordon Moore attended the public high school. It was in Redwood City that Moore was first exposed to a chemistry set and, with it, the capacity to make explosives. Moore soon turned a backyard shed into a sophisticated if risky explosives laboratory, undertaking various experiments with rocketry and detonation. Like many chemists before him, it was “flashes, bangs, and stinks” that drew Moore into the study of chemistry. 

One of the first in his family to pursue higher education, Moore enrolled at San Jose State as his first step toward becoming a chemist. In his two years there, before transferring to the University of California, Berkeley, Moore met Betty I. Whitaker, a friendly, savvy, journalism major. They would soon marry in 1950, immediately after Gordon’s graduation with a BS in chemistry and days before he was to arrive to begin his graduate studies in chemistry at the California Institute of Technology, Caltech. Betty Moore was herself a native of the San Francisco Peninsula and devotee of sport fishing. 

An oral history interview with Gordon Moore in 2008. The transcript is available here.

Early Career

Moore quickly earned his PhD in physical chemistry at Caltech, where he specialized in experimental research in the field of infrared spectroscopy. His work required precise measurement, careful mathematical analysis, and intricate equipment for careful control of difficult materials. Moore then spent just over two years as a research chemist, refining these same skills, at the Applied Physics Laboratory, the guided missile laboratory of the US Navy operated by Johns Hopkins University. While there, he attended a public lecture by William Shockley, one of the nation’s preeminent physicists, and the recent inventor of the junction transistor at the Bell Telephone Laboratories. Interested in more directly applied work, Moore explored to a variety of positions, including a post at the Lawrence Livermore National Laboratory—then at the peak of its prominence—which Moore turned down. It was at Livermore, that William Shockley found Moore’s resume, prompting an out-of-the-blue telephone call from Shockley to Moore with the offer of a job back on the San Francisco Peninsula. 

William Shockley (seated) with his Bell Labs colleagues John Bardeen (standing left) and Walter Brattain (standing right) in 1948. The trio later shared the Nobel Prize in physics for their pioneering work on the transistor. Credit: Bell Laboratories

William Shockley (seated) with his Bell Labs colleagues John Bardeen (standing left) and Walter Brattain (standing right) in 1948. The trio later shared the Nobel Prize in physics for their pioneering work on the transistor. Credit: Bell Laboratories.

In 1955 and 1956, Shockley was busy assembling key staff for a new laboratory in Mountain View, California. Shockley had left Bell Labs, and joined forces with another Caltech PhD chemist, Arnold Beckman, who had built a large electronics and instrumentation firm, Beckman Instruments. Beckman, who already had two subsidiaries in the Bay Area, contracted with Shockley to create a new laboratory, the Shockley Semiconductor Laboratory of Beckman Instruments, Inc., in the Bay Area to develop and then mass produce a promising new form of junction transistor, the double-diffused silicon transistor. The new transistor was thought to have significant market potential for military applications and industrial automation. The manufacturing of these transistors was fundamentally a chemical process, and therefore Shockley needed experienced, sharp, experimental chemists in addition to electrical engineers, physicists, metallurgists, and mechanical engineers. Moore jumped at the chance to serve as one of these chemists, to apply his chemical knowledge and experimental skills to the forefront of electronics technology, and to return to the Bay Area. 

The staff of Shockley Semiconductor salute William Shockley at a champagne brunch in honor of his Nobel Prize.

The staff of Shockley Semiconductor salute William Shockley at a champagne brunch in honor of his Nobel Prize. Read the transcript of the 2006 oral history interview with several of the staff of Shockley’s laboratory.

The Shockley Semiconductor Laboratory quite literally brought silicon electronics to what became known as Silicon Valley, but its impact was broader. By 1957, Moore and seven other members of Shockley’s staff had become profoundly dissatisfied with the technological and strategic direction of the laboratory, and with its management by William Shockley and Arnold Beckman. Having voiced their concerns about lack of focus and commitment to producing the double-diffused silicon transistor to both, and having their proposals rebuffed, the eight dissidents decided to leave. 

Fairchild Semiconductor

Cofounders of Fairchild Semiconductor at their first location at 844 Charleston Road in Palo Alto. From left to right, Vic Grinich, Gordon Moore, Robert Noyce, Julie Blank.

Realizing that the dissident group possessed all of the fundamental skills and knowledge to produce the double-diffused silicon transistor, and working with the New York investment bank Hayden Stone, the group eventually forged an agreement with defense contractor Fairchild Camera and Instrument to create a new firm, Fairchild Semiconductor Corporation, to manufacture the new silicon transistor. This was the first Silicon Valley spinoff, which would set the mold for hundreds of semiconductor and, later, IT startups and spinoffs.

At Fairchild Semiconductor, Moore made direct contributions to the technology for making the double-diffused silicon transistor and increasingly took on greater responsibility for the overall technology strategy for the firm. Fairchild Semiconductor’s technological and business success came rapidly. In just five years, with Moore directing R&D, the firm created a profitable business in silicon transistors and diodes, made cofounder Jean Hoerni’s conception of a new “planar process” for making devices into a technology adopted by the global semiconductor industry for making silicon electronics, and used the planar process to create the astonishingly successful silicon planar integrated circuit—the silicon microchip as we have come to know it. Further, the Fairchild Semiconductor founders all achieved financial independence when the parent company exercised its right to buy the firm, making it a division of Fairchild Camera and Instrument. 

Digital copies of Gordon Moore’s patent and laboratory notebooks from his career at Fairchild Semiconductor are available here. The originals are preserved in the Computer History Museum’s collections, along with hundreds of other Fairchild notebooks and documents.

Gordon Moore’s original patent and laboratory notebooks from his career at Fairchild Semiconductor are preserved in the Computer History Museum’s collections, along with hundreds of other Fairchild notebooks and documents. See the Moore Patent Notebooks here: Notebook LN #2, Notebook # 6, Notebook LN #3, Notebook LN #4.

Moore and Noyce at Fairchild Semiconductor.

Among the original cofounders, Robert Noyce and Gordon Moore had become first among former equals. Noyce was the general manager for the semiconductor division, while Moore ran R&D. With the huge opportunity in electronics represented by the silicon microchip, the other six cofounders rapidly left for new silicon microchip startups. Moore was uniquely positioned to grasp and analyze this opportunity, as he had intimate knowledge of the silicon manufacturing technology and of the process of developing silicon microchips for digital electronics.

Moore’s Law

In 1963, Moore began to publish his analysis of the silicon microchip opportunity. Moore believed that the basic approach for manufacturing silicon microchips, a kind of chemical printing process, was open to dramatic refinement and extension. In short, with enough investment of creative effort, time, and money, the chemical printing process could be made almost arbitrarily better. There was no fundamental brick wall preventing future progress, at least not for decades. With this, Moore saw that the extensibility of the chemical printing process would favor, both in terms of cost and performance, making silicon microchips with ever more digital functionality. Ever tinier transistors would be formed together in ever greater numbers on silicon microchips that would perform more and more digital electronics. Silicon microchips would make digital electronics profoundly cheaper over time. As a result, they would eventually pervade all of electronics, and dramatically increase the use of digital electronics in society.

In a 1965 article on this analysis, appearing in the trade magazine Electronics under the title “Cramming More Electronics Onto Integrated Circuits,” Moore underscored his view with a prediction. The number of transistors put onto microchips was going to double every year, for at least the coming decade, in order to get the maximum economic benefit from digital electronics, that is, to make them cheaply and profitably. This regular doubling of microchip “complexity” for maximum economic benefit would have its timescale refined by Moore in coming years, as it would be come to be generally known as “Moore’s Law.” This regular doubling associated with exponential decreases in the cost of digital electronics did in fact play out over the next half century, providing the foundation for the creation of our digital world. 

This 14-minute film, made for the 50th anniversary of Moore’s Law, provides in-depth background and explanation. It was cowritten by Computer History Museum historian David C. Brock. See Moore’s original manuscript of what would become his famous 1965 publication in Electronics.

The Intel Years

In 1968, Gordon Moore decided to leave Fairchild to create a new silicon microchip startup with Robert Noyce. Noyce was dissatisfied with the top management of the company, and Moore was increasingly frustrated that he could not successfully transfer new technologies out of his R&D laboratory into manufacturing. It was Moore who believed that the silicon microchip technology had advanced to the position where microchips could become a contender for use as computer memory. Digital silicon microchips were finding increasing use for computer logic—the parts of computers that perform calculations. The parts of computers that held the information to be calculated upon—computer memory—were formed from a cheap, reliable form of magnetic technology called core memory. Moore believed that by pushing the chemical printing process still further, silicon microchips could win computer memory from magnetic cores. Since computer memory was the bulk of the cost of digital computers, and the production of digital computers was continually expanding, Moore thought that memory microchips could be a tremendous business. In the summer 1968, Moore and Noyce created the Intel Corporation. Their first hire was Moore’s primary colleague in Fairchild R&D, Andy Grove. 

While Moore and Noyce both made personal investments to create Intel, the bulk of the funding came from venture capital investments organized by Arthur Rock. Rock had been the principal financier behind the creation of Fairchild Semiconductor, who subsequently relocated to San Francisco to pioneer the business of venture capital partnerships for investing in high technology startups. Moore, like the other cofounders of Fairchild Semiconductor, were key early limited partners for Rock’s, and other’s, venture capital partnerships. Indeed, all of the Fairchild Semiconductor cofounders were initial investors in Intel. Moreover, several Fairchild Semiconductor alumni established many of the most prominent venture capital firms in Silicon Valley, including Kleiner Perkins and Sequoia. 

In this 2015 interview, Gordon Moore and Arthur Rock discuss the formation of Intel. The transcript is available here.

At Intel, Moore, Noyce, and Grove formed a triumvirate that led the firm for the next 30-plus years. Noyce was, until his untimely death in 1990, very much the public face and champion for the firm, working closely on strategic decisions with Moore and Grove. Moore served as the ultimate technological strategist for the firm, holding key decisions on investments in R&D and manufacturing capacity, as well as which types of products the firm would pursue. It was left to Grove the fantastically difficult remit of creating an organization to execute on these strategies and decisions, as well as driving it to do so.

As Executive Vice President of Intel to 1975, Moore made several technological contributions that helped the firm secure early leads and successes in the microchip memory business he had envisioned. Under Moore’s leadership, Intel came to lead the market for DRAM memory chips in this period and to pioneer the creation of and market for EPROM memory chips, the predecessor to today’s Flash memory. Further, Moore supported Intel’s creation of the first commercially successful microprocessor—the entire central processing unit of a computer on a single chip. Together, the microchip, DRAM, and the EPROM would become critical for the rise of personal computing. Read a 2007 oral history interview with the members of the Intel team behind the firm’s first microprocessor, the 4004.

From 1975 to 1987, Gordon Moore worked as Intel’s longest serving CEO to date. In those years, Intel faced growing domestic and international competition in all of its markets for silicon microchips. Moore, with key advice and support from Grove, made several critical decisions in this period that set the stage for the dramatic success for the firm. Moore agreed to a strategic retreat from the DRAM business it had pioneered, reorienting the firm toward the microprocessor and, to a lesser extent, on non-volatile memory chips like the EPROM and Flash. In the microprocessor business, Moore approved a number of key strategies. Intel insured that its microprocessors were backward-compatible, allowing users to maintain their investment in software, most especially Microsoft’s operating systems and the application software written for them. Intel also engaged in aggressive marketing programs against their main rivals and targeted the end consumer with their “Intel Inside” campaign. Critically, Moore also approved the decision for Intel to act as the “sole source” for the Intel 386 microprocessor, a bet-the-company-scale move that resulted in Intel’s dominant franchise in commercial microprocessors outside of mobile devices. 

Historian Richard Tedlow discusses the decision to sole-source the Intel 386 microprocessor.

Andy Grove became CEO in 1987 to 1997 in order to lead Intel’s microprocessor strategy, with Moore acting as chairman of the board from 1987 to 1997. During these years, Moore took on greater responsibilities outside of the firm. In particular, he led an important effort that resulted in the Semiconductor Industry Association’s Technology Roadmap for Semiconductors. This roadmap, later expanded to include international partners, served as a mechanism by which the international semiconductor industry and its suppliers of materials and equipment worked together to explicitly maintain Moore’s Law. 

Philanthropy

After his chairmanship of Intel, Moore increasingly turned his attention to philanthropy. During the internet boom, Moore’s decades-long position on Intel had made him one of the wealthiest Americans. Most notably, Gordon and Betty Moore created the Gordon and Betty Moore Foundation in 2000 with a gift of several billion dollars. The foundation supports scientific research, environmental and conservation activities, and Bay Area concerns. Through this foundation and their private wealth, the Moore’s have made among the largest ever philanthropic gifts to higher education (Caltech) and environmental conservation (Conservation International). For the past several years, Gordon and Betty Moore have been living on the island of Hawaii, returning to the Bay Area regularly for meetings of their foundation’s board. 

For a detailed history of semiconductor electronics, see the Museum’s online exhibition Silicon Engine.

Museum historian David C. Brock is the coauthor of a biography of Gordon Moore, Moore’s Law: The Life of Gordon Moore, Silicon Valley’s Quiet Revolutionary (Basic Books, 2015). A discussion about the book can be viewed here.

FacebookTwitterCopy Link

The post In Memoriam: Gordon Moore (1929-2023) appeared first on CHM.

]]>
Courage, Resilience, and Sharing https://computerhistory.org/blog/courage-resilience-and-sharing/ Mon, 27 Jun 2022 17:55:58 +0000 https://computerhistory.org/?p=25530 Integrated circuit pioneer and CHM Fellow Lynn Conway has been sharing her story of gender transition and creating a supportive online community for transgender and LGBTQ+ youth.

The post Courage, Resilience, and Sharing appeared first on CHM.

]]>
Lynn Conway and Transgender Community Online 

Across the United States, countless Americans have celebrated and honored LGBTQ+ Pride Month this year, 2022. They have paraded, cheered, and reflected in solidarity with their families, friends, and strangers alike. These observations have taken place against an outrageous, hateful, and ultimately failed attempt by right-wing bigots to mar and disrupt Pride. Political and religious figures on the Right have made horrifying, immoral calls for violence and discrimination against the LGBTQ+ community. In Idaho, a group of over thirty violent white nationalists were arrested on route to a Pride event. In state legislatures across the nation, over two hundred and thirty bills have been introduced, aiming to strip the human and civil rights of LGBTQ+ people, particularly transgender people. Indeed, transgender people have been particularly attacked by the Right. This attack is not unique to the American Right. It is a project of the Right internationally.

For centuries, transgender people have had the courage and resilience to live their lives in the face of bigotry and violence. Many of them have been exemplars of what Martin Luther King, Jr. meant in 1959 when he exhorted us to “Make a career of humanity. Commit yourself to the noble struggle for equal rights. You will make a better person of yourself, a greater nation of your country, and a finer world to live in.” For many transgender people, their career of humanity has been in creating community, defining and sustaining spaces in which transgender people can find one another, share experiences and knowledge, and learn from each other. One of these transgender people is 2014 CHM Fellow Professor Lynn Conway.

Lynn Conway in her Xerox PARC office, 1983. Photograph by Margaret Moulton. Source: https://ai.eecs.umich.edu/people/conway/BioSketch.html

CHM honored Conway for “her work in developing and disseminating new methods of integrated circuit design.” While working at Xerox PARC in the 1970s, she created innovative new methods for designing the next generation of incredibly complex microchips. Collaborating with famed Caltech electronics expert Carver Mead, Conway developed a crucial book on these methods, “Introduction to VLSI Systems.” Using this book, Conway devised and taught a new university course and then created innovative techniques for widely teaching these new methods, fundamentally changing how a new generation and community of designers created microchips—the building blocks of our digital world. Not only an innovator, Conway was also a teacher and community builder.

The cover of Lynn Conway and Carver Mead’s breakthrough book, Introduction to VLSI Systems. Source: https://www.computerhistory.org/revolution/digital-logic/12/287/1613

Conway’s success with her new approaches in design and teaching in collaboration with Carver Mead were quickly and widely recognized. Source: https://ai.eecs.umich.edu/people/conway/Awards/Electronics/ElectAchiev.html

Conway went from success to success, leading major programs at DARPA in the 1980s, then becoming a professor of electrical engineering and computer science at the University of Michigan, where she also served as an associate dean of engineering. In 2000, she began to use her personal website at the University of Michigan for a new dimension of teaching and community building. Conway began to share her story of gender transition and her experiences as a transgender person publicly for the first time.

Over the coming years, Conway made her website into a space for community building, information exchange, and activism. She and other transgender people shared both their stories and knowledge they had gathered on the site. Conway used the growing prominence of her site for her activism, particularly to protect and support transgender and other LGBTQ+ youth.

Lynn Conway’s website, 2022. Source: https://ai.eecs.umich.edu/people/conway/

Conway’s use of her website as a space for teaching, sharing, and community building is part of a broader story and history of LGBTQ+ people using the online world for these activities. Washington State University professor Avery Dame-Smith is one of the leading researchers documenting and preserving this history through his Queer Digital History Project. His project is collecting oral histories, cataloging pre-2010 LGBTQ+ online communities, and archiving documents and newsgroups.

Avery Dame-Smith’s Queer Digital History Project, 2022. Source: https://queerdigital.com/tuarchive

Amid strife and hardship, it is so hopeful to recall these stories of brave people who are using the online world to “make a career of humanity” in such powerful ways. 

SUPPORT CHM’S MISSION

Blogs like these would not be possible without the generous support of people like you who care deeply about decoding technology. Please consider making a donation.

Cover image: A recent portrait of Lynn Conway, courtesy of Lynn Conway

FacebookTwitterCopy Link

The post Courage, Resilience, and Sharing appeared first on CHM.

]]>
Xpress Yourself: Pride, Stonewall, and Tim Gill https://computerhistory.org/blog/xpress-yourself-pride-stonewall-and-tim-gill/ Thu, 09 Jun 2022 15:38:36 +0000 https://computerhistory.org/?p=25460 Quark software founder Tim Gill has played an important role in ensuring that historic sites related to LGBTQ+ communities are protected and preserved.

The post Xpress Yourself: Pride, Stonewall, and Tim Gill appeared first on CHM.

]]>
A Software Entrepreneur and LGBTQ+ History

With each passing year, the place of June as LGBTQ+ Pride Month seems to gain greater recognition and wider celebration. This year, I was delighted to talk to my 76-year-old aunt about her experience marching for the first time in a Pride parade—from Lambertville, New Jersey, across a bridge into New Hope, Pennsylvania—as part of an outreach by her church to the LGBTQ+ community. It made me recall how moving it was to march in Pride in Northampton, Massachusetts, with my children and their elementary school community, now too many years ago. With the growing recognition of Pride, including by many corporations that simultaneously fund actively anti-LGBTQ+ politicians and political groups, the historian in me feels that it is important to recall why it is that Pride Month is celebrated in June.

The Stonewall Inn, Greenwich Village, New York City, 1969. Source: https://commons.wikimedia.org/wiki/File:LgStonewall.jpg

The reason is, of course, historical. It was on June 28, 1969 that a police raid of the Stonewall Inn—a LGBTQ+ bar on Christopher Street in Greenwich Village, New York City—was met with direct and sustained resistance by the community: the Stonewall uprising. Led to a great degree by Black and Brown LGBTQ+ people (who self-identified as queens, drag queens, transvestites, and lesbians), a growing community hampered the arrest of the Inn’s patrons and employees, continuing their protest for LGBTQ+ human and civil rights for days afterward. These direct actions of June 1969 were inspirational for the continued fight for LBGTQ+ rights that has been fought since. Today, the human and civil rights of the transgender community, and in particular transgender youth, are under renewed and severe attack.

One of the only known photographs from the morning of June 28, 1969, as a crowd begins to assemble and confront the police at the start of the Stonewall Uprising. Source: https://en.wikipedia.org/wiki/Stonewall_riots#/media/File:Stonewall_riots.jpg

 

Marsha P. Johnson, a founding member of the Gay Liberation Front and a prominent figure in the Stonewall uprising. Johnson self-identified as a drag queen and transvestite. Source: https://en.wikipedia.org/wiki/Marsha_P._Johnson

The history and meaning of Stonewall have been particularly important to computer programmer and entrepreneur, Tim Gill. During the Presidency of Barack Obama, Gill worked with the US Department of the Interior to identify important historical landmarks for LGBTQ+ history. In 2016, the US Park Service created the Stonewall National Monument in Christopher Park, across from the Stonewall Inn. Through his Gill Foundation, Tim Gill and his husband, Scott Miller, ensured that the Covid-19 pandemic did not spell the end of the Stonewall Inn, raising significant funds to see it through.

National Park Service signage for the Stonewall National Monument in New York City. Source: https://commons.wikimedia.org/wiki/File:Stonewall_National_Monument.jpg

As detailed in his 2019 oral history interview for the Computer History Museum, Tim Gill was born in 1953, and grew up in Colorado. With an interest in mathematics and science fiction as a youth, it was perhaps not completely surprising that when he first encountered computers during his junior year of high school he was hooked. Gill even enrolled at the Colorado School of Mines and took courses at that time to continue to have access to computers. Back at high school, he helped form a computer club.

Gill went to the University of Colorado, Boulder, from 1972 to 1976, studying computing and mathematics, and also working on programming jobs for local businesses. It was also in the first weeks of his first year at Boulder that Gill began living openly as a gay man. As he explains in his oral history: “That happened within about two weeks of getting to college. I went and visited the Boulder Gay Liberation group, which is what it was called, and I think by my[. . .] freshman, [or] sophomore year, I was their office manager.” It was the beginning of his active involvement in a cause that he champions to this day.

After graduating and trying his hand at working on software at a few established firms, Gill decided to try something of his own. The origins of what would become Quark Incorporated started with his project to create a word processor for a then-new Apple product, the Apple III, a successor to the wildly popular Apple II. The success of the word processor allowed Gill to quickly repay the $2,000 that his parents had loaned him for the effort. Gill’s next step was to evolve his editor so that it could control a professional-quality phototypesetter. This evolution led to the creation of QuarkXPress software, positioned as the typesetting and publishing tool for professional printing and publishing, and quickly in competition with desktop publishing software like Aldus PageMaker in the tremendous expansion of this market. Quark thrived, reaching hundreds of millions in annual sales.

A promotional button for the QuarkXPress publishing software. Source: https://www.computerhistory.org/collections/catalog/102641932

In January 2000, Gill left Quark to devote himself more fully to the work of the Gill Foundation which he had created in the 1990s. The foundation has been one of the leading organizations supporting the fight for LGBTQ+ human and civil rights. As Gill explains in his oral history: “…what you learned, now that I’ve been involved in LGBT rights for a long time, is that people are initially afraid of change, and they’re afraid of people they don’t know and afraid of people they don’t think are like them in some way. And so LGBT people were like that, and so literally it was a matter—so here, what, 14 or something odd years later—well, it’s a little longer than that, but certainly less than 25—we have gay marriage or same-sex marriage, and that entire time was really about having a conversation with the American people and making them think about it and reason about it and get past their initial knee-jerk reaction. And so that really is what the Gill Foundation is about . . . and we have been communicating with the broader American public, and slowly over time we win, because when you can get people to stop and think about it rationally, in the end there’s no real reason to put different people in different boxes.”

Getting to know each other better. Overcoming our fears of difference. Treating everyone equally. Sounds like a great program.

Main image: Tim Gill, in 2019. Photograph by Hyoung Chang/Denver Post. Source: https://www.denverpost.com/2019/07/14/tim-gill-colorado-lgbtq-rights/

 

SUPPORT CHM’S MISSION

Blogs like these would not be possible without the generous support of people like you who care deeply about decoding technology. Please consider making a donation.

FacebookTwitterCopy Link

The post Xpress Yourself: Pride, Stonewall, and Tim Gill appeared first on CHM.

]]>
Lillian Schwartz: Pushing the Medium https://computerhistory.org/blog/lillian-schwartz-pushing-the-medium/ Wed, 17 Nov 2021 16:38:40 +0000 https://computerhistory.org/?p=23407 Meet new CHM Fellow Lillian F. Schwartz, a pioneer at the intersection of art and technology.

The post Lillian Schwartz: Pushing the Medium appeared first on CHM.

]]>
The artist Lillian Schwartz was honored with a CHM Fellow award in 2021 for her pioneering work at the intersection of art and computing. In the first half of the 1970s, she created a remarkable series of films that brought the new technology of computer animation into the artworld. Her work, and those of artists like her, expanded the scope of media art, and spurred fresh developments in technology. Through a decades-long residency at Bell Labs, and collaborations at IBM and the MIT Media Lab, Schwartz continued to explore the possibilities of the computer for artistic exploration. Through teaching and writing, she shared what she had learned with others. Schwartz’s work has been exhibited widely and internationally at a variety of museums and galleries, including the Museum of Modern Art, the Brooklyn Museum, and the Whitney. Several of her films from the 1970s are in the permanent collection of the Museum of Modern Art. Her entire archive is now held by The Henry Ford.

Portrait of the Artist as a Girl (1927-1944)

Born in Cincinnati, Ohio in 1927, Lillian Schwartz was the child of recent immigrants to the United States. Her father had come from Russia and her mother from England. Schwartz’s father worked as a barber, while her mother worked as the caregiver for the couple’s thirteen children.

Lillian Schwartz c. 1930. Schwartz’s childhood coincided with the Great Depression of the 1930s. This, and her father’s ailing health, spelled significant economic challenges for the family. Image from the Collections of The Henry Ford.

A volunteer serves free soup to a line of men in 1936. National Archives Identifier 196174.

From the youngest ages, Schwartz followed her creative impulse, turning the materials from her household into artistic media: drawing with sticks in dirt; sculpting with bread dough; and drawing on the walls of the family house. Her creative and expressive pursuits, along with those of her siblings, were especially encouraged by her mother. As a result, Schwartz was further exposed to more traditional artistic media—Conté drawing crayons, oil paints, and more—among her sibling’s possessions.

Conté crayons, like those used by Schwartz for her earliest drawings. From Wikimedia Commons, under Creative Commons: https://creativecommons.org/licenses/by-sa/3.0/deed.en

Lillian Schwartz in the 1930s. Image from the Collections of The Henry Ford.

With her father suffering from heart disease, and in the throes of the Great Depression, Schwartz and her siblings worked from an early age, and moved houses often. She and her family experienced violent anti-Semitism, with her mother often keeping her home from school for fears for her physical safety. While these experiences positioned her as an outsider, her creativity and passion for art were undiminished.

Anti-Semitic publications appeared, and appear, across the United States. This one from 1940 originated in the Midwest, where Lillian Schwartz lived. Collection of the United States Holocaust Memorial Museum.

Portrait of the Artist as a Nurse, Wife, and Mother (1944-1960)

Poster for the U.S. Cadet Nurse Corps. From the collections of the Pritzker Military Museum and Library. Accessed through WikiMedia.

Lillian Schwartz found a way to further her education: the Cadet Nurse Corps, which started in 1943. This federal program addressed the dramatic shortage of nurses by covering almost every cost of nursing school, and explicitly did not discriminate based on race or religion. Cadet nurses were simply required to serve as a nurse for the duration of the war. Schwartz jumped at the chance to pursue her education, joining the Cadet Nurse Corps at the University of Cincinnati. She found nursing difficult, finding herself perhaps too sensitive for it, but reinforcing her identity as an artist. At the University, she also met a young physician who would soon become her husband, Jack.

Lillian Schwartz in high school. As high school was ending, Lillian Schwartz’s true desire was to study art at college. Her and her family’s circumstances, however, made that seemingly impossible. Instead, her sister first found her a job as a secretary. Image from the Collections of The Henry Ford.

After the war, Schwartz accompanied her husband to occupied Japan, where he was completing his service. The couple was based in the port city of Fukuoka, which had been firebombed during the war. With severely damaged infrastructure of all kinds, diseases were a serious challenge. Schwartz began suffering severe neck pain and stiffness and was quickly diagnosed with polio. Paralyzed from the waist down and in her right arm, she became a patient in the hospital in which her husband worked.

A mislabeled photograph of Fukuoka, c. 1950. Note the uniformed U. S. military member to the far right. Collection of Rob Ketcherside, Flickr. https://creativecommons.org/licenses/by-nc-sa/2.0/

Lillian Schwartz was not alone in her strenuous efforts to recover from polio. This photograph from 1948 shows a patient in Ann Arbor, Michigan working to recover from polio through physical therapy. From the Ann Arbor District Library.

During her long recovery, Schwartz worked with an expert in the art of Japanese calligraphy. With him, she practiced visualizing each motion and brushstroke needed to create the complex forms. This practice of detailed visualization, Schwartz found, greatly expanded her artistic imagination. Recovered, but still living with post-Polio syndrome, she returned to the U.S. with Jack, soon settling their family in northern New Jersey.

An example of the expressive and considered brushwork involved in Japanese calligraphy. Image from Ayu Nabila on Wikimedia Commons. https://creativecommons.org/licenses/by-sa/4.0/deed.en

As her husband established his pediatric practice, Schwartz dove deeper into her study and practice of art. She studied both oil and watercolor painting, and then began creating sculptures in acrylic. In her painting and sculpture both, she made creative and unusual use of her materials. In her words, she was always “pushing the medium.”

Lillian Schwartz with one of her early acrylic “acrylicast” sculptures. She treated the bulk of the acrylic with solvents and blowtorches to achieve visual transformations, and affixed spheres and other shapes to the surfaces. Often, she presented these sculptures with lighting elements. Courtesy of Lillian and Laurens Schwartz

Portrait of the Artist as an Experimenter (1960-1969)

The New York City skyline, in June 1960. Photographer: Harold Edgeberg. From Wikimedia Commons. https://creativecommons.org/licenses/by-sa/2.0/deed.en

In the 1960s, New York City figured larger in Lillian Schwartz’s life and art. A short half-hour train ride from her home in New Jersey, Schwartz regularly visited New York’s art museums and galleries, many times bringing her children. On other visits, she collected surplus and abandoned technological materials, transforming them into new artistic media. She created a long series of acrylic sculptures, manipulating the material, shaping it, and adding elements to it. By the mid 1960s, she was creating kinetic sculptures, with colored liquids pumping through glass tubing.

Experiments in Art and Technology, E.A.T., was established in 1967 in New York City. One of the co-founders was the Bell Labs engineer Billy Kluver, who also collaborated with Andy Warhol. Kluver is pictured here lecturing about E.A.T. in 1967. Photographer unknown. Courtesy E.A.T. Archives. All rights reserved.

The other co-founder of E.A.T. was the American painter, Robert Rauschenberg. He is pictured here with one of his works in 1968. From the Dutch National Archives, via Wikimedia Commons.

This work brought her into a new development in the New York artworld, a project called E.A.T, Experiments in Art and Technology, led by the painter Robert Rauchenburg and the Bell Labs engineer Billy Kluver. Schwartz attended meetings of the E.A.T. group, which paired engineers with artists to collaborate on new works of art. As part of E.A.T., in the late 1960s she developed a new, complex, multimedia, interactive sculpture, Proxima Centauri. In it, a globe covered by colorful, changing projections responded by retreating into its base when viewers tripped a sensor. The sculpture was included in the exhibit “The Machine as Seen at the End of the Machine Age” at the Museum of Modern Art, along with several other E.A.T. works. The exhibit ran from November 27, 1968 to February 9, 1969.

The entry for Proxima Centauri from the MoMA catalog for the “Machine as Seen at the End of the Mechanical Age.” Per Biorn was a Bell Labs engineer who was paired with Lillian Schwartz to collaborate with her on the sculpture. From the Collection of the Museum of Modern Art.

View of the “Proxima Centauri” globe, and a description of the sculpture’s technology. Courtesy Lillian and Laurens Schwartz.

Ken Knowlton (left) and Leon Harmon (right) with their computer-generated print, “Studies in Perception.” Courtesy of the AT&T Archives and History Center.

Portrait of the Computer as the Artist’s Tool (1969-Present)

Lillian Schwartz’s invitation to Bell Labs grew into a decades-long residency. She was quickly enthralled by the computer, and was assisted by several computer professionals to explore how she might use it to create art. Her first work with the computer was a print titled “Head,” formed much like “Studies in Perception.” However, Schwartz characteristically decided to push this new medium, transforming the image through subsequent silk-screen printing.

The silkscreened image, “Head,” created by Schwartz when she first came to Bell Labs. At upper left is her original sketch. In the middle is a sketch on graph paper, translating her original sketch into alphanumeric characters. The final silkscreen is based on a computer printout of these alphanumeric characters. From Lillian and Laurens Schwartz’s book, The Computer Artist’s Handbook (W. W. Norton), 1992.

Her focus soon shifted to the very new technology of computer animation, interacting with computer researcher Ken Knowlton. Knowlton had developed a programming language, BEFLIX, used at Bell Labs to generate computer animations. Schwartz took short animations generated at her direction, manipulated them in various ways, and combined them with other imagery and techniques to create a series of short films, moving computer animations into the artworld. Her earliest efforts led to official AT&T sponsorship, and her films from the first half of the 1970s incorporating computer animations garnered both acclaim and wide showings.

Lillian Schwartz in her studio, where she transformed and incorporated the computer animations generated at Bell Labs into her final art films. For them, she intermixed the computer animations with other found footage from the Labs as well as new sequences that she created using various media. Courtesy Lillian and Laurens Schwartz.

A selection from Lillian Schwartz’s 1970 film, “Pixillation.” Moving Image from the Collections of The Henry Ford.

A selection from Lillian Schwartz’s 1971 film, “UFOs.” Moving Image from the Collections of The Henry Ford.

A selection from Lillian Schwartz’s 1971 film, “Olympiad.” Moving Image from the Collections of The Henry Ford.

A selection from Lillian Schwartz’s 1972 film, “Mutations.” Moving Image from the Collections of The Henry Ford.

A selection from Lillian Schwartz’s 1972 film, “Googolplex.” Moving Image from the Collections of The Henry Ford.

A selection from Lillian Schwartz’s 1974 film, “Metamorphosis.” Moving Image from the Collections of The Henry Ford.

In this same period, Schwartz engaged with another emergent electronic medium, video, to produce video art. She drew on the resources of the TV Lab at the television station WNET/Thirteen in New York City, where Nam June Paik and others had developed pioneering video art and video art tools.

A selection from Lillian Schwartz’s 1977 video piece, “Trois Visage.” Moving Image from the Collections of The Henry Ford.

Lillian Schwartz at work at Bell Labs in the 1970s. Courtesy of Lillian and Laurens Schwartz.

Through the support of electronic music pioneer Max Mathews, a high-ranking researcher at Bell Labs, Schwartz received official status at Bell Labs, continuing her use of computers for artmaking and her interactions with scientists and engineers for several decades.

Max Mathews (right) with Lawrence Rosler, in a 1966 exploration of a graphical language for generating computer music at Bell Labs. Courtesy of AT&T Archives and History Center.

She pursued other important collaborations as well, with IBM and particularly with the MIT Media Lab in the use of Symbolics artificial intelligence computers for artmaking. In these collaborations, and on her own primarily using Apple Macintosh computers, Schwartz continued to push the boundaries of the computer as an artists’ tool, and to educate others in its use. In continuation of her work both with computers and with video, she won an Emmy award for a public-service advertisement for the Museum of Modern Art, promoting their 1984 reopening.

Lillian Schwartz c. 1980s. Courtesy Lillian and Laurens Schwartz.

While vision challenges prevent her direct use of the computer today, Schwartz continues to make art, creating a series of drawings that she intends to animate using the computer.

Lillian Schwartz at work in her New York City apartment, August 2021. Photograph by Matthew Cavenaugh.

Her body of work—both her computer films and video art—were important steps in the emergence of time-based media art in the artworld, opening fresh avenues in both the history of computing and the history of art.

Lillian Schwartz with her CHM Fellows award, August 2021. Photo by Matthew Cavenaugh.

Lillian Schwartz with her CHM Fellows award, and with her son Laurens Schwartz, in her New York City apartment, August 2021. Photo by Matthew Cavenaugh.

FacebookTwitterCopy Link

The post Lillian Schwartz: Pushing the Medium appeared first on CHM.

]]>
Meet 2021 CHM Fellow Honoree Raj Reddy https://computerhistory.org/blog/meet-2021-chm-fellow-honoree-raj-reddy/ Thu, 17 Jun 2021 16:22:50 +0000 https://computerhistory.org/?p=22103 CHM Fellow Raj Reddy and his students established the foundations for technologies we use every day, including speech recognition, computer vision, autonomous robotic systems, and he is a passionate advocate of universal access to information.

The post Meet 2021 CHM Fellow Honoree Raj Reddy appeared first on CHM.

]]>

I believe artificial intelligence can be used to create a humane society.

— Raj Reddy

Inventing the Future

Do you control your phone with voice commands? Did you know your car was built by dozens of robots? Have you read books online for free? For over five-decades, Raj Reddy and his students established the foundations for these technologies: speech recognition, computer vision, autonomous robotic systems, and universal access to information.

In the early 1960s, as a recent graduate student, CHM Fellow Raj Reddy became intrigued by the predictions of early AI pioneers, namely that computers would beat a world chess champion in just 10 years. The idea that machines could one day do what was considered intelligent human behavior captured Raj’s imagination. It also marked the beginning of Raj’s prescient vision that computing could be a broad field in service to humanity.

1958

Raj Reddy (second from right) grew up in the farming village of Katur, India, and was the only member of his family to get an advanced education, graduating from Guindy Engineering College with a BS in civil engineering in 1958. Credit: Courtesy Raj Reddy

1960

Raj (left) went on to pursue a master’s degree in civil engineering at the University of New South Wales in Sydney, Australia, in 1960. He and two of his fellow students from Guindy Engineering College studied together there. Credit: Courtesy Raj Reddy

1961

At the University at New South Wales, Raj’s professors Stan Hall and Bob Woodhead introduced him to his first computers, including the DEUCE (pictured), descended from the ACE computer Alan Turing designed at Britain’s National Physical Laboratory. Raj used it for his civil engineering research, and it opened his eyes to a whole new world. Credit: Courtesy of University at New South Wales, Australia

Pioneering the Early Years of AI

The beginning of artificial intelligence can be traced to the 1956 Dartmouth Summer Research Project on Artificial Intelligence hosted by two young MIT professors, John McCarthy and Marvin Minsky. The pair convened researchers from assorted disciplines to Dartmouth College to discuss a new field that McCarthy coined “artificial intelligence.” Among the many discussions and presentations, researchers Allen Newell and Herbert Simon presented their groundbreaking paper, “The Logic Theory Machine,” one of the earliest programs to mimic human problem-solving skills.

Shortly after the conference, leading AI laboratories were founded, including the Stanford Artificial Intelligence Laboratory (SAIL) under John McCarthy in 1963, as well as others at MIT and Carnegie Mellon University, forming the basis of AI research for decades to come. In that same year, Raj came to Stanford as the first PhD student to do work in the new artificial intelligence lab.

1963

Stanford AI Lab founder, John McCarthy, Raj’s thesis advisor in 1963, encouraged students to pursue their interests, suggesting projects but taking an overall hands-off approach. McCarthy is shown here, in about 1967, at Stanford’s IBM 7090 computer playing a correspondence computer chess game against a Soviet team. Credit: Courtesy of Stanford University Libraries, Department of Special Collections and University Archives

1964

Partly because Raj had a multilingual background and partly because Stanford’s new DEC PDP-1 computer had an analog-to-digital converter, Raj programmed his first speech recognition program—a vowel recognizer—on the machine. The PDP-1 was one of the first interactive computers, enabling new applications, including the first video game Spacewar! Credit: Courtesy of Alexey Komarov

1966

In 1963, Raj began his doctoral studies at Stanford, under artificial intelligence pioneer John McCarthy, becoming Stanford’s first graduating computer science PhD in 1966, along with Bill McKeeman, who defended his thesis in the afternoon, while Raj did his in the morning. Credit: Courtesy of Raj Reddy

1968

Raj stayed on at Stanford as an assistant professor and created a voice command interface for the Stanford Arm, a precursor to modern robots today. This historic video shows his speech recognition system sending commands to the Arm to pick up blocks. Credit: Courtesy of Stanford University Libraries, Department of Special Collections and University Archives.

Building A World-Class Institution and Empowering Students

The Department of Computer Science at Carnegie Mellon University (CMU) was founded in July 1965 by programming language expert Alan Perlis and AI pioneers Allen Newell and Herbert Simon. Raj moved to CMU in 1969, becoming the fourth pillar of their world-class computer science department.

In 1979, Raj founded and then led the Robotics Institute, the first robotics department at any US university. He went on to create world-class centers and institutes at CMU for language, human computer interaction, machine learning, and software research. Setting up these departments separate from the Department of Computer Science itself was part of Raj’s conviction that computing is a broader field—with wider social impact—than the study of computers themselves.

Over his five-decade teaching career, Raj empowered thousands of students, many of whom soon made vital contributions to a wide array of technologies. He believed that mentors could play a pivotal role in giving students the freedom, guidance, and tools to pursue their ideas.

1972

Compelled by the mystery of how speech information is encoded in variable waveforms, Rockefeller University graduate students James and Janet Baker set themselves the goal of creating practical automatic speech dictation. With Raj’s encouragement, they transferred to CMU in 1972 and developed a novel statistical approach that was an early form of machine learning, and which became the dominant approach in speech recognition systems. Founding Dragon Systems in 1982, they announced the first built-in PC speech recognition in 1984, and in 1997, released Dragon NaturallySpeaking, the first continuous speech dictation program for PCs. Shown here at The Computer Museum, Boston, in 1990 for the announcement of DragonDictate-30K. Credit: Jim and Janet Baker

1985

In the 1980s, Raj’s student Kai-Fu Lee led the SPHINX team, extending the Bakers’ statistical system to allow speech systems to recognize speech regardless of the speaker. Lee has held executive positions at Apple, Microsoft, and Google. In recent years, he created Sinovation Ventures, a venture capital fund investing in Chinese technology startups, and has published two books, AI Superpowers and AI 2041, that explain the power of AI and predict how it will change the future of humanity. Credit: Courtesy of Kai-Fu Lee

1990

Raj founded and led the CMU Robotics Institute from 1980 to 1992. Notable faculty include futurist Hans Moravec (pictured), whose 1988 book Mind Children predicted that robots would evolve superintelligence by 2040; Takeo Kanade, a leading expert in the field of computer vision; Manuela Veloso, who has created soccer playing robots to compete in the RoboCup; and Red Whittaker, who created robots to work in dangerous environments such as failing nuclear power plants. Credit: Courtesy Carnegie Mellon University

2005

Since the 1980s, the CMU Robotics Institute has pioneered self-driving vehicles with the NavLab project, a series of autonomous vans funded by DARPA. In the 2000s, DARPA’s Grand Challenge pitted autonomous vehicle teams led by CMU’s Red Whittaker against his former Institute colleague Sebastian Thrun. The 2005 event was won by Thrun’s Stanley (left), with Whittaker’s H1ghlander and Sandstorm of CMU placing third and second (middle and right, respectively). Credit: Courtesy Carnegie Mellon University

Fostering Technology in Service to Humanity

Since the 1980s, Raj has sought to broaden access to information and computing technologies around the world. With Jean-Jacques Servan-Schreiber, Nicholas Negroponte, Seymour Papert, Alan Kay, Terry Winograd, and others, Raj was instrumental in building Le Centre Mondial Informatique et Resource Humaine with the French government to bring computing technology to developing nations. For his contributions, Raj received the Legion of Honor from French President François Mitterrand.

Raj also led the FiberAfrica project to bring high-speed internet access to Africa and helped launch the Million Book Project to digitize the world’s books and make them available online to everyone around the world. Finally, he cofounded Rajiv Gandhi University of Knowledge Technologies in India, whose mission is to provide an education to students from rural villages.

2007

Raj helped the create Million Book Project, a free, online digital library for anyone, anywhere in the world, that includes more than 1.5 million volumes, with partners in China, India, and the United States. Shown here are members of the group. Credit: Courtesy of Raj Reddy

2008

In 2008, Raj cofounded Rajiv Gandhi University of Knowledge Technologies (RGUKT) in his home state of Andra Pradesh in India. The college was created to provide an education in computing technologies to the bright students of rural villages who may not otherwise have access to higher learning, bringing opportunity to kids who are much like Raj was himself. Credit: RGUKT

For the last twenty years, Raj Reddy has focused his energies on realizing technology’s promise for social good. He’s working to improve access to information for the 2.5 billion illiterate people in the world.

 


To participate in CHM’s FREE June 24 event, Empowering Humanity Through Technology: A Celebration of Raj Reddy, register here.


 

About the 2021 CHM Fellow Awards

The 2021 CHM Fellow Awards marks the Museum’s first-ever virtual Fellow Awards. CHM will celebrate the 2021 Fellows in a yearlong four-part series of thought-provoking virtual events and engaging digital content that explores the story and impact of each honoree and the present and future of tech for humanity.

Learn more about this year’s honorees and the 2021 Fellow Awards.

Headline Sponsor 

Education Sponsors 

First_tech_Logo    oracle-sponsor-logo

FacebookTwitterCopy Link

The post Meet 2021 CHM Fellow Honoree Raj Reddy appeared first on CHM.

]]>