Excerpts
Preface 1
Introduction 3
Part 1: Growing Up 17
- My Ancestors 1824–1993 19
- Childhood 1953–1967 43
- High School 1968–1971 55
- University 1971–1975 69
Part 2: My Life’s Work 79
- GE Utica, New York 1975–1980 81
- GE Syracuse, New York 1980s 91
- GE, Lockheed Martin, ABB 1990s 103
- Downsized 2001–2007 123
- ASML 2007–2021 135
Part 3: Family and Meaning 155
- Family Life 157
- Retirement 193
- Looking Back on My Life 205
Afterword 217
Photos 221
Appendix 1:
The Doctrine of Completed Staff Work 255
Appendix 2:
The Ballad of the Harp-Weaver 259
About the Author 265
Preface
I was a foot soldier in the Fourth Industrial Revolution. There were days when I thought I might die of a heart attack from the pressure. When working on cutting edge of technology, the pressure from the global market is intense. Nevertheless, I am happy to have played a part in the inexorable march of technology. One of the reasons I retired was so I could escape alive to start a new chapter in my life.
To understand the Fourth Industrial Revolution, it helps to be reminded of the earlier revolutions.
- The First Industrial Revolution was based on steam and water-powered machines in the late 1700s.
- The Second Industrial Revolution was based on the discovery and introduction of electricity in the late 1800s.
- The Third Industrial Revolution, or the Digital Revolution, was based on the introduction of computers in the late 1900s.
- The phrase “Fourth Industrial Revolution” was first used in 2016 by Klaus Schwab, the founder of the World Economic Forum, in a book of the same name. The Fourth Industrial Revolution combines advances in artificial intelligence (AI), robotics, the Internet of Things (IoT), 3D printing, genetic engineering, quantum computing, and other technologies. All these technologies have in common that they are made possible by more powerful and cheaper computer chips, also called integrated circuits.
The dramatic improvements in the increasing power and the shrinking size and cost of chips are described by Moore’s Law which was coined in 1965 by Gordon Moore, the co-founder of Intel. The law states that there will be a doubling of the number of components on an integrated circuit every two years. I worked at ASML for the last fourteen years of my career, where we made Moore’s Law possible. This book describes the work we did at ASML in more detail. In the earlier years of my career, all my work depended on chips and Moore’s Law.
Introduction
My life is a struggle.
—Voltaire
Sometimes, I pretend everything is under control. But there is a struggle beneath the surface that I don’t reveal to just anyone. When I was a child, I would admit whatever I thought. But when I grew up, I learned to be guarded about what I revealed to others. I am reserved. I like to present as being in total control of my life. Of course, I control how I react to events in my life. I own my choices. I do control a lot of my life, but there is so much of life that is out of my control. I often adhere to the rule, “Some things are better left unsaid.” In this book, I will try to break that rule and share some of my struggles in life. I don’t want to mislead you. I have had a good life, but I have had my struggles.
I was born in 1953 when the average cost of a new house was $9,500, the average annual pay was $4,000, and the first color television set cost $1,175. The Soviet Union tested a hydrogen bomb. The Korean War was over after three years, Texas Instruments invented the transistor radio, and Dwight D. Eisenhower was the president of the United States.
In November 1953, Richard Grimsdale and Douglas Webb demonstrated a prototype transistorized computer. This event is fitting since my career started as a computer operator. Later, I worked in finance at a laboratory doing cutting-edge research on electronic components. And finally, my career ended with helping to make the chips used to make all the electronic devices in our lives, like cars and computers. Throughout my career, I was involved in developing, using, and manufacturing high-tech electronic devices.
Why did I write a memoir? First, I thought my sons might be interested to know more about my life. There are stories here we never discussed. Some we never discussed because, at the time, it didn’t seem they needed to know. Others I was not emotionally ready to talk about. Now I am ready. Living in this next chapter of my life has freed me from the reservations of my youth.
Before, I was too preoccupied with the tyranny of the urgent and my many responsibilities to devote time to writing. Now that I have the time, writing is a way to reflect on my life. And in writing, the power of my reflection is magnified. Writing has a discipline that intensifies and clarifies my thoughts and memories. It helps to bring my life into focus. This lengthy self-examination helps me conclude that my life has been worth the many struggles, even though, in the past, I wasn’t always sure.
Three months before retiring, I told Dorota, one of my colleagues at work, about my plan to retire. She asked what I planned to do in retirement. I told her some of my plans, and she responded, “You should write a book. You read so much that you could write a book.” At first, I didn’t think this was a practical suggestion, so I didn’t give it much thought. But this idea must have been brewing in my mind for a few months. And now, when writing this book, I remembered her suggestion and wrote the story of my life for my sons and myself. I wanted to summarize, analyze, and document my life. Finally, I hope the general reader will also be interested.
I found it practical to compartmentalize my life into two parts, my career and personal life. Mostly, I will talk about these parts of my life separately, just like I lived it.
Evolution of Technology in My Lifetime
The history of high-tech in my lifetime is the history of how integrated circuits and computers enabled almost all advances in technology. A virtuous cycle creates an explosion in demand for the most advanced integrated circuits (chips). Many, if not most, of today’s emerging technologies depend on computers, and computers rely on chips. The explosion of these technologies creates a virtuous cycle where the chips enable the technologies, and the new technologies drive the demand for chips.
Here is a partial list of some of the key milestones during my lifetime. All of these milestones depend on computers and chips.
- 1958 Integrated Circuit – Invented by Jack Kilby at Texas Instruments.
- 1968 Intel – Robert Noyce and Gordon Moore founded Intel.
- 1970: Canon Pocketronic Calculator – The first handheld battery-powered calculator.
- 1971: Intel 4004 Microprocessor – Intel introduced the 4004 microprocessor, the first significant step in microprocessor technology with twice the number of transistors and five times the operating speed of existing chips. The increase in performance made a single-chip CPU possible, replacing the existing multi-chip CPUs.
- 1973: The first handheld cell phone call using a prototype of what would become the Motorola DynaTAC 8000x, the world’s first commercial cell phone.
- 1975: The Altair 8800 – The first commercially successful microprocessor-based computer.
- 1976: Apple Computer Company was founded by Steve Jobs, Steve Wozniak, and Ronald Wayne, who sold his founder’s equity stake in the world’s most valuable company for $800.
- 1977: Apple II – The Apple II was one of the first successful mass-produced computers, and in 1979, Software Arts introduced VisiCalc for the Apple II, one of the first killer apps and the first commercial spreadsheet program.
- 1981: IBM PC and MS-DOS – These two products together created the desktop PC market.
- 1983: Lotus 1-2-3 was introduced and was the IBM PC’s killer app.
- 1984: ASML was founded as a joint venture between Phillips and ASM International. ASML is the leading manufacturer of photolithography tools used to manufacture microprocessors and memory devices.
- 1989: Launch of the first GPS satellite.
- 1990: Nexus – The first web browser – Tim Berners Lee created the first browser, initially called the World Wide Web. Its name was later changed to Nexus to avoid confusion with the entity we now call the web.
- 1994: Amazon was founded by Jeff Bezos in his garage in Bellevue, Washington.
- 1998: Larry Page and Sergey Brin founded Google.
- 2005: YouTube was founded by Steve Chen, Chad Hurley, and Jawed Karim, three former employees of PayPal.
- 2007: The iPhone was introduced.
- 2010: ASML shipped the first EUV lithography tool NXE:3100 to TSMC in Taiwan.
- 2014: The Amazon Echo was introduced.
- 2020: The Nobel Prize in Chemistry was awarded to Jennifer Doudna and Emmanuelle Charpentier for CRISPER gene-editing technology.
- 2020: OpenAI’s GPT-3 language model became available to users. OpenAI is an artificial intelligence research laboratory. GPT-3 can create computer code, poetry, and prose and is one of the most interesting AI systems ever produced.
Fairchild, Texas Instruments, and the Microchip
The microchip is at the core of information and communications technology, including desktops, laptops, cellphones, and tablets. The private sector produced the first integrated circuits, notably Texas Instruments and Fairchild Semiconductor, but the U.S. government was instrumental in fostering the development of the microchip industry. One of the earliest users of the new integrated circuits were the Air Force and NASA, which used them in missile technologies and space-guidance systems. The federal government contributed to microelectronics research and development and served as the first and largest customer.
The first transistor was created in 1947 by a Bell Labs team under the direction of William Shockley. Texas Instruments was the first company to produce a silicon transistor in 1954. Shockley established Shockley Semiconductor Laboratory in 1955. Eight scientists, including Gordon Moore and Robert Noyce, dissatisfied with Shockley’s dictatorial management style, left Shockley in 1957 and started Fairchild Semiconductor. Fairchild won a contract to supply IBM with silicon transistors for the Air Force’s new supersonic B-70 bomber. The Air Force required transistors that could operate quickly and withstand high temperatures.
Fairchild completed its first order in July 1958: 100 silicon transistors priced at $150 each. Fairchild, however, had trouble producing semiconductors consistently. Fairchild outbid Texas Instruments to provide transistors for the Air Force’s Minuteman ballistic missile guidance system. Fairchild’s transistors were poor quality and regularly failed in lab tests. Fairchild needed to quickly fix its product and developed a new transistor design with a thin, protective coating of silicon oxide put on top of the transistor. This technique, called the planar process, resulted in reduced costs and more dependable transistors.
Texas Instruments copyrighted a whole circuit on a single semiconductor chip in March 1959. Jack Kilby of Texas Instruments created the first solid integrated circuit made of semiconductor components in the fall of 1958. Integrated circuits contain every transistor on a single piece of silicon instead of having various transistors executing discrete functions as separate devices.
Texas Instruments was given a contract by the Air Force for $1 million to design and construct circuits composed of silicon. Kilby’s design was the original, but it needed to have each component manually strung through thin gold wire. Fairchild discovered a way to mass-produce integrated circuits by building on the planar process. Texas Instruments and Fairchild engaged in a patent dispute over the following years but ultimately decided to split licensing rights for the integrated circuit.
NASA decided in 1962 that Fairchild integrated circuits would be used in the Apollo guidance computer prototype. The Air Force used integrated circuits in the guidance system for the Minuteman II missile. NASA purchased 60% of the integrated circuits made in the United States by the middle of the 1960s. A chip cost $32 in 1961, $1.25 by 1971, and less than a cent by 2000 for a much more powerful chip.
Steve Jobs said, “Some people can do one thing magnificently, like Michelangelo, and others make things like semiconductors or build 747 airplanes — that type of work requires legions of people. In order to do things well that can’t be done by one person, you must find extraordinary people.” In 1968, Robert Noyce and Gordon Moore left Fairchild and founded Intel. Today, Intel, Samsung, and the Taiwan Semiconductor Manufacturing Company (TSMC) are the three globally leading semiconductor manufacturing firms.
Apple, IBM, and the Computer
Steve Jobs and Steve Wozniak established Apple in 1976. They built the Apple I in Jobs’s garage. By offering the first color graphics and a floppy disk drive, the Apple II transformed the computer industry. When Apple went public in 1980, sales increased from $8 million in 1978 to $117 million.
In 1981, IBM introduced the IBM PC with the Microsoft Disk Operating System (MS-DOS). IBM’s entrance into the market caused an explosion in the personal computer market by establishing a technology standard and positioning the technology for widespread use. Other companies were making computers, but none had the brand-name recognition of IBM, which had a 62% global market share of mainframe computers.
Bill Gates said, “I think it’s fair to say that personal computers have become the most empowering tool we’ve ever created.” In 2005, IBM sold its PC business to the Chinese company Lenovo, one of the world’s top PC companies. Lenovo, HP, Dell, and Apple comprise 70% of the global PC market.
My Career in High-Tech
My first job was as a computer operator at GE in Utica, New York. I ran a Honeywell 6060 mainframe computer in a large cleanroom. We usually had a team of about five to ten people to run the computer. The computer was used to solve complex engineering problems in designing the radar systems we sold to the Department of Defense. Later, I did finance work at GE in Syracuse, New York, where we designed and manufactured many different radar systems. Radar – Radio Detection and Ranging – uses radio waves to locate objects like ships, airplanes, and satellites. Radar systems have an antenna, a transceiver, and a processor and are made with chips and other electronic components. I also worked at a GE laboratory, where we researched to develop advanced electronic components. Next, I moved to GE in New Jersey, where we manufactured advanced electronic assemblies used in the manufacture of various electronic systems for the Department of Defense. Then I moved near Princeton, New Jersey, with Lockheed Martin, where we designed and manufactured satellites also made with electronics and integrated circuits. Then, at ABB, we designed and manufactured electronic systems like control systems to automate factories and robots often used to manufacture cars.
My last company was ASML, where we designed and manufactured photolithography tools used to manufacture chips by companies like Intel, Samsung, and TSMC. In 1965, Gordon Moore, the co-founder of Intel, proposed what would later become known as Moore’s Law, stating that every two years, there would be a doubling of the number of components on an integrated circuit. When I ended my career at ASML, our work was critical to keeping Moore’s Law alive. Moore’s Law and integrated circuits have enabled almost every technological advance in my lifetime. Moore’s Law and integrated circuits are synonymous with, and drive, the Fourth Industrial Revolution, which is causing an exponential change that will dwarf all preceding industrial revolutions and change all of our lives in ways that are difficult to imagine.
Here are some emerging technologies that are changing the world as part of the Fourth Industrial Revolution: 3D printing, machine learning, genetic engineering, robots, edge computing, quantum computing, cloud computing, virtual reality, augmented reality, blockchain, and the Internet of things.
The first chief technology officer at Microsoft, Nathan Myhrvold, said, “The way Moore’s Law occurs in computing is really unprecedented in other walks of life. If the Boeing 747 obeyed Moore’s Law, it would travel a million miles an hour, and a trip to New York would cost about five dollars. Those enormous changes just aren’t part of our everyday experience.”
I didn’t have a plan for my life. Life happened to me. My father’s life was a convergence of science and art. My father traveled the world as a pilot for the Air Force, evaluated radar systems, and wrote radar system instruction manuals for GE. He made stained glass art in his free time, played the organ, and painted in oils and watercolors.
My life was also a convergence of science and art. My artistic side is revealed in many ways. I used to paint and write poetry in my twenties and studied liberal arts and English in college. I was a singer in a band in college. I led singing in church for years. I listen to classical music and jazz every day. I am a voracious reader, reading an average of a book weekly. And finally, I have completed my memoir. Mary Karr, one of the leading memoir writers, wrote a book, The Art of Memoir, in which she advocates for memoir as an art form. The Liars’ Club by Mary Karr (her memoir) is widely regarded as having started the current memoir boom and was on the New York Times bestseller list for more than a year.
Before I started my first job as a computer operator, I didn’t know I was interested in technology. That changed after my first week on the job. And as the years passed, technology became more fascinating to me, and I spent the next forty-five years working in high-tech. I never realized until today, as I write this, that my life was also a convergence of science and art as I followed in my father’s footsteps.